Human Computer Interaction

NUS CS3240 Team 9

Human-Computer Interaction Design in Sci-Fi Movies

Science Fiction movies are usually set in a world with advanced and fictional technology. In order to visualize the otherwise abstract and possibly invisible technologies, Human Computer Interfaces is an important component in this particular genre.
Inescapably, HCI technology and Sci-fi movies would inspire and be inspired by each other in many different ways. The figure on the left shows the different ways in which filmmakers and researchers could possibly collaborate.

There are many examples where movies show currently well-known types of human-computer interaction method which uses a far more advance technology compared to the actual technological possibilities of the time of production.

The speech interface is first depicted in “Colossus: The Forbin Project” in 1970. Since then, many robots and computers in Sci-Fi movies seems to have similar capability. The most popular example of the usage of the speech interface is probably the main computer of the USS Enterprise, the spaceship of “Star Trek – The Next Generation”.
Another example of realized technology which is first introduced through Sci-Fi films is palm/fingerprint recognition system. It is depicted in a scene of a film called “The Bourne Identity” where the protagonist needs to get past a high-tech security system to access his locker at a Swiss Bank.
Of course, there are also movies with unique visions of human-computer interaction which were never been implemented or probably never will be due to the nature of unrealistic imagination.
The evolution of HCI in movies eventually led to collaboration between HCI scientists and film directors to make a sophisticated but seemingly realistic technology. Steven Spielberg’s Minority Report, for example, involved HCI scientists as much as possible, in order to construct authentic visions of future computer usage.

Filed under: Interactive Devices, User Interface

[Week5 Post] Interactive Devices

Overview of Interactive Devices 

There are many different types of interaction devices being used and conceived today. Some are familiar tools from the past and many are just distant concept dreams of the future. In this post we will discuss on two trending interaction methods; the touch screen and the gesture recognition.

Touch Screen 

Basic Information

There are different ways to achieve a touch screen experience and, through careful selection of these methods, developers, are able to customize their touch screen experiences. Resistive touch screens, using electrodes and a voltage comparison, are relatively simple and inexpensive to manufacture. This makes the mass production of widely marketed devices easier. Different capacitive types allow for thicker glass, gloved hands, or one or multiple fingers to be used. Surface acoustic wave touch screens check for disturbances (cause by fingers) in sound waves emitted and offers the most clarity of screen and can be used with many mediums (gloves, stylus, etc.). Outside of the hardware differences, software, alone, greatly differentiates one touch device from another, even when the use the same hardware methods.

Touch screens are such a big field because, by combining input and output, a multitude of new experiences come to exist. In addition to new design experiences, there arise new hardware and software possibilities and so the use of touch screens can be said to grow “exponentially”. The way code interacts with the touch screen mechanism is a separate and equal issue from the way the device interacts with people and other devices. A developer can choose to further the field by syncing the touch device with other devices and allowing the touch input and output to also be input and output for other, not necessarily related devices, such as with haptics, the 3DS, and some physical uses of the iphone. A developer can also forgo pioneering physical aspects to investigate new ways that programs can receive input and give output via the touch screen such as with 3D interfaces and gesture recognition research.

Mobile Touch Screen Device

The touch screen potentials in providing different styles of interface allows smart phones with touch screen to serve as an interaction device as well. Smart phones or tablets can be used to act as a remote control for TVs or other electronic device provided both devices are connected to the same network and the right app is installed. The Apple TV and the Samsung Smart TV are among the few devices currently utilizing this new technology.

In addition, these “smart TVs” have computer like functions, such as internet surfing capability. Conventional TV remote controls aren’t sufficient or comfortable enough to make full use of all these functions. The smart phones can provide a touch pad interface (just like a laptop) and the keyboard to type.

    

Gesture Recognition

Basic Information

In the future, more intuitive, flexible and interactive devices will take place of fixed-sized screens that rule our lives today. Gesture recognition is one of such technologies that will change our interaction with machines all around.

Gesture recognition is a topic
computer science and language technology with a goal of interpreting human movement through mathematical algorithms. Currently the focuses in the field cover emotion recognition from the face and hand gesture recognition. It builds a richer bridge between machines and human than primitive text user interface or even GUI (graphical user interface). Using the concept of gesture recognition enable human to interact with machines naturally without any mechanical devices. It will potentially make input devices such as mouse, keyboards or even touch-screens redundant.

Application of Gesture Recognition     

1. Kinect:


It is a motion sensing console launched by Microsoft as an extension of the Microsoft Xbox 360 game console. The main function is to enable you to control the Xbox through voice or gestures rather than physically using the controller.

Kinect is based on technologies which are developed by Microsoft and PrimeSense. It basically makes use of an infrared projector which is able to read your gestures hence enabling you have a complete hands-free control of the gadget or game you are playing.

Microsoft has already sold more than 18 million copies of Kinect and they plans to implement the same system and technology for its PC and release it in February this year.

2. Eon Interactive Mirror

EON Interactive Mirror enables customers to virtually try-on clothes, dresses, handbags and accessories using gesture-based interaction. Changing from one dress to another is just a ‘swipe’ away and offers endless possibilities for mixing designs and accessories in a fun, quick and intuitive way. Customers can snap a picture of his/hers current selections and share it on Facebook or other Social Media to get instant feedback from friends.

Future Trend of Interactive Devices

Like those seen in movies like “minority Report” “Avatar” and “Star Trek”, the reality of an intelligent interactive interface is a far-fetched idea, but possible to be implemented in real-world.

On Friday February 12 2010, John Underkoffler, the Chief Scientist at Oblong Industries and the leader of the team responsible for creating the futuristic interface concept used during “Minority Report”, gave a speech on TED about g-speak spatial operating environment.

John Underkoffler points to the future of UI

Sixth Sense

This is a technology used in a wearable gadget that would help us interact with devices better using gesture recognition. It is developed by Pranav Mistry, a scientist at MIT Labs. It basically includes a small projector and a camera device which are connected to a smartphone. The projector projects the virtual images of objects onto an interface and the camera is able to track your gestures. These gestures can then interact with the system which also supports multi user interaction. This technology hasn’t been used at a commercial scale yet but the researcher has made it as an open source software and hence the code is available online.

Reference :

1> Prime Sense, Technology Review, <http://www.technologyreview.com/tr50/primesense/>

2> An interview with Pranav Mistry, the genius behind Sixth Sense,TED Blog, 2009 Mar 11. < http://blog.ted.com/2009/03/11/sixth_sense_pranav/&gt;

3> Next generation computers will be highly interactive devices controlled by gestures, The independent , 2010 Feb 10 .<http://www.independent.co.uk/life-style/gadgets-and-tech/news/next-generation-computers-will-be-highly-interactive-devices-controlled-by-gestures-1901967.html&gt;

Filed under: Interactive Devices