Future tech. If you can dream it, you can make it. The promise of sci-fi movies.
Multi-touch displays is one of those technologies. As seen in Minority Report in 2002, Tom Cruise's character is able to manipulate data with both hands on a huge virtual display. It seems fantastical but the real thing isn't that far off. In fact, there are products with similar technologies already.
Engineers have been working on multi-touch for over 25 years, starting in 1982 at the University of Toronto. More than simple pressure-sensitive touch screens, these displays can keep track of multiple points of contact and link them together to recognize gestures. This allows for manipulation of on-screen elements more intuitively.
Jeff Han @ TED
At the TED conference in February 2006, Jeff Han, an NYU research scientist, showed off the first public demonstration of a high-resolution display that created excitement with simple, multi-user, multi-touch interfaces. In fact, the applications he showed were 'interface-free,' meaning that there were no drop-down menus, no pointer - none of the usual contraptions that are used in today's GUI operating systems. To zoom in on a photo, you touched the image with two fingers and spread them apart. To zoom out, you did the reverse. Rotating an object was done in the same manner - use two fingers to move separate corners as you desire. The whole system looked really quite intuitive.
In 2001, Stevie Bathiche and Andy Wilson from Microsoft started brainstorming a multi-touch interface for a virtual game table that eventually became Microsoft's Surface technology. Announced on May 29, 2007, Surface takes the form of a 30-inch tabletop display. The technology goes a step further than Han's interface in that more than just fingers are recognized. Bluetooth cellphones and digital cameras are sensed as they are set on the table and information can be downloaded or transferred.
June 29, 2007 saw the release of the much-hyped Apple iPhone. This phone has none of the standard buttons but uses multi-touch in its interface along with a virtual keyboard. The two-finger gesture that Jeff Han demonstrated in his TED talk is how you zoom in on webpages, photos and maps.
Just imagine what the future has in store...
Perhaps biometrics will become integrated so one surface could be used by multiple users and the system could know who was touching which area. This could be a way to ensure security in a shared interface.
TED Talks - Jeff Han