Computing with a wave of the hand
The iPhone’s familiar touch screen display uses capacitive sensing, where the proximity of a finger disrupts the electrical connection between sensors in the screen. A competing approach, which uses embedded optical sensors to track the movement of the user’s fingers, is just now coming to market. But researchers at MIT’s Media Lab have already figured out how to use such sensors to turn displays into giant lensless cameras. On Dec. 19 at Siggraph Asia — a recent spinoff of Siggraph, the premier graphics research conference — the MIT team is presenting the first application of its work, a display that lets users manipulate on-screen images using hand gestures.Many other researchers have been working on such gestural interfaces, which would, for example, allow computer users to drag windows around a screen simply by pointing at them and moving their fingers, or to rotate a virtual object through three dimensions with a...