Startup Tarsier promises a touchscreen floating in the air

Tarsier will show off its emerging user-interface technology at Demo Fall

By , IDG News Service |  Consumerization of IT

While consumers go crazy for touchscreens on their smartphones and tablets, Minnesota startup Tarsier is working on what it calls the interface for the next 30 years: a touchscreen you don't have to touch.

Tarsier's MoveEye technology lets users reach out and manipulate icons, windows or images on a screen as if they're floating in the air, according to co-founder Shafa Wala. The company is looking toward TVs as a natural place to implement MoveEye, because users normally look at TVs from several feet away. The company will demonstrate some of the MoveEye technology at the Demo Fall conference this week in Santa Clara, California.

Despite years of promises of interactive TV, the systems that have hit the market, such as Google TV and Microsoft's earlier WebTV, have never gained much traction.

"Today, we don't really have a way to interact with our TVs from a distance," Wala said. "The remote control, the mouse, and the keyboard are really not the ideal input devices for interacting with TVs."

In place of those familiar PC tools, MoveEye uses a special pair of glasses, a "media box" to run the interface, and software. The glasses have a built-in stereoscopic pair of cameras pointed at the screen, sensors to detect the viewer's eye movements, and Wi-Fi to talk to the media box. Together, these give the illusion of viewing and touching the interface in thin air.

"You just point at what you want to interact with, or you just grab something that's projecting out of the screen," Wala said.

The first version of MoveEye will be two-dimensional, with the user manipulating objects on a plane suspended in space. Later, Tarsier will add 3D capability so users can feel as if they're reaching into an interface that has more than one layer.

Using gestures to control something on a screen is not new. Both Nintendo's Wii and Microsoft's Kinect let users play games and carry out other tasks with movements and gestures. But MoveEye can better read hand gestures as well as where those gestures are directed, Wala said. That precision is necessary for the full range of actions on a computer, such as grabbing a small object, he said.

MoveEye achieves this by viewing the screen from the user's perspective. While the Wii system follows a controller in the user's hand and Kinect faces the user and watches body movements, MoveEye captures the user's own perspective through the stereoscopic camera in its glasses. That, and the eye-tracking sensors, help to tell the system precisely where each element on the screen appears from the user's perspective, Wala said. Algorithms and software developed by Tarsier do the rest.

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Consumerization of ITWhite Papers & Webcasts

See more White Papers | Webcasts

Answers - Powered by ITworld

ITworld Answers helps you solve problems and share expertise. Ask a question or take a crack at answering the new questions below.

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question