[ See also: Cool Kinect hack: Controlling Windows 7 [video] ]
That's what Microsoft's research group said Tuesday as it kicked off the TechFest conference, an annual gathering of research scientists at Microsoft's Redmond, Wash., headquarters.
"In the months and years to come, a growing number of Microsoft products will recognize voices and gestures, read facial expressions and make computing easier, more intuitive and more productive," Microsoft said.
Microsoft Research "is working closely with Microsoft business units to develop new products" using technology similar to the natural user interface exhibited by Kinect, the company said. "Computers are moving rapidly toward ... interfaces that are more intuitive, that are easier to use, and that adapt to human habits and wishes, rather than forcing humans to adapt to computers."
Several of the 17 Microsoft projects exhibited at TechFest take advantage of natural user interfaces.
Microsoft's Applied Sciences Group is working on smart, interactive displays using Kinect technology, including one that improves display of 3D images. The project uses "a special, flat optical lens (Wedge) behind an LCD monitor to direct a narrow beam of light into each of a viewer's eyes," Microsoft said. "By using a Kinect head tracker, the user's relation to the display is tracked, and thereby, the prototype is able to steer that narrow beam to the user. The combination creates a 3-D image that is steered to the viewer without the need for glasses or holding your head in place."
A related project uses a "Kinect-based virtual window," which tracks a "user's position relative to a 3-D display to create the illusion of looking through a window," Microsoft said. Along with other 3-D technologies, the Kinect-based virtual window could improve upon existing telepresence systems, Microsoft said.
Natural user interfaces represent just a few of the projects Microsoft is displaying at TechFest. Here are some more examples: