BARCELONA -- Software engineers at Intel are exploring new ways people can use the human voice, gestures and head-and-eye movements to operate computers.
Intel's Barry Solomon uses hand gestures in a demonstration of a perceptual computing toolkit being used by independent developers. (Photo by Matt Hamblen/Computerworld)
In coming years, their research is expected to help independent developers build computer games, doctors control computers used in surgery and firefighters when they enter flaming buildings.
"We don't really know what this work will become, but it's going to be fascinating to watch it play out," said Craig Hurst, Intel's director of visual computing product management, in an interview at Mobile World Congress. "So far, what we've seen has gone beyond what we thought of originally."
Intel's visual computing unit, created two years ago, has grown to become a top priority for the chip maker, Hurst said. Last fall, the unit released several software toolkits that are used by independent developers to create a raft of new and sometimes unusual applications.
One of the toolkits, called the Perceptual Computing SDK (software developer kit), was distributed to outside developers building applications that will judged by Intel engineers. Intel is planning to award $1 million in prizes to developers in 2013 for the most original application prototype designs, not only in gaming design, but also in work productivity and other areas.
Barry Solomon, a member of the visual computing product group, demonstrated how the Intel software is being used by developers on Windows 7 and Windows 8 desktops and laptops. With a special depth-perception camera clipped to the top of his laptop lid and connected over USB to the computer, Solomon was able to show how the SDK software rendered his facial expressions and hand gestures on the computer screen, accompanied by an overlay of lines and dots to show the precise position of his eyes and fingers. A full mesh model can then be rendered.
With that tracking information easily available, a developer can quickly insert a person's face and hands into an augmented reality scenario. Or, the person can be quickly overlaid onto a green screen commonly seen in video applications to make a weather or news report. The person's gestures could be used by a developer to interact with functions in a game or productivity application.
A company called Touchcast is building a green-screen application that will be available later in 2013. The prototype camera, called the Creative Interact Gesture camera, which Intel uses in its perpetual computing demonstrations with the SDK, will also be for sale later this year.
Hurst said Intel's role in building the SDK's for developers is to "reduce the barriers" to making creative new applications. Voice software company Nuance worked with Intel on the speech recognition capabilities, while SoftKinetic provided depth recognition software for the camera and augmented reality software.
The depth of field is not room-sized like Microsoft's Kinect games for Xbox 360. Intel's version reaches from six inches to three feet from the camera attached to the laptop lid, Solomon said. Eventually, Intel expects to provide a perceptual computing toolkit to use with smartphones and tablets, which people interact with differently than desktops and tablets.
The basis for perceptual computing concepts that Intel is building with independent developers and partners has been around for years, but Hurst said that faster processors and better cameras, as well as consumer demand, "have risen to the point that it's become interesting."
It's possible that a surgeon could some day use a hand gesture to move through pages on a computer screen, rather than touching the screen and risking hand contamination. Voice commands could have the same advantage.
"There's a lot of nuance that you don't get from the keyboard and a mouse," Hurst said.
Hurst also said that the various apps that developers build by using Intel's free tools will mushroom. At some point, Intel may decide to charge for the tools, but so far the company wants to build a community of developers globally.
Hurst predicted that Intel's tools will get plenty of competition from other companies in the computing world. "Once developers see how easy it is to get access to these development capabilities, there will be an explosion in the ecosystem," he said. "This work is a very high priority for Intel."
In a remote corner of the massive MWC trade fair in hall 8.1, Intel is showing one of the prototype perceptual computing apps that an independent German developer built. The app allows hand gestures and voice commands to be used to move through an on-computer catalog of photos.
Solomon smiled as he pointed out that the prototype was built by a small game controller company that took the enigmatic name of "4tiitoo."
The company's playful reference to "42" also refers to the number 42 that had a central role in the science fiction novel The Hitchhiker's Guide to the Galaxy where it is described as the "Answer to the Ultimate Question of Life, the Universe, and Everything."
It's an ambitious name for an ambitious idea.
See more Mobile World Congress coverage from our team in Barcelona.
Matt Hamblen covers mobile and wireless, smartphones and other handhelds, and wireless networking for Computerworld. Follow Matt on Twitter at @matthamblen or subscribe to Matt's RSS feed. His email address is firstname.lastname@example.org.
Read more about emerging technologies in Computerworld's Emerging Technologies Topic Center.
This story, "Intel demos perceptual computing software toolkit" was originally published by Computerworld.