Bye-bye, mouse. Hello, mind control

By Maria Korolov, Network World |  Hardware, voice recognition

"'Hannah,' for instance, for [UK's] M&S Bank, knows all about their credit cards, loans, and other financial service products," he says.

For companies that deploy virtual assistants like Hannah, the goal is to answer questions that normally are handled by human staff. According to Ezekiel, these virtual agents typically average 20% to 30% success rates, and the systems are continuously updated to learn from previous encounters so that they can handle more queries.

One Creative Virtual client, Telefónica UK, found that their intelligent agent Lucy reduced customer service calls by 10% to 15%. That doesn't mean that she only understands 10% to 15% of questions, says Telefónica knowledge base manager Richard Hagerty. "One of the key questions customers ask is, 'How do I contact customer service?'"

In other cases, Lucy might not yet know the answer, and the company will need to create one. "Maybe we wouldn't answer the question, anyway," he says.

What the company has learned over the past 12 months is that it's better to have one clear answer than to respond with several possible answers. In addition, Lucy needs to become a bit less human, he adds. For example, Lucy can handle a wide variety of personal questions. She says she likes Italian food, for example, has seen Titanic several times, and enjoys tennis and salsa dancing.

"There's a back story that allows a customer to ask personal questions," Hagerty explains. "She lives in Wimbledon, and is engaged to her boyfriend. But some customers believe they are having a chat with a human being. So we are looking at reducing some of the elements of personalization so that our customers' expectations are managed correctly. We want to make it clear to our customers that it's an automated service they're using, not a human being."

Gestures a tough nut to crack

Interface designers looking to translate spoken -- or written -- words into practical goals have a solid advantage over those designing interfaces for gestures or other non-traditional input methods.

That's because designers are already familiar with the use of spoken language. And if they aren't, there is a great deal of research out there about how people use language to communicate, says MIT Media Lab's Holzman. The language of human gestures is much less understood and less studied.

"We've been playing around with browser interfaces that work with you moving your body instead of moving a mouse," he says. But there are no common gesture equivalents to the "pinch to shrink" and "swipe to flip page" touch commands.

There are some gestures that are universally identifiable, but they may be less appropriate for the workplace.

"We're at the beginning of the gesture phase," he says. "And not just the gestures, but everything we can do with some kind of camera pointing at us, such as moving our eyebrows and moving our mouths. For example, the screen saver on the laptop -- why doesn't it use the camera on the lid to figure out whether to screen save? If your eyes are open and you're facing the display it should stay lit up."

One company tracking hand motion is Infinite Z, which requires that users wear 3D glasses and use a stylus to touch objects which appear to float in the air in front of them.

"A virtual environment makes a lot of sense for computer-aided design, data visualization, pharmaceuticals, medicine, and oil and gas simulations," says David Chavez, the company's CTO. The products works with Unity 3D and other virtual environment engines, as well as the company's own Z-Space platform.

Another difficult technology to commercialize is eye tracking, which is commonly used to see which portions of an ad or Website viewers look at first. It is also used to improve communication for the handicapped.

Reynold Bailey, a computer science professor at the Rochester Institute of Technology, uses eye-tracking technology to teach doctors to read mammograms better. The idea is to subtly highlight areas that the student should look at next, teaching them the scan patterns followed by experienced radiologists.

"If this works with mammograms, there are also other applications," he says. The same technology can be used to train pilots in how to check instruments, for example.

But he says he doesn't expect eye tracking to be used as an input device, to, say, replace a mouse for general-purpose use.

"The eye is not an input device," he says. "With the mouse, you can hover over a link and decide whether to click or not. With the eye, you might just be reading it, so you don't want to activate everything you look at. So you can do blink to click, but your eyes get tired from that. And we move our eyes around and blink involuntarily."

Limits of mind control

It may sound like science fiction, but mind reading devices are already out in the market -- and they don't require sensors or plugs to be implanted into your skull. Some work by sensing nerve signals sent to arms and legs, and are useful for helping restore mobility to the handicapped. Others read brain waves, such as the Intific, Emotiv and NeuroSky headsets.

The Intific and Emotiv headsets can be used to play video games with your mind. But these mind reading devices can do more than just connect with computers. NeuroSky, for example, is the maker of the technology behind the Stars Wars Force Trainer and Mattel's MindFlex Duel game, both of which allow players to levitate balls with the power of their minds.

That doesn't mean that office workers can sit back, think about the sentences they want to write, and have them magically appear on the screen. "If you're an able-bodied individual, typing words on a keyboard is just so much quicker and more reliable than doing it with the brain control interfaces," says MIT Media Lab's Holtzman.

A paralyzed person may benefit greatly from being able to pick out letters or move a paintbrush simply by thinking about it, he says. And moving a racecar around a track with your mind is a fun parlor trick. But it's still easier just to use a real paintbrush, or simply pick up the car with your hands and move it around.

But where mind reading can directly benefit an office worker is in picking up the user's mood, he says.


Originally published on Network World |  Click here to read the original story.
Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Ask a Question
randomness