Speaktoit CEO talks about the future of natural language interface

More often than not now, when you see someone talking on a smartphone, they're not talking to another person, they're talking to the phone itself. And it answers back, often in a playful and engaging way. How Star Trek is that? Natural language interface has evolved just over the past few years from a clunky, atonal voice interface to something out of yesterday's science fiction.

Ilya Gelfenbeyn, CEO of Speaktoit, the top-rated virtual assistant for Android, has not only played a major role in making natural language assistants a part of everyday life today, he has a vision for where it will be five years hence. That's the mark of a true visionary.

With four million downloads, Speaktoit was designed to go beyond allowing interaction with the phone and calendar. "The idea was that it should be an intelligent interface for different services, so you don’t have to launch a browser or separate apps, you just tell it whatever you want and get results," explained Gelfenbeyn. Going through a long list of possibilities, Gelfenbeyn explains that you can update your Facebook and Twitter with Speaktoit, check for places in Yelp!, search with Google or Wikipedia, or use ChaCha. "And we are expanding constantly," he notes, including introducing support for multiple languages, as well as generic dialogue so you can talk to it just because you’re bored and have nothing else to do.

And if that's not enough, the app has a major focus on the avatars. It may seem a minor point, but the ability to choose one's own avatar makes the personal assistant all the more realistic.

A natural language based personal assistant is "smart"—that is, it learns over time. If you've ever played with one of these apps, you get surprised every now and then with the responses, and with the range of oddball questions you can ask. The goal of course, is to make the assistant seem as realistic as possible, and capable of holding, at least almost, a real conversation and not just a simple question-and-answer dialogue. Gelfenbeyn explained how this works. "We're trying to support generic dialogue, so you can just discuss your mood, or talk about random stuff with the assistant. And that helps us to involve our users. The user starts to ask different types of questions, and then even if this particular service is not yet supported by us, we get statistics that, for example, this particular request is typically asked, and we add this functionality. And it's not just a single question and answer. We take into account your preferences, we ask you qualifying questions, take into account where you are, and what you were talking about just before."

It seems then, that the virtual assistant gets smarter—actually "learns" over time. Looking into the future, Gelfenbeyn envisions being able to get assistance whenever needed, without even having to launch the assistant and ask a question. In the future for example, the assistant would, behind the scenes, track what you are doing and where you are, and push out information that may be relevant. "For example, it may know from your calendar that you’re late for a meeting, and it knows that the meeting is far from here. It might give you a reminder and suggest that you call a taxi." This level of intelligence comes from use of meta-information, so that the assistant doesn't just interface with one service at a time, but with multiple services to combine them, just as a live person might, to figure out how best to use all of that information.

The real future of natural language interface though, goes beyond the smartphone. "In five years, assistants will be introduced in lots of different devices, not just smartphones, but also in cars, homes, TVs and so on. You can think of it like a remote."

The advancement of the natural language interface, having been proven in the smartphone world, is already starting to move to the automotive realm, with Ford announcing just recently the integration of Siri. We do already have "smart" homes, and it's not going to be too much of a leap to come to the point where everyday consumer devices such as HVAC systems, entertainment consoles, gaming devices, and yes, even your toaster and coffee pot—have an affordable natural language interface.

What’s wrong? The new clean desk test
Join the discussion
Be the first to comment on this article. Our Commenting Policies