Recently IDG News Service asked Google’s Marissa Mayer about the “perfect search engine.” Here was the question posed: “What is the perfect search engine? If you had a magic wand and could create it, what would it look like? What would it do?”
Mayer replied: “It would be a machine that could answer that question, really. It would be one that could understand speech, questions, phrases, what entities you’re talking about, concepts. It would be able to search all of the world’s information, [find] different ideas and concepts, and bring them back to you in a presentation that was really informative and coherent.”
What Mayer may have unknowingly described is Siri, a “virtual personal assistant” that uses artificial intelligence to determine user intent and then match data or applications that can fulfill that intent. The company will launch its iPhone application soon and already has a deal with a “tier one” US mobile carrier. The NY Times offers background on Siri and some of the technology behind the system:
SRI International’s software venture, called Siri, is more ambitious, in that it allows users to speak or write natural-language requests into the device (“Find me a place to eat dinner tonight with Karen, reserve a table and put it on our calendars.”), which will complete the task independently and inform you when it is done.
In terms of long-term predictions, Siri is actually an easy bet. Dag Kittlaus, the company’s chief executive, said one of the four major carriers would introduce the service early next year, and he said it would also be available as an iPhone app. But over the next two years the technology should be able to complete a wider range of tasks.
I’ve seen Siri in action and found it impressive. The system is not perfect but it brings users closer to transactions and fulfillment of their objectives — at least in a range of use cases — than can Google on mobile devices today. It uses a voice interface to receive queries. You can use the keyboard if necessary but that’s entirely secondary to the experience.
The way one interacts with it is “conversational” and “transactional” rather than providing a verbal version of a conventional search query.
I moderated a panel at the recent Open Mobile Summit in San Francisco called “new directions in navigation and search.” The panel, among others, featured Siri CEO Dag Kittlaus. What became clear during the panel is that we’re going to see lots of innovation and change in mobile search and that the present version of the experience could well be regarded as Jurassic in only a few years as the unique attributes of the device (e.g., the camera) become input mechanisms and search tools. Augmented reality is also a part of this, although in its present form it’s fairly undeveloped and limited.
And, as another example of how far things could develop away from the current “query box and blue links” search paradigm, look at the video demo below of “SixthSense” a “wearable gestural interface”: