Eagerly Awaiting Google’s Voice Search For The iPhone
Yesterday, arguably the top tech story of the day was Google’s introduction of voice search for its iPhone app. As of this morning, the updated app still isn’t available in the iTunes store (that’s not Google’s fault). I spoke yesterday afternoon with Google’s Mike Cohen and Gummi Hafsteinsson about the app and how it would […]
Yesterday, arguably the top tech story of the day was Google’s introduction of voice search for its iPhone app. As of this morning, the updated app still isn’t available in the iTunes store (that’s not Google’s fault). I spoke yesterday afternoon with Google’s Mike Cohen and Gummi Hafsteinsson about the app and how it would work. What they described was something that sounded qualitatively different in terms of accuracy and usability vs. competitive offerings now in the market.
Unfortunately, no one has had an opportunity to try the updated app yet. But, if it works as promised, it should be quite impressive.
The technology behind the new voice search capability is built partly on the same platform as Goog411, but apparently that’s only part of the story; there’s a good deal more going on, as well. In fact, Goog411 is reportedly improving and benefiting from the work done on voice search for the iPhone.
The first thing that is both intriguing and very different is that there are no buttons to push to initiate voice search. Once the app is open on the iPhone you hold it up to the side of your head as though you were going to talk on the phone and simply speak the query. Search results then appear as they would if you had manually entered a query.
Google says it has learned tremendously from its experience with Goog411 but its desktop search query data is also contributing knowledge to the effort. These and other technical factors beyond the scope of my expertise will make the system more accurate than what has been possible in the past, said Google’s Cohen.
Beyond its reported accuracy, the usability of the system is striking. Most voice control on mobile handsets requires that buttons be pushed. There’s also often a “walkie-talkie”-style experience, with the phone held out in front of the user to speak the query or command into the phone.
By removing the need to push a button and simply mimicking the experience and handset position of talking on the phone, Google’s voice search may prove to be quite a bit more natural and intuitive. Another benefit for Google is that by having the phone’s receiver closer to the mouth of the person speaking, the system gets a better, cleaner input.
I said yesterday in my postscript to Matt’s post that none of the voice apps or voice-initiated search tools currently in the market have proved to be a kind of “killer app” for mobile. That’s largely because of uneven accuracy and success rates or some other limitation or awkwardness. After my discussion with Google yesterday, I became hopeful that what Google was going to introduce would be a leap forward.
If it is, we should see increased query volumes and longer query strings — and increased search monetization from mobile for Google. But all this will become more clear once the app launches, hopefully today.
Voice search is launching in US English and will roll out to other handsets — and eventually other countries and languages — in the future.