Buried among the sexy hardware announcements and hair-raising Google Glass demo this morning were some fairly radical changes to the Google search experience for mobile devices.
To date Google has been compensating for the awkwardness of the PC SERP on smartphones with voice search and other marginal “fixes.” Today it took a potentially major step toward offering a distinct search experience designed specifically for mobile that may bury the “old” mobile SERP.
The familiar smartphone mobile SERP is still there and probably will continue to be shown in a majority of cases. But these new experiences (shown this morning) will likely take over more and more. How that might or will affect SEM and Google mobile ad revenues is not clear. But I’m sure there are plenty of monetization scenarios.
What Google Announced
First, here’s what Google introduced:
- An improved Google search UI (incorporating the Knowledge Graph for more structured results)
- A Siri-like female voice to read you search results (in some circumstances)
- Google Now (a capability that takes your calendar, location and other data inputs and gives you information with no formal query prompting)
The images and screenshots went by very quickly and I was unable to capture any photos. The Google press site doesn’t include any images of the new experience or Google Now, which are all part of the Jelly Bean Android update. That update is rolling out to consumers in July.
Siri-like Spoken Search Results
Part of what Google is doing with its new approach to mobile search is responding to Apple and Siri, which is a paradigm shift in the way we interact with machines. And partly Google is just adapting search to better fit mobile devices. Accordingly, what Google demonstrated this morning included improved voice search (with offline dictation capability).
Google also introduced a “female assistant” that can read back certain categories of search results. Google didn’t call it an “assistant” and didn’t name it (e.g., Majel). It was merely presented as an extension of voice search. Google Navigation has had the ability to give you spoken turn-by-turn directions. But this was a different female voice. It also sounded more natural and human than Siri does.
As with Siri, Google’s improved voice search seemed to trigger structured results presented as what might be called “answer cards.” Sports scores, local business information and other types of content were shown briefly. There are no screenshots that I’ve been able to find.
Google Now: Persistent Contextual Search
Another component of the new mobile search experience is what Google is calling “Google Now.” To some degree it merges with or extends the “assistant” metaphor but it doesn’t seem to be triggered or accessible via voice search. (I might be wrong about this.) To access Google Now users instead tap or touch the search box to get contextually relevant information based on location, time of day and your calendar.
Android knows what time it is, where you are and presumably what you’re doing if you’re using Google calendar. It can give you transit information, tell you about travel time to your next meeting and so on. It’s pretty intriguing. All the content was presented on visually rich “answer cards” rather than a traditional mobile SERP. Users can also swipe away the answer or information “cards” and the old Google mobile SERP reappears.
Dramatic Changes to Mobile Search Experience
Later on we’ll be able to play with Jelly Bean and the new experience and truly determine whether these are marginal use cases or whether they represent the kind of dramatic changes that I believe them to be. Indeed, Google Now, together with some of the other changes announced, are suggestive of Marissa Mayer’s description of “the perfect search engine.”
As mentioned, Jelly Bean will be available mid-July. Android developers can get access immediately. Stay tuned for more.