The future of search begins with a ‘V’
Voice and visual search will ultimately become significant drivers of query volume beside text input.
The future of search begins with a “v” — as in voice and visual search. While “voice search” (on smart speakers) hasn’t taken off, as we enter the era of “ambient computing,” it’s clear that voice will be the universal interface for an increasingly diverse array of connected devices.
Not 50% but closing in — eventually. The oft-cited stat, “50% of all searches in 2020 will be voice searches,” has been discredited. However, in 2016 Google said, “in the Google app, 20% of searches are now by voice.” Since that time, adoption of voice and virtual assistants has grown significantly.
So while it might not be 50%, voice input (on smartphones) already drives a non-trivial percentage of queries. We just don’t know exactly how many because Google doesn’t break it out. And, as Amara’s law states, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” So it will be with voice.
Google and Bing visual search advancing rapidly. Visual search is less mature than voice but offers another compelling alternative to inputting text into a box. Google has been rapidly developing its visual search tool Lens, which now has a wide array of capabilities: translating text, searching restaurant menus, scanning barcodes, searching objects in the real world and driving commerce. Most recently, Google introduced “style ideas,” where users can search on items of clothing in stores or otherwise IRL and see similar items with the opportunity to buy many of them.
This kind of visual search is built on computer vision, object recognition and machine learning. Microsoft has also done a great deal of work in this area since at least 2009 and been making steady upgrades and improvements to Bing visual search. It also makes visual search available to third party app developers.
Pinterest Lens. Pinterest’s visual search efforts are a bit more under-the-radar than Google’s and Bing’s. But the company is doing pioneering work in the area. Pinterest enables users to isolate items in pins and search on them visually to find similar objects or use the smartphone camera to identify objects in IRL and shop them online in the Pinterest app. The latter offering is also called “Lens,” setting up a potential naming or trademark dispute at some point with Google.
Most recently, Pinterest introduced shoppable Product Pins, connecting Lens image search results to e-commerce information (price and retailer link). In addition, saved photos from Lens can be saved to boards and become a source of future recommendations for those users.
Local visual search. In the event it wasn’t obvious, the ability to use a smartphone camera to search objects, products or places offline is another form of local search. In other ways too, image search and the smartphone camera are making their way into local. Google’s recent “search by photos” is one example. Another is the incorporation of augmented reality into Google Maps walking directions.
Why we should care. As a general matter, retailers and product marketers need to optimize for image search and local marketers need to ensure that their various local profiles (GMB, Yelp, Facebook, etc.) have a rich supply of images — profiles with optimized images significantly outperform those without.
Stepping back we can say something like “the search box is expanding;” SEO and content discovery are becoming more fragmented and complex. And while it may not be today, marketers need to start preparing for a time when consumers use voice/virtual assistants and the smartphone camera as much as they use text in the traditional search box to access information and express buying intent.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.