Google Lens improvements ready visual search and AR for mainstream adoption
Better text recognition, lookalike search and real-time functionality are upgrades.
Google Lens, the company’s visual search tool, was announced last year at I/O for Google’s Pixel devices and later rolled out for iOS and other Android devices, awkwardly through the photos app. You could take a photo and then run Lens against the photo rather than through the camera in real time.
This year at Google I/O, Lens’ distribution is expanding to a range of new devices and directly through the camera app as intended. The company also announced some updates:
- Lens can analyze text and provide explanatory pictures or background information. Google points out that this involves semantic understanding of meaning and context of words and phrases.
- Lens will now show you lookalikes. You see shoes you like or furniture out in the world; Lens will get more information about that item (assuming recognition) but can now also show you lookalikes (see below).
- Google also says that Lens now works in real time “just by pointing your camera.”
In a related vein, Google Maps now has new Lens-like features. You can use the camera in navigation for improved walking directions and information about places. The same computer vision and machine learning technologies are behind these features in Lens and Maps, as well as a new app for the visually impaired called Lookout.
Together with Apple’s ARKit, what all this means is that augmented reality is poised to go mainstream, while virtual reality looks increasingly like a lucrative vertical or niche application.
Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.
Related stories
New on Search Engine Land