Sign up for our daily recaps of the ever-changing search marketing landscape.
Pinterest’s Lens update adds a Snapchat-style look and a fashion sense
Pinterest is making its image-scanning visual search tool easier to use and more useful for outfit ideas.
Pinterest wants to do for visual search what Google has done for text search. But to do that, Pinterest needs to make searching by taking a picture as easy as typing on a keyboard. So the search-slash-social platform continues to tweak its three-month-old in-app visual search tool, Lens.
Pinterest’s latest updates to Lens, announced on Wednesday, make the image-scanning feature easier to handle and handier for fashion.
For starters, Pinterest is updating the look of Lens. Of course that means it’s becoming more Snapchatty. When people open Lens by tapping the camera icon on the search page in Pinterest’s app, they will be shown a full-screen viewfinder a la Snapchat’s main screen. And like Snapchat’s camera screen, Lens’s new screen sports a capture button at the bottom for people to snap a photo for the visual search tool to scan.
More important than Lens’s familiar new look is its familiar new functionality. The new capture button gives people more control over the shots they take for Lens to process.
Previously, since tapping on Lens’s in-app viewfinder captured the image to scan, people were unable to zoom in or out when taking a photo with Lens or focus their cameras before taking the photo. That made it more frustrating for people to capture the right shot to scan and more likely that Lens would scan the wrong thing, too many things or couldn’t recognize a thing. To make sure Lens zeroed in on the correct object, a person would have to physically move to close-up on that object. To make sure Lens took in an entire scene, a person would have to step back. And to make sure Lens could make out an object or scene, a person would have to wait for Lens to autofocus, making it more difficult to discreetly scan a passerby’s shoes or bags before they walk out of view. As a result, people may have been better off taking a picture with their phone’s native camera app and later having Lens scan it from their camera roll.
With the Snapchat-style interface and capture button that people can tap when they’re ready to take a photo, people can pinch on Lens’ viewfinder to zoom in or out and tap it to focus before taking the photo. Those new controls appear intended to make it more likely that people will think to use Lens in the moment and less likely to think of it as a hassle to use.
But people can still have Lens scan photos from their phones’ camera rolls. In fact it’s easier to do that now. Instead of tapping a button to pull up the saved photos, those photos will automatically appear as a swipeable carousel at the bottom of the screen when opening Lens (assuming a person has given Pinterest’s app permission to access their phone’s camera roll).
Pinterest is also updating the types of results that Lens will present to people.
In keeping with Pinterest’s ambition to show people results they can act on, instead of gawk at, people scanning clothes with Lens will be shown outfit ideas in addition to similar-looking items. According to Pinterest, people have told the company that they want Lens to give them “ideas for how to wear items they already own,” according to a company blog post. The outfit recommendations follow a similar move Pinterest made last month so that people could scan food items with Lens and find recipes in addition to photos of similar-looking food items.
Finally, Pinterest is giving people a way to turn Lens’s results into their own search queries. The company is extending its Instant Ideas feature — through which a person taps a circle atop a pin to spawn a list of similar pins — to Lens. As a result, a person may use Lens to scan a jacket they own, see a list of outfit ideas, find one particular idea that almost hits the mark and tap it to refine the search.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.