• paulaspeak

    The findings are what we would hope Google would consider. Local listings ought to be businesses that are close by, recommended, and high quality.

    What surprised me was the line below your four main points: “It doesn’t appear that third party reviews (e.g., Yelp) are factoring in the Carousel rankings…” Why wouldn’t Google consider Yelp ratings? People certainly use Yelp for recommendations, and Yelp itself cracks down on businesses that try to get reviews unnaturally. That review site, at least, seems fairly trustworthy … at least as trustworthy as Google Places’ ratings.

    Is it possible that Google is just making its own rating service more important? That seems too cut-throat and inappropriate for an organic search environment.

  • http://obpglobal.com/ Illa Hernandez

    I have always wondered what variables attributes to those local carousel by Google. Now I have a bit of idea, not that I’m planning to game the system for my little coffee shop.

  • Sergiu Draganus

    It will be very interesting to know how does the study has been taken? Have you used Local IP addresses for running the search queries or you have used only Google Location Change option, as local carousel results are different for local users from New York ( for example ) than for remote users with the Google location set to New York.

    If the study was not built by using Local IPs when searching then all the results are only relative, not exact, and can not be considered as being relevant.

  • Kevin Mullaney

    Native reviews on Google are always going to be weighted significantly higher than those on any other 3rd party review platform when it comes to determining local carousel rank. However what is missing from this study are the other key local ranking factors. It made no mention of other key ranking signals such as the quality and quantity of structured and unstructured citations, Google+ local optimisation and geo-tagged photos, which are all believed to be an important part of carousel rankings. It seems this study is someone inclomplete and bias to reviews, although they undoubtably play a key part.

  • Aaron Zwas

    Hi all- in response to questions / comments… The study was designed to determine what, if any, influence reviews have on Carousel rank. So, we designed our study to focus on reviews and to set aside the many other complex factors like semantic, visual, personal/location, and price.

    1. The study has no opinion on Yelp or reviews for other sites. By design- we did not look at this. One should assume that Google reviews matter most, but -again- we did not collect data on this and therefore have no finding to share.

    2. Our findings show high correlation. NOT that “reviews are the single most important variable” when determining ranking. Like Kevin says, citations, geo-tags, G+ quality all contribute too. Our study does not disregard the relevance of these items. It simply confirms that reviews are a notable ingredient in the mix.

    3. Good point regarding potential variable results by user location. To account for this, we conducted each specific search at least 3 times to gain directional data. Because these were desktop searches and b/c the users were usually not in the city for a given search term, the results were unexpectedly uniform. Although we do not count this as an official finding of the study, it appears that user location did NOT influence these desktop Carousel results.

  • Illogicalthinker

    I don’t think the results would vary much. If the results said that geographical indicators were the biggest variant, then I would want to see results from local IPs. Maybe I am wrong though.

  • Aaron Zwas

    Yes. As the author of the study, I can confirm that results were very standard, regardless of user location. We expect much more variability for mobile searches, but did not include mobile in this study.

  • http://www.flowerstation.co.uk/ Flower Delivery Guy

    The unfortunate truth:

    I think that Google is hopeless at tackling the issue of duplicate listings on their search results, hence the need to try and justify.

    For example:

    https://www.google.co.uk/#q=flower+delivery+on+sunday&lrd=lrd

    and guess what top three listing are same, and they do not even offer the service, how ironic is that.

    So a company that does not offer a Sunday service is ranked 1,2,3 purely because they are trying to damage their competitiors that actually do offer the service.

    And to make things worse, it’s the company they penalized for buying links only a few months before. WOW!!!!

    will this get published? highly unlikely

    will google do something about this? highly unlikely.

  • Aaron Zwas

    Flower Guy-
    Google doesn’t know what to do with your incomplete search term — you haven’t told it where you’d like your Sunday flower delivery.

    Try adding “in chelsea” or “in [any specific city / neighborood]” to your search. Very different results appear.

  • http://www.flowerstation.co.uk/ Flower Delivery Guy

    you are missing the point…you have a big company trying to hurt a smaller company by outranking them on a term i mentioned above.

    they do not offer that service!

    yet they are forcing themselves up there to prevent companies that do offer the service from getting customers.

  • Sergiu Draganus

    Thanks for the answer. Will be very interesting to see an analysis on local searches by using implicit keywords, without specifying the location in the search query.

  • Richrd Neal

    Yelp’s reviews are suspect at best. Many companies, like mine, have found that Yelp filters out some good reviews in hopes that, as a business owner, I’ll be forced to pay to get to the top of Yelp listings. Bravo for Google staying away from Yelp’s reviews…

  • http://www.fencing.net/ Craig

    What’s not clear in this study is how much of a factor review freshness/recency is in the impact of reviews to the carousel result. I would expect this algo to evolve to include checks for review relevancy into the reviews portion of the scoring.