Study: Google Reviews Determine Local Carousel Rankings

Since the launch of the Google “Local Carousel” in June, SEOs and marketers have been trying to reverse engineer the ranking variables that elevate listings into that hallowed ground. Mike Blumenthal offered a good early roundup of stories and analysis.

Now Digital Marketing Works (DMW) has conducted an extensive study of what might be called “Carousel results” and come to the conclusion that the quality and quantity of Google reviews are the single most important variable determining inclusion and ranking.

Google Local Carousel

The agency examined more than 4,500 search results in the hotels category, in 47 US cities. Each SERP featured a Carousel result. Here’s more on the methodology of the study from DMW’s verbatim discussion:

For the top 10 hotels of each search, we collected the hotel’s name, rating, quantity of reviews, and rank (as displayed in Carousel). We also recorded the travel time and distance from each hotel to Google’s definition of the given city . . .

Our research yielded approximately 42,000 data points, including data on approximately 1,900 distinct hotels . . .

Our study look[ed] for correlations between a hotel’s rank in a search result with each of the following: 1) Google review rating (out of five stars), 2) quantity of Google reviews, 3) travel time from the hotel to the searched city, and 4) driving distance from the hotel to the searched city. We can look for these correlations within all combinations of query type (like “best hotels in…”) and market tier. For example, we were not surprised to see a strong correlation between rank and travel time/distance: closer hotels ranked higher.  

There were four main findings from the study:

  1. Review quality and volume: “Carousel rank correlates highly with Google review ratings… Our study also showed an equally strong correlation for a hotel’s quantity of Google reviews”
  2. Distance and travel time are ranking variables
  3. Google appears to be weighting results based on a more nuanced understanding of queries and inferred user intent
  4. The findings and ranking variables held true in both large and small markets

It doesn’t appear that third party reviews (e.g., Yelp) are factoring in the Carousel rankings according to DMW.

The agency goes into considerable additional detail about its methodology and scoring, as well as practical recommendations and takeaways from its findings. It’s worth taking a closer look.

Related Topics: Channel: SEO | Google | Google: Knowledge Graph | Google: Maps & Local | Google: SEO | Top News


About The Author: is a Contributing Editor at Search Engine Land. He writes a personal blog Screenwerk, about SoLoMo issues and connecting the dots between online and offline. He also posts at Internet2Go, which is focused on the mobile Internet. Follow him @gsterling.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • paulaspeak

    The findings are what we would hope Google would consider. Local listings ought to be businesses that are close by, recommended, and high quality.

    What surprised me was the line below your four main points: “It doesn’t appear that third party reviews (e.g., Yelp) are factoring in the Carousel rankings…” Why wouldn’t Google consider Yelp ratings? People certainly use Yelp for recommendations, and Yelp itself cracks down on businesses that try to get reviews unnaturally. That review site, at least, seems fairly trustworthy … at least as trustworthy as Google Places’ ratings.

    Is it possible that Google is just making its own rating service more important? That seems too cut-throat and inappropriate for an organic search environment.

  • Illa Hernandez

    I have always wondered what variables attributes to those local carousel by Google. Now I have a bit of idea, not that I’m planning to game the system for my little coffee shop.

  • Sergiu Draganus

    It will be very interesting to know how does the study has been taken? Have you used Local IP addresses for running the search queries or you have used only Google Location Change option, as local carousel results are different for local users from New York ( for example ) than for remote users with the Google location set to New York.

    If the study was not built by using Local IPs when searching then all the results are only relative, not exact, and can not be considered as being relevant.

  • Kevin Mullaney

    Native reviews on Google are always going to be weighted significantly higher than those on any other 3rd party review platform when it comes to determining local carousel rank. However what is missing from this study are the other key local ranking factors. It made no mention of other key ranking signals such as the quality and quantity of structured and unstructured citations, Google+ local optimisation and geo-tagged photos, which are all believed to be an important part of carousel rankings. It seems this study is someone inclomplete and bias to reviews, although they undoubtably play a key part.

  • Aaron Zwas

    Hi all- in response to questions / comments… The study was designed to determine what, if any, influence reviews have on Carousel rank. So, we designed our study to focus on reviews and to set aside the many other complex factors like semantic, visual, personal/location, and price.

    1. The study has no opinion on Yelp or reviews for other sites. By design- we did not look at this. One should assume that Google reviews matter most, but -again- we did not collect data on this and therefore have no finding to share.

    2. Our findings show high correlation. NOT that “reviews are the single most important variable” when determining ranking. Like Kevin says, citations, geo-tags, G+ quality all contribute too. Our study does not disregard the relevance of these items. It simply confirms that reviews are a notable ingredient in the mix.

    3. Good point regarding potential variable results by user location. To account for this, we conducted each specific search at least 3 times to gain directional data. Because these were desktop searches and b/c the users were usually not in the city for a given search term, the results were unexpectedly uniform. Although we do not count this as an official finding of the study, it appears that user location did NOT influence these desktop Carousel results.

  • Illogicalthinker

    I don’t think the results would vary much. If the results said that geographical indicators were the biggest variant, then I would want to see results from local IPs. Maybe I am wrong though.

  • Aaron Zwas

    Yes. As the author of the study, I can confirm that results were very standard, regardless of user location. We expect much more variability for mobile searches, but did not include mobile in this study.

  • Flower Delivery Guy

    The unfortunate truth:

    I think that Google is hopeless at tackling the issue of duplicate listings on their search results, hence the need to try and justify.

    For example:

    and guess what top three listing are same, and they do not even offer the service, how ironic is that.

    So a company that does not offer a Sunday service is ranked 1,2,3 purely because they are trying to damage their competitiors that actually do offer the service.

    And to make things worse, it’s the company they penalized for buying links only a few months before. WOW!!!!

    will this get published? highly unlikely

    will google do something about this? highly unlikely.

  • Aaron Zwas

    Flower Guy-
    Google doesn’t know what to do with your incomplete search term — you haven’t told it where you’d like your Sunday flower delivery.

    Try adding “in chelsea” or “in [any specific city / neighborood]” to your search. Very different results appear.

  • Flower Delivery Guy

    you are missing the point…you have a big company trying to hurt a smaller company by outranking them on a term i mentioned above.

    they do not offer that service!

    yet they are forcing themselves up there to prevent companies that do offer the service from getting customers.

  • Sergiu Draganus

    Thanks for the answer. Will be very interesting to see an analysis on local searches by using implicit keywords, without specifying the location in the search query.

  • Richrd Neal

    Yelp’s reviews are suspect at best. Many companies, like mine, have found that Yelp filters out some good reviews in hopes that, as a business owner, I’ll be forced to pay to get to the top of Yelp listings. Bravo for Google staying away from Yelp’s reviews…

  • Craig

    What’s not clear in this study is how much of a factor review freshness/recency is in the impact of reviews to the carousel result. I would expect this algo to evolve to include checks for review relevancy into the reviews portion of the scoring.

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide