Sign up for weekly recaps of the ever-changing search marketing landscape.
Yellow Pages Sites Beat Google In Local Data Accuracy Test
In the brave new world of “SoLoMo” there are an increasing number of sites and mobile apps competing to help you choose a local business or lead you there. In addition to Google Maps, Yelp and Foursquare there are the venerable yellow pages’ sites and many others. They all get their local data from generally the same several sources; so one might expect all these sites to have comparably accurate information, right?
Roughly a month ago I spoke with Marc Brombert, the CEO of Implied Intelligence. His company provides a range of data-related services (e.g., enhancement, cleansing, de-duplication) to marketers and publishers. At the conclusion of our call I suggested that Implied Intelligence test the accuracy and completeness of the business listings data on several of the leading local search sites.
Surprise: Yellow pages beat Google for local search
Several weeks later Implied Intelligence sent me the results of its test. They’re a bit unexpected and illuminating. Google, which has probably devoted more effort and resources to local search than any of its competitors, did not come out on top in the test. Overall it placed third. Two yellow pages sites beat it.
Implied Intelligence crawled and hand checked 1,000 independent local business websites in the US (no chains or franchises were included in the test) and compared the information it captured to the data contained on the following sites:
- Bing Maps
- Google Maps
- Yellowpages.com (YP.com)
The criteria and results
Implied Intelligence evaluated and scored the local search competitors on the basis of the following criteria:
- Coverage (was the listing present)
- Number of duplicates
- Accuracy of information
- Richness of information (presence of additional information beyond business name, address and phone)
The first table below offers a comparison among these sites in terms of basic listings coverage and accuracy. The yellow highlighting indicates the winner in each category.
The table reflects that Google Maps had the most complete coverage: 80 percent of the 1,000 local listings were present. No site had 100 percent of the 1,000 listings. Foursquare had the worst coverage at only 16.7 percent.
In terms of error percentages, yellow pages site Superpages outperformed the others. YP.com had the fewest duplicate listings in the test.
In terms of enhanced information, YP.com was the winner. Reviews and check-in data were not considered because Implied Intelligence felt this didn’t allow for an “apples to apples” comparison across sites. However, had reviews content been included Yelp, Google and Foursquare would likely have fared better.
Superpages the overall winner
Overall Superpages was the winner, followed by YP.com with Google Maps coming in third. Foursquare was the overall loser. However Yelp also didn’t fare that well either.
To many people these results will be a surprise. (They were to me to some degree.) And some people may charge bias. While I didn’t supervise the test and was not involved in its design I can report that Implied Intelligence has no agenda here. I would and do take the results at face value.