Local Search Complexity = SMB Frustration

In my role as president at GetListed.org, I sometimes receive emails from users asking why one local search engine or another is displaying an old location for their business, or why the search engines still aren’t showing their new website address, or why the phone number listed for their retail location is actually the one […]

Chat with SearchBot

In my role as president at GetListed.org, I sometimes receive emails from users asking why one local search engine or another is displaying an old location for their business, or why the search engines still aren’t showing their new website address, or why the phone number listed for their retail location is actually the one for their warehouse.

The short answer I give them is that they can likely fix their particular issues by verifying and updating their information at each search engine’s Local Business Center or Local Listing Center individually. It’s (usually) a fairly painless process that yields quick results.

But the long answer to their question usually would require a deep consultation from an experienced local search marketer, and even then might only result in an educated guess as to where that bad or out-of-date information was coming from.

Amazingly, Local search might actually be more complex than traditional search. My “spaghetti” diagram of data provider relationships has intimidated more than a few marketers, let alone the handful of small business owners who have probably seen it.

Things in the Local Search space are out of the control of the typical small business owner; even if they know about the Google Local Business Center, Yahoo Local Listings, or Bing Local Listing Center, they’re only covering half their bases (according to last year’s 15 Miles survey.) And other portals and data companies typically don’t push updates live with the same alacrity as Google and Bing.

At our recent Local University in Spokane, Mike Blumenthal explained to small business owners how Google and Bing assemble a Local listing. Mike explained that the major search engines pull in a number of disparate pieces of information (street address, phone number, hours of operation, etc.) about a business from just about any source they can crawl – obviously weighted more heavily towards sources they trust, including their own Local databases.

He further blew our minds with the notion that this two dimensional chart of location information not only expanded infinitely in two dimensions to cover additional local information sources, but also extended into three dimensions, with time as the additional variable.  Suggesting, in other words, that recent changes to listing information that aren’t corroborated by other sources may be overwhelmed by the weight of a particular listing’s history!

This lack of transparency and understanding of how listing “clusters” are formed understandably leads to frustration on the part of the business owner, particularly when his experience to date has simply been to tell the Yellow Pages rep how he’d like his information represented year after year.

The bottom line: if you don’t claim and verify each and every listing on each and every search engine and data provider, Google and the other search engines are forced to make a “best guess,” by clustering information that seems to match up, and they don’t always guess right.

If there are listings that are showing information you don’t want to appear, or duplicates that you’d like to consolidate, somewhat counterintuitively you’ll want to claim all of them (at least at Google) and attempt to merge them as described here.

No doubt about it, the local search engines and data companies have a tough job to try to show the most current contact information for tens of millions of businesses, consolidate many different taxonomies, and make sure that nefarious parties aren’t trying to hijack that information.

I’d love to see a more open, less time-intensive industry initiative that allows business owners to “push” accurate information to the major players in local search, though. KML and geositemaps may be one way to do it. For instance, the presence of a KML file on a merchant-verified top-level-domain that any local search crawler can grab any time it chooses. This solution, though, would require some central repository for verified domains which would be a massive effort in-and-of itself.

But hey, if the Big Three search engines were able to come together on the sitemaps protocol, there’s still hope that they and the Big Three data providers can come together on something like this.

In the meantime, at the risk of drawing the ire of equine rights advocates for continuing to beat a very dead horse, maintaining ironclad consistency of basic information like Name, Address, and Phone Number (NAP), in as many places as possible, online and offline – is critical for search engines to associate the correct information with your business.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

David Mihm
Contributor
David Mihm is Director of Local Strategy at Moz, and the architect of Moz Local—a newly-released software product that distributes U.S. business listings to the primary local data aggregators and important local directories.

Get the must-read newsletter for search marketers.