• http://localreachlabs.com/ Russell Hayes

    Thanks for writing about this issue and what I’d love to see Google do is just let us disavow incorrect listings inside our dashboards just like we do for links. We’ll never stop data scrapers from scraping incorrect data so we need a tool in our listing dashboards to allow us to tell Google to stop creating dup listings based on wherever they got the source from. They, “Google” claim they do everything to provide a better search experience for their users, well duplicate listings do not provide that in the least.

  • Andrew Shotland

    That would be an interesting solution Russell. Google’s preference is to do this kind of thing algorithmically. They are kind of like a germophobe who doesn’t want anything icky (aka human) to touch them.

  • http://localsearchforum.catalystemarketing.com/ Linda Buquet

    Great post Andrew and love the zombie dupes! Off to share with all my peeps.

    Well said Russ and great suggestion. This was also one of the suggestions in the recent BrightLocal survey of features we’d all like to see Google add in the Places dashboard. I know Google saw it because I brought to their attention. Here’s hoping some of the recommendations, including that one are implemented.

  • http://www.949local.com/ Jim Froling

    Ironically, Google itself contributes to the dupe headache as they “create” Google+ pages for businesses, like my own, even though the business has already created and verified a G+ page previously. Google acknowledges their ambitious creativity and says that eventually the two pages will be merged. In the meantime, I see the Google created page sitting there empty (besides NAP+U) and without content and scared to death to touch it.

  • Andrew Shotland

    @Jim, this could definitely be a matching/conflation issue where there are other sources with slightly different data that Google can’t catch, no matter how awesome their algo is.

  • http://localreachlabs.com/ Russell Hayes

    Well if they can take the time to make a disavow tool for spammy links why not for duplicate listings, they are a form of spam and Google’s the one putting them into their index.

  • http://localreachlabs.com/ Russell Hayes

    Well if they can take the time to make a disavow tool for spammy links why not for duplicate listings, they are a form of spam and Google’s the one putting them into their index.

  • http://localreachlabs.com/ Russell Hayes

    Thanks Linda! Yeah I’ve wanted a tool for this as far back as 2011 when it was still places. I’d log into a clients dashboard and find 3 listings that were either dupped or just old addresses and remove them again, over and over and over! lol I guess I can thank Google for some extra billable hours. =)

  • http://localreachlabs.com/ Russell Hayes

    Thanks Linda! Yeah I’ve wanted a tool for this as far back as 2011 when it was still places. I’d log into a clients dashboard and find 3 listings that were either dupped or just old addresses and remove them again, over and over and over! lol I guess I can thank Google for some extra billable hours. =)

  • Ian Harper

    Google really needs to sort local out. I am in carpet cleaning and everyone i know has listing for many towns. how they do it i dont care but its spam and more importantly deceptive, customers will think that address is a business and its fake. the card check just does not work.

  • Ian Harper

    Google really needs to sort local out. I am in carpet cleaning and everyone i know has listing for many towns. how they do it i dont care but its spam and more importantly deceptive, customers will think that address is a business and its fake. the card check just does not work.

  • Andrew Shotland

    Unfortunately Ian it’s all too easy to create fake Google Places listings

  • Andrew Shotland

    Unfortunately Ian it’s all too easy to create fake Google Places listings

  • http://imprezziomarketing.com/ Colan Nielsen

    Thanks for the great post Andrew. I won’t be able to get the image of brain eating duplicate zombies out of my mind when I sit down to watch The Walking Dead this weekend.

    I must say that InfoUSA/ExpressUpdate seems to be a main culprit when it comes to feeding incorrect data to Google’s scrapers. I can’t tell you how many dupes we see that have the exact same title format as a corresponding ExpressUpdate listing. But, as you said, I know the duplicate is most likely made up of multiple sources.

  • RightTech

    This might be the stupid Google+ Local page they create, versus the Biz page you create. From what I’ve read that is intentional and you are stuck with both – which is even more fun when you have multiple locations, since each one gets its own Google+ Local page.

  • RightTech

    I have a client with the opposite problem – updated a Google Place with their new address, but Google continues to list the new address as ‘closed or relocated” even after contacting their support team…[updated 2/21/14: finally fixed after a month of effort....]

  • Scott Davis

    Check to make sure your name field is to the character, the same on both.

    Same with the street address field. If you use West Main on one and the other says W. Main, they won’t merge.

  • Scott Davis

    The Google+ Local page disappears when your map listing merges with your Google+ Page.

  • Scott Davis

    Especially destructive to your SERP set when your competition is stuffing keywords like “Appliance Repair Dallas” as their company name when the actual company name is Jake’s Appliance Service. For some reason Google can’t figure that trick out.

  • http://www.discountonlinefitness.com steve

    I started using Yext last year and it has really helped get all my NAP listings in order without much effort.

  • RightTech

    @Scott: how does one get their map listing to merge with a separately created Google+ page?

    And does that work when the biz has locations and verified Google Places in two different cities?

  • Scott Davis

    Totally. But you’d have to have a map listing for each location and a plus page for each location.

    There’s no “merge” button. PLEASE don’t waste time searching for one. It’s an automated process where google ties the “more info” link on the map listing card to the Google+ page so users can read/write reviews, etc.

    In most cases, this happens within a few days of the plus page being created.

    A few things to note:
    Use the same email/google account to set up both the map listing & the Google+ page. If you didn’t, add the email associated with the map listing as a manger of the Google+ page (doesn’t have to be the owner).

    Make sure your NAP info is the same on both!!! If it isn’t, they will not merge. (Note: Google will change how they display your street name on occasion on the actual listing, so make sure you’re logging in to each and clicking on the ‘edit information’ button. If the map listing address field is populated with “Jones St.” & the Google+ address field is populated with “Jones Street”, they will not merge.) Same thing goes with your name field. Make sure they match, to the character.

    It also helps facilitate the merge if you add the email associated with the Google+ page as an owner of your webmaster tools account for your website.
    ________
    FYI: If you’re looking at the new google maps interface… “read reviews” has replaced the “more info” anchor text that takes users to the Google+ page.
    ___
    While this article doesn’t exactly address the topic of the blog post here, or your question specifically, it might provide some insight into things you can do to encourage Google to merge your map listing & plus page.

    http://blog.valetinteractive.com/2013/10/google-carousel-important-hotels/

  • http://www.erikeric.com/ erikeric

    Great, simple overview. Thanks, Andrew

  • RedLeader

    While it’s great that you’ve posted *what* the issue is, I think you’re breaking down the mechanisms for an issue we’re mostly familiar with. It’s like diagnosing an engine noise – yes, it’s interesting to understand the underlying mechanics, and it’s OK to say “ah’yep, you sure do got a engine problem”, but the real help would be to provide us with some actionable solutions we can implement.

    Telling us that there are issues with the local data is OK, but your article wraps up on a really defeatist tone, by stating all of the problems without identifying any solutions. Could you please follow up with some solutions to the issue??

  • http://www.mobilemarketingprofits.com/ Kim Dushinski

    Agreed. I am actively seeking a solution to this problem and since the kind of fix that works for zombies on The Walking Dead doesn’t actually work in this scenario…

    Steve mentioned above using Yext. There are tons of solutions like this and I would love to know if any of them are actually a good solution. Is it worth paying a recurring fee to have this problem solved? Does paying the recurring fee actually solve the problem?

  • http://www.mobilemarketingprofits.com/ Kim Dushinski

    Do you feel the recurring fee is worth it?

  • http://www.discountonlinefitness.com steve

    yes because it guarantees that thousands of NAP listings will have the most current information and I can make changes in one place to update them all at once. It not only saves a lot of time but ensures that hundreds/thousands of listings are correct. The time it takes to verify each listing manually cost me more in time/labor/stress than to have it centrally controlled. Cost averages out to about $38 a month.

  • Zak Tomlinson

    Well, maybe the following isn’t the solution to the problem itself, but rather to the consequences. You know what is you main listing is, work on it, add details, photos, make regular updates… a little bit time will pass and duplicate listing will disappear in the Nether

  • Andrew Shotland

    @RedLeader:disqus “mostly familiar with” is the issue IMO. Based on my discussions with many practitioners, particularly many “expert” practitioners, including those who sell citation cleanup services, everyone (at least everyone I have spoken with) is under the mistaken impression that fixing dupes at the source level and the publisher level, is how to solve the problem. It’s not, at least it’s not in the long term, because sooner or later one of the sources is going to start spewing out more dupes – even though you thought you had squashed them. I personally squashed 10 dupe records of a single business at the source & publisher level six months ago and five new ones showed up a few weeks ago.

    The problem is there currently is no solution because the entire system is built on an unstable foundation.

    My intention is not to be “defeatist”. My intention is to be honest and open about a fundamental problem with our business. The only way to get the industry as a whole to solve the problem is to continue to shine a spotlight on the issue.

  • Andrew Shotland

    Yext does not solve the problem because Yext does not deal with duplicate records. In theory I guess you could try to claim each dupe listing with a Yext Powerlisting – but that would get expensive – and it still wouldn’t solve the problem of the source data providers spewing new dupes. That said, Yext’s concept of “locking” a listing at the publisher level is part of the solution IMO. Full disclosure – I do SEO consulting for Yext.

  • Andrew Shotland

    Sounds great, except it’s more likely that over time the duplicate listings will multiply and you will be overrun by the zombie herd.