Google & Bing Have Whitelists – “Exception Lists” – For Some Algorithm Signals

During the Spam Police panel at the SMX West conference, both Google and Bing said they have “exception lists” for sites that might get hit by some algorithm signals.

Neither suggested that such lists were used to boost sites in their search results nor exempt them from any overall algorithm changes. However, in the case of Google, sites might be excluded from the impact of particular ranking signals, said Google’s spam chief, Matt Cutts.

When I followed up with Cutts about the question during the session, (I was moderating), I put it this way:

So Google might decide there’s some particular signal within the overall ranking algorithm that works for say 99% of the sites as Google hopes, but maybe that also hits a few outlying sites in a way they wouldn’t expect — in a way they feel harms the search results — then Google might except those sites?

The answer was effectively yes (Matt’s actual answer was fairly long, but it was a “yes” in the end).

Imagine you found a new signal that detected a huge class of spammy sites — but for some reason, it also caught a few “innocent” sites. That’s where the exception list comes in. And Cutts stressed that as the signal gets more refined, eventually the exception lists can be dropped.

In the past, Google has said there was no whitelisting, seemingly to put Google now in conflict with statements it has said in legal cases about having no whitelists, as The Register is pointing out. Google has also repeatedly said that it exempted no sites from the recent Farmer / Panda update.

The difference seems to be about exempting sites from particular signals and from an algorithm change overall. In the case of Panda, it’s entirely possible that no sites were excepted from that update, because that update, in the end, was an overall change to Google’s entire algorithm. However, it could be that individual sites have been excluded from particular signals that were added as part of that update — and that effectively could seem to act as an overall exception.

We’ll follow up again about this — and we’ll work to get a excerpt of the video covering this question online shortly.

Also, I’ve updated this story and headlines since it was originally written by Barry Schwarz, to better clarify some points. Our original article is below:

Google & Bing Have Whitelists/Exception Lists For Algorithms
During the Spam Police panel at the SMX West conference, Google and Bing admitted publicly to having “exception lists” for sites that were hit by algorithms that should not have been hit.
Matt Cutts explained that there is no global whitelist but for some algorithms that have a negative impact on a site in Google’s search results, Google may make an exception for individual sites. But didn’t Matt tell us there is no whitelist for the Farmer update?
That is true, Google currently doesn’t have a way to whitelist a site for the Farmer/Panda update. It does not mean that Google won’t add an exception/whitelist for the Panda release but currently they do not.
Google and Bing explained that not all algorithms are 100% perfect and thus require these exception lists.

Postscript: Google has sent us an official statement, it reads:

Our goal is to provide people with the most relevant answers as quickly as possible, and we do that primarily with computer algorithms. In our experience, algorithms generate much better results than humans ranking websites page by page. And given the hundreds of millions of queries we get every day, it wouldn’t be feasible to handle them manually anyway.

That said, we do sometimes take manual action to deal with problems like malware and copyright infringement. Like other search engines (including Microsoft’s Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality. We don’t keep a master list protecting certain sites from all changes to our algorithms.

The most common manual exceptions we make are for sites that get caught by SafeSearch–a tool that gives people a way to filter adult content from their results. For example, “essex.edu” was incorrectly flagged by our SafeSearch algorithms because it contains the word “sex.” On the rare occasions we make manual exceptions, we go to great lengths to apply our quality standards and guidelines fairly to all websites.

Of course, we would much prefer not to make any manual changes and not to maintain any exception lists. But search is still in its infancy, and our algorithms can’t answer all questions.

Postscript by Barry Schwartz: We have posted the full transcript and audio of the ‘whitelist’ discussion at Transcript: Google & Bing On Whitelists & Exception Lists.

Related Topics: Channel: SEO | Features: General | Google: SEO | Microsoft: Bing | Top News

Sponsored


About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:
 

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Winooski

    Seems to me that, for a company which makes “non-human intervention” in its algorithmic output a point of great pride, admitting that there’s a search index whitelist is a significant loss of face.

  • http://www.michael-martinez.com/ Michael Martinez

    @Winooski: “Seems to me that, for a company which makes ‘non-human intervention’ in its algorithmic output a point of great pride, admitting that there’s a search index whitelist is a significant loss of face.”

    I think being accused of whitelisting a site and NOT acknowledging or denying the accusation hurt their credibility more. At this point, Google and Bing are being more open (perhaps due in part to pressure from Rich Skrenta and Blekko) about how the process works.

    At least we know there are no system-wide, permanent whitelists — which I would be far more concerned about than a sub-system-specific, iteration-dependent exception list that buys them time to fix the algorithm. That is reasonable and clearing the air on this topic should help rebuild some of the trust that was lost by their silence.

  • http://www.redmudmedia.com Red_Mud_Rookie

    I don’t have an issue with this at all because I think the algorithms would work better with more human intervention.
    Where I DO have an issue is where sites are blatantly, and I mean blatantly, buying links and not being punished at all.
    Let’s face it, anyone working in a competitive sector like travel or finance who is competing for the “big” search terms like “cheap…” has no choice, but to offer some form of incentive to get links from other sites if you are to compete for top spot or even first page.

    Google themselves did it when they handed out Nexus phones to bloggers!

    I’m not suggesting Google ban all sites ranking for top terms if they have dodgy links, but there is certainly a lot of work to be done in creating a level playing field for sites that don’t have big link building budgets, but which present their websites well and offer a great user experience with quality content.

    Recent initiatives from Google such as the “Block all “Site” Results” is a step in the right direction, but you need to be able to choose whether to block an entire domain or just a particular page.

    http://googleblog.blogspot.com/2011/03/hide-sites-to-find-more-of-what-you.html

  • http://www.seo.net/compare/seo-companies/united-states Artur – SEO.NeT

    @Red_Mud_Rookie,
    The most important law of economics says:
    If there is money to earn there will be people to earn it. So I don’t think the link selling/buying will stop. However they will have to live with the consequences when they got smashed.

  • http://www.redmudmedia.com Red_Mud_Rookie

    @Artur – SEO.NeT Spot on. It won’t stop, especially as there is such a massive opportunity to tap into the growing social media network of friends and family and friends of friends of family and their friends…. “Get 50% of your *enter number* friends on Facebook to “Like” our new product and we’ll send it to you for free!

    I just prefer this social media “links with personality” approach because if your product is good and people like it, you will succeed :-) whereas if you’re Mr Big or mr small and you treat customers badly or sell poor quality products, then you will suffer.

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide