Google & Bing Have Whitelists – “Exception Lists” – For Some Algorithm Signals

During the Spam Police panel at the SMX West conference, both Google and Bing said they have “exception lists” for sites that might get hit by some algorithm signals. Neither suggested that such lists were used to boost sites in their search results nor exempt them from any overall algorithm changes. However, in the case […]

Chat with SearchBot

Requesting Reconsideration Of Your Site Webmaster Tools HelpDuring the Spam Police panel at the SMX West conference, both Google and Bing said they have “exception lists” for sites that might get hit by some algorithm signals.

Neither suggested that such lists were used to boost sites in their search results nor exempt them from any overall algorithm changes. However, in the case of Google, sites might be excluded from the impact of particular ranking signals, said Google’s spam chief, Matt Cutts.

When I followed up with Cutts about the question during the session, (I was moderating), I put it this way:

So Google might decide there’s some particular signal within the overall ranking algorithm that works for say 99% of the sites as Google hopes, but maybe that also hits a few outlying sites in a way they wouldn’t expect — in a way they feel harms the search results — then Google might except those sites?

The answer was effectively yes (Matt’s actual answer was fairly long, but it was a “yes” in the end).

Imagine you found a new signal that detected a huge class of spammy sites — but for some reason, it also caught a few “innocent” sites. That’s where the exception list comes in. And Cutts stressed that as the signal gets more refined, eventually the exception lists can be dropped.

In the past, Google has said there was no whitelisting, seemingly to put Google now in conflict with statements it has said in legal cases about having no whitelists, as The Register is pointing out. Google has also repeatedly said that it exempted no sites from the recent Farmer / Panda update.

The difference seems to be about exempting sites from particular signals and from an algorithm change overall. In the case of Panda, it’s entirely possible that no sites were excepted from that update, because that update, in the end, was an overall change to Google’s entire algorithm. However, it could be that individual sites have been excluded from particular signals that were added as part of that update — and that effectively could seem to act as an overall exception.

We’ll follow up again about this — and we’ll work to get a excerpt of the video covering this question online shortly.

Also, I’ve updated this story and headlines since it was originally written by Barry Schwarz, to better clarify some points. Our original article is below:

Google & Bing Have Whitelists/Exception Lists For Algorithms

During the Spam Police panel at the SMX West conference, Google and Bing admitted publicly to having “exception lists” for sites that were hit by algorithms that should not have been hit.

Matt Cutts explained that there is no global whitelist but for some algorithms that have a negative impact on a site in Google’s search results, Google may make an exception for individual sites. But didn’t Matt tell us there is no whitelist for the Farmer update?

That is true, Google currently doesn’t have a way to whitelist a site for the Farmer/Panda update. It does not mean that Google won’t add an exception/whitelist for the Panda release but currently they do not.

Google and Bing explained that not all algorithms are 100% perfect and thus require these exception lists.

Postscript: Google has sent us an official statement, it reads:

Our goal is to provide people with the most relevant answers as quickly as possible, and we do that primarily with computer algorithms. In our experience, algorithms generate much better results than humans ranking websites page by page. And given the hundreds of millions of queries we get every day, it wouldn’t be feasible to handle them manually anyway.

That said, we do sometimes take manual action to deal with problems like malware and copyright infringement. Like other search engines (including Microsoft’s Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality. We don’t keep a master list protecting certain sites from all changes to our algorithms.

The most common manual exceptions we make are for sites that get caught by SafeSearch–a tool that gives people a way to filter adult content from their results. For example, “essex.edu” was incorrectly flagged by our SafeSearch algorithms because it contains the word “sex.” On the rare occasions we make manual exceptions, we go to great lengths to apply our quality standards and guidelines fairly to all websites.

Of course, we would much prefer not to make any manual changes and not to maintain any exception lists. But search is still in its infancy, and our algorithms can’t answer all questions.

Postscript by Barry Schwartz: We have posted the full transcript and audio of the ‘whitelist’ discussion at Transcript: Google & Bing On Whitelists & Exception Lists.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Danny Sullivan
Contributor
Danny Sullivan was a journalist and analyst who covered the digital and search marketing space from 1996 through 2017. He was also a cofounder of Third Door Media, which publishes Search Engine Land and MarTech, and produces the SMX: Search Marketing Expo and MarTech events. He retired from journalism and Third Door Media in June 2017. You can learn more about him on his personal site & blog He can also be found on Facebook and Twitter.

Get the must-read newsletter for search marketers.