Panda 2.0: Google Rolls Out Panda Update Internationally & Incorporates Searcher Blocking Data

Late February, Google launched a substantial algorithm change (known as “Farmer” or “Panda”) aimed at identifying low-quality pages and sites. These are pages (often seen on so-called “content farms”) with text that is relevant for a query, but may not provide the best user experience. (Google calls it a “high quality sites algorithm”.) Today, Google has […]

Chat with SearchBot

Late February, Google launched a substantial algorithm change (known as “Farmer” or “Panda”) aimed at identifying low-quality pages and sites. These are pages (often seen on so-called “content farms”) with text that is relevant for a query, but may not provide the best user experience. (Google calls it a “high quality sites algorithm”.) Today, Google has rolled this change out to all English language queries and made a few minor updates (with an estimated impact to 2% of U.S. queries).

Live for All English Queries

The original algorithm update impacted only U.S. queries. As of today, this change is live for all English queries worldwide. This includes both English speaking countries (such as searches on google.co.uk, and google.com.au) and English queries in non-English countries (for instance, for a searcher using google.fr who’s chosen English-language results).

In the United States, the initial launch impacted nearly 12% of queries, so it stands to reason that the impact may be similar for English-speaking searchers across the world.

Incorporating Searcher Data About Blocked Sites

Google has always used a number of signals in determining relevant search results. Some of these are on the pages themselves (such as the text on a page), some are on other sites (such as anchor text in links to a page), and some are based on user behavior (for instance, Google gathers data about how long pages take to load by using toolbar data from users who access those pages).

In recent months, Google has launched two ways for searchers to block particular sites from their search results. The first was a Chrome extension. More recently, Google has launched a block link directly in the search results that appears once a searcher has clicked from the results to a site and then return to the search results.

When Panda launched initially, Google said that they didn’t use data about what sites searchers were blocking as a signal in the algorithm, but they did use the data as validation that the algorithm change was on target. They found an 84% overlap in sites that were negatively impacted by Panda and sites that users had blocked with the Chrome extension.

Now, they are using data about what searchers have blocked in  “high confidence situations”. Google tells me this is a secondary, rather than primary factor. If the site fits the overall pattern that this algorithm targets, searcher blocking behavior may be used as confirmation.

Impact Seen To a Wider Variety of Sites

In the initial launch, large sites were primarily affected. This makes sense as larger sites, with more pages, traffic, and links, have more signals available. With the latest update, smaller sites may see an impact. Amit Singhal, in charge of search quality at Google, notes in the blog post, “this change also goes deeper into the “long tail” of low-quality websites to return higher-quality results where the algorithm might not have been able to make an assessment before”.

Amit Singhal told me,

“We’re focused on showing users the highest quality, most relevant pages on the web. We’re cautious not to roll out changes until we’re confident that they improve the user experience, while at the same time helping the broader web ecosystem. We incorporate new signals into our algorithm only after extensive testing, once we’ve concluded that they improve quality for our users.”

What To Do If Your Site Is Impacted

When this change was launched in the United States, site owners who were impacted were vocal in their unhappiness and Google opened a thread in the Google webmaster central discussion forum so site owners could provide feedback to Google. In the latest post, they said:

“Based on our testing, we’ve found the algorithm is very accurate at detecting site quality. If you believe your site is high-quality and has been impacted by this change, we encourage you to evaluate the different aspects of your site extensively. Google’s quality guidelines provide helpful information about how to improve your site. As sites change, our algorithmic rankings will update to reflect that. In addition, you’re welcome to post in our Webmaster Help Forums. While we aren’t making any manual exceptions, we will consider this feedback as we continue to refine our algorithms.”

As I noted in my previous articles, take an objective look at the user experience of the site:

  • Can visitors easily find their way around?
  • Is it obvious what topic each page is about?
  • Is the content original or is it aggregated from other sources?
  • Do the number and placement of the ads obscure the visitor’s ability to quickly access the content?
  • When looking objectively at the site, is the primary focus the user need or the business goal?
  • Is the content on the page authoritative and valuable? Does it answer the query  better than other pages on the web?
  • If some of the pages on the site are very high quality and engaging, are other pages on the site not as high quality? (Google has stated that enough low quality content on a site can reduce the entire site’s rankings, not just the low quality pages.)

Use these findings to target improvements to your site that will enhance the overall user experience (which should also benefit overall engagement, loyalty, and conversion).

Related:


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Vanessa Fox
Contributor
Vanessa Fox is a Contributing Editor at Search Engine Land. She built Google Webmaster Central and went on to found software and consulting company Nine By Blue and create Blueprint Search Analytics< which she later sold. Her book, Marketing in the Age of Google, (updated edition, May 2012) provides a foundation for incorporating search strategy into organizations of all levels. Follow her on Twitter at @vanessafox.

Get the must-read newsletter for search marketers.