The Confusion Over Google’s Algorithm Updates & Refreshes

I personally spend a lot of time within the SEO space investigating claims of Google algorithm updates; it is a fascinating topic because there are so many variables involved and a lot of secrecy on Google’s end around its search algorithms. Often, Google will not confirm updates or algorithm refreshes. Plus, with manual actions and link […]

Chat with SearchBot

google-white1-algorithm-seo-ss-1920

I personally spend a lot of time within the SEO space investigating claims of Google algorithm updates; it is a fascinating topic because there are so many variables involved and a lot of secrecy on Google’s end around its search algorithms.

Often, Google will not confirm updates or algorithm refreshes. Plus, with manual actions and link penalties happening all the time, it makes it hard to determine if Google updated its ranking algorithms or if a penalty was pushed out that impacted a large number of sites.

Recently, there have been many rumors around Google testing a Penguin refresh. Note, it has been almost a year since the last official Penguin refresh, aka Penguin 2.1, so webmasters are eager to see their sites released from the algorithm.

This past Friday, there reports from some sources of a Panda refresh. With Panda, we know Google won’t confirm future updates, for the most part. But there is strong evidence to say Google ran a refresh, which on average happens monthly these days.

This morning, I spotted an interesting post by Google’s John Mueller in a help thread where John explains the complexities around Google’s algorithms. It is important you all read what John wrote because Google’s search algorithm is incredibly complex:

In theory: If a site is affected by any specific algorithm or its data, and it fixes the issue that led to that situation, then the algorithm and/or its data must be refreshed in order to see those changes. Sometimes those changes aren’t immediately visible even after a refresh, that’s normal too.

In practice, a site is never in a void alone with just a single algorithm. We use over 200 factors in crawling, indexing, and ranking. While there are some cases where a site is strongly affected by a single algorithm, that doesn’t mean that it won’t see any changes until that algorithm or its data is refreshed. For example, if a site is strongly affected by a web-spam algorithm, and you resolve all of those web-spam issues and work to make your site fantastic, you’re likely to see changes in search even before that algorithm or its data is refreshed. Some of those effects might be directly related to the changes you made (other algorithms finding that your site is really much better), some of them might be more indirect (users loving your updated site and recommending it to others).
So yes, in a theoretical void of just your site and a single algorithm (and of course such a void doesn’t really exist!), you’d need to wait for the algorithm and/or its data to refresh to see any changes based on the new situation. In practice, however, things are much more involved, and improvements that you make (especially significant ones) are likely to have visible effects even outside of that single algorithm. One part that helps to keep in mind here is that you shouldn’t be focusing on individual factors of individual algorithms, it makes much more sense to focus on your site overall — cleaning up individual issues, but not assuming that these are the only aspects worth working on.

All that said, we do realize that it would be great if we could speed the refresh-cycle of some of these algorithms up a bit, and I know the team is working on that. I know it can be frustrating to not see changes after spending a lot of time to improve things. In the meantime, I’d really recommend – as above – not focusing on any specific aspect of an algorithm, and instead making sure that your site is (or becomes) the absolute best of its kind by far.


About the author

Barry Schwartz
Staff
Barry Schwartz is a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry can be followed on Twitter here.

Get the must-read newsletter for search marketers.