• http://benacheson.com Ben Acheson

    Thanks for a great post Janet.

    I think the first and most important thing after identifying an organic search traffic drop is to check for a warning message in Webmaster Tools. The one that people fear so much.

    In the menu on the left, click on “Search Traffic” and then”Manual Actions”.

    If there’s a penalty related to links then, to put it simply, you’re mainly looking for over-optimised anchor text and low quality / irrelevant sites. (In 2014 you’re probably looking mainly for obvious guest blog posts that meet those criteria).

    We’re lucky that there has recently been a PageRank update. If a site has no PageRank on the homepage then see if it ranks for its brand name. If not then there is a good chance that Google doesn’t like that site.

    Google’s penalty-crazed fear campaign is getting out of hand:


    Google doesn’t own the web. But they are in danger of doing lasting damage to it.

  • Scott Davis

    “Has your robots.txt file from your test server been moved to the live
    server accidentally? If so, it could be blocking search engines from

    This is a common misconception. The robots.txt file only tells the search engines to not crawl the pages. They’ll still end up indexed even if you’ve blocked them in robots.

  • http://www.twitter.com/EricPaul_ Eric Paul Abrantes

    This link was shared with me from @CraigBradford of Distilled earlier today.


    It is a Google Analytics overlay that puts Google Updates on your data. I have yet to try it, but it could save a lot of time going back and forth from Moz’s algo page.

  • andykillen

    that may be, but the description does not end up in the search results, only a warning message that says that the site is blocking this. Not really a thing you want if your gonna get clicks. Not many click on a warning.

  • AA

    I tend to click on the link more whenever this happens to see what the site has failed at trying to deindex