Up Close @ SMX: Avoiding SEO Disasters

Contributor Russell Savage recaps Mark Munroe's presentation at SMX East, in which he discussed common SEO disasters, as well as ways to detect and prevent them.

Chat with SearchBot

up-close-at-smx-east-2014

The following is a recap of Mark Munroe’s presentation in the SMX East 2014 session “Conquering Today’s Technical SEO Challenges.”

“Whatever can go wrong, will go wrong.”

This is commonly known as Murphy’s Law, and anyone managing SEO for any substantial amount of time knows how true this can be… and how painful.

One day, you pull the organic traffic numbers and they’re down. Way down. Your heart sinks. Then you notice the email from your boss asking about it.

That sounds like a nightmare scenario, but it happens all too frequently in the SEO world.

Luckily for you, there are a set of known issues to check for when you see a large, unexpected drop in organic traffic. The best way to avoid being in the scenario described above is to constantly monitor for these issues and try to catch them before they happen.

1. Check For De-Indexing Issues

When you see a sudden drop in traffic in a short amount of time, the first thing to check is if your site has been de-indexed from the search engines.

Maybe your development team forgot to switch off the noindex/nofollow tags when they pushed the new website to production. Maybe there was a change in site structure (how could they not tell you?!) and the robots.txt file was not updated.

The first thing to check is the HTML source of some of your most popular pages, and at least one page from each unique template on your site. Are there any noindex/nofollow tags?

You can also add those tags to the HTTP response, so you might need to fire up the developer tools on your browser and look at the request and response made from your browser to your site.

Checking for that automatically is a little tougher, but you can work with the developer team to create an SEO test script which will verify that the pages do not contain and are not being served with noindex/nofollow tags.

The robots.txt file is a little easier to test. Grab the latest copy and head to Google Webmaster Tools. You can paste your current robots.txt file into the Robots.txt tester and enter a single URL to check.

Webmaster_Tools_-_robots_txt_Tester_-_http___www_freeadwordsscripts_com_

Other tools are available to check multiple URLs against your robots.txt file, but just remember that those other tools are using their own webcrawler to interpret your file and in rare cases, they may show different results from Google’s own crawler.

I recommend using at least one page from each section of your site. A check for this should also be added to the developer script.

2. Check Your Redirects

By now, you know that 301 redirects should be used in 99.99% of all redirects that happen on your site. But not everyone knows this.

Bad or broken redirects are tough to identify because if you aren’t checking the request and response headers using the developer tools of your browser, the site looks and functions fine. But it’s not fine, because you just lost all of your link juice from the redirects on your site.

So what are the symptoms to look for if you suspect this is happening? Try to focus your investigation on top pages with a lot of things linking to it from around the site. Has that page dropped in ranking more than other areas of the site?

The other thing to watch out for are an increased number of 404s in your webmaster tools. Unfortunately, because of the crawl schedule, this may only show up after the issue has been out there for a while.

The way to monitor for this issue is to add additional checks into your developers test script that checks a set of redirects to pages. Verify that all the redirects in the path (and there could be many) are returning a status code of 301. You should also educate your tech team on the importance of using 301 redirects which may solve issues in server configurations that are harder to test for.

3. Check For Spammers

There are evil people out there and they want to spam your site. If you’re not careful, they will leverage any area of your site where users can generate content to drive traffic to their online shady services.

The symptoms of this are pretty easy to find. Check your comment sections of the site for any spammy content. Look through Webmaster Tools to see if unexpected keywords start showing up. Many times, Google will warn you with an email from webmaster tools when spam starts hitting your site.

Keeping this from happening is a little harder. If you are using any popular CMS, then they should have spam stopper plugins for any comment pages. You can also turn on comment moderation and monitor that regularly. If you have a large site with a search feature, you can regularly search for spam keywords (check your spam folder in your email for ideas).

Conclusion

No one wants to walk into work on a Monday and be hit with one of these issues. Monitoring them is key to avoiding uncomfortable meetings with your boss and sleepless nights worrying if everything is ok.

Working with your technical teams to monitor and prevent these issues from making it into production is the best way to keep an eye out for these. Of course, if you don’t have access to a dev team, there are third party tools that will do this for you. Don’t let Murphy’s Law happen to your SEO.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Russell Savage
Contributor
Russell Savage is an Application Engineer for Cask Data and loves tinkering with marketing data and automation in his spare time. He is the creator of FreeAdWordsScripts.com, where he posts AdWords Scripts for anyone to use.

Get the must-read newsletter for search marketers.