Losing Organic Traffic After A Redesign? Four Things to Check

While web designers and developers are often strong at designing and programming websites, not all are as well prepared to handle SEO issues, especially when relaunching websites. Just this week, I’ve worked with three separate web design firms who contacted me after a recent site relaunch caused a substantial loss of organic website traffic. So, if […]

Chat with SearchBot

While web designers and developers are often strong at designing and programming websites, not all are as well prepared to handle SEO issues, especially when relaunching websites. Just this week, I’ve worked with three separate web design firms who contacted me after a recent site relaunch caused a substantial loss of organic website traffic.

So, if you just relaunched your website and you’re losing organic traffic, how can you know where to start?

Here’s my basic list of the most common issues I see that are often overlooked in a relaunch. While there can be many reasons for traffic losses, these are the four areas I recommend checking first.

1. Check For An Algorithm Update

First and foremost, make sure you don’t mistakenly contribute a loss in traffic to a relaunch problem when it was actually the result of an algorithm update.

If you had the misfortune of relaunching at the same time as an algorithm update, you likely will need to check potential issues suspected to be part of the algorithm update as well as relaunch-related issues.

To identify if the loss is algorithm-update related, first check to see if the organic traffic losses are occurring for multiple search engines.

If the traffic losses (by percentage) are significant only for one engine, then this may indicate an algorithm update. Also check Moz’s Google Algorithm Update History page and blogs like Search Engine Land to see if others are discussing a possible update.

2. Check 301 Redirects

This is one of the areas I find most overlooked during a relaunch and often the main culprit of organic traffic loss.

301 redirects are like a “change of address card” for the search engine robots — they indicate for the engines that the old URL has permanently moved to a new URL.

If a web page is relaunched with a new URL, the search engine robot will still go to the old URL to index it. Links from other sites, for instance, likely still exist for the old URL, so when the search engine robot follows those links, it follows them to the old URL.

Without a 301 redirect to tell the robots where the new URL is, the robots will abandon trying to index that page and eventually it will drop from the search engine’s index.

To figure out if you’re missing 301 redirects (or perhaps they are not programmed correctly), look at the organic traffic to the individual pages of your site both before and after the redesign. I typically run a report that shows the top entry pages from organic search engines before the relaunch and compare that to the traffic after relaunch.

For pages with major drops, check the URL itself by entering the URL in your browser. Were you redirected? If you received a 404 error, that’s likely what the search engine robots are finding, too.

Another problem may be the type of redirect used. Be sure to use 301 redirects in this case because a 301 redirect tells the search engines that the move is a permanent one. Other types of redirects, like 302s, shouldn’t be used in most relaunch situations.

To check to ensure that the redirect is a 301, visit a redirect checker, like this one, and enter the old URL in the box. 

3. Check The Robots.txt

The robots.txt file serves as a set of instructions for search engine robots, indicating which pages to index and which to avoid.

It’s not uncommon to have a robots.txt on the test server (prior to website launch) that blocks search engine robots from indexing any of the pages on the test server (since the site is still being developed and approved).

Occasionally, when a website is moved to the live server, the robots.txt from the test server may be inadvertently copied to the live server.

If the robots.txt file is not updated to allow search engine robots to index the relaunched site on the live server, the search engines will not be able to visit or view pages, leading to them being removed from the search engine index.

To find out if your site’s robots.txt is blocking search engine robots, open a browser and enter your domain followed by /robots.txt in the address bar. This will show you the robots file. Then look for “disallow” on the page. While you may want to hide certain pages, like password-protected pages, only these “protected” pages should appear under a disallow statement.

4. Check The Pages Themselves

In addition to blocking search engine robots through the robots.txt, individual pages can block robots through using NoIndex in the robots meta tag. As stated above with the robots.txt, there may be reasons you want to block certain pages from search engine indexing.

However, similar to the situation with a robots file being copied from the test server to the live server, I’ve seen the same mistake made with pages on the test server using the robots meta tag accidentally copied to the live server.

Check the page for a robots meta tag by viewing the page’s source code. A robots meta tag will resemble this one:

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

… and will reside in the head area of the page.

Resolve Issues Quickly

Regardless of the reason for the traffic loss, once you find the issue, be sure to resolve it as quickly as possible. When search engine robots can’t find pages (or index them), they will soon remove them from the index to avoid sending searchers to invalid pages. So, act quickly to regain your rankings.

Images via Shutterstock.com, used with permission.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Janet Driscoll Miller
Contributor
Janet Driscoll Miller is the President and CEO of Marketing Mojo and has been working digital marketing for over twenty-five years. She is the author of "Data-Frist Marketing: How to Compete and Win in the Age of Analytics" and is a frequent speaker on digital marketing and data analytics. She specializes in providing technical SEO, digital marketing analysis and management and accurate attribution and data analytics.

Get the must-read newsletter for search marketers.