Website redesign mistakes that destroy SEO

To keep up with user preferences, you have to redesign your website now and then. Learn how to avoid the most common pitfalls when you do.

Chat with SearchBot

Mistake Doh Laptop Ss 1920Redesigning a website, whether it’s your own or a client’s, is an essential part of marketing today. It’s essential because technology, trends, and the expectations of users change over time, and if we want to remain competitive, we must keep pace with these changes.

But this task, while essential, also presents certain risks from an SEO perspective. A number of things can go wrong during the process. These issues can potentially cause search engines to no longer view that website as the authoritative answer to relevant queries. In some cases, certain mistakes can even result in penalties.

No one wants that.

So in this article, we’re going to explore some of the common web design mistakes that can destroy SEO. Knowing the potential risks may help you avoid making the kind of mistakes that tank your organic search traffic.

Leaving the development environment crawlable / indexable

People handle development environments in a lot of different ways. Most simply set up a subfolder under their domain. Some may create a domain strictly for development. Then there are those who take the kind of precautions to hide their development environment that would give a CIA agent a warm fuzzy feeling in that empty spot where their heart should be.

I tend to fall into the latter category.

Search engines are generally going to follow links and index the content they find along the way — sometimes even when you explicitly tell them not to. That creates problems because they could index two versions of the same website, potentially causing issues with both content and links.

Because of that, I place as many roadblocks as possible in the way of search engines trying to access my development environment.

Here’s what I do. The first step is to use a clean URL that has never been used for a live website before. This ensures there are no links pointing to it. Next, disallow all bots using robots.txt, and set up an empty index page so that other folders are not visible. In the past, I’ve even gone as far as setting up password protection, but in most cases, that may be overkill. You can make that call.

From there, I’ll set up a separate folder for each website in development. Typically, the folder name will be a combination of incomplete words so that it’s unlikely to be found randomly. WordPress will then be installed in these folders, and configured to also block bots at this level.

Arbitrarily changing image names on pages that rank well

This isn’t always an issue, but if a web page is ranking well, changing the name of an image on that page may cause a loss of ranking. Especially if the web designer doesn’t know what they’re doing.

I’ve seen this happen more than a few times, where a client hires a web designer who doesn’t understand SEO to redesign a website that already ranks well. As part of the redesign process, they replace old images with new, larger images, but, lacking the appropriate experience, they use stupid image names that provide zero SEO value, like image1.jpg.

This takes away a vital piece of context that search engines use to determine where a particular web page should rank.

Deleting pages or changing page URLs without redirecting them

During a redesign, some pages will almost certainly no longer be needed. Less experienced web designers will often simply delete them. Other pages may be moved and/or renamed, which in most cases, changes their URL. In these cases, inexperienced web designers often change these URLs and consider the task complete.

This is a big mistake because some of those pages may already rank well. They might have inbound links pointing to them or have been bookmarked by visitors.

When you delete pages that already have inbound links, you’ll lose all of the SEO value from those links. In some cases, this could result in a drastic loss of ranking.

The issue goes even deeper though. Anyone clicking those links or bookmarks will be greeted by a 404 page. That presents zero value to anyone, and more importantly, it creates a negative user experience. This is important because Google has confirmed that user experience is a ranking factor.

The proper way to delete pages is to redirect any them to the most relevant page that currently exists. As for moving pages, which includes anything that changes the URL of that page in any way, it’s equally important to redirect the old URL to the new one.

In both scenarios, a 301 redirect should generally be used. This tells search engines that the old page has been permanently moved to the new location. For most hosting platforms, this is best accomplished by adding the appropriate entry into your .htaccess file.

If you’re unable to see a .htaccess file on your server, you may need to adjust the settings on your FTP program to view hidden files.

Some specialized hosting platforms may utilize a different method, so you may need to check with their support team to determine how to accomplish it.

Not performing a full crawl after migration to and from the development environment

Regardless of the method you use for migration you’re bound to run into some errors.  Typically you’ll first migrate the live website into your development environment, and then later, send it back to the live server after you’ve made and tested changes.

One that I run into frequently is links within content pointing to the wrong place. For example, within a page or post on the live website, you may have a link that points to:

domain.com/services/

Once migrated to the development environment, it may be:

devdomain.com/client123/services/

All is fine and good so far, right?

But sometimes, while migrating the completed website back over to the live server, the content in pages and posts may still contain links pointing to the pages within the development environment.

This is just one example. There are countless links to content within a website — including links to the essential image, JavaScript, and CSS files.

Fortunately, the solution is simple. A tool like Screaming Frog, which runs from your desktop, or a cloud-based tool like SEMrush, can be used to crawl every single link within your website. This includes the text links visible on the front end, as well as all of the links to image, JavaScript, and CSS files that are tucked away in the HTML of a website.

Be sure to review all links to external sources once the new website has been migrated to the live server because any links pointing to your development environment will appear as external links — when you find “external links” that should really be internal links, you can make the appropriate corrections.

This step is essential after migrating in either direction, in order to prevent potentially catastrophic errors.

Failing to perform a complete function check on everything

Once a redesigned website has been migrated to the live server, you need to do more than quickly review a few pages to make sure things look OK. Instead, it’s essential to physically test everything to make sure it not only looks right, but also functions properly.

This includes:

  • Contact forms.
  • E-commerce functionality.
  • Search capabilities.
  • Interactive tools.
  • Multimedia players.
  • Analytics.
  • Google Search Console / Bing Webmaster Tools verification.
  • Tracking pixels.
  • Dynamic ads.

Failing to reconfigure WordPress and plugins after migration to the live server

Remember how we talked about the importance of putting up a wall between your development environment and the search engines’ crawlers? Well, it’s even more important to tear that wall down after migrating the website to the live server.

Failing to do this is easy. It’s also devastating. In fact, it’s a mistake I made several years ago.

After migrating a client’s website to their live server, I forgot to uncheck the box in Yoast SEO that told search engines not to crawl or index it. Unfortunately, no one noticed for a few days, at which point, the website had been almost completely dropped from Google’s index. Fortunately, they didn’t rely on organic traffic, and, once I unchecked that box, the website was quickly reindexed.

Search Engine Visibility Settings E1538675933850

Because of the impact mistakes like these can have, it’s critical that after migration to the live server, you immediately check the configuration of WordPress as well as any plugins that could affect how search engines treat your website.

This includes plugins for:

  • SEO.
  • Redirection.
  • Sitemaps.
  • Schema.
  • Caching.

Neglecting to pay attention to detail

None of these mistakes are particularly complicated or difficult to avoid. You simply need to be aware of them, implement a plan to avoid them, and pay close attention to detail.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Jeremy Knauff
Contributor
Jeremy Knauff is the founder of Spartan Media, a digital marketing agency in Tampa, Florida. He's also a proud father, husband, and US Marine Corps veteran. After 18 years in the digital marketing industry, he's learned a thing or two, and today, while still serving his clients, he's working to share his knowledge with the industry to help even more people.

Get the must-read newsletter for search marketers.