Don’t Abuse Users’ Search Experience With 301 Redirects

100% Organic - A Column From Search Engine Land Linkbait, link juice, PageRank, paid links—all of these terms are commonly associated with one of the fundamental building blocks of successful search engine optimization (SEO): link development.

One of the problems with link development gurus is their obsession with 301 redirects. Heck, I have even heard search engine software engineers tout 301 redirects as the magic solution to maintaining high quality link development. The problem with this 301-redirect obsession is that the focus is on search engine positioning, not the user.

Many SEO firms only focus on gaining top positions. What happens after a searcher clicks on a link to a web site? Not their concern because creating and maintaining a user-friendly web site that converts visitors into buyers—well, it is not what they were hired to do. To this group, losing a link is almost like losing a limb. “301 redirect, 301 redirect, 301 redirect,” is one of their mantras.

I believe that one of the most difficult things for an SEO to understand is that it is perfectly acceptable to lose a link, and possibly multiple links, because the search experience is compromised when 301 redirects are the only solution. In fact, there are situations when presenting a customized 404 Error Page is more acceptable than presenting a redirected web page.

When to use a 301 redirect

As I explained in a previous column, Understanding Search Engine Duplicate Content Issues, 301 is a status code that tells search engines that the content at a specific URL (web address) has permanently moved to another URL. I like to think of it as a Change of Address card for computers.

Many SEO professionals, and even search engine software engineers, often state that 301 redirects should be implemented to preserve the “link juice” to expired content. I completely disagree with that assessment. Reason? The important word to remember in that previous sentence is the word “expired”. If content expires on a web site, meaning that it is not available anymore, then the content expiration should be communicated to both search engines and site visitors.

Here is a typical SEO scenario. A searcher types a specific keyword phrase into a search query, views the term highlighting in search results. Term highlighting is very important for search usability since it provides a powerful information scent and increases user confidence that he/she will be delivered to desired content. When a searcher clicks on a link to expired content, he/she is redirected to a home page to begin searching or browsing for the desired content. The scent of information is compromised. User confidence decreases. If the content has expired, then the search for this content is futile.

“But you must preserve link juice!” cries the link-obsessed SEO professional. I wholeheartedly disagree because, unlike many SEO professionals, I perform usability tests and have been doing so for many years. I observe the search behaviors and reactions to misguided redirects. Searchers become frustrated and leave with a negative impression of a web site. Even if the site’s search engine listing reappears for different keyword queries, searchers no longer click on that link because of the negative search experience.

In fact, in the past six months, I have observed a decline in user confidence with Google searches among some IT professionals. People who fit this particular profile/persona tend to go directly to vertical sites that provide them with a positive search experience, which includes an information scent and satisfactory content delivery.

In my opinion, 301 redirects are a reasonable solution when searchers are delivered to updated information. Of course, whenever a site must rewrite URLs due to a content management system (CMS) migration, a page-to-page 301 delivery is the solution. However, for expired content, delivering a custom 404 page is more appropriate, in spite of the “link juice” theory.

When to use a custom 404 error page

I understand the mentality of “link juice” theorists. Even if a page’s content has expired, “link juice” theorists believe it is perfectly acceptable to substitute the expired content with similar content.

What “link juice” theorists fail to do is perform usability tests and observe searcher behavior. Is the substituted content acceptable to end users? If so, why is it acceptable? Do site visitors take the desired calls to action? If users do not find the substituted content to be acceptable, they will not link to the URL or take the desired call to action.

From a web site owner’s and an SEO professional’s perspective, the substituted content preserves link popularity and delivers prospects to the web site. From the user perspective, however, desired content is no longer available. Users want to know that the desired content is no longer available. They do not want to be redirected to content they did not request.

When content on a web site is no longer available, users prefer to see a customized 404 Error page. Content for a customized 404 Error page should come after a thorough analysis of keyword and clickstream data so that the scent of information is preserved.

If new, substituted content were truly valuable and useful, then web site owners should have no problems getting links to that content.

Conclusion

I understand that it might be a tough pill to swallow, hearing from an SEO professional that there are times when losing “link juice” is perfectly acceptable. Some of my colleagues probably think I am nuts for making such a statement. I would not remove content that consistently delivered high quality search engine traffic and conversions. Nevertheless, web site owners add and remove content from their sites all of the time, and they forget about the user experience and qualified search engine traffic. SEO professionals often forget about the user experience as well.

In the end, if SEO professionals would quit being so overly obsessed with positioning only, they might deliver a better search experience which ultimately results in higher conversions and increased sales.

Shari Thurow is the Founder and SEO Director at Omni Marketing Interactive and the author of the book Search Engine Visibility. The 100% Organic column appears Thursdays at Search Engine Land.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: All Things SEO Column | Channel: SEO | SEO: Redirects & Moving Sites

Sponsored


About The Author: is the Founder and SEO Director at Omni Marketing Interactive and the author of the books Search Engine Visibility and When Search Meets Web Usability. Shari currently serves on the Board of Directors of the Information Architecture Institute (IAI) and the ASLIB Journal of Information Management. She also served on the board of the User Experience Professionals Association (UXPA).

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Red_Mud_Rookie

    Great post Shari, clear and to the point as usual… however you’ve got me a little confused…

    I manage SEO for a large travel site which has recently moved onto a new CMS.
    The new pages live on new URLs and as a result are causing duplicate content issues with the old because Google doesn’t know that we have new versions of the old pages.
    I have put in motion a project to identify all the old URLs so that we can 301 redirect them to the new thereby nullifying the duplicate content threat.

    How does this relate to your article?

  • http://www.seo-theory.com/wordpress/ Michael Martinez

    Well, I can’t speak for Shari but as it sounds like the content simply resides at new URLs, I would implement 301 redirects myself.

    I also use custom 404 pages where there is nothing to find because I would rather show my visitors what my site is about than lead them on a wild goose chase.

    Custom 404 pages can provide a lot of useful information without confusing search engines. People who obsess over “link juice” are not doing anyone any favors.

  • http://www.seocharlie.com/blog seocharlie

    Great article Shari, thanks.
    The main problem is that people are focusing more on the SERPS than ever. They want to keep their “green bar” and recycle the all possible content.

    So, they don’t care about 404 pages because they “drop” something from the authority. It’s hard to explain how things are, especially to people who don’t perceive other things than rankings.

  • http://www.receptional.com/newsroom/ Receptional

    Hiya Shari,

    Firstly, Thanks so much for the wine and the chocs! Looking forward to the book!

    Your post is thought provoking – and I know you won’t mind me taking the other side of the debate. Someone had to take up the gauntlet! :)

    So here’s my take. I believe that if links are genuinely built, then 301s are helping usability, not hindering it. Don’t get me wrong – I am not talking about buying up random expired names here, I am talking about the myriad of ways my users would be put out if I didn’t use 301s. Sometimes just minor confusion. Other times outright “Broken site” feel. Here’s some examples:

    1. When a revised site is built, I think about the poor guys that bookmarked the old copy and(yes) the people that will click on search engine results to a dead page on my site. Now I could give them a 404, but I still HAVE a page on (say) a review of the “Blue Lagoon in Iceland” (!) but now it’s somewhere else. Giving the user the right content is one step on from making them search again.

    2. One of the first uses of 301s for me is to create consistency for the user. “www” vs. non “www” is an obvious one, but there are others. Whether urls have trailing slashes and common mistypings of urls are both examples where we use 301s and I believe the consistent (and hopefully meaningful) urls and filename is an added indicator to the user that they are indeed where they want to be.

    3. We use tracking urls on PPC adverts (it’s just about the Algorithmic results) which help us immensely in understanding what our users like and don’t like. Things like adding:
    ?source=panama&adgroup=hotwater (hotwater being what I’m getting into disagreeing with Shari!) to an ad-link really makes our money go further because we are not advertising to the wrong users. But having these variables tacked onto the end of a domain URL just upsets people. I hate sessions variables as a user. Don’t you? But we 301 those tracking URLs as well. Not for any seach benefit, only for the user’s benefit and the desire to keep consistency.

    I do agree with the sentiment of your post. Please don’t stop being nice to me! Expired content should probably be executed and confined to supplemental hell or deleted from the index ASAP.

    I have always been a strong advocate of the “links are for users” camp and anyone who hears my views sees little difference between a link face to face and a link on a web page in business terms. They are both valuable links – and the face to face one is the most valuable of all. But a lack of consistency is always confusing for web users and multiple ways of writing urls that show the same content is confusing for the user.

    Dixon.

  • http://www.brulant.com chris boggs

    I think you have taught effectively, Shari, the perils of using “from-the-hip” 301s. In many cases, a 404 -especially if customized- will be far more user friendly than guessing or simply sending the user to the home page or most likely category page.

    However, I (along with Dixon) want to stress what I feel was the most important thing you wrote in here:

    “301 redirects are a reasonable solution when searchers are delivered to updated information.”

    The “link juice” is not the sole reason for 301s. If a site is sending a lot of traffic to my old page, I want to establish the best one-to-one relationship that I can.

    Maybe taking at least the top pages from the old site ranked by inbound traffic as well as inbound links, and establish a 301 for those, and then “collect” the rest and make the choice between a 301 and a 404. Personally, I would feel that if you have a good enough home page, the user experience in navigating from there to the original intended topic should be pleasant. A sitemap or custom 404 should not have to be any more effective, IMO.

    This is definitely a discussion that could go back and forth, but I am in the camp that properly structured 301′s for at least the most important pages outweighs the use of 404s. Along with Dixon, I agree with you that there is usually plenty of content on an old site that can simply be dumped via 404.

  • http://www.jehochman.com JEHochman

    I’m working on an interesting test case right now where we’ve taken 20,000 catalog pages offline. The purpose is to focus more PageRank on newer product info and to reduce duplication between last year’s catalog and this years. The old pages were drawing tons of traffic. We’ve elected to use a very customized 404 that explains why we moved the old catalog offline and provides the user with a path forward.

    You can make as many different 404 pages as you like, as long as you use server side scripting to return a 404 status code. This combined with silent URL rewriting (moves the user to the 404 error page without a redirect) for a specified group of pages can provide a better user experience while giving a proper response to the search engines.

    As a caveat, this catalog didn’t have many deep links. Most of the link juice was being poured in from the top, so it was easy to direct that where we wanted it to go instead.

 

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide