• Red_Mud_Rookie

    Great post Shari, clear and to the point as usual… however you’ve got me a little confused…

    I manage SEO for a large travel site which has recently moved onto a new CMS.
    The new pages live on new URLs and as a result are causing duplicate content issues with the old because Google doesn’t know that we have new versions of the old pages.
    I have put in motion a project to identify all the old URLs so that we can 301 redirect them to the new thereby nullifying the duplicate content threat.

    How does this relate to your article?

  • http://www.seo-theory.com/wordpress/ Michael Martinez

    Well, I can’t speak for Shari but as it sounds like the content simply resides at new URLs, I would implement 301 redirects myself.

    I also use custom 404 pages where there is nothing to find because I would rather show my visitors what my site is about than lead them on a wild goose chase.

    Custom 404 pages can provide a lot of useful information without confusing search engines. People who obsess over “link juice” are not doing anyone any favors.

  • http://www.seocharlie.com/blog seocharlie

    Great article Shari, thanks.
    The main problem is that people are focusing more on the SERPS than ever. They want to keep their “green bar” and recycle the all possible content.

    So, they don’t care about 404 pages because they “drop” something from the authority. It’s hard to explain how things are, especially to people who don’t perceive other things than rankings.

  • http://www.receptional.com/newsroom/ Receptional

    Hiya Shari,

    Firstly, Thanks so much for the wine and the chocs! Looking forward to the book!

    Your post is thought provoking – and I know you won’t mind me taking the other side of the debate. Someone had to take up the gauntlet! :)

    So here’s my take. I believe that if links are genuinely built, then 301s are helping usability, not hindering it. Don’t get me wrong – I am not talking about buying up random expired names here, I am talking about the myriad of ways my users would be put out if I didn’t use 301s. Sometimes just minor confusion. Other times outright “Broken site” feel. Here’s some examples:

    1. When a revised site is built, I think about the poor guys that bookmarked the old copy and(yes) the people that will click on search engine results to a dead page on my site. Now I could give them a 404, but I still HAVE a page on (say) a review of the “Blue Lagoon in Iceland” (!) but now it’s somewhere else. Giving the user the right content is one step on from making them search again.

    2. One of the first uses of 301s for me is to create consistency for the user. “www” vs. non “www” is an obvious one, but there are others. Whether urls have trailing slashes and common mistypings of urls are both examples where we use 301s and I believe the consistent (and hopefully meaningful) urls and filename is an added indicator to the user that they are indeed where they want to be.

    3. We use tracking urls on PPC adverts (it’s just about the Algorithmic results) which help us immensely in understanding what our users like and don’t like. Things like adding:
    ?source=panama&adgroup=hotwater (hotwater being what I’m getting into disagreeing with Shari!) to an ad-link really makes our money go further because we are not advertising to the wrong users. But having these variables tacked onto the end of a domain URL just upsets people. I hate sessions variables as a user. Don’t you? But we 301 those tracking URLs as well. Not for any seach benefit, only for the user’s benefit and the desire to keep consistency.

    I do agree with the sentiment of your post. Please don’t stop being nice to me! Expired content should probably be executed and confined to supplemental hell or deleted from the index ASAP.

    I have always been a strong advocate of the “links are for users” camp and anyone who hears my views sees little difference between a link face to face and a link on a web page in business terms. They are both valuable links – and the face to face one is the most valuable of all. But a lack of consistency is always confusing for web users and multiple ways of writing urls that show the same content is confusing for the user.

    Dixon.

  • http://www.brulant.com chris boggs

    I think you have taught effectively, Shari, the perils of using “from-the-hip” 301s. In many cases, a 404 -especially if customized- will be far more user friendly than guessing or simply sending the user to the home page or most likely category page.

    However, I (along with Dixon) want to stress what I feel was the most important thing you wrote in here:

    “301 redirects are a reasonable solution when searchers are delivered to updated information.”

    The “link juice” is not the sole reason for 301s. If a site is sending a lot of traffic to my old page, I want to establish the best one-to-one relationship that I can.

    Maybe taking at least the top pages from the old site ranked by inbound traffic as well as inbound links, and establish a 301 for those, and then “collect” the rest and make the choice between a 301 and a 404. Personally, I would feel that if you have a good enough home page, the user experience in navigating from there to the original intended topic should be pleasant. A sitemap or custom 404 should not have to be any more effective, IMO.

    This is definitely a discussion that could go back and forth, but I am in the camp that properly structured 301′s for at least the most important pages outweighs the use of 404s. Along with Dixon, I agree with you that there is usually plenty of content on an old site that can simply be dumped via 404.

  • http://www.jehochman.com JEHochman

    I’m working on an interesting test case right now where we’ve taken 20,000 catalog pages offline. The purpose is to focus more PageRank on newer product info and to reduce duplication between last year’s catalog and this years. The old pages were drawing tons of traffic. We’ve elected to use a very customized 404 that explains why we moved the old catalog offline and provides the user with a path forward.

    You can make as many different 404 pages as you like, as long as you use server side scripting to return a 404 status code. This combined with silent URL rewriting (moves the user to the 404 error page without a redirect) for a specified group of pages can provide a better user experience while giving a proper response to the search engines.

    As a caveat, this catalog didn’t have many deep links. Most of the link juice was being poured in from the top, so it was easy to direct that where we wanted it to go instead.