SEO and archiving content - imageDuring consultations for search engine optimization services, the topic of blogs and other types of social media inevitably arises. I observe plenty of enthusiasm about blog content and fresh search results. I listen attentively to client ideas and concerns. Then I ask a simple question.

“What is your SEO archiving plan?”

Do you know what the typical response is? Stunned silence.

Then I ask another question:

“Who is in charge of archiving your blog content as well as your corporate website content?”

Do you know what the typical response to that question is? Continued silence. More stunned expressions. People looking at each other to see who might have the answer to that question.

You see, to too many people, SEO is all about keywords and rankings and freshness and the latest flavor-of-the-month tactic. A topic like archiving does not appear on an SEO professional’s radar until a specific situation arises, such as a site redesign or a migration to a new content management system (CMS).

The Costs Of Not Having An Archiving Plan

Long time Linking Strategist Eric Ward, who also publishes LinkMoses Private, shared this URL archiving horror story with me:

“I worked with a client for many years, each month seeking links for the new content they added each month. This new content was always subject specific, and was placed within a subdirectory at a nice short URL that made the link seeking process easier. In this company, the marketing departments and IT departments didn’t communicate every decision they made to each other.”

“Well, the IT department made the decision to change web content delivery platforms, but they did not share this with the marketing folks, who had no idea it was happening. Overnight thousands of URLs changed, with no redirects in place. Nothing but 404s.”

“This meant that all the deep links I had obtained over the course of several years became useless. There was no migration or archiving plan in place. There was no old site map or list of previous URLs. The lesson from this is Web sites demand planning and cooperation across departments. One decision can wreck a lot of work.”

Unfortunately, I encounter this type of situation all too frequently. There is a mad scramble to put together properly programmed 301 redirects without considering the searcher experience.

Web pages that should return 404 File Not Found errors are redirected to the home page in the chaos to retrieve lost link juice. 301 redirects might have to be implemented and re-implemented, diminishing their value. Web content that had solid link development and easy access is suddenly buried in the revised information architecture.

The costs of not having an effective archiving plan results in lost search engine visibility (temporary and long-term), diminished brand credibility, and considerable staff/outsourcing time and expense to repair the damage. Ultimately, these items lead to lost prospects and lost sales.

Archiving Blog Content

Blog archive by date - image

Many pre-formatted blog templates offer archives by date, but this feature is not enough for effective SEO and overall findability.

So let’s go back to my original situation. During the wave of enthusiasm for launching a blog for increased freshness, spidering, and (hopefully) increased search engine visibility, an archiving plan is never discussed.

And if archiving is mentioned? The answer is a typical brush off — just use the pre-programmed archives in the blog software. End of archiving discussion.

Well, I can tell you that pre-programmed blog archives is not an effective way to archive content because users/searchers generally do not discover or locate desired blog content by date.

They search for it by keywords using either a commercial Web search engine or a site search engine. They browse by categories and related content. Therefore, it is important for blog content to contain both parent-child and sibling-sibling links to related content.

Too many blogs remain uncategorized or rely on tagged pages as a poor substitute for an effective information architecture.

(Note: Tagged pages on blogs typically lead to duplicate content delivery. Duplicate content delivery to search engines can result in less pages being indexed, important pages not being available to rank, and a compromised searcher experience.)

Furthermore, blog content should not be written once and discarded. If you want your blog content to have long-term search engine visibility and grow stronger over time, archiving and categorizing are a necessary part of the SEO process.

“Posting valuable website content is not for a one time, getting attention driving endeavour. It’s also about long-term value,” said Ezra Silverton, Website Architect at the Canadian-based 9th Sphere. “One key attribute to long-term content value is making it easily accessible to visitors and crawlers long after its posted.”

Think about this: every blog post you write is going to eventually disappear from the home page and top-level category pages. When those links disappear:

  • How are you making that content accessible to both searchers and search engines?
  • What parent-child links are available on blog template pages?
  • What sibling-sibling links are available on blog template pages?
  • If you cannot put these links in the template, how else are you accommodating natural finding behaviors?

If you didn’t have an archiving plan from the outset, imagine the amount of work it would take to implement these items and achieve the long-term benefits. Not having an archiving plan or strategy ultimately hurts all types of websites…not only blogs.

As SEO professionals, we understand that SEO should never be an afterthought during the site design or redesign process. SEO works the best when it is addressed during the planning stages of website development. Archiving is no different. Archiving is an important part of the SEO planning process. It shouldn’t be an afterthought.

In other words, the answer to, “What is your SEO archiving plan?” should never be stunned silence.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: Channel: Content | Search & Usability

Sponsored


About The Author: is the Founder and SEO Director at Omni Marketing Interactive and the author of the books Search Engine Visibility and When Search Meets Web Usability. Shari currently serves on the Board of Directors of the Information Architecture Institute (IAI) and the ASLIB Journal of Information Management. She also served on the board of the User Experience Professionals Association (UXPA).

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • http://www.infatex.com Mikhail Tuknov

    I think you should list a backup solutions in your post!

  • http://www.search-usability.com/ Shari Thurow

    Hi Mikhail-

    There are minimal back-up solutions for a substandard information architecture. The best and only real solution is to fix the website, fix the architecture.

    Wayfinders and guides are one solution, if they are done effectively. But that is an entirely different article. I’ll write about that topic…thank you for inspiring it!

    Another solution is to go back into your blog or website and add related links, starting with your most important pages.

    As for 301s and other messes (as in Eric’s example), that’s difficult to fix. That solution is a slow one. People have to decide if they wish to implement the 301 redirects If not, then they have to go back to the previous “linkers” and ask them to link to the revised URL.

  • http://saidulhassan.com SaidulHassan

    Well, I’m a low profile SEO currently maintaining only 56 AU SMB sites. Keeping server backup and HTTrack have always saved my day.

  • http://www.ericward.com Eric Ward

    Shari – I remember the marketing team called me frantically wondering what happened to all their traffic. I looked at their site and could see right away the old cold fusion URLs were now something different. They were baffled. One of the more memorable moments I’ve had and a tough thing to explain to them how IT’s change of content delivery platforms not only rendered every inbound link dead, and that it could have all been avoided if any of them had communicated with each other or me before the changeover. Years of link equity gone in a blink.

  • http://seogrouch.wordpress.com S.G.

    SEOGrouch does not have an SEO archive plan because SEO is an unethical business practice. Design good websites. Create good content. Then, if your stuff don’t stink you’ll do okay. Game the system by having an archive plan? That’s unethical.

    His Majesty has written.

  • http://www.search-usability.com/ Shari Thurow

    Hello Grouch-

    Interesting. If you believe that “SEO is an unethical business,” then why give yourself a moniker with the abbreviation “SEO” in it?

    I think that it is a gross overgeneralization to consider archiving an “unethical” practice. The librarians and other archivists in the world are hardly “unethical” by any stretch of the imagination.

    I would challenge anyone to take me to task on my SEO ethics. :-)

    My 2 cents.

  • http://www.seotrainingnw.com ColleenWright

    I had a similar situation arise when the hosting company was not backing up the database for a blog and it became corrupted. Not only did we lose the correct URLs, we lost the actual data because they had no good archiving of data or backups to speak of.

    One way I was able to recreate what was lost was to quickly capture the data using the “site:” function in Google. I was able to retrieve much of the lost data before the pages were re-indexed and information was lost forever. This might work but probably is better for smaller sites because it takes a lot of time to capture this lost data. Also, what about the Wayback Machine http://www.archive.org/web/web.php? I was able to capture lost data for a client that way also.

    Thank you for helping me think about this more, it is definitely a critical component!

 

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide