A Practical Guide To Information Architecture Changes

A change in information architecture (IA) can make or break your in-house SEO program .  A successful  IA makeover can open up a window to previously unimagined search engine domination, or it can see years of hard SEO work evaporate in the fluttering of a URL. Despite the complexity of IA changes, by following some […]

Chat with SearchBot

A change in information architecture (IA) can make or break your in-house SEO program .  A successful  IA makeover can open up a window to previously unimagined search engine domination, or it can see years of hard SEO work evaporate in the fluttering of a URL. Despite the complexity of IA changes, by following some practical guidelines throughout the process you will maximize the potential SEO value of a well-executed IA changeover and avoid catastrophe.

A practical approach to information architecture changes

Information architecture is the conceptual model around which a website is built, and forms the structural foundation by which a search engine assesses individual resources (chiefly pages), and the relationship between them. Accordingly, a change to this model has enormous ramifications for how a site is indexed, ranked and otherwise represented by the search engines.

Perhaps it is SEO that led the charge for an IA changeover in the first place – this is not uncommon, as a poor IA is one of the biggest structural impediments for SEO success. Conversely, the change may have been initiated by a different company unit for reasons wholly unrelated to search, such as the propagation of a new ecommerce platform. In either case, the in-house SEO has a role in both the architectural scheme itself, and the steps by which that blueprint is developed and transformed into reality.

This guide focuses on the practical steps involved in the planning and rollout process, rather than the design of the revised site architecture itself. Whether or not you have put together that strategic plan yourself, you will get the best results from an IA change if you take an active roll in that plan’s execution.

Come early, stay late

It is essential that you make your presence felt at every juncture in the IA changeover process. From planning to code development to testing, you will end up with the best possible site architecture for findability and high search engine rankings if (and only if)you are an active advocate for SEO at every stage.

Not to put too fine a point on it, but never trust anyone else to adequately account for the needs of SEO in an IA revamp. Even when you have provided explicit implementation guidelines, be aware that these guidelines may be subject to interpretation or, not uncommonly, misinterpretation. Stick your nose in the process and keep in there.

Make an SEO requirements checklist

Make a comprehensive checklist of SEO requirements for the new architecture to ensure you have anticipated SEO needs. This is true even if the SEO department has initiated the changeover; it is still possible to overlook some nuts and bolt issues when your focus has been chiefly on how pages are arranged and interconnected.

If you have a team, conduct at least one comprehensive brainstorming session to help come up with a list of items to be addressed. In the absence of a dedicated SEO team, augment your initial checklist by extensive reading of blogs, forums and online resources, and tap into the expertise of the broader SEO community when you can.

At a minimum, your checklist should address the following:

  • File and folder naming conventions. Ensure your folder and file names are search-engine friendly, consisting of semantically meaningful keywords, with dashes used as word separators. Avoid deeply nested folders, and try as much as possible to create a sensibly hierarchical folder structure.
  • Singular URIs. Any website resource, including but not limited to web pages, should only resolve under a single URI. This can be particularly challenging in some ecommerce platforms, where, in the absence of your due diligence, product pages can end up living under multiple URLs based on their parent categories. However difficult, any potential sources of duplicate content must be neutralized to produce singular URLs. While, unless absolutely necessary, you should rely on the canonical tag to solve duplicate content issues, do plan for how you plan to deploy it.
  • Parameter handling. Adding a parameter is often the path of least resistance for the tracking of internal and external referrers, but can lead to duplicate content, dilute the effectiveness of your inbound links and – ironically – cause problems with tracking pages in analytics.  Your working motto should be “parameters if necessary, but not necessarily parameters.”
  • Sitemap support. You will have an easier time of producing sitemaps if they are part of the information architecture plan, rather than tacked on after the fact. For any serious IA effort, a sitemap will have been produced that can aid in the construction of both XML and onsite sitemaps. For sites that appear in Google News, or have substantial video content, account for the creation of Google News and Video sitemaps respectively.
  • RSS. If you have content that lends itself to syndication, account for the location, format and automated updating of RSS feeds.
  • Scalability. What happens when a new product, landing page, video, upper-level category page or any other class of resource is required down the line? Often a IA revamp is required precisely because the existing architecture cannot sufficiently handle new site elements. As much as possible, try to anticipate what type of content might end up on your site in the future, and ensure that the IA is capable of integrating it.

Search friendly content management systems

Is a new content management system (CMS) part of the IA makeover? Indeed, is the introduction of a CMS or change of CMS the reason for the IA change? As an SEO be afraid. Be very afraid. Actually, some content management systems, particularly publishing platforms such as WordPress, can be very helpful for SEO. But, by and large, CMS platforms have traditionally been designed with for task execution and interactivity with databases, with something of a blind eye for SEO. This is particularly true of enterprise content management systems designed for large product or document catalogues.

Where CMS selection is part of the IA makeover process, ensure you are there at the vetting stage. Look for a CMS that, above all, shows flexibility in how content is output. I have never met a CMS vendor who has not made magnificent claims about how their product supports SEO, however much the reality differs from those claims. Due diligence here entails not only reading the specification sheets, but looking at real-life implementations of that CMS and, if at all possible, spending some time playing in a functional demonstration environment.

Once a CMS has been selected, and armed with your comprehensive list of SEO requirements, sit with your developers and determine what custom modifications are going to have to be made to the CMS to support SEO. This is important because you will need to account for that development time in the IA roadmap.  Those modifications will have to be made. To cite just one example, I worked with an enterprise CMS that happily returned a 200 response header for any file extension, as long as the root file name existed. Modifying that system so invalid file names either redirected or returned a 404 was essential to avoid duplicate content issues.

Hopefully you have both a good working relationship with your developers and a strong personality, because you must actively resist taking “no” for an answer. Two common responses to requests for modifications will be that the system was not designed to work that way, and that what you are requesting cannot be done. To the first point – no, indeed, if the CMS was designed to be SEO friendly the modification would not be necessary;  if a hack is a required and it does not adversely impact the actual integrity of the system, then a hack it is. To the second point, computer systems can almost always be manipulated to do what you need them to do, and it is more often a question of development time and costs than that a task is flat-out impossible. In the light of resource constraints, prioritize your requests, but be aware that this is the CMS you are going to have to live with.

Get access to the development environment and use it

Once one site component has been built on the back of another, changing the how that first component functions is difficult, if not practically impossible. The best way to avoid such situations is to monitor the information architecture as it is being put together by viewing it in the development environment.  Ideally, you will also want to create test pages in the authoring environment to uncover process or output issues you may not have anticipated.

There may be stages in the process where there is no meaningful output you can check, but work is moving forward. For those phases arrange to sit with your developers, where they can walk you through the latest developments. As with any project that impacts SEO, ongoing communication with key stakeholders is important.

Make an SEO migration checklist

A new site architecture, by definition, will result in existing site elements being either changed or removed altogether. You need to anticipate and enumerate what these changes will be to provide the search engines with a smooth transition from the old site to the new.

Both in the development environment and once the revised IA has gone line, use the migration checklist you have developed to ensure the new site functions as anticipated. Augment the efforts of a quality assurance and testing team, where that exists, even with your checklist in hand, they may be blind to many unanticipated SEO issues that you are only able to identify by testing.

  • Redirects. In all probability, existing URLs on your site will change, and you need to create a redirect list to ensure old URLs are correctly redirected to their new locations. In many cases this will require the creation of regular expressions. Randomly test old URLs from various locations across to the site to ensure you have not overlooked anything.
  • Response headers. You will, of course, want to test redirects to ensure they are using the correct redirect logic:  the preferred behavior will almost always be a 301 (permanent) redirect, and you want to be on the lookout for 302 (temporary) redirects used in error.  However, you will also want to test the response headers of new pages to ensure they return a 200 response, and invalid file names to ensure they return a 404.  To test for proper 404 behavior, try various combination based on valid file and folder names:  many CMS systems may return a blank page with a 200 rather than a 404.
  • 404 and other error pages. Your existing “file not found” page, as well as other pages for server or authentication errors, may not appear or function as they did before. Test different error conditions to ensure correct behavior.
  • Webmaster tools. Once your site is live, login to the webmaster consoles of the different search engines and make any required changes there, such as stipulating new sitemap locations or revised parameter handling. You will also need to ensure your selected validation method is still valid and functioning. Sometime after your new IA has been indexed, you may also need to block sitelinks that Google has extracted from the new architecture. And, of course, you will want to pay close attention to diagnostic information that is returned, such as crawl errors.

Correctly executed, a new information architecture can breathe fresh life into your SEO efforts.  Perhaps more so than any other project that impacts SEO, take all the time and effort required – even it means a few sleepless night – to derive the maximum benefit of an IA makeover, and avoid disaster.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Aaron Bradley
Contributor
Aaron Bradley is an SEO consultant specializing in organic search, and writes on search issues at his blog SEO Skeptic. He has worked in SEO since 2005, following ten years as a website designer.

Get the must-read newsletter for search marketers.