Sign up for weekly recaps of the ever-changing search marketing landscape.
Creating And Maintaining An SEO Timeline
Nothing is more certain in the world of SEO than the fact that things will change over time. Rankings, search traffic, search algorithms and your website’s structure and content are all mutable. In order to maximize the value of your optimization efforts, it is important to know not only the nature of any given change, but when it occurred. At a fundamental level, it is impossible to make a relationship between cause and effect without knowing the sequence of events.
To this end, it is useful for SEO to plot relevant events and data against a timeline. This is especially true for in-house SEO efforts, as such efforts usually pertain to large, complex websites where there are numerous factors that may influence a site’s search performance over time. Such a method is also most likely to be successful in an in-house environment, where access to information is easier and there is a better chance at long-term continuity of data collection.
Barring a data storage catastrophe, there are some historical metrics (such as traffic from specific keywords) that you should always be able to access. Other data points such as rankings, however, are bound to specific moments in time. And further factors that may impact your search efforts, such as changes to your site, may be omitted from an analysis if they slip your memory – or reside, unrecorded, only in the memory of a predecessor. Knowing which sort of information to record is the first step in developing a useful SEO timeline.
There are classes of temporal data that change over time; accordingly (and obviously), you will not have the ability to use these data for analysis unless you have been recording these on an ongoing basis.
Chief among these temporal data points are search engine rankings. However much rankings may have fallen out of vogue as a measure of organic search engine optimization success, they are invaluable for assessing the impact of on- and off-site factors on your site’s search engine performance. Monitoring keyword traffic is all fine and well, but when there are changes to that traffic it is useful to know, even using representative samples, which keyword rankings were impacted, and to what degree.
Typically rankings are recorded for keywords that important to a site – keywords for which you do, can, should or want to rank. Historical rankings for core keywords are very helpful, but there are other types of keywords you should be consider tracking, as well as ensuring you are using the best methodology for tracking rankings.
- Ensure you are tracking lower-ranked keywords. A change from position 100 to position 50 may not make any difference whatsoever in your traffic, but can tell you a lot about the impact of your optimization efforts or an algorithm change.
- Include a number of relatively obscure long-tail keywords with stable rankings in your keyword basket. This can be helpful in determining whether traffic changes are due to a shift in the competitive environment (where these are likely to be unaffected) or a change in a search engine algorithm.
- Be consistent in how you conduct your queries. Searches should be conducted from the same location (ideally that of your most important target market) and with personalization disabled (in the case of Google, by appending the &pws=0 parameter to your query).
- Record representative rankings on a regular basis. Widely separated, ad hoc reports may not enable you to correlate ranking shifts with on-site or search engine changes.
- In circumstances where it is important to your business model, include seasonal terms in your keyword basket year-round. Knowing your ranking for a Christmas term in June can be very useful, particularly if you have year-over-year data (where, for example, you can correlate historical data to predict your performance in six months – and have time to build on opportunities or address potential problems accordingly).
It is obviously most efficient to use specialist software to record rankings. In the absence of an automated program you can, of course, record rankings manually – though you will have to work with a relatively small keyword basket. Google Webmaster Tools also now allows you to view and download average keyword position for different markets – the proviso here being that this report may not include all keywords which are of interest to you, especially those for which you rank poorly.
Internal and external links
Another temporal metric you will want to track is the number and nature of links in play on your site, both internal and external. Again, if you do not record link data on a regular basis, you will be not be able to use link information in assessing search traffic changes.
Google Webmaster Tools has evolved to become the best free, reliable source of link data at an SEO’s disposal. Webmaster Tools will provide you with raw link counts, both internally and externally, which will enable you to record and graph linkage patterns over time. Webmaster Tools also allows you to export detailed linked data as a CSV, and it useful to regularly download these reports for future reference.
You may be able to tell whether or not the number of links pointing to your site are rising are falling by recording link numbers, but without the detailed link data, you will be unable to answer questions such as whether the loss or gain of specific, high-value links has played a role in changes to search traffic.
For as long as Yahoo! continues to support the tool, Site Explorer will provide you with similar link data as Webmaster Tools, although you may only download the top 1,000 links. Details aside, comparing and contrasting even the basic number of inbound links between Webmaster Tools and Site Explorer can help you determine if changes in your inbound link environment are search engine specific, or as a result of changes to your site or sites that link to you. A number of commercial utilities, such as SEOmoz’s Open Site Explorer or Raven’s Link Manager, can also provide you with valuable link data if you have the budget.
Pages in index
A final piece of temporal information that is important to record is the number of pages that appear in the index of each search engine. The number of indexed pages is easily retrieved from each of the major search engines using the site: command. While the numbers returned by the site: command can fluctuate wildly and are, in general, not wholly reliable, they are nonetheless an important source of trending data for the breadth and speed of indexing on your site.
You may also find it useful to record the in-index counts of specific classes of pages if your URL structure supports this (e.g. site:example.com/products/ and site:example.com/blog/). Google Webmaster Tools will additionally display the number of pages in their index against the pages submitted for any given sitemap.
When did we deploy that new section? When did the changes to the related items widget go live? When did we put those redirects for the old blog tags? Mostly (but not always) the answer to such questions is retrievable, but often it takes a lot of digging or email inquiries to find out; recording changes to your site of any magnitude is far more efficient. And in the absence of a record of such changes, you may also be unable to correlate the impact of site changes and search traffic because you have forgotten about those changes altogether.
A simple, linear record of site changes can go a long way in helping you disentangle what impact such changes may or may not have had on your site performance. For the in-house SEO, this means staying on top of changes initiated by different departments – but it is just such corporate interaction and resulting intelligence which is one of the benefits of a having an in-house SEO program. Needless to say, when site changes are undertaken specifically for SEO purposes, knowing when these changes occurred is critical in evaluating the effectiveness of any given on-site optimization technique.
Whether or not they entail on-site changes, marketing campaigns can have a large impact on search, impacting metrics such as the creation of inbound links and the number of brand-related queries. Knowing what these initiatives are, when they launched and when they concluded can be valuable information for the search marketer.
In this realm, the intersection between SEO and social media initiatives is becoming increasingly important. The launch of a Facebook page or Twitter account may have both directly and indirectly impact your search traffic. Knowing and recording the status of social media efforts may help you correlate, even if indirectly, the still fuzzy metrical relationship between traction in social media and search-derived traffic.
Known search engine changes
Obviously, one of the most critical factors that impact a site’s search engine traffic are changes made by the search engines themselves. Keeping track of these changes and plotting them against a timeline can help answer one of the most common questions asked by SEOs in response to changes in search engine traffic rankings or traffic: is it something I did, or something they did?
Google, by its own admission, makes on average one or more changes to its search algorithm every day. It is difficult to even identify most of these changes, let alone know precisely when they occurred. However, on occasion the search engines themselves acknowledge a change to their ranking algorithm or indexing protocol – such as the completion of Google’s “Caffeine” indexing system – and these key occurrences can and should be plotted on your SEO timeline. SEO analysts, such as Barry Schwartz, may also aggregate webmaster reports of major shifts and ranking, and these can also be used as touch points when you regard the reports as both reliable and significant.
There are other changes in the search engine environment that are both evident to users, and typically confirmed (and even heralded) by the search engines. These can include changes to the layout of search result pages, inclusion of new verticals or other modifications to the search environment. Again, recording these as they occur will save you the effort of hunting down inception dates at a later time, or obviate the risk of overlooking these changes as you try to correlate multiple sources of data in analyzing search traffic and ranking patterns.
Putting it all together
The more in which you can consolidate different types of time-based information in one place, the better you will be able to see patterns and make associations between different data points. Most high-level metrics information can be consolidated in a multi-sheet Excel workbook, notably excepting granular ranking or traffic information. Where such a utility exists (such as in Google Analytics) making annotations that mark important site and search engine changes against an analytics timeline
can often be very revealing.
However you store and display data in your SEO timeline, the most important factor is ongoing and regular recording, especially for those temporal metrics such as rankings or pages in index that will not be retrievable at a later date. Whether the task you face is how to capitalize on an unanticipated search success, or a forensic investigation of a search traffic decline, a well-maintained SEO timeline can be one of the most important tools in your analytical arsenal.
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.