Cleaning Up the Retail Site Navigating Mess
In recent years, many retailers have implemented powerful attribute-based (a.k.a. faceted) navigation systems that make it easier for customers to filter, sort, navigate, and buy from a retailer’s inventory. Most of Netconcepts’ clientele who have implemented these technologies report good results, increased conversion and sales. From the natural search perspective, many such marketers are pleasantly […]
In recent years, many retailers have implemented powerful attribute-based (a.k.a. faceted) navigation systems that make it easier for customers to filter, sort, navigate, and buy from a retailer’s inventory. Most of Netconcepts’ clientele who have implemented these technologies report good results, increased conversion and sales. From the natural search perspective, many such marketers are pleasantly surprised to find increases in the number of their pages indexed by Google upon implementing such systems. But in this case, what’s good for users, is actually not so good for search engines — or for your bottom line.
Recently, Google Webmaster Central took a position on this by pronouncing the “infinite filtering” and “resulting page duplication” produced by such guided navigation systems as negative for bots, and urged that the “blech” of duplicated pages be cleaned up, by saying:
“Another common scenario is websites which provide for filtering a set of search results in many ways. A shopping site might allow for finding clothing items by filtering on category, price, color, brand, style, etc. The number of possible combinations of filters can grow exponentially. This can produce thousands of URLs, all finding some subset of the items sold. This may be convenient for your users, but is not so helpful for the Googlebot, which just wants to find everything – once!
….One fix is to eliminate whole categories of dynamically generated links using your robots.txt file… Another option is to block those problematic links with a “nofollow” link attribute.”
While useful for consumers, the attribute filters also produce unique URL parameters based on elements like product colors, sizes, price, popularity, the number of results per page, and the pagination construct itself, resulting in dozens or hundreds of permutations of unique crawlable URLs. But these unique URLs contain content that is completely similar and duplicated in nearly all respects. And duplicate content of this magnitude hurts your natural search marketing performance.
Here’s an example from the Artful Home brand: Compare this page to this page. Unique URL addresses, but same content. Click on the “price range” or “color” or “medium” attributes in the left navigation and you’ll get even more URLs with the same content. In fact Google has indexed 20 URL permutations containing the exact same content for this one page. Same title tags, headings, body copy, and more.
Below are five natural search marketing performance issues introduced by attribute-based retail navigation models (as typically implemented):
- You end up creating self-competition between your site’s own pages. (If you were Googlebot, which of these do you consider the authority page?)
- Your pages don’t resonate with searchers. (People seldom include attributes like price in their search queries)
- You bloat the engine index. (You don’t have as many pages indexed as you’re bragging about to your CEO.)
- You burn your “crawl equity.” (More of your unimportant pages get crawled with each visit a bot makes.)
- You fragment your available link popularity (PageRank) between so many different versions of the same/similar pages.
To deal with the situation, merchants with attribute-based navigation systems need more sophisticated strategies and execution capabilities.
For example, the on-page attributes that create the navigation scheme should be researched during the design phase to ensure your resulting landing page themes are consistent with searcher vocabulary. Filtering options that are not beneficial to your natural search performance should be similarly evaluated, and tactics put in place that make any “false” landing pages uncrawlable for bots. A combination of nofollows, meta noindexing and disallows strategies should be employed for this. This is easier said than done as most merchants require IT resources to modify the site architecture in this way.
(If you’re in this boat and expending any serious amount of IT effort is not an option, you may wish to consider a proxy site technology such as GravityStream to generate and apply such rules-based changes simply and automatically across your large scale site. (Disclaimer: I manage the GravityStream product at Netconcepts.) One benefit of this approach is that you can, without involving your IT department, automatically resolve the many parameter-filled URL permutations created by such navigation systems into singular, authoritative versions of each category, subcategory, and product level URL — a form of “canonicalization” via intelligent redirection. By eliminating page duplication in this fashion, you maximize the distribution of your available link popularity to the greatest number of unique landing pages.)
Google’s position on this matter may seem to contradict their Webmaster Quality Guideline around “making your site for users not search engines.” After all, attribute-based navigation systems do offer many great user benefits. But they will unquestionably dilute your natural search marketing performance if not engineered with search engines in mind. Today’s savvy merchant is already implementing sophisticated strategies to capture the best of both worlds.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
New on Search Engine Land