Supercharge Your URLs For Maximum SEO Impact

When optimizing URLs for high rankings, little attention is given to optimizing the URL for maximum clickthrough. Yet the URL undeniably affects searcher clickthrough rates in the SERPs (Search Engine Results Pages), as demonstrated by MarketingSherpa in their eyetracking study published in the 2008 Search Marketing Benchmark Guide. Specifically, MarketingSherpa found that short URLs get […]

Chat with SearchBot

When optimizing URLs for high rankings, little attention is given to optimizing the URL for maximum clickthrough. Yet the URL undeniably affects searcher clickthrough rates in the SERPs (Search Engine Results Pages), as demonstrated by MarketingSherpa in their eyetracking study published in the 2008 Search Marketing Benchmark Guide.

Specifically, MarketingSherpa found that short URLs get clicked on twice as often as long URLs (given that the position rank is equal). As you can see in the heatmaps below, experiment participants spent more time viewing the long URL, but less time viewing the entire listing. You could conclude from this that the long URL distracts the searcher from viewing the listing’s title and description. Not a great outcome.


MarketingSherpa eyetracking heatmaps showing impact of URL length on listing viewing
Caption: MarketingSherpa eyetracking heatmaps showing impact of long URL length on listing viewing. Used with permission.

Worse yet, long URLs appear to act as a deterrent to clicking, drawing attention away from its listing and instead directing it to the listing below it, which then gets clicked 2.5x more frequently. It’s open for debate, of course, as to what is a “short” URL or a “long” URL. But it’s the first data I’ve ever seen attempt to quantify the affinity searchers have for the URL component of natural search listings.

For us, these MarketingSherpa findings confirm that success at SEO still requires more than just Google Sitemaps, and that an unoptimized URL is money left on the table. Just because algorithms have evolved to handle dynamic URLs with multiple parameters, avoid session-based spider traps, and even fill out forms on occasion, we shouldn’t be lulled into a false sense of security that our URLs are “good enough” and don’t need work. You should be on an unending mission to find and execute on opportunities to test and optimize URLs for both rankings and clickthrough.

So even though URLs you’d never have dreamed of getting indexed a few years ago are now regularly making it into the index, this doesn’t mean that suboptimal URLs are going to rank well or convert searchers into clickers. Here at Netconcepts, we’ve conducted countless tests using our GravityStream platform, proving to ourselves and to our clients that optimized URLs consistently outperform unoptimized URLs. Given that, here are some general best practices for URLs that we believe hold true:

  • The fewer the parameters in your dynamic URL, the better. One or two parameters is much better than seven or eight. Avoid superfluous/nonessential parameters like tracking codes.
  • A static looking URL (containing no ampersands, equals signs, or question marks) is more search optimal than a dynamic one.
  • Having keywords in the URL is more optimal than no keywords.
  • A keyword in the filename portion of the URL is more beneficial than in a directory/subdirectory name.
  • Hyphens are the preferred word separator, although underscores are gaining acceptance over times past . So if you have multiple-word keyword phrases in your URLs, I’d recommend using dashes to separate them.
  • Stuffing too many keywords in the URL looks spammy. Three, four, or five words in a URL looks perfectly normal. A little longer and it starts to look worse to Google, according to Matt Cutts.
  • The domain name is not a good place for multiple hyphens, as it can make your URL look spammy. Although that said, sometimes a domain name should have a hyphen, as the domain faux pas ‘arsecommerce.com’ demonstrates (you may not get this joke if you don’t recognize Queen’s English!).

Given the above, it’s absolutely worthwhile to rewrite your dynamic URLs to make them appear static and to include keyword phrases with hyphens separating the words (done within reason). So a targeted search term of “blue widgets” would be represented as “blue-widgets” in the URL. Bare spaces cannot be used in URLs, so some “white space” character needs to be used—either the + (plus sign) or the character encoding for a space %20. I’m not a fan of using the character-encoded version, as it’s not quite as pretty: “blue%20widgets”.

The above best practices are generally accepted. It gets a lot more contentious when talking about stability/permanence of your URLs. The general school of thought is that stable is better. In other words, decide on an optimal URL for a page and stick with it for the long haul. We have a different view: URLs can be as fluid as a title tag.

In our view, URLs can be experimented with and optimized iteratively over time, just like any other on-page factor. Why would you “set it and forget it” when it comes to your URLs when you don’t do that with your titles, H1 headlines, page copy, and internal linking structure? For example, all the following hypothetical URLs follow best practices—with the exception of the first URL, of course, which is actually the real URL; now which one will perform the best?

The only way to know for sure is to test.

If your CMS or ecommerce platform supports having URLs that are malleable, then why not exploit that capability and embark on a regimen of testing and continuous improvement? WordPress supports this fairly well by automatically 301 redirecting requests for old permalink URLs, once the “post slug” for that post has been changed in the admin. Unfortunately, most ecommerce platforms do not support such a capability. When sites are stymied by their platform, the only options are to replace your CMS with one that supports malleable URLs, customize the CMS to support it (assuming you have access to the source code), or put a layer on top of your CMS by using an SEO proxy technology like GravityStream.

Regardless of how you accomplish continuous URL optimization, the MarketingSherpa study shows complacency when it comes to iterative testing and improvement of your URLs (or any other on-page factor, for that matter) results in more traffic going to your competitor’s listings. This is fatal to your natural search program.

Stephan Spencer is founder and president of natural search marketing firm Netconcepts and inventor of the GravityStream SEO proxy technology. He’s currently authoring the upcoming O’Reilly book “The Art of SEO” along with co-authors Rand Fishkin and Jessie Stricchiola. The 100% Organic column appears Thursdays at Search Engine Land.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Stephan Spencer
Contributor
Stephan Spencer is the creator of the 3-day immersive SEO seminar Traffic Control; an author of the O’Reilly books The Art of SEO, Google Power Search, and Social eCommerce; founder of the SEO agency Netconcepts (acquired in 2010); inventor of the SEO proxy technology GravityStream; and the host of two podcast shows Get Yourself Optimized and Marketing Speak.

Get the must-read newsletter for search marketers.