A Turning Point In The Field Of SEO

We are at a turning point in the field of search engine optimization (SEO)—a positive turning point. For those of us who have been around for a long time, it’s an interesting (and very good) time to be involved in search. But it must be daunting for those outside the industry, or just getting started […]

Chat with SearchBot

We are at a turning point in the field of search engine optimization (SEO)—a positive turning point. For those of us who have been around for a long time, it’s an interesting (and very good) time to be involved in search. But it must be daunting for those outside the industry, or just getting started with their careers.

I can’t remember a time when there was more “noise” and opinion out there, thanks to the power of the web and social media. There is a tremendous amount of information generated every minute, not all of it accurate or even useful. I have hundreds of subscriptions in my RSS feed reader, and at any given time (depending how diligent I’ve been keeping up) thousands of unread posts. It’s a requirement in SEO that we stay up to speed on the latest developments, which are frequent and myriad across several disciplines, and it’s becoming a very onerous task to stay current. I’ve developed a system to keep up with this firehose of information that I’ll expand on in a future column. In short, you have to learn to distill the noise of many blog posts and news items (and Google innovations!) down to a useful signal of items that may have an impact on your work. It’s a challenge in efficiency.

In this environment of almost overwhelming voices—many of them yelling simply for the attention rather than the significance of their contributions—there are some important changes happening that are changing the face of SEO.

Last week I attended the seminal conference SMX Advanced. It was my fourth year in attendance. I consider SMX Advanced to be the premier SEO conference of the year, partly due to the quality of the speakers and content, and partly due to the location and timing. Sunshine in Seattle is rare, but when it shines on that city it’s a special place to be, and June often delivers beautiful days.

In previous years I’ve noticed that Google (specifically Matt Cutts) will sometimes use SMX Advanced as a stage for announcing important changes to both algorithm and approach. In 2009 there was the infamous event when Matt came on stage (after Stephan Spencer pressed him) during a discussion of the newly released rel=canonical tag. Matt stated flatly that nofollow was no longer working for internal PageRank sculpting, in fact that it hadn’t worked for over a year. Which was fine with me, since I’d argued against using nofollow for PageRank sculpting in blog posts and at SMX Advanced in 2008.

So this year I was expecting more of the same, and potentially a strong statement from Matt on paid links during the traditional “You & A” with Danny Sullivan (which, if I had to vote, is probably the most important session all year for SEO at any conference). While Matt did make a sort of wicked chortle about how impressive Google’s tools are for sniffing out paid links, and emphasized that they’re still chasing these down, it wasn’t the dramatic stance I was expecting to see. That tells me Google is getting a pretty good handle on paid links, and doesn’t need to bang that drum quite so loudly anymore.

But there were more important issues that came up. I’ll go through each of them in turn.

Information architecture + SEO = BFF

Queue images of unicorns, rainbows and Carebears, because for me, the union of information architecture (IA) and SEO is the promised land. Add user experience (UX) into a prominent position in this relationship and a triad of power emerges.

With SEO, we send relevant traffic to websites. Traffic simply means “people.” We send people to websites. Websites are about people, ultimately, not search engines. But we lose sight of that, from time to time, because search engines are a huge source of high-quality, relevant traffic. They are a superb online marketing channel.

In my presentation at SMX Advanced on Site Architecture and Advanced SEO, I hectored the crowd (briefly, before exposing them to the sublime genius of bosom buddies and The Hoff) on the importance of IA in the field of SEO. I pushed the crowd to think of SEO in ways beyond “putting links on a page.”

During my presentation I explained the approach we’re taking with SEO at Zappos and other companies:

  • First, make the best user experience possible
  • Then leverage for maximum SEO

SEO should be an invisible layer beneath a smart site architecture and fabulous user experience. When SEO gets pushed out in front of these things, the site suffers and we end up with “optimized footers” and gobs of anchor text links that serve no real purpose.

As SEOs, we need to evolve our concepts of navigation and internal linking. For example, many sites will echo the global navigation along the left sidebar; using the same links in a different place. That’s not necessarily useful, depending on the site and user testing. Much more useful may be to use that left sidebar real estate for other important links not featured in the global navigation, or for refinements and calls to action.

Furthermore, we need to build contextual navigation based on the category or sub-category experience. Amazon does this fabulously well, so that if I’m in Books, I’m presented with relevant sub-categories based on that primary heading, not a slew of other semi- or not-at-all-relevant links based on misguided SEO.

We really, really, really need to get rid of that horrible “SEO footer” as I call them—the footer packed with links and optimized anchor text to everything under the sun within the site. That’s not a useful user experience and it’s definitely not a useful SEO experience.

And we need to build better, smarter, more relevant internal links. Categories should link to related categories and sub-categories, but product pages should link across to related products, and up to parent categories. If we begin to think like a user, and make our sites incredibly search engine friendly, then we’ll be walking the path to the promised land.

Faceted navigation

According to Peter Morville, one of the largest breakthroughs in web design and search over the past decade, faceted navigation (also called parametric search, if the user is required to execute the search after selecting options) is a boon to users. It allows visitors to perform complex boolean search logic by simply clicking on intuitive links (attributes) and narrow down a search in a nicely focused manner.

Side note: Faceted navigation is now “additive filters” because Google says so. Go read that.

But faceted navigation sucks for bots. Endless possible attributes appended to URLs cause mass confusion during the crawl experience, and more often than not, poor character choices are placed in the URL query string. Quotes (both single and double), ellipses, commas, spaces, brackets, pipes and tildes are just a few of the characters I’ve seen in URLs generated from faceted navigation schemes.

So what can be done? Well, first of all get a good SEO consultant, because this issue is not simple and there are many different ways to approach it. But briefly, pick the most important attributes (based on search volume, primarily) and ensure these URLs are search friendly (dash separated, or no more than 4 or 5 parameters and very clean). Then, additional overhead attributes that don’t represent significant search volume can be appended to the end of the URL and a rel=canonical hint can be added to the head of those pages back to the canonical “parent.” This requires some fancy programming, but can work nicely to roll up all of the overhead facets that don’t have an impact on your SEO strategy back to the important URLs that do have an impact.

Search result pages

Google doesn’t want search result pages in their search results. That’s Google’s current public stance, but it gets more complicated once you start to peel some layers back. Let’s examine this:

  • There are endless examples of website search results in Google’s search results
  • Some website search results provide a good user experience and convert well
  • Some websites are entirely built on search results, or have massive amounts of their sites built around search results, and these are also indexed by Google

As you begin to investigate this issue it becomes clear that there are many different factors at work here, and while Google needs to have an official public stance against “search results,” the term can have multiple definitions depending on the quality of the website search results in question. Amazon’s search results, for example.

I like what Epicurious is doing with their search result pages. Categories and articles link to popular search pages which are search friendly (and which appear in Google); these can then be drilled into further with faceted refinements, which are themselves carefully restricted from appearing in Google’s index with robots.txt. This is a great user experience and smart work.

Another interesting approach is to make the primary search page (usually mydomain.com/search or something shallow in the domain) a robust page with important links beneath the front-and-center search function. Then, all search result pages consolidate to the “search home page” with rel=canonical. This keeps all search results out of Google’s index, but passes authority from those pages which tend to build external and internal link equity. But, this requires that the search home page is a high-value page far beyond just a search box.

Pagination

It’s a real sticky wicket, pagination. On ecommerce sites it’s a constant problem we struggle with, and, depending on the often unique challenges and circumstances we face, best approached with an “it depends” mindset. There are several ways to deal with pagination for SEO. One of these, a good one first recommended by Maile Ohye at SMX West this year, is to roll up all paged versions with rel=canonical to the view all page, which becomes your default browse page.

If you have thousands of products or returns on a page, you won’t want to show all of them, obviously. You have to decide what number is a good one for those cases, but you can normally go well beyond 100 items per page if there isn’t a significant performance hit on the site. User testing has proven that, especially when shopping, people want to see a lot of products at once. It makes shopping more efficient and easier than paging through 10 products at a time, which can be a frustrating experience.

304 headers

Historically we’ve recommended sites return a 304 (if-modified-since) header if a page hasn’t changed. This allows the search engine to fetch only the header and not the entire page. While this will probably still be useful (assuming Bing and Yahoo! respect the protocol like Google has), it’s becoming less important. Why? Hard to say for sure, but it appears Googlebot will now fetch the entire page regardless, even if you’ve returned a 304 saying in effect that the page hasn’t changed. Matt Cutts, speaking to Rand Fishkin recently, said that it’s really not much more work for Googlebot to go ahead and grab the entire page if you’re already grabbing the header, anyway.

This is interesting because of what we’ve recently learned about Google’s new Caffeine architecture and the Mayday update. Caffeine includes, among other things, a new crawling and indexing technology that allows Google to index content as quickly as it gets crawled. This may be why Google prefers fetching the entire page rather than just the response headers, although it may still respect 304 if it’s implemented. Questions, questions.

A turning point

With the latest changes at Google, Bing’s imminent replacement of Yahoo search results, and the evolution and coordination of SEO, IA and UX (yes, I’m hopeful), we are at an interesting turning point in the field of search marketing. Google cares a lot about real time search, and its interface is becoming ever more crowded and, perhaps, fragmented. Social media is changing everything. Sites are getting faster, or they need to be. Mayday is hammering long tail spam (or at least attempting to). Caffeine can index content almost immediately during the crawl. The reasoning behind the Bosom Buddies plot is still a riddle unsolved in time. And The Hoff is still 100% awesome sauce.

But it’s all about one thing, or one person: your visitor. It’s all about the visitor to your website. Stay focused on her—she’s precious and has money.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Adam Audette
Contributor
Adam Audette is SVP SEO and Data Science at Blend360. Adam started his career in the early 2000s by co-founding the SEO agency AudetteMedia. He was the global head of SEO for Merkle / Dentsu for nearly eight years. He now works at Blend360 combining the worlds of data science and SEO. Adam and his teams have worked with many of the world’s leading brands, including Google, Facebook, Amazon, Zappos, Walmart, Target, Experian, Samsung, Walgreens, and many others.

Get the must-read newsletter for search marketers.