Why Quality Is The Only Sustainable SEO Strategy

One of the most important takeaways after the Panda Update / Farmer fallout is to make your sites as high-quality and useful as possible. The next year should be interesting, as some sites invest in quality, while others try to game signals seeking shortcuts to the hard work. Both are valid, as long as you’re ready to accept the risk of shortcuts, but only the hard work will continue to yield results long-term.

Matt Cutts and Amit Singhal conceded that there are signals in the latest algorithm update that can be gamed. (Any algorithm can be gamed.) What are they? It will take time, but eventually some of them will be discovered.

If quality, credibility, and authority can all be algorithmically identified, then certainly they are based on distinct sets of factors that in sum create signals.

Things such as density of advertising, author names and titles, address and phone information, badges and memberships in known organizations, content density, the quality of a site’s link profile, maybe even W3C compliance (although that’s a stretch) are all potential areas to investigate. This advice is not just for the gamers, these are areas where high-quality, white-hat SEOs should be looking, too.

We are coming upon a graduation of sorts for SEO that will continue to bring various disciplines together: information architecture, user experience, even web design are all important in regards to SEO and how a site is scored.

How a site “feels” to a visitor, the credibility it portrays, these are areas that design plays an important part in, an area that John Andrews was already talking about at SearchFest in 2010.

Above and beyond the granular tactical stuff we SEOs are obsessed with, we need to figure out what users want, because that’s where Google is going. Chasing users, not algorithms, will have the best long-term influence on a site’s rankings.

After all, Google (and any search engine) is basically a means to an end, a way to capture audience share (the users) who depend on search to find good information. I’ve been saying exactly the same thing for about 10 years, and it’s more true now than ever.

What Link Metrics Translate To Higher Rankings?

The conference season is in full swing and I’ve spoken at a couple recent ones: SearchFest, where I presented on link building with Rand Fishkin, and at SMX West where I presented with Vanessa Fox, Dennis Goedegebuure and Tony Adam on enterprise SEO.

Building presentations is always a great exercise, because it forces you to distill your thoughts into actionable, quality information for conference attendees. I love the process and I really enjoy speaking at these shows.

My SearchFest presentation sought to communicate the following four points to our audience:

1. Traffic yield of the URL

While “people” need keywords to find what they’re looking for, keywords are just a proxy for the people who use them. As SEOs, we tend to obsess on keywords… after all, they’re where the money is. Right? Sort of. Keywords are a means to an end, they are bait on a hook. The hook is your quality resource which will attract and retain them. And that resource is best signified for SEOs by one thing and one thing only: the URL. In SEO, the URL is where all the value is, not the keywords.

Ranking reports are becoming even more meaningless than ever. Google appears to be throwing random results back for IPs and/or user agents that appear to be scraping for rankings. This creates a lot of noise and problems as reports are built for clients.

What matters is not the ranking (funny though how Google reports on “average rank” in Webmaster Tools), but the total traffic yield of the URL.

What is the traffic quantity in total keyword searches? How much volume do those searches have, how much traffic does the URL see? And what is the quality of traffic, such as bounce rate (hopefully low), average time on site or pages per visit, and conversion rate (hopefully those are high).

That is much better information than a ranking report. All this said, ranking reports are not going away because there is far too much education yet to be done on the client side. Ranking reports are comfortable, they are what’s always been used to track SEO success. That needs to change.

Getting back to the above point, the URL is where all the value is stored. Page scoring factors and many other criteria are rolled up into the URL, which is stored as a distinct field in the search engine databases.

A sample of SEO scoring factors

2. Preserving the power of URLs

This is why it’s absolutely critical that URLs are preserved. Well-aged URLs will score best, unless they’re in News and QDF searches. Redirects greatly hamper SEO success. Any redirect.

Recent experiences have shown a great deal of equity loss when using 301s, and in some cases, a rel canonical tag appears to work better to transfer equity. The idea that one can “store” internal PageRank to be used later with a 301 is what basically introduced the equity rot that is occuring with permanent redirects now.

I still recommend using a 301 when you can. It’s the best possible way to permanently move content. Just be open to rel canonical, because it’s quite powerful and can be a very strong signal for Google at this time. It’s also well adopted across the web. Bing also supports it, but reports are mixed how well they’re using it.

3. Looking beyond links

It’s not only about links. However, especially prior to Panda Farmer, links tend to brute force top rankings on competitive SERPs (when overall domain authority, or the “wikipedia effect,”, doesn’t hold sway).

I took the time to analyze several competitive SERPs to see what factors really mattered when it comes to links: is it sheer quantity, unique domains, page-specific links, diversity, or anchor text? In my analysis, the biggest four factors were domain authority, total domain links and unique domains, page-specific links and uniques, and matching anchor text.

However, it was interesting to note that in several cases, prominence of exact-match anchors seemed to be very common in positions 7-10, possibly indicating up-and-coming competitors pushing hard for rankings using heavy anchor matching. Stronger competitors were benefiting from a more cohesive link strategy that also focused on sheer quantity, especially quantity of unique referring domains.

The below image shows the link profile for the SERP ‘marketing automation’ in unique referring links and matching anchor text. Can you find the Wikipedia entry? Bet you can.

Unique links versus matching anchor text

4. Link factors in search algorithms

There are many link factors that could (and should) be taken into account by any algorithm. These include (at least):

  • Recency (are links “come and go,” have there been a lot or very few links recently, etc
  • Transience (do links disappear after a time)
  • Anchor text (how much exact match is there)
  • Context (is the link contextual)
  • Relevance (how related to the site’s content is the link)
  • Prominence of placement (is the link in a spot that maximizes its CTR, or is it lower left or in a footer)
  • Other links on the page (what quality are the other links on the page, and how well do they match)
  • Trends (what is the trend of links over time)
  • Co-citation (what kinds of links point to the page)
  • Frequency of linking (how frequently do the domains exchange links)

While the above is a fairly exhaustive list of link factors, in our analyses we’ve found time and again that there are basically 4 link factors that tend to influence performance:

  1. The domain authority of the ranking URL
  2. The quantity and diversity of links into the domain
  3. The quantity and diversity of links into the URL
  4. The amount of matching anchors

(“Diversity” here meaning the amount of unique referring domains.)

There are always exceptions, and in fact, every SERP is unique. Additionally, it’s impossible to isolate link scoring outside of on-page factors; rankings are more complex than links. But the result of link analyses tend to show the above factors.

How To Achieve SEO Sustainability

In industrial strength SEO, quality and scale must hold sway. On-page strategies, internal linking, and off-page strategies in social and link development, should always emphasize quality and scalable techniques.

As we’ve found with the latest Google algorithm shift, when quality and the user is kept in focus, performance can withstand even dramatic algorithm adjustments. The name of the game in SEO is change, but by keeping focused on users and not algorithms, negative consequences can be minimized.

That’s not to say you shouldn’t keep an eye on what the engines are doing. On the contrary, I recommend studying the algos like a hawk! It’s essential to know what’s happening and why. Just don’t build your SEO strategy around the algorithms. Build your SEO strategy around your users.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: Channel: SEO | Industrial Strength

Sponsored


About The Author: is the Chief Knowledge Officer at RKG, where he blogs regularly. You'll find him speaking at conferences around the world when he's not riding down mountains on something fast. Follow Adam on Twitter as @audette.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • http://www.location3.com Tarla Cummings

    While I definitely agree that search positioning should not be considered a main performance metric, I still find it incredibly important on the side of optimization. For example, If I’m targeting a keyword but the visits are not increasing, I need to know whether it’s because I’m not yet in a high enough position to see traffic increase, or because it’s just not a good keyword choice. Being able to run accurate position reports provides a lot of important information like this, which is why it’s so frustrating that getting that information is becoming increasingly difficult. But you are absolutely right about educating clients away from considering position to be a top judge of performance.

    Enjoyed the post, great info on link factors

  • http://www.audettemedia.com Adam Audette

    Great point, Tarla. I agree. What you’re describing is more of a spot-check thing than regular, on-going reporting. With Google giving average position via GWT, that might be a place to get that data. You can also set up Google Analytics to give position info via some custom segments (Google passes the URL rank position in the referrer). Or just check by hand.

  • http://newevolutiondesigns.com Tom

    Excellent post. Makes me feel and little better/hopeful for our blog.

  • http://www.visionefx.net Rick Vidallon

    I chuckle each time I read about the latest and greatest scheme about getting more website traffic, hits or better page position on Google and other search engines.

    Some so-called search engine optimization professionals say:

    You need to have a Blog.
    You need to have an XML sitemap for Google.
    You need to have a forum.
    You need to have back links.
    You need to optimize your local listing.
    You must have a special meta-tag.
    You need to optimize all your ALT-tags.
    You need to post videos.
    You need to get on Face book, Twitter, Linked In and Four Squared.
    You must do this or you must do that and so on.
    When all else fails they blame your website woes on Google and tell you that Google has changed the rules and it’s not their fault.

    Some of the methodologies just mentioned do help to some extent, but only a little. In the end it all goes back to your website.

    A great website is EXACTLY like a good book. Good books make the New York Times best seller list because they have some or all of these values: either interesting, informational, entertaining, enlightening, popular, useful, educational and/or illustrative.

    If webmasters and website owners put the amount of money, time and effort wasted on monthly SEO (Search Engine Optimization) into improving their website, search engine optimization would happen or occur naturally or organically. You are better served hiring a website copywriter to build content versus hiring an SEO specialist to build back links.

    While some websites may see a temporary spike in traffic using some popular search engine optimizations tactics, it is simply not good online business practice and may even hurt your brand or reputation.

  • http://www.visionefx.net Rick Vidallon

    Opps! – Pardon my manners Adam. GREAT ARTICLE!

  • fabioricotta

    Hi Adam,

    Great article! I would like to know more about your recent experiences that shown you an equity loss when using 301s. Can you share more about this?

  • http://blog.webpro.in Bharati Ahuja

    Great article. Yes quality has always been the sustainable measure and that is why SEO is an ongoing process and not a one time job.

    The ongoing endeavor of the SEOs is to adapt the site according to the changes in the algorithms is a constant challenge.

    My article on http://www.searchenginejournal.com/seo-is-more-important-and-more-needed-than-ever-before/28725/ expresses my views related to this topic.

    Though the SERPs keep on changing and should not be considered as the primary performance measure of any SEO campaign but yes the higher you rank on organic listings the higher is your CTR and that can lead to increased business which is the primary goal for any business website. (The image in the article proves that )

    This is a little off the topic but nevertheless thought of sharing it.

    In fact It would be a great help to all web marketers especially SEOs if the SERP ranks are also displayed in the GA reports next to the keywords.

    I know that the SERPs keep on changing but it is an important reporting field for all SEO reports as the traffic does depend on how high you rank.

    Yes the true approach is the holistic approach to SEO but having this in the GA reports will reduce our dependence on other third party reports and we can just concentrate on Google webmaster tools and GA for analysis.

    I suggest that there can be 2 fields for SERPs one the SERP for the keyword when the click took place and one field showing the current SERP when the report is being accessed.

  • http://www.pagezero.com Andrew Goodman

    My good friend Adam, of course I agree with the overall sentiment, as against SEO “gamers”. Good points.

    But I can’t help but notice that you mention “money” only once, “conversion” once in a passing way, and “revenue” not at all. You rightly point out that SEO’s often lose focus, but by my standards, the analysis is frankly too interested in the ins and outs of SEO wizardry, and not enough on the “business why” of all of this.

    Let me echo back how you describe the lay of the land:

    “While “people” need keywords to find what they’re looking for, keywords are just a proxy for the people who use them. As SEOs, we tend to obsess on keywords… after all, they’re where the money is. Right? Sort of. Keywords are a means to an end, they are bait on a hook. The hook is your quality resource which will attract and retain them. And that resource is best signified for SEOs by one thing and one thing only: the URL. In SEO, the URL is where all the value is, not the keywords.”

    True. As science, accurate and helpful.

    BUT:

    You use the word *resource* a couple of times. It is euphemistic and terribly politically correct to refer to a store or a profit-making enterprise as a “resource”. Indeed, it’s admirable and flattering enough to the high-mindedness of many of these businesses as to be a complete crock. When are we going to start calling a store, a store?

    “Resources” don’t convert for many businesses engaged in direct ecommerce sales. “Resources” may work for some models (content businesses, experts), but that accounts for what, 10% of the revenue intent of the businesses online trying to figure out SEO? Did JC Penney get whacked for linkspam because they’re looking to get cool resources about prom dresses and sunglasses out there in the public realm?

    Or are they a store?

    The euphemism “resource” is typical of nice wording used by SEO’s who would love to misdirect conversations about business metrics to all of the fun times we can have writing copy and thinking about “users” who enjoy “consuming great content”. To say nothing of the (of course, quite important) conversations about various elements of the algorithm, especially link analysis. Understanding signals of relevancy and trust is paramount to understanding organic search rank, of course. But we are having a conversation in a vaccuum if we hold it to that academic level all the time while implying there is a revenue impact to SEO content/resource shenanigans that may not (even indirectly) create business results.

    I believe Jill Whalen has a word for this: boondoggle.

    This is why most SEO’s don’t provide qualified help to most businesses. They maintain an ongoing weakness for rank reports, link authority scores, and other (peripheral, if important) data points, in complete isolation from frank reference to revenue related analytics. You shouldn’t do one without the other.

    When are we going to start calling a store, a store?

  • http://www.pagezero.com Andrew Goodman

    Note the muddled, tentative, and incomplete analysis of the previous comment:

    “…yes the higher you rank on organic listings the higher is your CTR and that can lead to increased business which is the primary goal for any business website.”

    So the higher you rank, the higher your CTR, and that “can” lead to increased business?

    Incredible. We posit CTR as some kind of helpful metric in this context (sort of, a bit, but not really), don’t mention click volume or the business value of the keyword(s) in question, and are incredibly uncurious about closing the loop between “can lead to” increased business and “did convert to x revenue”.

    Warning: SEO’s indicate that higher search rankings can, might, “lead to” increased business… or not. We’re not really sure. We haven’t looked into it all that closely.

  • http://blog.webpro.in Bharati Ahuja

    The logic is that if you have high CTR as a result of organic rankings then that leads to increased targeted visits and if the no. of targeted visits are high then higher the possibility of conversions which again depends on many factors that is why the word CAN was used.

    Good SEO results in targeted traffic which has the potential of getting converted into business.

    SEO is a subset of SEM (Search Engine Marketing) and yes the goals of marketing do apply to SEO also.

  • http://andrewkaufman andrewkaufman

    As someone who has been on both sides of the quality content equation (both in helping to produce crappy seo copy and eventually moving away from that into high quality, well researched and comprehensive articles) this article rings incredibly true. While producing large amounts of “thin” content pages may work in the short term (or may have worked), it just doesn’t hold up in the long run – you’ll always be at mercy of the next algorithm change.

    The problem is that high quality content takes a lot longer to show a positive ROI (just like any good SEO campaign in a way). A lot of businesses (especially new ones trying to find their business model) are in such a rush to generate revenue and exposure (like my former employer Mahalo) that they don’t stop to think about the implications down the road.

    The constant struggle we face is how to optimize for both users and search spiders. And while I’m no engineer or search expert, my intuition is that the Panda update was designed to close the gap between these two things. By giving more weight to bounce rate, time on page, and ad placement and frequency as ranking factors (all indicators of usability, design and IA), they seem to be looking at user behavior more closely in determining which sites deserve to have the top spots.

    A few other thoughts on how my time working at a content farm showed me that investing in quality content (and focusing on the user) is the best way for businesses to protect themselves from getting torched: http://www.freelancecontentstrategist.com/content-farm-hand.html

  • http://www.forefrontseo.com Todd McDonald

    @Andrew Goodman

    “It is euphemistic and terribly politically correct to refer to a store or a profit-making enterprise as a “resource”. ”

    I’m pretty sure “resource” was not used to describe a store, but rather something of value the store owns…How did you come to that conclusion?

  • http://www.realwebmarket.com RWM

    “The quantity and diversity of links into the domain
    The quantity and diversity of links into the URL”

    Could you describe the difference between them?

  • http://www.rimmkaufman.com George Michie

    Seminal post, Adam. Similar to weight loss plans that produce short term benefits but not long term results, gaming the system ‘works’ — folks wouldn’t bother with the black hat stuff if it didn’t — but not for the long run. Quality site (usable, fast, spider friendly), quality content, quality links are the way to go. Analogous to my new diet plan which I hope will catch on: DELAP GSE. (Don’t Eat Like a Pig — Get Some Excercise). Unfortunately, quality is hard and gimmicks aren’t.

    Andrew Goodman raises a valid point: at the end of the day the investment in SEO needs to pay off, but there is no reason to believe it can’t. It’s a matter of paying reasonable fees to smart people who do quality work be they salaries to employees or fees to quality agencies. All traffic is not created equal, but that’s just a matter of prudent use of resources to get leverage where it has the most impact.

  • http://TannerChristensen.com Tanner C

    Really a great synopsis of what’s changed recently and what webmasters should be keeping an eye on.

    I have to echo Rick Vidallon’s comment above though, as I think it’s critical to success online (and many beginning SEO managers and marketers overlook the point): if you want to be successful online, be interesting, informative, useful, etc.

    Targeting search algorithms can only get you so far in rankings — and what happens when the algorithm changes? Instead, target your audience, real people, and it won’t matter what changes with the search engines because your audience will always be there.

  • http://trustmetrics.com mike andrews

    great article — at trust metrics we rate websites for advertisers, and quality is one of the most important ratings we offer. we do this by looking at many features derived from crawling a site, features which we’ve determined correlate well (especially in aggregate) with quality.

 

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide