Worthy Alternatives To The Useless SEO Data Provided By Search Engines

The data classically used in search engine optimization (SEO) to benchmark sites and track performance is all but useless today. Ranking reports, still a necessary evil in SEO reporting, have been unreliable (at best) for at least 3 years. Their validity has been further marginalized over the last 24 months by way of Google’s increased use of personalization, localization and experimentation with blended results, to name just a few.

Today, the convention of using indexed page counts and backlinks to benchmark and report on site performance is facing a similar demise.

Google has always been careful at hiding data from prying eyes, including its supplemental index results (which Google claims do not exist anyway) and backlinks. The link: command contains purposefully obfuscated data. Even for sites that have been validated in Webmaster Tools, the link information we’re given has not evolved much. We get a very simple result set of sites linking to us, but we don’t get to slice, dice or order the data much further than that. It’s pretty much the same tool today that it was when it launched in October of 2007.

Index page counts returned using site: commands have been very inconsistent since at least November 2009, in my experience. Depending on how you search, or when you search, or where you search from, you’re likely to see very different results—sometimes varying by tens or even hundreds of thousands!

Yahoo gave us Site Explorer, the de facto backlink research tool in the SEO industry. Thanks, Yahoo, for that. But what’s happening with that tool? It’s been giving us very inconsistent, downright misleading data for the last few months. It’s no longer reliable.

Google is doing too good a job hiding the kinds of info we need as SEOs (supplemental index, indexed pages, backlinks) while Yahoo has given up completely and isn’t putting any resources into search. What we’re left with is a wasteland.

And the worst part? We keep right on doing what we’ve been doing. We really have no choice—we need data points like this. We’re basically stuck in a catch-22 of needing these metrics, while realizing they’re inconsistent and unreliable. That’s not a good place to be.

So just what is the enterprising enterprise SEO practitioner to do? There are options, but unless and until the search engines themselves give us valid and accurate data, these are workarounds and conduits rather than outright solutions.

What tools to use

To be fair, even with accurate data from the engines we’d still use our own tools and perform our own analyses, so really we’re back where we started from. There are some excellent tools we can use to track backlinks. Indexed pages, not so much, but there are workarounds for that, too.

There are three primary tools that every SEO worth her salt should be using (and probably already is). These are Linkscape, MajesticSEO and Open Site Explorer (OSE).

Although SEOmoz has gotten a lot of flak for the technology behind its tool Linkscape, it’s actually an awesome piece of kit. (I wrote “kit” so I could sound like a cool Brit. Yes, I failed.) There are many different ways one can use Linkscape to analyze, dissect, reverse engineer and just basically own the link information of competing websites.


Majestic is another great tool and it really shines once a site is authenticated (for sites you don’t own, you can pay Majestic a fee to get the same data). However, I’m anxious to see how their data ages and the accuracy of its trending information.


Both of these tools do a great job analyzing backlink profiles and showing important metrics such as the domain and URL authority, the number of unique domains linking to a page or website, anchor text information and much more. They also make use of proprietary scoring methods: “ACRank” for Majestic and “mozRank” for Linkscape.

OSE is the newcomer, also from SEOmoz and it’s designed to replace the already dying Yahoo Site Explorer (thank god). It’s another good tool and is based on Linkscape.


While we can’t accurately track indexed pages outside of the search engines, we can use a tool like SEMrush to analyze overall keyword visibility in Google. No, this isn’t going to give us N number of URLs in Google’s index, but what it will do is surface a selection (the data can’t be definitive, nor does it need to in order to be actionable) of terms a domain is ranking for. Pretty darn useful, if you ask me.


The best part about all of these tools, including Google’s own Webmaster Tools, is that the data can be exported into Excel for further analysis. That’s where the magic happens.

What metrics to track

The primary metrics we track are the percentage of overall site traffic from free search, the percentage share of each engine, free search traffic at the keyword level (using clustering for related terms) and the delta between branded and non-branded free search traffic. Then, you can slice and dice as deeply as you need to go, looking at bounce rates, conversion data and more. Often, we do specific analyses at the category or product page level, too.

These are really the primary metrics we should track. You can go further, too, to track things like landing page yield and keyword reach, as illustrated by Brian Klais in the Analyze This column (which I highly recommend adding to your feed reader subscription list).

I really want to stop tracking indexed pages and backlinks using Google and Yahoo’s data. I need to stop. But until there’s a better way, until a reliable and accurate set of metrics and tools exists, this is the standard that we have. This is the best we can do. Sure, we can use our own tools like the excellent ones outlined here and perform our own analyses and we can ensure that cutting-edge and competitive SEO strategies are being formulated.

But C-level executives will continue to request things like ranking reports, indexed page counts, backlink counts (from the engines, not from a third-party tool) and even toolbar PageRank. The problem with these “metrics,” if you can even call them that, is they’re only useful in documenting what the engines are saying. They aren’t useful in documenting a site’s health, potential or competitive SEO stature.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: Channel: SEO | Industrial Strength


About The Author: is the Chief Knowledge Officer at RKG, where he blogs regularly. You'll find him speaking at conferences around the world when he's not riding down mountains on something fast. Follow Adam on Twitter as @audette.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Duane Forrester

    What is it about Oregon and smart search marketing folks?!

    Great article Adam – very useful insights. Though I will admit I gave up on ranking reports about 7 years ago myself, it’s nice to see folks continuing to state this out loud and back up the logic.

    I’m saving this article for future reference. :)

  • http://www.blizzardinternet.com Carrie Hill

    Great info – and very timely for me. I have a client that obsesses over the rise and fall of the site: data for his site – DAILY! One more person telling him it’s unreliable is a great thing in my book.

    I’d also like to see the day when i can stop tracking link and indexed page data in Y & G – unfortunately we spent the last 5 years training clients HOW to do that so now they’re obsessing. I guess its better than the previous years of TBPR obsession :)

    Thanks for writing this – appreicate the insight!

  • http://joblr.net Mikkel deMib Svendsen

    Adam, the problem with many of the data sets available – now and in the past, is that they are very “geeky”. They are great for the few of us that knows how to value them but close to impossible to use for the average webmaster, manager or web business leader.

    That is why I decided to invent a brand new approach to “visibility reporting” – a technology that is now US patent pending and used in its early stages on Joblr.net – a service I recently launched with my great team (some of yoy may know a couple of them: Anne Kennedy and Jamie Low).

    The great thing about the approach we have taken with this is that we are not dependant of specific data souces but can change them over time and from market to market – and still maintain a consistand (indexed on a 1-100 scale) reporting that can be used over time and bewteen sites. You can even compare sites and competitors in different markets.

    This makes it very usefull to a much greater audience than any reporting has been in the past. And if you want to know more, you can digg into the souces we used – in aggregated (scored) formats or the raw numbers.

    Off couse geeks like us will need more than what Joblr provides – but thats no news :)

    Adam, give me call if you want to know more about my patent and Joblr. Its quite interesting, I think :)

  • http://www.ioninteractive.com allenkristina

    Thanks for the great SEO checklist! It makes sense that checklists increase accuracy because they keep us on tasks and ensure we don’t miss a step!

    -Kristina, @Ion_interactive

  • http://www.ioninteractive.com allenkristina

    Very insightful post, Adam. It’s true the C-suite will probably continue to request ranking reports, and the like — this is where it becomes important for SEOs to educate. If the C-suite is on the side of SEOs, company wide support for the correct initiatives will likely follow. Thanks for the great analysis of worthy SEO tools!

    -Kristina, @ion_interactive

    (hit ctrl + c and had the wrong thing copied, sorry!)

  • http://www.audettemedia.com Adam Audette

    @Duane – I think it’s the clean mountain air! Awesome to get your feedback here, thank you.

    @Carrie – It’s funny, just because we “can” doesn’t mean we “should” yet that’s exactly what happens with this stuff.

    @Mikkel – Definitely interested to talk about Joblr and your approach to visibility reporting. I’ll shoot you a DM.

    @Kristina – Absolutely true, we need to educate and train teams, and that includes the C-level teams. I’ve found they are typically very interested in search because of its ramifications across the bottom line, so that’s a good thing.

  • http://www.seoeffect.com Keesjan Deelstra

    @Adam Interesting post! Here in the Netherlands its just the same with clients bagging for PageRank data and more stuff like that. With our online tool SEOEffect we take a different approach: instead of the dying SERPS metrics -due to personal search- we calculate ‘traffic share’ down to the keywords level. This KPI and the average mean traffic share is top KPI can help to keep track your SEO campaigns. If you are interested I can send you an invitation for the private beta. More about this KPI on http://www.seoeffect.com/blog/Two-New-Features-of-the-SEO-Effect-Keyword-Tool-SEO-KPIsBlog-title/

    @Mikel what I understand from your visibility KPI here http://joblr.net/default.asp?id=118 is that its calculated on ‘old metrics’ like SERPS, Alexa rank etc. Don’t wan to offend you but is that patent pending worth?


Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide