The Competitive Linking Analysis Trap
Several years ago, the search engines began to slowly turn off the fire-hose of linking data they freely gave anyone familiar with the link: operator. That operator became just about useless as a method for detecting competitor links, potential link targets or competitive intelligence, etc. I have a vivid memory of speaking at one of […]
Several years ago, the search engines began to slowly turn off the fire-hose of linking data they freely gave anyone familiar with the link: operator. That operator became just about useless as a method for detecting competitor links, potential link targets or competitive intelligence, etc.
I have a vivid memory of speaking at one of Danny’s conferences ages ago and telling the audience that the search engines had no mandate to freely give us marketers whatever linking data we wanted. That the link: operator was not a birthright, and that someday it would all go away. And, away it has gone (from Google, but FYI the little known engine DuckDuckGo.com supports link:). Have a look.
In the years since, a new industry niche grew, an industry of third-party data providers that aimed to provide that which the search engines no longer would: Extensive backlink data and metrics. Please let me state for the record that this column is not an indictment of these third-party services.
Whether it’s Majestic, OSE, ahrefs, or Blekko (which was a free secret link data weapon for many of us for a long time), the companies that take on the monumental task of creating a private crawl of the Web for the sake of providing competitive linking intelligence do a fantastic job. I use many of them, including my own private backlink data tools.
The trap is not the data itself.
The trap is in how you interpret the data and what strategic moves you make as a result.
Most linking data providers have a tiered fee structure for access to linking data, and rightfully so. There are significant costs involved in crawling the Web, extracting and storing key data (follow, PageRank, anchors, etc.), and maintaining historical data for clients that want to track such things over time. And, somewhere along the way, the various bits and pieces of collected backlink data turned into KPIs for link building related activities.
And, it’s in those KPIs where the traps can be found. What many of us forget is that no matter how good the linking data is from any third-party provider, it’s all still a proxy for Google. We use this proxy data because the search engines will not give us the exact thing we most want, which is the accurate linking data the search engines have.
So, in the absence of that, we are all dependent upon third party providers for data. Again, let’s be clear. These services provide an awesome service and I’m glad for the data they provide me. But, here’s where that data can trap you.
A year ago, if you were examining your backlink profile, and were comparing your site to your competitors using any of the third-party data providers, you might have looked at a signal such as anchor text; and, based upon seeing how a competitor had a larger number of exact match keyword anchor text links than you did, you may have decided to alter your strategy so as to close that gap.
You built exact match keyword anchor text links to catch up to a competitor because according to the data you were looking at, that was a metric that stood out to you as a likely reason that competitor was outranking you. This all seems perfectly logical.
But, let’s look at what was going on behind the scenes during that same year. The search engines were busy devaluing anchor text links as a ranking signal. \
In other words, the very thing the proxy linking data was telling you you needed turned out to be completely wrong, but you couldn’t know that because the search engines don’t broadcast what they are devaluing beforehand, at least not in an overly obvious manner. We find out after the fact, usually in a panic because a site’s rankings have tanked.
I use anchor text as one example, but you could just as easily use any of the many well-known metrics and still fall into the same trap. Follow/nofollow, TLD distribution, deep link ratio, the number of signals that can be measured borders on the absurd; yet, many people make strategic decisions based on the very signals Google may, in fact, be devaluing at the exact time you are pursuing them.
I’d love to tell you there is a perfect link profile that can be replicated across all sites, but there is no such thing, and no metrics, no matter how accurate, are going to answer for you the most important questions of all: what is it that Google is truly looking for, and how can you get it?
The answer to those questions may well indeed be found inside the mountain of data we can all access if we’re are willing to pay for it, and many of us are willing.
But, having the data and knowing what to do with it, especially in today’s world of continually changing signals, is where the real expertise is going to surface, and where we, as the linking strategists, are going to earn (or not) our keep.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
New on Search Engine Land