The Problem With Identifying Problem Links

Since the last Penguin update, there has been a lot of chatter about examining your link profile in order to identify bad links. Whether you’ve been hit by that update or your site remains unscathed (for now), the potential danger of “unnatural” links is top of mind. Many people who’ve never given a second thought to their link profiles are running backlink reports and trying to identify whether they stand a chance of being hit by a future update or manual penalty.

However, if you’ve ever used a tool like Link Detox or LinkRisk (for the record, I do use both), you might have noticed something: some of the links identified as being the worst offenders are actually good or even great links. Sometimes, the links that you’d pick out as being the bad ones don’t get flagged at all.

It’s the same problem that any algorithm faces: it’s impossible to determine whether a link is 100% good or bad, because the metrics don’t tell the full story. These tools definitely help us narrow down the list of  potential offenders, and when you’re dealing with a profile that contains tens of thousands of linking domains, it would be exceptionally difficult to wade through those manually without any help.

However, once your tool of choice has identified potential suspicious links, it is important to review this list with a critical eye. For each link in question, ask yourself: “Is this link good for my site?”

The Problem With Relying Solely On Tools

When I was beta testing LinkRisk, I ran one of my sites — a music site — through its system to see if we had any bad inbound links. There have been no intentional link building efforts for the site and no paid links — generally not much more than content creation and social promotion going on. In my naive mind, I assumed this meant that we shouldn’t and couldn’t have had any really bad links.

The results were surprising, though. A link identified as being extremely toxic was one that I would consider among our best — it brings us relevant traffic, and it was editorially given from a thematically related site (in this case, a local music site).

A second link identified as being a very poor one was from the local college radio station where my site’s owners host a weekly radio show — another relevant link that sends us nice traffic.

A third is from a music venue where we have sponsored shows and do a lot of offline marketing.

Actually, almost all of the links identified as the worst ones are totally natural and good for the site. They’re from websites of bands we’ve interviewed or local partners we’ve worked with. The site has never been penalized or hit by any big update, but if I relied solely on what this tool told me, I might pursue removal of a few of the site’s most valuable backlinks.

Another issue is that the two main tools that I use for link risk management don’t always agree on which links are suspicious/unnatural:

AG Risk


My site’s link profile is high risk according to LinkRisk and low risk according to Link Detox. Different data source and proprietary scoring algorithms are obviously among the reasons for the discrepancies, but this still leaves me confused. How is your average webmaster supposed to make an accurate decision about which links might need to be removed or disavowed?

What Should You Do?

Figure out what it is you want from a link. For one of my sites, I want local relevance and a sign that referral traffic has an interest in our content. For another one, I want someone to fill out a contact form (and hopefully sign up for something). For another, I want good search engine rankings. Knowing what you want to accomplish helps you determine what to do.

Visit the sites and look at the links. There’s obviously a reason why they were identified as being risky, so see if you can tell what that reason is. For example, one of my main “bad” links is from a site-wide blogroll on a massive site. It may seem “spammy” at first glance, but every blogroll link on that site is relevant to the site itself. Is that why this inbound link is being flagged? I’m truly not sure, but what I do know is that there are many good links getting flagged as suspicious. (Link Detox does provide a reason for each flagged link, however, which is quite helpful.)

Find out if these links are sending you any traffic. Look at the other stats for that traffic, too, especially the conversion information (if you track that) and metrics like pages viewed per visit or time spent on the site.


One of my “bad” links sent me visitors who clicked through several pages of the site (5.67 on average) and that’s an important metric to me for this site. We’re local, and we want people to click through a few posts and read what we have to offer. I’d rather see 70 people viewing close to 6 pages each than 1000 people landing on the site and immediately leaving.

The Bottom Line: Use Your Brain

Relying on tools alone is a poor way to analyze links. Different tools use different data –and if you’ve ever run a report in more than one tool at a time, I’m sure you’ve noticed that the data rarely match up exactly. Conversion data on AdWords doesn’t always match what is pulled in through Google Analytics.

Heck, I can’t even get the Majestic API data to match what I get when I run a site in their own tool.  If you can’t get the same result twice, why would you trust it completely to make decisions that could impact your site in the future?

I’m not at all dissing these tools, either. I think they’re invaluable, but I also think they need to be used with caution. If you have been penalized or badly hit by an algorithm update, I’d highly suggest using these tools to see where your problems might be, because from my experience, many webmasters and site owners simply are not able to identify all of their problem links for various reasons. I have yet to analyze a link profile that did not contain loads of bad links that were still live after a webmaster or business owner has gone through and done cleanup.

So, use these tools… but use them wisely.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: Channel: SEO | Link Building | Link Building: General | Link Building: Paid Links | Link Week Column


About The Author: owns the link development firm Link Fish Media and is one of the founding members of the SEO Chicks blog.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Nick Stamoulis

    Even the best tools still need a human on the other end to evaluate the data. I too have found “bad” links that actually sent really targeted traffic my way and would never cross my mind as something I should have been worried about. That’s why you have to look at multiple factors and not just one thing when evaluating the quality of a link.

  • Shannon Sofield

    I see your point, but what if the link testing tool is a perfect match for how google sees the link, shouldn’t you remove it then as long as the Organic referral gained is greater than the direct referral traffic from the link? Or, as long as the penalty decrease for the domain from the link is worth the loss of the inbound link traffic?

  • juliejoyce

    if it’s a perfect match and you’ve been penalized then I don’t see a problem with removing a link if you depend on Google and that fact outweighs everything else. How can you tell if the tool IS a perfect match though?

  • Shannon Sofield

    Thanks. I agree with your examples, but it wasn’t totally clear on the point I brought up.
    If I see a link like: http://www.wesellbacklinks/yoursite.html and we get a lot of traffic from it in referrals, it still seems to make sense to remove that link based on the negative organic impact. In fact, I would argue that even if you get a ton of traffic from that link, it makes sense to remove it as the automatic penalty will be domain wide from google’s point of view. I may be off on the last point here.

  • juliejoyce

    No I think you’re correct there. If the link is bad for you, for whatever reason, remove it. In the example with my own site, I don’t depend on Google for traffic there so I would not want to remove those links. I simply want people to look at more than metrics alone, but if you do depend on Google AND the tool can match their likes and dislikes, if it’s identified as being a problem, it probably is a problem and should be removed.

  • Nick Ker

    I’ve been using LRT for a while as a starting point for link auditing severely spammed sites. The links LRT flags as absolutely toxic seem to be pretty accurate, for an automated tool.

  • Stu Morris

    Nice post Julie. I just started using LinkDetox and find as you say is more of a baseline tool. I just find it simply illogical that Google is making you throw out the baby with the bathwater. The line between good and bad is neatly tucked away in G’s little secret basket and the work required especially if you manage a product site that people link to, is ridiculous considering Google could just discount these links through an algorithmic pass or fail. So sadly as I go through my link profile I find myself airing on the conservative side rather than relishing the wonderful link http://www.joewebmaster gave me, I now I have him take down.

  • CarlosVelez

    I’m with you there. Who knows what is “good” or “bad” except for the algo? We’re dealing with links that come from articles that were written in 2007. Why are we wasting time removing backlinks from articles written in 2007?

    So as Julie rightly says, the only approach is to look at the link and make the decision. But to do this, you have to be conservative and ruthless, as you’re suggesting.

  • Derek Abbring

    A fast way I sort is by using majestic seo report on the domains, just looking at the domains vs. individual links can speed up the process and you can usually tell if the site content is relevant, high quality or poor and no authority just by looking at the top level domain.

  • Chris Green

    I’ve been running some backlink audits on Link Detox and LinkRisk side-by-side on some sites whose link profiles I’m very familiar with – to see if I could gain any more insight on the data.

    The long and the short of it? The data was provided and the classification/risk appraisal was very comparable. Both pulled up a few false-positives, however, which always highlights why you should always try and prospect the majority of those links you plan to remove/disavow.

    My preference at the moment is Link Detox, the breakdown of the data in regards to toxicity as well as risk, means you have a lot more ways to drill down into the data. If you’re not going to put absolute faith in the results (i.e. willing to request removal and/or disavow without checking), the more dimensions you have within the data, the better – it helps make more sense of it!

  • Unbound Marketing

    No link tool knows Google’s algorithm. The links pulled through are based on their metrics and not Google’s which means you have to be careful which links you remove, if you remove any at all.

    The tools will still pull links through even if there are no bad links on your site. Imagine paying for a tool and it finding no “bad backlinks” on your domain? They wouldn’t let their tools do that.

  • juliejoyce

    Excellent point.

  • juliejoyce

    That’s a good point. I do confess that I have never run a profile and seen zero bad links though, but I wonder what would happen if there were none based on a tool’s calibration? Would they just blow up?

  • juliejoyce

    The false positives worry me in any situation, but as you say, they are found in both tools and I’m sure that if I ran profiles through anything else, the same thing would be true. I love your point about breaking down the data as that’s a point that I didn’t make well enough. There IS a reason why a link gets flagged so it’s a good idea to dig into that.

  • juliejoyce

    That’s what I’ve seen too. From what I recall with the last profile I ran there, if the link was on a site that was no longer in Google’s index, it got flagged as toxic, and I’d definitely agree with that label.

  • Thom Disch

    I have to admit that this whole penguin update has me a bit flustered
    (technical term). While I think we should manage our linking activity
    as best we can, I’m wondering if all of the discussion and hype relating
    to penguin updates doesn’t cause us to beat ourselves up a lot more
    than Google ever would with their penguin penalties.

    One of the sources of my frustration comes from the following: I operate
    in a fairly low competitive keyword market and I have some competitors
    that have been very successful creating lots of links that are from just
    a few domains and all of these domains are owned by the same company.
    This self promotion technique now results in very good search listings
    for these companies. This was something that I thought was eliminated
    many years ago. Does this sort of self promotion technique fall under
    the Penguin umbrella? Has anyone else had similar experiences? Where
    sites with self created links are now doing well? Is this only
    happening in noncompetitive keyword markets?

  • juliejoyce

    You raise some good questions. First of all I do think there’s a lot of fear out there right now and that sites who are not having any problems are freaking out and removing links. There are people who are penalized but convinced it’s not due to having bad links but their profiles are full of nothing but low-quality links, but still they think it’s something else. I think it has everyone flustered to be honest. If you think about it though, it makes sense from Google’s perspective. If they can’t stop link manipulation with their algorithm, scaring people to death so that they don’t do it anymore is a viable option. What cannot be fixed by a machine can be (at least partially) fixed by fear.

    The fact remains that most techniques that violate Google’s guidelines or are considered to be very risky do still work in some cases. For every technique that’s said to have been handled so that it no longer works, you can find 10 SEOs who could point you to sites where it’s actually still working, and working very well.

    With regards to your other questions I’ll hush and see if anyone can answer you on the subject of similar experiences. Thanks very much for the comment.

  • Thom Disch

    Management by fear. I like it.

  • Emory Rowland

    No disrespect to tools in helping find the most egregious violations, but there is no way I would depend on a tool for something like this. Any link that gets disavowed needs to be looked at by a real pair of eyes since it “can potentially harm your site’s performance.”

  • Nick Garner

    were using a link risk a lot… the main problem is that the tool has to crawl individual links to score it, so if you have a big site with 100s of thousands of links its really hard to come up with an aggregate score that means something – simply because you will run out of crawl credits. So at the moment we use majestic to build a randomised set of URL’s to asses to get out overall score for a site

  • juliejoyce

    I agree. I think identifying problems would be close to impossible without the help of those tools and that’s where they really are invaluable.

  • Traian Neacsu

    … and that’s how link analysis should be started. Domain is good, dig deeper, domain is spam, email or disavow directly. That’s how I do it…

  • Jake

    Good link analysis always requires a manual review… this is why so many people are being swept up into some of these penguin issues despite having done a good job with their backlink profile…. We hope to make things more streamlined with Remove’em, but have run into this challenge as well.. You can’t solve everything with an algo…. yet!


Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide