• http://www.brickmarketing.com/ Nick Stamoulis

    Even the best tools still need a human on the other end to evaluate the data. I too have found “bad” links that actually sent really targeted traffic my way and would never cross my mind as something I should have been worried about. That’s why you have to look at multiple factors and not just one thing when evaluating the quality of a link.

  • Shannon Sofield

    I see your point, but what if the link testing tool is a perfect match for how google sees the link, shouldn’t you remove it then as long as the Organic referral gained is greater than the direct referral traffic from the link? Or, as long as the penalty decrease for the domain from the link is worth the loss of the inbound link traffic?

  • juliejoyce

    if it’s a perfect match and you’ve been penalized then I don’t see a problem with removing a link if you depend on Google and that fact outweighs everything else. How can you tell if the tool IS a perfect match though?

  • Shannon Sofield

    Thanks. I agree with your examples, but it wasn’t totally clear on the point I brought up.
    If I see a link like: http://www.wesellbacklinks/yoursite.html and we get a lot of traffic from it in referrals, it still seems to make sense to remove that link based on the negative organic impact. In fact, I would argue that even if you get a ton of traffic from that link, it makes sense to remove it as the automatic penalty will be domain wide from google’s point of view. I may be off on the last point here.

  • juliejoyce

    No I think you’re correct there. If the link is bad for you, for whatever reason, remove it. In the example with my own site, I don’t depend on Google for traffic there so I would not want to remove those links. I simply want people to look at more than metrics alone, but if you do depend on Google AND the tool can match their likes and dislikes, if it’s identified as being a problem, it probably is a problem and should be removed.

  • http://kercommunications.com/ Nick Ker

    I’ve been using LRT for a while as a starting point for link auditing severely spammed sites. The links LRT flags as absolutely toxic seem to be pretty accurate, for an automated tool.

  • Stu Morris

    Nice post Julie. I just started using LinkDetox and find as you say is more of a baseline tool. I just find it simply illogical that Google is making you throw out the baby with the bathwater. The line between good and bad is neatly tucked away in G’s little secret basket and the work required especially if you manage a product site that people link to, is ridiculous considering Google could just discount these links through an algorithmic pass or fail. So sadly as I go through my link profile I find myself airing on the conservative side rather than relishing the wonderful link http://www.joewebmaster gave me, I now I have him take down.

  • CarlosVelez

    I’m with you there. Who knows what is “good” or “bad” except for the algo? We’re dealing with links that come from articles that were written in 2007. Why are we wasting time removing backlinks from articles written in 2007?

    So as Julie rightly says, the only approach is to look at the link and make the decision. But to do this, you have to be conservative and ruthless, as you’re suggesting.

  • http://www.aomservices.com/ Derek Abbring

    A fast way I sort is by using majestic seo report on the domains, just looking at the domains vs. individual links can speed up the process and you can usually tell if the site content is relevant, high quality or poor and no authority just by looking at the top level domain.

  • Chris Green

    I’ve been running some backlink audits on Link Detox and LinkRisk side-by-side on some sites whose link profiles I’m very familiar with – to see if I could gain any more insight on the data.

    The long and the short of it? The data was provided and the classification/risk appraisal was very comparable. Both pulled up a few false-positives, however, which always highlights why you should always try and prospect the majority of those links you plan to remove/disavow.

    My preference at the moment is Link Detox, the breakdown of the data in regards to toxicity as well as risk, means you have a lot more ways to drill down into the data. If you’re not going to put absolute faith in the results (i.e. willing to request removal and/or disavow without checking), the more dimensions you have within the data, the better – it helps make more sense of it!

  • Unbound Marketing

    No link tool knows Google’s algorithm. The links pulled through are based on their metrics and not Google’s which means you have to be careful which links you remove, if you remove any at all.

    The tools will still pull links through even if there are no bad links on your site. Imagine paying for a tool and it finding no “bad backlinks” on your domain? They wouldn’t let their tools do that.

  • juliejoyce

    Excellent point.

  • juliejoyce

    That’s a good point. I do confess that I have never run a profile and seen zero bad links though, but I wonder what would happen if there were none based on a tool’s calibration? Would they just blow up?

  • juliejoyce

    The false positives worry me in any situation, but as you say, they are found in both tools and I’m sure that if I ran profiles through anything else, the same thing would be true. I love your point about breaking down the data as that’s a point that I didn’t make well enough. There IS a reason why a link gets flagged so it’s a good idea to dig into that.

  • juliejoyce

    That’s what I’ve seen too. From what I recall with the last profile I ran there, if the link was on a site that was no longer in Google’s index, it got flagged as toxic, and I’d definitely agree with that label.

  • Thom Disch

    I have to admit that this whole penguin update has me a bit flustered
    (technical term). While I think we should manage our linking activity
    as best we can, I’m wondering if all of the discussion and hype relating
    to penguin updates doesn’t cause us to beat ourselves up a lot more
    than Google ever would with their penguin penalties.

    One of the sources of my frustration comes from the following: I operate
    in a fairly low competitive keyword market and I have some competitors
    that have been very successful creating lots of links that are from just
    a few domains and all of these domains are owned by the same company.
    This self promotion technique now results in very good search listings
    for these companies. This was something that I thought was eliminated
    many years ago. Does this sort of self promotion technique fall under
    the Penguin umbrella? Has anyone else had similar experiences? Where
    sites with self created links are now doing well? Is this only
    happening in noncompetitive keyword markets?

  • juliejoyce

    You raise some good questions. First of all I do think there’s a lot of fear out there right now and that sites who are not having any problems are freaking out and removing links. There are people who are penalized but convinced it’s not due to having bad links but their profiles are full of nothing but low-quality links, but still they think it’s something else. I think it has everyone flustered to be honest. If you think about it though, it makes sense from Google’s perspective. If they can’t stop link manipulation with their algorithm, scaring people to death so that they don’t do it anymore is a viable option. What cannot be fixed by a machine can be (at least partially) fixed by fear.

    The fact remains that most techniques that violate Google’s guidelines or are considered to be very risky do still work in some cases. For every technique that’s said to have been handled so that it no longer works, you can find 10 SEOs who could point you to sites where it’s actually still working, and working very well.

    With regards to your other questions I’ll hush and see if anyone can answer you on the subject of similar experiences. Thanks very much for the comment.

  • Thom Disch

    Management by fear. I like it.

  • Emory Rowland

    No disrespect to tools in helping find the most egregious violations, but there is no way I would depend on a tool for something like this. Any link that gets disavowed needs to be looked at by a real pair of eyes since it “can potentially harm your site’s performance.”

  • Nick Garner

    were using a link risk a lot… the main problem is that the tool has to crawl individual links to score it, so if you have a big site with 100s of thousands of links its really hard to come up with an aggregate score that means something – simply because you will run out of crawl credits. So at the moment we use majestic to build a randomised set of URL’s to asses to get out overall score for a site

  • juliejoyce

    I agree. I think identifying problems would be close to impossible without the help of those tools and that’s where they really are invaluable.

  • Traian Neacsu

    … and that’s how link analysis should be started. Domain is good, dig deeper, domain is spam, email or disavow directly. That’s how I do it…

  • Jake

    Good link analysis always requires a manual review… this is why so many people are being swept up into some of these penguin issues despite having done a good job with their backlink profile…. We hope to make things more streamlined with Remove’em, but have run into this challenge as well.. You can’t solve everything with an algo…. yet!