Q&A With Google’s Matt Cutts On How To Use The Link Disavow Tool

It’s been almost two weeks since Google launched its link disavowal tool. Some have been busy diving in and using it, but others have had more detailed questions about it. We’ve got some answers, from the head of Google’s web spam team, Matt Cutts. Question: How do people know what links they should remove? Answer: When […]

Chat with SearchBot

Disavow Links Featured

It’s been almost two weeks since Google launched its link disavowal tool. Some have been busy diving in and using it, but others have had more detailed questions about it. We’ve got some answers, from the head of Google’s web spam team, Matt Cutts.

Question:

How do people know what links they should remove?

Answer:

When we’re taking targeted action on some specific links, the emails that go out now include examples of bad links. We provide example links to guide sites that want to clean up the bad links. At the same time, we don’t want to help bad actors learn how to spam better, which is why we don’t provide an exhaustive list.

Question:

Why not list the bad links?

Answer:

That’s related to the first question, of course. We don’t want to help bad actors learn how to spam better, which is why we don’t provide an exhaustive list.

Question:

Who should do this?

Answer:

The post [Google’s announcement post last week] says anyone with an unnatural link warning. It also mentions anyone hit by Penguin, but I keep getting asked about this. I’m going to reiterate that if you were hit by Penguin and know or think you have bad links, you should probably use this too.

Question:

What if you don’t try to remove links? Given what a pain it is to get links off the web, why wouldn’t someone just use disavow? I know Google recommends requesting link removals, but from a technical standpoint, if they don’t do that and just disavow, it’s pretty much going to work, right?

Answer:

No, I wouldn’t count on this. In particular, Google can look at the snapshot of links we saw when we took manual action. If we don’t see any links actually taken down off the web, then we can see that sites have been disavowing without trying to get the links taken down.

Question:

How are you dealing with index files? Do you have to remove all variations, such as like this:

https://badsiteiwanttodisavow.com
https://badsiteiwanttodisavow.com/
https://badsiteiwanttodisavow.com/index.html

Answer:

We tried to cover this in the last two to three questions [of the announcement post]. Technically these are different URLs, so if you want to be ultra-safe, then you would list the URL variants.

Practically speaking though, Google normally canonicalizes such URLs to a single URL, so if you’re going off the backlinks that you download from google.com/webmasters/, then you should normally only need to list one url.

Question:

If you download and reupload a disavow list, is it still a several week delay between when the fresh upload is acted upon, even if you upload a fresh list the same day, perhaps after catching a mistake?

Answer:

I would count on it potentially still being a several week delay. If you have URLs A and B and you download the file and edit it to add a new URL C then it shouldn’t really affect A and B, but it will take time for disavowing C to go into effect.

Question:

How long will it take sites to see any potential improvement? It seems like potentially months.

IE, say you upload a file. It takes several weeks for that to be read. Then you might wait several weeks for the next Penguin Update, until the change would be reflected, right?

Or when you say multiple weeks, do you mean that really, the file might get read right away, but the changes might not be reflected until some Penguin or other update can act on those changes?

Answer:

It can definitely take some time, and potentially months. There’s a time delay for data to be baked into the index. Then there can also be the time delay after that for data to be refreshed in various algorithms.

Question:

Just to double-check, reconsideration should only be done if they’ve gotten a message about a manual action, correct?

Answer:

That’s correct. If you don’t have a manual webspam action, then doing a reconsideration request won’t have any effect.

Question:

Do manual actions specifically say if they are related to bad links?

Answer:

The message you receive does indicate what the issue with your site is. If you have enough bad links that our opinion of your entire site is affected, we’ll tell you that. If we’re only distrusting some links to your site, we now tell you that with a different message and we’ll provide at least some example links.

Question:

What about the www prefix? It sounds like to be safe, you should do this:

domain:badsite.com

domain:www.badsite.com

Answer:

You only need the first line. If you do domain:badsite.com, then that also ignores all links from www.

[NOTE: I’m pretty sure this also means Cutts is saying that if you only disavow from a domain with the www prefix, and it also has a non-www variation, those will still be counted. But I’m double-checking on this].

Question:

What prevents, and I can’t believe I’m saying this, but seemingly inevitable concerns about “negative negative SEO?” In other words, someone decides to disavow links from good sites as perhaps an attempt to send signals to Google these are bad? More to the point, are you mining this data to better understand what are bad sites?

Answer:

Right now, we’re using this data in the normal straightforward way, e.g. for reconsideration requests. We haven’t decided whether we’ll look at this data more broadly. Even if we did, we have plenty of other ways of determining bad sites, and we have plenty of other ways of assessing that sites are actually good.

We may do spot checks, but we’re not planning anything more broadly with this data right now. If a webmaster wants to shoot themselves in the foot and disavow high-quality links, that’s sort of like an IQ test and indicates that we wouldn’t want to give that webmaster’s disavowed links much weight anyway. It’s certainly not a scalable way to hurt another site, since you’d have to build a good site, then build up good links, then disavow those good links. Blackhats are normally lazy and don’t even get to the “build a good site” stage. :)

Question:

One last try on something I asked when the tool launched. Why not simply discount links so there’s no need for people to disavow, rather than considering some links as negative votes capable of harming a site?

Answer:

As part of our efforts to be more open about manual actions, we’ve been providing more information to site owners, about when links to their site are affecting our opinion of their site. Because of that additional information, webmasters have been paying more attention to their link profile and trying to move toward higher quality links. That’s a good thing.

But we understand that migrating toward higher-quality links also means that some sites feel the need to clean up previous spammy or low-quality links. Right now it can be a difficult task to clean up a site’s backlinks, and from listening to the SEO community we wanted to provide a tool that could help after site owners had already taken substantial steps to try to clean up their site’s backlinks.

Question: Any last thoughts, comments or perhaps warnings of mistakes you’ve seen people make?

I have gotten a couple people asking “If I disavow links, do I still need to do a reconsideration request?” We answered that in the blog post, but the answer is yes.

We want to reiterate that if you have a manual action on your site (if you got a message in Webmaster Tools for example), and you decide to disavow links, you do still need to do a reconsideration request.

We recommend waiting a day or so after disavowing links before doing the reconsideration request to give our reconsideration request system time to pick up the disavowed links, and we also recommend mentioning that you disavowed links in the reconsideration request itself.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Danny Sullivan
Contributor
Danny Sullivan was a journalist and analyst who covered the digital and search marketing space from 1996 through 2017. He was also a cofounder of Third Door Media, which publishes Search Engine Land and MarTech, and produces the SMX: Search Marketing Expo and MarTech events. He retired from journalism and Third Door Media in June 2017. You can learn more about him on his personal site & blog He can also be found on Facebook and Twitter.

Get the must-read newsletter for search marketers.