• http://www.dotdynamic.ca/ Matthew Shepherd

    Absolutely. Everyone hears you loud and clear Google…cut out all the spammy tactics. Lessons learned…now it’s maybe time to Tame The Penguin.

  • http://www.leadfigures.com Lead

    Would the contents of “badlinks.xml” have to contain two pieces of information per-bad link; the originating page and the destination link? Or would we consider all links from that site bad? I’m trying to think whether the scenario of two links from the same site, one being a good link and the other bad is a valid one?

  • http://twitter.com/DavidJo45324615 David Johnstone

    I agree with the sentiments of this article – negative SEO should not even be an issue in 2012 – it’s like Google’s taken a huge regressive step in punishing sites for off-page signals. 

    I just disagree with the way of going about it.  If webmasters have to be proactive, it means we have to constantly monitor our inbound links using special tools, THEN submit an .xml file to Google when we see suspicious links.  What if someone Xrumer’s your site? That’s a lot of heavy lifting right there.  

    I think Google just need to be neutral on what they deem to be spammy links – end of story.  Just don’t count them.  Let spammers waste their time, and keep them in the dark. 

  • http://twitter.com/doddssteeleroad stephen dodds

    I think that there should be some way to acknowledge dodgy links – as you point out we are already submitting plenty of information to google – either through our own direct actions or the very fact that we have sites that we are responsible for. 
    Also, as you point out, is n’t the current reconsideration process already half way there already?I am fed up of repeatedly trying to get old legacy links taken down from long defunct sites – and as for scrapers…….

  • MertS

    How does link devaluation help prevent spammers from spamming is Google’s problem with that solution. Devaluation makes things neutral so what is the incentive for the spammer not to spam again? Spammers will say: “Oh I spammed, oops I got caught. Now I will lie to Google saying oh someone is spamming me. Help help.” 

    I think Google stopped doing that from now on.

  • http://twitter.com/kevgibbo Kevin Gibbons

    Personally I think the rumours of having this as a feature in GWT is a great idea – would definitely be useful for a lot of sites.

    However, the main problem I can see from Google’s perspective – is what if SEOs build a bunch of low-quality or paid links which they find aren’t working for them? Surely just being able to discredit them because they’re not working isn’t helping Google out – and almost encourages riskier tactics, even if it is from a minority!

  • Felix McCelery

    Wouldn’t you just have an arseholes.txt listing crap domains you’ve found

  • http://www.seoautomatic.com shendison

    I think something like this is long long overdue.  The “suspicious links detected” emails have never done anyone any good, aside from Google as they can more easily identify sites where people have removed links.  

    I think Google really should either show us the actual “suspicious links” in Webmaster Tools or at least allow us to mark them as suspicious ourselves. Where is the upside for keeping them secret? I had an incident last summer (through no doing of my own) with over a million spamlinks pouring into a site over a month, and there was no way to report this. Many of them did show up in WMT tools also, and I was in a panic. Someone was bombing me. I did email the reinclusion team, but of course got no response, and I simply lived in fear that the site would tank..  Instead of tanking the site rose – sabotage backfire – then the links and the rise were both gone in a couple months ;( 

    We really need to be able to flag our links, Google, and I agree w/ Matt “Everyone hears you loud and clear”…

  • http://twitter.com/netagence netagence

    I second David Johnstone and Kevin Gibbons’s opinions. I’m concerned that having to do the work ourselves would both require a *lot* of time, and the need to use resources such as majesticSEO which no matter how great it is, requires some spending. Plus I immediately see hackers just trying out even crazier and shadier linking methods if they knew they have that safety net to have the suspicious links discounted. Those ? Nehh, I’m not the one who created thousands of forum signatures in 24h, just ignore them dear GG !
    In the end, this remains a Google issue if they wish to remain pertinent, so the same work is at stake, which could need human supervision, but I don’t think webmaster should generally be the detection source. Having a way to signal them an obviously bad linking operation against one’s site would be a plus, but given the recent examples of stolen/duplicate contents reports not taken into account 6 months after filling the official form, I still have to be convinced…
    Best regards from France !

  • JassyPets21

    So over worked small business owners must now GUESS at what links may or may not be bad links, and report them to Google so that they can be ignored? This is an awful idea.

    Google very easily can make negative SEO impossible. Stop passing negative value from “bad” links. And stop penalizing pages for over optimized anchor text.

    The problem is that they would RATHER allow negative seo (and its consequences) so that they can punish people who they believe are trying to game the system.

  • RyanMJones

    so I still haven’t seen solid proof that negative SEO is an issue. where are the sites that are disappearing? 

    From what I’ve seen, negative SEO IS possible, but not simply by posting a few spammy links to your competitors.  (now hacking, ddos, hijacking, fraudulent takedown notices, etc still work but are a different discussion)  

    The only sites I’ve seen get hit are those that only ranked due to shady practices to begin with. Theory: The added links seemed to cause the algorithm to take another look and discredit many of the spammy links they were once crediting.

    I’ve yet to see proof of any site that ranks based on quality factors get hit because of links. that’s why nobody took rand up on his task of doing negativeSEO to seomoz – they knew it couldn’t be done.  

    I think we should confirm whether we actually have a problem before we ask google to fix it.   

  • http://majuter.us/ Leo Ari Wibowo

    Although it sounded like a good idea, but I think it’s a waste of time. Just let Google fix their flaw system by them self. They know what their problems are, and I believe they know how to fix them (sometimes)

  • http://twitter.com/AaronFriedman Aaron Friedman

    @RyanMJones:disqus I will admit, I agree with you, we need proof that it actually works, and what Rand did was awesome and epic. Sites like Moz though will be tough to take down no matter who tries. 

    Much of what we talk about is speculation. My point was simply, let users take action, help control link spam, (or spam in general), report to the engines. Its kinda like putting more police officers on the street to reduce crime :)

  • JassyPets21

    So RMJ, since you haven’t “seen” the proof it can’t exist?

    Site A ranks #1 in google for a term. A couple guys take it upon themselves to test negative seo. They create TONS of spammy links with the exact same anchor text as the #1 term being ranked. And within weeks the ranking for the term tanks. This is pre-Penguin. Is this proof? Nope. But I’ve seen it.

    Site B (one that I do have VERY personal knowledge of) was ranking well for a variety of terms pre-penguin. Rankings tanked from the penguin update. For instance, page XX on the site ranks top 5 for “keywordCC”. After penguin that page no longer ranks in the top 100 for that term or many others. Yet on the same site, another page (page YY) now ranks top 20 for “keywordCC”.

    Page XX has a couple thousand inbound links (some good, some spammy) and over optimized anchor text. The page itself is well optimized with original content. A page that a search engine should like for its on page optimization.

    Page YY has virtually no content or inbound links. No reason it should rank well. Certainly no reason it should outrank page XX.

    If the “bad” links to page XX had only been devalued (there are good links too), it would still logically outrank page YY. But it doesn’t. Why? Either one or two things is happening. 1 – Google passes negative value from links they deem “bad. And 2 – A page or site can be penalized for over optimized inbound anchor text.

    Personally I suspect, but cannot prove, that Google is doing both. I don’t have the resources or time to conduct an adequate experiment.

    And just because attempting to tank SEOmoz is a stupid experiment and wasn’t done, doesn’t mean it can’t be done. You can’t announce your target if you really want to conduct an experiment like this.

  • RyanMJones

    all I’m saying is that if this was really a problem that we needed a solution to, we’d be seeing a TON of sites being sabotaged by competitors – and we’re not. A lot of SEOs are just being paranoid and asking for a solution to a problem that they don’t have just because they think it might be possible. 

  • RyanMJones

    and also.. if they do implement this, my first move will be to fire up scrapebox and xrumer and let them run for a few weeks building me millions of links.  Then, if I get banned I’ll just log in to webmaster tools and say “hey, it wasn’t me.”  

  • JassyPets21

     Maybe you’re right. I hope you are. I completely understand why google wants to stop the spammy backlinks.

    My point would be this: In the past people have been able to game google for rankings with spammy links. People have used them because they worked. But what do you expect less legit seo companies to do if easy, spammy backlinks no longer work? But negative seo does work? You think they’ll just close their doors and say “We’re OK with building spammy links that directly benefit our clients, but we would never build spammy links to hurt our client’s competitors.”

    I’m concerned that google is opening Pandora’s Box. Negative seo has been around for a long time, but it was difficult and time consuming. It now appears to be neither difficult nor time consuming to take down a mid-level / small company site for specific terms.

  • JassyPets21

     You won’t be alone in this tactic.

  • http://gyitsakalakis.com/ Gyi Tsakalakis

    Love the idea in theory, but as pointed out by @RyanMJones:disqus it’s just too susceptible to gamesmanship.

    Seems like the solution is negating value of spammy links, not penalizing.

  • Ramón Garcia

    There are two problems:

    There are a lot of good sites that have made spammy links that have go out of serps. You can see it, the serps are less quality than two months ago. Good site with good content has go out of google. Maybe you can assume than google is a private company and they can do everything. OK.

    But the second problem. A lot of web masters are waiting to confirm really the negative seo. If this is confirmed we can expect to a big spam increase in the website. You can think in:

    -100.000 forum posts per 120$ (every posts includes at least 3 links because you have to create the profile).
    -100.000 blog comments per 30$ (scrape box)
    -web 2.0, google+, Facebook likes, Facebook fans, twitter retweets or followers, quite inexpensive.

    We’ll see it in 6 months. When spammers will be sure than their actions are “bad” again compitors. Now they are spamming not too much for one web, tomorrow they’ll spam for 30 websites more spammy links.

    Thanks google, thanks to be the most spammer promoter

  • http://twitter.com/BeckyLehmann Rebecca Lehmann

    I do think you could totally test some spam links and then block them later. But I think that’s probably okay. You get immediate feedback on what works and what doesn’t, and perhaps some myths get dispelled. If you get caught for something you did that you knew was black hat, you’re still out the money and time spent on it – almost certainly time and money better spent somewhere else. And blocking it isn’t a guarantee that you’ll regain your rankings.  Any recovery from blocking a bad link is likely to be partial at best. 

  • http://twitter.com/DavidJo45324615 David Johnstone

    A million “unnatural links” notices (and resulting PENALTIES, not link loss) prove negative SEO exists. If you can inadvertently penalise your site from off-site tactics, a competitor can deliberately do the same to your site.  Google doesn’t know who built the links to your site.

  • http://twitter.com/ahmansoor SEO Specialist AHM

    There are couple of flaws in this theory. Few has already been discussed, here is another one:
    If one say to Google ‘please ignore this backlink’, Google wouldn’t be able to use sentiments as  a ranking factor ever. Because everyone would request removal of negative mention/reviews.

  • http://twitter.com/AlexHavian Alex Havian

    Well Google did post latest update on Google Webmaster Tool  by 5/22/2012 saying “Google works hard to prevent other webmaster from being able to harm your ranking or have your site removed from the our index” I assume they currently working on it and it should be fixed in a few months …

  • http://twitter.com/handtrucks2go Handtrucks2go

    A similar or better idea might be to be able to mark these these links within webmaster tools under links to yoursite. What do you think?

  • http://twitter.com/enovabiz Aniruddha Badola

    Google is making driverless cars but cannot make an algorithm which can find out paid links or unnatural links with precision and no collateral damage. Why is Aaron pleading with Google to put onus on Webmasters for finding out negative links. How is every website owner and webmaster supposed to find all negative links.   

    If Google finds some not-so-white link then it should just not consider them instead of dishing out penalties. 

  • http://www.brickmarketing.com/ Nick Stamoulis

    “Because really, in today’s day and age, it’s ridiculous that Negative SEO should be part of the conversation.”

    I definitely agree with you on this one. I don’t really believe that a website that is squeaky clean and totally white hat is actually going to be undone by Negative SEO. I have to think that if a site that has been online for 5+ years and has nothing but great SEO under their belt, if all the sudden there are 1,000 spammy links in their portfolio that that isn’t going to signal something is amiss to Google.

  • Johndx

    I don’t want to be mean, but this is ridiculous. I can buy a 3000 link package on Fiverr to link to your site. Want to copy and paste 3000 links to your BadLinks file? How about going over them one by one to see which is which?

    The only way for Google to morally deal with this is to discount the links. Make them worthless. They can still deindex networks and other crap so people won’t waste time and money on them, but asking webmasters to be proactive shows you have a lack of understanding of how easy bulk link building is these days.

  • MeyerKaty

    my roomate’s st ep-mot her brought ho me $1 3342 the previous mo nth. s he g ets pa id on the computer and bought a $369300 condo. All she did was get lucky and profit by the advice uncove red on this web page ===>> ⇛⇛⇛⇛► http://getitmust.blogspot.com/m

  • http://www.bestrank.com/ Mike Shannon

    Having a robots.txt command (or webmaster tools feature) to ignore all links from said site would be nice and easy.

    And removing an inbound link’s weight from page A on spammy domain while not removing an inbound link from page B on the same domain sounds OK at first, but page A might (probably) links to page B thus bringing over some negative weight still to your domain… besides you wouldn’t want any links pointing at you from a spammy domain anway.

    I personally think penguin had a lot to do with too many inbound links from sites within the same niche and not necessarily overly optimized anchor text; overly optimized anchor text is easy to see since it’s a % but the concept of a niche is an AI problem that probably got a boost recently with the penguin update.  So having the ability to remove entire domains from inbound linking to your domain seems to make sense.

  • KramerEdward77

    my roomate‘s sister makes $82/hour on the laptop. She has been out of work for six months but last month her check was $19771 just working on the laptop for a few hours. Read more on this site ====>> ⇛⇛⇛⇛► http://enternet-Job.blogspot.com

  • http://www.facebook.com/ChristopherJamesNoble Christopher Noble

    I’m with most people here . . . no need to mess around, Google just needs to ignore/discount any links it considers “bad” . . . not penalize for them. Simple and no work for us or them.

  • threads yasir

    I would certainly
    commit 10 on 10 for such incredible cognition. web design

  • threads yasir

    Your listing are
    progressing with days dungeon it up guys. web design

  • http://twitter.com/RegDCP Reg Charie

    As far as I can see, and from what the big G tells us, linking has been devalued to the point of no influence on SERPs, good or bad.

    Back in 2010 Google perfected it’s textual relevance algos to the point where input from links was not needed.
    In fact, their LSI based algos were so good they substituted relevance for the PageRank figure of the linking page in their PageRank calculations.

    While PR is still around, it is a stand alone metric and does not impact SERPs, to any measurable extent.

    In response to Ryan M, I find that the sites that got hit were probably link building without concern as to relevance.
    That is to say that their link building was based on the PR of the linking pages, relevant or not.
    Google has stated on MANY of the updates that they were not algo changes, but merely “recalculations”.

    Think how a page would be hit if their link profile was recalculated by removing credit for non-relevant links using the link anchor text on the non-relevant pages.

    Google has gone to great lengths to remove “spammy” links by delisting link farms, thin content, over abundance of ads, and similar pages of little or no value. They also consider a non-relevant link as spam, regardless of the “quality” of the linking page.

    Jassy, what would happen if your example page, XX was in the top position because of the amount of non-relevant links and these were removed?
    You state that page YY had no links so it was unaffected by a link recalculation.

    Since Google has closed the doors on linking, negative SEO cannot exist.


  • http://twitter.com/RegDCP Reg Charie

     David, what if these unnatural link penalties were only applied to links pre-penguin?
    What if these unnatural link notices were sent because of linking PATTERNS and non compliance to the relevance factor?

  • http://twitter.com/RegDCP Reg Charie

    Ramón, what if links do not count towards SERPs anymore?

  • http://twitter.com/RegDCP Reg Charie

     What if the site being penalized HAD been doing link building intended to influence their SERPs? Would you consider the damage as “collateral damage”?