• http://www.deepanshugahlaut.com/ Deepanshu Gahlaut

    Would you suggest me the document where Google has listed the degree for Penguin?

  • http://meancreativity.com/ Mason Pelt

    I didn’t know Search Engine Land was a tabloid. This feels like the SEO version of an article about a Justin Biber Tweet.

  • http://babypickel.com/vincenzo.html Chenzo

    Has it been confirmed of another update recently?

  • Durant Imboden

    Penguin is an algorithm, not an on/off switch. Of course sites can be affected to different degrees.

  • http://www.megrisoft.com/ Mohnesh Kohli

    So bad link cleaning is a continuous process

  • Scott Davis


    At this point can we just get a message in the WMT saying we’re penalized… How are we supposed to “guess” whether we’re under a penalty with multiple levels of penalization?

    This is especially a concern for new clients coming on board.

  • http://www.drugpossessionlaws.com/ David Matson

    This x1000. Whether you are trying to recover from an algorithmic penalty, or build a new site that has never been able to get top rankings, actually knowing what you are up against is important.

    Especially now that it is clear that Google places A LOT of responsibility on managing your backlink profile to the webmasters.
    Are those duplicitive scraper sites hurting me? So many questions….

  • http://www.drugpossessionlaws.com/ David Matson

    Legit point, but this is actually an important insight that we’ve never heard before. It just happens to be buried in a tweet.

  • Shawn

    There is no external document for you to see.

  • Chris Koszo

    @Scott Davis:disqus if you get a message in GWT, partial or site-wide, it’s actually a blessing in disguise: if you clean up your links and get the penalty removed it basically means you’re also good to go as far as Penguin/algorithms go so you should be in good shape. Remember also that it’s somewhat common for websites to receive a manual penalty and then get hit by Penguin.

  • http://www.drugpossessionlaws.com/ David Matson

    Also, the tweet from Matt above has other people replying to his tweet, and another reply from Matt re a different site “you have a slightly stronger case of Penguin.”

    Soooo, if you are lucky, I guess Matt will tell you how strong your penguin penalty is, but that information is not available to just anyone who wants it.

  • Enrico Altavilla

    Well, I really hope that most SEL readers didn’t need the confirmation given in the article, but at least the post gives us the opportunity to talk a bit about rankings.

    What Matt Cutts said is actually a reference to a very common characteristic of search engines: rankings are produced attributing each resource in the SERP a “ranking value” and then sorting them by these values.

    Algorithms like Penguin work as “modifiers”: they produce a “penguin value” for each resource or website, then during the ranking phase the “ranking values” are multiplied by the “penguin values” and this is the way the “ranking value” can be reduced.

    If the “penguin value” of a resource is 1, nothing changes because anything multiplied by 1 stays the same. But if the “penguin value” is less than 1, for example 0.5, 0.4, 0,3… then the ranking of that resource is reduced according to its “penguin value”.

    So, “a very mild case” just means that the “ranking value” of a resource was multiplied by a “penguin value” very near to 1, for example 0.95. The “ranking value” decreases but not so much.

    If these numbers are confusing, just think to them as percentages: the ranking value of a resource can be confirmed as a 100% (no penalization), reduced to 95% (mild penalization) or reduced to 10% (extremely penalized).

    I’ve over-simplified the topic, but the underlying concept is that one.

  • http://www.drugpossessionlaws.com/ David Matson

    Good points.
    For me, the interesting take-away is that it’s trivial for Matt Cutts to tweet back at people whether they have a “mild” or “strong” penguin penalty, but it is basically impossible for anyone else to verify this for their own site.

    Other than wait 6 months for a refresh, and maybe be able to tell if they fixed the problem.

  • Ralph D. Klonz

    Like to find out where to look to see if some of my sites have a mild case of Penguin.
    Cheers from Texas

  • http://www.dreamtechie.com/ Yogita Aggarwal

    @megrisoft:disqus Just like quality link building… bad link cleaning is an ongoing process for webmasters.

  • Dave

    If Matt can say the level of impact of penguin, he must have been using some tool or application to determine that. Why not integrate it in the Google Webmaster Tools? He doesn’t have to reply to Tweets about the level of impact, people can directly view it in GWT.

  • http://w3digit.wordpress.com/ Clal Lodh

    There are a lot of technological limitation of Google and monopoly.. He can not impart exact information in GWT. like wise which links are Unnatural and what degree of Penalties Occur by off-page and On-page consideration.. Google has only publish his guideline but never resolve the problem which is facing by webmaster.

  • http://w3digit.wordpress.com/ Clal Lodh

    Could you give me the resource of your given description to define what matt has said?

  • http://www.otriadmarketing.com/ Christopher Skyi

    “I’d recommend continuing to clean backlinks though. You still have a very mild case of Penguin.”

    It sounds like the severity of Penguin is determined by the number of backlinks and the degree of poor quality; as you clean up, the severity (degree) of Penguin starts to go down. That’s a bit different than the idea of different “degrees” of Penguin, like they’re sitting in a box somewhere and Google pulls one out for one site and a different one for another site. It’s more like they turn up or down a dial depending on how bad things are in a link profile.

  • Enrico Altavilla

    I don’t have a single resource to suggest to you, because we are talking about a very basic feature of ranking algorithms and formulas, but a good first step to acknowledge the mechanism of “weights” that change the ranking values of the resource is to have a look at a ranking/scoring function of a generic search engine, like Apache’s Lucene:


    You’ll see that most scoring formulas can be expressed in terms of variables that are multiplied together and each of those variables can accept many values. That means that you never get a “yes/no” result, but many possible results, depending of the value of each variable.

    Algorithms like Panda and Penguin, in particular, share a common methodology that has been described by Google’s Amit Singhal when Panda was released. This methodology classifies websites according to their distance to a boundary, which separates two classes of resources, for example hi quality and low quality, or spammy and not spammy. A resource can be very near to the boundary or very far from it so, again, these algorithms don’t classify resources in a black/white way but taking in account many shades of grey.

    (Amit Singhal description of Panda: http://www.wired.com/2011/03/the-panda-that-hates-farms/all/1 )

    If you want to acquire more information about the inner workings of a ranking function, you could read a book about information retrieval, like http://nlp.stanford.edu/IR-book/

  • Enrico Altavilla

    The “deal” is a very good metaphor of what actually happens. It’s just a value that establishes how much the rank of a resource should be devalued. That’s what Cutts intended to say.

  • http://www.myrtlebeachwebdesign.com/ Jan Chilton

    I’ve noticed that since the first of the month, real estate sites that were formerly filtered for overdoing keywords on the homepage are not only back, but are at the top of the SERPS. Put em on, you can’t win. Take em off, you can’t win. :-)

  • Eduardo Sobral Guilherme

    @Tannuty:disqus , Unfortunately you are right. But don’t say it lightly or as if it is a “naturally acceptable situation”. I Don’t ask Google WebSpam team to come write my content or build links for me, so they shouldn’t ask (?or force?) me to do research on places where I have backlinks – They are external and not in my website or my webspace. It makes me really angry to see people talking normally about something that would be outrageous some years ago.

    Ridiculous what google turned webmasters into just because of their webspam team’s massive flaw of detecting and ignoring bad links.
    Matt Cutts has been doing a terrible job for the algorithm and for Google itself. Instead of doing it’s work, he is trying to make the community do it for him. Really smart, but Google should fire him and hire someone actually capable of making the algorithm artificially intelligent or at least leading a team in that direction. Matt Cutts it’s a great PR person and that’s probably how he convinced and pulled the strings inside Google to take those crazy ideas ahead (and nothing in this post is trying to attack him personally, I’m just telling my opinion regarding his – terrible – work as Head of Webspam team). The search space top10 results have less spam. True. All hail for Matt Cutts. But at what cost? How much damaged is the internet after a decade? If you are/were the best quality website for some micro niche, it’s not because hundreds of bad backlinks that you stop having quality. It’s not because you don’t use disavow tools that you lost your quality right?
    Google is taking the freedom from the internet, pushing “quality guidelines” that are more and more lacking quality.

    Webmasters don’t focus on quality content and user experience. Focus on working as google’s henchman, policing the internet for what the algorithm can’t detect! Internet, specially the search space is becoming a rotten place. More and more I wish some new search engines would show up and google monopoly lowered to 30% or so between 3 search engines. But guess what? Thats Not gonna happen. So stick by Google and their rules and do free work for them.
    And then their #1 Holy Reason from Google: “Don’t like it? Don’t do it and you’ll just disappear from rankings. It’s up to you…”

    Maybe in some years google will ask webmasters to review random lists of links and rating them for quality (since their algo will keep being flawed), and the more hours you work for Google doing it’s job, more chances of increasing rankings you will have… And after it, after all Webmasters stop complaining and get used to one more strange change, Google will start penalizing the ones that refuse to do free work for the Big G, and once again “Don’t like it? Don’t do it and you’ll go out of our search space, since it’s ours and we do whatever we want with it”.

    I always though search engines tried to mimic the actual world in terms of reputation. If you are respected in some niche, people value your opinion. And if you talk about me, I get respect in that niche. Now Google isn’t being able to tell if you’re talking good things about me because you want to or because I bribed you to do it. And when Google finds Dodgy people talking about me, they will penalize me, when they should just ignore, since they have no value to the matter. But no, they want me to go and spend hours looking at who’s talking about me, and tell Google “Look, this scoundrel is talking X about me. Please disavow him.”
    No one should be blamed for something they did not do, and Google’s change after change from “ignoring” to “penalizing” only keeps proving their algo needs to be vastly improved.

    Wow. this was supposed to be a small note on your content, and turned out to be a big complaint that wen’t a big offtopic (but still related to this “degrees of penguin penalization”.

  • http://www.tylermcconville.com/ Tyler McConville

    The Penguin update is an addition to their algorithm to further add levels of filtration for the organic SERPs. Logically we can assume that the algorithm is condition based which would heed different results based on your link portfolio (assuming that’s the only thing it’s looking at). Instead of worrying about penguin variants we should all be learning to elucidate the various forms with direct relation to how they’re triggered. After all we are Search Engine Optimization experts, no?

    Just my two cents.

  • Roman M

    I’d say no, they don’t hurt you (anymore)…I think G is pretty good at spotting these sites by now and not counting their links and/or content. Imagine how many times a day these sites get disavowed by webmasters.

  • http://www.ducktoes.com/webdesign/seo.php Cathie Dunklee-Donnell

    How would they select different sites for different levels? I don’t think that’s what he means. I think he means he still has some unnatural-looking links. He’s still a bit penalized by Penguin.