Google on Penguin algorithm; aims to ignore spammy links but can lead to distrusting your site

Maybe it is best to avoid link building and build a site that naturally gains links?

Chat with SearchBot

When Google released the real-time Penguin algorithm update, which some SEOs code-named 4.0, back in 2016, Google told us this version devalues or ignores most spammy links. For the most part, Google’s Penguin algorithm no longer penalized for bad links because it would aim to neutralize the spammy links and just not count them, as opposed to penalizing for them.

John Mueller, a Search Advocate at Google, said on Friday in a video question and answer session that Penguin does try to ignore the spammy links. However, in the cases where Google cannot because there is a “very strong pattern” of spammy links pointing to the site, Penguin may penalize and distrust the site as a whole and not act in the granular way it was designed for.

John Mueller said this at the 37:06 mark in this video he posted on Friday on the Google Search Central YouTube channel.

What was said. The question John was asked was “Is the Penguin penalty still relevant at all or less relevant spammy toxic backlinks are more or less ignored by the ranking algorithms these days.”

John responded, “I’d say it’s a mix of both.” He explained, “For the most part when we can recognize that something is problematic and any kind of a spammy link and we will try to ignore it.” “If our systems recognize that they can’t isolate and ignore these links across a website, if we see a very strong pattern there, then it can happen that our algorithms say well we really have kind of lost trust with this website and at the moment based on the bigger picture on the web, we kind of need to be more on almost a conservative side when it comes to understanding this website’s content and ranking it in the search results and then you can see kind of a drop in the visibility there.”

John is saying, that in some cases, Google’s Penguin link algorithm can demote the whole site based on the links and not just ignore the specific spammy links. But it seems like it has to be very high levels of spammy links.

“But for the most part like the web is pretty messy and we recognize that we have to ignore a lot of the links out there. So for the most part I think that’s fine. Usually, you would only see this kind of a drop if it’s really a strong and a clear pattern that’s associated with the website,” John added.

John came back on Twitter to clarify that “this is the case for many spam & low-quality signals” in the Google algorithms. He explained, “we’ll work to ignore the irrelevant effects, but if it’s hard to find anything useful that’s remaining, our algorithms can end up being skeptical with the site overall.” “Our spam algorithms are pretty nuanced and they do look at a number of factors,” he added.

Disavow links. So do we need to disavow links now, even when Google said we really don’t need to? The answer is no, you don’t need to disavow links. You can John Mueller said, “I’d either ignore it or use the disavow file (for the worse domains).”

The video. Here is the video embed where John said this:

SEO consultants chime in. I asked a few SEO consultants their opinion on what John said in this video and here is what they had to say:

Lily Ray, the Senior Director, SEO & Head of Organic Research of Amsive Digital told us, “John’s advice here shouldn’t come as much of a surprise to SEOs who have dealt with companies engaging in large scale link building initiatives using tactics that violate Google’s guidelines, only to encounter massive declines in ranking and traffic.” “Google doesn’t always send out a manual action when the sites run into trouble. But in many cases, sites can either struggle to rank or feel that they’ve received an algorithmic penalty without any formal notification. Often, when you take a look at their backlink profile and talk to the company about their SEO strategy, you might discover that most of the links are paid links, guest posts, footer links, exact match anchor text, etc. on websites no one has ever heard of. In these cases, I believe it’s important to reconcile Google’s trust issues with your site by disavowing the paid/offending links, as well as earning new, trustworthy links organically,” she added.

Glenn Gabe, SEO consultant at G-Squared Interactive told us, “Rolling out Penguin 4 was a great move by Google in 2016, since it devalued link spam versus penalizing it. But as John explained, if Google’s algorithms cannot find any useful links (which would be an extreme situation), and there is a strong and clear pattern of spammy links, then it can be skeptical with the site, and Google can lose trust with the site overall. As a result, the site can see a drop in search visibility. The problem is that many site owners believe they are being attacked via negative SEO (and that those link attacks are working — and it’s the reason they have seen drops over time). Google has explained in the past that negative SEO attacks don’t work and that its algorithms can just ignore the link spam (especially for sites with a normal mix of links). So for many of those sites fearing negative SEO attacks, the situation John covered in the latest Search Central Hangout would not really apply. In my opinion, if a site has an overwhelmingly spammy link profile (almost all of the links are unnatural and spammy), without any other quality links, then that obviously can be problematic. But for most sites that have a normal mix of links, what John is explaining should NOT be a problem. Unfortunately, I’m already hearing from site owners about this… when their sites definitely don’t fit into the situation John explained in the video.”

Dr. Marie Haynes, the CEO of MHC inc told us, “Google’s communication on what site owners need to know about Penguin has been frustrating. At a Pubcon conference in late 2016 Gary Illyes told us that it was indeed possible for Penguin to algorithmically cause harm saying, “If Penguin sees signs of manipulation, it can decide to discount ALL the links, which can be pretty bad for a site.” Our belief is that while this can happen, it is reserved for cases where there is an obvious history of links being built solely to manipulate PageRank on an astronomical scale. While we have seen improvements for some sites after filing a thorough disavow, even if no manual action is present, the only cases that we feel we can attribute improvements to disavow work are for sites with a history of years of very manipulative link building.”

Why we care. Ultimately, when it comes to link building, you should be careful. Don’t buy links, don’t look for cheap and easy ways to get links to your site. Make sure to review Google’s documented link schemes help document and stay far away from those methods.

When working on client sites that may have link issues, you need to decide if you really need to disavow links or not and then which links you should disavow. At the same time, Google is likely already ignoring most of the bad links, so you may not have to worry too much about it.

It may just be easier to avoid the practice of link building and build a website and content that other sites naturally want to link to without you asking them.


About the author

Barry Schwartz
Staff
Barry Schwartz is a technologist and a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics.

In 2019, Barry was awarded the Outstanding Community Services Award from Search Engine Land, in 2018 he was awarded the US Search Awards the "US Search Personality Of The Year," you can learn more over here and in 2023 he was listed as a top 50 most influential PPCer by Marketing O'Clock.

Barry can be followed on X here and you can learn more about Barry Schwartz over here or on his personal site.

Get the newsletter search marketers rely on.