Everything you need to know about SEO, delivered every Thursday.
Manual Spam Action Revoked! A Case Study
Checklists of how to get out of a Google penalty abound. Five things you should do, five things you shouldn’t do, etc. But the number one thing you shouldn’t do is ignore it. I want to tell you the story of a two-year long penalty, the steps we took and how something we never thought would work… finally did.
No Impact On Organic Traffic
The client was a purveyor of a highly competitive retail product — one of the top in the world, with over 10 million visits per month. Back in July 2012, the client received the first notice that there were “unnatural links pointing to the site.” But the client didn’t notice a big drop in organic traffic; so at first, we ignored it. We rationalized this strategy when Matt Cutts released a statement that said, in part:
[W]e sometimes target specific spammy or artificial links created as part of a link scheme and distrust only those links, rather than taking action on a site’s overall ranking. The new messages make it clear that we are taking “targeted action on the unnatural links instead of your site as a whole.”
This was the message we had received:
We continued to monitor organic traffic and didn’t see any big reductions, so we assumed the danger had passed and that the “penalty” wasn’t really a penalty so much as a warning (as the industry suggested). The client did well over the holiday season, and traffic was even up about 40% year-over-year. Traffic remained level with no big drops.
The Industry Shifts
In late July to early August 2012, the SEO industry underwent a seismic shift with regard to links, beginning with this statement from Matt Cutts:
So while the site’s overall rankings might not drop directly, likewise the site might not be able to rank for some phrases.
This was covered in Danny Sullivan’s article, “Google Explains New Link Warnings, Says Don’t Panic But Don’t Ignore.” Danny had this to say on the subject:
What’s a confused publisher to do? I think if you got one of the messages last week, don’t worry unless you also noticed a recent traffic drop from Google.
In the face of these conflicting opinions, we decided to check out the client’s rankings. (Because we do still track them, and rankings aren’t dead! But more on that another time.) What we found was telling: for a selected group of high value, non-branded terms, the site had dropped 20+ positions.
(Note: I hear the skeptics here saying that ranking is inaccurate, but I promise you we used the most accurate and consistent measure possible, and we looked at trended data. I’ll debate it with you in the comments if you want.)
The steady traffic was actually coming from an increase in brand traffic. We had believed for some time that their non-brand term pages were competing with their home page for brand term relevancy. Now we had proof. As the non-brand pages dropped to +20, the brand terms soared to the top spots above all of their competitors and affiliates.
(Note: I’m going to stop here and clarify something. You’re probably thinking that brand terms should be easy to rank for. In a perfect world, that’s true. But when your brand contains a very competitive term and you have an industry where there’s a lot of black hat activity [more on that later], brand can be a struggle.)
The important thing to recognize here is that you can be impacted by a penalty and see no drop in overall traffic. That’s why it’s really important that you monitor traffic by page and — yes — rankings.
One Failed Reconsideration Request After Another
Over the next 18 months, we contacted webmasters, changed elements of the affiliate program, removed paid links, deleted directory links, begged for nofollows and submitted disavow files. All of our requests were rejected, with that horrible phrase:
We’ve reviewed your site and we still see links to your site that violate our quality guidelines.
We’d sigh, download a new list of links and start over. It became like playing a game of whack-a-mole!
Every time we downloaded a new list of links, there would be 1-2K (yes, that’s thousand) new links to deal with. We realized that as long as this many new links were being added on a regular basis, Google was never going to be sympathetic to our plight.
Our Own House Was Dirty
In August of 2013, we realized that we’d been chasing our own tails and decided to stop actively contacting/disavowing links for a while and just let them pile up so we could study the patterns of the new links being added. What we found was shocking.
First, we discovered that our own house was dirty. Another department, unbeknownst to us, was paying a firm to create links. These links were being created under the fallacy that one unnatural link without targeted anchor text on a page with 2-3 authoritative links would fly under Google’s radar. If you’re not familiar with this, you see it on blogs where a word like “here” or “of” is linked to an unrelated site. It tricked us as we were reviewing links because the anchor text seemed innocent, and there were other links on the page that were nofollowed. But it didn’t trick Google!
Needless to say, as soon as we discovered that, we got the contact list of bloggers who had posted those links and got all of those removed. Then we fired the link building company and changed internal policy so that all website vendors had to be reviewed by the VP of Marketing. But that still only accounted for a few hundred links over several months.
Our Affiliates Were Even Dirtier
While reviewing patterns, we discovered that a large number of the inbound links were what we would consider black hat. Sites that had recently expired (like one for land mine victims) now carried ads for our client. Sites like public libraries, fire stations, doctor’s offices, and others that had nothing to do with our client suddenly had a pop up on them or an inserted link. It was very clear they had been hacked. Here’s an example of one that doesn’t contain the client’s link:
We had to find out who had done this and put a stop to it. Unfortunately, it turned out to be their top affiliate — someone who regularly provides the site with about 20% of the sales… someone we did not want to fire or anger. Gently, we spoke to him/her and explained that we needed to put a stop to this kind of activity. The affiliate had thought this would not impact the client because (surprise!) all of the links were nofollowed.
Now, I know Google says that nofollowing links will make them “safe,” but I’m willing to bet that doesn’t apply to hacking the site of a children’s hospital. Of course, we got the affiliate to remove the links… apparently, there is an underground “service” that does this for people and by canceling his/her contract, all the links were magically removed.
Our Ah-Ha Moment
During this exercise with our affiliate and our trip into the seedy underbelly of the internet, we realized that in an industry like our client’s, we would never be able to keep up with the monitoring necessary to “control” our affiliates. Affiliate marketing is big money, and there are plenty of legitimate affiliates out there. But there are also plenty of affiliates that “push the envelope.” We just realized we needed a better solution. I can hear the naysayers arguing that affiliate links aren’t counted, but that’s simply not true, as indicated in this Google+ conversation with John Mueller:
(I could write a whole other article on some of the incorrect statements made in that thread. The important corrections are: 302s DO pass PageRank, and noindexing or blocking a page in robots.txt does NOT stop PageRank from accruing.)
To Burn, Or Not To Burn?
There was so much legacy spam and so much new spam pointing to our client’s site every day that we seriously considered burning the domain and starting over — and I think if this statement from Google’s Matt Cutts had come out sooner, we might have:
Don’t Burn The Site, Burn The Page
So if we couldn’t control the affiliates and we didn’t want to burn the domain, how could we stop that negative energy (aka PageRank) from landing on our client’s site in the first place? I studied this diagram from John Mueller that shows ways to stop PageRank and eliminated all the areas we couldn’t control. That left one, and a plan began to form.
We’d thought about burning the page, but had dismissed it because the affiliates drove a lot of sales to our site. Suddenly, we realized that we could change how the search engines saw our page and not impact the customer experience. And no, I’m not talking about cloaking. I’m talking about a little tool in every SEO’s toolbox — one that we use as a best practice, but that I’d never even considered before as a way to stop PageRank: the custom 404 page. If we created a custom 404 page that emulated the same experience for the visitors, but returned a 404 status code, could we stop that negative activity from reflecting poorly on our client’s site? I asked the community:
The response was varied, but finally John Mueller chimed in and confirmed that this would be an okay scenario:
The One That Finally Worked
We notified the affiliates of our plan, set this scenario up on the site, did one last round of link clean up and disavows, and submitted for reconsideration with our fingers crossed on January 8, 2014 with this strategy. It took a while, and we started to get nervous… but finally, in early February, we received this:
The client still has a lot of work to do to bring their site fully within guidelines and improve the search experience, but they have already seen improvements in traffic and in rankings too:
It may not look like much, but this represents about 2,000 extra visits (or a 30% increase) in the daily Google traffic to the client’s site — over their high season of the holidays. Over 50% of it is new visitors, which is always a good thing. Rankings have also rallied, coming up about 20 positions for the terms that had dropped. Now the client needs to focus on improving the site. They’ve lost ground over the last 18-20 months with all these issues, and they had to disavow or remove most of the links that were propping them up to begin with. They need new high quality links and more high quality content.
Will This Work For You?
As you can guess, and in the words of John Mueller, this is not a “trivial, standard situation.” It’s a creative solution to an uncommon problem, and in reality, more of a band-aid than a fix. Thus, I don’t recommend that all of you go out and make your primary landing pages into custom 404s.
But through the exercise, I hope you’ve learned to question what you “know” and think creatively about how you can solve certain problems. I’ve heard from a lot of webmasters lately that are frustrated with multiple rounds of requests and rejections. If you’re in that boat, you may want to take a step back and look at the patterns to determine how you can resolve the issues. At the very least, I hope learning that a site with 10 million visits per month has this problem will help your morale. Go forth and clean links, and best of luck to you.
Oh, and don’t forget to debate with me in the comments!
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.