SEO is as dirty as ever

Search engine optimization has built credibility over the years, but spammers and black-hat practitioners still give it a bad name. Columnist Patrick Stox shares his SEO horror stories.

Chat with SearchBot

scam-shady-dirty-liar-ss-1920

As much as the SEO industry wants to grow up and shed the shady reputation it had in the past, it seems many don’t want to change and still cast a shadow over the industry. For every company doing things right, it seems there are many more who have not updated their practices or still want to take shortcuts.

I know it’s not October and not time for Halloween, but I’m already putting my costume together and was inspired to write this. All of these are true SEO horror stories I have seen over the last couple of years. There are outdated practices, mistakes and some seriously shady stuff that still goes on in our industry.

I’m okay with mistakes, but a lot of these stories involve companies just never updating their practices or intentionally doing things that give all SEOs a bad name. There are many great companies out there, but it seems like for every good one, there are still a few bad apples.

Outdated SEO practices

Believe it or not, I still see cases of keyword stuffing and even hidden text.

Titles look terrible when stuffing; I’ve seen the same term multiple times or every city known to man in the title tag. I recently ran into a home page title that was over 800 characters long, with almost every city in the area!

Hidden text is also a surprisingly common problem, where website content (text, internal links and so on) is barely readable or sometimes intentionally hidden. These aren’t sites launched years ago, either — some of them are less than a year old.

I’m also seeing more websites that use the exact same page content on multiple pages with only the city name swapped. These pages have become so prevalent that I have a hard time telling clients not to do this, but of course I still recommend against it. (If they choose to reuse page content, I ask that that they add something additional that’s relevant and useful.) I even see pages that use obviously spun text still ranking well.

Link spam is the worst. I’m seeing a lot of sites using press release services that go out to local news websites. I see a lot of general directories and article websites in profiles that were recently added. I still see a lot of web 2.0 and video spam. Sadly, I see a lot of obvious footprints from programs like ScrapeBox, XRumer, SEnuke and GSA SER. Sometimes websites are getting away with the spam, but other times, companies have come to me with a penalty, and it’s obvious from the backlink profile what the cause is.

Local is a joke these days, too. The local listings are so full of spam and fake reviews that it’s sickening, and I’ve reached the point that I really don’t trust the reviews anymore. I see people keyword stuffing Google My Business listing names, adding in alternate business names that are keyword-rich, using a high-ranking website or an authoritative profile instead of their website, using UPS store locations, Regus offices or co-working spaces for the address, having multiple listings (or even multiple websites) for the same business, and so much more. It’s like the Wild West all over again.

For those who might have missed it, I highly recommend you check out Joy Hawkins’The Ultimate Guide to Fighting Spam on Google Maps,” and have fun reporting the things you’ve seen.

Mistakes

I’m seeing websites blocking crawlers a lot these days. It feels like almost every search has at least one result that says, “A description for this result is not available because of this site’s robots.txt.”

Of course, this can be caused by a noindex tag, as well as robots.txt. Whether people are bringing websites out of development, accidentally clicking wrong boxes, migrating websites or whatever other reason, this seems to get overlooked way more than it should and is a fairly common mistake.

Less common, but growing in popularity, are various JavaScript frameworks like Angular and React where no content is rendered or pages indexed. For anyone whose company is starting to use these frameworks, I highly recommend reading through Adam Audette’s “We Tested How Googlebot Crawls Javascript And Here’s What We Learned” and Jody O’Donnell’s “What To Do When Google Can’t Understand Your JavaScript,” as well as Builtvisible’s AngularJS and React guides.

A pet peeve of mine is when a company will redesign a website without doing redirects. I’ve seen catastrophic drops in traffic as a result of this. I’m giving the benefit of the doubt that this is a mistake, but it’s likely either not part of a company’s process or something they cut it out because it’s time-consuming and they were on a deadline. I also see a lot of redirects done incorrectly when switching from HTTP to HTTPS, including redirect chains and 302s instead of 301s.

Though it’s not always the fault of an SEO company, I’ve seen domain names expire or older domains dropped that have had substantial impacts on various businesses. This isn’t all that common, luckily, but it can be painful when it happens.

SEO Horror Stories

Shady SEO stuff

  • Shady sales tactics. I still see companies misrepresenting their Google partner status as something more than for paid search. I have talked to many small business owners who have signed simply because the salesman made it seem like they have an inside connection at Google. I’m also disappointed by the companies that try to sell packages before they even talk to a client or try to sell a package instead of a custom plan after speaking to them about their current position and challenges.
  • Ridiculous contracts. I’ve had clients who had to go to court because their contracts said the SEO provider owned everything — not just content or design, but even the domain name. If I had one piece of advice to business owners out there, it’s to make sure you control all your own branded accounts and properties.
  • Proprietary CMS systems. Some companies make it nearly impossible to leave them by using their own homegrown CMS systems without any export options and no database access. This is where scraping comes in handy. Some of these have some serious SEO issues, and I’ve even seen where all websites were also duplicated and indexed as a subdomain on the SEO company’s website.
  • Not turning over account logins. I see this a lot where companies will withhold login information, campaign details, or even entire accounts. I particularly hate when a client wasn’t in control of their web analytics or PPC accounts, and they have to be set up from scratch with no history. A lot of times, agencies claim their methods are proprietary, but this is just shady. If you haven’t checked it out already, go read my “Checklist For Transitioning To A New Digital Agency.”
  • Private blog networks (PBNs), paid links and spam. I’m amazed I still see this stuff so much. People just really want to take shortcuts, and SEO companies still sell people on the easy (and risky) wins. I can’t tell you the number of people I’ve heard refer to PBNs as white-hat in the past few months; it seems companies aren’t explaining the risks involved at all.
  • Link networks. I’ve seen pages where all the clients of an SEO company linked to all other clients. I’ve seen pages where the company used sponsored or partner pages to link client sites together. I wish I could say I’ve only seen this once or twice, but sadly I see this a lot (especially with niche-specific SEO companies).
  • Removing links. I had a client whose rankings and traffic dropped within the first of couple months of working with me. When I looked into it, I discovered that his previous SEO company actually put in the effort to remove all the links that he had built in the past year! Another example I’ve run into a few times is where companies are building links to a secondary website, not the main website. These secondary websites are redirected to the main website — and when a client leaves, the website is redirected to someone else, effectively taking the value built up with them.
  • Adding noindex. While this could be classified as a mistake, in this case I mean the instances where a noindex tag is meant as a malicious act. I’ve seen it when not even switching hosts, before handing over a website, for instance. I’ve also seen some sneaky functions that would add the tag specifically for Googlebot, and even options hidden behind a password-protected custom dashboard for the theme. Once, a company decided to do a change of address to a competitor in Google Search Console and redirected the website to the competitor. They refused to hand over the information to get this fixed in a timely manner.
  • Building another website on a different domain. If one website is good, then two or three must be better, right? I hate the providers who do this, and even some big companies do it. Why bother doing work on the main website when you can do it on a website you control, right? It’s even worse when they use a call tracking number instead of the actual number and hijack the Google My Business listings as well. I’ve had the worst headaches after a few companies did this and then went through a service like Yext that locked the NAP listings for the next year.
  • Canonical tags. I’ve seen so many shady things with canonical tags that it’s scary. Websites are copied from another site without changing the canonical, or I’ve seen the canonical set as the web design company. Some of the most frustrating things I’ve seen are when companies post the same blogs with the canonical to their website, or when the canonical points to the website of a company’s “preferred” customer in an area. I’ve seen canonicals set so that every page is canonicalized to the home page. I’ve even had companies canonical an entire website to a different website when handing it over, thinking it wouldn’t get noticed or to boost up one of their other clients, I’m sure.
  • Reusing content. I see this with many niche SEO providers. Service pages and blogs will be used across thousands and sometimes tens of thousands of websites. I guess there are only so many ways they could write about the same thing, so they just didn’t even bother. The worst I saw was a dentist whose service pages were used on over 30,000 other webpages.
  • Reviews. I’ve seen companies abuse account logins given in good faith to leave themselves a review from a client’s Gmail account. I’ve seen companies build fake reviews for their clients. I’ve seen companies mark up fake review stars on pages just so they show in the SERPs. Hint: If you see these, please report them: https://support.google.com/webmasters/contact/rich_snippets_spam?hl=en
  • Rolling back a website. I once had a company load a backup of a website from a time before they had done any on-page work. Of course, they said it was a “glitch” and that they didn’t have a recent backup. The backup they loaded was from over a year ago.
  • Threatening lawsuits. One of the stories I found the most interesting was where a company set up a company name as an exact match of one of the more popular search terms. This company actually sent out letters to top-ranking websites threatening lawsuits if these other companies targeted their “brand.” It was sad to see, but many of these companies actually asked their people to remove mentions of that phrase.
  • Not setting up conversion tracking. While this could go down as a mistake, if it’s done for several businesses, and reports are shown that make a campaign always look good or are vague enough to not tell anything, I consider this shady. Especially when people are paying for your services, whether it’s content, SEO, social or PPC, if you’re not tracking, you’re doing it wrong.
  • 301 a penalized website. I’ve seen this a few times now, implemented in different ways. I’ve seen penalized websites redirected to a competitor, of course, but I’ve also seen them redirected to more authoritative websites. There’s also an example I’ve been sharing for a few years of a water damage company ranking different websites until they are penalized and then redirecting to a public Google Doc that has their information. The Google Doc has ranked first for the past few years because of this tactic, and they usually have at least one or two other websites in the top 10.

TL;DR

There’s still a lot of shady stuff that happens in our industry, and I don’t feel we’ve cleaned up our act very well. Keep in mind that I’m just one guy, and I’ve personally seen all of the above just in the last couple years.

It’s not just local or niche companies that are doing bad things; in fact, enterprise and large websites can get away with murder compared to smaller sites. This encourages some of the worst practices I’ve ever seen, and some of these companies do practically everything search engines tell them not to do.

Share with me on social media some of your own horror stories; I’d love to hear about the crazy stuff you have seen.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Patrick Stox
Contributor
Patrick Stox is a Product Advisor, Technical SEO, & Brand Ambassador at Ahrefs. He was the lead author for the SEO chapter of the 2021 Web Almanac and is a reviewer for the 2022 SEO chapter. He’s an organizer for the Raleigh SEO Meetup, Raleigh SEO Conference, Beer & SEO Meetup. He also runs a Technical SEO Slack group and is a moderator for /r/TechSEO on Reddit.

Get the must-read newsletter for search marketers.