25 Super Common SEO Mistakes

No, these aren’t “myths” disguised as “common mistakes.” I’ve already beaten the SEO myths theme to death with my previous three articles.

What follows are innocent mistakes that many SEOs make. Some of these things catch even the best of us…

1.  Google AdWords Keyword Tool Set To Broad Match

The Google AdWords Keyword Tool defaults to “Broad match” mode, which yields useless data from an SEO perspective — useless in that the numbers are hugely inflated to include countless phrases incorporating the search term specified. For example, the Keyword Tool reports 30.4 million queries for “shoes”, but that includes multi-word phrases such as “dress shoes,” “leather shoes,” “high heeled shoes,” and even “horse shoes,” “snow shoes,” and “brake shoes.”

In Exact mode, the search query volume for “shoes” drops to 368,000. The difference between those numbers is striking, isn’t it? So always remember if you are doing keyword research for SEO in the AdWords Keyword Tool: untick the box next to Broad match and tick the box next to Exact.

Google AdWords Keyword Tool defaults to broad match

2.  Disallowing when you meant to Noindex

Ever notice listings in the Google SERPs (search engine results pages) without titles or snippets? That happens when your robots.txt file has disallowed Googlebot from visiting a URL, but Google still knows the URL exists because links were found pointing there. The URL can still rank for terms relevant to the anchor text in links pointing to disallowed pages. A robots.txt Disallow is an instruction to not spider the page content; it’s not an instruction to drop the URL from the index.

If you place a meta robots noindex meta tag on the page, you’ll need to allow the spiders to access the page so it can see the meta tag. Another mistake is to use the URL Removal tool in Google Webmaster Tools instead of simply “noindexing” the page. Rarely (if ever) should the removal tool be used for anything. Also note that there’s a Noindex directive in the REP (Robots Exclusion Protocol) that Googlebot obeys (unofficially). More on disallow and noindex here.

3.  URL SERP Parameters & Google Instant

I just wrote about parameters you can append to Google SERP URLs. I’ve heard folks complain they aren’t able to add parameters to the end of Google SERP URLs anymore — such as &num=100 or &pws=0 — since Google Instant appeared on the scene. Fear not, it’s a simple matter of turning Google Instant off and URL parameters will work again.

4.  Not using your customer’s vocabulary

Your customer doesn’t use industry-speak. They’ve never used the phrase “kitchen electrics” in a sentence, despite the fact that its the industry-accepted term for small kitchen appliances. Your customer may not search in the way you think makes intuitive sense. For example, I would have guessed that the plural “digital cameras” would beat the singular “digital camera” in query volume — yet it’s the other way around according to the various Google tools.

Sometimes it is lawyers being sticklers that gets in the way — such as a bank’s lawyers insisting the term “home loan” be used and never “mortgage” (since technically the latter is a “legal instrument” that the bank does not offer). Many times the right choice is obvious but it’s internal politics or inertia keeping the less popular terminology in place (e.g. “hooded sweatshirt” when “hoodie” is what folks are searching for).

5.  Skipping the keyword brainstorming phase

Too rarely do I hear that the site’s content plan was driven by keyword brainstorming. Keyword brainstorming can be as simplistic as using Google Suggest (which autocompletes as you type and is built into Google.com) or Soovle (which autocompletes simultaneously from from Google, Bing, Yahoo, YouTube, Wikipedia, Amazon, and Answers.com). The idea is to think laterally.

For example, a baby furniture manufacturer discovers the popularity of “baby names” through looking at popular terms starting with “baby” and decides to build out a section of their site dedicated to related terms (“trends in baby names”, “baby name meanings”, “most overused baby names” etc.).

6.  Mapping URLs to keywords, but not the other way around

It’s standard operating procedure to map all one’s site content to keyword themes (sometimes referred to as primary keywords, declared search terms, or gold words.) What’s not so common is to start with a target (i.e. most desired) keyword list and map each keyword to the most appropriate page to rank for that keyword and then optimize the site around the keyword-to-URL pairs.

For example, “vegan restaurants in phoenix” could be relevant to five different pages, but the best candidate is then chosen. The internal linking structure is then optimized to favor that best candidate, i.e. internal links containing that anchor text are pointed to the best candidate rather than spread out across all five. This makes much more sense than competing against oneself and none of the pages winning.

7.  Setting up a free hosted blog

Free hosted blog platforms like WordPress.com and Blogger.com provide a valuable service. Over 18 million blogs are hosted on WordPress.com. They’re just not a service I would sign up for if I cared about SEO or monetization. They aren’t flexible enough to install your own choice of plugins or themes/frameworks to trick out the blog with killer SEO. And for Heaven’s sake, don’t make your blog a subdomain wordpress.com. For $10 per year, you can get a premium WordPress.com account under your own domain name.

Did you know putting AdSense ad units on your WordPress.com blog is against the service’s Terms & Conditions? Much better to get yourself a web host and install the self-hosted version of WordPress so you have full control over the thing.

8.  Not properly disabling Google personalization

Not long ago, Google started personalizing results based on search activity for non logged in users. For those who thought that logging out of Google was sufficient in order to get non-personalized results, I’ve got news for you: it isn’t. Click on “Web History” in the Google SERPs and then “Disable customizations based on search activity”. Or on an individual query you can add &pws=0 to the end of the Google SERP URL (but only if Google Instant is off, see above).

9.   Not logging in to the free tools

Some of the web-based tools we all use regularly, such as Google Trends, either restrict the features or give incomplete (or less accurate) data if not logged in. The Google AdWords Keyword Tool states quite plainly: “Sign in with your AdWords login information to see the full list of ideas for this search”. It would be wise to heed the instruction.

10.  Not linking to your top pages with your top terms on your home page

The categories you display on your home page should be thought through in terms of SEO. Same with your tag cloud if you have one. And the “Popular Products” that you feature. In your mind translate “Popular Products” into “Products for which I most want to get to the top of Google.”

11.  Not returning a 404 status code when you’re supposed to

As I mentioned previously, it’s important to return a 404 status code (rather than a 200 or 301) when the URL being requested is clearly bogus/non-existent. Otherwise, your site will look less trustworthy in the eyes of Google. And yes, Google does check for this.

12. Not building links to pages that link to you

Many amateur SEOs overlook the importance of building links to pages that link to their sites. For commercial sites, it can be tough to get links that point directly to your site. But once you have acquired a great link, it can be a lot easier to build links to that linking page and thus you’ll enjoy the indirect benefit.

13.  Going over the top with copy and/or links meant for the spiders

Countless home pages have paragraphs of what I refer to as “SEO copy” below the footer (i.e. after the copyright statement and legal notices) at the very bottom of the page. Often times they embed numerous keyword-rich text links within that copy. They may even treat each link with bold or strong tags. Can you get any more obvious than that? I suppose if you put an HTML comment immediately preceding that said “spider food for SEO!” (perhaps “Insert keyword spam for Google here” might be more apropos?)

14.  Not using the canonical tag

The canonical tag (errr, link element) may not always work but it certainly doesn’t hurt. So go ahead and use them. Especially if it’s an ecommerce site. For example, if you have a product mapped to multiple categories resulting in multiple URLs, the canonical tag is an easy fix.

15.  Not checking your neighborhood before settling in

If you’re buying a home, you’d check out the area schools and the crime statistics, right? Why wouldn’t you do the same when moving into a new IP neighborhood. Majestic SEO has an IP neighborhood checker. This is especially important for the small-time folks. You don’t want to be on the same IP address (shared hosting) with a bunch of dodgy Cialis sites.

16.  Doing too much internal linking

Don’t water down your link juice so much that only a trickle goes to each of your pages. An article page should flow PageRank to related topics not to everything under the sun (i.e. hundreds of links).

17.  Trusting the data in Google webmaster tools

Ever notice Google Webmaster Tools’ data doesn’t jive with your analytics data? Trust your analytics data over the webmaster tools data.

18.  Submitting your site for public site review at a conference where Google engineers are present

Doh! (Insert Homer Simpson voice here.) Unless you’re absolutely sure you have nothing weird going on within your site or link neighborhood, this is pretty much a suicide mission. Corollary: talking to Matt Cutts at a conference without covering your badge up with business cards. Note this mistake was contributed by a guy we’ll call “Leon” (you know who you are, “Leon”!)

19.  Cannibalizing organic search with PPC

Paying for traffic you would have gotten for free? Yeah that’s gotta hurt. I wrote about this before in Organic Search & Paid Search: Are they Synergistic or Cannibalistic?.

20.  Confusing causation with correlation

When somebody tells me they added H1 tags to their site and it really bumped up their Google rankings, the first question I ask is: “Did you already have the headline text there and just change a font tag into an H1, or did you add keyword-rich headlines that weren’t present before?” It’s usually the latter. The keyword-rich text at the top of the page bumped up the keyword prominence (causation). The H1 tag was a correlation that didn’t move the needle.

21.  Not thinking in terms of your (hypothetical) Google “rap sheet”

You may recall I’ve theorized about this before. Google may not be keeping a “rap sheet” of all your transgressions across your network of sites, but they’d be foolish not to. Submitting your site to 800 spam directories over a span of 3 days is just plain stupid. If it’s easy enough to see a big spike in links in Majestic SEO, then it’s certainly easy enough for Google to spot such anomalies.

22.  Not using a variety of anchor text

That just doesn’t look natural. Think link diversity.

23.  Treating all the links shown in Yahoo Site Explorer as “followed”

Don’t ask me why YSS includes nofollowed links in its reports, but it does. Many YSS users wrongly assume all of the links reported under the “Inlinks” tab are followed links that pass link juice.

24.  Submitting a Reconsideration Request before EVERYTHING has been cleaned up

This may not be “super-common” because many SEOs have never submitted a “Reconsideration request” to Google. But if you have or plan to, then make sure everything — and I mean EVERYTHING — has been cleaned up and you’ve documented this in your submission.

25.  Submitting to the social sites from a non power user account

Nothing goes flat faster than a submission from an unknown user with no history, no followers, no “street cred”. Power users still rule, Digg redesign or not.

Bonus tip: Stop focusing on low- (or no) value activities

Yes I’ll beat on the meta keywords tag yet again. Google never supported it. All it is is free info for your competitors. Guaranteed there are items on your SEO to-do list like this that aren’t worth doing. Be outcome-focused, not activity-focused. Focus on what matters.

Of course this wasn’t an exhaustive list. There are many, many more. I could easily make this a three article series too. I will try to resist the temptation. ;-)

What mistakes are you seeing your co-workers, clients, and competitors make? Share them in the comments!

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: All Things SEO Column | Channel: SEO

Sponsored


About The Author: is the author of Google Power Search, creator of the Science of SEO and co-author of The Art of SEO now in its second edition, both published by O'Reilly. Spencer is also the founder of Netconcepts and inventor of the SEO technology platform GravityStream. He also blogs on his own site, Stephan Spencer's Scatterings.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • http://www.web-savvy-marketing.com rebeccagill

    Good article Stephen. My particular favorite is # 13. It is a pet peeve of mine. If you can “see” attempts at SEO, then it is just bad SEO. Good SEO flows and becomes part of the website. Bad SEO jumps out at your like a neon sign.

  • http://www.experienceadvertising.com experienceadvertising

    Great post Stephan! Very informative. Also really enjoyed your presentation at the Performance Marketing Expo in Miami a few weeks ago…great live analysis!

  • http://www.mynextcustomer.com kerimorgret

    Related to #8 about disabling Google personalization, I’ve found using Safari as a browser just for testing to be helpful. There’s an Edit -> Reset Safari option where you can clear cookies, search history, auto-complete, and everything else. It’s an easy way to make sure you’re not logged in to anything.

  • http://www.portentinteractive.com Tom Schmitz

    Great list Stephan.

    In reference to #1, I’ll add check your geography. I always use the Local (USA) Exact Match and Local (USA) Phrase Match numbers. It doesn’t help my domestic clients when all the queries come from overseas.

    If the ratio of Phrase Match over Exact Match is large, like 2:1 or higher, then I look for a healthy collection of long-tail keywords. If the ratio is close to 1:1 then I know most people just search for the exact match.

  • Paresh.shrimali

    its a nice redirection of our basics mistakes. we must correct our mistakes as above guidelines. I know new thing about IP neighbor. its also very important for SEO activity.

  • http://www.borisjacquin.com borisjacquin

    You start on the wring foot. #1 is no longer a useful SEO tool. Google clearly announced in October that Google AdWords Keyword Tool was only giving results with commercial value. See http://www.aimclearblog.com/2010/10/07/r-i-p-google-keyword-tool-long-live-seo/

  • http://www.sauravrimal.co.uk srimal

    Not sure if Mapping URLs to keywords, but not the other way around, is a bad thing especially now when you can have up to 4 pages listed for the same terms in the SERPs. Yes logically we recommend to focus on one page but would you not take the chance of having 3 to 4 pages listed in Google?

  • http://top-seo-blog.blogspot.com/ Sangeeta Mittal

    Good Points to keep in mind while SEO for sites.
    Thanks for the long list

  • http://www.seolinkreports.com ogletree

    I’m going to have to disagree with part of 21. Have you ever tested submitting a site to 800 free directories. I have test sites that I run and do just that to see what google thinks about it. Also if you know anything about how submitting to directories work you would understand that it does not work the way you say. If I submit to 800 directories right now very few links will be acquired. Many of them have huge backlogs and take a while to get in. Also most the time your site is buried very deep in a pr3 or lower directory that does not get spidered that often. Google has no idea that you just submitted to a bunch of directories. Every single link that you get will be spidered on different days.

    Also we run backlink profile reports with our software that shows us how many directories sites have. I have run these reports on 100′s of sites that rank in the top five for major keywords and every single one of them have thousands of links from directories. Many people don’t understand how big directories are these days. Google can’t do anything about it because pretty much every site has links in them. I have sites that have nothing but directory links and they rank for some decent keywords.

    I do link building for a living and I would never tell somebody they need to get a ton of directory links but I do tell people that every link campaign should spend a little time on it. You should never have any one link type take up a large percentage of your links. The only reason I have some sites with only directories is for testing.

  • wordswordsseowords

    Not sure I understand the details in #12:

    “Many amateur SEOs overlook the importance of building links to pages that link to their sites. For commercial sites, it can be tough to get links that point directly to your site. But once you have acquired a great link, it can be a lot easier to build links to that linking page and thus you’ll enjoy the indirect benefit.”

    Do you mean that if a site writes an article about my client, then on my client’s site I should somehow create a link back to that site? What’s the ‘indirect benefit?’

  • http://getinnepal.com Prarthana Sharma

    Great Article. Enjoyed going through each and every points. Super common yet neglected errors :)
    Thanks!

  • Hariharan K

    Really Fantastic post, We can learn Positives from their negatives. Good concept 

 

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide