Everything you need to know about SEO, delivered every Thursday.
36 SEO Myths That Won’t Die But Need To
Every day a new SEO myth is born; unfortunately, not every day does an old SEO myth die off. The net result is a growing population of myths. These are nearly impossible to squash because the snake-oil salesmen of our industry keep perpetuating them — bringing them back from the brink, even. You can talk at conferences till your blue in the face. You can develop SEO checklists like this one, or even author a book. You’ll still get asked how to write good meta keywords.
You can even pre-empt myths before they take hold, as Matt Cutts attempted in his recent post, Google Incorporating Site Speed in Search Rankings. Despite Matt’s best efforts, I am sure this won’t be the last time we hear (or read) the myths “site speed is a major new factor in determining Google rankings” and “the site speed signal will help big sites who can pay more for hosting.”
Sometimes the myths get debunked, only to end up coming back with a vengeance. Take the meta keywords tag for instance. Google never did support this worthless tag. But apparently, Yahoo had been “supporting” the tag for some time. Remember when Yahoo went on the record (at the SMX East 2009 conference) to say they were no longer giving any credence to the meta keyword tag? Then, within days Danny Sullivan published his findings from some of his own tests. Turned out Yahoo’s assertion on the meta keywords tag was wrong. In Yahoo it apparently still is a signal (albeit an incredibly minor one.) Oops.
I, for one, hate misinformation and disinformation, and our industry, unfortunately, is rife with it. I’m going to do my part in fighting this menace and spreading the truth — by exposing some of the more insidious myths in this very article. I think this is only fitting, considering Covario’s oft-stated goal is to be “the source of truth” for our clients on the performance of their SEO and SEM.
And now, without any further ado, the list…
- Our SEO firm is endorsed/approved by Google. The following comes from an actual email a friend of mine received from an SEO firm last year:
We are…Google Approved, a partner with Google, they endorse us as an optimizer, and their list includes very few partners, and we are one of them!. To find us on their list please go to: http://www.google.com/websiteoptimizer/woac.html and select region: United States; scroll to the middle of the page and find National Positions.”
Hmm…. you won’t find them listed there anymore.
- Don’t use Google Analytics because Google will spy on you and use the information against you. This one comes straight from the conspiracy theorists. Google has made numerous assurances that they aren’t using your traffic or conversion data to profile you as a spammer.
- Your PageRank score, as reported by Google’s toolbar server, is highly correlated to your Google rankings. If only this were true, our jobs as SEOs would be so much easier! It doesn’t take many searches with SEO for Firefox running to see that low-PageRank URLs outrank high-PR ones all the time. It would be naive to assume that the PageRank reported by the Toolbar Server is the same as what Google uses internally for their ranking algorithm.
- Having an XML Sitemap will boost your Google rankings. I just heard this one from a fellow panelist in an SEO session at a conference I presented at within the last month (I won’t mention who, or which show.) This made me cringe, but I bit my lip rather than embarrass and contradict them in front of the audience. Should I have spoken up? Did I do the audience a disservice by leaving this myth unchallenged? I struggled with that. In any event, Google will use your sitemaps file for discovery and potentially as a canonicalization hint if you have duplicate content. It won’t give a URL any more “juice” just because you include it in your sitemaps.xml, even if you assign a high priority level to it.
- Since the advent of personalization, there is no such thing as being ranked #1 anymore because everyone sees different results. Although it is true that Google personalizes search results based on the user’s search history (and now you don’t even have to be logged in to Google for this personalization to take place), the differences between personalized results and non-personalized results are relatively minor. Check for yourself. Get in the habit of re-running your queries — the second time adding &pws=0 to the end of Google SERP URL — and observing how much (or how little) everything shifts around.
- Meta tags will boost your rankings. I’m so sick of hearing about meta tags. Optimizing your meta keywords is a complete waste of time. Period. They have been so abused by spammers that the engines haven’t put any stock in them for years and years. What about other meta tags — such as meta description, meta author, and meta robots — you ask? None of the various meta tags are given any real weight in the rankings algorithm.
- It’s good practice to include a meta robots tag specifying index, follow. This is a corollary to the myth immediately preceding. It’s totally unnecessary. The engines all assume they are allowed to index and follow unless you specify otherwise.
- It’s helpful if your targeted keywords are tucked away in HTML comment tags and title attributes (of IMG and A HREF tags.) Since when have comment tags or title attributes been given any weight?
- Having country-specific sites creates “duplicate content” issues in Google. Google is smart enough to present your .com.au site to Google Australia users and your .co.nz site to Google New Zealand users. Not using a ccTLD? Then set the geographic target setting in Google Webmaster Tools; that’s what it’s there for. Where’s the problem here?
- Googlebot doesn’t read CSS. You’d better believe Google scans CSS for spam tactics like hidden divs.
- You should end your URLs in .html. Since when has that made a difference?
- You can boost the Google rankings of your home page for a targeted term by including that term in the anchor text of internal links. Testing done by SEOmoz found that the anchor text of your “Home” links is largely ignored. Use the anchor text “Home” or “San Diego real estate” — it’s of no consequence either way.
- It’s important for your rankings that you update your home page frequently (e.g. daily.) This is another fallacy spread by the same aforementioned fellow panelist. Plenty of stale home pages rank just fine, thank you very much.
- Trading links helps boost PageRank and rankings. Particularly if done on a massive scale with totally irrelevant sites, right? Umm, no. Reciprocal links are of dubious value: they are easy for an algorithm to catch and to discount. Having your own version of the Yahoo directory on your site isn’t helping your users, nor is it helping your SEO.
- Linking out (such as to Google.com) helps rankings. Not true. Unless perhaps you’re hoarding all your PageRank by not linking out at all — in which case, that just looks unnatural. It’s the other way around, i.e. getting links to your site — that’s what makes the difference.
- It’s considered “cloaking” — and is thus taboo and risky — to clean up the URLs in your links selectively and only for spiders. If your intentions are honorable, then you have nothing to fear. All the major search engines have said as much. You are helping the engines by removing session IDs, tracking parameters and other superfluous parameters from the URLs across your site — whether it’s done by user-agent detection, cookie detection or otherwise. After all, if it were bad, would Yahoo be doing it? Check it for yourself: visit the Yahoo.com home page with the Googlebot user agent string (e.g. with Firefox using the User Agent Switcher extension). You’ll notice the “ylt” parameter has been stripped from all the links.
- If you define a meta description, Google uses it in the snippet. We already learned from my last column (“Anatomy of a Google Snippet“) that this is oftentimes not the case.
- The bolding of words in a Google listing signifies that they were considered in the rankings determination. Also discussed in my last column, this phenomenon — known as “KWiC” in Information Retrieval circles — is there purely for usability purposes.
- H1 tags are a crucial element for SEO. Research by SEOmoz shows little correlation between the presence of H1 tags and rankings. Still, you should write good H1 headings, but do it primarily for usability and accessibility, not so much for SEO.
- There are some unique ranking signals for Google Mobile Search, and they include the markup being “XHTML Mobile”. Google Mobile Search results mirror those of Google Web Search. By all means, create a mobile-friendly version of your site; but do it for your users, not for SEO.
- SEO is a black art. And it’s done, usually in a dark room, by some rogue SEO consultant, without requiring the involvement of the client / rest of the company. If SEO were like that, our lives would read like spy novels.
- The Disallow directive in robots.txt can get pages de-indexed from Google. As I explained in my article “A Deeper Look at Robots.txt“, disallows can lead to snippet-less, title-less Google listings. Not a good look. To keep pages out of the index, use the Noindex robots.txt directive or the meta robots noindex tag — NOT a Disallow directive.
- SEO is a one-time activity you complete and are then done with. How many times have you heard someone say “We actually just finished SEOing our site”? It makes me want to scream “No!” with every fiber of my being. SEO is ongoing. Just like one’s website is never “finished,” neither is one’s SEO. Catalog marketers get this better than anyone else: they are used to optimizing every square inch of their printed catalog. There is always more performance to be wrung out. The “set it and forget it” misconception is particularly prevalent among IT workers — they tend to treat everything like a project so that they can get through assignments, close the “ticket” and move on, and thus maintain their sanity. I can’t say I blame them.
- Automated SEO is black-hat or spammy. There is nothing wrong with or inappropriate in using automation. Indeed, it signals a level of maturity in the marketplace when industrial-strength tools and technologies for large-scale automation are available. Without automation, it would be difficult to impossible for the enterprise company to scale their SEO efforts across the mass of content they have published on the Web. Chris Smith paints a compelling picture for SEO automation in this classic post.
- A site map isn’t for people. A good (HTML, not XML) site map is designed as much for human consumption as it is for spiders. Any time you create pages/copy/links solely for a search engine, hoping they won’t be seen by humans, you’re asking for trouble.
- There’s no need to link to all your pages for the spiders to see them. Just list all URLs in the XML Sitemap. Orphan pages rarely rank for anything but the most esoteric of search terms. If your web page isn’t good enough for even you to want to link to it, what conclusion do you think the engines will come to about the quality and worthiness of this page to rank?
- Google will not index pages that are only accessible by a site’s search form. This used to be the case, but Google has been able to fill out forms and crawl the results since at least 2008. Note this doesn’t give you permission to deliberately neglect your site’s accessibility to spiders, as you’d probably be disappointed with the results.
- Placing links in teeny-tiny size font at the bottom of your homepage is an effective tactic to raise the rankings of deep pages in your site. Better yet, make the links the same color as the page background, and/or use CSS to push the links way out to the side so they won’t detract from the homepage’s visual appearance! (I am being facetious here, don’t actually do this.)
- Using a service that promises to register your site with “hundreds of search engines” is good for your site’s rankings. If you believe that, then you may also be aware that there is a Nigerian prince who desperately needs your help to get a large sum of money smuggled out of his country, for which you will be richly rewarded.
- Home page PageRank on a domain means something. As in: “I have a PageRank 6 site.” In actuality it means nothing. As I already stated, toolbar PageRank is misleading at best, completely bogus at worst. Furthermore, a high PageRank on one’s home page doesn’t necessarily equate to high PageRank on internal pages. That’s a function of the site’s internal linking structure.
- Outsourcing link building to a far-away, hourly contractor with no knowledge of your business is a good link acquisition solution. And a sound business decision… NOT! As it is, the blogosphere is already clogged enough with useless, spammy comments in broken English from third-world link builders. No need to make it worse by hiring them to “promote” your site too.
- The clickthrough rate on the SERPs matters. If this were true then those same third-world link builders would also be clicking away on search results all day long.
- Keyword density is da bomb. Ok, no one says “da bomb” anymore, but you get the drift. Monitoring keyword density values is pure folly.
- Hyphenated domain names are best for SEO. As in: san-diego-real-estate-for-fun-and-profit.com. Separate keywords with hyphens in the rest of the URL after the .com, but not in the domain itself.
- Great Content = Great Rankings. Just like great policies equals successful politicians, right?
Now it’s up to you, dear reader, to do your part in spreading the truth. Whenever possible, stand with your SEO brothers and sisters united against this damaging SEO mythology. Let’s all be SEO mythbusters.
Which SEO myths are you most sick of? What did I miss in my list? Talk back in the comments.
Postscript: Be sure to read the follow-up column to this, 36 More SEO Myths That Won’t Die But Need To.
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.