Deadly Sins Against SEO: Part 2
In Deadly Sins Against SEO, Part 1, I outlined some of the most egregious mistakes that website owners, web designers/developers, information architects, landing page experts and even search engine optimization (SEO) professionals make to prevent qualified search engine traffic and conversions. Here are some more of the top SEO sins I commonly encounter.
Sin #4: Using tools vs. developing skills
A tool can be effective if the person using the tool has aptitude, skill and talent. A tool is not a substitute for skill. For example, I can hammer a nail into a wall to hang a picture, but I do not have the skill and aptitude to build an entire house. This is analogous to SEO. Knowing how to use various keyword research tools does not automatically make a person an effective search-engine friendly copywriter or information architect.
Don’t get me wrong. I understand the initial need and usefulness of many SEO tools. A keyword density tool can be useful for people who are not accustomed to writing with keywords and applying them appropriately on a web page. Some of the search engines’ webmaster tools help site owners pinpoint easily overlooked crawlability roadblocks.
Problems often arise when SEO professionals use these tools as a crutch instead of becoming true optimization chefs. For example, one tool will say that your content’s keyword density is too much and another tool will say that the content is perfect. And search engines have not used keyword density to determine rankings for a very long time. What about users/searchers? What keyword density tool determines what your target audience thinks? At some point, search-engine friendly copywriters should know how to write effective content without relying on such tools.
Sin #5: Misinterpreting data by taking numbers out of context
A number without context is just a number. A number taken out of context can lead website owners down the wrong path.
Here is an example that has been driving me crazy for many years. I know of a sales/conversion expert (who touts himself as an SEO professional) who created this word calculator. This calculator supposedly determines the number of times you use your company name on a web page. The reasoning is that if you use your company name too much, your site content is more focused on your company name and brand than on site visitors.
On the surface, this sounds somewhat reasonable. But what if your company name contains keywords, including the primary keyword phrase you wish to target? Or your trademark? What about navigational queries, when the searchers’ intent is to go directly to a website or even a page within a website? Navigational queries are far more common we might imagine, often up to 33% of search engine queries. Why would any website owner make it difficult for a person to arrive at the official company web site?
Bounce rates can be an indication of meeting searcher expectations (for a “quick fact” informational query) or not meeting searcher expectations. Increased page views per visitor can be an indication of confusion (poor navigation and labeling) or interest. Eye-tracking data can show the page elements (text, graphic images, videos) that people view and the order in which they view them. But people view content differently, based on individual scenario and user goals, especially during eye-tracking tests.
I have been a strong believer in web analytics and usability testing since the mid- to late 90s. But I am equally a strong believer in accurately interpreting that data.
Sin #6: Treating symptoms instead of solving the problem
Sometimes, I swear the SEO industry has become the industry of band-aids and workarounds. Is your entire site designed in Flash? There’s a workaround for that. Do you have a content management system (CMS) that generates an uncrawlable URL structure? There’s a workaround for that, too.
Again, don’t get me wrong—some SEO workarounds are necessary because the “powers that be” who are employed at the commercial web search engines do not yet know how to deal with the new and emerging technologies that enhance and enrich the searcher experience. In addition, software developers seem to discount or ignore crawlability and indexation issues when they create CMS software. Therefore, we SEO professionals find it necessary to use these workarounds until software is developed to accommodate web crawlers.
Nevertheless, many workarounds are not workarounds. Many workarounds are band-aids for genuine site problems. For example, site maps (both wayfinder and XML site maps) are still used as a substitute for a site’s poor information architecture and crawlability issues. A wayfinder site map is a web page that all sites should have as a part of defensive design. But if the wayfinder site map is the only way in which users can directly access desired content? Then the solution is to fix site navigation and supplemental page interlinking.
Likewise, XML site maps (also spelled XML sitemaps) are commonly used as a crutch. If a site has 50,000 pages and a search engine is not crawling all of the pages, the problem might be duplicate content delivery, substandard or non-existent third party link development, poor navigation and interlinking, poorly implemented URL workarounds, and so forth. A URL list is not going to fix those problems.
I have to admit that I found it very difficult to identify my top SEO sins, as each SEO professional has unique challenges. What SEO sins do you commonly observe? Let us know in the comments section below.
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.
(Some images used under license from Shutterstock.com.)
Sign up for content marketing news and tips delivered every Tuesday.