SEO Checklist Part 2: Best Practices

Continued from Part 1: 29 Worst Practices & Most Common SEO Failures.

Best Practices

Implementing the 14 best practices below (or at least some of them) and avoiding the worst practices should offer you a straightforward approach to better visibility in search engines, including Google, Yahoo!, and Bing.

Best Practice Doing now Will do soon Won’t or N/A
1. Are the keywords you are targeting relevant and popular with searchers?
2. Do your page titles lead with your targeted keywords?
3. Is your body copy of sufficient length and keyword-rich?
4. Does the anchor text pointing to various pages within your site include good keywords?
5. Do you employ text links from your home page to your most important secondary pages?
6. If you must have graphical navigation, do you use the CSS image replacement technique as a workaround, and do those graphics have descriptive and keyword-rich ALT attributes that are useful for both humans and engines?
7. Does your Web site have an XML Sitemap, as well as an HTML site map with text links?
8. Are the URLs of your dynamic (database driven) pages short, simple, and static-looking?
9. Does your home page and other key pages of your site have sufficient PageRank (link authority)?
10. Does your site have an optimized internal linking structure?
11. Do your pages have keyword-rich meta descriptions with a compelling call to action?
12. Does your site have a custom error page that returns the correct “status code”?
13. Do your filenames and directory names include targeted keywords?
14. Are you actively building links to your Web site?


Best and Worst Practice Explanations

Curious about the importance or relevance of some of the questions on the checklists? Read on for full descriptions of the implications of these questions.

Best Practices Explanations

  1. Are the keywords that you are targeting not only relevant but also popular with searchers? There is no point going after high rankings for keywords that no one searches for. Compare relative popularity of keywords using Google’s free tools (Google AdWords Keyword Tool and Google Insights for Search) and/or paid tools like KeywordDiscovery.com and WordTracker.com before deciding what keywords to employ on your Web pages.

    Despite the popularity of individual words, it’s best to target two- or three-word phrases (or even longer). Because of the staggering number of Web pages indexed by the major search engines, competing for a spot on the first or second page of search results on a one-word keyword will typically be a losing battle (unless you have killer link authority). This should go without saying, but the keywords you select should be relevant to your business.

  2. Do your page titles lead with your targeted keywords? The text within your page title (a.k.a. the title tag) is given more weight by the search engines than any other text on the page. The keywords at the beginning of the title tag are given the most weight. Thus, by leading with keywords that you’ve chosen carefully, you make your page appear more relevant to those keywords in a search.

  3. Is your body copy of sufficient length and keyword-rich?Ideally, incorporate at least several hundred words on each page so there’s enough “meat” there for the search engines to sink their teeth into and determine a keyword theme of the page. Include relevant keywords high up in the page, where they will be weighted more heavily by the search engines than keywords mentioned only at the bottom of the page, where it’s almost like an afterthought. This is known as keyword prominence. Think in terms of keyword prominence in the HTML, not the rendered page on the screen; Google doesn’t realize that something is at the top of the third column if it appears low in the HTML. Be careful not to go overboard to the point that your copy doesn’t read well; that’s called “keyword stuffing” and is discussed later, under “Worst Practices.”

  4. Does the anchor text pointing to various pages within your site include good keywords? Google, Yahoo, and Bing all associate the anchor text in the hyperlink as highly relevant to the page being linked to. So, use good keywords in the anchor text to help the engine better ascertain the theme of the page you are linking to. Keep the link text relatively succinct and tightly focused on just one keyword or key phrase. The longer the anchor text, the more diluted the overall theme conveyed to the engine.

  5. Do you employ text links from your home page to your most important secondary pages? Text links are, by far, the better option over ALT attributes in conveying to the search engine the context of the page to which you are linking. (An ALT attribute is the text that appears in a small box when you hover your cursor over an image.) ALT attributes can have an effect, but it’s small in comparison with that of text links. If you have graphical navigation buttons, switch them to keyword-rich text links; if that’s not an option, at least include text link navigation repeated elsewhere on the page, such as in the footer (note however that footer links are partially devalued), or consider the CSS image replacement technique, described below.

  6. If you must have graphical navigation, do you use the CSS image replacement technique as a workaround, and do those graphics have descriptive and keyword-rich ALT attributes that are useful for both humans and search engines? Image Replacement is a technique that employs CSS (Cascading Style Sheets) to substitute in replacement copy and HTML – such as a text link or heading tag – when the stylesheet is not loaded (as is the case when the search engine spiders come to visit.) The text-based replacement is weighted more heavily by the engines than the IMG ALT attribute — thus it is preferable to relying solely on the ALT attribute. Of the many ways to implement the image replacement technique, most use CSS to physically move the text off the screen (text-indent: -9999em; left:-9999em;display:none, etc), which is not ideal because the search engines may discount this as hidden text.

    Important: resist the temptation to work in additional keywords or text into the text replacement, or your site may be hit with a penalty. A few CSS image replacement methods exist that are preferable because they don’t physically move the content off-page and are still accessible, namely The Leahy/Langridge Method, The Gilder/Levin Method and The ‘Shea Enhancement’. It is still useful to have ALT attributes on your images, more for usability/accessibility than for SEO. ALT attributes should contain relevant keywords that convey the key information from the image that the user would not receive if she had image loading turned off.

  7. Does your Web site have an XML Sitemap, as well as an HTML site map with text links? An XML Sitemap file provides the search engines with a comprehensive list of all the URLs corresponding to the pages/documents which are contained on your website. This helps ensure all of your pages end up getting indexed by the search engines. But the XML Sitemap is more than just a list of URLs; it can include additional information about each URL, such as the page’s last modified date and priority (which can impact how frequently the page is visited by the search engine spiders and thus how quickly it is refreshed.)

    It’s abest practice to also include the location of your sitemap file(s) in your site’s robots.txt, so that the search engines can “autodiscover” the sitemaps on their own without you having to specify the location of the file(s) in each search engine’s Webmaster Center. An HTML sitemap is a different thing altogether. It’s simply a page on your website with links to all your important pages, displayed usually in a hierarchical fashion. A link to the sitemap is typically present in the footer of every page of the site.

    HTML sitemaps have long been touted as good “spider food” because it provides the search engine spiders with a links to key pages to explore and index. Use text links, since they are more search engine optimal than graphical links, as already mentioned. Bear in mind that you should ideally try to stay within 100 links per page, as a recommended best practice by Google (this is a rough guideline, not a hard and fast rule). That may mean breaking up your site map into multiple HTML pages.

  8. Are the URLs of your dynamic (database-driven) pages short, simple and static-looking? Pages with URLs that contain a question mark and numerous ampersands and equals signs aren’t as palatable to the search engines as simple, static-looking URLs. Either install a server module/plug-in that allows you to “rewrite” your URLs, or recode your site to embed your variables in the path info instead of the query string; or, if you need to minimize resource requirements by your IT team, you can enlist a “proxy serving” solution such as Organic Search Optimizer.

    I’ve written about this at length in this two-part article. Another, oft-neglected aspect of URL optimization is making them short for improved click-through from the search results. In my previous article on URL optimization I discussed an interesting study by MarketingSherpa that found that short URLs get clicked on twice as often as long URLs in the Google SERPs.

  9. Does your home page and other key pages of your site have sufficient PageRank (link authority)? PageRank is Google’s way of quantifying the importance of a Web page. Put another way, it’s as much about the quality of the links pointing to a given Web page as it is about the quantity (more so, actually). PageRank has been the cornerstone for Google’s ranking algorithm since the beginning. The more important (PageRank-endowed) pages wield more voting power; the page’s “vote” gets divvied up among all the links on the page and passed on to those pages.

    Of course, this is a massive over-simplification, and the PageRank algorithm has evolved over the years to include such things as trust and authority to stay ahead of the spammers. Nonetheless, a form of PageRank is still in use today by Google. You can check Google PageRank scores using the Google Toolbar. Mouse over the toolbar’s PageRank meter to display the numerical rating, an integer value between 0 and 10. Yahoo’s importance-scoring equivalent to PageRank has been referred to internally as both LinkFlux and Yahoo! Web Rank at various times. It’s best to refer to the PageRank-like algorithms of the three major engines more generally as “link authority,” “link equity,” or “link juice”.The PageRank scores delivered by Google’s toolbar server are on a logarithmic scale; meaning that integer increments are not evenly spaced. Thus, garnering more links and gaining in PageRank score from 3 to 4 is easy, but from 6 to 7 is a lot harder.

    Also bear in mind that the PageRank displayed in the Google Toolbar is not the same PageRank as what is used by Google’s ranking algorithm. In fact, the correlation between the two PageRanks has degraded over time. Potentially a better predictor of your true PageRank score is the “mozRank” score available from Linkscape. “mozRank” approximates Google PageRank using a sophisticated algorithm and an index of 30+ billion pages. mozRank scores are also on a logarithmic scale. A PageRank or mozRank score for your home page of 7 or 8 is a laudable goal.Note that each page has its own PageRank score. Because most of the inbound links your site has garnered point to the home page, your home page almost invariably ends up being the highest PageRank-endowed page of your site. The PageRank that has accumulated on your home page is passed to your internal pages through your internal linking structure.

    Bottom line: if a given page on your site doesn’t have enough PageRank (I’m referring to the super-secret, internal PageRank that Google doesn’t share with us SEOs via the Toolbar), then it doesn’t deserve to rank.


  10. Does your site have an optimized internal linking structure? Your site’s hierarchical internal linking structure conveys to the search engines how important you consider each page of your site, comparatively. This of course impacts these pages’ PageRank scores and ultimately their Google rankings. The deeper down a page is in the site tree (i.e. the more clicks away the page in question is from the home page), the less PageRank with which that page will be endowed.

    Therefore, it’s critical you think carefully about how you spend that hard-earned PageRank, i.e. where and how you link from your home page and from your site-wide navigation to the rest of your site.Generally speaking, the deeper in your hierarchy you hide key content, the less important that content appears to the search engines — if they even find it (which is not a given if it’s very deep). As an aside, this concept applies not only to your linking structure but also to your URL structure: too many slashes in the URL (i.e. too many sub-directories deep) and you convey to the engines that the page is unimportant. A flat directory structure, where you minimize the number of slashes in the URL, helps ensure more pages of your site get indexed.

  11. Do your pages have keyword-rich meta descriptions with a compelling call to action? Because meta tags are tucked away in the HTML and hidden from the view of the human visitor, they have been abused like crazy by spammers trying to hide keywords out of view. The original purpose of meta tags was to provide meta-information about the page which could then be used by search engine spiders and other algorithms. One such piece of meta-information is a description of the page (e.g., its content and its purpose), a.k.a. the “meta description”. Although it won’t improve your rankings to define a meta description (or meta keywords or any other meta tag, for that matter), it is useful from the standpoint of influencing what text appears within your listing in the search results (i.e. the “snippet”), in order to better persuade the user to click through to your site.

    Yahoo will frequently employ the meta description as the description in your search results listing. Bing is also displaying the meta descriptions in the search listings. Google may incorporate some or all of your meta description in to the snippet displayed in your search listing; it’s more likely if the searcher’s keywords are present in your meta description. More on the intricacies on Google snippets here. The user’s search terms, and related keywords, like those with the same root – are bolded in the search listing, which improves the clickthrough rate to your page (from the search results). This is known as KWiC (KeyWords in Context).

  12. Does your site have a custom error page that returns the correct “status code”? Don’t greet users with the default “File not found” error page when they click through from a search engine results page to a page on your site that no longer exists. Offer a custom error page instead, with your logo and branding, navigation, site map, and search box. Important from an SEO standpoint – make sure that “File not found” error page returns in the HTTP header a “status code” of 404 (or potentially a different 400 or 500 level status code depending on the nature of the error), or it 301 redirects to a URL that returns a 404. You can check this with a server header checker, such as this one. If you send a mistakenly send a 200 status code instead, this error page will likely end up in the index, and thus the search results. This is discussed further in the “Worst Practices.” No matter what the reason for the page’s unavailability (e.g., discontinued product, site redesign, file renamed, server or database issues), you shouldn’t be driving visitors away with an ugly error page that doesn’t provide a path to your home page and other key areas of your site.

  13. Do your filenames and directory names include targeted keywords? Google engineer Matt Cutts has blogged that this is a useful “signal” to Google, so if it’s easy to do, why not? Separate keywords with hyphens, not with underscores. Avoid having more than a few keywords into a filename or directory name, as it could look spammy to the search engines.

  14. Are you actively building links to your site? A steady stream of high quality links don’t just “happen”; just like ongoing, consistently great media coverage doesn’t just “happen.” If it did, link builders and public relations pros would all be out of a job.The most basic of starting points for link building is the authoritative directories like the Yahoo Directory and the Open Directory. Not only do the high quality directories improve your PageRank and consequently your rankings; they also drive direct click-through traffic. If you aren’t already listed in the Yahoo Directory or Open Directory then you should identify the category most relevant to your business and submit your site. A listing in Open Directory also ensures a listing in the (largely forgotten) Google Directory and numerous other directories powered by Open Directory.

    Submitting to Yahoo’s directory costs $299 then $299 per year recurring (it’s free for noncommercial sites, though.) Submitting to Open Directory is free but it’s become practically impossible to get into, at least in the most appropriate category for your site, since the Open Directory’s owner (AOL) and its volunteer editors have left the Directory semi-abandoned. Don’t waste your time and money submitting to hundreds of directories, just pick the most critical ones that are relevant to your business/industry and that Google would likely consider authoritative and trustworthy.

    For example, a business-to-business company may wish to submit to business.com and ThomasNet.com. Directories that primarily target webmasters and SEOs to sell them listings, rather than end users who would actually browse the directory, are most likely being devalued by Google and thus would be a waste of your time and money to submit to.

    What’s next after the directories? I can’t get into that or this already overly long article would quickly become a book! There’s an entire Search Engine Land column dedicated to this important topic: Link Week. Suffice it to say that within link building lies quite a spectrum of tactics, from the more basic like optimized press releases, article syndication, and guest blogging to the more advanced like consistently hitting the Digg.com front page with killer link bait. Diversify your link building tactics like you diversify your investment portfolio. Don’t just rely on one tactic.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: All Things SEO Column | Channel: SEO

Sponsored


About The Author: is the author of Google Power Search, creator of the Science of SEO and co-author of The Art of SEO now in its second edition, both published by O'Reilly. Spencer is also the founder of Netconcepts and inventor of the SEO technology platform GravityStream. He also blogs on his own site, Stephan Spencer's Scatterings.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • http://www.twitter.com/ogletree ogletree

    Do we really need another “SEO best practices” article and does it need to be on the front page of SEL? How many times has this article been written. Are we running out of things to say. How about we just not publish something or republish and old article when we run out of things to say.

  • Matt McGee

    I think it would be a mistake to assume that all people reading SEL have been reading SEO articles for years. We tend to get recommended as a starting point when new people are entering the industry, and I think we also get recommended regularly to webmasters and business owners/staff outside of the search industry. And so our content should reflect that. Not every article is going to be targeted to every reader.

  • http://www.magikalhotels.com edrad80

    Excellent article. Having read hundreds over the last two months since the launch of our site, this is by far the most informative and clear article out there.

  • http://www.masterblade.net diamondtools

    Good tips in this article. I’ve read most of them before, my question is regarding the internal linking structure. What’s the deal with people using “nofollow” on their about and contact pages? Is that to try to sculpt pagerank internally or reduce the PR value of outbound links? Never really did understand that bit, anyways thanks for the list and if you could shed any light on the internal linking it would be appreciated.

  • http://www.sendeasy.gr/ μετακομισεις

    Thanks for the tips! I would really like to clear up an issue in my mind about anchor text pointing to a website from another. Of course i agree that using more phrases than one will divide the link relevance resulting in giving each term less weight than it would give to only one. Furthermore the prominence that we give in 2 keyphrases of the anchor text surely affects the weight on each. For example, i have anchor text:  moving companies (1 term) and i think of transportation as a good alternative keyword so i decide to turn my anchor into: moving companies - transportation (2 terms). This way i divide the weight in two terms and give more weight to the first one. I slow down the rythm of grow for each keyword but in the end i create brand awareness around two competitive keywords and not just one. Now to the point. I ve read  a lot of articles stating that we should optimize a page only around 1 keyword. But in many cases there are more than one great choices, or in small markets only one keyword is simply not enough. In addition optimizing more than two to three pages of a site sound pretty difficult and time consuming. As a conclusion i believe that our dominant page, that’s to say home page should be optimized for two keywords and then we should have one more page to optimize with a third keyword. My question: While optimizing a page for two keywords, is it better to devide their weight in the same anchor, or to link to the page switching the anchor in every case? thanks and sorry for the long analysis 

 

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide