Standards Compliance: Just Do it, Already

Dear coders: please write your HTML/XHTML code to be standards-compliant.

Yes, I know — we’ve been down this road before. Folks have tested compliant versus non-compliant code, and it doesn’t help with search rankings, so why bother, blah blah blah.

Just use the damned standards, OK? Standards compliance does impact rankings, albeit indirectly: Standards compliance leads to lean code, which leads to deeper crawls, which then leads to deeper indexation of your site.

Standards Compliance Means Deeper Crawls

Search bots come to your site with a ‘budget’ defined by a the amount of time they’ll spend crawling your site. Don’t believe me? Look at this Google Webmaster Tools crawl stats report:

Pages crawled versus time spend downloading

Compare the highlighted areas. When the time spent downloading a page goes down, the number of pages crawled goes up. And more pages crawled means a better shot at better rankings for more terms.

One way to decrease the time search bots spend downloading each page on your site? Reduce the amount of code on each page, so that there’s less to download.

And one way to do that? Standards compliance.

Why Standards Compliance Leads To Streamlined Code

Since I can’t establish a direct link between standards compliance and rankings, I sometimes give in to coders and say, “Fine, don’t use standards”. Inevitably, the resulting code is rife with:

  • Inline and embedded javascript;
  • Inline and embedded CSS;
  • Table-itis, where tables are used to position every element on the page;
  • Use of steam-powered attributes like font size=.

I could go on, but I’ll start grinding my teeth.

When folks stick to standards, though, they tend to root out all the inline and embedded garbage. It’s a lot easier to write code that follows web standards if you can see it, and eliminating a few hundred lines of unnecessary stuff goes a long way to readability. So inline javascript ends up in .js include files, where it should be. Tables are used for data. And things like font attributes tend to disappear.

Also, web standards mandate moving as much of the stuff that controls the appearance of a page into external .css files. So again, inline and embedded styles and tags that control appearance move out into external files.

Removal of all that extra bloat leads to faster loading, streamlined code. And it really does make life for the search engines easier. When a search spider visits your site, it ignores javascript and CSS information that’s in include files. So the spiders spend less time on each page.

And it’s not a small difference. Even on small, simple sites, cleaner code can reduce file size of a typical page by 30-40%.

You’ve Got Nothing To Lose

To see the ranking benefits, you don’t need 100% perfect, standards-compliant HTML/XHTML throughout your site. All you need to do is clean things up, use elements the way they’re meant to be used, and move everything that controls appearance to the stylesheet(s).

The real argument around web standards isn’t why — it’s why not. Even ignoring search engine optimization, even if you’ve got your eye on HTML 5, getting your site compliant with current standards will mean an easier upgrade later on. It will also simplify converting your site to a mobile layout, and give visitors the most consistent experience across browsers and platforms (note I didn’t say perfect – I said most consistent.)

Put in the time now, and start realizing the benefits to SEO, site performance and future site changes.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: All Things SEO Column | Channel: SEO


About The Author: is Chief Marketing Curmudgeon and President at Portent, Inc, a firm he started in 1995. Portent is a full-service internet marketing company whose services include SEO, SEM and strategic consulting.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Michael Martinez

    There is much to be said for advocating standards-compliant code, but using one Website’s chart — and showing only an event, not a trend — doesn’t even begin to make a case.

    Your approach seems to have merit, but it lacks punch. You should be publishing far more data than this sole example in order to build a scientifically valid argument.

  • T Campbell

    I have to agree with Michael– which is a shame because I really like the article’s sass.

  • Webmaster T

    Ummm tables aren’t standards compliant? Using font tag isn’t compliant? Was for HTML 4 and HTML 5 is at best a year away. If you knew the difference between a Standard and an RFC and best practices I might take you seriously.

    It is best practices to use the techniques you mentioned it is not non standards compliant code. As the curerent standard is 4… HTML 5 is an RFC not a standard. And in the end I can use any version of HTML I want so long as the version used is indicated.

    I would also add that the gains from doing this would be miliseconds so unless you have a very, very large site there would be no change to the indexing of the site.

  • Ian Lurie

    Good point, Michael.

    Of course, if I’d published 20 graphs and charts, someone would’ve complained that I overwhelmed them with data. :)

    Such is life when you write articles.

    I’ll write a follow-up that includes all of the data I’ve collected over the years.

    Webmaster T, if you code it right when your site is 100 pages, then you won’t have to do it when it grows to 1000 pages. I can’t tell you how many webmasters have told me one year “Our site’s too small for this to matter” and then the next year said “Oh, our site’s too big, we can’t take the time to fix it now”. Do it right from the start.

  • trashknob

    Great article Ian.

    After getting our ecommerce site properly coded, we saw a whopping 65% improvement in page load time.

    The funny thin is that it’s so easy to get things properly hand-coded cheaply and quickly with all the PSD to HTML/CSS service providers that out there. We’ve used Dan over at with great success, for example.

  • adrianb

    Very well structured article Ian. Thank you. It just makes perfect sense to want to get a web page as well organised and streamlined as possible. Some web designers need to take a serious look at the way they ‘design’ web pages.

  • Ruth_OL

    Even if it had no effect on SEO it’s still worth doing for maintenance and accessibility reasons. Your boss wants all the text on your 1000-page site changing from grey to blue? With a separate CSS file, that takes seconds.

    Plus, with separate CSS and layout in tags, your users, including ones with physical impairments that may prevent them from using regular browsers, can display your site in a way that suits them, rather than cursing your name for making their life so difficult. Who do you think such users would be more likely to buy from, the site that tries to cater for their needs, or the one that excluded them by using sloppy code?

  • Ruth_OL

    That was supposed to be “layout in DIV tags”, but I put it in angle brackets and it got stripped out.

  • mrportman

    All decent developers are building with clean code already, so you’re kind of stating the obvious, but it is okay to deviate from standards sometimes.

    What’s your take on standards compliance for anyone building with HTML5? I think the ability to use more than one h1 element is a good thing for document structure (when used with the new header and hgroup elements, and it will validate.

    I wonder what Google’s doing in these situations.

  • http://mauricewalshe mauricewalshe

    I should mention the quote

    “Standards are great that’s why we have so many”

    Standards lead to streamlined code and better code hmm “yes right”. In a panglosian world this would be, true in reality you end up with sites that have huge numbers of included css files, multiple bloated JavaScript library’s included where 90% of the code is never used.

    In some ways an old school site with layout done in tables is much easier for some one to pick up the code and look at.

    I looked at one site that had 18 css files and 5 JavaScript ones an absolute nightmare to work out wtf was going on and as the site had been developed over time there was zero documentation – I felt bad giving it to a new developer to sort it out.

    You can get the best speedup for a lot of sites by using Photoshop on the images and don’t be fooled by the tool that firebug uses to suggest compression savings PS is much better.


Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide