• stuart mcmillan

    Daniel, I’d say if page performance is really linked to your database performance then you should really consider decoupling the two. A site where database can so badly affect the front end performance will typically have scaling problems and won’t perform well under heavy load. You should look at strategies where pages are generated on a regular basis (every minute if you need to be that fresh) and the HTML is cached on the server. Essentially, one page request does the build, all subsequent requests then get the benefit. Most pages actually don’t need to be truly dynamic, most pages only have small portions which need to be freshly generated. In those cases, use some javascript to pull in the fresh content post page-load.

    A good example would be a global basket value/quantity indicator on an ecommerce website. This can be pulled in post-render, substituting a static “go to basket page” link with the dynamic content, giving a fallback if JS fails.

    Database optimisation *is* important, indices are probably the most important part of that (although too many indices can actually be bad for performance in some situations), but best to remove the problem altogether. A middle ground would be to ensure you use query caching to store commonly used result sets and save hitting the underlying tables every time.

  • http://www.neverpaintagain.co.uk/ Never Paint Again UK

    These are actually really good tips and any saving on site speed can be done simply, using point #2 above. I looked in my header and had validate keys for google, bing, Yahoo (!) and pinterest. As I am running All in One SEO, i simply took the codes/keys, entered them into AllinOneSEO, deleted the other ones and now all these tracking keys load in the footer, the yahoo one is gone as it was not needed obviously, and now it has shaved FULL SECOND off my home page load time. Its that easy. Thanks for the tip. I hope Google reward us in some way! (…….(*waits patiently)…. :)

  • Enrico Altavilla

    Fast sites have a LOT of advantages and it’s always a good idea to improve speed performances. Unfortunately, there is also a lot of misinformation about speed and rankings and I think that it would be correct to provide here the full scenario.

    The author said “Google considers the page speed of the relevant matches, and delivers a ranking bonus to the speediest.” but this statement is false; actually it’s exactly the contrary: faster sites do not get a bonus, it’s slower sites that are “demoted”.

    “You don’t get a boost for having a fast site. Sites that are outliers in terms of being slow will rank lower. All other things being equal, a site that’s too slow will rank lower.” — Matt Cutts


    I would like to give two suggestions:

    1) do not rely on patents when you want to understand what a search engine actually does with its rankings. Patents are not a way to tell people what happens, they are valuable assets produced to increase the value of a company and they try to cover a large set of theoretical possibilities, not the facts.

    2) speed is important even if there is no ranking boost for faster sites. So focus on the quality of the user experience and keep improving the performances!

  • dancristo

    Hey Enrico,

    Thanks for stopping by and sharing your thoughts.

    When I said, “Google considers the page spped of the relevant matches, and delivers a ranking bonus to the speediest” it was within the context of what the referenced patent said.

    The exact patent wording is, “At query time, the search results adjusting engine receives search results responsive to the query. Each search result includes a resource locator referencing a resource and is associated with an initial score for the result. For each search result including a resource locator associated with load time data in the resource index, the search results adjusting engine computes a multiplier factor for the search result. Each multiplier factor is a measure of the effect the load time of the resource will have on the initial score of the search result and is dependent on a load time measure for the resource referred to by the search result. The search results adjusting engine applies the multiplier factor for each search result to the initial score for the search result (i.e., multiplies the initial score for the search result by the multiplier factor for the search result) to generate a load-adjusted score. The search results can then be ranked according to their load-adjusted scores. Alternatively, an additive factor for each result can be determined based on the load time measures. The additive factors can then be added to each initial score to generate the load-adjusted scores.”

    So the patent is clearly talking about page speed scores increasing or improving ranking scores, however it’s not necessary excluding a “dampening” affect, it’s just that the emphasis is more on the boost.

    That said, I whole heartily agree with your point about not taking patents as grounds for how Google currently works. Patent owners have the right to use all, some or none of their invention.

    Where I would caution you is in putting too much emphasis in what Matt Cutts says. The % of not provided is a great example of why we must take what he says with a grain of salt, and remember that what he says today:
    1) Can change tomorrow
    2) Many times is open to interpretation

    So use each patent, Matt Cutts’s statement, industry survey, and 3rd party research, etc. as single data points in the greater context of, “this is how SEO works”.

  • dancristo

    Great advice.

    Clearly there is a lot of room for website and database optimization in the example I used.

    To your point though, just because you can speed up pages by tweaking the database doesn’t mean that the site is “fixed”. If a tweak like that needs to be done then there is likely a much larger architectural issue at play.

  • dancristo

    That’s great. A full second is no joke. Great job!