• stuart mcmillan

    Daniel, I’d say if page performance is really linked to your database performance then you should really consider decoupling the two. A site where database can so badly affect the front end performance will typically have scaling problems and won’t perform well under heavy load. You should look at strategies where pages are generated on a regular basis (every minute if you need to be that fresh) and the HTML is cached on the server. Essentially, one page request does the build, all subsequent requests then get the benefit. Most pages actually don’t need to be truly dynamic, most pages only have small portions which need to be freshly generated. In those cases, use some javascript to pull in the fresh content post page-load.

    A good example would be a global basket value/quantity indicator on an ecommerce website. This can be pulled in post-render, substituting a static “go to basket page” link with the dynamic content, giving a fallback if JS fails.

    Database optimisation *is* important, indices are probably the most important part of that (although too many indices can actually be bad for performance in some situations), but best to remove the problem altogether. A middle ground would be to ensure you use query caching to store commonly used result sets and save hitting the underlying tables every time.

  • http://www.neverpaintagain.co.uk/ Never Paint Again UK

    These are actually really good tips and any saving on site speed can be done simply, using point #2 above. I looked in my header and had validate keys for google, bing, Yahoo (!) and pinterest. As I am running All in One SEO, i simply took the codes/keys, entered them into AllinOneSEO, deleted the other ones and now all these tracking keys load in the footer, the yahoo one is gone as it was not needed obviously, and now it has shaved FULL SECOND off my home page load time. Its that easy. Thanks for the tip. I hope Google reward us in some way! (…….(*waits patiently)…. :)

  • Enrico Altavilla

    Fast sites have a LOT of advantages and it’s always a good idea to improve speed performances. Unfortunately, there is also a lot of misinformation about speed and rankings and I think that it would be correct to provide here the full scenario.

    The author said “Google considers the page speed of the relevant matches, and delivers a ranking bonus to the speediest.” but this statement is false; actually it’s exactly the contrary: faster sites do not get a bonus, it’s slower sites that are “demoted”.

    “You don’t get a boost for having a fast site. Sites that are outliers in terms of being slow will rank lower. All other things being equal, a site that’s too slow will rank lower.” — Matt Cutts

    Source:
    http://searchengineland.com/smx-advanced-conversation-with-matt-cutts-162925

    I would like to give two suggestions:

    1) do not rely on patents when you want to understand what a search engine actually does with its rankings. Patents are not a way to tell people what happens, they are valuable assets produced to increase the value of a company and they try to cover a large set of theoretical possibilities, not the facts.

    2) speed is important even if there is no ranking boost for faster sites. So focus on the quality of the user experience and keep improving the performances!

  • dancristo

    Hey Enrico,

    Thanks for stopping by and sharing your thoughts.

    When I said, “Google considers the page spped of the relevant matches, and delivers a ranking bonus to the speediest” it was within the context of what the referenced patent said.

    The exact patent wording is, “At query time, the search results adjusting engine receives search results responsive to the query. Each search result includes a resource locator referencing a resource and is associated with an initial score for the result. For each search result including a resource locator associated with load time data in the resource index, the search results adjusting engine computes a multiplier factor for the search result. Each multiplier factor is a measure of the effect the load time of the resource will have on the initial score of the search result and is dependent on a load time measure for the resource referred to by the search result. The search results adjusting engine applies the multiplier factor for each search result to the initial score for the search result (i.e., multiplies the initial score for the search result by the multiplier factor for the search result) to generate a load-adjusted score. The search results can then be ranked according to their load-adjusted scores. Alternatively, an additive factor for each result can be determined based on the load time measures. The additive factors can then be added to each initial score to generate the load-adjusted scores.”

    So the patent is clearly talking about page speed scores increasing or improving ranking scores, however it’s not necessary excluding a “dampening” affect, it’s just that the emphasis is more on the boost.

    That said, I whole heartily agree with your point about not taking patents as grounds for how Google currently works. Patent owners have the right to use all, some or none of their invention.

    Where I would caution you is in putting too much emphasis in what Matt Cutts says. The % of not provided is a great example of why we must take what he says with a grain of salt, and remember that what he says today:
    1) Can change tomorrow
    2) Many times is open to interpretation

    So use each patent, Matt Cutts’s statement, industry survey, and 3rd party research, etc. as single data points in the greater context of, “this is how SEO works”.

  • dancristo

    Great advice.

    Clearly there is a lot of room for website and database optimization in the example I used.

    To your point though, just because you can speed up pages by tweaking the database doesn’t mean that the site is “fixed”. If a tweak like that needs to be done then there is likely a much larger architectural issue at play.

  • dancristo

    That’s great. A full second is no joke. Great job!

  • http://www.neverpaintagain.co.uk/ Never Paint Again UK

    thank you! I’m pleased with that too! I am looking to carry on with more of this, currently it seems that facebook open graph and shareaholic are the 2 culprits currently slowing it down still but maybe if i removed them too it would have a negative effect on the site?

  • stuart mcmillan

    So your saying that the site you were working on had/has major architectural issues? Optimising must have been an interesting challenge, but I guess there was real opportunity to make improvements. Transforming the architecture of an established site can be fraught with danger, was speed improvement one of the main goals of the project?

  • http://charismaworks.com.au/ davidbobis

    Thanks for the tips, Daniel. I haven’t thought about the part about optimising the database before. I also loved your parting advice to do it for the user. In the end, we need to make websites that users will love!

  • http://www.mikearnesen.com/ Mike Arnesen

    Great post! Time to make the web (even) faster! Bonus points for using Flula!

  • John Biundo

    Hi Daniel, I was intrigued by your statement “This time tracker sends device and page speed information to Google every time you visit a page within your Chrome browser.” I’m not aware of this “feature” of Chrome, and couldn’t find anything to substantiate it. I did a careful read of Google’s privacy pages and representations of what it tracks, and didn’t see any reference to this. Of course that doesn’t mean it’s not happening, but if you could provide a reference to something to substantiate this statement, that would be a *really* valuable piece of information to have in the community.

  • Jason Purcell

    Great post! I love seeing more people talk about the importance of site speed, both as related to search engine rankings and revenue. My only addition would be that you can make a HUGE impact in your payload size (and therefore site speed) just by enabling compression. It’s a relatively simple change – regardless of server type – that (in my experience at least) not a lot of people take the time to turn on.

  • http://www.krishtechnolabs.com/ Krish TechnoLabs

    I would like to add one more tips “If you notice a particular third-party element consistently loads slowly on your website, consider removing it to boost your page load speed.”

  • http://www.binary-garden.com/ Frank Huebner (binary-garden)

    I like the topic and I like the post but nevertheless here are some remarks.

    – In my opinion the most important page speed tool is not Google’s “PageSpeed Insights”, it’s gtmetrix.com. It gives you much more information about the things that have do be done than Google and in many cases it’s delivering the solution just-in-time (e.g. if it comes to minifying HTML/CSS/JS or compressing images). You can store your project and watch it getting better and better on a history-timeline.

    – Slow websites are not always a matter of bad style. If you are hosting your site for the lowest price your provider has to offer, chances are good that gzip-compression (besides some other things) is not supported via .htaccess.

    – If there are any third-party-plugins that are hitting on the brakes: try to bind their execution on user generated events like mousemove or scroll. jQuery delivers you the perfect function() for that (the “one” means that it’s only executed once):

    $(document).one(“mousemove”, ( function() {
    // do something
    }));

    I once had a customer who wanted a carousel on his startpage – showing only the first 4 pictures at start, with 44 more waiting (hidden) in the pipe and it worked great.

    Try to be a little creative and you soon will find a lot of useful occasions.

    (Please, can somebody bring back my beloved magazine “webtechniques”…?)