Breaking Down Page Speed Events For SEO Gain

It's important to get clear the value of changes in terms of SEO impact before allocating in-demand tech time to work on fixes. To throw up just one example: if your page returns in less than one second what impact will fixing render-blocking CSS have on absolute page load times?

Chat with SearchBot

We all know page speed is important to SEO, but how can we accurately — and quickly — assess if our performance is good enough? Identify where the issues actually lie?

While Google’s Page Speed tool is excellent for diagnosing page issues, site-wide testing requires more. Also, taking a single performance snapshot doesn’t cut it for getting a true assessment of your page load times; so what tools are there we can use to supercharge our diagnostic?

Additionally, it’s important to get clear the value of changes in terms of SEO impact before allocating in-demand tech time to work on fixes. To throw up just one example: if your page returns in less than one-second, what impact will fixing render-blocking CSS have on absolute page-load times?

PageSpeed Vs Time to First Byte

(Click to enlarge.)

What Are We Measuring?

When we talk about page speed; really, we are corralling a number of key page-speed events together which are important for different reasons. Here are some of the key events we should care about for SEO in (roughly!) their order of load:

  • Time To First Byte (TTFB): An absolute measure of the performance of your server architecture (or any overlaying CDN) under load. Measures how long after the client page request is made the first byte of data takes to return.
  • DOM Load: This is the point at which all HTML content within the initial page HTML tag has been received from the server along with all assets like CSS and JavaScript. Usually, this is also a point when a number of critical JavaScript events fire; regardless the next step is parsing CSS and JavaScript by the user-agent (AKA: the browser). Note: this load event may well be supplemented by AJAX or similar functionality to deliver further HTML content after user action, so this is also sometimes referred to as the complete static load of an HTML page, in contrast to a “dynamic” page load time.
  • Critical Render Path: This is the point at which everything in the initial device viewport is rendered. You can also think of the viewport as everything “above the fold.” This is the point most usability folks are looking for with their sub-three second target for “page load.” In effect, they don’t care if content not in the viewport isn’t loaded, as the user can’t see it yet and therefore doesn’t have a perception that it’s “slow loading.” This is certainly a rational point, though the sub-three second threshold standard is dropping quickly these days and will be largely usurped by an “always faster” attitude.
  • Headless Browser Render: Essentially, this is the same as below, except no actual render is displayed. Similar to the question of whether a tree falling in a forest with no one around makes a sound or not; so we can ask if it’s worth distinguishing between if human eyes need to see render output for render to be complete. (Answers: not at the quantum level, and no.)
  • Full Page Render: The page is now fully loaded and all required rendering is complete. And, indeed, a human can clap eyes on the content and marvel at your Bootstrapped Geo-Cities styling choices.
Geo-CVities-Bootstrap

Make your sites cutting-edge with Geo-Cities Bootstrapped code!

Bear in mind, we can segment all of these events by device type. Typically for SEO, we care about desktop, tablet and mobile (really smartphones) categories, though you may want to further segment into Phablets, iPhone/Target Android Devices, Kindle Fire, etc., depending on your target market.

And of course, each event will record different average load-times over different connection conditions (for example, 3G Vs. 4G Vs. Wi-Fi TTFB load times for smartphones will be significantly different).

Further, bear in mind that we can also go wide when optimisingpage loads: using multiple parallel requests simultaneously to load multiple assets at the same time. This has limits depending on which browser you are using , so optimising for additional parallel connections based on user-agent requires either use of multiple domain hosts (via subdomains or asset specific domains), or use of a framework like HeadJS.

Here’s how parallel loads look when under analysis (click the image to enlarge):

Page Speed load times

(Click to enlarge.)

Which Events Should I Care About For SEO?

Google cares about time to first byte (TTFB) and critical render path. Both are measured in Google’s day-to-day indexing, and Googlebot has used headless browser analysis for a long time to determine real page-load times.

We also know that from very early on, Google was aware that the render of a page was necessary to fight spam: using CSS to take blocks of text with heavily optimised content and shifting it outside of the viewport was a common black hat technique back in the day.

Also, firing JavaScript to adjust the user experience for human visitors to be different than that of Googlebot is also used to this day by spammers out to deceive search engines. (Their reasoning for this approach is a lot less sound these days, though, especially with over-optimisation penalties. You have to ask what the SEO benefit is of allowing Google to easily encounter your over-optimised content, then hiding it from users.)

SEL readers will also remember Google’s since removed test SERP presenting the initial page render image when hovering over a search result listing.

When you’re assessing thresholds for TTFB and critical render path, think about beating the average first, then getting as well-optimised as possible second. Why these two steps?

Improving page speed is often hard. You need milestones that are achievable early on in the process as you tune your website delivery engine to its peak performance capability. Beating what’s around you will give you an SEO boost. How? By delivering reduced SERP bounce behaviour.

Wait, What? SERP Bounce Behaviour?

A common belief in SEO forums is that Google uses Google Analytics data to determine if your page load times are fast enough or not.

This is, of course, wrong. Google does not use any Google Analytics data in its algorithm. Not only is there a clear technical reason not to — despite being popular, GA is still far from having sufficient dominance to be a fair global feedback tool for something as important as their search algorithm — but there are obvious legal and anti-trust reasons, as well.

However, this myth is perpetuated because many webmasters spot that when their page speed performance as recorded in GA gets significantly worse, they see ranking reductions soon after. So, how can this be the case?

Simply: Google knows at exactly what time you click on a link in its SERPs.

If you click a link and lose patience with a slow loading site, what do you do? Jump back to the SERP (using your browser back button) and click the next result you think is most relevant. Google can see you’ve done that as your session is still exposed through the referral URL. It also knows the time the last referral URL was passed, revealing the order of, and time between, clicks.

SERP Click Tracked

(Click to enlarge.)

For a slow-loading site, this behaviour will scale across all results for which you have prominent listings. This sort of mass behaviour is exactly the sort of trend that will not only result in your individual results being demoted, but also your domain itself scoring badly for “quality” and incurring another poor performance filter.

So, ultimately, there are a number of different aspects of Google’s algorithm that are triggered by slow-loading sites, and tripping any one of them — from TTFB delays when being crawled to slow critical render paths impacting SERP bounce rates for your domain — will prevent you from achieving maximum SEO performance.

What’s The SEO Impact?

Answering this depends upon two variables: your domain performance context and the relative improvement in speed gained.

If you are a strongly visible site with high-traffic generics driving lots of traffic to your domain, the delay in page-load times of a few milliseconds may be enough to tip user behaviour trends against your site in comparison with your competition. That trend will snowball, and you’ll see negative ranking start to creep in across the board with highest traffic terms hit first.

That’s a major issue, and it needs to be resolved ASAP.

If you’re a brand-new site with few rankings to speak of yet, and you have the same slowdown in page load times, you will want to address the issue, but you may well have more pressing SEO objectives, such as building good, quality content to develop quality inbound-links as your priority right now; and, you’d be right.

Practical Outcomes

You can always improve page speed, and larger businesses need to set some thresholds — so, I would offer up these standards to allow you to get technical work allocated against solving your page speed issues:

  • Set sub one-second critical render path as your absolute goal across all devices and all network paths. This is ambitious — but once attained, it will ensure excellent SEO performance against your competitors based on page load-time SEO impact.
  • Segment out your key user visitor paths, particularly for smartphone and tablet devices, and benchmark their current TTFB and critical render path. Aim to improve both by 50%. Benchmark again, rinse and repeat. This will be easier with critical path than TTFB.

Generally speaking, there are lots of general, best-practice advice tools out there, and using the above targets, you should schedule them for execution appropriate to their priority in the tool. Start with Google Page Speed tool, but also consider setting up Selenium to programmatically survey your development server to test that you are at least as good as the existing site performance before pushing changes to the live server.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Chris Liversidge
Contributor
Chris Liversidge has over twelve years web development experience & is the founder of QueryClick Search Marketing, a UK agency specialising in SEO, PPC and Conversion Rate Optimisation strategies that deliver industry-leading ROI.

Get the must-read newsletter for search marketers.