JavaScript rendering and indexing: Cautionary tales and how to avoid them

The results of a JavaScript rendering and indexing experiment highlight some challenges of running JS-dependent content.

Chat with SearchBot

I recently read Ziemek Bucko’s fascinating article, Rendering Queue: Google Needs 9X More Time To Crawl JS Than HTML, on the Onely blog.

Bucko described a test they did showing significant delays by Googlebot following links in JavaScript-reliant pages compared to links in plain-text HTML. 

While it isn’t a good idea to rely on only one test like this, their experience matches up with my own. I have seen and supported many websites relying too much on JavaScript (JS) to function properly. I expect I’m not alone in that respect.

My experience is that JavaScript-only content can take longer to get indexed compared to plain HTML. 

I recall several instances of fielding phone calls and emails from frustrated clients asking why their stuff wasn’t showing up in search results. 

In all but one case, the challenge appeared to be because the pages were built on a JS-only or mostly JS platform.

Before we go further, I want to clarify that this is not a “hit piece” on JavaScript. JS is a valuable tool. 

Like any tool, however, it’s best used for tasks other tools cannot do. I’m not against JS. I’m against using it where it doesn’t make sense.

But there are other reasons to consider judiciously using JS instead of relying on it for everything. 

Here are some tales from my experience to illustrate some of them.

1. Text? What text?!

A site I supported was relaunched with an all-new design on a platform that relied heavily on JavaScript. 

Within a week of the new site going live, organic search traffic plummeted to near zero, causing an understandable panic among the clients.

A quick investigation revealed that besides the site being considerably slower (see the next tales), Google’s live page test showed the pages to be blank. 

My team did an evaluation and surmised that it would take Google some time to render the pages. After 2-3 more weeks, though, it was apparent that something else was going on. 

I met with the site’s lead developer to puzzle through what was happening. As part of our conversation, they shared their screen to show me what was happening on the back end. 

That’s when the “aha!” moment hit. As the developer stepped through the code line by line in their console, I noticed that each page’s text was loading outside the viewport using a line of CSS but was pulled into the visible frame by some JS. 

This was intended to make for a fun animation effect where the text content “slid” into view. However, because the page rendered so slowly in the browser, the text was already in view when the page’s content was finally displayed. 

The actual slide-in effect was not visible to users. I guessed Google couldn’t pick up on the slide-in effect and did not see the content. 

Once that effect was removed and the site was recrawled, the traffic numbers started to recover.

2. It’s just too slow

This could be several tales, but I’ll summarize several in one. JS platforms like AngularJS and React are fantastic for rapidly developing applications, including websites. 

They are well-suited for sites needing dynamic content. The challenge comes in when websites have a lot of static content that is dynamically driven. 

Several pages on one website I evaluated scored very low in Google’s PageSpeed Insights (PSI) tool. 

As I dug into it using the Coverage report in Chrome’s Developer Tools across those pages, I found that 90% of the downloaded JavaScript wasn’t used, accounting for over 1MB of code. 

When you examine this from the Core Web Vitals side, that accounted for nearly 8 seconds of blocking time as all the code has to be downloaded and run in the browser. 

Talking to the development team, they pointed out that if they front-load all the JavaScript and CSS that will ever be needed on the site, it will make subsequent page visits all that much faster for visitors since the code will be in the browser caches. 

While the former developer in me agreed with that concept, the SEO in me could not accept how Google’s apparent negative perception of the site’s user experience was likely to degrade traffic from organic search. 

Unfortunately, in my experience, SEO often loses out to a lack of desire to change things once they have been launched.

3. This is the slowest site ever!

Similar to the previous tale comes a site I recently reviewed that scored zero on Google’s PSI. Up to that time, I’d never seen a zero score before. Lots of twos, threes and a one, but never a zero.

I’ll give you three guesses about what happened to that site’s traffic and conversions, and the first two don’t count!

Get the daily newsletter search marketers rely on.


Sometimes, it’s more than just JavaScript

To be fair, excessive CSS, images that are far larger than needed, and autoplay video backgrounds can also slow download times and cause indexing issues.

I wrote a bit about those in two previous articles:

For example, in my second tale, the sites involved also tended to have excessive CSS that was not used on most pages.

So, what is the SEO to do in these situations?

Solutions to problems like this involve close collaboration between SEO, development, and client or other business teams. 

Building a coalition can be delicate and involves giving and taking. As an SEO practitioner, you must work out where compromises can and cannot be made and move accordingly. 

Start from the beginning

It’s best to build SEO into a website from the start. Once a site is launched, changing or updating it to meet SEO requirements is much more complicated and expensive.

Work to get involved in the website development process at the very beginning when requirements, specifications, and business goals are set. 

Try to get search engine bots as user stories early in the process so teams can understand their unique quirks to help get content spidered indexed quickly and efficiently. 

Be a teacher

Part of the process is education. Developer teams often need to be informed about the importance of SEO, so you need to tell them. 

Put your ego aside and try to see things from the other teams’ perspectives. 

Help them learn the importance of implementing SEO best practices while understanding their needs and finding a good balance between them. 

Sometimes it’s helpful to hold a lunch-and-learn session and bring some food. Sharing a meal during discussions helps break down walls – and it doesn’t hurt as a bit of a bribe either. 

Some of the most productive discussions I’ve had with developer teams have been over a few slices of pizza.

For existing sites, get creative

You’ll have to get more creative if a site has already launched. 

Frequently, the developer teams have moved on to other projects and may not have time to circle back and “fix” things that are working according to the requirements they received. 

There is also a good chance that clients or business owners will not want to invest more money in another website project. This is especially true if the website in question was recently launched.

One possible solution is server-side rendering. This offloads the client-side work and can speed things up significantly. 

A variation of this is combining server-side rendering caching the plain-text HTML content. This can be an effective solution for static or semi-static content. 

It also saves a lot of overhead on the server side because pages are rendered only when changes are made or on a regular schedule instead of each time the content is requested.

Other alternatives that can help but may not totally solve speed challenges are minification and compression. 

Minification removes the empty spaces between characters, making files smaller. GZIP compression can be used for downloaded JS and CSS files.

Minification and compression don’t resolve blocking time challenges. But, at least they reduce the time needed to pull down the files themselves.

Google and JavaScript indexing: What gives?

For a long time, I believed that at least part of the reason Google was slower in indexing JS content was the higher cost of processing it. 

It seemed logical based on the way I’ve heard this described: 

  • A first pass grabbed all the plain text.
  • A second pass was needed to grab, process, and render JS.

I surmised that the second step would require more bandwidth and processing time.

I asked Google’s John Mueller on Twitter if this was a fair assumption, and he gave an interesting answer. 

From what he sees, JS pages are not a huge cost factor. What is expensive in Google’s eyes is respidering pages that are never updated. 

In the end, the most important factor to them was the relevance and usefulness of the content.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Elmer Boutin
Contributor
Elmer Boutin is the Director of Search Engine Optimization at Rocket Central – a Detroit-based company that provides services to Rocket Companies (NYSE: RKT) – where he supports various websites across the Rocket Companies platform. Following a career in the U.S. Army as a translator and intelligence analyst, he has worked in digital marketing for over 25 years doing everything from coding and optimizing websites to managing online reputation management efforts as an independent contractor, corporate webmaster and in agency settings. He has vast experience and expertise working for businesses of all sizes from SMBs to Fortune 5-sized corporations including PFS, Displays Fine Art Services, Banfield Pet Hospital, Corner Bakery Cafe, Ford Motor Company, Kroger, Mars Corporation, and Valvoline; optimizing websites focusing on local, e-commerce, informational, educational and international.

Get the must-read newsletter for search marketers.