Common JavaScript errors that can harm your SEO [Video]

And, why JavaScript may not be the real cause of your crawling issues.

Chat with SearchBot

“Often, I see JavaScript being blamed when the problem is something else,” Martin Splitt, search developer advocate at Google, cited as the underlying issue behind many of the site errors he comes across. During our crawling and indexing session of Live with Search Engine Land, Splitt discussed the most common JavaScript-related issues that can hurt a site’s SEO as well as some ways to avoid them.

One popular misconception is that JavaScript simply does not work well for search engines. “Well, you could [have JavaScript work well for search engines] if your JavaScript wouldn’t be roboted, so we [Google] can’t access your JavaScript,” Splitt said. When using external JavaScript files as part of the page, some SEOs and site owners use their robots.txt file to block Google from accessing that code, unaware of the consequence. This does not break functionality for users, but will disable search engines from fetching that JavaScript to render the page.

In contrast, “We do see people breaking websites for users, rather than for search engines,” said Splitt. These sites are indexable, but do not provide a good user experience as they may need to send abnormally large amounts of data to load a simple list of products, for example.

“Another thing that I see relatively often is that people rely on JavaScript to do things that you can do without JavaScript,” Splitt said, adding, “That’s not something that you need to inherently be careful about, it’s just something that I think is pointless.” Splitt’s go-to example for unnecessary JavaScript is using it in place of the standard HTML link. This can cause problems for Googlebot as it does not interact with such features, which may result in it skipping over your links.

Why we care. When implemented conscientiously, JavaScript can be used to enhance your site with interactive features or web applications, for example, without sacrificing your organic visibility. That being the case, site owners should opt for simple, reliable techniques over more complex solutions when possible, as using complicated workarounds may lead to crawling issues down the road.

Want more Live with Search Engine Land? Get it here:


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

George Nguyen
Contributor
George Nguyen is the Director of SEO Editorial at Wix, where he manages the Wix SEO Learning Hub. His career is focused on disseminating best practices and reducing misinformation in search. George formerly served as an editor for Search Engine Land, covering organic and paid search.

Get the must-read newsletter for search marketers.