Common oversights that can impede Google from crawling your content [Video]

SEOs and developers shouldn’t over-engineer workarounds when basic, reliable solutions already exist, says Martin Splitt in this clip from Live with Search Engine Land.

Chat with SearchBot

“I don’t know why people are reinventing the wheel,” said Martin Splitt, search developer advocate for Google, during our crawling and indexing session of Live with Search Engine Land. As more and more techniques are developed to provide SEOs and webmasters with flexible solutions to existing problems, Splitt worries that relying on these workarounds, instead of sticking to the basics, can end up hurting a site’s organic visibility.

“We have a working mechanism to do links . . . so, why are we trying to recreate something worse than what we already have built-in?” Splitt said, expressing frustration over how some developers and SEOs are diverging from the standard HTML link in favor of fancier solutions, such as using buttons as links and forsaking the href attribute for onclick handlers. These techniques may create problems for web crawlers, which increases the likelihood that those crawlers skip your links.

Another common issue arises when SEOs and developers block search engines from accessing certain content using the robots.txt file, still expecting their JavaScript API to direct the web crawler. “When you block us from loading that, we don’t see any of your content, so your website, as far as we know, is blank,” Splitt said, adding “And, I wouldn’t know why, as a search engine, would I keep a blank website in my index.”

Why we care. “Oftentimes, people are facing a relatively simple problem and then over-engineer a solution that seems to work, but then actually fails in certain cases and these cases usually involve crawlers,” Splitt said. When simple, widely accepted techniques already exist, site owners should opt for those solutions to ensure that their pages can get crawled and subsequently indexed and ranked. The more complex a workaround is, the higher the chances are that the technique will lead to unforeseen problems down the road.

Want more Live with Search Engine Land? Get it here:


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

George Nguyen
Contributor
George Nguyen is the Director of SEO Editorial at Wix, where he manages the Wix SEO Learning Hub. His career is focused on disseminating best practices and reducing misinformation in search. George formerly served as an editor for Search Engine Land, covering organic and paid search.

Get the must-read newsletter for search marketers.