Consumers over bots: Why site crawlers don’t hold all the answers
Site crawlers are valuable tools in the SEO toolbox, but columnist David Freeman warns that search marketers should not use them as a replacement for manual analysis.
Advances in technology have driven efficiency in SEO, where site crawlers such as Botify, DeepCrawl and ScreamingFrog have flourished.
These tools are an essential part of the SEO toolbox and are great at uncovering and visualizing technical issues such as broken links, 404 errors and invalid canonical tags. They are becoming the default source of technical performance analysis for SEOs, which means they spend less time interacting with, and analyzing, websites in a browser and/or site analytics.
On the surface, this doesn’t look like anything to be concerned about; we’re getting vast amounts of technical analysis at speed with tools.
However, these tools are bots — they analyze the site’s source code looking for identifiable issues against an audit checklist which, while useful, won’t necessarily correspond to the issues consumers face.
Search results are centered around the consumer
Studies from SEMrush and SearchMetrics both also reference user signals and the consumer experience, including mobile-friendliness, content relevancy, site speed, bounce rate/search sequence, time on site and content format as key ranking factors.
However, with site crawlers becoming the default for website analysis and reducing the time that SEOs spend analyzing physical websites, the consumer experience is being neglected, resulting in untapped opportunities to improve performance.
Additionally, as of last November, consumers accessing the web via mobile devices overtook desktop for the first time. This further disconnects site crawlers from consumer behavior, and while site crawlers are catching up, they predominantly still default to desktop analysis.
With this in mind, it is critical to analyze and diagnose websites in the same way consumers interact with them, in addition to bot usage.
Prioritize user signals and consumer experience
A consumer-first approach is essential to succeed in the search results, now and in the future. To deliver this, we need to put ourselves in the shoes of the consumer and interact with websites, as well as analyze site analytics and Google Search Console, rather than purely relying on site crawlers.
Interaction with, and deep analysis of, a website helps us understand the challenges that consumers face across devices, from their initial search through to conversion. Additionally, site analytics and Google Search Console help discover issues consumers face on a website.
For instance, is the consumer journey optimized across mobile and desktop? Are consumers’ content needs considered across devices? Is this content experience optimized for the device? Are forms simple to fill out and submit?
Interacting with websites as a consumer and crawling as a bot will never replace each other; they both provide unique and extremely relevant insight that allows brands to maximize potential within organic search. However, to realize this potential, it is essential that neither of these methods are neglected.
Site crawlers continue to make our lives significantly easier by ensuring we can efficiently identify technical site issues at scale. However, with user signals and consumer experience playing an ever more important role in driving organic search performance, it is essential that we do not solely rely on site crawlers.
It is critical that we put ourselves in the shoes of consumers and walk through their journey, understand their needs and remove frustrating usability barriers on websites. From here, strategic consultancy around the informational needs of the consumer at each stage of the purchase funnel is essential to increase relevancy and drive conversions.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.