Bing says it is improving web crawler efficiency

Bing is working on making sure its crawler doesn't miss new content and at the same time overload your web servers.

Chat with SearchBot

Bing Logo Woodsign2 1920

Fabrice Canel, principal program manager for Bing Webmaster Tools, provided an update on his team’s efforts to improve the efficiency of their web crawler, BingBot.

Responding to user feedback. The update is a follow up to his talk at SMX Advanced in June, during which he announced an 18-month effort to improve BingBot. Canel asked the audience to submit suggestions and feedback.

In a blog post Tuesday, Canel said the team has made numerous improvements based on this feedback and thanked the SMX audience for its contributions. He said they will “continuing to improve” the crawler and share what they’ve done in a new “BingBot series” on the Bing webmaster blog.

BingBot’s goal. In this first post, Canel outlined the goal for BingBot, which is to use an algorithm to determine “which sites to crawl, how often, and how many pages to fetch from each site.” To ensure site’s servers aren’t overloaded by the crawler, the goal of BingBot is to limit its “crawl footprint” on a site while ensuring content in its index is as fresh as possible.

This “crawl efficiency” is the balance Bing is working to strike at scale. Canel said, “We’ve heard concerns that bingbot doesn’t crawl frequently enough and their content isn’t fresh within the index; while at the same time we’ve heard that bingbot crawls too often causing constraints on the websites resources.” It’s a work in progress.

Why should you care? Bing is clearly listening to the webmaster and SEO community. The Webmaster Tools team is making changes to ensure its crawler does not overload your servers while at the same time are faster and more efficient when it comes to finding new content on your web site. Bing is actively working on this and says it will continue to work on this.

How does this impact you? If you add new content to your web site and Bing doesn’t see it, it won’t rank it. That means searchers using Bing will not find your new content.

Recently Bing shut down the anonymous submit URL tool, and we have seen reports that Bing is not listening to submit URL requests even in Bing Webmaster Tools. It is possible the tweaks and changes Bing is making is causing some of this slowness with crawling and indexing now. But ultimately, Bing is clearly working on the issue.

Canel will be speaking at SMX East in New York City next week. See the full agenda here.


About the author

Barry Schwartz
Staff
Barry Schwartz is a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry can be followed on Twitter here.

Get the must-read newsletter for search marketers.