Google Can Now Execute AJAX & JavaScript For Indexing

This morning we reported that the comments on Facebook are being indexed by Google. Google’s Matt Cutts just confirmed on Twitter that Google is now able to “execute AJAX/JS to index some dynamic comments.”

This gives Google’s spider, GoogleBot, the ability to read comments in AJAX or JavaScript, such as Facebook comments or Disqus comments and others that are dynamically loaded via AJAX or JavaScript. In addition, this means, Google is better at seeing the content behind more of your JavaScript or AJAX.

Postscript: Google now has an official blog post up with more details.

Related Stories:

Related Topics: Channel: SEO | Google: SEO | SEO: Blocking Spiders | SEO: Flash | Top News


About The Author: is Search Engine Land's News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry's personal blog is named Cartoon Barry and he can be followed on Twitter here. For more background information on Barry, see his full bio over here.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • T.A.

    The timing on this is great. An article came out of Distilled’s SearchLove conference today which posits that GoogleBot is actually based on Chrome.

    If you’re interested, you can check it out here:

  • Scott Bartell

    Does this mean google can now index Facebook comments?

  • timacheson

    It’s about time.

    It’s a search engine’s job to index content on the web!

    Why has the corporation taken this long to deal with Facebook comments?

    I welcomed this wholeheartedly when I saw the first report.

    However, it’s manifestly lamentable that Google has taken this long to react to Ajax, a web technology introduced by Microsoft decade ago and showcased by IE6.

    This is symptomatic of a web search monopoly that has been resting on its laurels. Google has neglected to innovate and has been left behind.

    Not long ago, Google was proposing a whole new framework to allow the indexing of Ajax type content, which the corporation was expecting all affected websites to implement in order to make Google’s job easier! As I pointed out at the time on my blog and in comments on Google’s announcement, this would have been a bad solution. Here’s an extract from my own comment, posted at the time:

    “The role of Google’s search engine is to facilitate search. That means working with the web as it is, not trying to change it. Yes, it’s a challenge to index Ajax; but it’s not impossible. Yes, it will require work by Google. Guys, that’s your job. The next-generation of indexing wizardry needs to be able to do this without every website implementing something that Google has come up with. I wish Google would stop trying to tell us how to create the web.”

  • Ashley Sheridan

    @timacheson it’s not as simple as you make it out to be. Ajax is just the method used to grab the page data, but often it’s triggered by something that only a user can do. You can’t seriously expect a search engine crawler to look at what a click would do on every button or object on your page in order to then see what Ajax calls were made?

    As an example, consider a typical Facebook page. Often comments are hidden beneath a ‘read more’ link that makes an Ajax request to the rest of the comments. Without context (and a search bot shouldn’t be context specific, it should be generic enough to look at content across any web page, not just Facebook) the bot has to traverse everything that can be interacted with. Not always easy, especially with more and more people moving away from event handlers in the HTML being moved to external Javascript files using frameworks.

    What about other events, such as scripts looking for a specific keyboard key being depressed? Without some hefty code being the search bot, it would never ‘know’ to look at Ajax calls being made from a trigger such as this, and any such Ajax call would most likely use the specific key as an argument in the call to the server, so you can’t just search the Javascript code for any Ajax-y looking URL and assume it’s the full one used in the call.

    It’s often the things that seem the most simple that are incredibly complicated.

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide