Google Offers A Proposal To Make AJAX Crawlable
It’s one of the most common pieces of SEO advice you’ll find: Don’t build your web site with AJAX if you want it to be crawled and indexed by search engines. AJAX-based web sites are essentially a locked and bolted door when a spider comes crawling.
At our SMX East conference today, Google has announced a proposal to change that, a new standard that would make AJAX crawlable. If it comes to pass, and if the other major search engines go along with the idea, the proposal could serve as a green light for developers wanting to enjoy the rich features of using AJAX while not sacrificing search engine visibility.
The details of Google’s proposal are, to this non-developer, highly technical and more than I care to recap here. (Read Google’s blog post, linked above, for the details.) Google’s goals in creating the new standard are summed up in less technical language:
- Minimal changes are required as the website grows
- Users and search engines see the same content (no cloaking)
- Search engines can send users directly to the AJAX URL (not to a static copy)
- Site owners have a way of verifying that their AJAX website is rendered correctly and thus that the crawler has access to all the content
Google estimates that about 70% of all web content is created dynamically, and that figure is likely to grow. “This hurts search,” Google says. “Not solving AJAX crawlabilty holds back progress on the web.” Those quotes are from a Google Docs presentation deck about the proposal, embedded below.
Google is asking for feedback on the proposal in the Google Webmaster Central help forum.