An Update On Javascript Menus And SEO

SEOs have traditionally recommended using search engine friendly menu systems such those based on HTML styled with CSS instead of those written in JavaScript.  The primary language understood by search engines has been HTML, but now several search engines have learned how to read JavaScript.  I recently found an example that reveals how much progress has been made.  My observations confirm what software engineer Janis Stipins’ said at SMX East in October 2008, that Google is doing a much better job spidering JavaScript.

About a month ago Monitronics had deployed a new site built with DotNetNuke and the Solpart version 1.7.2 menu system. You can see what technologies were used by reading the comments in the source code. DNN is an open source framework for building websites that run on ASP.NET. DNN’s Solpart menu system relies on JavaScript.  In the past I have recommended ripping out that menu system and using something else.  Like most companies working with DNN, Monitronics just went with the default menus.  When I visit their site with scripting disabled in my browser (Firefox with the NoScript add-on), I see nothing: no main menu items, and no drop downs.  In former times, this would have been a bad omen for SEO.

As promised, Google has spidered the new JavaScript menus.  The Google cache shows all the menu items, lined up in a neat list with links fully operational.  Google is also showing reasonable Sitelinks for a search on [Monitronics], which is a good indication that the site has been properly spidered.  When I look at the Yahoo cache, I see the main menu text, though not the drop downs.   Yahoo has cached the text generated by JavaScript rather than the Javascript code itself. However, Yahoo has failed to spider the drop down menus and failed to extract the menu links.  When I visit Microsoft’s cache, all the menu links are there, organized in the same neat hierarchy as shown in Google’s cache, and the links work.  When a website is built with Microsoft technology, I’d expect the Microsoft search engine to be able to decipher the code, and it seems like Microsoft is doing a good job reading DNN JavaScript menus.

Does this mean we can forget about search engine friendly menus?  I would not ignore the issue, but before investing resources in a new menu system, I would check the performance of the old system.  If something is working, or partially working, it might be better to put those resources towards another priority. Getting good performance from Google and Microsoft, and having Yahoo index a site via its sitemap is perhaps 90% as effective as having fully spiderable menus.

In November DNN posted a note that their newest menu system, version 1.8.0 is “SEO compatible”. Perhaps Yahoo’s spidering capabilities will soon catch up with Google’s and Microsoft’s. Depending on the potential traffic that could be generated by interior pages of a site, the web site owner needs to decide whether it is worth the investment of redoing a menu system to gain that final 10% performance, or whether procrastination might be an effective strategy. In 2009 I think Google and Microsoft will master JavaScript, Yahoo will follow and SEOs will have fewer menu gripes.

PostScript, January 12, 2009: Upon investigation of reader comments, it appears that DNN’s menu system includes a user-agent cloaking module.  Some details are available in a thread at the DNN Community Blog.  Apparently the module serves up search-friendly menus when the user agent matches a search engine spider.  The Firefox User Agent Switcher provides a method of testing.  First, install that add-on, and then import a list of user agents, such as this one.  Switch your user agent to Googlebot, Yahoo Slurp, or MSNbot to see the same pages as appear in all three search engine caches.  Oddly, the cloaking module does not provide correct output for Yahoo.

People debate whether this sort of cloaking is black hat or accessibility programming.  It appears that the search-engine version of the page is identical in content to the one served to users.  There appears to be no deception.  In addition, I think it would be odd for the search engines to ban a large number of sites built using a common menu system, the details of which are probably not apparent to the majority of webmasters managing these sites.  It remains an open question whether the search engines could read these menus without the assistance of friendly cloaking.

Nevertheless, the essential advice of the article remains correct.  Before replacing a menu system, check the actual performance.  Don’t assume that a menu won’t work with search just because of JavaScript.  Search engines claim to be able to read JavaScript, and some menu systems provide their own code fixes when they detect a search engine spider.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: Channel: SEO | Features: Analysis | Google: SEO

Sponsored


About The Author: has two degrees in computer science from Yale University and is a founder of Hochman Consultants, an internet marketing company, and CodeGuard, a computer security service.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:
 

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • http://www.gounder.co.in/ Alexander Gounder

    Though Google can crawl these menus, one may have two questions:
    a.) When you check a cached text version of the page you will not see the links showing up, further one cannot be very sure of how search engines may be using it.. Like with Flash, Search engines do cache flash file text, I had seen a snippet of text which was part of the header flash of client, in the SERP result, but I have no way to know if all the text is cached or if the links within are spreading juice..

    b.) what about the others, often corporate clients are more interested in large fancy reports than actual results, Though Google is a clear winner and provides 80% traffic clients still want you to go beyond Yahoo and MSN… an search engines like MSN which still take several weeks to cache a site.

    anyways such changes at Google are good to know, because it speaks volumes about why they are market leaders (hint to Yahoo & MSN)

  • BET_wayne

    In the sphinn article, an autor said, that they just cloaking ( reference to http://www.linkvendor.com/blog/google-und-javascript-menus-bislang-nur-cloaking.html ).
    As i see, its correctly. They cloak the menu to the search engines.

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide