Does this mean we can forget about search engine friendly menus? I would not ignore the issue, but before investing resources in a new menu system, I would check the performance of the old system. If something is working, or partially working, it might be better to put those resources towards another priority. Getting good performance from Google and Microsoft, and having Yahoo index a site via its sitemap is perhaps 90% as effective as having fully spiderable menus.
PostScript, January 12, 2009: Upon investigation of reader comments, it appears that DNN’s menu system includes a user-agent cloaking module. Some details are available in a thread at the DNN Community Blog. Apparently the module serves up search-friendly menus when the user agent matches a search engine spider. The Firefox User Agent Switcher provides a method of testing. First, install that add-on, and then import a list of user agents, such as this one. Switch your user agent to Googlebot, Yahoo Slurp, or MSNbot to see the same pages as appear in all three search engine caches. Oddly, the cloaking module does not provide correct output for Yahoo.
People debate whether this sort of cloaking is black hat or accessibility programming. It appears that the search-engine version of the page is identical in content to the one served to users. There appears to be no deception. In addition, I think it would be odd for the search engines to ban a large number of sites built using a common menu system, the details of which are probably not apparent to the majority of webmasters managing these sites. It remains an open question whether the search engines could read these menus without the assistance of friendly cloaking.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.