• http://www.countryroadwebdesigns.com SueM

    Thank you. Information was very useful.

  • http://www.seo-theory.com/ Michael Martinez

    Danny, this is a great article and I welcome the news with open arms.

    Just one request for the future: Would you please say “XML sitemaps” to distinguish these types of files from “HTML sitemaps”, or otherwise indicate the difference (since .TXT files and ATOM feeds can also be used).

    People very easily confuse ambiguous references to XML Sitemaps with HTML sitemaps, that many sites include in their standard secondary navigation.

  • eCopt

    Good suggestion Michael.

    I have been having to distinguish between the many formats with clients too. It’s understandable since some formats are just beginning to be used in this manner. I guess if you didn’t keep up with it since the first uses of sitemaps, you wouldn’t really know the differences.

  • http://www.HippoIMT.com Hippo

    This is wonderful news for site owners and the industry of SEO. It’s nice to see that the search engines are working together and allowing us to “talk to them” in effect.

    What a welcome change this industry has had over the last five years or so. I remember attending SES in Boston back then and it seemed as if the attendees were just hanging on every word the search engine reps were saying and they had very little interest in what we were saying.

    The times are changing. It’s nice.

    Thanks Danny! You’ve been an integral part of “the revolution”.

  • http://www.ioncannon.net/ mcdonaec

    This is nice coming right after Google added support for embedding KML into sitemaps.

    http://googlemapsapi.blogspot.com/2007/01/get-more-traffic-to-your-maps-api-site.html

  • http://www.blizzardinternet.com Mary Bowling

    We’re arguing here at my outfit whether we should give the SE’s an XML sitemap telling them about all the pages on a site or simply upload the verification file from Google and then wait to see how they crawl the site and what errors they may find.
    There are good arguments on both sides. Do you have an opinion as to which method is best? Thanks

  • eCopt

    Easy, try both. Verify your site with Google and Yahoo and create a valid file type to upload to your robots.txt or use http requests.

    I think the greatest advantage of using autodiscovery would be the access it gives to Ask and Live.com. Those were the only two that didn’t have another way to discover your pages other than crawling or manual submit. Now they do.

    Both will make you accessible to all SE’s. Why start slow and see what happens, might as well do all you can. Think Vanessa Fox said if you can try to notify SE’s all the ways you can, if possible.

  • http://www.antezeta.com/google/sitemap-standard.html Sean Carlos

    Ask has since documented sitemap support here: http://about.ask.com/en/docs/about/webmasters.shtml#22 . While the ping mechanism does work, I haven’t seen them actually crawl the file, but that could be attributed to their low overall crawling rate.

    Before embracing sitemaps auto-discovery, one should ask the question: is it sensible to facilitate content scraping by providing a list of all of your URLs to the world? I believe it is wiser to stick with manual notification via Google’s Webmaster Dashboard and Yahoo!’s Site Explorer. To the extent there are few major search engines, manual pinging is not a major issue for the others, which for now is limited to Ask as Microsoft has not yet begun supporting sitemaps.

  • http://www.ourhometools.com odls

    Although submitting both URLs and Sitemaps to Google along with Yahoo is easy enough, I still don`t see how to submit either to Ask, nor a Sitemap to MSN. Or, am I missing something?

  • http://oiloffshoremarine.wordpress.com Paul