Sitemaps.org Update: You Can Now Store Your XML Sitemap Files Anywhere!

The major search engines have announced an update to the sitemaps.org protocol which enables site owners to store their XML Sitemap files in any location — even on a different domain than the one referenced in the Sitemap. This will be a welcome change for those who manage multiple domains and would like to keep […]

Chat with SearchBot

The major search engines have announced an update to the sitemaps.org protocol which enables site owners to store their XML Sitemap files in any location — even on a different domain than the one referenced in the Sitemap. This will be a welcome change for those who manage multiple domains and would like to keep all Sitemap files in one place, as well as for those who would like to store their Sitemap in a location other than the root.

The only caveat? You have to be able to edit the robots.txt file of the domain the Sitemap file references.

The search engines made the announcement today on the Search Engineers Q&A panel at SMX West. Below, more about how this works and how to implement it on your site.


Historically, your Sitemap file had to be stored in the same location as the URLs listed in that Sitemap. For instance:

https://www.example.com/sitemap.xml could include:
www.example.com
www.example.com/folder1/page1.html
www.example.com/folder2.page1.html

But it could not include:
subdomain1.example.com
www.example2.com

Similarly, https://www.example.com/folder1/sitemap.xml could include:
www.example.com/folder1/page1.html
www.example.com/folder1/images/page2.html

And could not include:
www.example.com
www.example.com/folder2/page1.html

The protocol required this to ensure that whoever could modify the Sitemap file also had ownership rights of the URLs listed in the Sitemap. However, this setup was problematic for some sites, including those with multiple domains and subdomains who would have found it difficult or technically infeasible to write Sitemap files to multiple locations. Some sites also found it technically different to write Sitemap files to the root of the domain.

With Google, site owners can overcome this issue by verifying ownership of all domains in Webmaster Central, then submitting Sitemaps for any of those domains from any location. However, as this solution only works for Google, it’s not useful for those who want one process for generating Sitemaps for submission to all engines. (Which is, after all, the spirit of sitemaps.org.)

With the latest sitemaps.org update:

  • Each Sitemap file can reference only one domain.
  • The Sitemap can be placed anywhere (on a different domain or in a subfolder) as long as its location is referenced in the domain’s robots.txt file.

Referencing The Sitemap File In robots.txt
Simply point at the Sitemap location in your robots.txt file. For instance, if you have a Sitemap file for www.example and you want to store it at www.example1.com/sitemap.xml, add to the following line to www.example.com/robots.txt:

Sitemap: https://www.example1.com/sitemap.xml

This line lets the search engines know that you have ownership of www.example.com (since you can modify the robots.txt file for it) and that the Sitemap for that domain is located at https://www.example1.com/sitemap.xml.

If you want to store Sitemaps for multiple domains in one location, simply create a Sitemap for each domain, then modify the robots.txt for each of them to point at the Sitemap file location, as shown below:

sitemaps.org

The same technique works if you want to store your Sitemap in a location other than the root. If you want to store the Sitemap at https://www.example.com/sitemap/sitemap.xml, simply modify the robots.txt file for www.example.com with the following:

Sitemap: https://www.example.com/sitemap/sitemap.xml

For more information, see the sitemaps.org documentation or each search engine’s blog post:

Microsoft Live Search
Google Webmaster Central
Yahoo! Search Blog


Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.


About the author

Vanessa Fox
Contributor
Vanessa Fox is a Contributing Editor at Search Engine Land. She built Google Webmaster Central and went on to found software and consulting company Nine By Blue and create Blueprint Search Analytics< which she later sold. Her book, Marketing in the Age of Google, (updated edition, May 2012) provides a foundation for incorporating search strategy into organizations of all levels. Follow her on Twitter at @vanessafox.

Get the newsletter search marketers rely on.