enable findability by using sitemaps and robots.txt

For search engines (Google, Bing, Yahoo, etc) to be able to properly index websites it helps for the website to publish a sitemap Sitemaps are a vendor neutral way for websites to list URLs that they would like crawled and optionally when the resources were last updated. Additionally a robots.txt is used to identify the location of the sitemap, and also to indicate what portions of the website is crawlable and at what rate. It would be useful to see guidance on how federal websites should use these technologies to ensure their content is available where people are looking for it in a timely manner.


27 votes
Idea No. 92