Campaign: Search

enable findability by using sitemaps and robots.txt

For search engines (Google, Bing, Yahoo, etc) to be able to properly index websites it helps for the website to publish a sitemap http://www.sitemaps.org/ Sitemaps are a vendor neutral way for websites to list URLs that they would like crawled and optionally when the resources were last updated. Additionally a robots.txt http://www.robotstxt.org/robotstxt.html is used to identify the location of the sitemap, and also to indicate what portions of the website is crawlable and at what rate. It would be useful to see guidance on how federal websites should use these technologies to ensure their content is available where people are looking for it in a timely manner.

Submitted by

Tags

Voting

27 votes
Active
Idea No. 92