For search engines (Google, Bing, Yahoo, etc) to be able to properly index websites it helps for the website to publish a sitemap http://www.sitemaps.org/ Sitemaps are a vendor neutral way for websites to list URLs that they would like crawled and optionally when the resources were last updated. Additionally a robots.txt http://www.robotstxt.org/robotstxt.html is used to identify the location of the sitemap, and also to indicate what portions of the website is crawlable and at what rate. It would be useful to see guidance on how federal websites should use these technologies to ensure their content is available where people are looking for it in a timely manner.
Voting on Ideas
Vote for your favorite ideas by clicking on the up arrow.To undo an upvote, simply click the arrow again. This second click removes your vote.