Provide a Search Concierge that drives live updates on a search query, with multiple media types in the results. The interface would be a dashboard widget that updates as new results are found, with audio alerts keyed off particular words. Let the technology do all the work while informing and entertaining the searcher with a live window into what he seeks -- a peephole into the "now" of his need.
What can we do to improve how the public is able to search for federal content, via federal websites and commercial search engines?
All gov agences should use centralized services to provide the most relevant content the citizen quickly
I should be able to give my zip (opt-in) or my GPS location so I can find nearest gov offices and services
Searching gov sites should be as simple as bing.com and google.com
Hi all and thanks for joining the search dialogue-a-thon and the 2 week conversation! During this focused hour, I'll be consolidating some ideas that have been posted and suggesting some ways we can continue evolving those ideas into implementation plans. Please post new ideas, comment on existing ones, and let us know how you can help further!
As a citizen there has been many a time I would have liked to have read a copy of a proposed bill, and then the pros and cons regarding the bill, including the potential costs. I have yet to find a site that links me to that information.
Build metadata management as a key part of any web project. It can enable searching, sorting, audience personalization and content reuse. We must stop forcing the visitor to weed through pages of content that are not relevant to them, and help them search and find the content that is relevant.
Have a main search engine that will search the infromation and data on all governemnt sites. Sometimes it vague which department has the infromation on a subject. A general search based that uses key-word tag technology would eliminate the user having to navigate multiple government websites to find what they are looking for.
Following FRBR concepts in XML, catalog web pages and make them findable through a national catalog similar to (but hopefully better than) a library book catalog. Create an easy form agency content creators can fill out to add their record (i.e. web page) to the catalog. Use link resolvers to check for broken links automatically. Obviously, not every page of every website will be cataloged, but at a minimum every website ...more »
For search engines (Google, Bing, Yahoo, etc) to be able to properly index websites it helps for the website to publish a sitemap http://www.sitemaps.org/ Sitemaps are a vendor neutral way for websites to list URLs that they would like crawled and optionally when the resources were last updated. Additionally a robots.txt http://www.robotstxt.org/robotstxt.html is used to identify the location of the sitemap, and also ...more »
I imagine there are dozens of standard taxonomies and nomenclatures for .gov websites. They should be coordinated and combined and then humanized (remove or cross-reference the "alphabet soup", nonsensical/technical jargon). The final taxonomy should be assumed to be a "living" taxonomy (one that is constantly changing with our culture) and not "set in stone". The taxonomy should be public and shareable with non-government ...more »
Ensure that every new site follows the basic principals of optimizing its content for search engine ranking. Too many sites don't even contain the keywords commonly used by most people to describe the primary subject area of the site.