For those of you obsessed with Sitemaps and are under the impression this is going to get your pages indexed any faster, here's what Google says about Sitemaps:
Google doesn't guarantee that we'll crawl or index all of your URLs. However, we use the data in your Sitemap to learn about your site's structure, which will allow us to improve our crawler schedule and do a better job crawling your site in the future.
Given the statement highlighted above, why do you guys keep telling people to submit a Sitemap to get indexed faster?
Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.
We use a huge set of computers to fetch (or "crawl") billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.
Google's crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.
- Webmaster Tools Help
Sooo, given that second quote, the best way to get your site indexed quickly is, to get a link from an existing site that is crawled by Google often and
have a super duper internal linking strategy
in place, including a traditional HTML site map for your visitors and the search bots (including the ones that don't support the Sitemaps protocol).