Bigmouthmedia reports that Google Sitemaps has a new feature that allows you to choose how often Googlebot crawls your site. You can select from 5 values, from slowest to fastest, but you must know that a faster crawl uses more bandwidth. For the moment, this feature is still experimental, so you may not find it in your Google Sitemaps account.
"We are testing an alpha version of our new tool with a small percentage of webmasters who use Sitemaps. You should leave this control at the Normal setting unless you are having trouble with the speed at which Googlebot is crawling your server.
Simply select the rate at which you would like the Googlebot to crawl your server and click save. During this stage of testing, we will evaluate requests to determine the best way of using this data and providing this tool to everyone."
Google Sitemaps, recently rebranded as Google Webmaster Central, is a control panel for webmasters, where they can find statistics about searches, crawling errors and submit sitemaps so that Google finds their pages faster.
{ Thank you, TomHTML. }