Here is a golden chance for you to invite Google bots to crawl your site to get loaded in search engines. Google has sophisticated the search algorithms that determine how much to crawl each site and now, to crawl as many pages from your site as a bot can on each visit without overwhelming server's bandwidth. This will help to get indexed if you are updating your blog more frequently. Blogger users have default value and others may change it accordingly.
Crawl rate refers to the speed of Googlebot's requests during the crawl process. Remember that, the new crawl rate will be valid for 90 days.
How to change the crawl rate:
- On the Webmaster Tools Home page, click the site you want.
- Under Site configuration, click Settings.
- In the Crawl rate section, select the option you want.
Please note that, Google's crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Google bot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.
As Google, say in the Webmaster Tools, “You can change the crawl rate (the time used by Googlebot to crawl the site) for sites that are at the root level—for example,
http://subdomain.example.com. You can't change the crawl rate for sites that are not at the root level—for example,
More to read: