(from an earlier post to the infrastructure mailing list)
You'll maybe remember the problems we had with googlebot in summer. It had crawled about 150k pages per day and that caused problems on our servers. I then asked google (via the webmaster tools, thanks to Morbus Iff) to crawl less often and it went down to below 10k pages per day. This setting expired in November and as of December googlebot
is back in force.
Currently, this does not pose a problem and I'll leave it as it is.
It is more interesting to see that the time that was needed for a page to be generated
improved over the past months. We are now well below one second for anonymous pages.
This is probably due to a third webserver that has been deployed, thanks SUN!