I am curious to why it needed to send out 188 of it's little bots to my site? That seems excessive. I am not really concerned about bandwidth or server load but it makes my most online counter look absurd.
Yahoo Slurp/Inktomi Bot
Collapse
X
-
Tags: None
-
They like do the job as fast as possible I remember when vB.org and vB.com switched to vB3 we had thousands of people between .com and .org, mostly search bots. -
I just got hit by over 240 of the suckers before I found this on the Yahoo! site.
There is a Yahoo! Slurp-specific extension to robots.txt which allows you to set a lower limit on our crawler request rate.
You can add a "Crawl-delay: xx" instruction, where "xx" is the minimum delay in seconds between successive crawler accesses. If the crawler rate is a problem for your server, you can set the delay up to 60 or 300 or whatever value is comfortable for your server.
Setting a crawl-delay of 20 seconds for Yahoo! Slurp would look something like:
User-agent: Slurp
Crawl-delay: 20Comment
widgetinstance 262 (Related Topics) skipped due to lack of content & hide_module_if_empty option.
Comment