Yahoo Slurp/Inktomi Bot

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • SilentK
    Member
    • Nov 2004
    • 67

    Yahoo Slurp/Inktomi Bot

    I am curious to why it needed to send out 188 of it's little bots to my site? That seems excessive. I am not really concerned about bandwidth or server load but it makes my most online counter look absurd.
    Xbox Live teams.com
  • Zachery
    Former vBulletin Support
    • Jul 2002
    • 59097

    #2
    They like do the job as fast as possible I remember when vB.org and vB.com switched to vB3 we had thousands of people between .com and .org, mostly search bots.

    Comment

    • tgmorris
      Senior Member
      • Nov 2003
      • 187
      • 3.6.x

      #3
      I just got hit by over 240 of the suckers before I found this on the Yahoo! site.

      There is a Yahoo! Slurp-specific extension to robots.txt which allows you to set a lower limit on our crawler request rate.

      You can add a "Crawl-delay: xx" instruction, where "xx" is the minimum delay in seconds between successive crawler accesses. If the crawler rate is a problem for your server, you can set the delay up to 60 or 300 or whatever value is comfortable for your server.

      Setting a crawl-delay of 20 seconds for Yahoo! Slurp would look something like:

      User-agent: Slurp
      Crawl-delay: 20
      I added it to my robots.txt file and they slowly decreased to a reasonable number.

      Comment

      widgetinstance 262 (Related Topics) skipped due to lack of content & hide_module_if_empty option.
      Working...