vBulletin 4 Forum exemplar robots.txt file

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • ap0llo_*
    New Member
    • Mar 2010
    • 17
    • 4.0.0

    vBulletin 4 Forum exemplar robots.txt file

    Hi,

    Here's an exemplar robots.txt file for a vB4 Forum installation. It has been designed to combat / limit high bandwidth using bots and has the correct files disallowed / allowed.

    Code:
    Sitemap: Sitemap URL here
    
    User-agent: BoardTracker
    Disallow: /
    
    User-agent: Gigabot
    Disallow: /
    
    User-agent: Twiceler
    Disallow: /
    
    User-agent: Slurp
    Crawl-delay: 2
    
    User-agent: msnbot
    Crawl-delay: 2
    
    User-agent: *
    Disallow: /editpost.php
    Disallow: /gamercard.php
    Disallow: /inlinemod.php
    Disallow: /member.php
    Disallow: /memberlist.php
    Disallow: /newreply.php
    Disallow: /newthread.php
    Disallow: /payments.php
    Disallow: /printthread.php
    Disallow: /private.php
    Disallow: /profile.php
    Disallow: /report.php
    Disallow: /search.php
    Disallow: /sendmessage.php
    Disallow: /showpost.php
    Disallow: /usercp.php
    Disallow: /usernote.php
    
    User-agent: Mediapartners-Google
    Allow: /member.php
    Allow: /private.php
    Allow: /usercp.php
    Please note that this robot.txt file isn't a quick fix for all websites. It's been built with only a vB 4 forum installation on the domain in question. If you have any additional installations on your domain such as a CMS, rules for these should also be added.

    Rhys
    Tech-Reviews | Tech-Reviews Forums
widgetinstance 262 (Related Topics) skipped due to lack of content & hide_module_if_empty option.
Working...