Hi,
Here's an exemplar robots.txt file for a vB4 Forum installation. It has been designed to combat / limit high bandwidth using bots and has the correct files disallowed / allowed.
Please note that this robot.txt file isn't a quick fix for all websites. It's been built with only a vB 4 forum installation on the domain in question. If you have any additional installations on your domain such as a CMS, rules for these should also be added.
Rhys
Here's an exemplar robots.txt file for a vB4 Forum installation. It has been designed to combat / limit high bandwidth using bots and has the correct files disallowed / allowed.
Code:
Sitemap: Sitemap URL here User-agent: BoardTracker Disallow: / User-agent: Gigabot Disallow: / User-agent: Twiceler Disallow: / User-agent: Slurp Crawl-delay: 2 User-agent: msnbot Crawl-delay: 2 User-agent: * Disallow: /editpost.php Disallow: /gamercard.php Disallow: /inlinemod.php Disallow: /member.php Disallow: /memberlist.php Disallow: /newreply.php Disallow: /newthread.php Disallow: /payments.php Disallow: /printthread.php Disallow: /private.php Disallow: /profile.php Disallow: /report.php Disallow: /search.php Disallow: /sendmessage.php Disallow: /showpost.php Disallow: /usercp.php Disallow: /usernote.php User-agent: Mediapartners-Google Allow: /member.php Allow: /private.php Allow: /usercp.php
Rhys