Well... now that the googlebot accepts wildcards in robots.txt... I have put together some disallows that will hopefully solve all of vB's duplicate content penalizations (I HAVE TONS). One or two dupes are no big deal.. but if google gets all the links on your forums like mine... you can get up to 8+ dupes and that is NO GOOD - all go to sup index.
Someone please check this robots.txt entry and let me know if this is safe to use. I am not too hip on the robots exclusions syntax and would appreciate a friendly "looks good to me" for my lines below.
NOTE: I WANT to have showthread.php indexed... just without the params listed below:
Any flaws with this? Thanks!
Someone please check this robots.txt entry and let me know if this is safe to use. I am not too hip on the robots exclusions syntax and would appreciate a friendly "looks good to me" for my lines below.
NOTE: I WANT to have showthread.php indexed... just without the params listed below:
Code:
User-agent: Googlebot Disallow: /*goto= Disallow: /*mode= Disallow: /*showthread.php?p= Disallow: /*&pp= Disallow: /*showthread.php?p= Disallow: /*postcount= Disallow: /*daysprune= Disallow: /*&sort=
Comment