Robots.txt for VB 5

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • Alsuheimat
    Member
    • Jan 2016
    • 66

    Robots.txt for VB 5

    Click image for larger version

Name:	Untitled.png
Views:	415
Size:	34.8 KB
ID:	4354696 Need this file Robots.txt for VB 5.2.4


    Click image for larger version  Name:	Untitled.png Views:	1 Size:	34.8 KB ID:	4354693
    Last edited by Alsuheimat; Tue 27 Sep '16, 3:27am.
  • Trevor Hannant
    vBulletin Support
    • Aug 2002
    • 24358
    • 5.7.X

    #2
    Not something we supply, you will need to create your own.
    Vote for:

    - Admin Settable Paid Subscription Reminder Timeframe (vB6)
    - Add Admin ability to auto-subscribe users to specific channel(s) (vB6)

    Comment

    • Wayne Luke
      vBulletin Technical Support Lead
      • Aug 2000
      • 74132

      #3
      The reason we do not supply a robots.txt is because the file is required to be in the root of the website. There is no guarantee that vBulletin will be in the root of the website. Many people actually do not install vBulletin in the root of their website.

      Translations provided by Google.

      Wayne Luke
      The Rabid Badger - a vBulletin Cloud demonstration site.
      vBulletin 5 API

      Comment

      • SpaceStar
        Senior Member
        • Apr 2005
        • 171

        #4
        Alsuheimat,

        I faced similar thing. Google said it cannot crawl because robots.txt does not exist. I created a dummy file with following text in it

        User-Agent:*

        and put it into root of my website with sufficent access to read. Google and Bing picked it up.

        Comment

        • Alsuheimat
          Member
          • Jan 2016
          • 66

          #5
          Originally posted by SpaceStar
          Alsuheimat,

          I faced similar thing. Google said it cannot crawl because robots.txt does not exist. I created a dummy file with following text in it

          User-Agent:*

          and put it into root of my website with sufficent access to read. Google and Bing picked it up.
          Thanks
          I attach a file robots.txt to which you have

          Comment

          • Alsuheimat
            Member
            • Jan 2016
            • 66

            #6
            Originally posted by SpaceStar
            Alsuheimat,

            I faced similar thing. Google said it cannot crawl because robots.txt does not exist. I created a dummy file with following text in it

            User-Agent:*

            and put it into root of my website with sufficent access to read. Google and Bing picked it up.
            I want the file that you have

            Comment

            • Madekozu
              New Member
              • Aug 2015
              • 23
              • 5.1.x

              #7
              Here is an example of a part of my robots.txt


              Code:
              Sitemap: http://madekozu.de/vbulletin_sitemap_index.xml.gz
              
              User-agent: *
              Disallow: /admincp/
              Disallow: /usage/
              Disallow: /logs/
              Disallow: /rausda.php
              
              User-agent: SentiBot
              Disallow: /
              
              User-agent: SentiBot www.sentibot.eu (compatible with Googlebot)
              Disallow: /
              
              User-agent: MJ12bot
              Disallow: /
              
              User-agent: MegaIndex.ru/2.0
              Disallow: /

              It makes sense to use its own settings. Which Bot may search, which is not? You'll have to decide for your project!
              läuft

              Comment

              • SpaceStar
                Senior Member
                • Apr 2005
                • 171

                #8
                Originally posted by Alsuheimat

                I want the file that you have
                It is just a blank file with only one entry anyone attached....

                Attached Files

                Comment

                • Alsuheimat
                  Member
                  • Jan 2016
                  • 66

                  #9
                  Originally posted by Madekozu
                  Here is an example of a part of my robots.txt


                  Code:
                  Sitemap: http://madekozu.de/vbulletin_sitemap_index.xml.gz
                  
                  User-agent: *
                  Disallow: /admincp/
                  Disallow: /usage/
                  Disallow: /logs/
                  Disallow: /rausda.php
                  
                  User-agent: SentiBot
                  Disallow: /
                  
                  User-agent: SentiBot www.sentibot.eu (compatible with Googlebot)
                  Disallow: /
                  
                  User-agent: MJ12bot
                  Disallow: /
                  
                  User-agent: MegaIndex.ru/2.0
                  Disallow: /

                  It makes sense to use its own settings. Which Bot may search, which is not? You'll have to decide for your project!
                  thanks

                  Comment

                  • Alsuheimat
                    Member
                    • Jan 2016
                    • 66

                    #10
                    Originally posted by SpaceStar

                    It is just a blank file with only one entry anyone attached....
                    thanks

                    Comment

                    • SpaceStar
                      Senior Member
                      • Apr 2005
                      • 171

                      #11
                      Originally posted by Madekozu
                      Here is an example of a part of my robots.txt


                      Code:
                      Sitemap: http://madekozu.de/vbulletin_sitemap_index.xml.gz
                      
                      User-agent: *
                      Disallow: /admincp/
                      Disallow: /usage/
                      Disallow: /logs/
                      Disallow: /rausda.php
                      
                      User-agent: SentiBot
                      Disallow: /
                      
                      User-agent: SentiBot www.sentibot.eu (compatible with Googlebot)
                      Disallow: /
                      
                      User-agent: MJ12bot
                      Disallow: /
                      
                      User-agent: MegaIndex.ru/2.0
                      Disallow: /

                      It makes sense to use its own settings. Which Bot may search, which is not? You'll have to decide for your project!
                      Good example there. I may also exclude Baidu ...

                      Comment

                    Related Topics

                    Collapse

                    Working...