Change most users ever online to specific amount, rather than reset completely

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • tnedator
    Senior Member
    • Aug 2007
    • 329

    Change most users ever online to specific amount, rather than reset completely

    I don't know what happened last night, but my max users ever online jumped from 155 or so to 501. Based on Google analytics there is no way that was actual uers/guests, so I imagine there must have been some search bot flood or something else.

    I have seen queries posted that look like they will reset the max online users to 0, but is there anyway for me to set it back to 155? My members like checking that figure out, and while the jump to 501 makes them think something is wrong, I would hate to revert it all the way back to 0.

    Thanks
    BroncosForums.com - Broncos Fan Forum
    Total Broncos - Denver Broncos Blog and News
  • Jake Bunce
    Senior Member
    • Dec 2000
    • 46598
    • 3.6.x

    #2
    Run this query:

    UPDATE datastore
    SET data = 'a:2:{s:9:"maxonline";i:563;s:13:"maxonlinedate";i:1182947492;}'
    WHERE title = 'maxloggedin'

    The first number is the max online users. The second number is the timestamp.

    Comment

    • tnedator
      Senior Member
      • Aug 2007
      • 329

      #3
      Originally posted by Jake Bunce
      Run this query:

      UPDATE datastore
      SET data = 'a:2:{s:9:"maxonline";i:563;s:13:"maxonlinedate";i:1182947492;}'
      WHERE title = 'maxloggedin'

      The first number is the max online users. The second number is the timestamp.
      Ok. Good news, bad news. The query worked. Thanks a bunch. Anyone else that is trying to figure out how to do this, here is a link to a unix timestamp converter. http://www.onlineconversion.com/unix_time.htm

      Now the bad news. There are currently around 300 yahoo threads on my site, so once again the max count just shot up.

      Do you know if there is anyway to not count searchbots or to limit the number of instances Yahoo sends? On the one hand I am glad Yahoo has started to catalog my site big time, but I hate the fact it is skewing my stats so bad.

      Thanks
      BroncosForums.com - Broncos Fan Forum
      Total Broncos - Denver Broncos Blog and News

      Comment

      • tnedator
        Senior Member
        • Aug 2007
        • 329

        #4
        I did some searching after I posted my last message and figured a robots.txt (which I previously didn't have) was the best option. My robots.txt currently has the following. Does this look good?

        Code:
         
        User-agent: *
        Disallow: /forums/memberlist.php
        Disallow: /forums/attachment.php
        Disallow: /forums/avatar.php
        Disallow: /forums/editpost.php
        Disallow: /forums/member.php
        Disallow: /forums/member2.php
        Disallow: /forums/misc.php
        Disallow: /forums/moderator.php
        Disallow: /forums/newreply.php
        Disallow: /forums/newthread.php
        Disallow: /forums/online.php
        Disallow: /forums/poll.php
        Disallow: /forums/postings.php
        Disallow: /forums/printthread.php
        Disallow: /forums/private.php
        Disallow: /forums/report.php
        Disallow: /forums/search.php
        Disallow: /forums/sendtofriend.php
        Disallow: /forums/threadrate.php
        Disallow: /forums/usercp.php
        Disallow: /forums/admincp/
        Disallow: /forums/modcp/
        Disallow: /forums/images/
        Disallow: /forums/sendmessage.php
        Disallow: /forums/register.php
        Disallow: /forums/subscription.php
        User-agent: Slurp
        Crawl-delay: 60
        BroncosForums.com - Broncos Fan Forum
        Total Broncos - Denver Broncos Blog and News

        Comment

        • Jake Bunce
          Senior Member
          • Dec 2000
          • 46598
          • 3.6.x

          #5
          A robots file is a good way to block search spiders. There is no feature in vBulletin to exclude spiders from the online count.

          Comment

          • tnedator
            Senior Member
            • Aug 2007
            • 329

            #6
            Originally posted by Jake Bunce
            A robots file is a good way to block search spiders. There is no feature in vBulletin to exclude spiders from the online count.
            Thanks, as always, for the quick reply.

            Do you know if that list of "disallows" is about right to make sure the spiders still crawl the posts, but not the non-post pages?
            BroncosForums.com - Broncos Fan Forum
            Total Broncos - Denver Broncos Blog and News

            Comment

            • Jake Bunce
              Senior Member
              • Dec 2000
              • 46598
              • 3.6.x

              #7
              That looks correct to me.

              Comment

              • ceho
                Senior Member
                • Mar 2008
                • 439
                • 4.1.x

                #8
                Hi, I ran the query as stated above (adjusted number & date of course) but received this error message:

                An error occurred while attempting to execute your query. The following information was returned.
                error number: 1146
                error desc: Table 'DB123456.datastore' doesn't exist
                My version is vb 3.6.9. Any idea what might be wrong?

                Thanks a lot!
                sigpic

                Comment

                • Jake Bunce
                  Senior Member
                  • Dec 2000
                  • 46598
                  • 3.6.x

                  #9
                  You must be using a table prefix. Add the prefix onto the word datastore. The prefix can be seen in your includes/config.php file.

                  Comment

                  • ceho
                    Senior Member
                    • Mar 2008
                    • 439
                    • 4.1.x

                    #10
                    I still need to get used to this... ;-)

                    Thank you very much, it worked easily!
                    sigpic

                    Comment

                    • juice
                      Member
                      • Jan 2008
                      • 75
                      • 3.6.x

                      #11
                      Originally posted by tnedator
                      I did some searching after I posted my last message and figured a robots.txt (which I previously didn't have) was the best option. My robots.txt currently has the following. Does this look good?

                      Code:
                       
                      User-agent: *
                      Disallow: /forums/memberlist.php
                      Disallow: /forums/attachment.php
                      Disallow: /forums/avatar.php
                      Disallow: /forums/editpost.php
                      Disallow: /forums/member.php
                      Disallow: /forums/member2.php
                      Disallow: /forums/misc.php
                      Disallow: /forums/moderator.php
                      Disallow: /forums/newreply.php
                      Disallow: /forums/newthread.php
                      Disallow: /forums/online.php
                      Disallow: /forums/poll.php
                      Disallow: /forums/postings.php
                      Disallow: /forums/printthread.php
                      Disallow: /forums/private.php
                      Disallow: /forums/report.php
                      Disallow: /forums/search.php
                      Disallow: /forums/sendtofriend.php
                      Disallow: /forums/threadrate.php
                      Disallow: /forums/usercp.php
                      Disallow: /forums/admincp/
                      Disallow: /forums/modcp/
                      Disallow: /forums/images/
                      Disallow: /forums/sendmessage.php
                      Disallow: /forums/register.php
                      Disallow: /forums/subscription.php
                      User-agent: Slurp
                      Crawl-delay: 60
                      How and where do you add this robot.txt file? I want to avoid this situation as well.

                      Comment

                      • Jake Bunce
                        Senior Member
                        • Dec 2000
                        • 46598
                        • 3.6.x

                        #12
                        The file is called robots.txt and it is placed in your web root.

                        Comment

                        widgetinstance 262 (Related Topics) skipped due to lack of content & hide_module_if_empty option.
                        Working...