Way to avoid being penalised by google and yahoo

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • funfun168
    New Member
    • May 2006
    • 16
    • 3.5.x

    Way to avoid being penalised by google and yahoo

    If search engine such as google and yahoo delist or penalise a site, it would kill. For a forum, it is difficult to control the content and outbound link of the member's post. Is there any way to avoid being penalised by google and yahoo?

    There is one case come to my mind, if a member posts a bad link(ie. a link that already penalised by search engine), to an extreme, a link farm, what will search engine do to my forum?

    Is there any other case would lead to penality?
    May we discuss here?thank you
  • TruthElixirX
    Senior Member
    • Sep 2004
    • 1004
    • 3.6.x

    #2
    Personally I see SEO as modern day snake oil. Just build content and link backs will coem naturally.

    Comment

    • Floris
      Senior Member
      • Dec 2001
      • 37767

      #3
      Originally posted by TruthElixirX
      Personally I see SEO as modern day snake oil. Just build content and link backs will coem naturally.
      Yep, I agree with that

      Comment

      • Joe Page
        Senior Member
        • Sep 2002
        • 671

        #4
        Don't worry too much about it

        Comment

        • Icheb
          Senior Member
          • Nov 2002
          • 1291

          #5
          Originally posted by funfun168
          There is one case come to my mind, if a member posts a bad link(ie. a link that already penalised by search engine), to an extreme, a link farm, what will search engine do to my forum?
          Add rel="nofollow" to all such links.

          Comment

          • funfun168
            New Member
            • May 2006
            • 16
            • 3.5.x

            #6
            Originally posted by Icheb
            Add rel="nofollow" to all such links.
            yes, I think it is necessary to "kill" the outbound link for our own goods

            for example, www.yahoo.com

            how I remove this outbound link and preserve the meaning?

            Comment

            • Shining Arcanine
              Senior Member
              • Feb 2003
              • 2482
              • 3.0.3

              #7
              Originally posted by funfun168
              If search engine such as google and yahoo delist or penalise a site, it would kill. For a forum, it is difficult to control the content and outbound link of the member's post. Is there any way to avoid being penalised by google and yahoo?

              There is one case come to my mind, if a member posts a bad link(ie. a link that already penalised by search engine), to an extreme, a link farm, what will search engine do to my forum?

              Is there any other case would lead to penality?
              May we discuss here?thank you
              Use the ref="no-follow" attribute in links. Currently this is hard-coded (I do not know if 3.6.0 changes this) so it is impossible to change that without modifying files, which would nullify support from Jelsoft.

              Comment

              • ThorstenA
                Senior Member
                • Nov 2004
                • 3082
                • 4.0.x

                #8
                Originally posted by Shining Arcanine
                Use the ref="no-follow" attribute in links. Currently this is hard-coded (I do not know if 3.6.0 changes this) so it is impossible to change that without modifying files, which would nullify support from Jelsoft.
                I think I saw a product on vb.org that added a rel="nofollow" to all external links.

                Comment

                • kevinmanphp
                  Senior Member
                  • Jul 2005
                  • 389

                  #9
                  Hell... the real problem arises with duplicate content and numerous URL's to the same file. I asked if this solution I came up with would work... but am yet to hear a response. Personally... I can't believe no one has really looked at this:

                  Comment

                  • Scott MacVicar
                    Former vBulletin Developer
                    • Dec 2000
                    • 13286

                    #10
                    We're not looking beause google says the issues with duplicate content isn't real, its been posted on their development blog that the algorithms they use are smart enough to know that its not real duplicate content.

                    Read http://www.bytestart.co.uk/content/p...-content.shtml

                    Then read http://www.searchenginepromotionhelp...nalty-real.php

                    What is the proof that Dupliate Content Penalties actually exist?

                    I'm all for making things easier for a search engine but going through the files and affecting usability by removing features that some people use, making the code more complex and potentially use more resources based on something that may or may not exist?

                    Google implements a filter service not a penalty, only one copy of duplicate content will be shown.
                    Scott MacVicar

                    My Blog | Twitter

                    Comment

                    • kevinmanphp
                      Senior Member
                      • Jul 2005
                      • 389

                      #11
                      I am well aware of the status of duplicate content with Google. However, the problem is the "filtering process" that you made mention too.. and that is the Supplemental Index... which as we all know is not included in the search results, for all in this index is deemed, "untrustworthy".

                      Furthermore, if Google notices a large portion (say 80%) of your website is being filtered into the supplemental index (which is exactly what is happening to my forums)... then Google will deliberately slow their crawling of the links and may even halt deeper crawling of certain sections of websites, due to the fact it is a complete waste of Googlebot's time to index a site whose 80% of it's crawl will be dumped into supplemental index.

                      This is the very reason a lot of dynamic sites such as ecommerce, have very hard times getting google to crawl deep pages past the initial 30% scrape. I am certainly not trying to bust anyone's balls here, so don't get me wrong. But I do want to at least make light that there are "deficiencies" that can be cured.

                      Comment

                      • TruthElixirX
                        Senior Member
                        • Sep 2004
                        • 1004
                        • 3.6.x

                        #12
                        Originally posted by kevinmanphp
                        I am well aware of the status of duplicate content with Google. However, the problem is the "filtering process" that you made mention too.. and that is the Supplemental Index... which as we all know is not included in the search results, for all in this index is deemed, "untrustworthy".

                        Furthermore, if Google notices a large portion (say 80%) of your website is being filtered into the supplemental index (which is exactly what is happening to my forums)... then Google will deliberately slow their crawling of the links and may even halt deeper crawling of certain sections of websites, due to the fact it is a complete waste of Googlebot's time to index a site whose 80% of it's crawl will be dumped into supplemental index.

                        This is the very reason a lot of dynamic sites such as ecommerce, have very hard times getting google to crawl deep pages past the initial 30% scrape. I am certainly not trying to bust anyone's balls here, so don't get me wrong. But I do want to at least make light that there are "deficiencies" that can be cured.
                        Sources?

                        Comment

                        • Scott MacVicar
                          Former vBulletin Developer
                          • Dec 2000
                          • 13286

                          #13
                          I'd like to read your sources too about this, you quote figures and behaviours but I presumed this was only available to google engineers.

                          I'm just curious about what is truth and what is myth, since most of the SEO documents are either syndicated from another site or appear to be pure speculation.
                          Scott MacVicar

                          My Blog | Twitter

                          Comment

                          widgetinstance 262 (Related Topics) skipped due to lack of content & hide_module_if_empty option.
                          Working...