Jump to content

Our topics in forums are on Google!


blb9556

Recommended Posts

I wonder if that could have anything to do with the forum becoming slow or unusable (too many connections error) occasionally. Maybe Google's crawler is trying to suck down the content too quickly and overloading the server.

Interesting thought to post right now. For the last half hour or so, the Forums have been nearly unusable. I have a very slow dialup Internet connection, but have been able to get to GC.com pages, while these Forums have been almost completely unresponsive . . . :rolleyes:

Link to comment

I wonder if that could have anything to do with the forum becoming slow or unusable (too many connections error) occasionally. Maybe Google's crawler is trying to suck down the content too quickly and overloading the server.

 

Highly unlikely. Google does a very good job at not overloading sites when it sends bots in to index.

Link to comment

I wonder if that could have anything to do with the forum becoming slow or unusable (too many connections error) occasionally. Maybe Google's crawler is trying to suck down the content too quickly and overloading the server.

 

Highly unlikely. Google does a very good job at not overloading sites when it sends bots in to index.

 

We had almost 500 users simultaneously banging on the site at that time. Fortunately it looks as though the forum software had a performance upgrade recently. When we get back from CES we'll look into upgrading the site. Unfortunately there is some custom code to link Geocaching.com to the forums so we'll have to do that and test it before we can install the upgrade.

Link to comment

It's possible to stop most robots from crawling your website by creating a file called robots.txt in the root of the site. Using certain command in the file, you can tell the robots what they are/aren't allowed to look at. It's not going to stop all of them, but any that adhere to the robots.txt file (most of the major search engines) will be stopped.

 

I'm sure the web guys already know this, but I thought I'd share with everyone else :anicute:

Link to comment
Highly unlikely. Google does a very good job at not overloading sites when it sends bots in to index.

 

I agree. Google is great at this and sometimes will even abort their crawling if they see that the site is being overloaded. It can sometimes take them days and several bots to completely index a site because of this.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...