+blb9556 Posted January 5, 2008 Share Posted January 5, 2008 If you type your username in on google you get your forum posts and maybe even some finds. Look at mine (Visit Link) I thought you had to pay to have things on google but I guess not. Link to comment
Jeremy Posted January 5, 2008 Share Posted January 5, 2008 You don't pay to have Google index your site. If it is publicly visible they suck it down and index it, then slap a contextual ad on their search results to make money on the content they index. Link to comment
+Yossarian Posted January 5, 2008 Share Posted January 5, 2008 I wonder if that could have anything to do with the forum becoming slow or unusable (too many connections error) occasionally. Maybe Google's crawler is trying to suck down the content too quickly and overloading the server. Link to comment
+Miragee Posted January 5, 2008 Share Posted January 5, 2008 I wonder if that could have anything to do with the forum becoming slow or unusable (too many connections error) occasionally. Maybe Google's crawler is trying to suck down the content too quickly and overloading the server. Interesting thought to post right now. For the last half hour or so, the Forums have been nearly unusable. I have a very slow dialup Internet connection, but have been able to get to GC.com pages, while these Forums have been almost completely unresponsive . . . Link to comment
+carleenp Posted January 6, 2008 Share Posted January 6, 2008 I wonder if that could have anything to do with the forum becoming slow or unusable (too many connections error) occasionally. Maybe Google's crawler is trying to suck down the content too quickly and overloading the server. Highly unlikely. Google does a very good job at not overloading sites when it sends bots in to index. Link to comment
Jeremy Posted January 6, 2008 Share Posted January 6, 2008 I wonder if that could have anything to do with the forum becoming slow or unusable (too many connections error) occasionally. Maybe Google's crawler is trying to suck down the content too quickly and overloading the server. Highly unlikely. Google does a very good job at not overloading sites when it sends bots in to index. We had almost 500 users simultaneously banging on the site at that time. Fortunately it looks as though the forum software had a performance upgrade recently. When we get back from CES we'll look into upgrading the site. Unfortunately there is some custom code to link Geocaching.com to the forums so we'll have to do that and test it before we can install the upgrade. Link to comment
+TeamHoopz Posted January 6, 2008 Share Posted January 6, 2008 It's possible to stop most robots from crawling your website by creating a file called robots.txt in the root of the site. Using certain command in the file, you can tell the robots what they are/aren't allowed to look at. It's not going to stop all of them, but any that adhere to the robots.txt file (most of the major search engines) will be stopped. I'm sure the web guys already know this, but I thought I'd share with everyone else Link to comment
+altosaxplayer Posted January 6, 2008 Share Posted January 6, 2008 Highly unlikely. Google does a very good job at not overloading sites when it sends bots in to index. I agree. Google is great at this and sometimes will even abort their crawling if they see that the site is being overloaded. It can sometimes take them days and several bots to completely index a site because of this. Link to comment
Recommended Posts