Jump to content

Rate Limit Exceed


ice13-333

Recommended Posts

But as far as this one, there's a very easy fix for it ..... DON'T THROTTLE PREMIUM MEMBERS !!! Bots don't typically pay.

 

I wish this were true. It would be an easy way to relieve the pain for a certain group of users. Unfortunately, several of the bots were using sock puppet accounts (thanks to Jon Stanley for getting me hip with the lingo), and some IPs use more than one paid account, and some of the bots were using accounts from more than one IP. So, it's not like we're being targeted by an intelligence agency, but it's more than just someone who turned on a screen scraper from their home server.

Link to comment

The warning sounds like a funny one (no offense to people who have been frustrated by it). I've tried my darndest, doing everything that people have said they have done while getting this warning, and I can't get it to happen to me. I'm quite put out. :lol:

Link to comment

This is just hilarious. I understand that some throttling is needed in order to keep bots from scraping the whole database, but normal members should under no circumstances be affected by this.

 

I haven't tried yet, but by reading the reports and talking to friends who have been banned, it appears as though my normal routine of logging caches would get me banned in no time.

Link to comment

One thing that may be confusing is that not every page is throttled. The main page for example is not throttled. Even if you're throttled, you can access the home page. The only pages we've throttled are those that heavily use the db or allow site scrapers to violate the terms of use. So, you may have triggered throttling code, gotten the error, went back to the home page, thought you were good-to-go, and then went back to a throttled page and gotten the "Whoa" message again. It's easy to think that you were throttled a second time when you were most likely still throttled from the first time.

Link to comment

Well, bots, premiums, whatever. It's broken, as several members have pointed out. And it broke yesterday with an update, since we never have experienced it before. I wasn't going to mention anything about this having repurcussions with members wanting to pay or not, although I was sure it would have. But it has now been mentioned, which should serve as plenty of notice that another way needs to be thought of to combat bots .... rather than punish honest members for doing what they have been doing for years without this problem.

 

It's been brought to the attention of those in charge plenty enough times, so I'm not going to add anything further to it. I'll just sit back, use the site (if it lets me now) and trust that TPTB have recognized that a serious problem has occured & get it fixed.

 

Thanks for the work you do. But still .... please do this type thing early in the week from now on. That one thing will keep you all from working on the weekend, and keep us from having our weekend caching plans scrunched.

Link to comment

This is just hilarious. I understand that some throttling is needed in order to keep bots from scraping the whole database, but normal members should under no circumstances be affected by this.

 

I haven't tried yet, but by reading the reports and talking to friends who have been banned, it appears as though my normal routine of logging caches would get me banned in no time.

 

The challenge is determining normal users from bots. Bots access the system the way normal users do, accept that the bots can access the system much faster. We've set limits that we've only been able to exceed by repeatedly hitting F5 on a throttled page, without taking time to let the page fully refresh, let alone read any of the page.

 

If you're being throttled, please let us know what pages you were browsing before you first saw the throttling code, along with the browser you're using and any plugins and grease monkey scripts you may be using for the site. We are trying to reproduce the problem so we can fix it.

Link to comment

ALL of the throttling pages have occured to myself when working with maps .... either trying to scroll them, switch between different views, zooming in or out, or clicking on a cache. I've experienced the issue at least once on each of these actions since yesterday about noon'ish .... and at least a half-dozen times on most of the actions. Just happened again, 2d time today, when scrolling to another location about 3 inches away.

 

The GOOD thing is that it appears we are now only "banned" for about 5 minutes, compared to upwards of 20 minutes last night. So it MIGHT be gettng better, but still not entirely fixed.

 

Latest edition of FF, no bots, no scripts, no GreaseMonkey, no nothing else .... just surfing & accessing the site as I always have in the past.

 

P.S. Might I kindly suggest that you ditch the Super-Hamster. Seems like the addition of him is what broke the site. Replace him with a Gerbil. I've extensively researched this in the last 45 seconds & they SHOULD co-habitate properly with the "regular" hamsters. But having an occasional Type A personality, I don't think the Gerbil will get along with a Super-Hamster ..... UNLESS ..... you use a female Gerbil. They will most likely be better at pushing the right buttons, but might be a little slower on the generator-wheel. And a female shouldn't get into a fight with the big hamster, & possibly even calm him down a bit.

 

P.S.S. Guinea Pigs are out of the question. They get too fat & too lazy too quickly. I guess you COULD starve them a bit & dangle the lettuce leaf just out of reach. But that might also get the unwanted attention of the HSUS. You don't want that.

Link to comment

We found a culprit. The time sync for one of our servers had failed and it was two minutes faster than the others. This could have caused our algorithm to think some users were accessing the site at a higher rate than they were. We got the time synced and saw the rate of suspensions drop. We're not convinced this was the only culprit and will continue to investigate for other causes.

Link to comment

It happened to me also. I was not logging finds, but simply looking at a number of caches / logs. I use Firefox. I was shut out for at least 15 minutes... could have been longer - I left and did something else, and came back a few hours later. Even closing all of my browser windows down and reopening them did not help.

Link to comment

We found a culprit. The time sync for one of our servers had failed and it was two minutes faster than the others. This could have caused our algorithm to think some users were accessing the site at a higher rate than they were. We got the time synced and saw the rate of suspensions drop. We're not convinced this was the only culprit and will continue to investigate for other causes.

 

Thank you for your time spent huntin' this one down. Hopefully you nailed it.

Sorry it ate into your Holiday weekend.

Link to comment
I wish this were true. It would be an easy way to relieve the pain for a certain group of users. Unfortunately, several of the bots were using sock puppet accounts (thanks to Jon Stanley for getting me hip with the lingo), and some IPs use more than one paid account, and some of the bots were using accounts from more than one IP. So, it's not like we're being targeted by an intelligence agency, but it's more than just someone who turned on a screen scraper from their home server.

So IP address is one of your primary criteria for deciding whether activity comes from a bot or not?

 

There are at least 3 premium members in my office. We all appear to come from the same external IP. If we all access the site simultaneously, you're saying we stand a higher chance of being flagged as TOU-violators?

Link to comment

 

If you're being throttled, please let us know what pages you were browsing before you first saw the throttling code, along with the browser you're using and any plugins and grease monkey scripts you may be using for the site. We are trying to reproduce the problem so we can fix it.

Could you please point use to even one example of a Greasemonkey script that has caused the throttling trigger to trip? I'm having a tough time seeing a script engine blamed for a problem possibly caused by server synchronization issues. I'm no Greasemonkey guru, but it seems to me that everything it does is client-side and occurs only after the html has already been fetched (anybody that knows differently, please educate me).
Link to comment

 

If you're being throttled, please let us know what pages you were browsing before you first saw the throttling code, along with the browser you're using and any plugins and grease monkey scripts you may be using for the site. We are trying to reproduce the problem so we can fix it.

Could you please point use to even one example of a Greasemonkey script that has caused the throttling trigger to trip? I'm having a tough time seeing a script engine blamed for a problem possibly caused by server synchronization issues. I'm no Greasemonkey guru, but it seems to me that everything it does is client-side and occurs only after the html has already been fetched (anybody that knows differently, please educate me).

A GM script can fetch things from the server if the programmer decides to do so. But I haven't seen any of that in the scripts that I've been using. In fact, Lil Devil's most recent script for the maps pages should actually lessen the load on the servers by preventing all the extra cache-loading while scrolling.

 

Blaming GM scripts is taking a cheap shot - blaming users opening tabs on their own is just plain low.

Link to comment
I wish this were true. It would be an easy way to relieve the pain for a certain group of users. Unfortunately, several of the bots were using sock puppet accounts (thanks to Jon Stanley for getting me hip with the lingo), and some IPs use more than one paid account, and some of the bots were using accounts from more than one IP. So, it's not like we're being targeted by an intelligence agency, but it's more than just someone who turned on a screen scraper from their home server.

So IP address is one of your primary criteria for deciding whether activity comes from a bot or not?

 

There are at least 3 premium members in my office. We all appear to come from the same external IP. If we all access the site simultaneously, you're saying we stand a higher chance of being flagged as TOU-violators?

 

Not if you're logged in.

Link to comment

 

If you're being throttled, please let us know what pages you were browsing before you first saw the throttling code, along with the browser you're using and any plugins and grease monkey scripts you may be using for the site. We are trying to reproduce the problem so we can fix it.

Could you please point use to even one example of a Greasemonkey script that has caused the throttling trigger to trip? I'm having a tough time seeing a script engine blamed for a problem possibly caused by server synchronization issues. I'm no Greasemonkey guru, but it seems to me that everything it does is client-side and occurs only after the html has already been fetched (anybody that knows differently, please educate me).

A GM script can fetch things from the server if the programmer decides to do so. But I haven't seen any of that in the scripts that I've been using. In fact, Lil Devil's most recent script for the maps pages should actually lessen the load on the servers by preventing all the extra cache-loading while scrolling.

 

Blaming GM scripts is taking a cheap shot - blaming users opening tabs on their own is just plain low.

 

I honestly don't know enough about the GM scripts being used to know whether they fetch additional data from the site or not. We know that it's possible and thus can't rule it out as a possibility. I also appreciate that there are some talented GM script creators who make our site easier to use. They are not being targeted. We are just doing our due diligence of investigating all possibilities.

Link to comment

 

If you're being throttled, please let us know what pages you were browsing before you first saw the throttling code, along with the browser you're using and any plugins and grease monkey scripts you may be using for the site. We are trying to reproduce the problem so we can fix it.

Could you please point use to even one example of a Greasemonkey script that has caused the throttling trigger to trip? I'm having a tough time seeing a script engine blamed for a problem possibly caused by server synchronization issues. I'm no Greasemonkey guru, but it seems to me that everything it does is client-side and occurs only after the html has already been fetched (anybody that knows differently, please educate me).

A GM script can fetch things from the server if the programmer decides to do so. But I haven't seen any of that in the scripts that I've been using. In fact, Lil Devil's most recent script for the maps pages should actually lessen the load on the servers by preventing all the extra cache-loading while scrolling.

 

Blaming GM scripts is taking a cheap shot - blaming users opening tabs on their own is just plain low.

 

I honestly don't know enough about the GM scripts being used to know whether they fetch additional data from the site or not. We know that it's possible and thus can't rule it out as a possibility. I also appreciate that there are some talented GM script creators who make our site easier to use. They are not being targeted. We are just doing our due diligence of investigating all possibilities.

 

I beg to differ. They are being targeted. They are being targeted by the language that has been used in this thread, and also by the language that is used on the "Whoah! Slow Down" page: Disable any add-ons to your browser, like Greasemonkey, that may be hitting the site too fast.

I have taken to using Firefox with the Greasemonkey script engine (I stress "engine", because it does nothing more than to host the scripts) because 3rd party freeware authors have taken it upon themselves to provide, via Greasemonkey scripts, improvements that many of us have been asking TPTB for for years with little or no satisfaction.

Link to comment

Before anyone else claims we are persecuting the evil Greasemonkey, I'd just like to emphasize that we do not have a problem with users taking advantage of browser addons to improve their experience.

 

In terms of troubleshooting a problem, asking you to disable addons is akin to requesting you clear your cache or start a new browsing session. These are the kinds of things that help us get a pure reproduction of the error and in no way reflect a bias against 3rd party addons.

 

So please, relax and enjoy what's left of your weekend. We will keep workin' on it.

Link to comment
Before anyone else claims we are persecuting the evil Greasemonkey, I'd just like to emphasize that we do not have a problem with users taking advantage of browser addons to improve their experience.

 

In terms of troubleshooting a problem, asking you to disable addons is akin to requesting you clear your cache or start a new browsing session. These are the kinds of things that help us get a pure reproduction of the error and in no way reflect a bias against 3rd party addons.

 

So please, relax and enjoy what's left of your weekend. We will keep workin' on it.

Then may I suggest that you remove the part about Greasemonkey and say what it is that you are really trying to say?

 

Disable any add-ons to your browser, like Greasemonkey, that may be hitting the site too fast.

 

This whole thread began by falsely accusing the innocent. Let's not perpetuate that, OK?

 

[Edited to add: Of course, if the throttling is working properly, only the guilty will ever see that specific wording, so perhaps I'm over-reacting?]

Edited by knowschad
Link to comment
Then may I suggest that you remove the part about Greasemonkey and say what it is that you are really trying to say?

 

Disable any add-ons to your browser, like Greasemonkey, that may be hitting the site too fast.

 

This whole thread began by falsely accusing the innocent. Let's not perpetuate that, OK?

I agree with knowschad. In addition to being singled out on the page, we also have Jeremy on record as saying:

 

If you are running any Greasemonkey scripts you can quickly hit the throttling code.

(emphasis mine)

Link to comment

I know of one especially virulent Greasemonkey script that hits the site very hard.

 

It is not a Prime Suspect script, a Lil Devil script or other such script that we all rave about as being so helpful for shortcuts, page layout, etc.

 

Wouldn't it be silly and counterproductive to name this Greasemonkey script, or any of its robot cousins?

Link to comment

Wouldn't it be silly and counterproductive to name this Greasemonkey script, or any of its robot cousins?

No, it wouldn't be.

 

If people knew they were running a bad script, they might stop running it.

 

Also, without naming it, most people would still see this as a generic blaming of greasemonkey scripts overall, indicating that gs is still guessing why their throttling code is so buggy.

Link to comment
Wouldn't it be silly and counterproductive to name this Greasemonkey script, or any of its robot cousins?

No, actually, IMO it would not, as then people would know what to avoid.

 

In fact, if I were Groundspeak, I would publish a list of bad applications/scripts with the statement that using one would get you banned, along with another list of approved applications/scripts.

 

I am not privy to everything, but I don't believe that secrecy is helpful in this case.

Link to comment

Wouldn't it be silly and counterproductive to name this Greasemonkey script, or any of its robot cousins?

No, it wouldn't be.

 

If people knew they were running a bad script, they might stop running it.

 

Also, without naming it, most people would still see this as a generic blaming of greasemonkey scripts overall, indicating that gs is still guessing why their throttling code is so buggy.

I have no dog in this hunt, I'm only watching it because I do use Greasemonkey scripts even though I have no idea how they work. If there are Greasemonkey scripts that negatively impact the site I would like to know which ones they are so I can avoid them.

 

I suspect that anyone with ill intent could find them pretty easily, so not mentioning their name doesn't offer you any protection but leaves us vulnerable to unknowingly loading them.

Link to comment

The challenge is determining normal users from bots. Bots access the system the way normal users do, accept that the bots can access the system much faster. We've set limits that we've only been able to exceed by repeatedly hitting F5 on a throttled page, without taking time to let the page fully refresh, let alone read any of the page.

I haven't experienced the page of subject, but I suspect that my normal (human+broser only, with no scripts, bots or pre-fetch functionality) could trigger a "ban".

 

Firstly, why should I read the whole page, if I only want the gpx-file (that has the same data, meant for later/offline-in-field reading) ?

 

Secondly, the majority of the users have browser with "tab"-capability. I have a tab-enabled browser as well. When I log finds, and it has been less then 7 days ago, that I requsted a "My Finds"-PQ, I go to the page with my geocache-logs http://www.geocaching.com/my/logs.aspx?s=1 . Then I center-click on each cache I want a gpx-file from, this opens a new tab with the geocache and doesn't change focus.

 

When I have selected all the caches I then go to the first tab, click on "Get GPX", it automagically saves the gpx-file where I want it, then type Ctrl+W (close tab, focus on the next one), and so on. Another option could be... click on "Get GPX", Ctrl+TAB to next cache, and so on... When all gpx-file-downloads have ended, close the browser.

 

Depending on the response-time on gs's servers sending me the gpx-file I would guess that I could manage more then one gpx-file-req pr. second. And if I had 50+ finds, that might be 50 gpx-file requests within a minute (if you exclude the time, where I open tabs).

 

If I plan a geocaching-tour, I might do the same for 30-50 geocaches, that I might want to search for on that tour. So I have up2date info and logs in the field. I do get PQ, but that data might be up to 7 days out of date.

 

Lastly I would state, that is unfair that a few people can ruin it for the many. But I do think you are firing, to hit abusers, whitout aiming properly.

 

Ys

Thomas

Link to comment

Logging by hand is breaking the terms of use? What did I miss?

 

"by hand" means "using an automated tool" in snark, of which I am fluent.

 

At least it should be. Our records show that the user was throttled several times. This only happens if the user isn't looking at what they are doing, which generally means it is a bot. Regardless, read the message and slow down.

Why would I look at all the cache-description, if all I want is the GPX-file, for later/offline reading ? In my other post, I have demonstrated that I easily could be seen as a bot (/abuser), simply by using the capability's in my browser. But I do have some other thoughts/points for this post...

 

I would suggest that you take the time, to think why a few of your customers find it nescacary to violate their EULA/TOU.

 

I do think you can boil it down to very few reasons. If you could make a few changes, I'll bet you that you could remove 90%, if not all, of the need for spidering and other abuse of gs's site:

  • include the same number of logs in a PQ, as in a directly downloaded GPXfile for a cache
  • When defining a PQ, add the functionality to get (recently ?) archived geocaches
  • Laslty I would suggest that requesting a GPX-file on ones own caches, should give you *all* logs

Why spider for GPX-files, if you can get the same in a PQ ?

 

Why spider for GPX-files on archived caches, if you can get it with a PQ ?

 

In stead of starting a battle against (potentially) millions of imaginative abusers, why not just remove the need to become an abuser ? Then you would not accidently harm legitimate customers, and if there is no need to abuse, the number of abusers would be very limited.

 

I'll bet you that buying CPU-power and bandwidth is very much cheaper in the long run (hence eliminating most of the need for the battle), then paying developers and sysadm's for fighting a never ending battle.

 

You staff (that is doing a very good job, btw), could then use time on developing nice-to-haves, instead of fighting abusers hiding between happy customers. In the end your employees would be more happy as well, when they can dedicate their time on making things that makes customers happy.

 

Just some thoughts from my keyboard...

 

Ys

Thomas

Link to comment

Logging by hand is breaking the terms of use? What did I miss?

 

"by hand" means "using an automated tool" in snark, of which I am fluent.

 

At least it should be. Our records show that the user was throttled several times. This only happens if the user isn't looking at what they are doing, which generally means it is a bot. Regardless, read the message and slow down.

 

Haven't ever heard of 'by hand' referring to an 'automated tool' in the programming and engineering world. We use 'by hand' to describe a true plug and chug / pencil and paper manual setup or debugging process.

 

Maybe we should just state the that we feel that it's like a "Bite of the 'hand' the that feeds". Got the throttle message throttle message over an hour ago while manually retrieving (no scripts, bots, and anything other than a straight mouse click in Firefox) the pages of a prolific area hider that had archived a large number of caches this past Saturday. The site let me back in about 15 minutes later, but it still won't load the maps. Just a limit exceeded icon in their place.

 

To think I just gave a gift Premium Membership to someone on Friday. Really wondering if I've should have just gotten them something else.

Edited by c_dog
Link to comment

Logging by hand is breaking the terms of use? What did I miss?

 

"by hand" means "using an automated tool" in snark, of which I am fluent.

 

At least it should be. Our records show that the user was throttled several times. This only happens if the user isn't looking at what they are doing, which generally means it is a bot. Regardless, read the message and slow down.

 

Haven't ever heard of 'by hand' referring to an 'automated tool' in the programming and engineering world. We use 'by hand' to describe a true plug and chug / pencil and paper manual setup or debugging process.

I think the point was that many times, when someone feels the need to point out that they're doing things "the slow way", they're really not, but trying to cover something up.

Link to comment

I have a novel idea. Instead of being on the defensive, go on the offensive. Use the programming time & funds available to create a top notch fully effiecient cloud of servers that can kick that certain GC script's (btw which is one awesome tool in reality) butt buy doing what it does (thus eliminating the problem) and providing an awesome platform for all of the world to enjoy geocaching. Imagine if google deployed a "whoa page" on searches? That premise smacks college sorority house mentality to me. The model is broken here men. Turn your efforts to efficiency not throttleing.

 

If little ole devil can do in a short time what many have been begging for (for a LONG time) can you just imagine what the experts (who KNOW the code) can do with that mentality!

 

Just saying, that is all - not attacking, just saying...

Link to comment

I've tried my darndest, doing everything that people have said they have done while getting this warning, and I can't get it to happen to me. I'm quite put out.

Me too! Well, I haven't tried to get it to occur, but I've done things as usual and never have super-human hamster-warnings.

 

I have been very gentle on the map. I zoom-out one level, and every cache on the ENTIRE PLANET starts to load onto the screen, and now it's impossible to scroll it. I'm actually concerned about the heavy server load I cause with one mouse click, when I didn't even want a whole mass of icons.

 

I've only been super-human at the Atlanta Airport, using the $9.95 Wi-Fi, last April. I got blocked by Sprint Wi-Fi after 30 minutes, and the ONLY thing I can think happened is, I had 4 browser windows open. No power scripts, no add-ons, no monkey (I wasn't even using Geocaching.com back then). The Airport Wi-Fi Administration said I was obviously a major hacker with the “THOUSANDS of hits” I was overloading the network with. Nevermind that it doesn't happen at any other wi-fi hotspot. It's just an example of how it could happen.

 

What if I open 3 or 4 instances of IE to geocaching.com? It's possible to, say, start Tabs up, to look up a Travel Bug and a map, before doing a cache log. And if a couple of those Tabs aren't "logged in" -- I'd swear this is one of the TOUGHEST sites to stay logged into -- now I'm approaching super-human territory. I'm not going to test it, just noting that simultaneous Internet browsers have caused me issues in some situations.

Link to comment

OK, I've been watching this thread for a couple of days now and I'm seeing quite a bit of finger pointing from TPTB at Greasemonkey. Lil Devil made a point and now I'd like to reiterate - I cannot think of a single popular Geocaching Greasemonkey script that hits a Groundspeak domain with even a single request. Most simply rearrange or modify existing page elements and/or incorporate data from outside geocaching.com (like additional mapping, etc) into the page.

 

So, I'd like to request that the lackeys show us a list of these scripts that you say are hitting the site. If you can show me a script that hits the site, I for one will stop using it. If you can't, quit pointing the finger at a legitimate browser extension and the authors who put so much work into improving the site's usability for our benefit!

Link to comment

I've tried my darndest, doing everything that people have said they have done while getting this warning, and I can't get it to happen to me. I'm quite put out.

Me too! Well, I haven't tried to get it to occur, but I've done things as usual and never have super-human hamster-warnings.

 

I have been very gentle on the map. I zoom-out one level, and every cache on the ENTIRE PLANET starts to load onto the screen, and now it's impossible to scroll it. I'm actually concerned about the heavy server load I cause with one mouse click, when I didn't even want a whole mass of icons.

 

I've only been super-human at the Atlanta Airport, using the $9.95 Wi-Fi, last April. I got blocked by Sprint Wi-Fi after 30 minutes, and the ONLY thing I can think happened is, I had 4 browser windows open. No power scripts, no add-ons, no monkey (I wasn't even using Geocaching.com back then). The Airport Wi-Fi Administration said I was obviously a major hacker with the “THOUSANDS of hits” I was overloading the network with. Nevermind that it doesn't happen at any other wi-fi hotspot. It's just an example of how it could happen.

 

What if I open 3 or 4 instances of IE to geocaching.com? It's possible to, say, start Tabs up, to look up a Travel Bug and a map, before doing a cache log. And if a couple of those Tabs aren't "logged in" -- I'd swear this is one of the TOUGHEST sites to stay logged into -- now I'm approaching super-human territory. I'm not going to test it, just noting that simultaneous Internet browsers have caused me issues in some situations.

 

Yesterday, I had about 10-15 tabs opened in IE, and 10-15 tabs opened in Firefox (with two different geocaching accounts) at the same time. I was actively using the geocaching map feature, zooming in and out, panning around and clicking on many caches.

 

Nuttin' happened to me. :) But I was able to do maintenance on a long (driving) multi, and found two new caches and bumped into some local cachers and found the caches with them. B)

Edited by Ambrosia
Link to comment

!UNBELIEVABLE!

 

That's the first time that a site stops me from surfing. And - nota bene - I'm a premium member and pay every year for having fun with gc.com. BUT now, should I really pay again? I've never seen anything like that before - hope this is only a april fool :-((

Link to comment

Wow. You guys sure can overreact.

OK - scary things are happening...

 

I'm agreeing with sbell111. :)

I think this is one of the signs of the coming of the Apocalypse.

 

Anyway, it isn't really clear - are people still being throttled after the time sync issue is fixed?

Link to comment

Adding my bug report since it looks like folks are still getting this:

 

4/3 11pm PST

Chrome, Vista, single tab, no add ons or scripts, logged in as Premium User

Looking for caches via the Google Map, about 50% zoomed. Scroll, pause to see if there are caches, scroll, pause for caches. I was looking in the Yukon Territory in Canada, so it wasn't like a lot of caches were loading. I had to wait about 20 mins to be let back into the site.

 

Thankfully I was just playing. If I had been trying to plan a caching trip for Sunday I wouldn't have been so good humored. Sounds like users are guilty until proven innocent, I feel bad for that poor German OP.

Link to comment

Logging by hand is breaking the terms of use? What did I miss?

 

"by hand" means "using an automated tool" in snark, of which I am fluent.

 

At least it should be. Our records show that the user was throttled several times. This only happens if the user isn't looking at what they are doing, which generally means it is a bot. Regardless, read the message and slow down.

Why would I look at all the cache-description, if all I want is the GPX-file, for later/offline reading ? In my other post, I have demonstrated that I easily could be seen as a bot (/abuser), simply by using the capability's in my browser. But I do have some other thoughts/points for this post...

 

I would suggest that you take the time, to think why a few of your customers find it nescacary to violate their EULA/TOU.

 

I do think you can boil it down to very few reasons. If you could make a few changes, I'll bet you that you could remove 90%, if not all, of the need for spidering and other abuse of gs's site:

  • include the same number of logs in a PQ, as in a directly downloaded GPXfile for a cache
  • When defining a PQ, add the functionality to get (recently ?) archived geocaches
  • Laslty I would suggest that requesting a GPX-file on ones own caches, should give you *all* logs

Why spider for GPX-files, if you can get the same in a PQ ?

 

Why spider for GPX-files on archived caches, if you can get it with a PQ ?

 

In stead of starting a battle against (potentially) millions of imaginative abusers, why not just remove the need to become an abuser ? Then you would not accidently harm legitimate customers, and if there is no need to abuse, the number of abusers would be very limited.

 

I'll bet you that buying CPU-power and bandwidth is very much cheaper in the long run (hence eliminating most of the need for the battle), then paying developers and sysadm's for fighting a never ending battle.

 

You staff (that is doing a very good job, btw), could then use time on developing nice-to-haves, instead of fighting abusers hiding between happy customers. In the end your employees would be more happy as well, when they can dedicate their time on making things that makes customers happy.

 

Just some thoughts from my keyboard...

 

Ys

Thomas

 

You know this entire post really comes off as a threat.

Give it me or else!! - is how I read it.

 

I think the users 'demanding' these features are VERY few and far between. Why can't you just live with the site as presented. Seems like a novel idea.

 

I am sorry for the legit users seeing issues but for the 'bad guys' suddenly having issues - I have no pity.

Link to comment

Wow. You guys sure can overreact.

OK - scary things are happening...

 

I'm agreeing with sbell111. B)

I think this is one of the signs of the coming of the Apocalypse.

 

Anyway, it isn't really clear - are people still being throttled after the time sync issue is fixed?

 

We're seeing less than .01% of users/hr being throttled. We have yet to determine how many of those are valid. We're still seeing reports from users that they're being throttled, but very few in comparison to before the time sync issue. We plan to release an update to the throttling code tomorrow to gather more information to help determine what issues still exist.

Link to comment

 

We're seeing less than .01% of users/hr being throttled. We have yet to determine how many of those are valid. We're still seeing reports from users that they're being throttled, but very few in comparison to before the time sync issue. We plan to release an update to the throttling code tomorrow to gather more information to help determine what issues still exist.

 

I for one would like to thank the gc.com team for working hard to alleviate any problems, and if you have things to the point of only have .01%/hr being throttled, you're doing a great job.

 

I agree, that some people are very quick to overreact. I'm a Web Developer / Programmer myself and it can be an extremely thankless job. Many people don't understand how much work goes into a website, especially one that gets SOOO much traffic.

 

I am happy to pay for my premium membership. I know how hard you folks must be working. Thank you.

Link to comment

We're seeing less than .01% of users/hr being throttled. We have yet to determine how many of those are valid. We're still seeing reports from users that they're being throttled, but very few in comparison to before the time sync issue. We plan to release an update to the throttling code tomorrow to gather more information to help determine what issues still exist.

80,000 premium members, so .01% would be a grand total of 8 users being throttled?

 

I kinda think from the volume of this thread that may be off just a tad.

Link to comment
80,000 premium members, so .01% would be a grand total of 8 users being throttled?

 

I kinda think from the volume of this thread that may be off just a tad.

Questions and observations:

  • 0.01% of users per hour
  • Where is the 80,000 premium users from?
  • Are basic users affected as well? I thought they were, but perhaps I was mistaken.

Link to comment

I would suggest that you take the time, to think why a few of your customers find it nescacary to violate their EULA/TOU.

 

I do think you can boil it down to very few reasons. If you could make a few changes, I'll bet you that you could remove 90%, if not all, of the need for spidering and other abuse of gs's site:

  • include the same number of logs in a PQ, as in a directly downloaded GPXfile for a cache
  • When defining a PQ, add the functionality to get (recently ?) archived geocaches
  • Laslty I would suggest that requesting a GPX-file on ones own caches, should give you *all* logs

Why spider for GPX-files, if you can get the same in a PQ ?

 

Why spider for GPX-files on archived caches, if you can get it with a PQ ?

 

In stead of starting a battle against (potentially) millions of imaginative abusers, why not just remove the need to become an abuser ? Then you would not accidently harm legitimate customers, and if there is no need to abuse, the number of abusers would be very limited.

 

I'll bet you that buying CPU-power and bandwidth is very much cheaper in the long run (hence eliminating most of the need for the battle), then paying developers and sysadm's for fighting a never ending battle.

 

You staff (that is doing a very good job, btw), could then use time on developing nice-to-haves, instead of fighting abusers hiding between happy customers. In the end your employees would be more happy as well, when they can dedicate their time on making things that makes customers happy.

 

Just some thoughts from my keyboard...

 

Ys

Thomas

 

You know this entire post really comes off as a threat.

Give it me or else!! - is how I read it.

 

I'm sorry if thats how it sounds, because it was definately not how it was intended.

I think the users 'demanding' these features are VERY few and far between. Why can't you just live with the site as presented. Seems like a novel idea.

To clarify... I dont need these changes, I'm perfectly happy with the functionality as is. Actually I still haven't seen the warning-page in question.

 

My point was that if you rethink using time to battle abusers, and instead use the same resources in changing the site in a way, so the abusers don't have any reason to abuse, you could use the same resources in a constructive way instead.

 

I.e. instead of trying to battle abusers (and harming some lgitimate users in the fight), why not just remove the reason for abusing ?

 

I did not state that we should get 20 logs/cache in a PQ, just that you should get the same number in a directly dl'ed GPX, as you get in a PQ. That could be 10 instead or even just 5 last logs in a GPX and PQ. The main thing is that you remove the reason to spider for GPX-files, when you can get the same data in a PQ.

I am sorry for the legit users seeing issues but for the 'bad guys' suddenly having issues - I have no pity.

(sorry for going AOL) me2.

 

But I do have pity on the legit users, getting slapped on their hands, because it has been decided to battle abusers instead of just removing their reasons to abuse.

 

Lastly I want to notice that the site has been slow the last ~2 days. But I dont know if its the cables from Europe or gs's servers sweating due to some mean abusers.

 

Ys

Thomas

Link to comment

We're seeing less than .01% of users/hr being throttled. We have yet to determine how many of those are valid. We're still seeing reports from users that they're being throttled, but very few in comparison to before the time sync issue. We plan to release an update to the throttling code tomorrow to gather more information to help determine what issues still exist.

80,000 premium members, so .01% would be a grand total of 8 users being throttled?

 

I kinda think from the volume of this thread that may be off just a tad.

 

You would think that was off tad given the tone of the thread. I haven't calculated numbers from before we fixed the time sync issues, but after the sync, yes, we're looking at an avg of 10 users per hour being throttled. This does not include accounts we have identified as bots, which have a much higher rate of being throttled.

 

To be fair to the tone of the thread, most of the complaints were filed before we fixed the time sync bug. At that point, I would be (and was) quite frustrated as well.

Edited by mambero
Link to comment
80,000 premium members, so .01% would be a grand total of 8 users being throttled?

 

I kinda think from the volume of this thread that may be off just a tad.

Questions and observations:

  • 0.01% of users per hour
  • Where is the 80,000 premium users from?
  • Are basic users affected as well? I thought they were, but perhaps I was mistaken.

 

All users must pass through the gauntlet.

Link to comment
Imagine if google deployed a "whoa page" on searches?
I don't need much imagination for that, because they actually do that. Although they don't just block you but show you a captcha.

 

Our company network is connected to the Internet over a single IP adress, which means 50.000+ users could potentionally access Google at the same time, triggering their "bot-check". We solved it by getting ourselves whitelisted by Google.

 

Also, one of the GM script I know of (but don't use!) that "hammers" the site is GC Tour. (At least it did when I looked at it some time ago)

Edited by BBosman
Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...