Jump to content

New possible Bot alert


rockhead15

Recommended Posts

Simply add a validation code entry box to all the log entry pages. Ta-da! Bots are stopped.

 

You can see an example of one on pages like this...

 

http://geocheck.org/geo_inputchkcoord.php?...e3-134b2afbe676

That's essentially what people mean when they refer to CAPTCHA. <snip> You want to add the burden of a validation code for each cache?

 

Do you really want to do that every time you log a cache?

 

 

So you want to inconvenience/punish tens of thousands of cachers in order to stop one bozo?????

 

That is even more onerous, and horrible, than asking the CO's to delete the bot's logs.

 

What is wrong with asking GS to take complete ownership?

 

Yes. I don't have a problem with entering a "captcha" thingie while logging every cache. The fact that there are a handful of people who are so concerned about racking up cache counts that they need to go out and find 60 caches per day doesn't bother me. Why does the majority have to pay for the convenience of the few?

 

It's not "punishment" and there are certainly thousands of these anti-caching "bozos" out there that need to be dealt with. It is just a smarter way to do business in this age of widespread evil.

 

What I'm hearing is that people don't want to take some simple security measures to prevent problems and they'd rather have the company (GS) respond in superhero fashion to clean up the mess, but only after the bad guys have done their damage.

 

That is akin to being too lazy to lock the front door of your house and then expecting the government to respond after the place has been burglarized to not only hunt down the perpetrator, but also clean and fix your house up. If there were better door locks in the first place, you probably wouldn't have to clean the house and waste tax-payer money on chasing down the perps. (Sure, there are a couple dedicated criminals who could still get through a locked door, but the locks will stop the majority).

 

And if the Found and Note logs are deemed to be insignificant, but the NM & NA logs are now being used with greater impact to the system, then just add the captcha gizmo to those log types.

Link to comment

The only time I had more than 60 caches to log was after a 10 day vacation. Usually I have five or six to log at most. I still don't want to have to deal with a captcha system that I often find myself trying two or three times before being successful. I particularly don't want to have to deal with it when the experts are telling me that it won't work. Pretty much pointless.

Link to comment
Simply add a validation code entry box to all the log entry pages. Ta-da! Bots are stopped.

 

You can see an example of one on pages like this...

 

http://geocheck.org/geo_inputchkcoord.php?...e3-134b2afbe676

That's essentially what people mean when they refer to CAPTCHA. <snip> You want to add the burden of a validation code for each cache?

 

Do you really want to do that every time you log a cache?

 

 

So you want to inconvenience/punish tens of thousands of cachers in order to stop one bozo?????

 

That is even more onerous, and horrible, than asking the CO's to delete the bot's logs.

 

What is wrong with asking GS to take complete ownership?

 

Yes. I don't have a problem with entering a "captcha" thingie while logging every cache. The fact that there are a handful of people who are so concerned about racking up cache counts that they need to go out and find 60 caches per day doesn't bother me. Why does the majority have to pay for the convenience of the few?

 

It's not "punishment" and there are certainly thousands of these anti-caching "bozos" out there that need to be dealt with. It is just a smarter way to do business in this age of widespread evil.

 

What I'm hearing is that people don't want to take some simple security measures to prevent problems and they'd rather have the company (GS) respond in superhero fashion to clean up the mess, but only after the bad guys have done their damage.

 

That is akin to being too lazy to lock the front door of your house and then expecting the government to respond after the place has been burglarized to not only hunt down the perpetrator, but also clean and fix your house up. If there were better door locks in the first place, you probably wouldn't have to clean the house and waste tax-payer money on chasing down the perps. (Sure, there are a couple dedicated criminals who could still get through a locked door, but the locks will stop the majority).

 

And if the Found and Note logs are deemed to be insignificant, but the NM & NA logs are now being used with greater impact to the system, then just add the captcha gizmo to those log types.

Did you see what I posted about Googling methods of evading Captcha? I found copy/paste source code in my very first hit.

 

I have found 105 caches in one day, and it isn't unusual for me to get 25-35 in a day, and I'm hardly special in that. A bad captcha is totally worthless, and a really good captcha is very difficult for humans to do, often requiring numerous refreshes to get one that is human readable.

 

The door lock analogy only goes so far. CAPTCHA is essentially a skeleton key. If you know how to pick that lock, all it takes is a hairpin. If you don't, you're locked out.

Link to comment
Simply add a validation code entry box to all the log entry pages. Ta-da! Bots are stopped.

 

You can see an example of one on pages like this...

 

http://geocheck.org/geo_inputchkcoord.php?...e3-134b2afbe676

That's essentially what people mean when they refer to CAPTCHA. <snip> You want to add the burden of a validation code for each cache?

 

Do you really want to do that every time you log a cache?

 

 

So you want to inconvenience/punish tens of thousands of cachers in order to stop one bozo?????

 

That is even more onerous, and horrible, than asking the CO's to delete the bot's logs.

 

What is wrong with asking GS to take complete ownership?

 

Yes. I don't have a problem with entering a "captcha" thingie while logging every cache. The fact that there are a handful of people who are so concerned about racking up cache counts that they need to go out and find 60 caches per day doesn't bother me. Why does the majority have to pay for the convenience of the few?

 

It's not "punishment" and there are certainly thousands of these anti-caching "bozos" out there that need to be dealt with. It is just a smarter way to do business in this age of widespread evil.

 

What I'm hearing is that people don't want to take some simple security measures to prevent problems and they'd rather have the company (GS) respond in superhero fashion to clean up the mess, but only after the bad guys have done their damage.

 

I disagree. First of all, deploy a captcha system does not stop the bot. It only makes it temporarily ineffective. When that happens there are two likely scenarios. First, the creator of the bot may try to work around the captcha mechansim using an easily googled method as mentioned in another response. The creator of the bot may decide at that point to do something more nefarious then just posting note or found it logs. The other scenario may be that the creator of the bot may decide "my work is done here" I've effectively changed the way that every geocacher has to log their finds (who know, the motivation for creating the bot in the first placed may have stemmed from some of their logs getting deleted).

 

In either case, the "fix" punishes the innocent (even if you don't consider it a big burden) and lets the perpetrator essentially go unpunished. There *are* laws in place which provide criminal penalties for what this bot is doing. Yes, it's probably cheaper for GS just to install a capcha but it doesn't "stop the bot".

Link to comment
Hey, the good news is that its only a bot, not an FTF Hourder. Count your blessings.
I just got FTF on a cache that the bot was first to log, so actually. . .
Oh, geeze... you're kidding, right? :rolleyes: Too funny!!!
Nope, not kidding. It was my 2000th find.
That is akin to being too lazy to lock the front door of your house and then expecting the government to respond after the place has been burglarized to not only hunt down the perpetrator, but also clean and fix your house up. If there were better door locks in the first place, you probably wouldn't have to clean the house and waste tax-payer money on chasing down the perps. (Sure, there are a couple dedicated criminals who could still get through a locked door, but the locks will stop the majority).
The door lock analogy only goes so far. CAPTCHA is essentially a skeleton key. If you know how to pick that lock, all it takes is a hairpin. If you don't, you're locked out.
True, Dog-With-Glasses, but the analogy breaks down before that.

 

We, the cache owners, don't own the house, we've left our valuables in Groundspeak's house. They don't want to lock the front door because we wouldn't be able to see the valuables inside, but they don't want to fix what vandals have done due to the lack of security.

 

Frankly, I don't know that I want to hide any more caches if it means I'm gonna have to sit for however long, deleting bot-logs because Groundspeak won't bulk delete them.

 

This brings me to another question: Is it that Groundspeak won't bulk delete these, or that they can't? I'm operating under the assumption that, since the whole website is, at it's base, a database, that they can delete the logs associated with a particular account in one fell swoop. Of course, if that isn't the way it works, that means it will fall to the local reviewers. I can understand their refusal to ask volunteers to delete all those logs if that is the case.

 

So, anyone know which it is? :D If it is the latter, I bet a lot more people will grumble less next time they're sitting and deleting logs from a bot. . . .

Link to comment

Hmmmmm. An alternative solution comes to mind, though it is the one Groundspeak would rather you didn't deploy.

 

Since:

 

- The bot is gonna keep on doing what it's doing

- We can't ask for casual users to take extra steps like filling out a CAPTCHA, or some other "are you human" task

- Groundspeak is not interested(?)/able(?) in/to bulk removing logs from cache pages, even if a CO gets hundreds of logs to remove

 

Cache Owners can:

 

- Setup a mail filter, and redirect all "[LOG] Owner:" emails from noreply@geocaching.com to "Deleted Items"/"Trash"/"/dev/null".

 

Result:

 

- Cache Owners have more time to go caching, place caches, et al.

- Cache Owners no longer get timely information about their caches but hey, no more spam logs in the inbox...

- Cache pages will be cluttered with useless logs, some containing links to activists or spyware

- More caches get archived as CO's simply ignore NM logs until reviewers act on the maintenance issue

- Cache seekers (including the precious new ones) will find PQ's / Official Geocaching apps useless for reading logs in the field as the 5 most recent logs are by bots.

 

* Just wait until the bot author(s) figures out a way to automate account creation, and a distributed attack base.

Link to comment

 

This brings me to another question: Is it that Groundspeak won't bulk delete these, or that they can't? I'm operating under the assumption that, since the whole website is, at it's base, a database, that they can delete the logs associated with a particular account in one fell swoop. Of course, if that isn't the way it works, that means it will fall to the local reviewers. I can understand their refusal to ask volunteers to delete all those logs if that is the case.

 

So, anyone know which it is? :rolleyes: If it is the latter, I bet a lot more people will grumble less next time they're sitting and deleting logs from a bot. . . .

 

of course they can remove any trace of this BOT and future ones, unfortunately they chose to Ban it instead of deleting the account

 

the banning option can't delete all the logs because someone can be banned temporarily so you can't wipe out their existence since they will be back

 

they should have DELETED the BOT accounts, this option SHOULD eliminate any trace of this user, including the logs

 

i've been hit twice and at 3rd strike i will make all my caches PMO's

 

i will put a note in my profile saying sorry to the regular members but they can take it up with Groundspeak

Link to comment

 

i've been hit twice and at 3rd strike i will make all my caches PMO's

 

i will put a note in my profile saying sorry to the regular members but they can take it up with Groundspeak

 

You will be wasting your time. All of my caches are PMO and they were hit.

 

 

.

Link to comment
i've been hit twice and at 3rd strike i will make all my caches PMO's

 

i will put a note in my profile saying sorry to the regular members but they can take it up with Groundspeak

You will be wasting your time. All of my caches are PMO and they were hit.
While I don't have any PMO caches, there were a number here in NH that were hit by the bot, so I can verify that it will do no good.

 

Here's one now.

Edited by Too Tall John
Link to comment
i've been hit twice and at 3rd strike i will make all my caches PMO's

 

i will put a note in my profile saying sorry to the regular members but they can take it up with Groundspeak

You will be wasting your time. All of my caches are PMO and they were hit.
While I don't have any PMO caches, there were a number here in NH that were hit by the bot, so I can verify that it will do no good.

 

Here's one now.

 

The method of logging to a PMO cache is well published. Probably as easy to Google as the bypass for CATPCHA.

 

.

Edited by Tequila
Link to comment

it would be so easy. limit the number of logs per minute for new users. let's say no more than 5 logs per minute for users with less than 100 finds, possibly with an additional throttling mechanism (the more logs you make in less time, the longer you have to wait). power users would be unaffected, bots would be thwarted, and real newbies probably would never hit that limit anyway.

 

then the next step for bots can only be bulk account creation, which is a wholly different level, and could also be thwarted by limiting the number of accounts created from a single IP address. after that they'd need many IP addresses to run the attack from, at which point it reaches the level of a large scale DDoS and may become impossible to automatically detect anyway.

Edited by dfx
Link to comment

 

i've been hit twice and at 3rd strike i will make all my caches PMO's

 

i will put a note in my profile saying sorry to the regular members but they can take it up with Groundspeak

 

You will be wasting your time. All of my caches are PMO and they were hit.

 

.

 

well that royally sucks and Groundspeak better fix that loophole, which i thought was only open through iPhone use, or remove the option of allowing PMO's

 

of course since nobody from Groundspeak ever comes in here to comment we are completely barking up the wrong tree and their lack of a response in this issue is seriously becoming ridiculous and annoying

Link to comment

 

i've been hit twice and at 3rd strike i will make all my caches PMO's

 

i will put a note in my profile saying sorry to the regular members but they can take it up with Groundspeak

 

You will be wasting your time. All of my caches are PMO and they were hit.

 

.

 

well that royally sucks and Groundspeak better fix that loophole, which i thought was only open through iPhone use, or remove the option of allowing PMO's

 

of course since nobody from Groundspeak ever comes in here to comment we are completely barking up the wrong tree and their lack of a response in this issue is seriously becoming ridiculous and annoying

 

It is not a loophole per se. It is a well known fact that Jeremy has blessed the backdoor for logging PMO caches. You can find that info in several threads. I doubt it will be closed.

Link to comment

 

It is not a loophole per se. It is a well known fact that Jeremy has blessed the backdoor for logging PMO caches. You can find that info in several threads. I doubt it will be closed.

 

so what is the point of having PMO's?

 

There are several reasons. Limiting the number of visitor to a site. Help prevent the cache from maggots. Promote support of the site. Just because they can. And I am sure there are more. But if we talk about them in this thread we will give the mods a panic attack.

Link to comment

 

It is not a loophole per se. It is a well known fact that Jeremy has blessed the backdoor for logging PMO caches. You can find that info in several threads. I doubt it will be closed.

 

so what is the point of having PMO's?

 

I don't have a picture of a dead horse being flogged but I know some forum members have. So I will leave it to them to post it.

 

Suffice it to say, that question has been discussed ad nausea in countless other threads. I suggest you do a thread search and you will quickly find it. In a nutshell: not much. Most, if not all, PMO features are easily circumvented or flawed in their completeness (ie. audit log).

 

.

Edited by Tequila
Link to comment

Does a PMO cache prohibit the leaving of notes by non-PMs? I thought it was only logging a "Found It" that required the backdoor?

 

As far as I know a non-pm can not even see the page so they would need a pm to open even the note log for them.

Not true. Even without a basic membership, you can view the cache page and read the logs (just log yourself out and have a look). I don't have a basic account to use to see what log types are available to them.
Link to comment

Does a PMO cache prohibit the leaving of notes by non-PMs? I thought it was only logging a "Found It" that required the backdoor?

 

As far as I know a non-pm can not even see the page so they would need a pm to open even the note log for them.

Not true. Even without a basic membership, you can view the cache page and read the logs (just log yourself out and have a look). I don't have a basic account to use to see what log types are available to them.

 

All I see with my super secret alternate identity is the cache name & number and the size a d/t ratings. Everything else has been replaced with an ad for premium membership.

Link to comment
All I see with my super secret alternate identity is the cache name & number and the size a d/t ratings. Everything else has been replaced with an ad for premium membership.

copy the link to the cache page without opening it and paste it into the address bar of a new browser window. replace "cache_details.aspx" with "log.aspx". presto.

 

or use the following template:

http://www.geocaching.com/seek/log.aspx?wp=GCXXXX

Edited by dfx
Link to comment

 

Suffice it to say, that question has been discussed ad nausea in countless other threads. I suggest you do a thread search and you will quickly find it. In a nutshell: not much. Most, if not all, PMO features are easily circumvented or flawed in their completeness (ie. audit log).

 

.

 

i would, but with the current capabilities of this forum's search function i rather go sit in a pile of snow

 

luckily someone posted a link to one of the old threads

 

http://forums.Groundspeak.com/GC/index.php...p;#entry3701827

 

so logging a PMO can be done with the aid of a PM that would provide the ID of the PMO cache, which i have no problem with

 

my question becomes...how did the BOT get hold of the PMO's ID's?

did he/she have help from a PM?

Link to comment
i've been hit twice and at 3rd strike i will make all my caches PMO's

 

i will put a note in my profile saying sorry to the regular members but they can take it up with Groundspeak

You will be wasting your time. All of my caches are PMO and they were hit.
While I don't have any PMO caches, there were a number here in NH that were hit by the bot, so I can verify that it will do no good.

 

Here's one now.

The method of logging to a PMO cache is well published. Probably as easy to Google as the bypass for CATPCHA.

 

.

Just to clarify, I was confirming what you said, not arguing with it. The "loophole" has been acknowledged by Jeremy, who has promised not to close it. Guess that makes it more of a "back door" than a loophole. :rolleyes:
Link to comment

 

Suffice it to say, that question has been discussed ad nausea in countless other threads. I suggest you do a thread search and you will quickly find it. In a nutshell: not much. Most, if not all, PMO features are easily circumvented or flawed in their completeness (ie. audit log).

 

.

 

i would, but with the current capabilities of this forum's search function i rather go sit in a pile of snow

 

luckily someone posted a link to one of the old threads

 

http://forums.Groundspeak.com/GC/index.php...p;#entry3701827

 

so logging a PMO can be done with the aid of a PM that would provide the ID of the PMO cache, which i have no problem with

 

my question becomes...how did the BOT get hold of the PMO's ID's?

did he/she have help from a PM?

 

Randomly. Log cache #GC56T1 then #GC56T2 and so on. And the GC# is one of the things I can see with a non-PM account.

Edited by GOF and Bacall
Link to comment
my question becomes...how did the BOT get hold of the PMO's ID's?

did he/she have help from a PM?

I just logged in under my wife's account (she's a basic member) and when I went to "Newest Caches in New Hampshire" all the PMO caches show up in the search, you just can't see the cache info if you click on the link. All the bot needs, though, is the waypoint code to log the cache, which is right there. I suspect the BOT did something like this, since, if you look at their finds, they are in chronological order of when they were hidden.
Link to comment

it would be so easy. limit the number of logs per minute for new users. let's say no more than 5 logs per minute for users with less than 100 finds, possibly with an additional throttling mechanism (the more logs you make in less time, the longer you have to wait). power users would be unaffected, bots would be thwarted, and real newbies probably would never hit that limit anyway.

 

This is about the least intrusive suggestion I've seen before. Without the additional throttling mechanism it would be too easy to circumvent. Just add a 21 delay in the bots logging loop and you only four logs in a minute. IMHO, some sort of throttling mechanism might be the best approach and it might even have the side effect of limiting cut-n-paste logs.

Link to comment
Just add a 21 delay in the bots logging loop and you only four logs in a minute.

and that would already kill the purpose of having a bot. the people running them usually don't want to wait hours in order to post a significant number of logs. and chances are that by the time they get around to having posted a large number, they've already been reported and (hopefully) banned.

 

IMHO, some sort of throttling mechanism might be the best approach and it might even have the side effect of limiting cut-n-paste logs.

the challenge here is to come up with an algorithm that works well, and it would also require the system clocks of all the servers to be synchronized, which is something that GS seems to have problems with, as we've seen in the past :rolleyes:

 

usually it's implemented as a kind of timed penalty score. every time to you post a log, the penalty score on your account is increased by a certain amount, which can be a fixed amount per log, or can depend on the current penalty score and/or the time elapsed since the last log you posted. then the penalty score decreases at a steady rate over time (usually it's one point per second). once the score hits a certain threshold, you can't post any more logs, until your score drops below the threshold again.

Edited by dfx
Link to comment

Just ignore the problem and it will go away. They'll get bored. This entire thread is just feeding the issue. If I was a botmaster and that was my thing I'd keep hitting you all because some of you are so irritated by it. The more irritated posts by ya'll, the more they will hit you.

 

Delete this thread.

Link to comment

it would be so easy. limit the number of logs per minute for new users. let's say no more than 5 logs per minute for users with less than 100 finds, possibly with an additional throttling mechanism (the more logs you make in less time, the longer you have to wait). power users would be unaffected, bots would be thwarted, and real newbies probably would never hit that limit anyway.

 

This is about the least intrusive suggestion I've seen before. Without the additional throttling mechanism it would be too easy to circumvent. Just add a 21 delay in the bots logging loop and you only four logs in a minute. IMHO, some sort of throttling mechanism might be the best approach and it might even have the side effect of limiting cut-n-paste logs.

 

So if I'm understanding this right the bot would be limited to 5 a minute for the first 20 minutes? Then it is how fast can it log it the trash before being caught? Perhaps it should be more like 5 a minute forever? I could live with that, even with my cachematelogging macro I probably don't hit more than 4 a minute. I'll have to time myself next time, I might hit 5 a minute.

 

But a throttling on the logging is probably the best I've heard so far. It will be interesting to see if the bot shows up again now that the caputra is implemented on the .loc search results for non-pm users.

Link to comment

Just ignore the problem and it will go away.

That's what they said about spam 20 years ago and I think it's pretty obvious how that' worked out.

 

Ignoring bad behavior doesn't punish those that are exhibiting bad behavior.

That part was tongue-in-cheek. TPTB will do something. Whatever they do someone will still be unhappy.

Link to comment

Just ignore the problem and it will go away.

That's what they said about spam 20 years ago and I think it's pretty obvious how that' worked out.

 

Ignoring bad behavior doesn't punish those that are exhibiting bad behavior.

That part was tongue-in-cheek. TPTB will do something. Whatever they do someone will still be unhappy.

 

You sure about that?

Link to comment

TPTB will do something.

 

you really believe that?

 

there's not a trace of TPTB in this thread or anywhere else discussing this issue, and its been going on for about a month now

 

Maybe because they are spending more time working on it than trying to convince people that they are working on it?

 

Are these the same people that are working on fixing the forum issues?

Link to comment

TPTB will do something.

 

you really believe that?

 

there's not a trace of TPTB in this thread or anywhere else discussing this issue, and its been going on for about a month now

 

Maybe because they are spending more time working on it than trying to convince people that they are working on it?

 

Are these the same people that are working on fixing the forum issues?

 

The forum that's not vital to my caching experiance?

Link to comment

TPTB will do something.

 

you really believe that?

 

there's not a trace of TPTB in this thread or anywhere else discussing this issue, and its been going on for about a month now

 

Maybe because they are spending more time working on it than trying to convince people that they are working on it?

 

Are these the same people that are working on fixing the forum issues?

 

The forum that's not vital to my caching experiance?

 

Still the same people.

Link to comment

TPTB will do something.

 

you really believe that?

 

there's not a trace of TPTB in this thread or anywhere else discussing this issue, and its been going on for about a month now

 

Maybe because they are spending more time working on it than trying to convince people that they are working on it?

 

Are these the same people that are working on fixing the forum issues?

 

The forum that's not vital to my caching experiance?

 

Still the same people.

 

I'm just not feeling the angst.

Link to comment

TPTB will do something.

 

you really believe that?

 

there's not a trace of TPTB in this thread or anywhere else discussing this issue, and its been going on for about a month now

 

Brad has been in the thread several times notifying that a certain bot account had been banned.

 

TPTB have a long standing and well known policy of not commenting on their processes etc. in the forums. And there is lots of proof that they won't be goaded into commenting either.

 

However, given the fact they implemented CAPTCHA for non-PM downloading would seem to be a clear sign that they are doing something.

 

If I was TPTB, I doubt that I would go public with all of what I was doing to put a halt to the bots. That just makes it easier for the bot to figure out how to get around things.

 

Several posts to this thread start with "...it could be worse,,,,,,the bot could be doing this..........". This just fuels the guy(s) with ideas.

 

I tend to agree with knight2000. This thread is feeding the bot guy. He has cachers arguing with each other. He has cachers giving him new ideas. He has cachers admitting how frustrating it is. etc etc.

 

It may not make him go away, but if this forum was reduced to simply "here is another account........", it certainly wouldn't be as much fun for him to read.

 

 

.

Link to comment

Just ignore the problem and it will go away.

 

That's what they said about spam 20 years ago and I think it's pretty obvious how that' worked out.

 

Ignoring bad behavior doesn't punish those that are exhibiting bad behavior.

 

Spammers spam because they get paid to. This guy is doing it because he has anger issues. Eventually he'll get bored.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...