Jump to content

Feature request: Sharing PQs, and remove the 500 limit.


bjorges

Recommended Posts

As I was reading the last few days' posts in this thread, I flagged several posts to comment on. Eventually, I got weary of it all and just decided to reply to this one tiny bit before blathering in a different direction:

Or maybe Jeremy could listen to the people who are supporting his website and consider making some changes like increasing PQ limitations, with or without an increase in membership rates.
The actions taken by companies cannot always be based completely on the desires of it's customers. Sometimes, changes requested by customers would be bad for the company in ways that are completely unrelated to the short term goals of the individuals clamoring for them.

 

On a somewhat different topic, I'm reminded of a quick weekend my wife and I spent in Central Florida last month. We weren't there long enough to schedule any geocaching time, but I was using my trusty GPSr to route us from A to B. I noticed that it still was carrying around POIs for caches that I had loaded into it back in February. I knew that because my data was so old that some of the caches that I would look for would be long archived, but most would likely not be. I almost considered going after a few of them, but didn't have the time.

 

That got me thinking about this thread's issue of people wanting larger off-line databases. You see, my position to these threads has NORMally been "Why would anyone need more than 17,500 caches in their 'to find' dbase?" However, I realized that since the data doesn't spoil THAT fast, you can actually build an off-line database that is much larger than 17,500 caches. If you ran your PQs every two weeks instead of once per week, you could get twice as much data in your prescious off-line databases, as much as 35,000 caches, in fact. That's enough PQ capacity to grab the data for every cache in the UK even if you ran your PQs without ANY filtering.

 

In practice, I wouldn't have all of my PQs running every other week. I'd set the PQ for the very newest caches to run much more frequently as this is the PQ that is likely to contain the bulk of the data that is likely to go stale quicker. The oldest caches are unlikely to go stale, at all. PQs for these old guard caches could even be run monthly.

Edited by sbell111
Link to comment

SBell's got one of the best answers I've seen here. If you set up your queries, and get notifications for all archived and disabled caches in that area, you can manage your database pretty handily.

I get archived caches within 25 miles, in a pretty big metropolitan area - lots of caches around. I don't really get all that many archived cache notifications - only when someone gets grumpy and pulls all his caches at once does it become a minor issue.

Link to comment

As I was reading the last few days' posts in this thread, I flagged several posts to comment on. Eventually, I got weary of it all and just decided to reply to this one tiny bit before blathering in a different direction:

Or maybe Jeremy could listen to the people who are supporting his website and consider making some changes like increasing PQ limitations, with or without an increase in membership rates.
The actions taken by companies cannot always be based completely on the desires of it's customers. Sometimes, changes requested by customers would be bad for the company in ways that are completely unrelated to the short term goals of the individuals clamoring for them.

 

On a somewhat different topic, I'm reminded of a quick weekend my wife and I spent in Central Florida last month. We weren't there long enough to schedule any geocaching time, but I was using my trusty GPSr to route us from A to B. I noticed that it still was carrying around POIs for caches that I had loaded into it back in February. I knew that because my data was so old that some of the caches that I would look for would be long archived, but most would likely not be. I almost considered going after a few of them, but didn't have the time.

 

That got me thinking about this thread's issue of people wanting larger off-line databases. You see, my position to these threads has NORMally been "Why would anyone need more than 17,500 caches in their 'to find' dbase?" However, I realized that since the data doesn't spoil THAT fast, you can actually build an off-line database that is much larger than 17,500 caches. If you ran your PQs every two weeks instead of once per week, you could get twice as much data in your prescious off-line databases, as much as 35,000 caches, in fact. That's enough PQ capacity to grab the data for every cache in the UK even if you ran your PQs without ANY filtering.

 

In practice, I wouldn't have all of my PQs running every other week. I'd set the PQ for the very newest caches to run much more frequently as this is the PQ that is likely to contain the bulk of the data that is likely to go stale quicker. The oldest caches are unlikely to go stale, at all. PQs for these old guard caches could even be run monthly.

If you will remember from previous threads that I've posted on regarding PQs, I do my PQs based on updated over the last 7 days. I run my PQs by date placed and I can assure you the old guard caches get updated plenty in one week's worth of time. So much so, that one PQ of updates in Washington does not cover them all.

 

Two example PQs running weekly for updates in the land of 13,000+ active caches. Format is YYYYMMDD.

1857382.gpx (20000501-20040207) 366

1857383.gpx (20040208-20041231) 252

 

In summer time, the numbers post closer to 450 each.

Link to comment

TotemLake: I started using your "date placed" process back in June 08 and it's now my main process. It generates 4200 caches within 30 miles of GZ (home). But, to keep up with updates I was running all ten three times a week, using up most of my PQ allowance. Just recently, I hit on the idea of using the Update in last 7 days routine and think that will reduce my PQ burden significantly. I also hit over 500 in a seven day perion. The 500 reaches out to 23.6 miles (an interesting statistic, I guess) so I'll have to try your combo idea of applying the Date Placed process to the Update PQ too. Still will reduce the PQ counts significantly.

 

As an aside, I still run a PQ for current month and into the future to grab new ones more often than the "once a week" master update.

Link to comment

I run a PQ called Placed in the last month and updated in the last week once a week. It generates about 300 caches per report depending on the month and that is run on Friday so I have an as up to date database as possible to pull from when planning my hikes. All told, I have 20 PQs running right now for the entire state. One of them seeks out inactive caches. This minimizes the work I do once every 6 months or so to check caches that haven't been updated since 3 months back. I have an instant notification of caches that are archived within 50 miles of me, but haven't used that data to update at all. It's too much daily work. I see an average of 3-5 per day getting archived. I use that mainly to see who is committing geocide in my area these days. Otherwise, I barely pay attention to it. When I do plan my hike, I check the caches along the route that may play into the hike to be sure they haven't been deactivated or archived.

 

I guess my point is, the current limits has nothing to do with making my effort any more easier or difficult. Having larger PQ counts won't add to the activity in the same exponential manner as the increase. The backchecking is still going to happen for current data. So the request for the increase isn't really thought through as much as any requestor would like to believe. It's a big hammer solution to a perceived problem that isn't clearly defined. They just see their diameter is getting smaller so obviously larger returns would push it back out. They don't look far enough to see it is a bandaid fix. The diameter will still reduce as the saturation gets higher. People who see that are accused of being against new or good ideas. I'm not against new or good ideas, I'm just against ideas that clearly won't solve the problem.

 

Even though CR's suggestion of better filters along with his other suggestions is more desirable, programming wise, I have no idea how much effort some of that would take with the current code. As it is, the attributes filters is almost useless because pre-existing caches are not required to be updated to properly reflect them.

Edited by TotemLake
Link to comment
... The oldest caches are unlikely to go stale, at all. PQs for these old guard caches could even be run monthly.
... I run my PQs by date placed and I can assure you the old guard caches get updated plenty in one week's worth of time. ...
It would appeaI think that you and I merely have a different definition of 'stale'. By 'stale', I am referring to the chance that the cache will be archived or go missing prior to my next PQ. Certainly, the chance of this happening reduces as the cache ages.

 

I realize that the less often that PQs are run, the more likely that logs will be missed. That, in my opinion, is the price to pay for keeping a very large external database. Certainly, if it were my desire to keep an offline database of every cache in the UK on the off chance that I would one day look for any specific cache, I would be fine with the fact that my database might not include every single log that had ever been posted to ALL of those caches.

 

Instead, my method would be useful for those people who are likely to find themselves called to business in any number of different areas on a moment's notice and would like an offline database of all the caches available. It is in response to typical requests made by road warriors and free spirited vacationers, not anal-retentive cache historians.

Edited by sbell111
Link to comment
... The oldest caches are unlikely to go stale, at all. PQs for these old guard caches could even be run monthly.
... I run my PQs by date placed and I can assure you the old guard caches get updated plenty in one week's worth of time. ...
It would appeaI think that you and I merely have a different definition of 'stale'. By 'stale', I am referring to the chance that the cache will be archived or go missing prior to my next PQ. Certainly, the chance of this happening reduces as the cache ages.

 

I realize that the less often that PQs are run, the more likely that logs will be missed. That, in my opinion, is the price to pay for keeping a very large external database. Certainly, if it were my desire to keep an offline database of every cache in the UK on the off chance that I would one day look for any specific cache, I would be fine with the fact that my database might not include every single log that had ever been posted to ALL of those caches.

 

Instead, my method would be useful for those people who are likely to find themselves called to business in any number of different areas on a moment's notice and would like an offline database of all the caches available. It is in response to typical requests made by road warriors and free spirited vacationers, not anal-retentive cache historians.

It's fine we have different perspectives. That being said, I do have a problem with the last part of your last statement.

Link to comment

Another change that I would like to see is gc.com uses "date/time PQ requested" to determine which day it counts against for the 5/day.

 

Currently, they use date/time PQ ran. This is unfair when PQ's are late and run a day late. It has a cascading effect. I just got one of yesterday's PQ's (almost 24 hours late) and it counts against today instead of yesterday.

Link to comment

Another change that I would like to see is gc.com uses "date/time PQ requested" to determine which day it counts against for the 5/day.

 

Currently, they use date/time PQ ran. This is unfair when PQ's are late and run a day late. It has a cascading effect. I just got one of yesterday's PQ's (almost 24 hours late) and it counts against today instead of yesterday.

I think this is a good call. It is definitely proving to be a problem for those trying to do the right thing.

Link to comment

I, too, did not receive 3 of 5 PQ scheduled for yesterday. Again, a Friday. Seems I see a pattern here. And I was active on the list yesterday. I think there should be some trigger that moves your PQs up the list if you are active on the site that day. Now I'll have yesterday's (which still have not been generated) count against today.

 

PS: Totemlake, I see you changed your avatar!!!

Link to comment

I run a PQ called Placed in the last month and updated in the last week once a week. It generates about 300 caches per report depending on the month and that is run on Friday so I have an as up to date database as possible to pull from when planning my hikes.

...

 

I wish I had your luck. If I run a PQ for caches updated in the last week, I hit the 500 cache limit and only manage to cover an area about a 25 mile radius. I guess it's just the curse of living in a cache dense area that is also cacher dense. All these caches get visited regularly, to the point that may times, it is even out of date the day after it is run.

 

Adding insult to injury, a local county FPD (Forest Preserve District) has added new requirements to placement in their area, causing lots of existing caches to get archived. It is hard to keep up with it all. I may see 20-30 archive notices in a day when a cacher chooses to comply with the new regulations and archives their hides in that area.

Link to comment

As I was reading the last few days' posts in this thread, I flagged several posts to comment on. Eventually, I got weary of it all and just decided to reply to this one tiny bit before blathering in a different direction:

Or maybe Jeremy could listen to the people who are supporting his website and consider making some changes like increasing PQ limitations, with or without an increase in membership rates.
The actions taken by companies cannot always be based completely on the desires of it's customers. Sometimes, changes requested by customers would be bad for the company in ways that are completely unrelated to the short term goals of the individuals clamoring for them.

 

The problem with this is that the majority of companies have to deal with some form of competition. If they don't do things to keep their customers happy, the customers go to another company that does what they are asking for. This is bad for the company in ways that are completely understandable to the majority of people - less finances.

 

In this case, this company has a virtual monopoly - no one else has a large enough database, or makes it easy enough to access that database, to make it worthwhile to switch to another company. Since I don't see any application of federal anti-trust laws in this case, they will likely maintain their monopoly, since it is only a small group of people who appear to disagree (at least vocally in these forums) with the way the person in charge chooses to limit access to data we provide to him, it is unlikely that unless he decides to change on his own, we will get what we are asking for.

 

Workarounds are always nice, but they don't solve the problem. How would auto manufacturers survive if they kept producing products that did ok, but if you wanted them to work better, they always ignored you, and you had to go out and modify the vehicles in substantial ways? Yes, we already do that, but I'm saying, at least they listen, and put some of those features in as they develop new and better models. They don't put everything in that everyone asks for - but if they don't do enough to keep up with the requests, people buy other models of cars.

 

I just think that for a company which would not survive without the support of its users, both through purchased memberships and the continual addition of data to their lists, they would be more amenable to listen to the requests of the users, rather than more concerned with protecting data which really isn't even theirs - they just collect and list other people's data in a database, and limit those same people's access to that information, through TOS rules and limited bandwith and PQ's. Sounds like a bad idea to me.

 

But then, I've always supported the idea of freedom of information.

Link to comment

I have had problems receiving my PQ's four of the last six Fridays. THAT is why I maintain an offline database.

THAT is why I don't schedule any for Fridays (and maintain an offline database).

 

That shouldn't be an issue - if we pay for the PQ's, which the only way to get them is to have a paid membership, they should be available as stated. If they always want us to use the freshest data, then they should make sure to provide that fresh data.

Link to comment

There are many reasons for keeping an offline database, here are some of mine:

  • If my internet connection goes down, I can still cache. (happens often when ISP is upgrading)
  • If GC.com is experiencing problems with the site, I can still go caching.
  • If the PQ generator is slow, I can still cache.
  • If I want to leave the house before 09:00 (the time the PQs arrive in the UK), I can still cache.
  • If I go on a Roadtrip thats longer than 200 Miles, I can still cache.
  • If I'm on a Roadtrip and the Motel doesn't have Internet, I can still cache.
  • If my brother-in-law cancelled my Mother-in-law's Internet, I can still cache (happened this week)
  • If I get stuck on a cache, I get more hints from the log history.
  • I get to play with GSAK Macros to automate everything.

One point on wireless hotspots, unsecured networks may be fair game in the US, but it is illegal to use them in the UK except where it has been advertised as a free wireless hotspot (very rare)

 

I have recently looked into getting an iPhone, and found that my network does not support them, and I know I cannot get a contract with another network thanks to the wonders of credit checks. As I said in a previous post, if the app gets ported to nokia's answer to the iPhone, I'll possibly sign up. The great thing about the nokia is that it is Windows Mobile based and therfore would cover a lot more phones than the iPhone app.

Link to comment

[*]If my internet connection goes down, I can still cache. (happens often when ISP is upgrading)

 

What if your battery on the laptop goes dead without a cable or power source handy? Not impacted by the 500 limit

 

[*]If GC.com is experiencing problems with the site, I can still go caching.

 

What happens if your files get corrupted? Again, no impact of current limits to PQ size.

 

[*]If the PQ generator is slow, I can still cache.

 

So can anyone. Not sure what this means.

 

[*]If I want to leave the house before 09:00 (the time the PQs arrive in the UK), I can still cache.

 

Doesn't this contradict the point before it?

 

[*]If I go on a Roadtrip thats longer than 200 Miles, I can still cache.

 

Good point for an off-line DB and is easily done with GSAK and the 500 limit.

 

[*]If I'm on a Roadtrip and the Motel doesn't have Internet, I can still cache.

 

And you will still have between 500 and 17500 from the previous weeks PQ's to hunt.

 

[*]If my brother-in-law cancelled my Mother-in-law's Internet, I can still cache (happened this week)

 

See point right above.

 

[*]If I get stuck on a cache, I get more hints from the log history.

 

Easily done within the 500 limit.

 

[*]I get to play with GSAK Macros to automate everything.

 

Can still do this with the current 17,500 in PQ's you get.

 

One point on wireless hotspots, unsecured networks may be fair game in the US, but it is illegal to use them in the UK except where it has been advertised as a free wireless hotspot (very rare)

 

It is illegal to use unauthorized hotspots in the US as well. Just like in the US, it is ignored and rarely enforced in the UK. Since I have family there and have visited with some regularity for work, I also know that the Free WiFi spots are almost as common, if not more so, than here.

 

I have recently looked into getting an iPhone, and found that my network does not support them, and I know I cannot get a contract with another network thanks to the wonders of credit checks. As I said in a previous post, if the app gets ported to nokia's answer to the iPhone, I'll possibly sign up. The great thing about the nokia is that it is Windows Mobile based and therefore would cover a lot more phones than the iPhone app.

 

Here in the US, iPhone is only available through one carrier, however almost every carrier has available phones that are internet enable (many Nokia's are) and/or have PDA's like Treo or Blackberry that allow access to http://wap.geocaching.com or even http://geocaching.com. They even have programs that let you carry that whole offline DB with you.

 

References have been that you can not use offline DB's as an argument since GC does not support them, not that they themselves were inherently bad, just some prefer not to use them. If this was an argument against offline DB's, I would probably be standing right there with you. However, it is not. It is rather can we effectively cache within the current limits. Since there are cachers with tens of thousands of finds using the current limits, and even more that get dozens or even hundreds of finds each month, the answer to effectiveness must be yes.

 

Even if it were not, this is what the TPTB (I hate the term) have decided to do with their site and their data. Make no mistake about it, it is theirs to do with what they want. It can be argued all day long, and has been by some that like to argue for arguments sake, that it is not however just like sending a picture to your newspaper or a tape to America's Funniest Home Videos, unless you have made prior arrangements, you agree that it has become their property.

 

Since the majority of those who support the site do not appear to have an issue, if the site were mine, I probably would not explain why I was not doing it as many time as Jeremy and those associated with the site have already done.

Link to comment

Baloo&bd, I was answering some anti-offline database comments that have been made prior to this post, so I beleive my reasons as they stand are relevant to the discussion. They are also relevant to the limits in areas of high density.

 

What if your battery on the laptop goes dead without a cable or power source handy? Not impacted by the 500 limit

 

Apart from the fact that I have 4 ways on the road of accessing this data (TomTom, GPSr Memory card, PDA & laptop) I have in car power for the laptop and this is just an argument for Offline Dbs nothing more.

 

What happens if your files get corrupted? Again, no impact of current limits to PQ size.

 

Hate to sound like a tech geek, but don't you keep regular backups? (for me, mirror fileserver & gmail backup all my data plus a full copy on both the laptop and desktop)

 

* If the PQ generator is slow, I can still cache.

So can anyone. Not sure what this means.

 

A lot of people complaining about Friday PQs here, and most of them from the US, add another 8 hours to my inconvenience (see below). Additional PQs/Memberships etc, don't solve this directly, they are a kludgy workaround in themselves, but extra revenue from enhanced memberships would pay for new PQ servers, plus you could plan ahead and schedule more on a thursday, leaving less important pqs for friday.

 

* If I want to leave the house before 09:00 (the time the PQs arrive in the UK), I can still cache.

 

Doesn't this contradict the point before it?

 

Nope, complements it, if pqs are slow on a friday, I can only use my offline database, as I won't get the PQs until afternoon. even if the PQs are the first to run, I won't be able to get out before 09:00 as PST midnight is 08:00 GMT

 

* If I go on a Roadtrip thats longer than 200 Miles, I can still cache.

 

Good point for an off-line DB and is easily done with GSAK and the 500 limit.

 

1 week's data in Northern California gives appx 200 mile radius at the moment, the bulk being centred around the I5 corridor, this takes you from SF to southern oregon, easily done in 6 hours or less, if you're headed in the direction of Washington State, forget it, you're going to be passing some 30,000+ caches.

 

* If I'm on a Roadtrip and the Motel doesn't have Internet, I can still cache.

 

And you will still have between 500 and 17500 from the previous weeks PQ's to hunt.

 

The whole point here is that people are saying that the request for increasing the data limits is only relevant if you are keeping an offline DB (I do not agree, it's partly the opposite, offline DBs reduce the need for larger PQs), although, I am putting forward the positives of offline DBs anyway.

 

* If my brother-in-law cancelled my Mother-in-law's Internet, I can still cache (happened this week)

 

See point right above.

 

I'm staying at my mother-in-law's for a week, it's out of the area set up with my offline DB, can't do a lot of caching unless I do something naughty.

 

* If I get stuck on a cache, I get more hints from the log history.

 

Easily done within the 500 limit.

 

Only if you keep an offline DB and repeat your queries (or get an iPhone or pay $10/Mb for data roaming, that's just 2 pages from the full site)

 

* I get to play with GSAK Macros to automate everything.

 

Can still do this with the current 17,500 in PQ's you get.

 

yeah

 

It is illegal to use unauthorized hotspots in the US as well. Just like in the US, it is ignored and rarely enforced in the UK. Since I have family there and have visited with some regularity for work, I also know that the Free WiFi spots are almost as common, if not more so, than here.

 

Ignored or not, Iillegal activity is illegal activity, I think we'd better let that drop eh. There were a few prosecutions last year in the UK under the Data Protection act for persons caught by the police using unsecured networks, they do not require the owner of the network to be involved in the prosecution, due to the wording of the Act. Punishments included large fines, a little jail time and confiscation of computer equipment, for me this would also mean the loss of my Job. Also, I have never seen a free Wifi hotspot advertised in the UK, unless it was of the nature "Customers Only".

 

Here in the US, iPhone is only available through one carrier, however almost every carrier has available phones that are internet enable (many Nokia's are) and/or have PDA's like Treo or Blackberry that allow access to http://wap.geocaching.com or even http://geocaching.com. They even have programs that let you carry that whole offline DB with you.

 

Thats great, got one, can't use it in the US without filing for bankruptcy when I get home.

 

References have been that you can not use offline DB's as an argument since GC does not support them, not that they themselves were inherently bad, just some prefer not to use them. If this was an argument against offline DB's, I would probably be standing right there with you. However, it is not. It is rather can we effectively cache within the current limits. Since there are cachers with tens of thousands of finds using the current limits, and even more that get dozens or even hundreds of finds each month, the answer to effectiveness must be yes.

 

It must be nice for those mega cachers not to have to work for a living. (do the math, they can't possibly have time to work) If they can afford not to work, they have plenty of time to work within the limits and plan ahead. Also, they've cleared out the majority of their local caches so their first PQ probably takes them out to 100+ mile radius. Hopefully the don't decide to run the finds query on a Friday and take down the PQ machine. For me, getting home from work 15 hours after I left in the morning, I really don't need to sit down and plan anything, I'd rather let my automated macros do it and grab my gear on my first day off. Also, see my previous notes about PQ data sharing, it does happen, and the more cachers you know, the more likely you are to have the opportunity to join such syndicates.

 

Some one mentioned the phrase "anal-retentive cache historians" in regard to how some people appear to see offline database keepers, gave me an idea, if I could get my finds up to 500/month, I could start with the first 500 placed in the UK, then the next and so on, during the winter months, i'll outstrip the placing and I'll be able to use my PQs more efficiently, I could even try to do them in order of placement :unsure: . On second thoughts, a second PM will be cheaper than all the extra driving, which I'll be signing up for this week.

 

My final point - yes I do want more data for a larger offline DB, because I only want to get involved with planning stuff once a month or so, not every time I feel like going caching, and the extent of planning I want to do is re-arranging my PQs for efficiency.

 

(Edit Note: Yup, I know I've made a cock up somewhere on the quotes, but at this time of night, the word quote and its spelling has lost all meaning to me as has the difference between back and forward slashes and the ability to count past 2, so if anyone would like to point out which tag I've gotten wrong, I'd be happy to change it)

Edited by Volvo Man
Link to comment
(Edit Note: Yup, I know I've made a cock up somewhere on the quotes, but at this time of night, the word quote and its spelling has lost all meaning to me as has the difference between back and forward slashes and the ability to count past 2, so if anyone would like to point out which tag I've gotten wrong, I'd be happy to change it)
There is a ten quote limit in teh posts. Once you have exceeded that limit, your post becomes a big unreadable mess.
Link to comment

It must be nice for those mega cachers not to have to work for a living. (do the math, they can't possibly have time to work) If they can afford not to work, they have plenty of time to work within the limits and plan ahead.

Please don't make assumptions like this. It makes your argument look foolish when you don't know these cachers. I know some and they all work for a living full time.

Link to comment

There is a ten quote limit in teh posts. Once you have exceeded that limit, your post becomes a big unreadable mess.

That's interesting to know.

I wonder if it also applies to nested quotes, ten deep...

 

P.S. I don't work for a living, I retired at 45, 4 years ago. And I find it hard to find the time to cache. And I somehow manage to, without being a Premium Member, find sometimes 56 caches in 19 hours. And somehow manage to plan all my activities to utilize the 20 month old, very stale, database of caches from when I was a Premium Member for all the New England states.

Link to comment

It must be nice for those mega cachers not to have to work for a living. (do the math, they can't possibly have time to work) If they can afford not to work, they have plenty of time to work within the limits and plan ahead.

Please don't make assumptions like this. It makes your argument look foolish when you don't know these cachers. I know some and they all work for a living full time.

30 caches on Saturday, 20 on Sunday times 52 weeks. Add in some extras while on vacation, subtract some for DNFs and weekends where you don't cache times 5 or 6 years caching, and one could easily have 15,000+ finds while working a full time job. And that doesn't include extra caching during the week at night after work and things like that.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...