Jump to content

Pocket Queries - will they ever go beyond 500


GEOJuice

Recommended Posts

 

As a premium member, or a non-premium member, you certainly are within your rights to request that Groundspeak provide additional service to you either free of charge or as part of what you already pay for. And it may be that the company will decide that because of the growth cache density or the fact that newer GPS units can hold 1000 or more waypoints they can provide larger PQs. Don't be suprised however when others post in response that they have no need for more caches and would prefer that those who do should pay extra for them. Also don't be surprised when others offer suggestions on how they are able to geocache using the current limitation.

 

 

I'm not surprised at any of these things, nor did I ever say that I was.

Yet when people make these suggestion you ask them to get out of your way. You have made your request and stated your reasons that you would like to download more caches in your PQs. With your 47 finds (and yes I read where you don't log all your finds and that you also look for caches that are not listed on Geoaching.com) I don't buy that you really have thought about ways to enjoy geocaching given the limits of PQs. But if you say you have and you must be able to download every cache in 100 mile radius than it must be so.

 

 

Don't believe that they way you geocache is so much different than any one else.

 

 

The large number of people who reply to my posts on this subject whenever it comes up seem to make it pretty clear that the way I cache *is* different from the way a lot of people do it. So forgive me if their prior posts have already rebutted you.

I'm pretty sure that if Geocaching.com allowed people to download every cache in 100 mile radius many people would do it. Using GSAK and other tools you can set up much more specific queries that you can with PQs. In addition, you might get results in the less than 10 minutes that it generally takes an new PQ to run and be delivered. This would save tremendous time for someone who can't decide where to cache until the last minute before the head out the door. (Of if you have your GSAK data base on a laptop, I suppose you don't need to run a query till you get there). You may be different than others in that you want to postpone your decisions till the last minute, but most of the other responses you got were from people who do find ways to be spontaneous about where they go caching and are simply willing to do this with only the last 5 logs instead of the last 50 logs.

 

 

I suspect many people like to be flexible and spontaneous and would like to have an offline data base of all the geocaches within two hours drive from their home. But they realize that you don't need to keep all the cache up to date all the time. Maybe they miss that newest cache (or not if they also get instant notifications). Maybe they look for an archived cache every once and awhile (or you can get instant notification when caches are archived, disabled, or re-enabled). Sometime they might not have that one log that had a spoiler or corrected coordinates. They might still find that cache, or they might accept that sometimes you DNF a cache. You may believe you need more, but your not going convince people who are having fun making do with what the website already gives them.

 

 

You are sadly mistaken as to what is going on, I'm afraid. I'm not trying to convince anyone that there's anything wrong with the way they cache (many of them cannot say the same). Nor am I trying to convince anyone that my way of caching is superior to anyone else's (also, many of them cannot say the same). The primary point I've been trying to make is that my way of caching is JUST AS VALID as anyone else's - and it is neither their place, nor yours, to tell me otherwise.

 

If you're okay will all of those "maybe" compromises you list above, that is 100 percent your prerogative. But don't tell me what I should be happy with. I will remind you that I was not the one who started this thread regarding the PQ limitations. I am simply *another* person who believes that the PQ limitations are archaic and too constricting. If you're completely happy with them, than that's wonderful - honestly. But don't get in the way of people for whom it doesn't work.

Your way of geocaching may be just as valid as anybody else's. I have my personal opinion about this, but for the most part I don't care how a person chooses to cache.

 

If you really want spontaneity, Groundspeak does provide a service with Trimble Geocache Navigator application for GPS equipped cellphones that gives you access to the latest information from the Geocaching.com data base any place where you have cell phone coverage. Of course there are additional fees involved.

 

I will get out of your way now and wish you good luck in trying to convince Groundspeak to give you the access to their data base you seem to feel you need to geocache.

Link to comment

 

I maintain three databases, all of which are updated weekly. One gives me all the unfound caches within 150 miles of my home. A second gives me unfound caches in Eastern Pennsylvania, where I travel several times a year. The third gives me the nearest 1000 unfound caches near my parents' home in Upstate New York.

 

Even with running 30 queries per week, I still have room left over to run my "All Finds" query, or a group of five queries for trips to other areas of the country. The secret is using the "placement date" method to maximize the efficiency of my PQ's. The process is so automated that I don't give much thought to how many separate queries are involved.

 

It will take me quite awhile to find all the caches in my "Home 150 miles" database -- all 10,000 of them. Last year I found nearly 1000 caches in nine states, and about 20% of those finds were outside the areas covered by my regular queries. Within my regular areas, I love having the ability to jump in the car and head off in any direction to find caches without any advance planning. That's freedom!

 

Yep, I'm getting my $30 worth. :(

 

 

The placement date has nothing to do with when a cache may change, be stolen, be submerged, or anything else - which is why I update all caches, not just the newer ones. So it doesn't really do much for efficiency, as far as that's concerned.

 

That being said, if you think you're getting good value for your money - I'm happy for you, honestly. I just happen to feel differently, that's all.

Link to comment

 

I maintain three databases, all of which are updated weekly. One gives me all the unfound caches within 150 miles of my home. A second gives me unfound caches in Eastern Pennsylvania, where I travel several times a year. The third gives me the nearest 1000 unfound caches near my parents' home in Upstate New York.

 

Even with running 30 queries per week, I still have room left over to run my "All Finds" query, or a group of five queries for trips to other areas of the country. The secret is using the "placement date" method to maximize the efficiency of my PQ's. The process is so automated that I don't give much thought to how many separate queries are involved.

 

It will take me quite awhile to find all the caches in my "Home 150 miles" database -- all 10,000 of them. Last year I found nearly 1000 caches in nine states, and about 20% of those finds were outside the areas covered by my regular queries. Within my regular areas, I love having the ability to jump in the car and head off in any direction to find caches without any advance planning. That's freedom!

 

Yep, I'm getting my $30 worth. :(

 

 

The placement date has nothing to do with when a cache may change, be stolen, be submerged, or anything else - which is why I update all caches, not just the newer ones. So it doesn't really do much for efficiency, as far as that's concerned.

 

That being said, if you think you're getting good value for your money - I'm happy for you, honestly. I just happen to feel differently, that's all.

So how many caches do you think they should give you for you paying only 8 cents/day? Right now you get ~300 for each penny you spend. That's a heck of a lot of entertainment for 1 cent! :laughing: I actually think we are getting a great deal and a lot of people would pay more. :( Edited by TrailGators
Link to comment

 

Yet when people make these suggestion you ask them to get out of your way.

 

 

I think you need to go back and read a bit more closely. I didn't make that statement simply because an alternate suggestion was made. I made that statement because you were implying that I was being unreasonable in not caching the way you and many others do.

 

 

You have made your request and stated your reasons that you would like to download more caches in your PQs. With your 47 finds (and yes I read where you don't log all your finds and that you also look for caches that are not listed on Geoaching.com) I don't buy that you really have thought about ways to enjoy geocaching given the limits of PQs. But if you say you have and you must be able to download every cache in 100 mile radius than it must be so.

 

 

I have thought about it extensively. And the maximum flexibility I can get to maximize my enjoyment comes from being able to execute as many or as complex filters as I need through GSAK - which occurs instantaneously, does not require an internet connection, and is not dependant on whether GC.com is running or not.

 

 

I'm pretty sure that if Geocaching.com allowed people to download every cache in 100 mile radius many people would do it. Using GSAK and other tools you can set up much more specific queries that you can with PQs. In addition, you might get results in the less than 10 minutes that it generally takes an new PQ to run and be delivered. This would save tremendous time for someone who can't decide where to cache until the last minute before the head out the door. (Of if you have your GSAK data base on a laptop, I suppose you don't need to run a query till you get there). You may be different than others in that you want to postpone your decisions till the last minute, but most of the other responses you got were from people who do find ways to be spontaneous about where they go caching and are simply willing to do this with only the last 5 logs instead of the last 50 logs.

 

 

You're making many assumptions about how I cache - most, if not all of them, completely wrong.

 

1. Sometimes, I take several days to plan out which caches I want to attempt. This involves many hours of searching for what I want and where I want it.

 

2. I don't always know I'm going to cache, or even have the opportunity to, until it happens. This has nothing to do with me being a person "who can't decide where to cache until the last minute".

 

3. It takes so many PQs to get a complete listing given the current limitations, that given the useless restriction of 40 *stored* PQs, I don't always have slots free to add new ones.

 

And here you go again, when you reference people "willing to do this with only the last 5 logs instead of the last 50 logs". Again, you're implying that my way of caching is unreasonable, and everyone else is doing it "the right way".

 

I'm sorry, but it's not your place to decide how a person should or should not cache. If you want to do it with 5 logs - more power to you. If someone wants to do it with 50 logs, it's none of your business, and it's not your place to judge.

 

I am not, nor have I ever, tried to tell everyone else that they're caching with too little research and too little data. They cache how they want to cache, and if it makes them happy - that's what counts. But a lot of you can't seem to reciprocate with that same type of open mindedness - and THAT'S where I take issue.

 

 

Your way of geocaching may be just as valid as anybody else's. I have my personal opinion about this, but for the most part I don't care how a person chooses to cache.

 

If you really want spontaneity, Groundspeak does provide a service with Trimble Geocache Navigator application for GPS equipped cellphones that gives you access to the latest information from the Geocaching.com data base any place where you have cell phone coverage. Of course there are additional fees involved.

 

 

I appreciate the suggestion - but it's not an acceptable solution because:

 

1. It requires cell phone coverage - many areas do not have coverage, or do not have reliable coverage.

 

2. It requires a GPS enabled phone, which not everyone has (I certainly don't).

 

3. It assumes that GC's servers are up and running, running well, and without errors - personally, I'd rather buy a lottery ticket than gamble on that one. :(

Link to comment

 

So how many caches do you think they should give you for you paying only 8 cents/day? Right now you get ~300 for each penny you spend. That's a heck of a lot of entertainment for 1 cent! :laughing: I actually think we are getting a great deal and a lot of people would pay more. :(

 

 

See, you're making a few assumptions here - and I'm guessing that it's based on how you cache. A lot of people seem to cache by simply going through a list of what's nearby. These are typically going to be the folks that make comments like "I've found 1000 of the 2100 caches around me - therefore I still have 1100 to go. Plenty for me!"

 

And if they enjoy doing that - fantastic, I'm glad that they can be happy with that. Personally, I can't stomach finding eight hundred lamppost/guardrail/I-had-30-seconds-while-taking-a-pee-so-I-thought-I'd-toss-a-cache-over-my-shoulder caches (and this is only one category, not all). In other words, most of the caches in a given area don't do it for me. So for a person who just goes down the list, you can say they get 300 caches per penny. But that doesn't hold true for me, or anyone else who's selective about their caches.

 

I remember that someone else once suggested that they should just have a regularly updated list of 52 files, representing all caches organized by state. You have to admit, if they did that, it would significantly cut down on the load on their SQL Servers. :(

Link to comment

 

So how many caches do you think they should give you for you paying only 8 cents/day? Right now you get ~300 for each penny you spend. That's a heck of a lot of entertainment for 1 cent! :laughing: I actually think we are getting a great deal and a lot of people would pay more. :(

 

 

See, you're making a few assumptions here - and I'm guessing that it's based on how you cache. A lot of people seem to cache by simply going through a list of what's nearby. These are typically going to be the folks that make comments like "I've found 1000 of the 2100 caches around me - therefore I still have 1100 to go. Plenty for me!"

 

And if they enjoy doing that - fantastic, I'm glad that they can be happy with that. Personally, I can't stomach finding eight hundred lamppost/guardrail/I-had-30-seconds-while-taking-a-pee-so-I-thought-I'd-toss-a-cache-over-my-shoulder caches (and this is only one category, not all). In other words, most of the caches in a given area don't do it for me. So for a person who just goes down the list, you can say they get 300 caches per penny. But that doesn't hold true for me, or anyone else who's selective about their caches.

 

I remember that someone else once suggested that they should just have a regularly updated list of 52 files, representing all caches organized by state. You have to admit, if they did that, it would significantly cut down on the load on their SQL Servers. :(

I don't cache that way either. I maintain a favorites list. I also tap into the must-dos or favorites of places I travel to. Edited by TrailGators
Link to comment

 

I remember that someone else once suggested that they should just have a regularly updated list of 52 files, representing all caches organized by state. You have to admit, if they did that, it would significantly cut down on the load on their SQL Servers. :(

 

I'd even be willing to "pay" my 5 daily PQs for any state's complete file. No SQL work on Groundspeak's side, just the e-mail. Let's say I want to find a cache in every county in Virginia. That would certainly make the task easier.

Edited by beejay&esskay
Link to comment

The secret is using the "placement date" method to maximize the efficiency of my PQ's. The process is so automated that I don't give much thought to how many separate queries are involved.

 

The placement date has nothing to do with when a cache may change, be stolen, be submerged, or anything else - which is why I update all caches, not just the newer ones. So it doesn't really do much for efficiency, as far as that's concerned.

 

That being said, if you think you're getting good value for your money - I'm happy for you, honestly. I just happen to feel differently, that's all.

Your comment makes it clear to me that you're not understanding my method, which is a common one useed by many others.

 

The placement date is just a way to split up all the caches in the desired area so there is no overlap and no missed caches. EACH cache, whether it was hidden in 2001 or last week, gets updated once each week. The first PQ runs on Monday morning and covers the oldest 500 unfound caches, with hidden dates ranging from 2000 to 2003. The last PQ runs on Friday morning -- capturing all the newest caches before the weekend -- and covers caches hidden on or after December 15, 2007. If a cache got disabled or archived in the past week, I know about it. If there were three DNF's last week on a cache hidden in 2002, I know about it. I've never wasted time looking for a cache that the owner had pulled or archived, as the odds are good that week-old data is OK for the oldest caches.

 

If you don't understand how to split up a huge area (in my case, 10,000 caches in a circle with a 150 mile radius), and how to work with the resulting data, please ask and we'll be happy to help.

Link to comment

Forgive me, but I'm not sure I understand where the 180,000 is coming from.

 

You said:

And yes, that means 1500 results per PQ, 15 PQs per day, and 120 PQs per week.

 

Maybe I'm doing the math wrong...

1500 caches per PQ

120 PQs per week

1500 x 120 = 180,000 caches per week

 

Are you telling me that you wolf down everything in sight when you're at a buffet? Of course, not. I'm just saying that the option should be there. If I build a system for someone, I'm not just taking into account what they need to do, but what they *might* need to do.

 

Right now, the system was designed so that people could load 500 caches (the standard number for the maximum waypoints GPS units could hold at the time). They build it so that people could get that same number 5 times per day everyday for a total of 17,500 caches per week. They built it not for what people need, but for what they *might* need. You have already indicated that you've max'd out that limitation, as have others.

 

I'm confident that if we had 1500 caches per PQ and 15 PQs per day, people would pull down 22,500 caches per day. They might not pull down a different 22,500 caches every day, but they'd pull down as close to 22,500 caches per day as they can.

 

You made the buffet analogy - and I'll take it a little further, as it seems a good analogy. The cooks spend time and money maitaining the food, keeping it hot and the warmers running. They're giving out standard sized plates and most people are filling their plates and are satisfied with the food. What you seem to be asking for is that you want to go to the buffet with a bigger plate so you can take more food and only eat what you want. Isn't it possible to take a smaller plate back to the buffet and only get the food you really want?

 

I'm done arguing this. I'll wait and see if any official response comes from Groundspeak, because they have been very responsive to this type of request in the past :(

Link to comment

 

Your comment makes it clear to me that you're not understanding my method, which is a common one useed by many others.

 

The placement date is just a way to split up all the caches in the desired area so there is no overlap and no missed caches. EACH cache, whether it was hidden in 2001 or last week, gets updated once each week. The first PQ runs on Monday morning and covers the oldest 500 unfound caches, with hidden dates ranging from 2000 to 2003. The last PQ runs on Friday morning -- capturing all the newest caches before the weekend -- and covers caches hidden on or after December 15, 2007. If a cache got disabled or archived in the past week, I know about it. If there were three DNF's last week on a cache hidden in 2002, I know about it. I've never wasted time looking for a cache that the owner had pulled or archived, as the odds are good that week-old data is OK for the oldest caches.

 

If you don't understand how to split up a huge area (in my case, 10,000 caches in a circle with a 150 mile radius), and how to work with the resulting data, please ask and we'll be happy to help.

 

 

I understand that method just fine, and I use it myself. I think I just misunderstood what you were saying by bringing it up. For me, I use it to eliminate overlap, and it *still* takes 25 PQs to get a 100 mile radius. :(

Link to comment

Wow. 12,000+ caches within 100 miles is pretty cache-dense. Consider whittling down that data. One way, of course, is to find caches. I'm having fun right now, trying to keep my database below 10,000 caches. Between finding 14 caches in the past week and seeing a bunch of others get archived, I got the number down to 9991. :(

 

Another way is to be selective in the caches requested by your pocket queries. If I ever max out my queries, I'd likely eliminate mystery/unknown caches, since I rarely hunt for them. Others don't like micros so they exclude that cache size from their queries.

 

In your case I noticed that in three years, you've never attended an event, or logged a virtual or webcam cache. And, you've only found one cache with a terrain rating greater than two stars. These are criteria you might consider using to slim down your queries.

Link to comment

 

Maybe I'm doing the math wrong...

1500 caches per PQ

120 PQs per week

1500 x 120 = 180,000 caches per week

 

 

Ah, that was a mistake on my part, for which I apologize. My referencing 120 PQs per week was a tripling of the current limit of 40 PQs - but that's not a weekly limit, it's the total number of stored PQs period.

 

What I should have said was that there should no longer be a limit on stored PQs. If they're already limiting the number of PQs that can be run, there's no need to limit the number that you can create to choose from.

 

 

You made the buffet analogy - and I'll take it a little further, as it seems a good analogy. The cooks spend time and money maitaining the food, keeping it hot and the warmers running. They're giving out standard sized plates and most people are filling their plates and are satisfied with the food. What you seem to be asking for is that you want to go to the buffet with a bigger plate so you can take more food and only eat what you want. Isn't it possible to take a smaller plate back to the buffet and only get the food you really want?

 

 

Well, the buffet analog worked for what I was pointing out - but it doesn't work so well here because there's no need for people to keep track of the total food on the table.

 

A slightly better analogy here is computer memory. A few years ago, the average consumer PC had around 512 MB of RAM. That was fine at the time, but nowadays, there are software apps are showing 1-2 GB *minimum* requirements. Currently, people's needs and wants have caused the consumer PC to average around 2 GB of RAM. And the cost for 2 GB of RAM today is approximately what 512 MB cost years ago. The price is about equal, but you're getting more because current apps use more memory, and also because people are doing more with their PCs.

 

What GC is doing is the equivalent of saying that 512 MB was sufficient years ago, so they're only going to offer 512 MB now. Sorry, but cache density has grown - and what they offer should grow to match it.

Link to comment

 

Wow. 12,000+ caches within 100 miles is pretty cache-dense. Consider whittling down that data. One way, of course, is to find caches. I'm having fun right now, trying to keep my database below 10,000 caches. Between finding 14 caches in the past week and seeing a bunch of others get archived, I got the number down to 9991. :(

 

Another way is to be selective in the caches requested by your pocket queries. If I ever max out my queries, I'd likely eliminate mystery/unknown caches, since I rarely hunt for them. Others don't like micros so they exclude that cache size from their queries.

 

In your case I noticed that in three years, you've never attended an event, or logged a virtual or webcam cache. And, you've only found one cache with a terrain rating greater than two stars. These are criteria you might consider using to slim down your queries.

 

 

As I've said before, I often don't log my caches, so the count is misleading.

 

As to your other suggestion - I already limit the categories of caches that my PQs pull. What's left is still a large set, but I choose from all the remaining categories and difficulty/terrain levels, so I can't limit it any further.

Link to comment
I have thought about it extensively. And the maximum flexibility I can get to maximize my enjoyment comes from being able to execute as many or as complex filters as I need through GSAK - which occurs instantaneously, does not require an internet connection, and is not dependant on whether GC.com is running or not.

 

...So you support my assertion that we should have better and more complex filtering options for a PQ??

 

I mean seriously, if you could target that data more precisely from the source your downloads would be more effcient and require downloading less caches right??

 

...and best as I can tell, gc.com is up running and available somewhere way north of 98% of the time.

Link to comment
I have thought about it extensively. And the maximum flexibility I can get to maximize my enjoyment comes from being able to execute as many or as complex filters as I need through GSAK - which occurs instantaneously, does not require an internet connection, and is not dependant on whether GC.com is running or not.

 

...So you support my assertion that we should have better and more complex filtering options for a PQ??

 

I mean seriously, if you could target that data more precisely from the source your downloads would be more effcient and require downloading less caches right??

 

...and best as I can tell, gc.com is up running and available somewhere way north of 98% of the time.

Better filtering is the way to go. Later this year, V2 of the site is going to let people give awards to caches like most scenic, best cammo, etc. I think this will also entice people to try to get awards, which is great. This should also allow some better filtering.
Link to comment

 

...So you support my assertion that we should have better and more complex filtering options for a PQ??

 

I mean seriously, if you could target that data more precisely from the source your downloads would be more effcient and require downloading less caches right??

 

...and best as I can tell, gc.com is up running and available somewhere way north of 98% of the time.

 

I'm afraid you don't understand. What I was saying worked best for me was the ability to run as many filters as I needed *** through GSAK ***. In other words, having a complete set of data on hand and being able to switch what I'm looking for at a moment's notice - which is something that you can't do if you've filtered the data set itself.

Link to comment

Since you can get five PQs per day, containing almost 2500 caches, particularly if you set up your PQs by "Date Placed," how many more caches do you need? :unsure:

 

Hilarious. :D:):lol:

 

Except that there's a serious response. If I only like 10 caches out of 1000, than the answer to how many more I need is "a lot". Some of you really need to realize that not everyone is as... indiscriminate... about which caches they go for.

Link to comment

 

...So you support my assertion that we should have better and more complex filtering options for a PQ??

 

I mean seriously, if you could target that data more precisely from the source your downloads would be more effcient and require downloading less caches right??

 

...and best as I can tell, gc.com is up running and available somewhere way north of 98% of the time.

 

I'm afraid you don't understand. What I was saying worked best for me was the ability to run as many filters as I needed *** through GSAK ***. In other words, having a complete set of data on hand and being able to switch what I'm looking for at a moment's notice - which is something that you can't do if you've filtered the data set itself.

I am afraid - you do not understand - if you could do ALL of those advanced filters on the PQ itself before any of the data got to you - would the 500/2500 limits be enough for you??

 

Since you can get five PQs per day, containing almost 2500 caches, particularly if you set up your PQs by "Date Placed," how many more caches do you need? :unsure:

 

Hilarious. :D:):lol:

 

Except that there's a serious response. If I only like 10 caches out of 1000, than the answer to how many more I need is "a lot". Some of you really need to realize that not everyone is as... indiscriminate... about which caches they go for.

 

Again - can't you see that if you could run some more evolved PQs via filtering that the limits start making much more sense. No need for GSAK filters - no need to worry about what you will find etc.....

 

BTW - your memory analogy is quite incorrect. The limit is designed around how many waypoints will fit into many GPSr units. That number (for even many of the newest units - [see garmin "h" line]) is either 500 or 1000. You keep assuming that the limits had (have) something to do with density or server limits or something. In fact the limits are designed around what a GPS can hold - that has not changed over the years. TPTB do not support the creation and use of offline databases. Never have. The idea behind a PQ - is to target the types of caches you would like to find in an area you will be in. Then go find them.

 

What has changed, IMHO - is the need to do more advanced filtering - things like log length, number of logs, keywords in logs and descriptions, types of logs, date since last found - etc.........

Link to comment

 

I am afraid - you do not understand - if you could do ALL of those advanced filters on the PQ itself before any of the data got to you - would the 500/2500 limits be enough for you??

 

 

You still don't get it. I run different filters all the time, based on what I'm looking for. I might run 10 filters in 10 minutes, while I'm researching. I might be in the mood for one thing, one day, and be in the mood for something else, the next. I'm not going to run any filters to limit the root data set because once I do that, I shoot myself in the foot as I can no longer change the filter.

 

Do you understand, now?

 

 

BTW - your memory analogy is quite incorrect. The limit is designed around how many waypoints will fit into many GPSr units. That number (for even many of the newest units - [see garmin "h" line]) is either 500 or 1000. You keep assuming that the limits had (have) something to do with density or server limits or something. In fact the limits are designed around what a GPS can hold - that has not changed over the years. TPTB do not support the creation and use of offline databases. Never have. The idea behind a PQ - is to target the types of caches you would like to find in an area you will be in. Then go find them.

 

What has changed, IMHO - is the need to do more advanced filtering - things like log length, number of logs, keywords in logs and descriptions, types of logs, date since last found - etc.........

 

 

First of all, there have been various statements by GC staff that contradict that. Responses to requests to raise the limits often have nothing to do with handheld GPS limits - but rather, state that GC staff feel that the limits are sufficient for people's needs in general. In light of those statements, my memory analogy works just fine.

 

And while I have not done an extensive search on the subject, I don't recall any staff member yelling at people because they stated that they used GSAK or a similar program. So, while the current limits might have been *originally* intended to mesh with handheld GPS units, that is an archaic standard - not only because programs like GSAK, that didn't exist then, exist now - but also because many handheld units can store much more than 500 waypoints.

 

As to what has changed, you forgot something - nowadays, many people want and expect the ability to download cache data for the purpose of creating offline databases. Whether GC is happy with that or not is irrelevant to the fact that many people are using programs like GSAK - and *like* it.

 

Cell phone carriers have long said that they don't support their customers unlocking their phones. They can say it all they want - the CUSTOMERS have stated that it's *their* phones, they paid for it, and they'll do what they want with them. And you know what? The courts have agreed with them. So don't use a company's position on something as the be all and end all when it comes to a discussion of what customers need and want. It doesn't work.

Link to comment
You still don't get it. I run different filters all the time, based on what I'm looking for. I might run 10 filters in 10 minutes, while I'm researching. I might be in the mood for one thing, one day, and be in the mood for something else, the next. I'm not going to run any filters to limit the root data set because once I do that, I shoot myself in the foot as I can no longer change the filter.

 

Do you understand, now?

No.

 

I still believe the ability to fine tune the filtering (at any time) is best served on the servers. All the data is stored there, it will not be changing there, it is the most current - so I should expect to be able to do it there. If I continue with the current scheme, I can change my mind about what I want 5 times per day everyday. And still get all the cache info I want for up to 500 caches in the 4.8 hours in between each change I make.

 

Jeremy himself has stated that PQs were never intended to create an offline database. He further states that he does not understand why people think that PQs have that purpose. Use the search - it is there. It is not Groundspeak's fault that you have made the choice to use them in an unsupportted manner. Do not heap the results of that choice on them as a problem that needs to be fixed. They created a tool to load caches into your GPSr and go caching. You chose to keep it. You chose to build it up. You chose to further slice and dice it. Don't go blaming somebody else for not supportting your choices.

 

Many people use butter knives as screwdrivers. I've seen it. The fact that my butter knife makes a poor screwdriver should not be blamed on the manufacturer. I should not expect the cutlery maker to add a grip and interchangable blades because the original is not good at driving in screws. It wasn't meant to be used that way. Same for PQs. The mere fact that it "can" be used to create offline large data sets is not reason enough to say it was created for that purpose.

Link to comment
You still don't get it. I run different filters all the time, based on what I'm looking for. I might run 10 filters in 10 minutes, while I'm researching. I might be in the mood for one thing, one day, and be in the mood for something else, the next. I'm not going to run any filters to limit the root data set because once I do that, I shoot myself in the foot as I can no longer change the filter.

 

Do you understand, now?

No.

 

I still believe the ability to fine tune the filtering (at any time) is best served on the servers. All the data is stored there, it will not be changing there, it is the most current - so I should expect to be able to do it there. If I continue with the current scheme, I can change my mind about what I want 5 times per day everyday. And still get all the cache info I want for up to 500 caches in the 4.8 hours in between each change I make.

 

 

<sigh> Let me try to put this in simpler terms for you:

 

------------------------------------------

If I did things My Way:

 

"Hmm... I have the time to grab about six caches tomorrow. I think I'd like to grab a couple of micros."

 

--- Run GSAK Filter ---

 

"Okay, there's two of them that are promising. Hmm... I think I'd also like to grab a couple of traditionals."

 

--- Run GSAK Filter ---

 

"Way too many. Let's try ones that are at least a Difficulty of 3."

 

--- Run GSAK Filter ---

 

"There we go, I'll grab those. I think I'd also like to do a couple of Mystery caches."

 

--- Run GSAK Filter ---

 

"There's a lot here, but they're not the type that I like. Hmmm... I remember liking the ones by Avroair, let's try looking for those."

 

--- Run GSAK Filter ---

 

"Aha! Okay, now I've got my list!"

------------------------------------------

 

------------------------------------------

If I did things Your Way:

 

"Hmm... I have the time to grab about six caches tomorrow. I think I'd like to grab a couple of micros."

 

--- Set PQ to grab only micros ---

 

"Okay, there's two of them that are promising. Hmm... I think I'd also like to grab a couple of traditionals."

 

"... shoot."

 

------------------------------------------

 

You talk about changing your mind up to 5 times a day. Well, I can change what I'm looking for 500 times a day (which isn't the same as changing my mind), and it doesn't require GC to be up and running, it doesn't require an internet connection, I don't have to be home, it puts zero load on GC's servers...

 

If you still don't get it, I'm not sure I can make it any clearer for you.

 

 

 

Jeremy himself has stated that PQs were never intended to create an offline database. He further states that he does not understand why people think that PQs have that purpose. Use the search - it is there. It is not Groundspeak's fault that you have made the choice to use them in an unsupportted manner. Do not heap the results of that choice on them as a problem that needs to be fixed. They created a tool to load caches into your GPSr and go caching. You chose to keep it. You chose to build it up. You chose to further slice and dice it. Don't go blaming somebody else for not supportting your choices.

 

Many people use butter knives as screwdrivers. I've seen it. The fact that my butter knife makes a poor screwdriver should not be blamed on the manufacturer. I should not expect the cutlery maker to add a grip and interchangable blades because the original is not good at driving in screws. It wasn't meant to be used that way. Same for PQs. The mere fact that it "can" be used to create offline large data sets is not reason enough to say it was created for that purpose.

 

 

I never said that PQs were originally created for the purpose of offline databases. That doesn't change the fact that many people TODAY want to use them for that purpose. Record companies may have *intended* for music to be listened by playing the original CDs directly. That doesn't change the fact that many people TODAY want to be able to load it on their MP3 Players, or create a mix CD from the music that they've already bought, or put it on their computer so they can listen through iTunes/WinAmp/etc... The companies and artists that recognize this are well positioned to flourish through the upcoming years. The ones who insist on standing by their original intentions, crafted years ago before MP3 Players or consumer CD burners even existed, are being left in the dust.

 

That's a much better analogy than your butter knife and screwdriver one. Because programs like GSAK *intend* for people to use those GPX files to populate them. Your analogy is about people using an item for something it's not suited for - which isn't the same as using an item for something it may not have been intended for, but works just fine all the same.

Link to comment

Many people use butter knives as screwdrivers. I've seen it. The fact that my butter knife makes a poor screwdriver should not be blamed on the manufacturer.

 

Although I am a database maintainer using GSAK (and thus on the other side of this argument), that is a nice analogy.

 

Jeremy is in the butter knife business. Some of us have found the Geocaching Swiss Army Knives work better for us. Every so often we try to convince the butter knife manufacturer to make things easier for the army knife users. The manufacturer rarely responds but his supporters tell us we are wrong to expect any help.

 

So we soldier on, enhancing the butter knife output with our army knives.

Edited by beejay&esskay
Link to comment

....

That's a much better analogy than your butter knife and screwdriver one. Because programs like GSAK *intend* for people to use those GPX files to populate them. Your analogy is about people using an item for something it's not suited for - which isn't the same as using an item for something it may not have been intended for, but works just fine all the same.

Ahhhh - but I myself have actually successfully used a butter knife as a screw driver. It worked. I was successful.

 

But they never intended for me to do that and still don't support it.

 

oh BTW

 

I would also propose that I slice and dice all that data on the servers and then when I get the result set I like (including a diverse set of criteria) - I download the PQ. Still allowing for everything you are trying to do.

 

It is the same data up on the servers isn't it? Tell me why you think you can slice it and dice it better on your machine. I want to see a vastly expanded set of filtering criteria. You seem to be stuck on the idea that you will ALWAYS do it better. Tell us what criteria you want to see - that would be more helpful. I think we have a much better shot at that then ever seeing more data coming down the pike.

Link to comment

The only changes to the PQ system planned for release when the new Phoenix project is completed will be the introduction of instant downloads. There are some details that have yet to be finalized concerning the storage of those PQs on Geocaching.com, but those issues will be worked out.

 

For those of you who are not satisfied with the limits set by the Pocket Query generator there are numerous ways to refine your searches; many of which have been pointed out in this thread. There are important reasons for Groundspeak setting those limits, foremost being that the site performs better when they are enforced, but also because we want you to visit the site frequently to retrieve fresh data. I'm sorry if that inconveniences some of you.

Link to comment

 

Ahhhh - but I myself have actually successfully used a butter knife as a screw driver. It worked. I was successful.

 

But they never intended for me to do that and still don't support it.

 

 

That's not quite it - after all, you said yourself that a butter knife makes a *poor* screwdriver. It's not that you can never make it work, but it's a lousy substitute under most circumstances. There is, however, zero loss of function if I take music from a purchased CD and put it on my computer or burn a mix CD.

 

And in any case, while people might occasionally use a butter knife as a screwdriver, how many people *expect* a butter knife to work just as well as a screwdriver? Not many, if any. On the other hand, there are many people who want and expect music CDs to be rippable for legal reasons, and many people who want and expect downloaded caches to work with offline DB programs. And they both work perfectly with those alternate functions. That's the way things are nowadays - and just like the record companies, you adapt or you get left behind.

 

 

oh BTW

 

I would also propose that I slice and dice all that data on the servers and then when I get the result set I like (including a diverse set of criteria) - I download the PQ. Still allowing for everything you are trying to do.

 

 

No, it doesn't allow for everything I'm trying to do. Are you even reading the posts?

 

Offline DB: You can change what you're looking for an UNLIMITED number of times per day.

GC Website: You can change what you're looking for FIVE times per day.

 

Offline DB: Works, regardless of whether or not GC.com is up, down, running slow, or having problems.

GC Website: ONLY works if GC is up and running, not having problems, and is running at a minimally useable speed.

 

Offline DB: Filtering results are shown immediately.

GC Website: PQs are unpredictable as to when, or even IF they get sent out. You can minimize the time by creating a new PQ, instead of modifying an old one, but that still means you might wait 10 minutes per change, as opposed to TWO SECONDS.

 

Offline DB: You can save as many filters as you want, to be used at any time.

GC Website: You are limited to 40 PQs, and if you have the temerity to say it's not enough, you get piled on by a bunch of people who tell you that there must be something wrong with you if you're not doing things the way they do.

 

Offline DB: Works, regardless of whether you have an internet connection or not.

GC Website: ONLY works if you have internet access.

 

Offline DB: Filtering puts zero load on GC's servers.

GC Website: Filtering puts load on GC's servers - not only their web servers, but also their SQL servers.

 

 

How can you think the two options are equal in functionality?

 

 

It is the same data up on the servers isn't it? Tell me why you think you can slice it and dice it better on your machine. I want to see a vastly expanded set of filtering criteria. You seem to be stuck on the idea that you will ALWAYS do it better. Tell us what criteria you want to see - that would be more helpful. I think we have a much better shot at that then ever seeing more data coming down the pike.

 

 

GSAK offers more flexibility and more speed than GC's PQ creation. If you want to know exactly what it can handle, take a look at their website - or better yet, try downloading and learning to use it.

 

Oh, and there's something else that's a very significant difference - the author of GSAK is very responsive and nice when it comes to user requests. GC on the other hand... I don't think I really need to finish that.

Link to comment

 

For those of you who are not satisfied with the limits set by the Pocket Query generator there are numerous ways to refine your searches; many of which have been pointed out in this thread. There are important reasons for Groundspeak setting those limits, foremost being that the site performs better when they are enforced, but also because we want you to visit the site frequently to retrieve fresh data. I'm sorry if that inconveniences some of you.

 

 

PQs get emailed. I can get fresh data without ever needing to visit the site, so it's not an inconvenience at all. The current number limits, however, are. And as has been stated before, along with the increase in caches over the years, also has come an increase in members, and thus an increase in funds available to upgrade those systems.

 

And that's aside from suggestions like the 52 Complete State files, which would almost completely eliminate the PQ load on your SQL servers.

Link to comment

 

I am afraid - you do not understand - if you could do ALL of those advanced filters on the PQ itself before any of the data got to you - would the 500/2500 limits be enough for you??

 

 

You still don't get it. I run different filters all the time, based on what I'm looking for. I might run 10 filters in 10 minutes, while I'm researching. I might be in the mood for one thing, one day, and be in the mood for something else, the next. I'm not going to run any filters to limit the root data set because once I do that, I shoot myself in the foot as I can no longer change the filter.

 

Do you understand, now?

You do understand that you can preview a pocket query online before you get the GPX file sent to you. If there were more extensive tools online you could run your five hundred queries a day, use the result to build bookmark lists of the caches you might want to find, and then, when you're ready, get a GPX file with just those caches to load into your PDA and GPS.

 

However, I'm going to guess that no matter how TPTB improve on the online querying capability that someone will develop GSAK macros to allow even more slicing and dicing of the data to help you select the caches you want to find. So long as this is true, I suspect that you will find you need to have your own copy of the Geocaching.com database (at least for the 100 mile radius of your home) so that you don't miss out on that one must do cache you wouldn't find using the online tools. Jeremy may at some point increase the number of caches you can download in a pocket query. Because of increased cache density which might include MicroSpew or other caches that you would like to eliminate from your hunts, I could see a change that would be helpful to cachers that are unable to find other ways to cope. But, I strongly suspect there will always be some limit simply so that Groundspeak can claim proprietary rights to the collection of cache data. In addition, as OpioNate's post states, allowing larger downloads would put a strain on the servers that Groundspeak currently uses. If you were willing to pay for the increased load to meet your need, there might be some room for negotiating.

 

 

As to what has changed, you forgot something - nowadays, many people want and expect the ability to download cache data for the purpose of creating offline databases. Whether GC is happy with that or not is irrelevant to the fact that many people are using programs like GSAK - and *like* it.

 

Cell phone carriers have long said that they don't support their customers unlocking their phones. They can say it all they want - the CUSTOMERS have stated that it's *their* phones, they paid for it, and they'll do what they want with them. And you know what? The courts have agreed with them. So don't use a company's position on something as the be all and end all when it comes to a discussion of what customers need and want. It doesn't work.

Your premium membership entitles you to a certain number of caches in your PQs. I don't know what Groundspeak's stance is on getting multiple premium memberships if you need to get more data. That might be the fairer way to handle it. It may be that cell phone providers can't stop someone, once they have purchased the hardware, from modifying it to work on a different network. I suppose that in the same way, once you get your PQ from Geocaching.com, they can't stop you from storing it in a offline database. What they can do is limit the amount of data you can get at one time, and make it difficult to keep track of when the data goes stale or a cache gets archived. Aggregators of data like Groundspeak rely on people visiting their websites in order to sell advertising for their revenue. The fact that for $3/month they allow you to download a not insignificant amount of data is a bonus.

 

I'm going to log onto BitTorrent now and download a movie :unsure:

Link to comment
PQs get emailed. I can get fresh data without ever needing to visit the site, so it's not an inconvenience at all. The current number limits, however, are. And as has been stated before, along with the increase in caches over the years, also has come an increase in members, and thus an increase in funds available to upgrade those systems.
Did I miss your reply as to why you can't (and shouldn't be expected to) simply buy an additional membership if you want to download a greater amount of caches?

 

Seriously, if you 'need' three times as many PQs as you can currently download, simply buy three memberships. For $1.75 per week, you can have what you want. Why don't you do this if you absolutely need to have that much data to cache the way you want to?

And that's aside from suggestions like the 52 Complete State files, which would almost completely eliminate the PQ load on your SQL servers.
Here's the problem with that idea:

 

In the United States, we have the freedom to freely move about the country. Personally, I own a home in Tennessee, but I frequently travel to New York, California, and Florida. Do I need cache data for each of the 78,424 caches in these four states? Of course not, but I might want it if PQs were packaged that way. In addition, If I am close to the edge of those states, I may visit any number of the surrounding states. I should be able to get all the cache data for those states, also. After all, it wouldn't be a huge load on GS's servers to simply email me that information. Of course, it would quickly do away whith the one thing that TPTB have to sell.

Link to comment

And that's aside from suggestions like the 52 Complete State files, which would almost completely eliminate the PQ load on your SQL servers.

Granted, I've been on vacation for the last two weeks, but I'm sure I would have heard about two additional states being added to the country. The flags I saw on my trip all had 50 stars on them.

 

The underlying suggestion is a good one though, being able to download your entire state as a PQ. Maybe it could be run once a month (especially in cache-rich states like California) then you could run daily/weekly PQ's of caches that have changed since your last whole-state download. Even better would be a 100 mile download for the people that live near state borders so they wouldn't have to run whole-state PQs for several states.

Link to comment

 

Let's say you want to get the closest 1000 caches and have it be 1000 caches. You can easily determine an approximation of the radius how close that is by going to www.geocaching.com/my and searching on caches closest to your home coordinates excluding finds.

 

Then add &dist=30 or some other number to the end of the URL and see how many return (the default for the distance is 50 miles, changing it to 30 reduces it to 30 miles). Change the number until you find something approximating 1,000 caches. You can even use decimals like 27.42.

 

Although this is not the ideal solution, the quoted info (the &dist= and the fact you could use decimal values) is very, very handy. Thanks for sharing it.

 

MrW.

Link to comment

Something strange happened to me while reading this thread -- I changed my mind.

 

I used to argue that the existing PQ limits were too low and needed to be increased -- either more PQ per day, more cache per PQ or whatever.

 

After reading this thread I realize that the current limits work fine for a majority of people, and when I think about it they work fine for me 99% of the time too. I think if one were to do a cost/benefit analysis of increasing the limits it would be hard to justify it.

 

As a sys admin we run into this sort of thing all the time when it comes to storage quotas. We set a limit which works for the vast majority of people. The minority of people who need more complain to us and we tell them they can have more but they will pay more than others. If only there was some way to do this for pocket queries where those who need more queries could get them without all of us having to pay...

 

Oh wait, there is! And, it has been suggested in this thread many times. Buy more premium memberships and you run as many queries as you want. :huh:

Link to comment

One "state" would be the District of Columbia.

 

The District of Columbia is a district, not a state.

 

hence the use of the quotation marks around the word "state"...signifying that while not a state of the union, it would be included in the list of available defined geographical regions for PQs.

Link to comment

For me, I find the 500 cache/5 PQ a day limit quite onerous. We frequently travel and cache or maybe just cache and follow our noses. Or, maybe work on challenge caches. For example the DeLorme challenges. When I planned the Oregon DeLorme cache trip which we completed driving over 6500 miles over three trips and about 9 days last fall. It took me a LOT of time to judge and size pocket queries for Oregon that I could load into GSAK and then set up route filters to accommodate 1000 way point limit in my GPS at the time.

In order to determine which caches meet the criteria for each DeLorme page I had to drop them into MapPoint with custom DeLorme grid lines drawn. Then I could finally visually determine the best route and finally select the caches as we needed them. But all those PQs had to be designed and run first.

 

For us, the freedom to have up to date cache information available really adds to the spontaneity and increases our fun. Working with out of date info such as old PQs means that while the seeking is fun it is also fun to find and old bad data means more DNFs.

Adding more memberships could be an alternative but they don't allow PQs that drop another members (your) finds.

IF it truly costs more for the system to provide larger PQs then perhaps a higher level membership that would allow much larger PQs, more PQs per day and the storage of more PQs formatted and ready for use could be made available to those of us who can make use of the larger numbers.

It really isn't as much fun spending so much indoor time trying to work around the current limitations at the cost of actually out caching with friends.

There are enough forum posts about this subject that maybe it is time that something be done. We aren't privy to all the reasons not to update this part of the system but we can say that changing the limitations would be greatly appreciated by many.

Link to comment

I have read this and many other threads from the same matter, but still I cannot understand why someone needs information from 2500 caches per day.

 

But as stated solution could be that if you need to more queris you could get those by paying. Let's say for example $10 per query.

Link to comment

So, just to throw my 2cents into the mix. I really think there is a misunderstanding of how people are using the pocket queries that want to have a slightly higher mix.

 

First, my details. I live near Indianapolis. I have 7 queries setup to get all the caches within just 50 miles of my house. Not really that unrealistic of a difference for me to go caching with on a spur of the moment. Those caches are broken up, as suggested by others here into date ranges. I did it, but it was a pain. I get updates on the caches close to my house once a week so they are up to date.

 

I then use GSAK to put them into a single database and export the file back out to my PDA. If I'm going caching, I grab the area near where I'm going and dump the nearby caches onto the GPS.

 

My point for wanting a higher limit isn't to download 500,000 caches/week. It's about the ease-of use. I really liked the idea someone else put in of making it a # caches/week downloaded limit instead of "max 5 queries/day of 500 caches each".

 

You're right, 17,500 caches can be downloaded every week. I don't want that many. I just would like an easier way to get the caches within 50 miles of my house. So, from my perspective, let's stop arguing about upping the max caches/query or queries/day and push towards a max # caches/week limit or something like that.

 

The other reason for this type of thing is the example of me going on vacation in a few months. I'm going to, aparently, a fairly cache dense area in tennessee. I will have zero internet connection while there. I'm going with others so can't "plan" my entire week ahead of time. I know we're going to try and go hiking a couple days and that will take me to areas possibly 50-60 miles from my condo. Unfortunately, I can't hit that kind of range. I can only get about 20 miles from the condo before hitting the limit. I'd GLADLY disable my normal queries and again would just like the ability to get about 2000 caches in one query to save me hours of setting up the Pocket Queries to get that data.

Link to comment

(With the rate that new caches are being placed I totally agree that we should be able to produce more query results, but)

I would be much happier if I could retain more than 30 queries.

I have a set of queries for my home area and several sets for when I plan to travel around the country. I don't want to run them all, all of the time but I want them kept in my library so I can run them when I need to. I can't see how the overhead required to allow 100 queries per PM to be maintained could be anything other than trivial.

 

Please, please increase this limit. :lol: Thanks for listening.

Link to comment

So, just to throw my 2cents into the mix. I really think there is a misunderstanding of how people are using the pocket queries that want to have a slightly higher mix.

 

First, my details. I live near Indianapolis. I have 7 queries setup to get all the caches within just 50 miles of my house. Not really that unrealistic of a difference for me to go caching with on a spur of the moment. Those caches are broken up, as suggested by others here into date ranges. I did it, but it was a pain. I get updates on the caches close to my house once a week so they are up to date.

 

I then use GSAK to put them into a single database and export the file back out to my PDA. If I'm going caching, I grab the area near where I'm going and dump the nearby caches onto the GPS.

 

My point for wanting a higher limit isn't to download 500,000 caches/week. It's about the ease-of use. I really liked the idea someone else put in of making it a # caches/week downloaded limit instead of "max 5 queries/day of 500 caches each".

 

You're right, 17,500 caches can be downloaded every week. I don't want that many. I just would like an easier way to get the caches within 50 miles of my house. So, from my perspective, let's stop arguing about upping the max caches/query or queries/day and push towards a max # caches/week limit or something like that.

 

The other reason for this type of thing is the example of me going on vacation in a few months. I'm going to, aparently, a fairly cache dense area in tennessee. I will have zero internet connection while there. I'm going with others so can't "plan" my entire week ahead of time. I know we're going to try and go hiking a couple days and that will take me to areas possibly 50-60 miles from my condo. Unfortunately, I can't hit that kind of range. I can only get about 20 miles from the condo before hitting the limit. I'd GLADLY disable my normal queries and again would just like the ability to get about 2000 caches in one query to save me hours of setting up the Pocket Queries to get that data.

Just trying to help.....

 

Do you really need EVERY cache in the area? Wouldn't just having the 2/2 and lower small caches work for today and then tomorrow you can load up all the 3/2 micros?

 

Have you tried caches along a route to narrow down the possiblities to just those along a route? Would work for your hiking trip.

Link to comment

And that's aside from suggestions like the 52 Complete State files, which would almost completely eliminate the PQ load on your SQL servers.

Granted, I've been on vacation for the last two weeks, but I'm sure I would have heard about two additional states being added to the country. The flags I saw on my trip all had 50 stars on them.

One "state" would be the District of Columbia.

 

The District of Columbia is a district, not a state.

 

hence the use of the quotation marks around the word "state"...signifying that while not a state of the union, it would be included in the list of available defined geographical regions for PQs.

Even counting DC as a "state" would only make 51, not 52 :lol:

Link to comment

Do you really need EVERY cache in the area? Wouldn't just having the 2/2 and lower small caches work for today and then tomorrow you can load up all the 3/2 micros?

 

 

I keep hearing this argument made in the forum, but it really doesn't apply to areas of high density.

 

Let's say you're a tech traveller to San Jose. You're a business traveller, so you don't feel like anything greater than a 1.5/1.5 and you toss everything but traditionals. A 500 cache query gets you only 11 miles away from the airport. That's a pretty conservative PQ and a pretty small distance; it gets you far enough away from the A/P to not be deafened by the planes when they fly overhead.

 

500 caches in rural areas is a crazy big circle. 500 in a cache dense isn't. Even with the best tools avaailble to make use of your PQs, 500 in a cache rich area is frustrating.

Link to comment

Let's say you're a tech traveller to San Jose. You're a business traveller, so you don't feel like anything greater than a 1.5/1.5 and you toss everything but traditionals. A 500 cache query gets you only 11 miles away from the airport. That's a pretty conservative PQ and a pretty small distance; it gets you far enough away from the A/P to not be deafened by the planes when they fly overhead.

 

500 caches in rural areas is a crazy big circle. 500 in a cache dense isn't. Even with the best tools avaailble to make use of your PQs, 500 in a cache rich area is frustrating.

 

I suppose it's frustrating if you really want to have the feeling that you could have found all 900 or 1,500 caches (or whatever) if you'd had the time. And yes, it's (very slightly) frustrating to get home and log your finds, and discover - via "nearest caches" - that instead of walking 0.4 miles for the last nano which you decided you had time for before heading to the airport and didn't find, you could have walked 0.3 miles in the other direction and found a simpler cache, which didn't make into the PQ.

 

But really, if you're going on a business trip to somewhere, you're presumably going principally for the business. You'd like to grab 2 or 3 caches if you have some time, but you aren't going to spend hours of poring-over-screen time and weeks of elapsed time preparing for it, like you would for the Delorme trip to Oregon. So does it really matter if you select those 2 or 3 from 500 instead of from 900 or 1,500? If the area is so densely packed, then finding 2 or 3 won't be hard anyway. And if you have to wade through 900 or 1,500 instead of 500, how much more time are you going to be spending planning that trip, when the chances are you'll only have time to find 2 or 3 near your hotel or local place of business anyway?

 

The "magic number" of 500 is not, and AFAIK never was, a function of the size of the database ("we'll set the number to allow a Premium Member to download X% of all the caches"); it's a function of how many caches you can reasonably expect to need in order to have a fun day's caching - DNFs, caches archived the day before you set out, and all. That number does not change significantly with the size of the database.

Link to comment

I'm wondering WHY is it that Groundspeak does not allow More than 500 caches per PQ. I'm confident that the Servers have been updated over the years and could probably handle the work load.

 

As the Geocaches in each province increases, the need to have larger PQs increases proportionally. Shouldn’t Groundspeak augment the number of geocaches allowable in each PQ to reflect the past growth? This is a service offered to their customers and should be reviewed at least once a year.

 

Questions Comments

I would rather see them change the maximum mileage for a PQ, now if I take a 1,500 mile trip I have to break it into three parts.

Edited by DWBur
Link to comment

And that's aside from suggestions like the 52 Complete State files, which would almost completely eliminate the PQ load on your SQL servers.

Granted, I've been on vacation for the last two weeks, but I'm sure I would have heard about two additional states being added to the country. The flags I saw on my trip all had 50 stars on them.

One "state" would be the District of Columbia.

 

The District of Columbia is a district, not a state.

 

hence the use of the quotation marks around the word "state"...signifying that while not a state of the union, it would be included in the list of available defined geographical regions for PQs.

Even counting DC as a "state" would only make 51, not 52 :)

Don't forget Canada. :laughing:

Link to comment
Do you really need EVERY cache in the area? Wouldn't just having the 2/2 and lower small caches work for today and then tomorrow you can load up all the 3/2 micros?
I keep hearing this argument made in the forum, but it really doesn't apply to areas of high density.

 

Let's say you're a tech traveller to San Jose. You're a business traveller, so you don't feel like anything greater than a 1.5/1.5 and you toss everything but traditionals. A 500 cache query gets you only 11 miles away from the airport. That's a pretty conservative PQ and a pretty small distance; it gets you far enough away from the A/P to not be deafened by the planes when they fly overhead.

 

500 caches in rural areas is a crazy big circle. 500 in a cache dense isn't. Even with the best tools avaailble to make use of your PQs, 500 in a cache rich area is frustrating.

I have the same issues when I go to the LA area. However, if I limit my PQs to just those kinds of caches that I am realistically going to visit, I can include all of the caches that I am interested in from Malibu to Dana Point in eight PQs. That gives me an area spanning 60 cache-rich miles in less than two days worth of PQs.
Link to comment

I have not read all the replies but unless you have a GPS that can hold more than 500 or 1000 caches,

Having a PQ with over 1000 caches is not going to do you a lot of good. Sure you can use GSAK to load them into a PPC/PDA as an HTML file, but you will have to enter them into the GPS by hand.

 

The Magellan Meridians, and the Magellan Explorist 400, 500 and 600 allow loading several cache files onto the SD card, as well as several maps. But the problem with the Magellans is that each file is limited to 200 caches. With my Explorist 500 and Meridian I have files broken down into the areas that I tend to cache in.

 

I just have lots of files, there have been times when I have had several thousand caches loaded into by Magellan explorist 500.

Edited by JohnnyVegas
Link to comment
The "magic number" of 500 is not, and AFAIK never was, a function of the size of the database ("we'll set the number to allow a Premium Member to download X% of all the caches"); it's a function of how many caches you can reasonably expect to need in order to have a fun day's caching - DNFs, caches archived the day before you set out, and all. That number does not change significantly with the size of the database.

You're totally correct, it doesn't change significantly with the size of the database, but it does change significantly with the cache density. Living in a cache dense area, it is frustrating to try and plan a day of caching, let's say, to San Francisco (I live in San Jose). I can't, at the spur of the moment, easily plan out a day of caching because of the number of pocket queries I would need to run to pick up the caches between here and San Francisco. Then there's the issue of overlapping data due to the circular nature of pocket queries.

 

So, realistically, it's more a function of distance than of the number of caches. I think it would be cool if you could run one PQ a week (similar to how the my finds PQ works) that had a max radius of 100 miles, and an unlimited number of caches. I could probably even live with a max radius of 50 miles, but I'd prefer 100, since that covers all the places that I might go to on any given day (without an over night stay). I personally think this would drastically improve the performance of the PQ servers (I could be wrong though).

 

--Marky

Link to comment
I have not read all the replies but unless you have a GPS that can hold more than 500 or 1000 caches

For some time now, many Garmin GPS's have been able to load over 10,000 caches at a time as custom POIs. There are a lot of GPSMap 60 and 76 users out there that could take advantage of this (not to mention Colorado users), if they used the custom POI method.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...