Jump to content

Increase number of caches in Pocket queries


Teddy Steiff

Recommended Posts

I cannot see any reason why Groundspeak maintain the limit of max 1.000 caches in each pocket query. In may areas, there are so many caches that one need a series of PQs to keep updated. It is frustrating to have to finetune PQs to stay within the 1.000 caches limit. As for resources: t must take more computing power to run five PQs of a little less than 1.000 caches, than to run and send one PQ with 5.000 caches.

 

I would also like to see one new type of PQ: The n most recent caches in a given area, making it easier to update bwfore returning to an area where one has not been for a while. It could also be the n caches with recent activity.

Link to comment

Scroll down:

 

Under "Placed During"

 

Choose the middle option

 

Drop down the box, and chose "the last week" or "the last month", or even "the last year"

 

There's your newest caches.

More accurately, those would be the caches with placed dates that fall within that range. That doesn't necessarily mean they are the newest caches. If a cache is published with an older placed date, it wouldn't be captured by such a filter.

Link to comment

Interesting. There needs to be limits of course; otherwise we can all ask for millions of caches and cripple the servers.

 

I do wonder why there is a limit per PQ rather than (just) per day. Currently, you can have 10 PQs of 1000, so 10,000 caches per day. Allowing 1 PQ of 10,000, or 2 PQs of 5000, etc would be more flexible, and avoid some of the work cachers need to do to split queries into multiple PQs. I suspect it is just that this is how it was implemented; changing it to be total number of caches would require development.

 

The API has just a total limit. In one day, I can do a single query of 6,000, or 100 queries of 60 caches etc.

 

I can live with it as it is, but as to why larger numbers can be helpful: Recently I was on vacation in New England, and while there didn't have regular internet access. I knew I'd do some caching with my brother-in-law, and we could end up anywhere in Maine, NH, or Mass. I loaded all the caches in these states onto my GPSr, so that wherever we went, I was ready. It was around 36,000 caches. I split it into multiple PQs over 4 days.

 

When based at home, or when I'm travelling to a specific area, I generally use the API and download the caches just before I go.

Link to comment

Interesting. There needs to be limits of course; otherwise we can all ask for millions of caches and cripple the servers.

 

I do wonder why there is a limit per PQ rather than (just) per day. Currently, you can have 10 PQs of 1000, so 10,000 caches per day. Allowing 1 PQ of 10,000, or 2 PQs of 5000, etc would be more flexible, and avoid some of the work cachers need to do to split queries into multiple PQs. I suspect it is just that this is how it was implemented; changing it to be total number of caches would require development.

 

The API has just a total limit. In one day, I can do a single query of 6,000, or 100 queries of 60 caches etc.

 

I can live with it as it is, but as to why larger numbers can be helpful: Recently I was on vacation in New England, and while there didn't have regular internet access. I knew I'd do some caching with my brother-in-law, and we could end up anywhere in Maine, NH, or Mass. I loaded all the caches in these states onto my GPSr, so that wherever we went, I was ready. It was around 36,000 caches. I split it into multiple PQs over 4 days.

 

When based at home, or when I'm travelling to a specific area, I generally use the API and download the caches just before I go.

 

I don't know if this is why, but I would have a limit just because so many devices using GPX files can only handle 2,000 or 5,000 cases.

  • Upvote 1
Link to comment

 

I do wonder why there is a limit per PQ rather than (just) per day. Currently, you can have 10 PQs of 1000, so 10,000 caches per day. Allowing 1 PQ of 10,000, or 2 PQs of 5000, etc would be more flexible, and avoid some of the work cachers need to do to split queries into multiple PQs. I suspect it is just that this is how it was implemented; changing it to be total number of caches would require development.

 

 

I totally agree with this. I wish GS would decide how many caches they want us to be able to download in a day, and then let us decide how we want to slice it.

Link to comment

I don't know if this is why, but I would have a limit just because so many devices using GPX files can only handle 2,000 or 5,000 cases.

 

I load caches as GGZ files. No (real) limit ... I've had 30000 caches loaded as just 1 file.

PQs are not limited to just putting them on a GPS, many load them in software like GSAK where limits are not a factor.

Link to comment

I don't know if this is why, but I would have a limit just because so many devices using GPX files can only handle 2,000 or 5,000 cases.

 

I load caches as GGZ files. No (real) limit ... I've had 30000 caches loaded as just 1 file.

PQs are not limited to just putting them on a GPS, many load them in software like GSAK where limits are not a factor.

 

That's nice, but I think Geocaching.com would be inundated with whining if GPX files exceeded the max size that works on common GPSrs.

  • Upvote 1
Link to comment

That's nice, but I think Geocaching.com would be inundated with whining if GPX files exceeded the max size that works on common GPSrs.

 

No GPS should choke on a file that contains more entries then they can handle. It should gracefully detect its limit and if not, they should fire the developer.

 

If we look at folks that really want a large # of caches, they figure out how to get it done, leveraging solutions such as GSAK and multiple PQs. If GC.com increases the limit per PQ, I'd expect the masses to change their "home" PQ to the max limit increasing the load on GC.com's infrastructure. Would those that pull down 1000 of the closest unfound caches change their PQ to pull down 10,000 even though 1000 could already be more then they needed? Just a thought.

Link to comment

That's nice, but I think Geocaching.com would be inundated with whining if GPX files exceeded the max size that works on common GPSrs.

 

No GPS should choke on a file that contains more entries then they can handle. It should gracefully detect its limit and if not, they should fire the developer.

 

If we look at folks that really want a large # of caches, they figure out how to get it done, leveraging solutions such as GSAK and multiple PQs. If GC.com increases the limit per PQ, I'd expect the masses to change their "home" PQ to the max limit increasing the load on GC.com's infrastructure. Would those that pull down 1000 of the closest unfound caches change their PQ to pull down 10,000 even though 1000 could already be more then they needed? Just a thought.

 

Yes, that's nice, but Geocaching.com doesn't make the devices; they just supply files that people like to put in them. From my view in the cheap seats it's obviously easier to provide people with limited files that work within the factual limitations of the devices that are common.

 

The current limits are quite adequate for the vast majority of geocachers. Most of us don't need to run an inferior geocaching data warehouse on our home systems for no apparent reason.

Link to comment

That's nice, but I think Geocaching.com would be inundated with whining if GPX files exceeded the max size that works on common GPSrs.

 

They don't exceed by default. There's still the option to set a max amount of caches/PQ and even now default on a new PQ is 500 caches which the user can change to whatever up to 1000. Should the max amount go up to 10000 users can still set it to 1000, 2000 or whatever their GPS will accept.

It's no use to always doing things for the lowest common denominator because it has to be "easy" for all (see the same views on challenges).

Link to comment

The current limits are quite adequate for the vast majority of geocachers. Most of us don't need to run an inferior geocaching data warehouse on our home systems for no apparent reason.

 

<OT>

I guess you don't use GSAK. It's anything but inferior. You might not need it or see any use for it but it wins hands down in organizing our cache days, "todo" list management, mystery solving and logging caches/trackables via API over the website any day.

</OT>

  • Upvote 1
Link to comment

The current limits are quite adequate for the vast majority of geocachers. Most of us don't need to run an inferior geocaching data warehouse on our home systems for no apparent reason.

 

A couple of comments....

 

1) No one said the "vast majority" would need the feature but the option for those that could benefit from it shouldn't be discounted. Still, my previous concerns need to be considered.

 

2) Your GC profile seems to indicate you use GSAK. If that's accurate, you are accomplishing, in a bit more manual manner, what the folks requesting larger PQs could achieve with less effort. GSAK is your personal [inferior?] data warehouse. Just as the vast majority of users might not need GSAK, it doesn't mean there isn't an advantage of the program to those that leverage it.

 

 

Link to comment

That's nice, but I think Geocaching.com would be inundated with whining if GPX files exceeded the max size that works on common GPSrs.

 

They don't exceed by default. There's still the option to set a max amount of caches/PQ and even now default on a new PQ is 500 caches which the user can change to whatever up to 1000. Should the max amount go up to 10000 users can still set it to 1000, 2000 or whatever their GPS will accept.

It's no use to always doing things for the lowest common denominator because it has to be "easy" for all (see the same views on challenges).

 

I can see where you're coming from, but for a company dealing with people who are mostly entering at a very low capability level, it would be a customer service nightmare.

 

I don't know if this is the actual reason. It's just speculation. No need to get excessively angsty about these musings.

Link to comment

The current limits are quite adequate for the vast majority of geocachers. Most of us don't need to run an inferior geocaching data warehouse on our home systems for no apparent reason.

 

A couple of comments....

 

1) No one said the "vast majority" would need the feature but the option for those that could benefit from it shouldn't be discounted. Still, my previous concerns need to be considered.

 

2) Your GC profile seems to indicate you use GSAK. If that's accurate, you are accomplishing, in a bit more manual manner, what the folks requesting larger PQs could achieve with less effort. GSAK is your personal [inferior?] data warehouse. Just as the vast majority of users might not need GSAK, it doesn't mean there isn't an advantage of the program to those that leverage it.

 

Let's be clear that my comments are speculation. I don't work for Geocaching.com so there's no reason to get too fussed about shouting down my conjecture about possible reasons for the limits.

 

I use GSAK on occasion but I don't expect the website to support me downloading every cache in its database so I can have my own out-of-date copy of every cache at home at all times.

Link to comment

The current limits are quite adequate for the vast majority of geocachers. Most of us don't need to run an inferior geocaching data warehouse on our home systems for no apparent reason.

 

<OT>

I guess you don't use GSAK. It's anything but inferior. You might not need it or see any use for it but it wins hands down in organizing our cache days, "todo" list management, mystery solving and logging caches/trackables via API over the website any day.

</OT>

 

What I personally use isn't relevant to my conjecture about why Geocaching.com may not see a need to let everyone download a million geocaches at a time.

Edited by narcissa
Link to comment

Some of the replies seem to be missing that it's not necessarily increasing the total # of caches that can be downloaded by an individual but changing the approach to allow them to download up to that same number. Today a paying subscriber can download 10 PQs each containing 1000 caches or 10,000 caches in total. The request appears to be to allow someone to download up to 10k caches in a single PQ in the course of a day, if that's what makes the most sense for that user. Or maybe 2 PQs each with 5k caches.

Link to comment

I use GSAK on occasion but I don't expect the website to support me downloading every cache in its database so I can have my own out-of-date copy of every cache at home at all times.

 

As Team DEMP wrote, it's not about more caches but more caches in one PQ. Total number of 10000 caches/day is probably enough for most of us.

As for "out of date", my Belgian database has just under 29000 not found caches and is refreshed weekly (Mon/Tue/Wed) and as I'm not a FTF hunter that's OK for planning a weekend of caching. Once a selection is made a Friday evening refresh of the selected caches before exporting to GPS and copying to GDAK (Android) has never given my "out of date" info.

No one needs "every cache" but having all info on hand without internet connection needed is something I couldn't do without anymore. Especially when away from home.

 

So bottom line, a 10000 cache limit would be better than 10 times 1000.

Link to comment

I use GSAK on occasion but I don't expect the website to support me downloading every cache in its database so I can have my own out-of-date copy of every cache at home at all times.

 

As Team DEMP wrote, it's not about more caches but more caches in one PQ. Total number of 10000 caches/day is probably enough for most of us.

As for "out of date", my Belgian database has just under 29000 not found caches and is refreshed weekly (Mon/Tue/Wed) and as I'm not a FTF hunter that's OK for planning a weekend of caching. Once a selection is made a Friday evening refresh of the selected caches before exporting to GPS and copying to GDAK (Android) has never given my "out of date" info.

No one needs "every cache" but having all info on hand without internet connection needed is something I couldn't do without anymore. Especially when away from home.

 

So bottom line, a 10000 cache limit would be better than 10 times 1000.

 

To be clear, I have absolutely no opposition to this at all. I am just riffing on a possible reason why the supplier might not want to implement a bigger file size. No need to convince me of anything with these exhaustive explanations of personal geocache file management strategies.

Link to comment

To be clear, I have absolutely no opposition to this at all. I am just riffing on a possible reason why the supplier might not want to implement a bigger file size. No need to convince me of anything with these exhaustive explanations of personal geocache file management strategies.

 

I know that won't work anyway ;)

 

File size may be bigger but the amount of files will be less so it might just even out.

I now run 30 PQs /week and it could be just 3 for the same amount of data downloaded.

Link to comment

To be clear, I have absolutely no opposition to this at all. I am just riffing on a possible reason why the supplier might not want to implement a bigger file size. No need to convince me of anything with these exhaustive explanations of personal geocache file management strategies.

 

I know that won't work anyway ;)

 

File size may be bigger but the amount of files will be less so it might just even out.

I now run 30 PQs /week and it could be just 3 for the same amount of data downloaded.

 

The forum's general failure to be convincing is not indicative of an inability or unwillingness to be convinced.

Link to comment

Interesting. There needs to be limits of course; otherwise we can all ask for millions of caches and cripple the servers.

 

I do wonder why there is a limit per PQ rather than (just) per day. Currently, you can have 10 PQs of 1000, so 10,000 caches per day. Allowing 1 PQ of 10,000, or 2 PQs of 5000, etc would be more flexible, and avoid some of the work cachers need to do to split queries into multiple PQs. I suspect it is just that this is how it was implemented; changing it to be total number of caches would require development.

 

The API has just a total limit. In one day, I can do a single query of 6,000, or 100 queries of 60 caches etc.

 

I can live with it as it is, but as to why larger numbers can be helpful: Recently I was on vacation in New England, and while there didn't have regular internet access. I knew I'd do some caching with my brother-in-law, and we could end up anywhere in Maine, NH, or Mass. I loaded all the caches in these states onto my GPSr, so that wherever we went, I was ready. It was around 36,000 caches. I split it into multiple PQs over 4 days.

 

When based at home, or when I'm travelling to a specific area, I generally use the API and download the caches just before I go.

 

I don't know if this is why, but I would have a limit just because so many devices using GPX files can only handle 2,000 or 5,000 cases.

 

2,000 would be best because sometimes caches in a list of 1,500 get filtered out by others because of the 1,000 limit.

Link to comment

2,000 would be best because sometimes caches in a list of 1,500 get filtered out by others because of the 1,000 limit.

Doh. That's why mine are at 900-990 caches. I run 27 PQs to give me everything within 65 miles, plus all of New Jersey.

It wasn't that long ago that PQs were set at 500 caches. 1000 is great!

But, yes, 2000 would be nicer. Then I wouldn't have to redo my PQs as often.

It would also be nice to have a county filter. 65 miles takes me pretty far out onto Long Island. I do not cache in Nassau, Suffolk, Kings, Queens, Richmond or The Bronx Counties any more. I can set the filter on GSAK, but it would be nice on Groundspeak. So I don't download all those counties I will never visit.

Link to comment

2,000 would be best because sometimes caches in a list of 1,500 get filtered out by others because of the 1,000 limit.

Doh. That's why mine are at 900-990 caches. I run 27 PQs to give me everything within 65 miles, plus all of New Jersey.

It wasn't that long ago that PQs were set at 500 caches. 1000 is great!

But, yes, 2000 would be nicer. Then I wouldn't have to redo my PQs as often.

It would also be nice to have a county filter. 65 miles takes me pretty far out onto Long Island. I do not cache in Nassau, Suffolk, Kings, Queens, Richmond or The Bronx Counties any more. I can set the filter on GSAK, but it would be nice on Groundspeak. So I don't download all those counties I will never visit.

 

hmm, 27pq? is there something specific you're looking for?

 

do you cache with a phone, or standalone?

 

just curious, dassa lotta queries

Link to comment

2,000 would be best because sometimes caches in a list of 1,500 get filtered out by others because of the 1,000 limit.

Doh. That's why mine are at 900-990 caches. I run 27 PQs to give me everything within 65 miles, plus all of New Jersey.

It wasn't that long ago that PQs were set at 500 caches. 1000 is great!

But, yes, 2000 would be nicer. Then I wouldn't have to redo my PQs as often.

It would also be nice to have a county filter. 65 miles takes me pretty far out onto Long Island. I do not cache in Nassau, Suffolk, Kings, Queens, Richmond or The Bronx Counties any more. I can set the filter on GSAK, but it would be nice on Groundspeak. So I don't download all those counties I will never visit.

 

hmm, 27pq? is there something specific you're looking for?

 

do you cache with a phone, or standalone?

 

just curious, dassa lotta queries

 

Ah. Thought I had mentioned that. Anything within 65 miles. (Excepting Long Island, and most of NYC.) And all of NJ. I use a Garmin eTrex 30.

Link to comment

2,000 would be best because sometimes caches in a list of 1,500 get filtered out by others because of the 1,000 limit.

Doh. That's why mine are at 900-990 caches. I run 27 PQs to give me everything within 65 miles, plus all of New Jersey.

It wasn't that long ago that PQs were set at 500 caches. 1000 is great!

But, yes, 2000 would be nicer. Then I wouldn't have to redo my PQs as often.

It would also be nice to have a county filter. 65 miles takes me pretty far out onto Long Island. I do not cache in Nassau, Suffolk, Kings, Queens, Richmond or The Bronx Counties any more. I can set the filter on GSAK, but it would be nice on Groundspeak. So I don't download all those counties I will never visit.

 

hmm, 27pq? is there something specific you're looking for?

 

do you cache with a phone, or standalone?

 

just curious, dassa lotta queries

 

Ah. Thought I had mentioned that. Anything within 65 miles. (Excepting Long Island, and most of NYC.) And all of NJ. I use a Garmin eTrex 30.

 

oh OK, its smorgasbord, not sniper, then?

Edited by ohgood
Link to comment

oh OK, its smorgasbord, not sniper, then?

The poster you just responded to has ~6k finds. The more finds the farther you have to go from your home location in different directions. Depending on the situation (weather, time available, goal, etc), a cacher needs options when planning their hunts - small pocket of hard caches, large pocket of easier caches, caches of a certain type, age, D/T, going with another cacher and looking for caches you both haven't found, etc. Planning can take more time then the hunt. A large number of caches allows the skilled finder to accomplish their goal.

 

I was an earlier adopter of GSAK back when it was initially released. I became inactive for a few years and now I'm using Project-GC as it currently meets my needs. Project-GC has so far minimized *my* needs for PQs bas on how the site works. If I started to rely on GSAK again, what others in this thread are asking would be beneficial.

 

If you have no need, that's fine, but that doesn't minimize the need of others that could benefit from adjusting the existing process.

 

Link to comment

2,000 would be best because sometimes caches in a list of 1,500 get filtered out by others because of the 1,000 limit.

Doh. That's why mine are at 900-990 caches. I run 27 PQs to give me everything within 65 miles, plus all of New Jersey.

It wasn't that long ago that PQs were set at 500 caches. 1000 is great!

But, yes, 2000 would be nicer. Then I wouldn't have to redo my PQs as often.

It would also be nice to have a county filter. 65 miles takes me pretty far out onto Long Island. I do not cache in Nassau, Suffolk, Kings, Queens, Richmond or The Bronx Counties any more. I can set the filter on GSAK, but it would be nice on Groundspeak. So I don't download all those counties I will never visit.

 

hmm, 27pq? is there something specific you're looking for?

 

do you cache with a phone, or standalone?

 

just curious, dassa lotta queries

 

Ah. Thought I had mentioned that. Anything within 65 miles. (Excepting Long Island, and most of NYC.) And all of NJ. I use a Garmin eTrex 30.

 

oh OK, its smorgasbord, not sniper, then?

It's getting all the caches in the area so you have them on your GPSr.

 

I do the same thing on a smaller scale. I have one PQ for small traditional caches, one for regulars, one for larges, one for solved puzzles, and then one for other caches. It covers the caches I'm interested in finding within 50 miles or so.

 

Obviously this is available via the app, but the GPS reception on my phone is nowhere near as good as on my Garmin, and I don't have a robust data plan, as I'm not interested in using my phone all the time.

Link to comment

oh OK, its smorgasbord, not sniper, then?

The poster you just responded to has ~6k finds. The more finds the farther you have to go from your home location in different directions. Depending on the situation (weather, time available, goal, etc), a cacher needs options when planning their hunts - small pocket of hard caches, large pocket of easier caches, caches of a certain type, age, D/T, going with another cacher and looking for caches you both haven't found, etc. Planning can take more time then the hunt. A large number of caches allows the skilled finder to accomplish their goal.

 

I was an earlier adopter of GSAK back when it was initially released. I became inactive for a few years and now I'm using Project-GC as it currently meets my needs. Project-GC has so far minimized *my* needs for PQs bas on how the site works. If I started to rely on GSAK again, what others in this thread are asking would be beneficial.

 

If you have no need, that's fine, but that doesn't minimize the need of others that could benefit from adjusting the existing process.

 

I'm not really sure why you're answering for the other poster, it kind of sounds like you've assumed i was interested in the numbers of caches found, and wether or not there is a need for different databases either on or offline. had i been interested in those subjects, i would have asked, but thank you for your opinion just the same. :-) i could care less if there are two or two million finds, and less about which type of database is used to store them. :-) if someone uses a truck full of paper, that's cool too. :-)

 

what i was asking about was (smorgasbord reference) gathering lots of caches in an area vs very specific (sniper reference) types of caches. it seemed to me that running more queries in a small area would be for specific types, buy her i go learning stuff. i could have been more specific, so the person i asked would understand what i hoped to learn, that's for sure. :-)

Edited by ohgood
Link to comment

oh OK, its smorgasbord, not sniper, then?

The poster you just responded to has ~6k finds. The more finds the farther you have to go from your home location in different directions. Depending on the situation (weather, time available, goal, etc), a cacher needs options when planning their hunts - small pocket of hard caches, large pocket of easier caches, caches of a certain type, age, D/T, going with another cacher and looking for caches you both haven't found, etc. Planning can take more time then the hunt. A large number of caches allows the skilled finder to accomplish their goal.

 

I was an earlier adopter of GSAK back when it was initially released. I became inactive for a few years and now I'm using Project-GC as it currently meets my needs. Project-GC has so far minimized *my* needs for PQs bas on how the site works. If I started to rely on GSAK again, what others in this thread are asking would be beneficial.

 

If you have no need, that's fine, but that doesn't minimize the need of others that could benefit from adjusting the existing process.

 

I'm not really sure why you're answering for the other poster, it kind of sounds like you've assumed i was interested in the numbers of caches found, and wether or not there is a need for different databases either on or offline. had i been interested in those subjects, i would have asked, but thank you for your opinion just the same. :-) i could care less if there are two or two million finds, and less about which type of database is used to store them. :-) if someone uses a truck full of paper, that's cool too. :-)

 

what i was asking about was (smorgasbord reference) gathering lots of caches in an area vs very specific (sniper reference) types of caches. it seemed to me that running more queries in a small area would be for specific types, buy her i go learning stuff. i could have been more specific, so the person i asked would understand what i hoped to learn, that's for sure. :-)

I happen to know Harry Dolphin and his frequent caching partner Andy Bear since they began in 2004. Even if I didn't anyone with his find count and stats (you can see on his page) where he's cached the entire eastern half of the US, would have the same need, which I referenced in my previous post. If he goes caching with Andy Bear he might have a different plan then if he goes alone. You can't use the web site to figure this out. He want to to filter out caches that have 2 or more recent DNFs/Maints and you can't use the web site to figure that out. The list goes on and on and the options are plentiful.

 

For me today, I had a plan if the weather was good or heavy rain I had a plan if I needed to go to a friend's wake late today or if it wouldn't be held today. I had a plan if my wife & I were going to Lancaster, PA if there was no wake. These plans were optimized in a way that can't be done just using the GC.com site and require offline tools or an online tool like Project-GC. Like many cachers they have many plans at any given time - specific areas, specific goals across wide areas, etc.

Edited by Team DEMP
Link to comment

oh OK, its smorgasbord, not sniper, then?

The poster you just responded to has ~6k finds. The more finds the farther you have to go from your home location in different directions. Depending on the situation (weather, time available, goal, etc), a cacher needs options when planning their hunts - small pocket of hard caches, large pocket of easier caches, caches of a certain type, age, D/T, going with another cacher and looking for caches you both haven't found, etc. Planning can take more time then the hunt. A large number of caches allows the skilled finder to accomplish their goal.

 

I was an earlier adopter of GSAK back when it was initially released. I became inactive for a few years and now I'm using Project-GC as it currently meets my needs. Project-GC has so far minimized *my* needs for PQs bas on how the site works. If I started to rely on GSAK again, what others in this thread are asking would be beneficial.

 

If you have no need, that's fine, but that doesn't minimize the need of others that could benefit from adjusting the existing process.

 

I'm not really sure why you're answering for the other poster, it kind of sounds like you've assumed i was interested in the numbers of caches found, and wether or not there is a need for different databases either on or offline. had i been interested in those subjects, i would have asked, but thank you for your opinion just the same. :-) i could care less if there are two or two million finds, and less about which type of database is used to store them. :-) if someone uses a truck full of paper, that's cool too. :-)

 

what i was asking about was (smorgasbord reference) gathering lots of caches in an area vs very specific (sniper reference) types of caches. it seemed to me that running more queries in a small area would be for specific types, buy her i go learning stuff. i could have been more specific, so the person i asked would understand what i hoped to learn, that's for sure. :-)

 

....These plans were optimized in a way that can't be done just using the GC.com site and require offline tools or an online tool like Project-GC...

 

this part is interesting to me. could you be more specific in this subject?

Link to comment

 

....These plans were optimized in a way that can't be done just using the GC.com site and require offline tools or an online tool like Project-GC...

 

this part is interesting to me. could you be more specific in this subject?

Look at https://www.geocaching.com/blog/2014/08/jasmer-fizzy-and-365/ and you'll see three typical challenges someone might try to accomplish. There's no way I know of to use the GC.com web site to find needed caches to fill in your Fizzy Challenge grid. You can run a series of queries or if you download a bunch of PQs to filter offline or use Project-GC you click a button and it shows you. Same goes for finding caches placed on specific dates or months/years referenced by the other challenges on that page.

 

I'm going to a new area not near my home location and I'd like to find some of the oldest caches, say 2003 or older.

 

I sometimes look and see if there are caches which haven't been found in 6+ months and consider going after those.

 

I might try and apply any of the above to a route I might be driving.

 

Those examples and many more aren't possible or extremely inefficient to attempt using the GC.com website search functions.

 

 

 

Link to comment

2,000 would be best because sometimes caches in a list of 1,500 get filtered out by others because of the 1,000 limit.

Doh. That's why mine are at 900-990 caches. I run 27 PQs to give me everything within 65 miles, plus all of New Jersey.

It wasn't that long ago that PQs were set at 500 caches. 1000 is great!

But, yes, 2000 would be nicer. Then I wouldn't have to redo my PQs as often.

It would also be nice to have a county filter. 65 miles takes me pretty far out onto Long Island. I do not cache in Nassau, Suffolk, Kings, Queens, Richmond or The Bronx Counties any more. I can set the filter on GSAK, but it would be nice on Groundspeak. So I don't download all those counties I will never visit.

 

hmm, 27pq? is there something specific you're looking for?

 

do you cache with a phone, or standalone?

 

just curious, dassa lotta queries

 

Ah. Thought I had mentioned that. Anything within 65 miles. (Excepting Long Island, and most of NYC.) And all of NJ. I use a Garmin eTrex 30.

 

oh OK, its smorgasbord, not sniper, then?

 

I guess that I did not understand your question. Smorgasbord or Sniper? Not my standard English.

I download my PQs during the week. Import into GSAK. Then decide where I might go geocaching over the weekend. Remove most caches with the last two DNFs, or three of the last four. Any caches that are inactive. Use the county filter to eliminate caches in counties I am not going to visit.

This weekend, that is 3600 caches. All loaded into the GPSr and nüvi. I don't need to pick individual caches to load. Where ever I choose to go, all the caches that interest me are loaded.

If I decide to go to Delaware, or odd spots in Pennsylvania, I can upload those GPX files as well. (Run those on Thursday.) I need a cache in Pennsylvania DeLorme Page 93. But that will probably be an overnighter. But I have a GPX file set up for that. Upload when needed. Lots of GPX files set up, for use when needed. Saybrook, Ct. Maine State Star. Erie County, PA. (Need to change that to Montgomery County, Pa.)

With 5974 finds, who knows where I will go geocaching this weekend? But my GPSr and nüvi are set!

Link to comment

Semi-OT:

 

If you want all the caches in an area, don't run distance-based queries around a center. Set up a series of caches for everything in a select region (state/province) for placed date ranges, down to the day. Once you have all the caches in an old-date PQ, you'll never need to run the PQ again; just set them up from oldest to a range including today, watch for the cache count to be <1000 to not miss any, and download them all.

 

Occasionally a cache might get published with an old placeddate, so it's good to run perhaps the last few recent PQs to fill in the holes, but otherwise, then you only need to regularly run the 'current' PQ to keep your list updated.

 

The API supports FAR more cache downloads than the PQ, so once you have the caches just update them via API close to where you're caching. I tend to just do a status update on as many as possible, then grab the last few logs for any I'm looking for.

 

For example, here's my list of PQs for all of Ontario. I now have a complete database of all caches in Ontario both in GSAK and in Geosphere. So it's very easy to search, check, update, do challenges, cache on the road by updating nearby caches, etc. Occasional status checks will weed out new archivals, and running the 6 or so recent PQs keeps the active list up to date. No worries about PQ limits (other than keeping each around the 950 mark, give or take)

 

pq-ontario-set.png

Link to comment

2,000 would be best because sometimes caches in a list of 1,500 get filtered out by others because of the 1,000 limit.

Doh. That's why mine are at 900-990 caches. I run 27 PQs to give me everything within 65 miles, plus all of New Jersey.

It wasn't that long ago that PQs were set at 500 caches. 1000 is great!

But, yes, 2000 would be nicer. Then I wouldn't have to redo my PQs as often.

It would also be nice to have a county filter. 65 miles takes me pretty far out onto Long Island. I do not cache in Nassau, Suffolk, Kings, Queens, Richmond or The Bronx Counties any more. I can set the filter on GSAK, but it would be nice on Groundspeak. So I don't download all those counties I will never visit.

 

hmm, 27pq? is there something specific you're looking for?

 

do you cache with a phone, or standalone?

 

just curious, dassa lotta queries

 

Ah. Thought I had mentioned that. Anything within 65 miles. (Excepting Long Island, and most of NYC.) And all of NJ. I use a Garmin eTrex 30.

 

oh OK, its smorgasbord, not sniper, then?

 

I guess that I did not understand your question. Smorgasbord or Sniper? Not my standard English.

I download my PQs during the week. Import into GSAK. Then decide where I might go geocaching over the weekend. Remove most caches with the last two DNFs, or three of the last four. Any caches that are inactive. Use the county filter to eliminate caches in counties I am not going to visit.

This weekend, that is 3600 caches. All loaded into the GPSr and nüvi. I don't need to pick individual caches to load. Where ever I choose to go, all the caches that interest me are loaded.

If I decide to go to Delaware, or odd spots in Pennsylvania, I can upload those GPX files as well. (Run those on Thursday.) I need a cache in Pennsylvania DeLorme Page 93. But that will probably be an overnighter. But I have a GPX file set up for that. Upload when needed. Lots of GPX files set up, for use when needed. Saybrook, Ct. Maine State Star. Erie County, PA. (Need to change that to Montgomery County, Pa.)

With 5974 finds, who knows where I will go geocaching this weekend? But my GPSr and nüvi are set!

 

OK, that explains a lot. i think my ignorance of stand alone processes and limitations is showing. i usually do the downloading and filtering/post processing on my phone, so a lot of this is foreign to me when it comes to why pocket queries are such a big deal. again, ignorance. thanks for the explanation

 

are you using two devices to get to the caches, or just using planned redundancies?

Edited by ohgood
Link to comment

GS should match all values given from BM lists, PQs and the Api. IE: When doing a search we only can map and bookmark 1000 caches for now because of the bookmark list's limit. This has to be enhanced in the first run. Then update the PQ limit. As most Garmins can hold at least 5000 geocaches in a gpx file - this 5000 might be a suitable number for a new limit on PQs, BM lists and search result mapping. This enhanced limit would also give us a way to search and load a unit in one go.

 

Hans

Link to comment
On 4.5.2017 at 9:04 AM, HHL said:

GS should match all values given from BM lists, PQs and the Api. IE: When doing a search we only can map and bookmark 1000 caches for now because of the bookmark list's limit. This has to be enhanced in the first run. Then update the PQ limit. As most Garmins can hold at least 5000 geocaches in a gpx file - this 5000 might be a suitable number for a new limit on PQs, BM lists and search result mapping. This enhanced limit would also give us a way to search and load a unit in one go.

 

Hans

Bump

Link to comment

I'd rather have the mile limit expanded.  For example, I like to find virtual caches, as so many of them are very interesting.  One PQ for the entire country would be ideal, rather than trying to set up multiple PQs, that clearly overlap.  As for the number of caches?  I guess it would have to be more than 1000 for my purpose.  Or, I could just travel more and reduce the total.

Link to comment
4 minutes ago, HHL said:

There is no distance limit when you select a country or a state.

Except that there are some countries (like USA) that you cannot select; you must select by state(s).  To get the entire country, you'd have to (AFAIK) select all 50.

I don't think there's a way to increase the number of caches in a PQ beyond 1000, so multiple PQs by (smaller) groups of states would be necessary anyway.

Link to comment
On 10/1/2017 at 1:09 PM, RufusClupea said:

Except that there are some countries (like USA) that you cannot select; you must select by state(s).  To get the entire country, you'd have to (AFAIK) select all 50.

I don't think there's a way to increase the number of caches in a PQ beyond 1000, so multiple PQs by (smaller) groups of states would be necessary anyway.

Yeah--I go by zip code, and it takes about 8 to cover the entire US.  Granted, there's overlap, but downloading the PQs takes care of that.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...