Jump to content

Its time to increase the number of caches in each Pocket Queries


Teddy Steiff

Recommended Posts

I'm actually kind of surprised the following suggestion hasn't been made yet here.....

 

For dense cache areas, set up PQs to query by Placed Date. Run them (only once needed) from May 2000 for as long a period as necessary to max out at just under 1000 caches, then start the next PQ on the next day. Run them until you get to the PQ that includes today (for which I just leave the end date as Dec 31). Eventually (after many PQs have run over a few days due to max run count) you'll have every single active cache in the region of your choice.

 

 

That's almost what I do. (258,768 cache in the GPS as of today) I try to keep my GPS full because I travel on my bike a lot and if I get tired I'll look for a cache. I don't make plans on where I go. I just go. I try to avoid big cities, they're too dangerous on a motorcycle in many ways. But I do have them in the GPS. It can take several pocket queries to cover a city like Denver or Fort Collins.

 

So I'm on a download cycle. I have 413 (now including 16 state queries that I created a few days ago) pocket queries that I just download every day from A to Z. It takes 42 days to get them all. They are pre-selected to download even if I'm not home. I've set up the API with the 16 states and I'll start downloading that when I can. Not necessarily every day but before I upload them to the GPS I'll run the Status Check and Get Recent Logs. It's not as time consuming as you might think. The longest part is selecting the Pocket Queries. I just do them like I say from A to Z.

 

I recently tried a macro called CacheSeries to help with the Power Cache nightmare but it's not that great. It looks for caches of the same name type to help try to find power caches so they can be deleted. Then you have to look at each group on a map to see if it's a chain and add them to your ignore list.

 

Over the last year or two I've watched many of my query circles shrink and shrink because of Power Caches. So I've had to add more pocket queries to cover the empty areas caused buy the power caches. The best thing to do would be add the power caches to the pocket query setup so we can choose not to download them. But instead you have to find them on a map and add them to your ignore list. But if the States pocket queries download limit were increased then it could make the power cache issue not as devastating as it currently is.

Edited by Jake81499
Link to comment

Circles and radii are why I opted to go with the date range method; no need to repeatedly download old date ranges unless you think there might be a new publish with a PlacedDate in that range. To keep current, all you need is the most recent date range (covering today), and status updates for nearby caches. That's my reasoning anyway.

I hated running those 'give me everything within 500mi' PQ circles that often overlapped unnecessarily and dramatically affected by high-density trails and forests.

Now it's basically just 'give me everything published in the last 2 months within Ontario', with the occasional 'update status of caches nearby xyz location'. With some minor adjustments occasionally, no newly published cache is missed :)

Link to comment

Circles and radii are why I opted to go with the date range method; no need to repeatedly download old date ranges unless you think there might be a new publish with a PlacedDate in that range. To keep current, all you need is the most recent date range (covering today), and status updates for nearby caches. That's my reasoning anyway.

I hated running those 'give me everything within 500mi' PQ circles that often overlapped unnecessarily and dramatically affected by high-density trails and forests.

Now it's basically just 'give me everything published in the last 2 months within Ontario', with the occasional 'update status of caches nearby xyz location'. With some minor adjustments occasionally, no newly published cache is missed :)

 

I always run the 500 mile circle with 1000 caches. In an area like Denver DIA you might get a 5 mile circle if your lucky. I tried the date range too.

 

I can sum up the problem and probably piss off somebody somewhere. The whole system is designed for GPS's that only hold a couple thousand caches. It's not designed for the new GPS systems that can hold few hundred thousand. The pocket query system needs to be updated to accommodate new GPS's and habits of those with those GPS's. I would imagine bandwidth is an issue too but internet speeds have gotten much better over the years.

Link to comment

I'm actually kind of surprised the following suggestion hasn't been made yet here.....

 

For dense cache areas, set up PQs to query by Placed Date.

I though that had been mentioned. Maybe I imagined it. <_<

 

Yes, this is the way I do it, too, though as I do like to keep my database up-to-date, the family of (currently) nine PQs is run and imported roughly weekly.

 

It's still a pain to set up, and -- if you do want to run it regularly -- maintain the date cutoffs over time. One or two larger PQs would be much more convenient.

 

I would also expect running a single, larger, less-constrained PQ to consume less resources to process server-side than would ten smaller PQs all with mostly-the-same-plus-a-couple constraints. But as we've come to understand, Groundspeak's database apparently does not conform to normal expectations.......

Link to comment

Circles and radii are why I opted to go with the date range method; no need to repeatedly download old date ranges unless you think there might be a new publish with a PlacedDate in that range. To keep current, all you need is the most recent date range (covering today), and status updates for nearby caches. That's my reasoning anyway.

I hated running those 'give me everything within 500mi' PQ circles that often overlapped unnecessarily and dramatically affected by high-density trails and forests.

Now it's basically just 'give me everything published in the last 2 months within Ontario', with the occasional 'update status of caches nearby xyz location'. With some minor adjustments occasionally, no newly published cache is missed :)

 

I always run the 500 mile circle with 1000 caches. In an area like Denver DIA you might get a 5 mile circle if your lucky. I tried the date range too.

 

I can sum up the problem and probably piss off somebody somewhere. The whole system is designed for GPS's that only hold a couple thousand caches. It's not designed for the new GPS systems that can hold few hundred thousand. The pocket query system needs to be updated to accommodate new GPS's and habits of those with those GPS's. I would imagine bandwidth is an issue too but internet speeds have gotten much better over the years.

 

I don't believe that it was designed based on the technology available to save results. I think it was designed such that the results would provide a list of caches that one might actually go out and find before the next time they ran a PQ. While it might be nice to be able to store a few hundred cache listings on your GPS, with 5 PQs a day, 7 days a week that's way more caches than anyone is going to be able to find in a week.

 

 

Link to comment

Circles and radii are why I opted to go with the date range method; no need to repeatedly download old date ranges unless you think there might be a new publish with a PlacedDate in that range. To keep current, all you need is the most recent date range (covering today), and status updates for nearby caches. That's my reasoning anyway.

I hated running those 'give me everything within 500mi' PQ circles that often overlapped unnecessarily and dramatically affected by high-density trails and forests.

Now it's basically just 'give me everything published in the last 2 months within Ontario', with the occasional 'update status of caches nearby xyz location'. With some minor adjustments occasionally, no newly published cache is missed :)

 

I always run the 500 mile circle with 1000 caches. In an area like Denver DIA you might get a 5 mile circle if your lucky. I tried the date range too.

 

I can sum up the problem and probably piss off somebody somewhere. The whole system is designed for GPS's that only hold a couple thousand caches. It's not designed for the new GPS systems that can hold few hundred thousand. The pocket query system needs to be updated to accommodate new GPS's and habits of those with those GPS's. I would imagine bandwidth is an issue too but internet speeds have gotten much better over the years.

 

I don't believe that it was designed based on the technology available to save results. I think it was designed such that the results would provide a list of caches that one might actually go out and find before the next time they ran a PQ. While it might be nice to be able to store a few hundred cache listings on your GPS, with 5 PQs a day, 7 days a week that's way more caches than anyone is going to be able to find in a week.

 

True. That's why the whole system needs updated. It needs to allow us to download more caches at a time. Update the limits including the API limits. And the time needs to be fixed for the API from 24 hours from when you last ran it. The API should be midnight to midnight. Power caches need to be categorized as such. Like you say, the system wasn't designed for todays GPS's. I would agree that you can't satisfy all the people all the time but just those simple changes would sure help. It's become risky to even run a route any more because of power caches. So updating the numbers would help with that too.

Edited by Jake81499
Link to comment

I don't believe that it was designed based on the technology available to save results. I think it was designed such that the results would provide a list of caches that one might actually go out and find before the next time they ran a PQ. While it might be nice to be able to store a few hundred cache listings on your GPS, with 5 PQs a day, 7 days a week that's way more caches than anyone is going to be able to find in a week.

 

I've only been at this a few years longer than you have, but lets review.

 

I remember a 500 cache per PQ limit, with a maximum of 5 PQ's per day = 2,500 total caches per day.

Then there was the increase after the fire where the servers were housed.

Was that the increase to 1000 caches per PQ? If so, that was then 5,000 total caches per day.

The current daily PQ limit is not 5 as you suggested, but is 10 PQ's per day.

10 PQ's per day with a 1000 cache per PQ limit = 10,000 total caches per day.

 

So while Groundspeak could make the process of acquiring 10,000 caches in one area every day easier, that is the one thing they have never done. Why would anyone think there is any real chance it ever will happen? That data (created by us) has value, and Groundspeak has always been careful to not let that data flow too freely. And they don't NEED to do anything. What are we going to do? Take our business to their viable competitor? What is the name of that competitor again?

Link to comment

And the time needs to be fixed for the API from 24 hours from when you last ran it. The API should be midnight to midnight.

 

And which "midnight" would that be? Midnight PST is 9 in the morning here making it inconvenient to run just before and again just after 9. 24 hours means server load is spread out and not everyone in a timezone ill hit "fetch" at xx:01.

Link to comment

And the time needs to be fixed for the API from 24 hours from when you last ran it. The API should be midnight to midnight.

 

And which "midnight" would that be? Midnight PST is 9 in the morning here making it inconvenient to run just before and again just after 9. 24 hours means server load is spread out and not everyone in a timezone ill hit "fetch" at xx:01.

 

The pocket query download is set to midnight somewhere. I think its set to pacific time but I could be wrong. Set it to that same midnight. Few people would stay up until and hit the fetch button exactly at midnight and It's terribly inconvenient now the way it is. If you want to run API every day you practically have to write down the time you did it last. Myself, like many others, get up in the morning, download the pocket queries and run the API if I can. Normally I can't run the API until long after I download the queries because the time is so screwed up. I just now downloaded my queries and I wouldn't be able to do the API until almost 4pm. That was the second biggest reason I gave up on API.

Link to comment

And they don't NEED to do anything.

 

I agree. I'm not holding my breath for any advancements. I see a lot of people with excuses and most of the excuses are kinda lame. Some of the rest in this subject have had some excellent ideas which I'm sure are falling on deaf ears.

 

The way I personally do my download works perfectly for the 259,070 caches in the GPS. I don't really need a better way or more caches in the PQ or API. But I'm fighting Power Caches. Adding more caches to the API and PQ would help repair the problem of power caches ruining PQs. Power Caches need to be added to the PQ and new cache building page so we can ignore easier without having to manually add them to an ignore list. I currently have 11,000 power caches in my ignore list and I'm tempted to start a forum page pointing out power caches that need ignored in the other peoples ignore list who are also sick of them.

 

PQ's would have to be raised to at least 5000 because of the way power caches are springing up. It's a disease that's taking the fun out of things.

 

I'm Not holding my breath on changes.

Edited by Jake81499
Link to comment

The pocket query download is set to midnight somewhere. I think its set to pacific time but I could be wrong. Set it to that same midnight. Few people would stay up until and hit the fetch button exactly at midnight and It's terribly inconvenient now the way it is. If you want to run API every day you practically have to write down the time you did it last. Myself, like many others, get up in the morning, download the pocket queries and run the API if I can. Normally I can't run the API until long after I download the queries because the time is so screwed up. I just now downloaded my queries and I wouldn't be able to do the API until almost 4pm. That was the second biggest reason I gave up on API.

 

:lol:

 

Making it midnight PST may be convenient for you (getting up in the morning) but very inconvenient for people further away from PST. I would have to wait until 9AM, people further East would have to wait longer. Very convenient if you want to fetch new or updating older caches before leaving on a cache day, NOT. The 24 hour system guarantees you can schedule API access efficiently wherever you are..

Link to comment

Ok, I have to wait until PST to download my pocket queries. That pretty darned inconvenient. I can make excuses too. LOL

 

Consider yours a luxury position. Unless you're up before 3 in the morning it's after midnight in Seattle for you.

And you can download PQs any time you want.

Link to comment

Ok, I have to wait until PST to download my pocket queries. That pretty darned inconvenient. I can make excuses too. LOL

 

Consider yours a luxury position. Unless you're up before 3 in the morning it's after midnight in Seattle for you.

And you can download PQs any time you want.

 

Ok, so what time do you download pocket queries? What's wrong with doing the API at that same time? I do my downloads between 5AM and 10AM Mountain Time every day. I should be able to do the API at that same time every day. As of today I have to wait until almost 8PM because every time you run them using the current time system it sets back a little. That is unless you have a bot or something to try and do it at the same time every day. Personally the API is still useless anyhow. Doing an API on 16 states, it's been coming up with only 1/2 dozen or fewer new caches almost every time I've run it.

Edited by Jake81499
Link to comment

Ok, so what time do you download pocket queries? What's wrong with doing the API at that same time? I do my downloads between 5AM and 10AM Mountain Time every day. I should be able to do the API at that same time every day. As of today I have to wait until almost 8PM because every time you run them using the current time system it sets back a little. That is unless you have a bot or something to try and do it at the same time every day. Personally the API is still useless anyhow. Doing an API on 16 states, it's been coming up with only 1/2 dozen or fewer new caches almost every time I've run it.

 

PQ's are mostly available between 9-9:15 in the morning local (00-00:15 PST) as I have made sure they ran as soon after midnight PST the first time. If you want to "reset" the time of your API limits, just skip one day and you're good to go at whatever time again. It's simple really :ph34r:

If I were to "need" API access at 7 in the morning but the reset is done at 12 because I keep using it skipping one day would reset the limit at 12 so waiting until the next day would allow me to get full 6000/10000 limits again at 7.

 

I have all Belgian caches in a database + another with all caches in the Netherlands and one with Luxemburg caches. I run weekly PQs (mo-we) to keep Belgium up-to-date and use the API on Friday to get new (published last 2 weeks)caches for BE, NL, LX. I make a selection of the caches we want to do during the weekend and refresh them too.

This has never gotten me in trouble with any limits. I don't want nor need 250000 caches "just in case we might go somewhere" anyway.

 

As for the API, there's a way around it but I don't want to go into that here. Just read the API info carefully and you'll unstand what I mean.

Link to comment

Again, as others have asked, WHY? Why do you need >250K caches to be up to date. What use are they? The #1 cacher in the world is only up to about 1/2 that number and it took him 13 years to get there.

 

I'm not going to get into why or why not. Or the API times. They've all be answered over and over. It's all lame, all the excuses are lame.

 

My whole gripe is over power caches. A pocket query with a limit of 1000 caches and a radius of 500 miles becomes seriously limited when a power cache of 1000+ caches mysteriously springs up within the boundaries of that query. Perhaps, raising the size of the pocket query limit to 10,000 so that our pocket queries are not completely ruined by a power cache might be in order. It's obvious that the powers that be are not going to write rules concerning power caches, add power cache to the cache creation page nor are they going to add then to the pocket query creation page so that we could just not download them if we don't want them. Instead we have to search them out and add them to an ignore list or use a macro that doesn't work all that well in order to help find them so we can add them to the ignore list. Maybe it just might be a whole lot easier to just increase the pocket query size. It might satisfy a whole lot of people's needs and the whole power cache problem might become moot for a while.

 

History shows, the next thing we will hear is; 'who sets 1000 caches in a day', 'how do you define a power cache', 'why do you need 250,000 caches', 'I don't cache like that', 'what's wrong with power caches', 'use the API to download caches', 'there's ways to add them to the ignore list', 'the servers won't handle the data', 'my gps only holds 1000 caches'.... On and on....

Edited by Jake81499
Link to comment

I have the opposite problem. Lots of PQ's with small cache counts. Probably 50 or so with 50 - 250 caches in each. I would like more PQ's per day rather than larger PQ's

 

Depending on the results you want, the API may be a better choice for some of them.

Link to comment

I have the opposite problem. Lots of PQ's with small cache counts. Probably 50 or so with 50 - 250 caches in each. I would like more PQ's per day rather than larger PQ's

 

I would recommend GSAK's api function Get Geocaches as well. You may save a dozent or more settings for every stopvover of your coming Europian tour.

 

Example:

 

61363214e066fdca59f1be936b0f35bf.png

 

Combine those saved Get Geocaches settings within a macro. Done.

#*******************************************
# MacVersion = 1.0
# MacDescription = GetGeocaches Tour 2016
# MacAuthor = HHL
# MacFileName = EuropeIn10Days.gsk
# MacUrl =
#*******************************************
DATABASE Name="TestDB" Action=select

GcGetCaches Settings=EE Reval (3/3) 10 km Load=Y ShowSummary=No
GcGetCaches Settings=CZ Prague (3/3) 10 km Load=Y ShowSummary=No

# and so on ...

 

Hans

Edited by HHL
Link to comment

Again, as others have asked, WHY? Why do you need >250K caches to be up to date. What use are they? The #1 cacher in the world is only up to about 1/2 that number and it took him 13 years to get there.

 

For me personally, there are two reason for wanting a lot of caches in my GPSr.

 

1) I'm at work and when I step out the door at quitting time, I decide to go cache. It is a spur of the moment decision. If I have the caches in my GPSr already, I'm free to go where I want. I don't have to go home and sync my GPSr to my home computer and then decided where to go cache.

 

2) I'm planning to go on vacation to a faraway land. I don't have a real itinerary, so I don't know where I'll be on any given day. I may or may not have internet connection. For this, I need to have my bases covered and have a lot of caches loaded into my GPSr

 

Now, I will admit, I don't need 250,000 caches in either scenario.

 

For my home location, I like to have about 2000 caches loaded. This is perfectly doable with 2 or 3 PQs. But if there was a modest increase to the size of the PQ, I could do this in one download and be done with it. Lazy? Maybe. But it would ensure I have all the geocaches loaded and I wouldn't have to worry a gap in the overlap area.

 

For vacations, especially in high cache dense areas, I'm looking at dozens of PQs. Within 50 miles of Los Angeles, there are 25,000 caches. Paris has 13,000 caches, London has 38,000 caches. Toronto has 15,000 caches.

 

I wouldn't advocate an increase from 1,000 to 10,000 caches, but even an increase from 1,000 to 2,000 would cut down on the querying the database. In an ideal world 12 PQs would pull in 12,000 caches, but realistically, with over lap area, you are pulling in a lot less. Why query the database for information you already have?

Edited by igator210
Link to comment
[...]

I wouldn't advocate an increase from 1,000 to 10,000 caches, but even an increase from 1,000 to 2,000 would cut down on the querying the database. In an ideal world 12 PQs would pull in 12,000 caches, but realistically, with over lap area, you are pulling in a lot less. Why query the database for information you already have?

 

I would like to have the PQ generator working like an api aware application. That way we could generate PQs with different content until the daily api balance is down to zero.

IE: Lets generate for example a PQ with 8250 caches, another with 875 caches and some smaller ones for special purposes with appr. 100 caches. Another user may generate 3 PQs holding appr. 3000 caches. And so on ...

Lets count the content only - not the PQs.

And yes, I want to generate a 5000 caches PQ - load the Garmin - and GO.

 

Hans

Link to comment
[...]

I wouldn't advocate an increase from 1,000 to 10,000 caches, but even an increase from 1,000 to 2,000 would cut down on the querying the database. In an ideal world 12 PQs would pull in 12,000 caches, but realistically, with over lap area, you are pulling in a lot less. Why query the database for information you already have?

 

I would like to have the PQ generator working like an api aware application. That way we could generate PQs with different content until the daily api balance is down to zero.

IE: Lets generate for example a PQ with 8250 caches, another with 875 caches and some smaller ones for special purposes with appr. 100 caches. Another user may generate 3 PQs holding appr. 3000 caches. And so on ...

Lets count the content only - not the PQs.

And yes, I want to generate a 5000 caches PQ - load the Garmin - and GO.

 

Hans

 

My suggestion would be to maximize the PQ limit to the maximum number of caches allowed in a gps's gpx file as advertised by the GPS manufacturer with the GPS that hold the most. I believe that number is 5000. Look at all the GPS manufacturers and their brands. This does NOT say GGZ file. GGX file allows 250,000+ caches and GSAK requires a macro (GarminExport.gsk) to send them to the GPS. So look at all the GPS's, find out how many caches they all hold in a GPX. Pick that largest number and use that as the PQ's upper limit.

 

I like the example you gave by the way. Thank you.

Edited by Jake81499
Link to comment

Again, as others have asked, WHY? Why do you need >250K caches to be up to date. What use are they? The #1 cacher in the world is only up to about 1/2 that number and it took him 13 years to get there.

 

For me personally, there are two reason for wanting a lot of caches in my GPSr.

 

1) I'm at work and when I step out the door at quitting time, I decide to go cache. It is a spur of the moment decision. If I have the caches in my GPSr already, I'm free to go where I want. I don't have to go home and sync my GPSr to my home computer and then decided where to go cache.

 

2) I'm planning to go on vacation to a faraway land. I don't have a real itinerary, so I don't know where I'll be on any given day. I may or may not have internet connection. For this, I need to have my bases covered and have a lot of caches loaded into my GPSr

 

Now, I will admit, I don't need 250,000 caches in either scenario.

 

For my home location, I like to have about 2000 caches loaded. This is perfectly doable with 2 or 3 PQs. But if there was a modest increase to the size of the PQ, I could do this in one download and be done with it. Lazy? Maybe. But it would ensure I have all the geocaches loaded and I wouldn't have to worry a gap in the overlap area.

 

For vacations, especially in high cache dense areas, I'm looking at dozens of PQs. Within 50 miles of Los Angeles, there are 25,000 caches. Paris has 13,000 caches, London has 38,000 caches. Toronto has 15,000 caches.

 

I wouldn't advocate an increase from 1,000 to 10,000 caches, but even an increase from 1,000 to 2,000 would cut down on the querying the database. In an ideal world 12 PQs would pull in 12,000 caches, but realistically, with over lap area, you are pulling in a lot less. Why query the database for information you already have?

 

I have the exact same reasons for wanting more in a query but I am not lazy, I am busy. I am a full time working mother. 1 query for my entire area or entire area I may be vacationing in is a much more efficient use of my time.

Link to comment

I have the opposite problem. Lots of PQ's with small cache counts. Probably 50 or so with 50 - 250 caches in each. I would like more PQ's per day rather than larger PQ's

 

Depending on the results you want, the API may be a better choice for some of them.

 

I use the API extensively, but PQ's allow me to schedule a weekly update in a set-and-forget manner, where the API doesn't make that easy.

Link to comment

What everyone seems to miss is that you don't need ALL of the caches, but rather the ones that match your criteria for hunting. I've posted this before, but I'll give an updated scenario.

 

At my last data pull over multiple days, there were 9,699 caches within 30 miles of my home.

My preferences can further limit the data:

  • I'm not partial to micros, and if truth be told, this area also uses "Not chosen" and "Other" for nano and micros. If I cut all of those out, it leaves 3,363 caches.
  • If I'm caching "on the fly", I'd like the cache to be at the posted coordinates. If I cut out all of the caches that aren't "Traditional", it leaves 2,790 caches
  • Terrain 1 typically represent parking lot caches, and "on the fly" I usually like to limit my upper-end to 3.5 terrain, so eliminating Terrain 1.0, 4.0, 4.5 and 5.0 leaves 2,605
  • Difficulty on the fly - probably 3.0 or less, and that leaves 2,588.
  • There are 29 disabled remaining, so 2,559
  • There are 302 remaining with "Needs Maintenance"

 

So what started as 9,699 ended as 2,257, about 24% of the full list.

 

Of these 2,257 caches, I'm pretty sure I'd have a good time finding any one of them, and I've pre-filtered problems or ones not to my taste. These are "ready to go" ones that I could easily download and be sure to go out and get. Am I going to possibly going to miss some good caches? Yep. But I'm still out there having fun, right? It's better to get out there and find them than worrying about upkeep of your own database to try and post-filter the data locally.

Link to comment

What everyone seems to miss is that you don't need ALL of the caches, but rather the ones that match your criteria for hunting. I've posted this before, but I'll give an updated scenario.

 

At my last data pull over multiple days, there were 9,699 caches within 30 miles of my home.

My preferences can further limit the data:

  • I'm not partial to micros, and if truth be told, this area also uses "Not chosen" and "Other" for nano and micros. If I cut all of those out, it leaves 3,363 caches.
  • If I'm caching "on the fly", I'd like the cache to be at the posted coordinates. If I cut out all of the caches that aren't "Traditional", it leaves 2,790 caches
  • Terrain 1 typically represent parking lot caches, and "on the fly" I usually like to limit my upper-end to 3.5 terrain, so eliminating Terrain 1.0, 4.0, 4.5 and 5.0 leaves 2,605
  • Difficulty on the fly - probably 3.0 or less, and that leaves 2,588.
  • There are 29 disabled remaining, so 2,559
  • There are 302 remaining with "Needs Maintenance"

 

So what started as 9,699 ended as 2,257, about 24% of the full list.

 

Of these 2,257 caches, I'm pretty sure I'd have a good time finding any one of them, and I've pre-filtered problems or ones not to my taste. These are "ready to go" ones that I could easily download and be sure to go out and get. Am I going to possibly going to miss some good caches? Yep. But I'm still out there having fun, right? It's better to get out there and find them than worrying about upkeep of your own database to try and post-filter the data locally.

 

Jeepers!! If someone didn't want to subtract the not chosen or micros and wasn't worried about terrain or difficulty, how many pocket queries would it take all in the same area to download all those? I'd agree you could probably do it with API but still, that's a lot of caches in a small area.

Link to comment
At my last data pull over multiple days, there were 9,699 caches within 30 miles of my home.

 

So I'll do the math - but let's round to 9,700 for easy

 

1000 x 5 PQs = 5,000 caches - well maybe 4,900 with the needing to make each one less than 1,000

 

That leaves 4,800 to pull from the API in the same day.

Since there's 6,000 per day with the API, I'd still be able to pull an additional 1,200

 

And you must have missed this part...

At my last data pull over multiple days, there were 9,699 caches within 30 miles of my home.
Link to comment

[...]

1000 x 5 PQs = 5,000 caches - well maybe 4,900 with the needing to make each one less than 1,000

[...]

 

That's the point: We want just one PQ with 5000 caches to load the Gpsr in one go. Got it? ;-)

(It's not the daily limit itself)

 

Hans

Edited by HHL
Link to comment
At my last data pull over multiple days, there were 9,699 caches within 30 miles of my home.

 

So I'll do the math - but let's round to 9,700 for easy

 

1000 x 5 PQs = 5,000 caches - well maybe 4,900 with the needing to make each one less than 1,000

 

That leaves 4,800 to pull from the API in the same day.

Since there's 6,000 per day with the API, I'd still be able to pull an additional 1,200

 

And you must have missed this part...

At my last data pull over multiple days, there were 9,699 caches within 30 miles of my home.

 

Suggestions to make the packaging of PQ's more flexible are not new. Lack of interest by Groundspeak to allow that flexibility is not new. We have no leverage. Stalemate.

 

But consider that so few users need anywhere near the available PQ limit that it seems to be a little known fact that the available daily limit was increased from 5 PQ's to 10 PQ's quite some time ago. From the PQ page:

 

You can create up to 1000 queries. However, a maximum of 10 queries can be run each day.

 

And yes it is a royal pain to determine the date ranges that allow you to combine those 10,000 (minus just a bit of shrinkage) caches into one large radius. I think I once saw a third party site that offered an automated solution for that, but I don't use that site. So I have to do it the Grounspeak way.

Link to comment

What everyone seems to miss is that you don't need ALL of the caches, but rather the ones that match your criteria for hunting. I've posted this before, but I'll give an updated scenario.

 

At my last data pull over multiple days, there were 9,699 caches within 30 miles of my home.

My preferences can further limit the data:

  • I'm not partial to micros, and if truth be told, this area also uses "Not chosen" and "Other" for nano and micros. If I cut all of those out, it leaves 3,363 caches.
  • If I'm caching "on the fly", I'd like the cache to be at the posted coordinates. If I cut out all of the caches that aren't "Traditional", it leaves 2,790 caches
  • Terrain 1 typically represent parking lot caches, and "on the fly" I usually like to limit my upper-end to 3.5 terrain, so eliminating Terrain 1.0, 4.0, 4.5 and 5.0 leaves 2,605
  • Difficulty on the fly - probably 3.0 or less, and that leaves 2,588.
  • There are 29 disabled remaining, so 2,559
  • There are 302 remaining with "Needs Maintenance"

 

So what started as 9,699 ended as 2,257, about 24% of the full list.

 

Of these 2,257 caches, I'm pretty sure I'd have a good time finding any one of them, and I've pre-filtered problems or ones not to my taste. These are "ready to go" ones that I could easily download and be sure to go out and get. Am I going to possibly going to miss some good caches? Yep. But I'm still out there having fun, right? It's better to get out there and find them than worrying about upkeep of your own database to try and post-filter the data locally.

 

I'm happy that the above solution works for you. You found a way to make the PQs work for your caching style. Your style of caching is not mine. Your preference for cache type is not mine.

 

If you came to vacation in my home town, you would miss 21 of the top 25 favorited caches.

Link to comment
At my last data pull over multiple days, there were 9,699 caches within 30 miles of my home.

 

So I'll do the math - but let's round to 9,700 for easy

 

1000 x 5 PQs = 5,000 caches - well maybe 4,900 with the needing to make each one less than 1,000

 

That leaves 4,800 to pull from the API in the same day.

Since there's 6,000 per day with the API, I'd still be able to pull an additional 1,200

 

And you must have missed this part...

At my last data pull over multiple days, there were 9,699 caches within 30 miles of my home.

 

Suggestions to make the packaging of PQ's more flexible are not new. Lack of interest by Groundspeak to allow that flexibility is not new. We have no leverage. Stalemate.

 

But consider that so few users need anywhere near the available PQ limit that it seems to be a little known fact that the available daily limit was increased from 5 PQ's to 10 PQ's quite some time ago. From the PQ page:

 

You can create up to 1000 queries. However, a maximum of 10 queries can be run each day.

 

And yes it is a royal pain to determine the date ranges that allow you to combine those 10,000 (minus just a bit of shrinkage) caches into one large radius. I think I once saw a third party site that offered an automated solution for that, but I don't use that site. So I have to do it the Grounspeak way.

 

Few cachers really need to worry about the limit any more. Just go find a power cache near you and you've got a few hundred caches that day. It's sad.

 

I've said it before and I'll say it again. I've got 413 PQ's that I run in a cycle. That's 16 states that I ride the bike to regularly whether I want to cache or not. Raising the PQ limit to 5000 would reduce the number of PQ's and shorten my cycle until the Power Caches overwhelm the 5000. Running the API is all fine and dandy but it isn't random and selects the same caches over and over with the addition of maybe a few new ones. The last states API I ran pulled one new cache and updated about 400. That's a 16 state API using the current API limits. Even If I just wanted to pull Wyoming, no other state, I'd still need 10 or 15 pocket queries. What I might be able to do is a different API each day on a single state. I might try that rather than 16 states and see what I come up with. But the API limit would still need to be increased because there are way more caches than it would pick up under the current limits.

 

And someone else hit the nail on the head when they said, 'We need the limits to increase, we need rules placed on power caches, they know we need these things, but they don't NEED to do anything.'

Edited by Jake81499
Link to comment

And yes it is a royal pain to determine the date ranges that allow you to combine those 10,000 (minus just a bit of shrinkage) caches into one large radius. I think I once saw a third party site that offered an automated solution for that, but I don't use that site. So I have to do it the Grounspeak way.

As an FYI for whomever might be interested in the date splitting you mentioned: Project-GC's PQ Splitter is a free service. Single or multiple states can be selected and some filters can be selected, such as 'not found' to exclude any caches you've already found.

 

Just for curiosity, I checked on splitting all caches in Wyoming. It said I'd need 7 PQ's and gave me the dates needed for each of those 7 PQ's.

Link to comment

Adding a new type of cache would solve the problem with power caches. I'm pretty sure it's been mentioned. If not, make a traditional/series cache. If a series of caches are more than 10 caches or so, it would be mandatory to make it a "series" cache. It would still be traditional, but set apart from normal traditional. Most so-called power caches I've seen have had numbers attached to them. But I'm new, I haven't seen that many "power caches"

 

Doing this, you could set a filter for it. Problem solved?

Link to comment

Adding a new type of cache would solve the problem with power caches. I'm pretty sure it's been mentioned. If not, make a traditional/series cache. If a series of caches are more than 10 caches or so, it would be mandatory to make it a "series" cache. It would still be traditional, but set apart from normal traditional. Most so-called power caches I've seen have had numbers attached to them. But I'm new, I haven't seen that many "power caches"

 

Doing this, you could set a filter for it. Problem solved?

 

It's been suggested almost exactly as you mentioned it. Then we could just shut off Power Cache in the PQ creation page. Peer Pressure would really help enforcing the new rules but don't hold your breath. The best work around right now would be to increase the number of caches in the PQ and API Get Caches. I just ran the API Light on California. It looked for 10,000 caches and gave me 83 new ones. It's better than none I guess.

 

Just an FYI, You say you haven't seen a lot of power caches. Here's one for you. Look at Highway to H.E.L.L in southern California. It's one of the worst.

Edited by Jake81499
Link to comment
Adding a new type of cache would solve the problem with power caches. I'm pretty sure it's been mentioned. If not, make a traditional/series cache. If a series of caches are more than 10 caches or so, it would be mandatory to make it a "series" cache. It would still be traditional, but set apart from normal traditional. Most so-called power caches I've seen have had numbers attached to them. But I'm new, I haven't seen that many "power caches"
I like the idea of some sort of attribute for numbers run caches, but I think that "series" is the wrong name for this attribute. There are a number of cache series around here. None of them are numbers run caches. For many of them, the caches in the series are united by a theme, but are not particularly close geographically, and certainly not 528ft/161m from each other along a trail.
Link to comment

I just ran the API Light on California. It looked for 10,000 caches and gave me 83 new ones. It's better than none I guess.

 

Depending on when you last updated you CA database you should have run the API with "was published in the last xx days" (xx<30). That would give you only new ones (and probably more than 83 <_< ).

 

Not only that but with updated databases you could even select the states needed and run the API fetch daily for caches "published in the last 2 days" unless more than 6000/10000 are published in the whole area you're covering.

The caches already in your database can then be updated by PQ's as usual.

Link to comment

[quote

 

Just an FYI, You say you haven't seen a lot of power caches. Here's one for you. Look at Highway to H.E.L.L in southern California. It's one of the worst.

 

I looked at that. How does one figure they are contributing to the game by doing that? I bet that person gets hate mail daily.

 

I'm sure there are some people out there that want to increase their find count and do caches like those. How boring.

Link to comment

Depending on when you last updated you CA database you should have run the API with "was published in the last xx days" (xx<30). That would give you only new ones (and probably more than 83 dry.gif ).

Actually for PQs it's "Placed" not published date. And the placed date could be any date. I have PQs covering the last half year, and I'm still occasionally finding new publishes that don't show up in those PQs because they were placed over 6 months ago, sometimes even years ago. Almost guaranteed, a PQ filtering for Placed Date < 30 days will not get you every new nearby cache you don't yet have.

 

I have this general setup (all maxed where applicable at 100 caches, 500 miles, in Ontario):

1] PQ for all unfound caches centered on home

2] PQ for all caches centered on home placed in the past 3ish months (this gives a much wider radius, and I adjust the date occasionally)

3] PQ for all caches in Ontario Placed between [end of previous date-series-PQ] and Dec 31 this year (this piggy-backs all the way down, adjusting dates to keep each PQ cache count < 1000 results)

 

Not only that but with updated databases you could even select the states needed and run the API fetch daily for caches "published in the last 2 days" unless more than 6000/10000 are published in the whole area you're covering. The caches already in your database can then be updated by PQ's as usual.

Can't wait for 'published date' to be a search option. I believe that was announced by GS as in the works.

Link to comment

 

I looked at that. How does one figure they are contributing to the game by doing that? I bet that person gets hate mail daily.

 

I'm sure there are some people out there that want to increase their find count and do caches like those. How boring.

 

Exactly. I just added 1590 more power caches to the ignore list. Walk About, The Connector, Solene's Trail, and Josephs Dream. As I find them or as they pop up I add them to the ignore list.

 

If the cache limits were expanded I probably wouldn't need to ignore them unless they are ridicules like the Highway To H.E.L.L group.

Link to comment

Depending on when you last updated you CA database you should have run the API with "was published in the last xx days" (xx<30). That would give you only new ones (and probably more than 83 dry.gif ).

Actually for PQs it's "Placed" not published date. And the placed date could be any date. I have PQs covering the last half year, and I'm still occasionally finding new publishes that don't show up in those PQs because they were placed over 6 months ago, sometimes even years ago. Almost guaranteed, a PQ filtering for Placed Date < 30 days will not get you every new nearby cache you don't yet have.

 

Now read again... <_<

If you have an updated database (most recent caches in the PQ within 30 days) then you will get by with API imports "published in the last xx days".

Importing PQs in GSAK will color any PQ that has 1000 caches red so you'll know to adjust dates on that one and newer dates. It happens, although not often, that a lot of caches are published months after their placed date but it only takes a run of the placedPQ macro to adjust the dates in my PQs. As caches are archived/found I adjust dates once in a while anyway.

Link to comment

Ok, sorry I misread - if I'm understanding you correctly you're referring to the strategy not of just getting most recent publishes by a query, but of determining what are recent publishes by the fact that a more recent query returns caches that weren't included in its previous query (or api call). In that case, yeah those are new caches were published since your last query.

 

Just wanted to make it clear that there's no way to query caches by "published date" [afaik], which is how I read that quoted part... that's what had me confused. :)

Link to comment

Just an FYI, You say you haven't seen a lot of power caches. Here's one for you. Look at Highway to H.E.L.L in southern California. It's one of the worst.

 

I looked at that. How does one figure they are contributing to the game by doing that? I bet that person gets hate mail daily.

 

I'm sure there are some people out there that want to increase their find count and do caches like those. How boring.

Here's one near you in List view and in map view.

 

ETA: By the way, this would also be considered a "GeoArt" series. Just mentioning that since you said you were new and so maybe you haven't heard that term before.

Edited by noncentric
Link to comment

Just an FYI, You say you haven't seen a lot of power caches. Here's one for you. Look at Highway to H.E.L.L in southern California. It's one of the worst.

 

I looked at that. How does one figure they are contributing to the game by doing that? I bet that person gets hate mail daily.

 

I'm sure there are some people out there that want to increase their find count and do caches like those. How boring.

Here's one near you in List view and in map view.

 

ETA: By the way, this would also be considered a "GeoArt" series. Just mentioning that since you said you were new and so maybe you haven't heard that term before.

 

That's one of the few I have seen. I personally wouldn't consider it art. I'd call it graffiti.

Link to comment

Just an FYI, You say you haven't seen a lot of power caches. Here's one for you. Look at Highway to H.E.L.L in southern California. It's one of the worst.

 

I looked at that. How does one figure they are contributing to the game by doing that? I bet that person gets hate mail daily.

 

I'm sure there are some people out there that want to increase their find count and do caches like those. How boring.

Here's one near you in List view and in map view.

 

ETA: By the way, this would also be considered a "GeoArt" series. Just mentioning that since you said you were new and so maybe you haven't heard that term before.

 

There is another thread concerning GeoArt and Power caches which I started and many that others have started. We need a band aid to help deal with them. Increasing the cache limits would be a band aid. The band aid will eventually fall off and the infection of Power caches will need to be dealt with. There have been dozens of methods posted for dealing with power caches but raising the cache limit might be the best until the geofathers realize that we are right and there is a problem.

 

I just ignored a Route 66 chain in California and Dear and Antelope Play in Nevada for 747 more added to the list.

 

Wild Ride in Wyoming, and When Two Worlds Collide in Nevada just got ignored for 1031.

Edited by Jake81499
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...