Jump to content

Feature Request: "Standard PQs"


TwinTraveller

Recommended Posts

It is my understanding that PQs require a lot of resources from the GC server. For this reason, one is limited to running 5 queries a day, with a single query running only once every day, and the "My Finds", which is not limited to 500 caches, running only once a week. This is all understandable, but has anyone ever wondered what queries people are actually running? Perhaps a lot can be optimized?

 

Allow me to clarify with an example:

The Belgian Geocache forum has a pinned topic that specifies dates for retrieving all geocaches in Belgium with as few pocket queries as possible. As of today, we have reached the need for 10 PQ. The fact that this is a pinned topic suggests that many GC premium members are running (some of) these queries every single day. I don't know how the GC server is setup to handle PQs, but if this is done on an individual basis, we are waisting a lot of server resources...

 

So my feature request is:

Would it not be possible to generate daily standard PQ's of certain regions, and allow premium members to download these pregenerate files or subscribe to a mail list? These PQs would not necessarily need to be limited to 500 caches. IMHO this would reduce the need for a lot of individual PQs and potentially free up server resources.

 

TT

Link to comment
If you ran a PQ for an area how do you know if I have or have not already found a cache in that area? I do not want to run a PQ for caches I have already found. I think this is where this big issue is.

That's fine, but there's no reason that one couldn't run a PQ without your found caches marked and still know which is which if you keep an offline record of the caches you've found. GSAK does this beautifully. A canned PQ and a "your found" PQ would be all you need.

 

To reduce load even further, one could download only those caches changed in the last 7 days for both. No need to download information that isn't useful--meaning, if you already have it, being sent to you again is not useful. On a weekly basis, my offline database only updates about one quarter to one third of the caches in it. that would another significant reduction.

 

Another would be sending only, but all, that information that changed. In a "differential" PQ, there is also no need to send logs that have already been sent or descriptions that don't change.

 

In my tests, aggressively pursuing a differential scheme for PQ downloads would reduce the size of the downloads by an excess of 90%. On canned PQs, the savings is multiplied by every person who uses it.

 

Canned PQs, differential or not, would not supplant custom PQ, but augment them. In my situation, I'd replace a series of PQ that grab the caches in South Carolina with a single weekly PQ.

 

Additionally, if the PQs could be shared much like Caches Along a Route, then priority could be given to PQs with the highest number of shared users. This would encourage folks to share and reward those who are providing the most savings to the site. This way Groundspeak doesn't have to do the work of dividing up the areas and let the community decide the best way to do it.

Link to comment
Another reason for limiting the number of PQ's each person is allowed to run is to prevent the creation of offline databases.

 

I do unsterstand that Groundspeak does not want someone to setup an unofficial mirror site or "hammer" their site to extract data for offline storage, but I don't see any harm in creating your own offline copy for personal use. If they did not want you to do that, why would they list an application like GSAK on their Geocaching Software page and have the option of running 5 PQs of 500 entries per day?

Edited by TwinTraveller
Link to comment

To reduce load even further, one could download only those caches changed in the last 7 days for both. No need to download information that isn't useful--meaning, if you already have it, being sent to you again is not useful. On a weekly basis, my offline database only updates about one quarter to one third of the caches in it. that would another significant reduction.

 

I'd love to be able to use updated in last 7 however it does not allow for caches that become archived. There is simply no way to tell and they remain in the database.

 

Today, if you run a PQ, or set of PQ's monthly, asking for caches that you do not own and are unfound, after you run your PQ's you run a filter telling you which were unfound and no GPX the last seven. The results can then be deleted because they are archived.

 

Having said that, and back to the OP's post, I don't think that 10 PQ's (5000 caches) is unreasonable to get a whole region or country. It will take 2 days, 3 if you want to leave a slot or two open for emergencies. It sounds like the desire is to keep an offline , on demand, db of the region. As such and in light of the fact that Groundspeak has made it clear many times they have no desire to change the PQ limits, some of the reasons listed in previous posts, running these PQs weekly seems to be the method for the foreseeable future.

Link to comment
I'd love to be able to use updated in last 7 however it does not allow for caches that become archived. There is simply no way to tell and they remain in the database.

Tis true. The last 7 day PQ is pretty much useless for most of us. Those of us that don't keep an offline database need a full set of caches simply because they don't keep the previous week's set. Those of us that do need a full set to know which are missing from the set to determine which are archived.

 

This is highly inefficient. OTOH, TPTB has only provided this up to this point. Right now I'm using the potential of 11,500 caches worth of downloads to maintain a database of under 6000 caches (includes all my finds and owned caches). On any single week only about a quarter or a third of these listings are even updated.

 

Fro instance, I hadn't updated GSAK since the 18th. I had all of my downloads from the 19th to today waiting in the queue. I just loaded them and here's the results:

Waypoints in file(s) loaded..............12134 
New waypoints added to GSAK.................68 
Existing Waypoints updated in GSAK........3118 
Waypoints already up to date in GSAK......8942 
Ignored because in permanent delete list.....6 
Additional child waypoints added.............7 

That's two weeks of downloads and 73.7% were essentially tossed the bit bucket. However, because I don't know which caches are archived, those caches are useful only to know which ones are not missing. In those two weeks 19 were archived. I needed those 8942 (or about 4500 for one week) to know which 19 to remove from my default database.

 

Now, of those 3118 that were updated, just from browsing the GSAK update log the vast majority of the changes were a single log. There, all of the caches data and 5 logs were sent only to say there was a single log added. I was only able to find 34 long description updates, 4 had short description updates and 2 that had only short description updates. 3082 cache descriptions were useless because I already had the latest version.

 

There are a lot of changes that could happen to make the PQ system a lot more efficient. Those who keep offline data bases could actually save the site massive amounts of server load and bandwidth--well beyond those who only pull what they need on any one day.

 

I'm hoping the TPTB haven't turned a deaf ear to my past ranting on the subject for the soon-to-be released PQ upgrades.

Edited by CoyoteRed
Link to comment

Many of you are assuming that reducing server load is the primary goal of the PQ process.

 

I would guess that the primary goal is to give Geocachers access to customized (and limited amount) data for a day of Geocaching. They would like to reduce server load while keeping the primary goal intact.

Link to comment
Many of you are assuming that reducing server load is the primary goal of the PQ process.

I'm not. Never did. Getting the caches in front of the geocachers that they want for their pursuit of the hobby is.

 

I would guess that the primary goal is to give Geocachers access to customized (and limited amount) data for a day of Geocaching. They would like to reduce server load while keeping the primary goal intact.

When Groundspeak can provide filtering as powerful as GSAK's with the same instant access and trivial ordering, then I can see your point. They don't so I'll continue to use GSAK as my primary means to doing so. First, I have to get an acceptable sub-set of local caches to do that. That's were PQs come into play.

Link to comment

maybe its just me, but why would anyone need a list of every cache in their entire area, I have no ambision to keep an entire data base of Wisconsin and the UP of Michigan, but at the drop of a hat, i can run a pocket query and be gone in about 10 minutes with 500 loaded in the laptop nuvi and GPS Why keep a data base when we have a web site to do that for us, It is all these unnecessary PQ run daily that back log the system, we dont need special queries set up in fact, it should be lowered to 2 or 3 a day, instead of 5 then people might think a bit about using a precious PQ just to update every cache within 1000 miles of their house.

 

Barry of sweetlife

Link to comment
maybe its just me, but why would anyone need a list of every cache in their entire area...

Huh, you've not been reading the thread after thread of folks having problems with getting their PQs? What about the website being down? Can't get an instant PQ when the site is down.

 

Oh, I bet I can be out the door faster than you can. I already have my PQs downloaded before you even get on the website--if you can get on the website.

 

Never mind I don't have to plan where in our stomping grounds we have to go before we hit the door. We just go. We change our minds in mid-trip, again, no issues as we already have the data to do so.

 

Sorry, just because I prepare differently than you do doesn't mean your way is any better than mine. If you can get by with the way you do it, fine. I like the way I do it.

Link to comment
It is all these unnecessary PQ run daily that back log the system...

 

True.

 

Hence the request to avoid running the same query over and over for each individual but offer them "pre-processed" such that there is room for those individuals who do want to create a truely personal query and ensure that the system responds within a reasonable time.

Link to comment
(snip) I like the way I do it.

 

... and so do I - I do it the exact same way. 5061 caches within 70 or so miles of my house. I can off hunting in any direction I want and change my mind mid hunt. I can filter out by cache owner within seconds. Filter by distance and direction from my house within seconds. GSAK is just superior to use than a PQ and go. But as CR said - how you do it is how you do it and how I do it is how I do it. I think he said that :P

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...