Jump to content

We need Super Pocket Queries


benmwatson
Followers 5

Recommended Posts

Pocket Queries are GREAT. I love them, really. It makes it easy to get a ton of caches in an area.

 

But it's not enough anymore. Here's what I really want to do:

* One query. All of the caches in Washington state that I haven't found. Updated every week.

 

That's it.

 

Here's my reasoning.

 

* With the advent of GPS devices like Garmin's 600/650, which can hold millions of caches, the limiting factor now becomes this website.

* There are so many caches in many areas (like Washington, where I am), that it's difficult to have a significant geographic area in a single query, unless you spend a LOT of time tweaking the queries.

* Many weekends, I just want to get in my car and drive, and see what caches are there. It's not really possible to do this without spending significant upfront planning. I have to figure out where I'm going to be, create a new PQ, download it, etc. If I just knew that my device had all of the caches already on it, how much easier would this be?! As it is now, I just don't bother sometimes.

* Allow far more spontaneity in general.

 

I know it's possible to work around these--I could create multiple queries, but figuring out how to get suitably wide coverage without getting overlapping results is next to impossible, and this is just too much work in the end.

 

Alternative:

* Generic, downloadable files that contain all caches per state/country/etc. It would contain things you've already found, but I think most devices recognize that anyway.

Link to comment
I know it's possible to work around these--I could create multiple queries, but figuring out how to get suitably wide coverage without getting overlapping results is next to impossible, and this is just too much work in the end.
Are you familiar with the system Markwell describes for splitting up multiple PQs based on date ranges? It is available here:

http://www.markwell.us/pq.htm#tips

 

It doesn't seem "next to impossible" to me.

Link to comment
but figuring out how to get suitably wide coverage without getting overlapping results is next to impossible, and this is just too much work in the end.

 

I suggest you set up a date ranged series of queries. You don't have to do a bunch of "figuring out", it's been done for you.

 

http://project-gc.com/Tools/PQSplit?multi_region[]=Washington&submit=Filter

 

Use these date ranges, plus That I haven't Found.

And there you are.

 

Kind of pain the first time you set it up, but once it's set up, fairly painless.

I wouldn't bother to adjust over time, just add new queries as time passes.

Link to comment

If you are a GSAK user, are you taking advantage of the API? Through the API you can receive complete details and up to 30 logs on as many as 6000 caches per day.

 

As I visit Washington State frequently, I keep an up-to-date database this way, by running just one set of pocket queries for the state and then keeping the database up-to-date with API updates. I do the same to maintain databases for other places where I visit regularly or am planning to visit.

Link to comment

@Keystone

 

I keep recommending GSAK for things like this, and keep getting blown out of the water with [a] I run a Mac and I don't want to / can't run a Windows emulator, or doing so would violate my Mac principles, or we shouldn't have to pay extra for features that gc.com should be giving to us with our premium membership.

 

I also keep using it for exactly the sorts of reasons that existing users, like yourself, keep mentioning!

 

It seems that without realizing it, people expect that the website provide all of the functionality of the API, including managing and manipulating their own personal databases, which of course, is far more efficiently done on their own PCs than 'in the Groundspeak cloud'.

 

While I have you here, Keystone, the OP's question reminds me of one I have as well:

One downside to the GSAK API interface is a limitation on geographic bounds for an area-defined query. Rectangles are limited to 62.1 miles corner to corner, and circles are limited to a 31.1 mile radius. Is this an actual limitation of the API call, or is this a limitation that is being imposed by Clyde? I haven't spotted the actual API spec anywhere to answer my own question. Could be it's only exposed to developers with API agreements. I'll ask over at GSAK as well.

Link to comment

Pocket Queries are GREAT. I love them, really. It makes it easy to get a ton of caches in an area.

 

But it's not enough anymore. Here's what I really want to do:

* One query. All of the caches in Washington state that I haven't found. Updated every week.

 

That's it.

 

Here's my reasoning.

 

* With the advent of GPS devices like Garmin's 600/650, which can hold millions of caches, the limiting factor now becomes this website.

* There are so many caches in many areas (like Washington, where I am), that it's difficult to have a significant geographic area in a single query, unless you spend a LOT of time tweaking the queries.

* Many weekends, I just want to get in my car and drive, and see what caches are there. It's not really possible to do this without spending significant upfront planning. I have to figure out where I'm going to be, create a new PQ, download it, etc. If I just knew that my device had all of the caches already on it, how much easier would this be?! As it is now, I just don't bother sometimes.

* Allow far more spontaneity in general.

 

I know it's possible to work around these--I could create multiple queries, but figuring out how to get suitably wide coverage without getting overlapping results is next to impossible, and this is just too much work in the end.

 

Alternative:

* Generic, downloadable files that contain all caches per state/country/etc. It would contain things you've already found, but I think most devices recognize that anyway.

Maybe a smart phone app will work for you? Depending on what part of Washington you live in, you may have 4g or wifi all over the place.

Link to comment
I know it's possible to work around these--I could create multiple queries, but figuring out how to get suitably wide coverage without getting overlapping results is next to impossible, and this is just too much work in the end.
Are you familiar with the system Markwell describes for splitting up multiple PQs based on date ranges? It is available here:

http://www.markwell.us/pq.htm#tips

 

It doesn't seem "next to impossible" to me.

 

I don't really see the sense of having a dozen of PQs of an area splitted by date. For me, this sounds a bit cumbersome, because i have to look on different PQs, because there might be caches in my circumference, which a splitted on different PQs because of the publish date.

 

To have all in one would be great, or as an idea, every cacher could create exactly one PQ with 10.000 caches once a week or so.

Link to comment

I don't really see the sense of having a dozen of PQs of an area splitted by date. For me, this sounds a bit cumbersome, because i have to look on different PQs, because there might be caches in my circumference, which a splitted on different PQs because of the publish date.

 

To have all in one would be great, or as an idea, every cacher could create exactly one PQ with 10.000 caches once a week or so.

 

No thanks! That would severely limit my weekly cache files! I use the 'split by date' format for all of New Jersey and anything within 65 miles of the Dolphinarium. (Yes. I do have to update it every few months.) That gives me 20,000 - 30,000 caches to choose from. I run them through GSAK, pick the area I'm interested in, and put them into GUPY using Garmin POI loader. I certainly would not want to be limited to 10,000 caches. I like the way it works now.

Link to comment
I know it's possible to work around these--I could create multiple queries, but figuring out how to get suitably wide coverage without getting overlapping results is next to impossible, and this is just too much work in the end.
Are you familiar with the system Markwell describes for splitting up multiple PQs based on date ranges? It is available here:

http://www.markwell.us/pq.htm#tips

 

It doesn't seem "next to impossible" to me.

 

I don't really see the sense of having a dozen of PQs of an area splitted by date. For me, this sounds a bit cumbersome, because i have to look on different PQs, because there might be caches in my circumference, which a splitted on different PQs because of the publish date.

 

To have all in one would be great, or as an idea, every cacher could create exactly one PQ with 10.000 caches once a week or so.

Project GC can help, if you are willing to try something other than a magic solution that requires no work on your part.

Link to comment

Instead of shooting down the OP, how about trying to spiral in on a compromise.

 

Like....

 

How about if total download from PQs could be 10k caches per day, rather than 1k per PQ so that if someone wants to burn that up in 1 PQ that's up to them, and if someone else wants to perform 10 smaller PQs that's up to them. The OP might be happy with that while keeping everyone else happy too.

 

And/Or...

 

How about GC.com performs a daily, or even weekly PQ of each state, or select geographical areas, and then allows premium members to download one of those pre-compiled state query files, over and above their normal allocation, perhaps once a week. The query would only run once no matter how many downloads, and the downloadable GPX file could be located anywhere, not necessarily on gc.com's core servers, so wouldn't necessarily affect website load/speed.

 

I travel quite a lot and when I travel, I'm pretty haphazard about where I go, on a whim, and this kind of query would be really useful for me to grab a few days before visiting some new country/state where otherwise I'd have to carefully plan out my trip and create a whole lot of overlapping circle pocket queries, or perhaps dozens of date ranged queries (causing a whole lot more query load on the servers than a single pre-compiled GPX file I might add)...

Link to comment

 

And/Or...

 

How about GC.com performs a daily, or even weekly PQ of each state, or select geographical areas, and then allows premium members to download one of those pre-compiled state query files, over and above their normal allocation, perhaps once a week. The query would only run once no matter how many downloads, and the downloadable GPX file could be located anywhere, not necessarily on gc.com's core servers, so wouldn't necessarily affect website load/speed.

 

I travel quite a lot and when I travel, I'm pretty haphazard about where I go, on a whim, and this kind of query would be really useful for me to grab a few days before visiting some new country/state where otherwise I'd have to carefully plan out my trip and create a whole lot of overlapping circle pocket queries, or perhaps dozens of date ranged queries (causing a whole lot more query load on the servers than a single pre-compiled GPX file I might add)...

 

I travel quite a lot as well and actually enjoy doing the research and constructing PQs of the areas and downloading maps I might visit. I single pre-compiled PQ might make it easier but the amount of "work" involved, to me, isn't tedious or difficult.

Link to comment

Instead of shooting down the OP, how about trying to spiral in on a comprom

How about if total download from PQs could be 10k caches per day, rather than 1k per PQ so that if someone wants to burn that up in 1 PQ that's up to them, and if someone else wants to perform 10 smaller PQs that's up to them. The OP might be happy with that while keeping everyone else happy too.

Hmm, interesting idea. I think I like it..... a maximum of 10K caches and 10 PQs, distribute caches amongst PQs as you see fit within those limits. Yes.

Link to comment

There are some good ideas in here, and a healthy dose of snark, which I was not expecting--wouldn't you all WANT to have a new feature like this?

 

The date-bound PQs are interesting, and I will probably do something like that for my home area, but let's be honest: that's a workaround for a systemic limitation. It's silly on the face of it. And do I have to repeat this technique everywhere I visit?

 

I also don't think a 10K limit is even high enough. 100K or higher would be more reasonable for some areas.

 

The point of my post was not really to explore the workarounds. I know they exist. I really just want the convenience feature and to gauge others' interest in it as well.

 

Perhaps this feature is tiered? Once you find 100 caches, your PQ limit increases to 10,000. Once you find 1000 caches, it increases to 100,000. That would naturally limit these queries only to the most serious of cachers and preserve processing power.

 

I would much rather be outside caching than on the computer tricking the web-site into giving me the data.

Link to comment

I don't really see the sense of having a dozen of PQs of an area splitted by date. For me, this sounds a bit cumbersome, because i have to look on different PQs, because there might be caches in my circumference, which a splitted on different PQs because of the publish date.

 

If you're loading them into another application or into your GPS receiver, it makes no difference if it comes in 1 file or in multiple files. You can combine multiple GPX files into one as well fairly easily if the app you're using can only handle one GPX file at a time.

 

It is a bit cumbersome, I admit.

 

How about if total download from PQs could be 10k caches per day, rather than 1k per PQ so that if someone wants to burn that up in 1 PQ that's up to them, and if someone else wants to perform 10 smaller PQs that's up to them. The OP might be happy with that while keeping everyone else happy too.

 

I'd like that. I think it unlikely that Groundspeak would do so, but I don't speak for them.

 

How about GC.com performs a daily, or even weekly PQ of each state, or select geographical areas, and then allows premium members to download one of those pre-compiled state query files, over and above their normal allocation, perhaps once a week. The query would only run once no matter how many downloads, and the downloadable GPX file could be located anywhere, not necessarily on gc.com's core servers, so wouldn't necessarily affect website load/speed.

 

I can think of a couple of reasons why this won't fly :

 

1) Some people live near the border of a state / province / region / country, and they may want 2 or more files.

 

2) Having a large number of caches in a single file encourages "sharing" of the files, which is strictly against the TOU. When I started geocaching, I recall coming across someone selling GPX files filled with geocaches on eBay.

 

There are some good ideas in here, and a healthy dose of snark, which I was not expecting--wouldn't you all WANT to have a new feature like this?

Welcome to the forums! :) It has been mentioned a number of times that the forums is the most unfriendly aspect of geocaching.

Link to comment

There are some good ideas in here, and a healthy dose of snark, which I was not expecting--wouldn't you all WANT to have a new feature like this?

 

Once, requests for very large PQ and/or canned pocket queries would be met with the explanation that Groundspeak didn't want to make it easy to copy their database. It is the constantly up to date database of geocaches that attracts people to geocaching.com and makes this the preeminent geocaching site. At one time they may have really feared people would abuse the ability to download cache information; and by limiting the data you could download and making you work to get and significant number of caches, they felt the were protecting this information. You still have to agree to the license that says you will not share the data.

 

I think overtime, Groundspeak has realized there is a small but active subset of geocachers who "need" to have a large number of caches to cover the areas they cache in. On several occasiong they have increased the number of PQs and the size of each. One now can run 10 PQ of 1000 caches each per day. I remember when we could only run 5 of 500 per day.

 

In addition they have released an API that allows premium members to get 10000 cache per day and people are creating new apps and tools that use the API.

 

Most people who are geocaching over a broad area and don't know in advance where they are going end up using a smartphone with an app using the API.

 

I don't believe the number of people who would be helped by Super Pocket Queries has ever been that great, and with smartphone apps I'm sure it is much smaller now.

 

The date-bound PQs are interesting, and I will probably do something like that for my home area, but let's be honest: that's a workaround for a systemic limitation. It's silly on the face of it. And do I have to repeat this technique everywhere I visit?

It does seem a silly workaround now. One advantage of the current system is that if you have too many cache being returned in a circle you can limit these to the closest to the center. There are several ways to limit the number of caches besides distance, but the date place was a simple one to implement. This field seldom changes, and for most areas, few enough caches are placed per day that it also give a fine granularity on the number caches being returned. (Power trails have thrown a bit of a wrench in that now people may published 500 caches all placed on one day).

 

It's a bit of a pain to set up the dates, but most people will do this once and only have to updated this once a year or so (depending on how many new cachers get placed in your area).

 

I also don't think a 10K limit is even high enough. 100K or higher would be more reasonable for some areas.

Maybe. But see the issue with Groundspeak being protective of their database. When you load up an enormous number of caches on your device, it is probably with the intention to this once and not keep it up to date. What will happen then is that you will miss new caches and you won't know if the old cache have been disabled or archived. Most people eventually decide that having current data is more important than having a lot of data. It's far more common to decide, just before you leave, where you are going for the day and download just the caches for that area. If that doesn't work for you, then most people find the solution of a smartphone where the App always has the latest data works best.

 

I understand that some people won't use a smartphone for geocaching on principle, or may be in an area without service so the smartphone needs to be preloaded anyhow. And that some people may not care that they don't have the latest caches and might sometimes have data for caches that aren't there anymore. But, forgive my snarkiness, this is the exception.

 

The point of my post was not really to explore the workarounds. I know they exist. I really just want the convenience feature and to gauge others' interest in it as well.

 

Perhaps this feature is tiered? Once you find 100 caches, your PQ limit increases to 10,000. Once you find 1000 caches, it increases to 100,000. That would naturally limit these queries only to the most serious of cachers and preserve processing power.

 

I would much rather be outside caching than on the computer tricking the web-site into giving me the data.

One nice thing in the forums is that people can share how they address these issues using existing tools with "workarounds".

 

Since Groundspeak does seem to understand that there are geocachers who feel they need a super PQ, you might get some response in the way of implementing some ideas. The most likely thing to happen is that they increase the limits on the current system since that isn't too much work. They have stated that they don't plan to make any major changes to PQs as they see the API as the modern way to get data. I think that we might see more tools like GSAK that use the API to create GPX and other types of files that can be loaded on GPS units as a replacement for PQs.

Link to comment

There are some good ideas in here, and a healthy dose of snark, which I was not expecting--wouldn't you all WANT to have a new feature like this?

Because we wouldn't use it? I an already pull down 6K unique caches with full logs and 10K more caches with 'lite' information (using 'lite' mostly to obtain a friend's not-founds to compare with my own), for a total of 16K in a day -- and that seems to be enough to serve anything I've done in the way of caching so far.
Link to comment
And do I have to repeat this technique everywhere I visit?
Only if you need the listing data for thousands of geocaches everywhere you visit.

 

There are a handful of numbers run trails with more than 1000 caches, but other than that, I'm having a tough time imagining the need for that many cache listings.

Link to comment

There are a handful of numbers run trails with more than 1000 caches, but other than that, I'm having a tough time imagining the need for that many cache listings.

"I am going on an overseas holiday to $bigplace for two weeks, I won't have mobile data, and don't know to any better accuracy than about 100km where I might be from one day to the next.."

 

(Yes, that is a real world example from personal experience.)

Link to comment
There are a handful of numbers run trails with more than 1000 caches, but other than that, I'm having a tough time imagining the need for that many cache listings.
"I am going on an overseas holiday to $bigplace for two weeks, I won't have mobile data, and don't know to any better accuracy than about 100km where I might be from one day to the next.."

 

(Yes, that is a real world example from personal experience.)

Yeah, especially if there is a 1000+ cache numbers run trail in $bigplace that fills up your PQs.

 

I've done pretty much the same thing, but 1000 caches covered a much larger radius than I was likely to travel in. But maybe my $bigplace wasn't as densely packed with caches. And it didn't have any numbers run trails.

Link to comment

"I am going on an overseas holiday to $bigplace for two weeks, I won't have mobile data, and don't know to any better accuracy than about 100km where I might be from one day to the next.."

If $bigplace is cache dense, chances are that you will have wifi where you're staying, so you can download a more targeted PQ before you set off. Or you can get a prepaid SIM card to use there for relatively cheap.

Link to comment

"I am going on an overseas holiday to $bigplace for two weeks, I won't have mobile data, and don't know to any better accuracy than about 100km where I might be from one day to the next.."

If $bigplace is cache dense, chances are that you will have wifi where you're staying, so you can download a more targeted PQ before you set off. Or you can get a prepaid SIM card to use there for relatively cheap.

 

Yeah, I can't imagine a place in the world where caches within a small area (or say 50km radius) would be more dense than a couple thousand caches, with absolutely no cheap or free means to access the internet. If not an open wifi, not even an internet cafe or coffee shop with wifi; or even McDonalds, which is worldwide? :blink:

Edited by thebruce0
Link to comment

"I am going on an overseas holiday to $bigplace for two weeks, I won't have mobile data, and don't know to any better accuracy than about 100km where I might be from one day to the next.."

If $bigplace is cache dense, chances are that you will have wifi where you're staying, so you can download a more targeted PQ before you set off. Or you can get a prepaid SIM card to use there for relatively cheap.

 

Yeah, I can't imagine a place in the world where caches within a small area (or say 50km radius) would be more dense than a couple thousand caches, with absolutely no cheap or free means to access the internet. If not an open wifi, not even an internet cafe or coffee shop with wifi; or even McDonalds, which is worldwide? :blink:

But how many people have a device that can download the PQ and then load it onto a GPS. Essentially people would be forced to geocache with smartphones and we can't have that. The Super PQ user simply wants a GPS pre-loaded with every cache so they never have to find the free WiFi. And who's to say that the free WiFi in $bigplace, doesn't block Geocaching.com :unsure:

Link to comment

But how many people have a device that can download the PQ and then load it onto a GPS. Essentially people would be forced to geocache with smartphones and we can't have that. The Super PQ user simply wants a GPS pre-loaded with every cache so they never have to find the free WiFi. And who's to say that the free WiFi in $bigplace, doesn't block Geocaching.com :unsure:

 

I suppose that's arguable. I wouldn't call that a Pocket Query though :P I always considered 'pocket' to be in the context of mini, concise, like 'pocket guide' for example. I think in the case of "I want to load my device with every findable cache in the wider region/state/province/country as an extended offline resource", it would be a different feature - like the previously suggested feature to provide a collection of caches exported each day or week for people to download and import on their own time. I could get behind that.

Edited by thebruce0
Link to comment

If you're loading them into another application or into your GPS receiver, it makes no difference if it comes in 1 file or in multiple files. You can combine multiple GPX files into one as well fairly easily if the app you're using can only handle one GPX file at a time.

 

It is a bit cumbersome, I admit.

 

That is the point, i have a Nokie Lumia with WP 7.5 and i use the maaloo-App. The problem is, that this app can handle obviously just one PQ at a time. Maybe i should try to merge some PQs to one big but i can't say if my app can handle a PQ with e. g. 3.000 caches.

 

Just imagine someone want to travel to USA, maybe the SF-Area, a 1.000 Cache PQ just covers barely SF and some little parts of South-SF (assumend the PQ covers all cachetypes and all D/T-Combinations). When you limit the PQs only to traditionals with D/T lower and equal 3, then the covered area of that pq is only slightly bigger and goes to the coast-areas of Oakland and Alameda.

So if you really want to travel around the Bay-Area, you have to create a lot of PQs or some few castrated PQs (regardless if you merge them in the end).

To have wifi everywhere in Cafes or Restaurants is not really a argument against a better PQ-handling...

Link to comment

There are some good ideas in here, and a healthy dose of snark, which I was not expecting--wouldn't you all WANT to have a new feature like this?

 

 

Increasing the size of PQ results is going to put an additional strain on servers. If they increased it to 10,000 I suspect that there would be a lot of geocachers that would use it, just because they can. I, personally, would never need to be able to download 10,000 caches at once. I suppose it's possible but I can't fathom the idea of someone finding that many caches in a week. There aren't that many caches in the continents of Africa and South America combined. While a small percentage might find the ability to download that may caches at once useful, it would reduce the performance to the site for everyone else that really has no use for such a feature.

Link to comment

Increasing the size of PQ results is going to put an additional strain on servers.

Quite the opposite, actually. If, as mentioned above, you make the limit 10k caches spread across up to 10 PQs, instead of 10 PQs of 1k each (which is the same total number of results), I would expect the server load to be significantly decreased.

 

To get those 10k caches now (which you can do today), you need to hit the server 10 times, with more complicated queries -- extra WHERE clauses in the SELECT to limit by date, for example. If one query with less WHERE clauses returning the same number of rows causes more load than ten more complex queries, then there is Something Very Wrong with the database[1].

 

As an aside, I have heard it claimed that the 1k PQ limit is due to a commercial agreement with a certain GPS vendor. Would be interested to hear if anyone else has any information on this (or can debunk it.)

 

[1]although this would come as no great surprise....

Link to comment

Technically, it's more like a variance of load. It's like the difference of loading an enormous web page with ALL the content at once, or loading it as-needed. All at once is more convenient for the end user (save a bit longer load time), but heavy a short-term load on the server (possible slowness for simultaneous users). Loading it as-needed may require more bandwidth, and each smaller request would be more complex in concept, but the load is stretched out over time, so there's less risk of side effects for simultaneous users.

Or, think of it like a loan. You could buy something straight up for its selling price, or break it into payments to pay less now but ultimately more over time because of interest.

 

A 10k PQ may be 'simpler', but if everyone's doing them because they can, then there would be side effects over multiple users. Even though the smaller PQs to cover the same area could be more complex, they're spread out and it's smoother for the multiple user experience.

 

It's not that one option is better than another - they both have strengths and weaknesses; it's what is more feasible and desirable in context, and that's a balance between what Groundspeak is willing to offer and what the community is willing to live with.

 

With the API growing in use, with the smartphone community and capabilities beginning to overshadow completely offline device use without access to an online device, it's very unlikely we'll see an improvement of the PQ capabilities.

 

I'd be tempted to say also to consider storage space. If PQs were allowed 10k caches, storage space for the collection of all users' PQs would grow 10-fold. PQs do expire, but the question would be if Groundspeak values the PQ limit increase enough to prioritize a massive storage space increase for them? I wouldn't be surprised if the storage of all current PQ zip files may even be larger than the database itself :P

 

Rather, I'd still promote the idea of providing a periodic flash of data for popular regions, provinces or states, for people to download. It wouldn't be updated automatically (otherwise just use the PQ/API system), but be a source of data for offline users who want to store mass catalogues of cache data (as of a certain date).

Edited by thebruce0
Link to comment

How about something in-between? Maybe not a 10,000 single PQ, but instead maybe 2500? I cache with an older Garmin GPS receiver (Colorado 400) which is limited to 2000 caches (which for most of my time is completely fine for me; that covers about a 30km radius from home - but is more problematic when traveling for reasons already stated in this thread). Anyway, with a 2500 cache limit (per PQ) those of us with older Garmin's (which I suspect is a fair number of GPSr-carrying geocachers) can set a single PQ with a 2000-cache limit, and others (for example those with newer Garmin's, or Magellan's, or maybe some app users who use GPX files, admittedly I know little about that) would only have to run 4 PQs instead of 10. Maximum number of caches per 24 hour period is not changed (still 10,0000). the only change is increased convenience (as for network loads, here again I have no knowledge or comment).

No, it is not that big of a deal to set up two PQ's vs one. Or even 10 for that matter (BTW, it would take 47 1000-cache PQs to capture all 45k active geocaches in Ontario). But if it could be 1 (or 4), well that is even more convenient. I think that is the point of the OP (well, besides the request for 100,000 caches). Statement suggesting the use of a cell phone or a secondary program are okay as a work-around, but don't serve as an reason why small changes which may serve to increase "customer convenience and satisfaction" should be ignored.

Link to comment

Increasing the size of PQ results is going to put an additional strain on servers.

Quite the opposite, actually. If, as mentioned above, you make the limit 10k caches spread across up to 10 PQs, instead of 10 PQs of 1k each (which is the same total number of results), I would expect the server load to be significantly decreased.

 

To get those 10k caches now (which you can do today), you need to hit the server 10 times, with more complicated queries -- extra WHERE clauses in the SELECT to limit by date, for example. If one query with less WHERE clauses returning the same number of rows causes more load than ten more complex queries, then there is Something Very Wrong with the database[1].

 

As an aside, I have heard it claimed that the 1k PQ limit is due to a commercial agreement with a certain GPS vendor. Would be interested to hear if anyone else has any information on this (or can debunk it.)

 

[1]although this would come as no great surprise....

With memory inexpensive, this is probably a correct assessment. However if the query becomes large enough, splitting it into smaller pieces that each require less resources could result in something that requires less resources. When running many queries that each require lots of resources like memory, you run in to the situation where a query is blocked because the resources it needs to run are not available. You may even get into a situation with deadlock or at least a lot of churning as queries are suspended so others can run. With smaller queries not only can these situations be avoided but the smaller queries can more easily be distributed over multiple servers. It may seem counterintuitive, but smaller queries most definitely can result in more efficient use of servers than big ones.

 

However, I don't think that server load is really what constrains the size of PQ anymore. Whether there is some agreement with a GPS vendor, the fact remains that some older units were limited to GPX files of 1000 or 2000 geocaches. It may simply be easier to leave the size of the GPX file at 1000 so that users don't have an issue loading a GPX file on these units. Units that allow bigger files also alow multiple files to be used.

Link to comment

I'd be tempted to say also to consider storage space. If PQs were allowed 10k caches, storage space for the collection of all users' PQs would grow 10-fold. PQs do expire, but the question would be if Groundspeak values the PQ limit increase enough to prioritize a massive storage space increase for them? I wouldn't be surprised if the storage of all current PQ zip files may even be larger than the database itself :P

 

The following comments are somewhat off-topic, but I've wondered about this. What is the utility in making a PQ available for one week after it has been run. I know I would surely be willing to see this timeframe reduced, if the freed-up resources could be shifted to allow for larger PQ sizes. In my case, I run a PQ when I want to replace the outdated files on my GPSr (anywhere from weekly to monthly, depending on time of year, number of recently published caches in my area, etc). I almost always download the files within a day of running the query. I could care less if it (the zipped PQ file) sits there on the GS servers for the next 6 days. That is my experience, I'd be interested to hear what others think. BTW I am not suggesting the timeframe be reduced to 1 day. I'm only stating my opinion that 1 week may be longer that required for other (most?) users.

Link to comment

Yeah, times have changed in 15 years. Both device memory (the 1-2000 limit per NYPaddleCacher's comment) and the 1 week availability (perhaps based on average internet use by those who played the game back then, now much more common and prevalent than a decade ago).

 

I'd support a small increase in PQ limit (not 10k, unless Groundspeak is positive that they can handle the load), and reducing the PQ availability time*, especially if that directly counters the larger query requirements. And yeah, I'm all for "small changes which may serve to increase 'customer convenience and satisfaction'" ;)

 

* I wonder if GS is monitoring PQ downloads to find out the average length of time a PQ sits before its last use before expiry? They should use that to determine expiry age.

Link to comment

I'd be tempted to say also to consider storage space. If PQs were allowed 10k caches, storage space for the collection of all users' PQs would grow 10-fold. PQs do expire, but the question would be if Groundspeak values the PQ limit increase enough to prioritize a massive storage space increase for them? I wouldn't be surprised if the storage of all current PQ zip files may even be larger than the database itself :P

We aren't (well, I'm not) talking about allowing 10-fold (or any for that matter) growth in what users can request, just the way they can get it.

 

Right now, users can request 10k caches a day, split across 10 PQs.

 

Instead, the proposal is to allow 10k caches a day, split across up to 10 PQs.

 

That could be one 10k PQ, five 2k PQs, or 10 1k PQs. The total storage remains the same, the only difference is in how it's distributed.

 

You won't get people taking ten times the resources, because after they've generated their 10k PQ, they can't generate any more that day anyway. B)

Link to comment

I don't see an increase in the size of PQs happening. Here's why:

Placing the request for the data in the "My Finds" PQ into an API operation separate from the requests for other PQs does not address the main reasons that the "My Finds" PQ is not available. It would still bring mobile devices to their knees, for example.

We've been told that one of the reasons why the My Finds PQ hasn't been made available through the API is that its size would cause problems for members with lots of finds. The same problem would exist with the proposed larger regular PQs. For this reason alone, I don't see Groundspeak increasing the size of PQs.

Link to comment

"I am going on an overseas holiday to $bigplace for two weeks, I won't have mobile data, and don't know to any better accuracy than about 100km where I might be from one day to the next.."

If $bigplace is cache dense, chances are that you will have wifi where you're staying, so you can download a more targeted PQ before you set off. Or you can get a prepaid SIM card to use there for relatively cheap.

 

Yeah, I can't imagine a place in the world where caches within a small area (or say 50km radius) would be more dense than a couple thousand caches, with absolutely no cheap or free means to access the internet. If not an open wifi, not even an internet cafe or coffee shop with wifi; or even McDonalds, which is worldwide? :blink:

 

When I travel and use gsak to get all within 30 miles I regularly get in the 3 to 5 thousand range

Link to comment

Yeah, I can't imagine a place in the world where caches within a small area (or say 50km radius) would be more dense than a couple thousand caches, with absolutely no cheap or free means to access the internet. If not an open wifi, not even an internet cafe or coffee shop with wifi; or even McDonalds, which is worldwide? :blink:

 

When I travel and use gsak to get all within 30 miles I regularly get in the 3 to 5 thousand range

 

You missed the key point - "with absolutely no cheap or free means to access the internet" ;P

There are plenty of areas with cache density that high and much higher. But practically guaranteed those are areas of high population or high activity, and there will be places with free or cheap wifi. You'll never get that density set apart from any discernible amount of civilization.

Anyway that's a minor point to begin with, countered by the fact that there are people who don't have or take smartphones let alone load their offline devices via smartphone, and may wish to load them for extended trips with multiple thousands. I'd say take your laptop on trips like that, but that may not be a possibility either.

Edited by thebruce0
Link to comment

How about GC.com performs a daily, or even weekly PQ of each state, or select geographical areas, and then allows premium members to download one of those pre-compiled state query files, over and above their normal allocation, perhaps once a week. The query would only run once no matter how many downloads, and the downloadable GPX file could be located anywhere, not necessarily on gc.com's core servers, so wouldn't necessarily affect website load/speed.

 

I can think of a couple of reasons why this won't fly :

 

1) Some people live near the border of a state / province / region / country, and they may want 2 or more files.

 

Those people could have PQ(s) set up for the other state/country, or could download both files in alternate weeks...?

 

2) Having a large number of caches in a single file encourages "sharing" of the files, which is strictly against the TOU. When I started geocaching, I recall coming across someone selling GPX files filled with geocaches on eBay.

 

I don't know whether it "encourages" sharing of files. We all agree to the TOU, and frankly if someone wanted to share geocache data, they can do so right now. If anything, NOT having such a facility encourages data sharing!

Link to comment

To put it simply, I agree with the original poster. I am trying to make the initial load of all the caches in my home state and have to split my Pocket Queries by date. You can only perform 10 PCs per day. For some reason, no matter what PC run setting I use, only about three run per day. It's really frustrating and I just wish I could run one query for my entire state, then just have my weekly update query run each week after that.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Followers 5
×
×
  • Create New...