Jump to content

5 queries per day limit


bones3

Recommended Posts

I was surprised after paying for a premium membership that pocket queries are limited to 5 per day. Some days I'd like to run more than 5, and other days I wouldn't run any at all. I'd rather have a per month or per week limit rather than per day, that way I could run 10 or so one day and not run any until the next week. Maybe something to think about...?

Link to comment

This made me curious as to what was happening in north Alabama. I checked the area and found that to reach the 2500 limitation from Huntsville, you'd have to go close to 83 miles out. At that same distance out, I've got about 3,600 caches.

 

</sarcastic voice off>

 

Sorry, I couldn't resist. The idea with the Pocket Queries is not so that you have a complete listing of caches every day. That's why the limitation is in place. Remember you can pull 2500 caches PER DAY, so (if you created your queries correctly) in a week's time, you could have 17,500 caches and then start them updating again the next week. So - as Prime suggested, plan ahead and choose your data requests wisely.

Link to comment

The idea with the Pocket Queries is not so that you have a complete listing of caches every day. That's why the limitation is in place. Remember you can pull 2500 caches PER DAY, so (if you created your queries correctly) in a week's time, you could have 17,500 caches and then start them updating again the next week. So - as Prime suggested, plan ahead and choose your data requests wisely.

 

Does anyone have any suggestions as to how to go about building a series of PQs to download every cache listing for a specific state? I'm thinking of the DeLorme and similar challenges here, where it would be really handy to set out on an epic road trip with a notebook, and a GSAK database of everything in the state. Then as you reach new target areas, use the GSAK center and filter features to load the GPSr with all the local caches.

 

The GSAK stuff I can manage, but since PQs are based on radius, you would either have a lot of overlap, or lots of missing cache listing. The gaps wouldn't really matter of course, so long as you had enough caches to fulfill the challenge region, but I'm something of a completist <_<

 

The alternate plan is to drive around residential communities as I reach each area looking for unsecured WiFi points, and manually download GPX files as I need them. :laughing:

Link to comment

The idea with the Pocket Queries is not so that you have a complete listing of caches every day. That's why the limitation is in place. Remember you can pull 2500 caches PER DAY, so (if you created your queries correctly) in a week's time, you could have 17,500 caches and then start them updating again the next week. So - as Prime suggested, plan ahead and choose your data requests wisely.

 

Does anyone have any suggestions as to how to go about building a series of PQs to download every cache listing for a specific state? I'm thinking of the DeLorme and similar challenges here, where it would be really handy to set out on an epic road trip with a notebook, and a GSAK database of everything in the state. Then as you reach new target areas, use the GSAK center and filter features to load the GPSr with all the local caches.

 

The GSAK stuff I can manage, but since PQs are based on radius, you would either have a lot of overlap, or lots of missing cache listing. The gaps wouldn't really matter of course, so long as you had enough caches to fulfill the challenge region, but I'm something of a completist <_<

 

The alternate plan is to drive around residential communities as I reach each area looking for unsecured WiFi points, and manually download GPX files as I need them. :laughing:

 

A series of Pqs covering the entire state and only the state - pull 500 at a time based on date placed instead of radius.

 

However for challenge you need only find 1 cache per page so shouldn't be hard to find 1 interesting one to visit just using the google map feature.

Link to comment

I think using the route feature in google earth and then finding the caches along a route feature on GC you should be able to satisfy most of the Delorme Challenge page requirements. If I recall one page in AL had only one or two caches for that page. Easy enough to download those odds and ends seperatelyand not have to worry about those that are too far off the chosen path.

Link to comment

A series of Pqs covering the entire state and only the state - pull 500 at a time based on date placed instead of radius.

 

I wasn't aware of the date placed option. That sounds like the best approach, and it will certainly be a LOT easier than anything I had come up with. Thanks!

 

However for challenge you need only find 1 cache per page so shouldn't be hard to find 1 interesting one to visit just using the google map feature.

 

Well, my thinking is that if I'm driving that far, I sure the heck am going to do more than one per region. After all, why not see more of what there is to see in each area? Besides, I'm a numbers freak :laughing:

Link to comment

A series of Pqs covering the entire state and only the state - pull 500 at a time based on date placed instead of radius.

 

I wasn't aware of the date placed option. That sounds like the best approach, and it will certainly be a LOT easier than anything I had come up with. Thanks!

 

However for challenge you need only find 1 cache per page so shouldn't be hard to find 1 interesting one to visit just using the google map feature.

 

Well, my thinking is that if I'm driving that far, I sure the heck am going to do more than one per region. After all, why not see more of what there is to see in each area? Besides, I'm a numbers freak :laughing:

 

After you have got a full set of caches for your area downloaded into GSAK change the PQs to only get the ones updated in the last 7 days. This will keep you GSAK up to date easily.

Link to comment

Going back to the OP, and the following question of why anyone would need more than 5 PQs in a day. Here's why I like the OP idea. About twice a year, I get to take a vacation to a new area. This is often another country. In preparation for doing one of these tours, I will do PQs that cover the entire area. One example, I recently took a tour to New Zealand. By the time I found that crossing the 180 degree longitude caused a problem, and then figured out where to place center points to cover the country, I had a need for more than the 5 PQs per day and the total search took 2 days to complete. Not a big deal, since I start planning 5 months out, but still a slight inconvience. Normally, I use 1 PQ per week to receive updates of local caches, and if I were to go to an increased daily limit, I would severely limit the monthly total caches to a lot less than the 150 normally available. I have a feeling that limiting increased PQs to only a certain day of the week when server load is minimal might be difficult, so that option probably wouldn't work.

Link to comment

No one has mentioned the true way to work within the system to get this fixed.

 

It sounds like most of the premium members that use pocket queries can function with writing efficient pocket queries and planning ahead, but there are a few that for short periods of time (or for whatever reason) want more than 2500 per day or 17,500 per week. :)

 

If that's the case - these "Power Pocket Query Users" could create a second account and pay the extra cost for a second premium membership. If you create the queries correctly using both accounts, you can get 5,000 caches per day and 35,000 caches per week. The U.K. has about 15167, New Zealand has about 2577 caches, Australia has 8751. Heck, California has 35,257 - which would put it over the 35,000 per week - but 1,928 of them are temporarily disabled leaving 33,329. If that's STILL not enough, then pay the extra $3 for a one-month subscription and get a third account.

 

By doing that, the people who feel they NEED the extra pocket query ability are helping support the infrastructure that they are using to a higher capacity than most of the premium members.

 

It's a little like

paying more for his electric bill than the neighbors that don't use quite so much electricity. :) Edited by Markwell
Link to comment

After you have got a full set of caches for your area downloaded into GSAK change the PQs to only get the ones updated in the last 7 days. This will keep you GSAK up to date easily.

 

If you follow that procedure, how will you know when a cache is archived?

I do something similar.. I run the "base" queries about once a week, then run the "updated in last 7 days" daily...

 

That way nothing is more than a week old, but I have current data for a larger circle than I would get if I just downloaded the closest 500 daily.

 

Closest 500 only gives me about a 30 mile circle (used to be smaller), where updated in last 7 days gives me well over a 100 mile circle.

 

This gives me quite a bit of extra data, since I like to "day trip" at the last minute all the time.

 

Archived?? more than 7 days since last .gpx, easy filter in gsak.

 

Doing this I run 4 queries a day (including the daily one) most days (3 on a couple), and this gives me coverage for my entire state, the entire state next to me (I live right on a state line), plus I have a "free spot" to run at least one extra query per day if I want.

Link to comment
... In preparation for doing one of these tours, I will do PQs that cover the entire area. One example, I recently took a tour to New Zealand. By the time I found that crossing the 180 degree longitude caused a problem, and then figured out where to place center points to cover the country, I had a need for more than the 5 PQs per day and the total search took 2 days to complete. ...
I bet that every one of us have done what you describe (or had similar errors that caused a PQ to be 'bad'). Luckily, there is a solution. When you build a PQ, don't set a day for it to run. Run it online to make sure that it is giving you the data that you need. Once you know that it is good, set the day for it to run.

 

BTW, as StarBrand suggested, there's an easier way than finding 'center points' to give you all of the caches. Instead, set the PQs to return the caches by date placed, rather than distance to a given point. Adjust the dates so each PQ returns just under 500 caches and you'll be able to get all of them very easily with no duplication. Also, you'll be able to reuse the PQs for this area without the fear that new caches will mess them up. New caches might require you to enter an end date on the latest one and create a new PQ for new caches, but they won't cause the PQs for earlier date ranges to grow. In fact, you can often tweak those 'earlier' PQs to account for archived caches and not have to add a new PQ, at all.

Link to comment

Thanks for the suggestion about using dates instead of radius from a certain point. That works MUCH better, no overlapping data! I don't need 2,500 caches per day, but I was having so much overlap by using radius from a zip code that I did need more than 5 queries with 500 caches each to get complete results for my area.

Edited by bones3
Link to comment

I currently have all the caches in Canada loaded in GSAK. I did it (as noted above) with a series of PQs by date and names them (NB01, NB02... NS01, NS01...) and so on. Once I had all them loaded I now only run the last one in the series for most Provinces and for NB and NS where I would do most of my caching I run all of them. My only running the last one, I capture the new caches. If I was to go and have a trip to a Province other than NB or NS, then I would set all the PQs for that province to run and filter out the Archived Caches also.

 

Markwell is right, you need to run the PQs fully to see what caches are archived. I do this in NB and NS and then use a macro to pull out the caches that have not been updated in the last X days depending on when my PQs run.

 

I have all the dates on a spreadsheet. Anyone who wants a copy can email me through my profile.

 

Joe

Link to comment

Yeah, I used to manage the closest 7500 caches, until I got tired of the constant fiddling and my complete lack of geocaching skill. The most I ever found in a day was 19. So now, I just take a shot at where I am heading and let the 190 that come in be enough.

 

Just because they have an 'all you can eat' menu, doesn't mean you have to put your hog suit on...

Link to comment

I currently have all the caches in Canada loaded in GSAK. I did it (as noted above) with a series of PQs by date and names them (NB01, NB02... NS01, NS01...) and so on. Once I had all them loaded I now only run the last one in the series for most Provinces and for NB and NS where I would do most of my caching I run all of them. My only running the last one, I capture the new caches. If I was to go and have a trip to a Province other than NB or NS, then I would set all the PQs for that province to run and filter out the Archived Caches also.

 

Markwell is right, you need to run the PQs fully to see what caches are archived. I do this in NB and NS and then use a macro to pull out the caches that have not been updated in the last X days depending on when my PQs run.

 

I have all the dates on a spreadsheet. Anyone who wants a copy can email me through my profile.

 

Joe

You never know when you might need to go to Saskatchewan.

Link to comment

I wish they would limit the number of caches you could download per day instead of the 5 PQ limit.

 

That wold be much much worse. especially for newer cachers , I may look new but I have been a tag a long for ages and just finally got an old GPS, download limiting would really put a crimp on things. after doing the max of 5 PQ's you can still download GPX files , I know of some people who just hate doing PG's and like the main search page.

Link to comment

I wish they would limit the number of caches you could download per day instead of the 5 PQ limit.

 

That wold be much much worse. especially for newer cachers , I may look new but I have been a tag a long for ages and just finally got an old GPS, download limiting would really put a crimp on things. after doing the max of 5 PQ's you can still download GPX files , I know of some people who just hate doing PG's and like the main search page.

 

You now have a limit of 5 PQ's a day, each with max 500 caches, giving a total of 2500 caches. What I propose is to limit the total number of caches you can download (using PQ's).

So I would be able to make 1 PQ containing 2500 caches, or 10 PQ's each containing 250 caches, or 1 PQ containing 1500 and a second PQ containing 1000 caches.

 

ym2eurocents.

Link to comment

I wish they would limit the number of caches you could download per day instead of the 5 PQ limit.

 

That wold be much much worse. especially for newer cachers , I may look new but I have been a tag a long for ages and just finally got an old GPS, download limiting would really put a crimp on things. after doing the max of 5 PQ's you can still download GPX files , I know of some people who just hate doing PG's and like the main search page.

 

You now have a limit of 5 PQ's a day, each with max 500 caches, giving a total of 2500 caches. What I propose is to limit the total number of caches you can download (using PQ's).

So I would be able to make 1 PQ containing 2500 caches, or 10 PQ's each containing 250 caches, or 1 PQ containing 1500 and a second PQ containing 1000 caches.

 

ym2eurocents.

This ignores the "oops!" factor - accidentally keying in 2500 when you meant 250, and then not being able to run any more PQs for the next 24 hours.

Link to comment

 

This ignores the "oops!" factor - accidentally keying in 2500 when you meant 250, and then not being able to run any more PQs for the next 24 hours.

 

yes, and after that you know better..

 

LOL I am suddenly reminded of some bad JAVA I wrote once....

Link to comment

After you have got a full set of caches for your area downloaded into GSAK change the PQs to only get the ones updated in the last 7 days. This will keep you GSAK up to date easily.

 

If you follow that procedure, how will you know when a cache is archived?

 

Instant notification of archiving all you need to do is set them up centred on your home and you get every archival within 50 miles you do need to remember to set one for each cache type :laughing:

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...