Jump to content

Pocket Query strategies


Recommended Posts

What strategies do you use to create your pocket queries ?

 

I am in Massachusetts and would like to have my GSAK database have all traditional and multi caches within 100 miles and update them weekly so that the current finds are there too.

 

Should I create the PQs according to dates placed within the area or is it easier to create them base on a series of central points ? And once I have downloaded the cache data, do I just need to run them once per week and click the "has been updated in the last week" switch ?

Link to comment

If you create them with a series of date placed, then you won't have to adjust the queries nearly as frequently. Also, it's a more efficient method of querying the database, as no single cache could be included in multiple queries as long as the dates aren't overlapping.

 

Take a read here for a more detailed explanation.

Link to comment

That is what I do. I found a "centerpoint" that would give me a 50-mile radius instead of using my Home Coordinates, since I have the ocean on one side and a foreign Country to the south :blink: I set up the PQs by "Date Placed," trying to get about 490 caches in each. It takes eight PQs to cover a 100-mile circle in this "cache-rich" area . . .

Link to comment

Yes, I thought the search by date placed strategy would eliminate any duplicates, which of course make the PQs more efficient.

 

Am I correct in thinking once I have my "date placed" PQs created and I run them one, I can change them to add the stipulation that I only want data on the caches that have been updated in the previous week ? I would think this way my database would never be more than 1 week out of date.

Link to comment

No ... you leave them send you all the caches active in that date range. The number will never "grow" (well, leave a little headroom, but not much) because you can't technically publish a cache in the past (there is the rare occasion that a cache will be unarchived). I have mine include both active and inactive (so I see them go disabled or enabled), and exclude caches that I've already found. I load these into GSAK using the "always" load option and it refreshes the caches with each new receipt. I also weekly download the "All my finds" special query to add in any caches that I've already found. I spread the queries out across the week running the older ones once a week and then I do the most recent date range query (which has a "future" end date for the range) daily so that I cache any new publications.

 

You can then write a quick search query which finds any cache listings which haven't been updated in over 2 weeks (GPX file update date) and which were not found by you and delete any which fall into this search (caches that were archived and thus are not showing up any more) -or- if you're someone who likes to keep all cache information even for archived ones you didn't find, you can write a quick macro to mark them as archived.

 

I have a 75 mile radius of my house using 8 queries using this approach and never automatically run more than 2 queries a day which leaves lots of room for adhoc cache runs.

 

Oh yeah ... I keep everything in GSAK.

 

Yes, I thought the search by date placed strategy would eliminate any duplicates, which of course make the PQs more efficient.

 

Am I correct in thinking once I have my "date placed" PQs created and I run them one, I can change them to add the stipulation that I only want data on the caches that have been updated in the previous week ? I would think this way my database would never be more than 1 week out of date.

Edited by Lasagna
Link to comment

Thank you for all of your inputs.

 

To get all of the desired caches within 75 miles from my center point I had to create 20 queries, probably because I am in a cache dense area. I essentially eliminated the event caches.

 

I didn't exclude the caches I have placed (since I don't have that many yet) or my finds since I wanted them updated regularly also (as well as running the special my finds PQ every week or so).

 

I kept all of the queries so that they will return about 490, and I run 3 or 4 per day.

 

I also figured out how to run the G.S.A.K. macro that reads the PQs in from my gmail account automatically. Pretty sweet although it does take a couple of minutes to do.

Link to comment

I set them up by dates, even if going somewhere on vacation and setting up a PQ in a 35 mile radius. Usually there's more than 500 in an area like that, and by using dates you get no duplicates.

 

An example is Long Island. I setup 2 PQs for 900 caches in a 25 mile radius. No overlap. Markwell's link is very helpful.

Link to comment

I'm creating my PQ's to reflect what cache's I want to search for. I have a PQ that only shows Micro's. My next PQ only shows smalls & regular. Then I have one just for large. I have one that shows everything, then I filter that PQ down in GSAK before transfering to my unit. I have some PQ's that are setup just for specific towns. A PQ for just TB's is handy. This way whatever mood I'm in or whatever time I have I have to cache a PQ is ready that meets my needs. You could just have one large PQ containing all types of cache's and again just do your filtering in GSAK. Also I adjust my PQ to always show as close 500 waypoints as possible. For example my micro PQ is set for 10 miles from home and hits 500 waypoints easy, but I have my large PQ set for 25 miles and only has around 300 waypoints. The combinations are enless and should reflect what meets your needs and wants.

Link to comment

One Caveat for Date Placed PQ's: Your combined data base is only complete, in terms of radius, to the radius of the smallest query.

 

I just switched to Date Placed. It takes 8 PQ's to get all queries within 124 km of my house, in spite of my finding 1900 caches. Seven of the PQ's have caches beyond the 124 km radius but since one query only gets out that far, I am missing some, but not all, of the caches beyond that point.

 

So, periodically, check to see which query is the "snuggest" and be sure to "trim" your combined database. Otherwise, you might miss some caches if you go on a trip to the outer edge.

 

Another piece of advice: If you completely update all the PQ's weekly, then check to see what caches in your Master DB have not been updated by one of those queries. Those are probably caches that have been archived. You can do this by looking at Last GPS Date in GSAK.

Link to comment

Yes, I thought the search by date placed strategy would eliminate any duplicates, which of course make the PQs more efficient.

 

Am I correct in thinking once I have my "date placed" PQs created and I run them one, I can change them to add the stipulation that I only want data on the caches that have been updated in the previous week ? I would think this way my database would never be more than 1 week out of date.

 

Yes and no. Updated does only means there was some activity on the cache in the last week. If there was not, there is no update.

 

Most GSAK users just run the query once a week, then delete any caches that are not in the PQ since those will be archived. Of course, make sure when you run that filter not to include the ones you have found or those will be deleted to.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...