Jump to content

Pocket Queries For Geocache Saturated Areas


JohnTee

Recommended Posts

How do you go about setting up PQ's for areas with dense cache saturation? I thought about breaking an area up into quadrants and running a PQ from the center of each quadrant . . . but wondered if I might lose some caches in the corners of some of the quadrants. There would be a lot of duplicates, but GSAK would deal with them without trouble.

 

JohnTee

Link to comment

How do you go about setting up PQ's for areas with dense cache saturation? I thought about breaking an area up into quadrants and running a PQ from the center of each quadrant . . . but wondered if I might lose some caches in the corners of some of the quadrants. There would be a lot of duplicates, but GSAK would deal with them without trouble.

 

JohnTee

I break up my PQs in cache dense areas by the date the cache was placed.

Link to comment

A bit more elaboration on what awhsom said. If you want every cache in an area that likely has more than 500 - use the date range for cache placement.

Preview it. Ask for all caches placed between Jan1 2000 and Dec 31 2003 - take a look at what the preview is giving you ( jump to the last page, obviously you want fewer than 25 pages). Another PQ from Jan 1 2004 forward.

Link to comment

A bit more elaboration on what awhsom said. If you want every cache in an area that likely has more than 500 - use the date range for cache placement.

Preview it. Ask for all caches placed between Jan1 2000 and Dec 31 2003 - take a look at what the preview is giving you ( jump to the last page, obviously you want fewer than 25 pages). Another PQ from Jan 1 2004 forward.

 

No need to jump to the last page of the preview...just look for this on the top of the preview page:

 

299f7d5c-b075-4a48-9203-97ccb07090cc.jpg

Link to comment

Thanks awhsom, Isonzo Karst and Stunod. I'll give that a try then. The only thing more frustrating than not getting all of the caches for an area downloaded is not having up-to-date caches and trying to search for something that has been archived.

 

I'm going to try to get something set up for weekly updates so I can update GSAK, CacheMate and my GPSr.

 

JohnTee

Link to comment

How do you go about setting up PQ's for areas with dense cache saturation? I thought about breaking an area up into quadrants and running a PQ from the center of each quadrant . . . but wondered if I might lose some caches in the corners of some of the quadrants. There would be a lot of duplicates, but GSAK would deal with them without trouble.

 

I use a slightly different technique than many. I have my queries broken down, not by cache age, but by container size, and cache type. One query returns all regular and large containers in a 75 mile radius, and my 500 limit actually gets me to just over 40 miles. The next gives me small and micro caches in the same 75 mile radius, and it caps out just below 45 miles from home. One more gives me multi-stage and "unknown" caches which is just below the 500 limit ( but only by less than 10 caches). I also have one for events, virtuals, etc. By the time the dust settles, I have just over 2200 caches in my GSAK database in the furthest is less than 75 miles from my home. Granted, I miss a few of the caches over 40 miles form home, but I seldom go that far without preplanning so I can run a custom query to bring back more caches in that area. This is how I got my total GSAK database over 2000 in the first place.

 

I also have a couple other things that I do differently than many others. My "main" PQ's only return active caches, then I have a PQ I run once a week to give me all inactive caches in my search radius, but with that PQ, I only use it to "UPDATE" existing records rather than to adding new, this way I learn when a cache is deactivated (within a week) so I know not to look for some of them any longer.

Link to comment

That's find until you hit the 500 mark for traditional micros, as will be the case in our area :blink:

 

By doing the date sort, you can set it up and let it run for quite some time before having to tweak the queries, and you can let GSAK or some other software spin the files into whatever subset you want.

 

For example - do the date range thing and set it up to capture the caches within a standard radius, say 45 miles and divide the PQs by date. Then in GSAK filter by traditional micros and then save as a GPX file. Then filter on traditional regular sized and then save as a GPX file. Lather, rinse, repeat.

Link to comment

I must be missing something here. I still don't see how breaking PQ's by date placed is more efficient than splitting by cache size/type. By splitting by size/type, I can run the PQ several times a week for the most common ones, and as little as once a week for the least common, so my GSAK data is most current for the most common cache sizes/types I will be seeking. If I split by cache placement date, I have to run all the PQ's to get he updates for one cache size/type because they are split across all PQ's

 

In other words, I currently run 3 queries 4 times a week and always have the most recent (within the last 2 days) information/updates for the cache sizes/types I most commonly seek. If I change to splitting the PQ's by date placed, I either need to run 6 PQ's a day for some days, or not have the most recent updates for some of the caches in my "high probability" target group. One last note, I sorted my GSAK DB by date placed for the caches, and the screen showing the oldest caches had logs for every one of them within the last 2 months.

 

Perhaps we can discuss this some time so that I can better understand what I'm missing... Maybe at a GONIL M-n-G?

Link to comment

The dates will not change. You will never need to adjust the lower dates, because it is impossible for more caches to be placed between Jan 01 and Jan 03 for instance.

 

Going by size and container type will constantly flux.

 

The dates the caches were placed may not change, but the most recent logs will change. I use the previous log information to assist in finding some caches. If a cache gets muggled, you can see logs and owner notes to indicate this, if you are updating the caches via your PQ's. If you run one query once to get a list of all caches from 2005 and earlier, if a cache gets disabled, or perhaps re-enabled you wouldn't know that if you don't re-run that PQ, but per your suggestion, I can't re-run all those PQ's regularly since I need 6 of them to cover the area I currently cover in my queries by cache size.

 

As I mentioned in my last post, the 25 oldest caches in my GSAK DB (placed 2003 or before) have all been logged within the last 2 months. In order for me to have that information in my GSAK database, I need to run those PQ's to get it. Since I can select to run my events and virutals less frequently (as they update less frequently) I narrow down my number of queries I need to run regularly to keep current in my GSAK DB logs.

Link to comment

I must be missing something here. I still don't see how breaking PQ's by date placed is more efficient than splitting by cache size/type....

 

It's not more efficient, it's just another way to break down the problem into managable bites.

 

Personally when there are more caches than I can find, I split it out so that I'm looking for larger caches. time enough for micros when I run out of the big ones.

Link to comment

I must be missing something here. I still don't see how breaking PQ's by date placed is more efficient than splitting by cache size/type....

 

It's not more efficient, it's just another way to break down the problem into managable bites.

 

...

 

OK, that's what I was thinking, but they kept implying that my method of splitting the files won't work well in "saturated areas" like we live in (Markwell in in my general area). My Small/Micro PQ only returns caches that are active and I haven't found. My query hits the 500 cache limit at about 42-43 miles away from my home coordinates. Yeah, this means that I don't have all the Small & Micro caches within a 50 mile radius of my house, but I still have the 500 closest ones, and if I run out of those, I also have the closest 500 Regular & Large caches, and almost 500 more in Multi & Unknown/Puzzle caches to keep me busy for a while. :mad:

Link to comment

Another idea . . . make GSAK work for you!

 

Run them all for the entire area/city. Using GSAK, export them to MS Streets & Trips. Look at them on the map and simply pick areas to work, sweeping across the city from one side to the other.

 

To make it a more targeted workout, limit by terrain/difficulty . . . maybe run all the 3.5/3.5 and less caches first, then, go after the higher rated ones.

 

Remember, once in GSAK, you can move your center by making any selected cache the new centerpoint and working out from that point . . . GSAK is grand, especially on an on-board laptop!

Edited by GRANPA ALEX
Link to comment

OK I just set up my queries to run by date range. After the first download to pick up all the active caches by those dates, does it sound right to check the "updated in the last 7 days" to limit the size of each successive download (I'm doing this on a weekly basis). I also have one query that provides all new caches in the last week.

 

Will I stay up to date?

Link to comment

I like the date range idea. I may give that a shot.

 

I used google earth, measured out 4 spots, each about 30 miles from the center of Nashville, with a little slop room for good measure, and run pg's for 30 miles out from each of these spots. I get a total of about 1350 caches, and gsak sorts out the dupes, and I end up with about 1060 or so, and I weed out the last 70 (most distant) or so cause I can only stick about 990 of them in my garmin. This works for me, but only because I've already found more than a thousand caches in the area.

Link to comment

OK I just set up my queries to run by date range. After the first download to pick up all the active caches by those dates, does it sound right to check the "updated in the last 7 days" to limit the size of each successive download (I'm doing this on a weekly basis). I also have one query that provides all new caches in the last week.

 

Will I stay up to date?

 

I just realized the query for new caches in the last week has a flaw. It won't update the new caches after 500 have been accumulated. So I modified it to have a separate query that loads all new caches placed 1-1-07 to 3-30-07. It should not go above 500 before then. Then on April 1st, I'll create another query going forward to let's say June 30, 2007, depending on the number of new hides.

 

Does anyone notice any problems with this?

Link to comment

OK I just set up my queries to run by date range. After the first download to pick up all the active caches by those dates, does it sound right to check the "updated in the last 7 days" to limit the size of each successive download (I'm doing this on a weekly basis). I also have one query that provides all new caches in the last week.

 

Will I stay up to date?

 

I have been doing this for some time now and my GSAK is always up to date. I run them at least twice a week.

 

The PQ will provide all changes to the cache descriptions and also logs which are dated before the 7 day period but entered in the last 7 days. It will also give you all the new caches approved in the last 7 days.

 

You can then reduce the number of PQs you run but watch that, during busy times, they may go over the 500 limit.

Edited by Crew 153
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...