Jump to content

Pocket Queries: Best technique for blanketing an area?


edelen

Recommended Posts

I live in the Greater Cincinnati area. We have a particularly vibrant geocaching community, and as a result there are probably 1700+ caches within a sixty mile radius of downtown Cincinnati.

 

I use GSAK to keep track of all the caches in the rough oval running E/W around Cincinnati, but I'm getting frustrated having to run nine queries to cover the area.

 

At issue here are concentrations of caches and the 500 cache limit imposed on queries. You would think that four or five thirty mile radius circles would cover the area, but that simply doesn't work. Some query circles have far more than 500 caches in them, while others do not. Trying to find the right spot to center a circle is maddening, and I really don't want to have to run nine queries to blanket the area. Circles aren't a particularly effective search, either, since covering an area with circles necessitates a lot of overlap, compounding the 500 cache limit problem. As a result, I have nine circles of varying sizes, many of which are forced to overlap.

 

GSAK allows one to search directionally, but the Geocaching.com queries don't. It would be nice to draw a rectangle around the area I wish to search, then search into the rectangle from the corners, using directionality. This would reduce the number of queries, the amount of overlap, and stray caches outside the search area. Increasing the 500 cache limit seems logical, too, since I suspect I could reduce the number of queries I run down to five or less, which has got to be better for the Geocaching.com query server than the nine I'm running right now.

 

Does anyone have any good techniques for building queries that blanket an area while reducing overlap, also accounting for large pockets of cache concentration? Managing nine query results is wasting my time.

Link to comment

You say there are 1700+ in a 60 mile radius. You should be able to get all of those in 4 queries by dividing the query date. The first one would be from the beginning of time 1/1/2000 to what were date that will get you about 490 caches. You want to allow for caches that are offline to come back without bumping the 500 cache limit. The next query in sequence would start the day and end when you again get 490 or so caches. Continue until you have all of the caches in the radius.

 

EDITED TO ADD: The reason this technique is superior in general is because once you set up the earlier dated PQs you don't have to worry about them ever getting larger. The number of caches placed between two dates in the past can only stay the same or get smaller--except taking into account those caches that come out of retirement. Secondly, it's easy to eliminate duplicates.

 

Hope this helps.

Edited by CoyoteRed
Link to comment

I believe that a popular solution is running your pocket queries using the "date placed" function. I.E. 1 PQ for all caches placed between '01 and '03, one for '04, one for '05, one for '06, etc...

 

You'll have to fudge with the settings a little before you find which dates maximize your results, but this way you won't have duplicate caches between PQs...

 

Does that make sense? Can someone explain this better? :)

 

EDIT: Yup... what CR said :)

Edited by Cache Heads
Link to comment
Does anyone have any good techniques for building queries that blanket an area while reducing overlap, also accounting for large pockets of cache concentration?

Markwell: PQ Tips and Tricks

 

If you're looking to get more than 500 caches (e.g. you're in a cache dense area), you can split the PQ up by date ranges. Make the first one Jan 1 2000 - Jun 30 2003 (or whatever date works) and the second one Jul 1 2003 (again, play with it, but make it one day later than the ending date of the other query) to Dec 31 2008. Splitting them up by date placed makes it so that each query uses it's maximum potential to grab data. While GSAK and Watcher will both eliminate duplicate data, why have a query pull extra data only to be eliminated? Some cachers try to use overlapping circles plotted out from Microsoft Streets and Trips. While that works for grabbing caches along a route, for a standard query of your home area it seems inefficient.
Link to comment

You say there are 1700+ in a 60 mile radius. You should be able to get all of those in 4 queries by dividing the query date. The first one would be from the beginning of time 1/1/2000 to what were date that will get you about 490 caches. You want to allow for caches that are offline to come back without bumping the 500 cache limit. The next query in sequence would start the day and end when you again get 490 or so caches. Continue until you have all of the caches in the radius.

 

By "date," do you mean the date the cache was placed?

Link to comment

Placed by date PQ Generation

PQ From To Count Done

01 21-1-2001 2-11-2003 490

02 3-11-2003 15-8-2004 489

03 16-8-2004 4-5-2005 490

04 5-5-2005 2-11-2005 490

05 3-11-2005 7-5-2006 486

06 8-5-2006 13-9-2006 482

07 14-9-2006 Maximum date 101

 

This a example from all the caches in the Netherlands i have in my database.

Link to comment

Excellent! Thank you everyone. :)

 

I was actually able to cut the queries down to three (and lose a few caches out on the very fringes of town that I would never get to anyway).

 

My question should be a FAQ, I think. I didn't see one about this in the FAQ list.

 

(And Jurgen, thanks for the GSAK macro. It helped much!)

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...