Jump to content

Question for PQ users


mambero

Recommended Posts

The current system works quite well for me. I maintain an OLDB and use Date Placed to eliminate overlap.

 

I live is a cache intense area and it takes 19 PQ's to cover the area I normally cache. The 19th PQ has a range date that ends in the future. This picks up the new caches. I have 7 copies of the 19th PQ and each one runs weekly. I spread the remaining 18 PQ's over 6 days running 3 per day. This leaves me 1 PQ per day for random queries. It took awhile to set this up and twice a year I adjust the date ranges. Obviously, increasing PQ's to 1,000 caches will be a bonus. Thank you.

 

With this system, at most, some of my data is 6 days old.

 

I use GSAK to determine which caches I want to find on a particular trip and then use the "Send to GPX" option to refresh the data for those caches. Typically this would never be more than 5 or 6 caches. So it is not a big deal.

 

With very few exceptions, all of my PQ's are waiting in my email when I wake up. I live in the Eastern Time Zone and the PQ's usually run at around 3:30 a.m. (12:30 PST). However, it is no big deal, if they arrive later.

 

Reading these various threads gives one an appreciation for how complex the PQ system is.

 

My observation is that I doubt any system will be ideal for everyone. Geocaching is so diverse and everyone has their own style. A system that spreads the workload more evenly over the server farm would obviously be nice. If it means some cachers get their PQ's earlier, great.

 

The real bonus to me is the improved information sharing by Groundspeak. For that, everyone is a winner.

Link to comment

personally i would like to see an option to filter for caches by distance from specified coordinates - not only "within xxx km/miles" as we have it now, but also being able to give a minimum distance required. in other words, a PQ returning caches that are "between xxx and yyy km/miles away" from the specified point. this would make creating multiple PQs to cover a larger area much easier than using the date-hidden approach. but i know that's never gonna happen :)

Link to comment

The current system works quite well for me. I maintain an OLDB and use Date Placed to eliminate overlap.

 

I live is a cache intense area and it takes 19 PQ's to cover the area I normally cache.

I am interested to know how large an area you query for that you need 19?

Link to comment
personally i would like to see an option to filter for caches by distance from specified coordinates - not only "within xxx km/miles" as we have it now, but also being able to give a minimum distance required. in other words, a PQ returning caches that are "between xxx and yyy km/miles away" from the specified point. this would make creating multiple PQs to cover a larger area much easier than using the date-hidden approach. but i know that's never gonna happen :)

You'd still have the same issue...and a problem. You'd have adjust the inner circle to have just under the cache limit. Then adjust each circle.

 

The problem is if you're not paying attention any one circle cache count can grow to beyond the maximum. With date-based division, this can't happen unless some goof back-dates his placements. Only the latest PQ would grow. The rest would shrink from attrition.

Link to comment

The current system works quite well for me. I maintain an OLDB and use Date Placed to eliminate overlap.

 

I live is a cache intense area and it takes 19 PQ's to cover the area I normally cache.

I am interested to know how large an area you query for that you need 19?

 

200km. 9650 caches. And that is with several hundred on my Ignore List. And over 2,000 found.

Link to comment

The current system works quite well for me. I maintain an OLDB and use Date Placed to eliminate overlap.

 

I live is a cache intense area and it takes 19 PQ's to cover the area I normally cache.

I am interested to know how large an area you query for that you need 19?

 

200km. 9650 caches. And that is with several hundred on my Ignore List. And over 2,000 found.

 

 

Wow! That is a ton! In not a huge area.

 

I live on an island that isn't 50 miles across and the Atlantic is100 yards or so south of me. Plus I filter out NJ and CT in my 50 mile radius. Which is more of a semi circle because of the ocean I do 5 and then 1 more for the Eastern end of the island.

Link to comment
personally i would like to see an option to filter for caches by distance from specified coordinates - not only "within xxx km/miles" as we have it now, but also being able to give a minimum distance required. in other words, a PQ returning caches that are "between xxx and yyy km/miles away" from the specified point. this would make creating multiple PQs to cover a larger area much easier than using the date-hidden approach. but i know that's never gonna happen :)

You'd still have the same issue...and a problem. You'd have adjust the inner circle to have just under the cache limit. Then adjust each circle.

 

The problem is if you're not paying attention any one circle cache count can grow to beyond the maximum. With date-based division, this can't happen unless some goof back-dates his placements. Only the latest PQ would grow. The rest would shrink from attrition.

That's something I wondered about.

PQ's for when you go on holiday.

Run the first to get the first 500 caches, see how far xx miles, then set second PQ for xx to yy miles for the next 500.

But have to agree, for regular PQ's it would be a bit of a pain in the ...

Link to comment

On the right day is close enough for me.

 

Traditionally, PQs would arrive semi-unpredictably on the day requested. Brand-new PQs showed up the fastest, while older ones would take somewhat longer. That was known and understood, and logical if you think about it; a brand-new one is one that you're waiting for.

 

Probably now, they ALL show up faster than they used to. This is plenty great, thanks.

Link to comment

The current system works quite well for me. I maintain an OLDB and use Date Placed to eliminate overlap.

 

I live is a cache intense area and it takes 19 PQ's to cover the area I normally cache.

I am interested to know how large an area you query for that you need 19?

 

200km. 9650 caches. And that is with several hundred on my Ignore List. And over 2,000 found.

 

 

Wow! That is a ton! In not a huge area.

 

I live on an island that isn't 50 miles across and the Atlantic is100 yards or so south of me. Plus I filter out NJ and CT in my 50 mile radius. Which is more of a semi circle because of the ocean I do 5 and then 1 more for the Eastern end of the island.

I guess I'm in a very rich area as I get 5000 caches within 30 miles (48km). And that's low because I've found 3200 within that circle. And, I restrict the T to 3.0 or less! This takes 11 PQs (date placed) run over three days. I'm looking forward to the day the 1000 limit is implemented.

Link to comment

The current system works quite well for me. I maintain an OLDB and use Date Placed to eliminate overlap.

 

I live is a cache intense area and it takes 19 PQ's to cover the area I normally cache.

I am interested to know how large an area you query for that you need 19?

 

200km. 9650 caches. And that is with several hundred on my Ignore List. And over 2,000 found.

 

 

Wow! That is a ton! In not a huge area.

 

I live on an island that isn't 50 miles across and the Atlantic is100 yards or so south of me. Plus I filter out NJ and CT in my 50 mile radius. Which is more of a semi circle because of the ocean I do 5 and then 1 more for the Eastern end of the island.

 

Mine is a semi circle as well. I do not include New York State caches within 200km. That would include Buffalo and Rochester.

 

Cache Intense does not equate with Cache "Rich". There are a lot of crappy caches in that 9650.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...