Jump to content

GSAK PQ's and the Archived cache


Recommended Posts

My PQ's load into GSAK ok and I am made aware of unavailable caches, but I am not notified of caches that have been archived.

 

Is there a filter I can set for this?

For example if not updated immediately delete waypoint/cache from database.

 

It's something I've been wondering for a couple of weeks or so and noticed other people have this problem, how do you deal with it?

 

Discuss :D

Link to comment

I don't delete the database when I load new PQs because the Past Logs will keep building up and sometimes you need more than just the most recent five logs for a cache to get that little hint enabling you to find an elusive cache. :D

 

After refreshing my GSAK database, I do the "Last Update .gpx filter" from the Date tab in GSAK. That filter will return all the caches that did not update with the last PQ. You can check each of those online to make sure they were either Archived or Disabled and then Delete all the waypoints in that filter from your database.

Link to comment

I am useless with macros :lol: Is there somwhere I can get info on writting my first macro?? Maybe somones done it already :lol:

There is a lot of info on the GSAK Forums about handling archived caches. There is a Macro Library that lists macros to help deal with them (here and here are examples). Try reading GSAK 301 macros for help in writing one.

 

The "problem" happens because GC.com doesn't send any info about archived caches in PQ's, so some manual steps must be used.

Link to comment

In the GSAK macro library there is a macro that will set all caches in the data base to archived the idea behind that is that any caches updated from the PQ will be unarchived I'm at my work computer so I can't tell you the name of the Macro it should be easy to find though if I can find it :lol:

Edited by Postman Pat
Link to comment

You don't have to use a macro. Just sort by the "Last GPX" field and delete all the ones that aren't with the most recent. If you get the inactive and active ones, the ones that don't update are the archived.

Not quite true. If you have new caches in a PQ, older ones (still active) may be "pushed out" of the PQ: the closest 500 PQ may reach out to, say 12.8 miles last week, but only 12.7 miles this week (due to the large number of new caches placed). The caches between 12.7 and 12.8 may still be active, but won't update.

Link to comment

But that's when you have to know that your PQs are going to be less than 500 - especially since the GSAK macro relies on the Last GPX date. :unsure:

I don't understand how that changes Jester's point. My home area PQ has 450 closest (less than 500) and I still have the same issue. Because of new caches placed inside the circle, the active caches on the outside of the circle are not getting updated.

 

To deal with the issue I've deleted all caches that weren't updated in the last PQ and have known that I'll be getting rid of some useful information (the non-updated, non-archived caches) because there's no easy way to filter them to stay. I've traded them for the convienience of keeping the known 450 closest active caches.

 

I only get 450 instead of 500 because my old GPSr (Garmin Vista) had a favorites folder that had 20-30 favorite locations waypointed that I wanted to keep. My new GPSr (Vista Cx) doesn't have that folder, so I might as well learn how to load POIs and use the date filters to get the closest 2500 caches instead. :)

Link to comment

Because this area is so "cache-rich," and because I can head in different directions while running my errands, I use locations as my centerpoints. I have more than 1300 caches in my database of caches less than 35 miles from my house. I have other databases that cover areas beyond that distance.

 

If I am heading in a southwestly direction from home, I will request the one, or two, PQs that cover that area. Then, I use one of the caches in that area as a centerpoint and filter down to fewer than 450 caches for my Vista C. After I have that filter, I do the "Last 2 DNF" filter to get rid of any caches that haven't been found by the last two cachers. Then I do the "Last .gpx" date filter to make sure any that I am putting into my GPSr haven't been Archived since the last time I got PQs for that area.

Link to comment

OK - looking at the pertinent posts:

 

StarBrand

Use a macro to filter out caches that did not update (were not in) a PQ.

Markwell

You don't have to use a macro. Just sort by the "Last GPX" field and delete all the ones that aren't with the most recent. If you get the inactive and active ones, the ones that don't update are the archived.

The Jester

Not quite true. If you have new caches in a PQ, older ones (still active) may be "pushed out" of the PQ: the closest 500 PQ may reach out to, say 12.8 miles last week, but only 12.7 miles this week (due to the large number of new caches placed). The caches between 12.7 and 12.8 may still be active, but won't update.

Markwell

But that's when you have to know that your PQs are going to be less than 500 - especially since the GSAK macro relies on the Last GPX date

 

What I was saying is that if you're relying on the macro to update your database to eliminate the archived caches, then it's doing the SAME THING as sorting by the Last GPX date and deleting the older caches.

 

It is true that if you have a cache radius PQ that will max out over the 500 that this will also eliminate the farthest ones from the GPX update. In that case, the macro will, in essence, treat these farthest-out-caches like they have been archived. The Last GPX date will not update and it will look just like an archived cache to both the manual sort and to the macro.

 

SO, if you want to be sure that your caches aren't being excluded because of a truncated radius, make sure that your PQs are going to be less than 500.

 

The other point I forgot to mention is to do it by a date range of date placed. If your "early" PQs are less than 500 (say 485) and your "recently placed" PQ is set somewhere WAY less than 500, then everything should be stable in that radius, and the ONLY reason a cache would get pushed out would be if it were archived.

 

EITHER WAY, the macro or the sort method still relies on the Last GPX Update date.

Link to comment

Sorry, didn't mean to cause such an "uproar" - what I meant to object to is the "those that don't update are the archived". I was just pointing out that some caches could be not-updated and still be active.

 

My own macro's use the distance range of the PQ (set user flag, filter to user flag, sort by distance, look at farthest) and check the caches within that range. I never look at the Last GPX Update. I did it this way as my DB has about 1500 caches ranging out about 40 miles, but my closest to home PQ only can return 500 and ranges out about 12.7 miles. I also use the Online Page in the split screen to manually look at each non-updated page - just so I can see why (i.e. read the last logs).

Link to comment
It is true that if you have a cache radius PQ that will max out over the 500 that this will also eliminate the farthest ones from the GPX update. In that case, the macro will, in essence, treat these farthest-out-caches like they have been archived. The Last GPX date will not update and it will look just like an archived cache to both the manual sort and to the macro.

 

SO, if you want to be sure that your caches aren't being excluded because of a truncated radius, make sure that your PQs are going to be less than 500.

I'm still not getting it.

 

If you only have the 10 closest caches in your PQ, and someone hides one closer than any of them, then the 10th cache will no longer be included in your PQ. The macro, or the Last GPX Date filter, will remove that 10th cache.

 

So it doesn't matter how far below the 500 limit you are with your PQ request, right?

Link to comment

True, but if you're using a buck-shot to try and capture *ALL* of the caches in a particular area, you wouldn't limit it to 10 caches - you'd use the 500 limit, and then preview the PQ to make sure it shows less than 500. Then when new caches are added that fit the criteria, you should still capture them.

 

If you're using that (usually automated) method, AS LONG AS THE PQ RESULTS AREN'T LIMITED BY QUANTITY - the only reason a cache should drop off would be if it were archived.

 

If you limit the quantity (either by manually making it 10, or by the fact that there are more than 500 caches that match your criteria), then you WILL get caches dropping off for other reasons.

 

Since I try to keep my GSAK DB up-to-date with the area in which I might cache, I try not to have caches drop off the list for any reason OTHER than archiving.

 

Does that make my position on this clearer? I'm really NOT trying to argue, just explaining.

 

But the key on this is that this is how the standard GSAK macro works as well.

Link to comment

True, but if you're using a buck-shot to try and capture *ALL* of the caches in a particular area, you wouldn't limit it to 10 caches - you'd use the 500 limit, and then preview the PQ to make sure it shows less than 500. Then when new caches are added that fit the criteria, you should still capture them.

 

If you're using that (usually automated) method, AS LONG AS THE PQ RESULTS AREN'T LIMITED BY QUANTITY - the only reason a cache should drop off would be if it were archived.

 

If you limit the quantity (either by manually making it 10, or by the fact that there are more than 500 caches that match your criteria), then you WILL get caches dropping off for other reasons.

 

Since I try to keep my GSAK DB up-to-date with the area in which I might cache, I try not to have caches drop off the list for any reason OTHER than archiving.

 

Does that make my position on this clearer? I'm really NOT trying to argue, just explaining.

 

But the key on this is that this is how the standard GSAK macro works as well.

Not arguing here either, just trying to understand, so I appreciate the patience. :o

 

I think I do see now what I didn't see before. My PQs aren't filtered in a way that limits the quantity to less than 500 based on anything other than distance. I get a PQ of 450 caches within 100 miles of my home coords - and the furthest one is probably less than 30 miles away. This is why new caches would push the furthest caches out of my PQ, and they'd get deleted when trying to rid myself of archived caches.

 

Perhaps the issue will go away when I start filtering them by date AND distance, in an effort to maximize the number of caches I can keep as POIs in my GPSr.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...