Jump to content

Pocket Query Also Archived Caches?


m.zielinski

Recommended Posts

I use GSAK also for documenting for myself my found caches, making notes etc. But when generating my pocket-querys i found that there are missed about 100 of my found caches - seems those are "archived" ones.

Is there a way to get a gpx-file with them?

 

Also because of the 500waypoint-limit it would be nice to be able to say "found by me until" or such to get e.g. the last 500 founds or such....

Link to comment

There currently isn't a way to get archived caches in PQ's, but in the future, a PQ for you finds is supposed to deliver ALL you finds.......

 

Right now, if you REALLY want them, you'll need to go to each caches page, download each individually as .gpx files, then merge them into one file.

Link to comment

Errr . . . one of the anwers I needed. Got a little overly enthusiastic - indoors to much today . . . not enough sunlight and oxygen to the brain . . . must go caching.

 

Anyway - I am trying to figure out an appraoch for dealing with archives as well but the issue is a little different (although I had the same one too).

 

Scenario: Active cache has record in GSAK. Cache becomes archived. Cache information is no longer exported; therfore, archived caches appear active in GSAK.

 

I've tried changing the PQ's several different ways but no luck. Went to three caches today, only to find when I got back on-line that they had been archived.

 

Only thing I have been able to come up with, and I don't really like it, is delete all the records in the GSAK database prior to uplaoding the new PQ records.

 

Suggestions? Ideas? Thoughts?

Link to comment
Errr . . . one of the anwers I needed. Got a little overly enthusiastic - indoors to much today . . . not enough sunlight and oxygen to the brain . . . must go caching.

 

Anyway - I am trying to figure out an appraoch for dealing with archives as well but the issue is a little different (although I had the same one too).

 

Scenario: Active cache has record in GSAK. Cache becomes archived. Cache information is no longer exported; therfore, archived caches appear active in GSAK.

 

I've tried changing the PQ's several different ways but no luck. Went to three caches today, only to find when I got back on-line that they had been archived.

 

Only thing I have been able to come up with, and I don't really like it, is delete all the records in the GSAK database prior to uplaoding the new PQ records.

 

Suggestions? Ideas? Thoughts?

If you are using GSAK, then just click on the heading of the "last update" column.

 

This will sort all your caches by date last updated. At the very beginning of the list will be dates that are older than all the rest (That is, the date of your last GPX load) In my experiance these caches are all archived ones (you can always confirm this via right mouse click "show online")

 

You can then set them to archived yourself (right mouse click, "Toggle archive status")

Link to comment

I have a huge database in GSAK myself. I've had the same problem with Watcher before as well. On a trip to Cincinnati, we searched for several caches without knowing they were archived due to this problem. Fortunately I came up with a solution fo this with GSAK earlier today. Hopefully something like this will work with Watcher as well.

 

I tend to add one state at a time to GSAK, so it might not work as well for you.

 

First, make a filter similar to the pocket queries you'll be loading and export it as a GPX file. Next, you'll need to edit the GPX file with a text editor like Word Pad. Search and replace 'archived="False"' with 'archived="True"'. Save the file again and reload the file into GSAK using the Always option instead of Newer Only. At this point all of those caches will be archived in GSAK. You'll need to load the new pocket queries with Always still selected. After this your list should match the site's list. You can switch back to Newer Only after this. Unless you know there are some recently archived caches you probably won't need to repeat this for a while.

 

So... is there any reason I don't want to use the Always option for this? It seems to work great for me so far and I still seem to have all the old logs, etc.

 

I hope this helps you!

Link to comment

Thanks ClydeE & Man in The Wild. I'll tinker around with those suggestions today. Still figuring our best way for me to use GSAK but both sound good.

 

Is this something that changed recently with the PQ's? with GSAK? Both? It appears at one time I was getting achive exports. I haven't been using GSAK that long . . . or, another possability, is that there is a difference between archived and inactive. Maybe I was getting inactive and not archived - hadn't thought of that, I'll have to check it out. Hmmm . . .

Link to comment

The problem of knowing caches are archived is one of the things that remains a problem for PQ users. There are several ways of dealing with the issue, of course.

  1. Don't keep using your PQ files. This isn't a solution to the problem, but it is the only workaround that exists and will eliminate the stale file problem.
  2. Include archived caches in PQs. This has been discussed on many occasions, with both the complete and abbreviated <wpt> ideas covered. All signs point to archived caches not being included in general PQs ("my finds" PQs being a possible exception), and the reasons are indeed valid.
  3. Provide a web service style call for "get cache status". This is the most logical solution, but it has not had much attention.

Implementation is obviously at the sole discretion of the Powers, but in this case, I will give a few examples of possible "get cache status" calls to explain what it is that I am attempting to communicate.

  1. GetCacheStatus?id=##### This call would return the archived/available information for the cache. An application could call this for each cache, either on displaying the "cache page" or possibly in response to a "verify cache availability" option. The downside would obviously be that an app trying to verify the status of 13,000 caches (yes, people have had that many in Watcher at one time) would undoubtedly cause undue strain (multiply by the number of users).
  2. GetCachesStatus (with IDs in the POST data) This would function like a batch version of the first call. It would eliminate the overhead of having a great number of individual HTTP connections to verfiy a quantity of caches. It would not, however, markedly reduce the database load, as each cache will still need to be checked.
  3. Cache status file (with all caches) This could be generated at the beginning of the PQ run. The file would have the archived/active status of each cache. A program would merely need to download todays file once (if it did not already have it), and all status checking could then be handled client-side. The downside of this approach is bandwidth (since you'd fetch the whole file, not just the caches you need), but that could be greatly reduced in a number of ways.

One way you could reduce the bandwidth hit of a cache status file would be to make it a simple bitmapped flat file. To find out if cache 143 is archived or inactive, the app would just AND byte 143 with 1 for archived and 2 for active. Such a file would almost certainly compress rather well as a zip or whatever for additional savings (you could even have the bytes bitmapped to use less than 1 byte per cache, if you really wanted to).

 

Anyway, my point is that there are several different avenues that could be explored to provide a solution to the very annoying stale file problem, and it would be very welcome if a dialog could be opened regarding something that might be workable (such as a "cache status file"-type solution).

Link to comment

I'll bite.

 

I use #1, but would like #2 for downloading my finds. Because I have had some of my finds deleted by a local bevis, I'd like it to be able to get archived caches other than just those caches that currently show as a 'find'. What are the reasons for not being able to get archived caches other than 'finds'? I can't think of any other than possible 'busy servers'.

 

You totally lost me with #3. Of course, I usually get lost as soon as the programmers of the group start sharing.

Link to comment
This will be added eventually. We're currently going through an overhaul of the servers to keep up with capacity.

I dug around a bunch of similar threads, but this is the only one I've found with a comment from Jeremy.

 

The quote, if you don't intend to read the rambling, yet historical, thread, is in regards to the ability to download a GPX file of all of your finds, including archived, and possibly beyond the normal 500 cache limit.

 

I recall this being a hot topic before I left for the summer. Upon my return, I didn't see it being brought up. I guess it was just a fad.

 

I liked the idea. Is it still on the table?

 

Jamie

Link to comment

The main reason I can see for not including archived caches in PQ's is that a cache could be archived for good reasons (such as landowner objections for example) and if someone got it in a gpx file, they might try to hunt it anyway.

 

I don't know whether that's the main objection, but if so, it would seem like the cache information being downloaded for "archive-no show" caches could be greatly abbreviated to only the GCxxxx and the archived status, leaving out coordinates etc, or dummy coordinates supplied (say the north or south pole for example) for the coordinates fields etc.

 

Sure would like to see the PQ's somehow tell me that caches I've been tracking have been archived though. And I'd like to see the PQ that documents all our finds too. I guess there's a lot of other things with higher priorities, but it probably doesn't hurt to ask again either.

 

Jon

Edited by jon & miki
Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...