Jump to content

Pocket Queries not returning archived caches


soulnavigator

Recommended Posts

I ran a pq to update my GSAK files, generate info for Cachemate, Garmin, etc. when I noticed that a recently archived cache from my area was not a part of the results. It's obviously a GC.com issue and not a GSAK issue because the pocket query didn't include the cache in it's results. Here's exactly what I did:

 

Ran the standard 100 cache 100 miles pq from 58436. I got the results I expected except GCVWX4 which just recently went into archived status is not included in the pq results. I re-ran the pq with "not active" filter set and still did not receive GCVWX4 in the results.

 

My question therefore is this: I have already found this cache and so therefore won't be looking for it. However, had it been one of my unfounds, I may have continued looking since the pq didn't generate the cache as being archived. Make sense? I've had other "temporarily unavailable" caches update without any trouble, but they show up in the pq results, as well.

 

I suppose what's happening is none of the permanently archived caches are being included in pq or search results because they'll never be active again, which is really nice most of the time. However, it would be nice if somehow the pq would generate the permanent archive status for a couple of weeks, or something so GSAK would be updated, etc.

 

Any thoughts?

Link to comment

This is not a fluke, and is by design, for the reasons you suppose. For now, the best thing to do is check the "Last GPX Date" in GSAK and visit the cache pages to confirm their status, before manually marking them as "Archived" in GSAK.

 

This topic has be repeatedly discussed here in the forums, and I don't recall any hint from TPTB that they have any plans to change this behavior, though if I'm incorrect in that statement, I'm sure someone will correct me.

 

There are a couple of different macros available in the GSAK forums, specifically in the Macro Library that can help automate this process, depending on exactly how you want to approach it.

Link to comment

TPTB really never designed the Pocket Queries to be used in an offline database of any kind. They are intended to be used as the newest freshest data available at the time of generation. That is why no archived ones. Archive issues in the manner you describe are a problem created by your use of them not by any problem with the PQs.

Link to comment

TPTB really never designed the Pocket Queries to be used in an offline database of any kind. They are intended to be used as the newest freshest data available at the time of generation. That is why no archived ones. Archive issues in the manner you describe are a problem created by your use of them not by any problem with the PQs.

 

It's a fact of life that many cachers have evolved to use their own offline database such as GSAK - and if a gpx file does not have archived cache data, then it is not the 'newest freshest data available'.

 

It's also a fact the if all cachers searched the GC database for every cache that they wanted to search for, the server probably would not be able to keep up with the demand (especially on the weekends).

 

Without Archived cache information, cachers may find themselves searching in sensitive areas when the cache has already been removed. This definitely goes against the idea of removing a cache because of damage to the area or even more importantly because a landowner has asked for the cache to be removed. Also consider the time lost/wasted by people searching for something that isn't there.

 

There is no reason for us to be constantly denied archived cache information within a gpx file. There is no need to include log data - just the cache id, and the archived flag.

 

I'm sure that this would be enough to satisfy people and allow them to update their offline database in the knowledge that it was in fact "the newest freshest data available".

 

My point of view - Wayne

Link to comment

TPTB really never designed the Pocket Queries to be used in an offline database of any kind. They are intended to be used as the newest freshest data available at the time of generation. That is why no archived ones. Archive issues in the manner you describe are a problem created by your use of them not by any problem with the PQs.

 

It's a fact of life that many cachers have evolved to use their own offline database such as GSAK - and if a gpx file does not have archived cache data, then it is not the 'newest freshest data available'.

 

 

I have to agree with Wayne here. When I do a pocket query and a cache status has been changed from available to unavailable due to being labled as "temporarily unavailable," the cache is a part of the results and GSAK is updated. However, I'm not getting any type of indication about status changes on newly permanently archived caches. Now, obviously I don't want to get these listed forever, but I think it would be very valuable to have them included in the pq results for say 2-4 weeks (or whatever a reasonable amount of time is) during which time most users will update their offline db's and then the issue is resolved. But just having the cache "disappear" off all results is definitely not helping anyone, including people who are only using GC.com's database.

Link to comment

TPTB really never designed the Pocket Queries to be used in an offline database of any kind. They are intended to be used as the newest freshest data available at the time of generation. That is why no archived ones. Archive issues in the manner you describe are a problem created by your use of them not by any problem with the PQs.

 

It's a fact of life that many cachers have evolved to use their own offline database such as GSAK - and if a gpx file does not have archived cache data, then it is not the 'newest freshest data available'.

 

 

I have to agree with Wayne here. When I do a pocket query and a cache status has been changed from available to unavailable due to being labled as "temporarily unavailable," the cache is a part of the results and GSAK is updated. However, I'm not getting any type of indication about status changes on newly permanently archived caches. Now, obviously I don't want to get these listed forever, but I think it would be very valuable to have them included in the pq results for say 2-4 weeks (or whatever a reasonable amount of time is) during which time most users will update their offline db's and then the issue is resolved. But just having the cache "disappear" off all results is definitely not helping anyone, including people who are only using GC.com's database.

 

I absolutely have to agree with you guys here, this is one of the most frustrating thing about paperless caching, never really knowing if you are searching for a cache that is offline for what ever reason. I have even resorted to planning my trip and updating the individual GPX file for every cache I plan on visiting the next day. A huge task when you plan on 40+ caches in a day.

 

To say that this is a result of our way of caching and that it is not what the PQ's were designed for is plain silly. All things evolve and our game that we all love is no exception! I would be happy to have a Query that is only archived or unavailable caches that I can upload into GSAK and only upload current waypoints to my GPS. I can not see a reason as to why this functionality can not be made available.

 

Anyhow, just my thoughts. I'm sure someone will take great pleasure in shooting them down in flames... :huh:

 

Matt.

Link to comment

Most of us that use GSAK, keeping it up to date due to the features within GSAK makes th a non-issue. It has not been addressed as a problem and I seriously doubt it ever will be, especially since the site does not support offline databases as a stated policy.

 

If you are using it as an "offline" database, as an earlier message indicated, it is quite simple. It means you are probably doing your PQ's either daily or weekly, some sort of regular interval. As an example, in the case of weekly PQ's you can set up a filter to show you all teh caches that are not found nor had a GPX update in the last 7 days. Then just right click and delete all that are in the filter. You can check each one, however sooner or later you will find that if you have your PQ's set right (staying within the 500 limit) this works more or less flawlessly.

 

I maintain a GSAK DB of about 5000 caches because of my eratic travel patterns in the area. Even with the potential for 6 day old data, I have only gone looking for one that was archived, and that was archived after I left the house so would not have mattered.

Link to comment

TPTB really never designed the Pocket Queries to be used in an offline database of any kind. They are intended to be used as the newest freshest data available at the time of generation. That is why no archived ones. Archive issues in the manner you describe are a problem created by your use of them not by any problem with the PQs.

 

It's a fact of life that many cachers have evolved to use their own offline database such as GSAK - and if a gpx file does not have archived cache data, then it is not the 'newest freshest data available'.

 

It's also a fact the if all cachers searched the GC database for every cache that they wanted to search for, the server probably would not be able to keep up with the demand (especially on the weekends).

 

Without Archived cache information, cachers may find themselves searching in sensitive areas when the cache has already been removed. This definitely goes against the idea of removing a cache because of damage to the area or even more importantly because a landowner has asked for the cache to be removed. Also consider the time lost/wasted by people searching for something that isn't there.

 

There is no reason for us to be constantly denied archived cache information within a gpx file. There is no need to include log data - just the cache id, and the archived flag.

 

I'm sure that this would be enough to satisfy people and allow them to update their offline database in the knowledge that it was in fact "the newest freshest data available".

 

My point of view - Wayne

StarBrand tells you how things are, arguing about how you want them isn't going to do any good. If you want it changed, you need to get Jeremy et al. to decide a change would be good. Since its been repeatly said they don't like people to keep offline databases, I really doubt they'll flip flop right now, but at least try to get your desire across to the right people.

Link to comment

 

StarBrand tells you how things are, arguing about how you want them isn't going to do any good. If you want it changed, you need to get Jeremy et al. to decide a change would be good. Since its been repeatly said they don't like people to keep offline databases, I really doubt they'll flip flop right now, but at least try to get your desire across to the right people.

 

Ah-aahh....but we are communicating with TPTB by posting here. Totally agree with the creep-over-time to the offline database model, it's been one of the best spin-offs of PQs and makes the premium membership worth it's weight in gold. A simple one or two week-retention of "Archived" caches within PQs would be very helpful... we already have "temp unavailable" caches included. Until then, there are plenty of good and easy workarounds as highlighted above.

Link to comment

<snip>.. A simple one or two week-retention of "Archived" caches within PQs would be very helpful... we already have "temp unavailable" caches included. Until then, there are plenty of good and easy workarounds as highlighted above.

 

The reason the unavailable are still included is that they are still viable caches that are unavailable temporarily until some issue is fixed or addressed. Archived caches are considered to be no longer viable and will rarely ever come back. From a user point of view, the data is no longer valid since it can not (should not) be hunted.

Link to comment

The reason the unavailable are still included is that they are still viable caches that are unavailable temporarily until some issue is fixed or addressed. Archived caches are considered to be no longer viable and will rarely ever come back. From a user point of view, the data is no longer valid since it can not (should not) be hunted.

 

Yup, I got the reasoning as to why temp. unavail. are included. I still think an Archived flag could be included for a fortnight or so.. so as to help people who cache (suboptimally, according to TPTB) with an offline database to not search for caches placed inappropriately / dangerously. People who d/l PQs less frequently than that "flagging period" are certainly more prone to stale data anyway.

 

I do agree that longer termed flagging of archived caches would not be a productive or a good use of PQs.

Link to comment

But wait a minute. There is a way in GSAK data to know this already. Gnbrotz gave the solution wayyyy up in post #2...

the best thing to do is check the "Last GPX Date" in GSAK and visit the cache pages to confirm their status, before manually marking them as "Archived" in GSAK
Since this already works, I'm not sure GC.com would be convinced implement something ELSE to tell GSAK that a cache is archived.

 

Not trying to say NO - just trying to ask WHY?

Link to comment

I have been using GSAK for all of five weeks, and I was a reluctant convert. Since then, I've built up a database of all the unfound caches within 100 miles of my home -- more than 3,000 of them. I figured out pretty quickly to use the sort method that gnbrotz and Markwell describe, all on my own, when I spotted caches that didn't get updated in the last batch of pocket queries. It took no time at all to check out their status online while I was watching TV, and updating them as archived or disabled in my GSAK database. If there's a macro to do the same thing, that's even better. I haven't gotten that fancy yet.

 

If I'm going to maintain an offline database, then it's my responsibility to keep it up to date and to accept the fact that I may have stale data. If I need to be absolutely current, I use the Trimble Geocache Navigator application on my cellphone when I'm out in the field.

Link to comment

I'll throw in this as well...

 

If PQ results returned archived caches and I lived in a cache-dense area, over time as more and more caches get archived, the number of caches returned in a "100 caches closed to home" query would include fewer and fewer active caches.

 

I also use the "check the GPX update date" method described above. In fact, I have a simple macro which finds any caches that haven't been updated in the last two weeks and purges them from my GSAK database.

Link to comment

 

Not trying to say NO - just trying to ask WHY?

 

I maintain an offline database (gasp!) of caches in areas where I am planning future cache trips. I want to keep this relatively up-to-date, so I run a lot of PQs weekly.

 

If TPTB provided a "status changed in the last 2 weeks" option that included archived caches (or even just had the current "changed in last week" include archived) I could run a lot fewer PQs.

 

That would seem to be a benefit to TPTB.

 

When a producer creates a product, they have some idea of how the consumer will use it. When it turns out that the consumer uses in a different manner, the producer has several choices.

 

One choice is to say "you are doing it wrong", which seems to be the answer here.

 

Another choice is to modify the product to make it more useful to the consumer.

Link to comment

 

Not trying to say NO - just trying to ask WHY?

 

I maintain an offline database (gasp!) of caches in areas where I am planning future cache trips. I want to keep this relatively up-to-date, so I run a lot of PQs weekly.

 

As do I. I just use the existing features in GSAK to purge the data. If you want it COMPLETELY up-to-date, there's another feature in GSAK as well.

2727be26-f831-47f3-b35c-922d8cb78219.jpg

 

That will make it so that you won't have ANY stale data in your GSAK load. Of course, that will also purge the logs on the ones that are kept.

 

Is your goal to have a COMPLETELY offline database, or are you just concerned about making sure that you don't hunt for stale ones? If the latter, then the GSAK purge-and-load method accomplishes this, right?

Link to comment

...

I maintain an offline database (gasp!) of caches in areas where I am planning future cache trips. I want to keep this relatively up-to-date, so I run a lot of PQs weekly...

 

 

TPTB want your data to be up-to-date as well so they conveniently keep it fresh and easily available.

 

 

Online.

 

 

Plus they generously allow you to download 2500 cache descriptions per day to use while not online - and that data never includes caches that are archived or "stale" so you don't waste your time. Within just a few minutes (most of the time) - I can have all the new fresh data I need for a long day of caching.

 

If you choose to store that data for any length of time then you run the risk of it not being as up-to-date or current as the online data. Your choice.

 

The best way to check "in the field" is via the WAP interface.

Link to comment

 

If you choose to store that data for any length of time then you run the risk of it not being as up-to-date or current as the online data. Your choice.

 

The best way to check "in the field" is via the WAP interface.

 

I know. I'm doing it "wrong". They could make things easier for me but insist I do things the "right" way.

Edited by beejay&esskay
Link to comment

TPTB really never designed the Pocket Queries to be used in an offline database of any kind. They are intended to be used as the newest freshest data available at the time of generation. That is why no archived ones. Archive issues in the manner you describe are a problem created by your use of them not by any problem with the PQs.

 

It seems to me that an offline database is just about all you can do with a pocket query. What else can I do with 2500 cache listings other than put them in an offline database. If I downloaded a pocket query straight into my PDA wouldn't it be an offline database. Even if I sent them straight to my printer it would be an offline database. If pocket queries would download all logs I would only use fresh data. Many times there is necessary info in old logs. The only way reasonable way to have them with that many cache listings is to compile an offline database and update it often.

 

TPTB want your data to be up-to-date as well so they conveniently keep it fresh and easily available.

 

 

Online.

 

 

Plus they generously allow you to download 2500 cache descriptions per day to use while not online - and that data never includes caches that are archived or "stale" so you don't waste your time. Within just a few minutes (most of the time) - I can have all the new fresh data I need for a long day of caching.

 

The data that is needed to find a cache may not be fresh data. I have seen logs that report bad coordinates etc. I have seen logs that can be used to help figure different ways or places to look. Something like "I put my glove on and went for it". If you want to go paperless and choose the caches you hunt on the fly you will benefit from keeping old logs. It seems that the system to remove archived caches from GSAK should work. I just find it hard to accept the idea that we pay for the ability to use Pocket Queries but we're not "really" supposed to keep that generous allotment of listings in an offline database. :D

If we are supposed to search and get all our current and up to date info right before we go out caching what exactly is the point of being able to download 2500 cache listing per day. It would take you all day to review and choose, you would never make it out the door. :D

Link to comment

It seems to me that an offline database is just about all you can do with a pocket query. What else can I do with 2500 cache listings other than put them in an offline database. If I downloaded a pocket query straight into my PDA wouldn't it be an offline database. Even if I sent them straight to my printer it would be an offline database.

 

It seems that the system to remove archived caches from GSAK should work. I just find it hard to accept the idea that we pay for the ability to use Pocket Queries but we're not "really" supposed to keep that generous allotment of listings in an offline database. :D

 

Ha! I love how the conversation has evolved from a "how do I..." to "do TPTB have a moral obligation...?" And it would appear that it rightly should. Traildad makes the best point yet, I think. It would appear as though PQ's were implemented to allow us to have cache info in a form more practical and accessible than carrying my laptop into the field looking for hotspots so I can get more caches. TPTB should be working on a fix to this issue so that THEIR service to us as a customer works at the highest level possible. If they implemented PQ's, they should be committed to making them the best they can be...

Link to comment
If we are supposed to search and get all our current and up to date info right before we go out caching what exactly is the point of being able to download 2500 cache listing per day. It would take you all day to review and choose, you would never make it out the door. :D

 

Why not slice and dice your caches in GSAK offline, and make a bookmark list of the targets that you really enjoy. Set them up as PQ, and if you REALLY want fresh data, run the PQ right before you go out the door.

 

The point that some of us are trying to make is that the tools you need already exist in either GSAK or through SOME of the online features of GC.com.

Link to comment

TPTB should be working on a fix to this issue so that THEIR service to us as a customer works at the highest level possible. If they implemented PQ's, they should be committed to making them the best they can be...

 

If it were viewed as an issue, I am sure they would. They seem to be relatively responsive.

 

Now, let's look at things from a different angle. Let's assume that GC does agree with you and that it is on their list. They often do this without necessarily making it known. Scanning through the forums, there are a lot of things requested. Buddy bookmarks, larger PQ's (although you seem to be advocating smaller), etc. not to mention that some have complained about server issues and they do have the day to day maintenance of system such as this.

 

If you had a long laundry list of things to do, I am fairly certain you would use similar criteria to decide what gets priority;

 

- How many requests have their been?

- Do I have the resources to devote to it?

- How much time will it take?

- Will it put any additional burden on the system?

- Will it adversely affect users or another part of the system?

- Is there a method to currently accomplish the same thing?

 

That last one is the thing people have been trying to politely say. As a customer, which by the way is not how I view you or I, you should want them to devote their resources to higher priority items, especially if their is a cost effective, simple means of doing what you are requesting. I want them working on the things that can not be done easily by any other means, such as a buddy watch list, different search criteria or the ability to have emails sent to my alternate addresses.

 

As an IT person, if there is another tool or method for doing something easily not to mention the rest of the criteria being applied, it gets moved to the bottom or completely removed from the list.

 

No one is saying that someone is doing it right or wrong, just suggesting that there is a solution already in place. Many of us prefer that archived caches not be included for previously stated reasons. Short of using WAP or WiFi in the field, there is no way to solve the stale data issue. The moment you receive the data, it is already stale. We've all just come to accept that.

Link to comment

 

The point that some of us are trying to make is that the tools you need already exist in either GSAK or through SOME of the online features of GC.com.

 

I don't mind the suggestion that I should be able to use GSAK to figure out if a cache has been archived. I have only gone looking for one archived cache so far and it was archived the same day so it could not be helped. I am slightly bemused by the idea that some seem to be putting forth that we can download 2500 listings per day but they are not meant to be used as an offline database.

 

On the other hand it doesn't seem it would be too hard for the system to simply keep a list of all archived cache waypoints that can be downloaded and run through GSAK so the offline database can be updated.

Link to comment

... I am slightly bemused by the idea that some seem to be putting forth that we can download 2500 listings per day but they are not meant to be used as an offline database....

 

Not just an idea - TPTB are on record in many posts in this very forum that they do not (did not) intend the data from PQs to be used in long term stored offline databases. There is but one source of current data - the web site. If you keep data on your own and it isn't up-to-date - your problem. Once Archived - they are a non-issue for up-to-date data being downloaded for the purpose of a hunt. The argument that you "need" more than 5 logs to find a cache is a weak one - many cachers find caches without logs or hints. The problem of stale data (archived caches) is yours - not the web sites - and anyway - GSAK makes culling them out easy (as has been stated).

Link to comment

long term stored offline databases.

Ok if you want to change the term then it seems ok now.

There is but one source of current data - the web site. If you keep data on your own and it isn't up-to-date - your problem. Once Archived - they are a non-issue for up-to-date data being downloaded for the purpose of a hunt.

 

I am still not sure why anyone would need to download 2500 listing per day to go on a hunt. If you download them into your PDA and then refresh the data adding any new logs you are able to cache on the fly and have all available info with you.

 

The argument that you "need" more than 5 logs to find a cache is a weak one - many cachers find caches without logs or hints. The problem of stale data (archived caches) is yours - not the web sites - and anyway - GSAK makes culling them out easy (as has been stated).

 

Please don't put your words in my mouth. I never said you "need" more than 5 logs to find a cache. I said that it is a "benefit". I "want" not "need" all the logs. The easiest way for me to do this is to let them collect in GSAK. I can then pick through the caches on my PDA when I am out in the field and have all the logs I want available. The problem of stale data is not mine as I previously stated, it is the suggestion that pocket queries were not meant to be used as an offline database and you have already corrected that so as my kids would say "It's all good"

:rolleyes:

Link to comment

Sorry - back again...

 

See this post from Jeremy....

#4

No. Pocket Queries were created so you could create a list of caches to put in a handheld device. There was no intent to create an offline database. I'm not sure where you got that idea from Geocaching.com

 

http://forums.Groundspeak.com/GC/index.php...hl=offline+data

 

It would seem to me that placing data into a handheld device (or printing it or storing it in GSAK,etc) is creating an offline database. It's not the entire GC database, though it can be quite extensive. The online database contains the information necessary to hunt a cache and if I am given that information from GC.com and place it in my possession in a different form, I have created a database. So, I think Jeremy is playing words a little if he doesn't want to call digital data in handheld device a "database." Even loading the files into my GPS is creating an offline database. The information used is limited and device specific, but I am using it without being in contact with the GC.com server. I think that's the key to defining what is and isn't an offline DB. All I am asking is that TPTB maintain a service they're providing by adding a simple way to include permanently archived caches in PQ results, at least temporarily. It cannot be that hard.

 

In addition, so that I don't sound only critical, I would like to say that I have found the GC.com website to be intuitive and user friendly. As a relatively new geocacher, I have thoroughly enjoyed GC's site at every level. It is extremely well done and very easy to use. I think TPTB have done a fine job. I am 100% satisfied. However, I just noticed this fluke in the PQ's and wanted some discussion.

Edited by timberghost27
Link to comment
...It would seem to me that placing data into a handheld device (or printing it or storing it in GSAK,etc) is creating an offline database. It's not the entire GC database....

Parse the words all you want - I know you undersatnd what is meant. Download the PQ - load in GPS and go caching. Do it that way (as is intended) and you will never worry about archived caches. That is all I have been trying to say. Use the workaround in GSAK or don't but my take on this is the website won't be changing anything. Sorry.

Link to comment

I'll throw in this as well...

 

If PQ results returned archived caches and I lived in a cache-dense area, over time as more and more caches get archived, the number of caches returned in a "100 caches closed to home" query would include fewer and fewer active caches.

 

I also use the "check the GPX update date" method described above. In fact, I have a simple macro which finds any caches that haven't been updated in the last two weeks and purges them from my GSAK database.

 

 

I'd like to see an archived search feature, but make it a member only feature in the pocket query, if everyone is worried about it returning results in the normal search on the website.

My wife and I are planning on putting out some caches (finally), and we'd like to see areas to avoid. I know a lot of caches get archived because of safety concerns or permissions being revoked and what-not, but there are the occassional muggles stealing them too. I know of one park that has had two caches (maybe more) stolen in it, and it's because of that reason that we would like to see an "archived cache search" feature.

The only reason I know about the stolen caches is because one was stolen a few days after I found one, and I checked the history of previous cachers in the area to see what caches they had found. I had to grab the coordinates and plot them in my mapsource program to see them practically overlap.

So far, in my manual history digging through local cachers' geocaching pages, I've found a couple of caches that were archived because of theft. It's those areas we want to avoid, or at least, use a different size container so as to not recreate another opportunity for non-geocachers.

Now that I've said that, someone will probably tell me about an easier way to find archived caches on the website, or through the pocket queries...

 

--Will

Link to comment

StarBrand, what if, say, I'm going on a road trip and won't have access to the internet?

 

Or, what if the geocaching.com website is down again?

I did a 10 day road trip last summer - ran 5 PQs for along the route and loaded the 960 caches I liked into my GPS and PDA - off I went. Found 74 along the way - great trip, never accessed the Internet the whole time!! (had 8 DNFs too) - ran into one that turned out to have been archived after the trip started - oh well. 2 hours of planning for 10 days of fun!! Data got wiped off my home HD as soon as I ran my next PQ about 2 weeks later.

 

Were you trying to make a point??

Link to comment

Now that I've said that, someone will probably tell me about an easier way to find archived caches on the website, or through the pocket queries...

Well, you can find them on the website. For your example, go the the cache in the park that you know, click the map link, then click "list archived" and "identify." You get an image and list of all the caches (including archived) in that area of the map.

Edited by FamilyDNA
Link to comment

Now that I've said that, someone will probably tell me about an easier way to find archived caches on the website, or through the pocket queries...

Well, you can find them on the website. For your example, go the the cache in the park that you know, click the map link, then click "list archived" and "identify." You get an image and list of all the caches (including archived) in that area of the map.

 

Thanks for the tip. I checked it out this morning, and it is what we're looking for.

 

--Will

Link to comment

Well, you can find them on the website. For your example, go the the cache in the park that you know, click the map link, then click "list archived" and "identify." You get an image and list of all the caches (including archived) in that area of the map.

 

It seems to me the geocaching mobile is speeding faster and faster towards being completely paperless. As a new cacher, I discovered very quickly that paperless is the way to go and I'll bet the trend continues to gain advocates. So, I think this thread is valuable and forward-looking.

 

Obviously the archived cache information is there on the website. It's not like I'm asking them to generate new information. Instead, I'm simply asking that we be given access to it in a form that is a little more usuable. All they need to do is include the newly archived caches in the "is not active" filter OR create a new filter that only returns permanently archived caches.

 

Apparently the need for this type of filter has already been determined because as FamilyDNA notes, you can pull these caches up on the map page - which is great for viewing but worthless for other purposes. The information is there, it's just not in usuable form.

Link to comment

I recently developed a method for dealing with this issue that I find useful.

 

I do use the method indicated above of inspecting caches that have not been updated with recent PQs. But many times I forget and then I end up searching for a few caches that I should not be looking for.

 

For the GSAK Cachemate export, I have the "CacheMate name" field set to:

%Name %macro="C:\Program Files\GSAK\Macros\LastGPXdate.txt"
and that txt file contains this one line:
$_Special="["+DateToString($d_LastGPXdate)+"]"
so that the cachename ends in the date of the last update that I had for that cache. I generally know what it is supposed to be (although not always).

 

Now when I find myself searching for a cache that seems to be missing, I can see the Last gpx date while I am out geocaching. It's not a perfect solution, but it does help.

Link to comment

StarBrand, what if, say, I'm going on a road trip and won't have access to the internet?

 

Or, what if the geocaching.com website is down again?

I did a 10 day road trip last summer - ran 5 PQs for along the route and loaded the 960 caches I liked into my GPS and PDA - off I went. Found 74 along the way - great trip, never accessed the Internet the whole time!! (had 8 DNFs too) - ran into one that turned out to have been archived after the trip started - oh well. 2 hours of planning for 10 days of fun!! Data got wiped off my home HD as soon as I ran my next PQ about 2 weeks later.

 

Were you trying to make a point??

Yes. They were examples that were intended to demonstrate the fact that offline databases can be useful in some situations to some people. You posted an anecdote as evidence that my question was invalid. Why don't I tell you a nice story too? Anecdotes don't prove anything.

 

Again, I ask since it was missed before. Perhaps rewording helps.

 

How would an extra option (unticked by default) called "include archived caches (description not included)" on PQ pages hurt anyone? It's already been demonstrated that it would help some people.

 

The only possible problem I can see is that perhaps archived caches are moved out of the active cache database, and would require some technical work behind the scenes. That's just speculation on my part; if it's true why doesn't Groundspeak acknowledge it?

Link to comment

I suppose that sooner or later, someone needs to come up with an idea that Groundspeak will take hold of. But first - a few points.

  • 1 - Archived cache data belongs in the general cache database.
  • 2 - Archived cache data is available in GPX files from the cache page.
  • 3 - Archived cache data is available from a GPX file of my found caches (which I run every so often)
  • 4 - Archived caches are Valid caches until they are marked as Archived in my (and others) Offline Db and they may be hunted / looked for.

Groundspeak, I am one (of many) of your customers who want Archived cache data in regular GPX files. I'm not asking for much - no data, just the cache id and the flag. There are many of us that want this extra data to make geocaching simpler, and in some cases save us time and money.

 

You don't have the right to tell us how we have to deal with the data that you supplied, because you never supplied us with a program to use the data with. Instead you left that to other sources and now we have GSAK, Cachemate and a host of other database programs.

 

Give us what we are asking for. Charge a little more for the data if you want, but stop stone-walling and give us fully up-to-date data.

 

It could be incorporated in a Pocket Query for "Any Activity in the last 7 or 14 days" (cache status and all logs over the period). If this information was available to me in one pocket query, I would disable most of my queries each day, knowing that I wouldn't lose any information. It would be a major benefit to you as well.

 

Wayne (getting down off the box now) :P

Link to comment

I agree with your desire for archived cache information, but I don't think it will ever happen.

 

Jeremy believes this will encourage the use of offline databases which he thinks is the wrong way to use the data. The fact that these databases will continue to exist in any case or that giving archived information would probably decrease the PQs run makes no difference.

Link to comment

...

How would an extra option (unticked by default) called "include archived caches (description not included)" on PQ pages hurt anyone? It's already been demonstrated that it would help some people.

...

It would encourage the use of offline databases - something that TPTB have stated they wish to discourage.

 

And my point throughout.

 

And having said that - there is an EASY workaround in GSAK - described in many posts above.

Link to comment

1 - Archived cache data belongs in the general cache database.

(snipped for brevity)

4 - Archived caches are Valid caches until they are marked as Archived in my (and others) Offline Db and they may be hunted / looked for.

(snipped for brevity)

You don't have the right to tell us how we have to deal with the data that you supplied, because you never supplied us with a program to use the data with. Instead you left that to other sources and now we have GSAK, Cachemate and a host of other database programs.

(snipped for brevity)

Give us what we are asking for. Charge a little more for the data if you want, but stop stone-walling and give us fully up-to-date data.

 

Point one, archived data belongs in the general cache database. Not according to Groundspeak. A land manager comes by and finds that a cache has been placed illegally. They don't want that particular location attracting visitors because the last 14 Romanian Giant Sea Slugs in the world make their nest at those coordinates. Cachers would trample them and make them extinct. The land manager contacts Groundspeak and demands that the site be stricken from the records. Archive the cache. Does that site belong in the general cache database?

 

Point two, archived caches are Valid caches until they are marked as Archived in my (and others) Offline Db and they may be hunted / looked for. Again, the example above is why Groundspeak does not encourage offline databases. If you keep one, and it is out-of-date, it is not the fault of Groundspeak. Analogy: Ginsu makes a pretty nasty knife - it can cut through a tin can. Is it Ginsu's fault when someone slices their finger while trying to cut a tin can? Should they have to provide a special sheath that would allow people to cut a can, but not cut their finger?

 

Point three, You don't have the right to tell us how we have to deal with the data that you supplied... Sure they do. Here's a link. But do they have a right to tell you you cannot make an offline database? No. Did they tell you that you can't? No. They have just said that they do not intend to provide archived caches in Pocket Queries except for the caches you've already found. If you choose to create an offline database using other software that Groundspeak does not create or support, isn't the onus on the user or the other software creator to make a way to keep the data up-to-date? GSAK has done this.

 

Point four, Give us what we are asking for...give us fully up-to-date data. They do. When you receive the PQ, it is fully up-to-date, with only the non-archived caches. That is what Groundspeak considers up-to-date.

 

I'm really having a VERY hard time seeing how there is a need to provide a list of archived caches, when GSAK and other software ALL have a method for culling these caches from offline databases.

Link to comment

I'm really having a VERY hard time seeing how there is a need to provide a list of archived caches, when GSAK and other software ALL have a method for culling these caches from offline databases.

 

Yes, there is a method. I use that method. And I run a lot more PQs than I would if the simple addition of recently archived caches were available.

 

I think what TPTB are providing is suboptimal. And I understand that is the way things are likely to remain.

Link to comment

Far less people would be tempted to keep offline databases of their own in gsak if.......

 

(1) The website was fully available more of the time (less of the server busy errors).

 

(2) The website included some of the more advanced searching options that gsak offers.

 

I know things are being done about (1).

But I think (2) would be well worth doing. TPTB could take a good look at the advanced filtering facilities gsak offers, and implement some of them into the existing search and PQ systems.

 

JMHO.

Edited by Jaz666
Link to comment

My wife and I are planning on putting out some caches (finally), and we'd like to see areas to avoid. I know a lot of caches get archived because of safety concerns or permissions being revoked and what-not, but there are the occassional muggles stealing them too. I know of one park that has had two caches (maybe more) stolen in it, and it's because of that reason that we would like to see an "archived cache search" feature.

The only reason I know about the stolen caches is because one was stolen a few days after I found one, and I checked the history of previous cachers in the area to see what caches they had found. I had to grab the coordinates and plot them in my mapsource program to see them practically overlap.

So far, in my manual history digging through local cachers' geocaching pages, I've found a couple of caches that were archived because of theft. It's those areas we want to avoid, or at least, use a different size container so as to not recreate another opportunity for non-geocachers.

Now that I've said that, someone will probably tell me about an easier way to find archived caches on the website, or through the pocket queries...

 

--Will

The Map It page will show archived caches if you're a Premium Member. Make sure "Show Archived Caches" is checked, then pan the map to the area you're curious about. You'll then be able to see every cache that's ever been hidden there.

Link to comment

...

How would an extra option (unticked by default) called "include archived caches (description not included)" on PQ pages hurt anyone? It's already been demonstrated that it would help some people.

...

It would encourage the use of offline databases - something that TPTB have stated they wish to discourage.

 

And my point throughout.

 

And having said that - there is an EASY workaround in GSAK - described in many posts above.

And if you really want to make sure you're up-to-date, turn on your Instant Notifications to send you an email when any cache gets archived within the area you cover with your regular PQs.

Link to comment

Mr Markwell,

 

Thankyou for your point of view without stooping to sarcasm (as often happens on these forums), however I stand by the points that I made. I also assume also that you are on the Groundspeak payroll (no problem with that - in fact I feel as if I'm being listened too!), so perhaps you can get the following to Jeremy and/or TPTB.

 

Groundspeak may have their rules / policies on what data they include in GPX files, but we are your paying customers and we are not asking for very much.

 

Like many many others, I have an offline (Gsak) database - it allows me to do what GC.COM does not, and like some others, I am quite computer literate, and having been a professional programmer, can I deal with Archived caches.

 

The thing that concerns me most is that there are no requirements to start geocaching - any one can start anytime they want. In time, new cachers will hear about things like offline databases from older cacher's and organise their own. If they do not know what is missing, they will hunt for all/any caches, including Archived caches. Like many, I plan a day out caching the night before, with whatever data I have available and my latest GPX files. And occasionally I end up looking for a cache that has been archived. Sure it's my fault, but I know how to find out which caches are archived.

 

To quote you > A land manager comes by and finds that a cache has been placed illegally. They don't want that particular location attracting visitors because the... - I agree, but the simple fact is that Archived caches DO get looked for, and Land Managers DO get 'annoyed' - probably because some of us are using old data. And Groundspeak CAN help to reduce that from happening.

 

Land managers demands should be heard and acted on, but I don't think that they have the right to expect cache details to be removed from the system. Why should they have a right above and beyond those of paying members. They do have the right though, to expect that we all stop going onto the property/into the area, looking for that cache. Your system as it is does not ensure that. Geocaching the GC.COM way probably does happen, certainly not by all geocachers.

 

Geocaching out here has evolved to use offline databases for various reasons including that internet access is generally line based. I think that it's time for Groundspeak catch up and solve some of these problems, instead of stone-walling and saying it's our problem/responsibility. Keep in mind that currently, I can get Archived cache data in GPX files in two different ways. We are only asking for minimal information to be included in pocket query GPX files, and then only for a short period, and Groundspeak is the only one that can do that.

 

Offline databases are a fact of life now, and the DO (sometimes) contain old data after a cache has been Archived. Groundspeak CAN do something about it.

 

What is so hard. Why can't we have that?

 

Wayne

Link to comment

Let me go officially on record as a "paying customer". I do not want archived caches to appear in our PQ's.

 

First, it appears to be a vocal minority that is making this request. second, it is easier and more efficient to identify them simply by learning to use the tools available, such as GSAK, for those of us that maintain some sort of list of caches offline. Third, depending on how it was done, it would increase the size of PQ's.

 

Las, an archived cache is no longer. Having them available in PQ's will confuse the newbie simply because someone who has an "offline database" wants someone else to maintain it for them.

 

Jeremy and gang, if your listening. Spend your time on improvements, the system can find its own was to break without help.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...