Jump to content

notification of archived state in PQ


mtrax

Recommended Posts

I gather archived caches are not included in PQ, so if I download a PQ and at some stage cache gets archived how can I get my offline database updated to remove it.

 

I gather this was a feature which maybe implemented in the future.. is it time to implement it yet?

 

I spent quite a bit of time looking for a cache which when I returned home I found it was archive.. very frustrating..

Link to comment

Throughout the history of PQs there have been several axioms:

 

GC.com didn't create PQs for users to maintain offline databases. They were created as a means for querying against the database and downloading sets of caches for quick upload to your GPS.

 

GC.com takes a hard line stance that while caches are in the database for archival purposes, they do not send out data on archived caches. As far as the PQs are concerned, they do not exist expect in the "All My Finds."

 

Since offline databases are stored and maintained in third party software, the onus is on third party software to weed out archived caches.

 

So - after years of requesting this, I can say with a good deal of certainty, it ain't gonna happen.

 

There are macros and methods in GSAK to weed out archived caches from your offline database, most of which involve the "Last GPX" date being older than the most recent upload of caches OR clearing the database on load. For details, head over to the GSAK forum on GSAK.net.

Link to comment

As Markwell said. This is just not going to happen.

 

It has been frequently requested and frequently discussed. Bottom line is that HQ does not support the creation and use of offline long term databases.

 

They happily keep the data fresh, ready and always up-to-date online for you.

Link to comment

I gather archived caches are not included in PQ, so if I download a PQ and at some stage cache gets archived how can I get my offline database updated to remove it.

 

I gather this was a feature which maybe implemented in the future.. is it time to implement it yet?

 

I spent quite a bit of time looking for a cache which when I returned home I found it was archive.. very frustrating..

If you use GSAK, to get Archived and Disabled caches out of your off-line database you can use the "Last .gpx Update" Date filter in GSAK.

 

I also set up a Notification to receive Archive and Disable Notices. That way I can update my GSAK database whenever a cache is Archived, and between the times I update the entire database with the "Date Placed PQs" I created for this large, cache-rich area.

Link to comment

The actual answer to your question is to.

 

A. Before loading any Zip files into a GSAK database. Do a complete clearing out of that database.

Then you will be guaranteed to NOT have any archived caches in it.

 

B. Use the 'Last Updated' Column of GSAK to show you that a particular cache has not been updated in your GSAK database in a horribly long time.

 

C. Run some GSAK macro somewhere that is designed for this very purpose.

 

D. Just plain be smart and read the cache page in GSAK, either online or not, and see that it has not been found in months or years and the last posting was an archive notice or needs maintenance or needs archiving notice.

Link to comment

it seems to me if so many people are asking perhaps this feature is something that everyone wants...

 

Also its a bit unrealistic for people to maintain archive lists offline... surely its not too much to ask for a small notificiation for a short period eg 1-2 months eg if the PQ would indicate cache is archive before its turned off .

Link to comment

I'd like a Series 300, too. That ain't gonna happen.

 

Also its a bit unrealistic for people to maintain archive lists offline... surely its not too much to ask for a small notificiation for a short period eg 1-2 months eg if the PQ would indicate cache is archive before its turned off .

 

I'm confused. I actually think it's a bit unrealistic to expect Groundspeak to bend over backwards on its standard policy against publishing archived cache data because someone wants to maintain an offline database (which Groundspeak doesn't want someone to do).

 

The work-arounds mentioned above do work.

Link to comment
it seems to me if so many people are asking perhaps this feature is something that everyone wants...

Please don't speak for me. I have no need or desire for archived cache data.

 

There are maybe a few dozen people asking for this "feature", yet...

In the last 7 days, there have been 280859 new logs written by 39919 account holders.

If 400 people were asking for this it would only be 1% of last week's active account holders. Hardly "so many people" :lol:

Link to comment

ok if we assume offline databases are absolutely required unless you are assuming everyone has roaming internet access everywhere, and are saying when I'm visting out-of-town I need to manually check all my PQ's everytime I run them...

this is a real issue. I know you are suggesting alot of fiddly ways to detect this but sure common sense should prevail.

 

As to maintenance of offline database what do you think PQs are used for..

 

Is Groundspeak trying to make our life difficult or is there another reason?...

I'm not after an indefinite notification just for a month or two...

Link to comment

ok if we assume offline databases are absolutely required unless you are assuming everyone has roaming internet access everywhere, and are saying when I'm visting out-of-town I need to manually check all my PQ's everytime I run them...

this is a real issue. I know you are suggesting alot of fiddly ways to detect this but sure common sense should prevail.

 

As to maintenance of offline database what do you think PQs are used for..

 

Is Groundspeak trying to make our life difficult or is there another reason?...

I'm not after an indefinite notification just for a month or two...

I don't accept your assumption. I get a new PQ. I load it into my GPS and PDA. I go geocaching. Now why exactly do I need roaming Internet access?

Link to comment

If you don't want to use the Work Arounds, then just set GSAK to Delete all the data before loading a new PQ.

 

Since I have needed more than just five Past Logs in my Palm to get a "hint" for a tricky cache, I don't choose that option, therefore I have figured out how to use the "Last .gpx Update" Filter. :)

Link to comment

ok if we assume offline databases are absolutely required unless you are assuming everyone has roaming internet access everywhere, and are saying when I'm visting out-of-town I need to manually check all my PQ's everytime I run them...

this is a real issue. I know you are suggesting alot of fiddly ways to detect this but sure common sense should prevail.

 

As to maintenance of offline database what do you think PQs are used for..

 

Is Groundspeak trying to make our life difficult or is there another reason?...

I'm not after an indefinite notification just for a month or two...

Inside of 5 minutes (usually) I can run a new PQ for an area or route I am taking and load it into my PC and go. I travel a lot. Never have seen an absolute "need" for keeping that data around for longer than the trip. The only way archived caches are a problem is if you have kept that offline data for some period of time. Too long. Essentially it is old 5 minutes after yoiu run it. The only fully reliable data is the online data.

Link to comment

I agree that the onus is on the cacher to make sure that they have the latest information.

 

As somebody mentioned, GS makes this information readily available. Sure, sometimes you don't have time to run a new PQ before heading out - but that's the way it is. To ask them to support something offline and outside of their control is unreasonable IMO.

 

If you don't run the latest PQ, there are many more issues that you could uncover, not just archived caches.

Coords could be updated.

Container may have been removed.

Cache may be disabled.

Area may be temporarily closed.

 

etc.

Link to comment
This question may be ready to be included in the FAQ.
I'll add it to mine within the next couple of weeks.
How about that... It only took me two days...

 

Part of the FAQ

 

And the first method mentioned:

 

Purging your database

This is by far the easiest way. When you are importing your data to GSAK, click the setting found in the image to the right below.

 

This will purge all of the logs, markings, waypoints, additional waypoints and all other data from the database prior to loading the new data. Since archived caches weren't included in the pocket queries, the archived caches are deleted with the purge and won't refresh.

pqarchive_eraseold.jpg

Link to comment

of course there are ways to fix this on the "client" side, but all these are work-arounds to circumvent the issue of the fact that this doesn't get sent to the "client" ie us the user of the data.

so it seems we can jump thru all these hoops or CG.com can just send a simple line in there code to indicate this cache is now archived, for about 1 month after its archived and there after go back the current policy.

 

ie if archvied and archive_date <= 30 days then include in PQ

Link to comment

of course there are ways to fix this on the "client" side, but all these are work-arounds to circumvent the issue of the fact that this doesn't get sent to the "client" ie us the user of the data.

so it seems we can jump thru all these hoops or CG.com can just send a simple line in there code to indicate this cache is now archived, for about 1 month after its archived and there after go back the current policy.

 

ie if archvied and archive_date <= 30 days then include in PQ

 

Wouldn't that be like showing addresses on new maps that used to be there. Archived, no longer available. No need to clutter up PQ's.

Link to comment

The "All Finds" PQ includes all the caches you have found, including those caches that are Archived.

 

In GSAK, I created a separate Database for "Found" caches. That way I have one less filter to run before sending waypoints to my GPSr from my "Default" database.

 

I don't delete the caches in my GSAK database when I load new PQs. That way I can have more than five Past Logs for the caches that have been in the database for a while.

Link to comment
ie if archvied and archive_date <= 30 days then include in PQ
Why 30 days? Suppose I have a 75-day-old GSAK DB that I want to update? Suppose Markwell has a 3-yo DB that he wants to update? What's the limit? There is no "natural" limit, which implies that this is a Bad Idea.

 

As others have pointed out, this feature amounts to supporting an offline mirror image of a subset of the gc.com database. If they start down this road, there is no natural stopping point short of supporting mirror images fully. That's a huge task and would be a huge load on the servers. Those of us interested in seeing much more useful features on the web site will oppose such uses of resources.

 

Edward

Link to comment

There you go basically they above are right in that from all the official responses have been it is not going to happen.

There are workarrounds many detailed above.

 

Mine is to set up instant notifications for the areas i will cache in of all archivals and published caches i then click the link and download the individual GPX file for the cache.

 

What i would like is to set up square notifaction emails rather than round ones as i occasionally get the same one more than once due to overlap's.

 

This method keeps my offline database (very usefull if the site is down) more up to date than any pocket querry .

 

Note instant notifiactions are not 100% reliable.

Archived PQ's it would be nice / usefull to get them but we are not so the workarrounds will continue.

Link to comment

Yeah, use the work-arounds. Who cares if it creates a higher workload on the servers? If TPTB wanted less of a load they'd work something out so the archived caches were properly marked in our offline databases. Until then, full datasets will be downloaded to the fullest that is allowed.

 

BTW, folks have a short memory as it was only this past weekend that gc.com went down hard and folks couldn't get the oh-so-critical freshest data possible. Aggregators simply shrugged and carried on.

Link to comment

Yeah, use the work-arounds. Who cares if it creates a higher workload on the servers? If TPTB wanted less of a load they'd work something out so the archived caches were properly marked in our offline databases. Until then, full datasets will be downloaded to the fullest that is allowed.

 

BTW, folks have a short memory as it was only this past weekend that gc.com went down hard and folks couldn't get the oh-so-critical freshest data possible. Aggregators simply shrugged and carried on.

As it stands now you can run 5 PQ per day for 2500 cache. 7 days a week gives 17500. If TPTB gave the "aggregators" the capability to see recently archived caches so they didn't have to use full downloads that there would be less of a load on the servers? No. What people would do is use their 5 PQs per day to get 17500 cache that have recently been updated (since they don't need to get the majority of caches in there offline database that didn't change). The load on the servers would be the same and the "aggregators" will have offline database with 175000 caches (assuming 10% of caches get updated each week). This is more than 1/3 of all the active caches worldwide. Groundspeak does not want to make it easy for people to have the ability to make a copy of a significant part of the their database but they do want to allow geocachers to get a reasonable number of caches to plan there cache outings.

Link to comment

As it stands now you can run 5 PQ per day for 2500 cache. 7 days a week gives 17500. If TPTB gave the "aggregators" the capability to see recently archived caches so they didn't have to use full downloads that there would be less of a load on the servers? No.

 

I don‘t think you know what "aggregators" will do. I'm an aggregator. I run 3 PQs a day on a routine basis, not 5. If I could get recently archived caches that number would drop for me.

 

But TPTB aren't makiing this decision based on server load, they are making it based on how we "should" use the data.

Link to comment

I don‘t think you know what "aggregators" will do. I'm an aggregator. I run 3 PQs a day on a routine basis, not 5. If I could get recently archived caches that number would drop for me.

 

But TPTB aren't makiing this decision based on server load, they are making it based on how we "should" use the data.

True enough. But by the TOU agreement, it is their data right? Read here starting at 3 and the paragraph "The Site and all content available on the Site are protected..."

 

Also, just because your number of caches in the PQ would drop if they made the archived caches available, I don't think that EVERYONE's would drop. I think it more likely that the prediction made by tozainamboku would be the norm. People would find a way to get only the updated and archived caches, and thus get the updated and archived caches for a much WIDER base.

 

You use 3 PQs a day on a routine basis. That means you're pulling anywhere from 1001-1498 caches. That probably covers a certain radius - let's say for the sake of argument - 50 miles. If you could get the archived caches and the updated caches in that 50 miles in one PQ of 500 on a daily basis, you might be happy with that. I believe MOST people would say - "Wow - now I can extend that radius to 150 miles instead" and use up the 3 PQs. That is the point that I believe tozainamboku is trying to make.

Link to comment
As it stands now you can run 5 PQ per day for 2500 cache. 7 days a week gives 17500. If TPTB gave the "aggregators" the capability to see recently archived caches so they didn't have to use full downloads that there would be less of a load on the servers? No. What people would do is use their 5 PQs per day to get 17500 cache that have recently been updated (since they don't need to get the majority of caches in there offline database that didn't change). The load on the servers would be the same and the "aggregators" will have offline database with 175000 caches (assuming 10% of caches get updated each week). This is more than 1/3 of all the active caches worldwide. Groundspeak does not want to make it easy for people to have the ability to make a copy of a significant part of the their database but they do want to allow geocachers to get a reasonable number of caches to plan there cache outings.
Using these calculations, the value of having a one-day-old copy of the entire database of active caches is about $60/month. Allowing information to be up to a week old drops that to under $10/month. Since the big worry is supposedly someone duplicating the whole database, why has nobody done it yet? Obviously nobody wants to, because the value of the database is being able to add logs to it. This explanation is tiresome to see repeated over and over again.

 

I believe MOST people would say - "Wow - now I can extend that radius to 150 miles instead" and use up the 3 PQs.
What, exactly, is wrong with the possibility that some people might get more bang for their buck?
Link to comment

........Using these calculations, the value of having a one-day-old copy of the entire database of active caches is about $60/month. Allowing information to be up to a week old drops that to under $10/month. Since the big worry is supposedly someone duplicating the whole database, why has nobody done it yet? Obviously nobody wants to, because the value of the database is being able to add logs to it. This explanation is tiresome to see repeated over and over again.

......

 

To help extend that "argument".....

Not only is it thier data to do with as they please. They also have a strong interest in getting users to frequently visit the website and look for new/addtional information.

Link to comment
I believe MOST people would say - "Wow - now I can extend that radius to 150 miles instead" and use up the 3 PQs.
What, exactly, is wrong with the possibility that some people might get more bang for their buck?

 

You took this out of context. Markwell was replying to the fact that allowing pq's of caches that no longer exist would lessen the server load, which it would most likely not.

Link to comment

I think it quite obvious certain folks have no idea of the dynamics of pulling differential PQs.

 

A differential PQ, one that only provides the caches that have changed in the past 7 days, is different each day. By definition, the PQ will have the number of caches that have been logged or changed in some other way in the last 7 days. One will never know just exactly how many that is. Therefore you must provide for enough "headroom" in your PQs to proved flexibility. If you didn't then you'd miss some cache updates. This means you'd never get anywhere near the theoretical limit alluded to in some of the above posts.

 

Additionally, 17,500 caches a week is only available in a perfect world. I run a butt-load of PQs a week to get all of our stomping grounds and yet it only yields about 7K caches. 17,500? First of all, that number is nearly impossible to accomplish. A difference in just one day can make the difference of several caches. The work involved would actually be quite enormous and you'd be chasing it weekly as folks add and take away caches.

 

Would folks expand their territory if the PQ system became efficient? You betcha. Would everyone expand the caches they pull? I doubt it. I wouldn't. I could maintain a large database, but I don't.

 

In fact, if I wanted I could download more than 17,500 caches. It'd just take me a few weeks to do it. The idea of the limits are to prevent someone else of stealing the database and starting a competing website if laughable.

 

Oh, and the caches that are updated weekly is about a quarter to a third. Only about half is updated monthly.

 

There are plenty of good reasons to provide a feature that removes archived cache from offline databases. There's only been one good reason not to--Groundspeak just doesn't wanna.

Link to comment

Yeah, you are correct there really is no debate as it is the provider's choice what services they want to provide.

 

It is interesting and kind of funny given Groundspeak does not want to encourage off-line databases that....... One of the best things about my offline db has been its availability when I could not get the geocaching,com website to load. I still could not go out with the latest data, but at least I had my recently updated off-line db to save the day.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...