Jump to content

Keeping other database (GSAK, etc) updated


W8TTS

Recommended Posts

As I think has been said many times in this thread, TPTB will not give you what you want. After a cache has been Archived, Groundspeak doesn't want to send out data about it, except in the All Finds PQs.

 

The best you can do is to figure out how to use the tool you are already using -- GSAK -- to solve your "stale data" problem. :blink:

 

The only problem is that you don't know it's stale until you bite into it. <_< And, I don't like stale anything.

Link to comment

My solution to this problem is that I use a instanotify each time a cache is archived within 50 miles from my house. To set it up you go to "set up notifications" on your http://www.geocaching.com/my/ page and then on the instanotifcations page go to the bottom of your list and select "Create a new notification" from there you can create archive Notices along with Publish so you could have a chance at FTF but you have to create one for every cache type. I can't find a how to page on doing this so post here if you have any questions on this.

 

When you have Instanotify you can check your email box for when caches get archived or published. When you see a archive notification you can go directly to that cache page and download the GPX of that cache into your pocket query directory that you download PQs to from emails. When you run get data from email on GSAK that information will automatically load into GSAK.

 

I know this is a low tech way to do this but until GSAK can read these emails to automatically archive caches in those emails or when Groundspeak automatically sends GPXs to your email box for this type notification so GSAK can get them as attachements, you have to use this instanotify method.

 

Very low tech <_< And, in this day and age not the way to do things. We're using satellites and GPSr's to find something, and relying on a notify message to manually keep something up to date. IMHO, not the way to go.

 

If you run a PQ that askes for updates in the last 7 days then it should include the Archived caches. That's an update, isn't it?

But . . . but . . . if TPTB are not going to give you that information, you have to set up your PQs differently. Either set them up by date placed to get a nearly full-compliment of 2500 caches every day, or, set them up around location centerpoints so when you are heading in that direction, you get the most recent past logs for the caches in that area.

 

Since you are not going to get information about Archived caches, except in your All Finds PQ, you are going to have to figure out how to fix the problem you have with the stale data in your database.

 

I do not do what StarBrand does because I want the Past Logs to build. Sometimes it is the eighth Past Log that has the information that enables me to find the cache. However, if you think five Past Logs is enough, clear your database and start over with new fresh PQs . . .

Link to comment

 

But . . . but . . . if TPTB are not going to give you that information, you have to set up your PQs differently. Either set them up by date placed to get a nearly full-compliment of 2500 caches every day, or, set them up around location centerpoints so when you are heading in that direction, you get the most recent past logs for the caches in that area.

 

Since you are not going to get information about Archived caches, except in your All Finds PQ, you are going to have to figure out how to fix the problem you have with the stale data in your database.

 

I do not do what StarBrand does because I want the Past Logs to build. Sometimes it is the eighth Past Log that has the information that enables me to find the cache. However, if you think five Past Logs is enough, clear your database and start over with new fresh PQs . . .

 

I can only hope that TPTB will see the wisdom in supplying the archived information.

 

I use the date placed and updated in the past 7 days, and it works for me.

 

Don't like the idea of clearing the DB and reloading all the time, mine builds as yours does.

Link to comment

Why don't you get the individual PQs each week instead of the "Updated in the past 7 days" PQ? idea.gif

 

If you get all your PQs once or twice a week, the "Last .gpx update" filter will work very nicely for you, once you get your "Found" caches, and the other stale data, out of your existing database. <_<

Link to comment

I don't completely reload my QSAK data base, I just apply updates. And, I only select update in the past 7 days in my PQ's that run for selected add dates.

You really must think you are saving bandwidth by asking for only the caches that have been updated in the past 7 days. Jeremy would probably thank you, but if he said that was the method to use to maintain an offline database then people would be able, after initially downloading the caches which would take a few weeks, to maintain an offline copy of the Geocaching.com database. You can download 2500 caches per day, 175000 cache per week. Lets say that 10% of caches get updated on average each week. So you are asking for the ability to maintain a database with 1,750,000 caches if you ran the maximum number of pocket queries with the maximum of 500 caches per query. And there are only 434,995 active caches in the world <_< Now I know you are probably using the updated in the last 7 days to reduce the number of caches you have to download in your pocket queries each week. But Jeremy is much more concerned about the person who wants to make a copy of the Geocaching.com database for the purpose of competing with Groundspeak than with any bandwidth that is saved by your method. By not including those caches that have been archived in the last seven days he is forcing you to have to download all caches. That is going to limit you to 175,000 caches - about 40% of all the caches in the world. That should be plenty for your personal use and since you'd be missing 60% of the cache you will probably not be in a position to compete with GC.com :blink:

 

Here's another approach. Maintain your offline database the way you are currently doing. But then, just before you load the caches into your GPS, run a pocket query to return just those caches you are planning to go looking for. A newly built ad hoc Pocket Query will usually run withing 10 minutes. Load the GPX file from this pocket query into GSAK and eliminate from the caches you are planning hunt any that didn't get updated in that GPX file. Hope this helps.

 

Or you can continue to insist that Jeremy should allow you to get the archived caches and maybe you will convince him. While you are at it, maybe you can convince him to bring back virtual caches. :huh:

Link to comment
<snip>

 

Or you can continue to insist that Jeremy should allow you to get the archived caches and maybe you will convince him. While you are at it, maybe you can convince him to bring back virtual caches. <_<

Oh . . . now there's a good idea. :huh: I love the Virtual caches. :blink: Do you think we will get them back the same time we start getting the Archived caches in our PQs. :P

Link to comment

Why don't you get the individual PQs each week instead of the "Updated in the past 7 days" PQ? idea.gif

 

If you get all your PQs once or twice a week, the "Last .gpx update" filter will work very nicely for you, once you get your "Found" caches, and the other stale data, out of your existing database. <_<

 

Basically because I'd have to 1). run more PQ's then I do now for no reason, 2). I hate the idea of rebuilding a DB all the time because I may have entered user data and it would be lost. Beside there are 2,500 active caches with 50 miles of my home, and there'd be a lot more if Lake Erie wasn't 30 miles north of here.

 

And, like I said before, you do it your way and I'll do it my way.

Link to comment
<snip>

 

Or you can continue to insist that Jeremy should allow you to get the archived caches and maybe you will convince him. While you are at it, maybe you can convince him to bring back virtual caches. <_<

Oh . . . now there's a good idea. :huh: I love the Virtual caches. :blink: Do you think we will get them back the same time we start getting the Archived caches in our PQs. :P

 

I like the virtual ones too. Since the local parks system and the NPS don't allow caches, it was the only way to do them there. At least the Ohio State Parks like them, in fact they are even placing some of their own.

Link to comment

Of course this is what Groundspeak would prefer. Pocket Queries were not created with the intention that you would maintain an offline database of all the caches in your area of interest. Tools like GSAK came along later and people began to use them to maintain there database. So of course they want a way to remove the archived caches from their database (or at least mark them archived). TPTB have repeatedly said that they won't provide this capability; yet they continue to refer people who ask for it to the forums. The only thing the forum regulars can do is describe how they maintain their offline database. I basically get a whole bunch of PQs over the course of the week to cover the areas I generally cache in. This downloads every cache in the area (although I suppose I could leave out the caches I've already found and just run the My Finds query every so often to update these). Each week I run a GSAK filter to show the caches in my area that haven't been updated in a week. These are the archived caches. I usually delete these, but if its a cache I've found or Did Not Find, I'll mark them archived and leave them in my database for statistical reasons.

 

Here's the fallacy in this train of thought of not being update in two days. It's been stated that "You can then use GSAK to find the caches in that area that haven't been updated in the past two days. These would be the archived caches. It's pretty easy, once you accept Groundspeak's reasoning." This two day rule or actually any number of days since the last update is not a good way to do this. Why?

 

I looked back at my finds and found a number that had the Last GPX date that was much earlier then my find. For example when I found Dick Cheney Cache (GCXCFV) it's GPX date was 7/1/2007 and I found it on 7/26/2007. If I used your method I would have thought the cache was archived and not looked for it. But, I looked for it and found it.

 

And, Rock Clingers Clinging Here (GCXC51) last GPX date 6/10/2007, I found it on 7/26/207.

 

And, Lost Dog (GCMWD3) last GPX date 7/1/2007, I fount it on 7/26/2007.

 

And, so on . . .

 

So, if I'd used your two day of no updates to assume that the cache was archived I wouldn't have looked for these.

 

What you are assuming is that every active cache is update every day or two, and this just isn't true. They're only updated when someone finds it and logs it, or when there is some maintenance on it. Rock Clingers Clinging wasn't updated for a month and a half, it wasn't archived, it was still there and active.

 

Think about caches that aren't available in the winter, maybe no activity for two or more months. They aren't archived, just unavailable.

Link to comment

The problem lies with your insistace that the last update filter won't work. It does - if you run a complete PQ and not just an update PQ.

 

We are trying to help you given that it is highly unlikely that archived caches will ever be included. Please accept that help.

Link to comment

Here's the fallacy in this train of thought of not being update in two days. It's been stated that "You can then use GSAK to find the caches in that area that haven't been updated in the past two days. These would be the archived caches. It's pretty easy, once you accept Groundspeak's reasoning." This two day rule or actually any number of days since the last update is not a good way to do this. Why?

 

<snip>

 

What you are assuming is that every active cache is update every day or two, and this just isn't true. They're only updated when someone finds it and logs it, or when there is some maintenance on it. Rock Clingers Clinging wasn't updated for a month and a half, it wasn't archived, it was still there and active.

 

Think about caches that aren't available in the winter, maybe no activity for two or more months. They aren't archived, just unavailable.

I think you're missing that the last GPX update field is updated every time you load the cache from a GPX file whether or not the cache was actually updated. But to do this you have to request all caches not just the ones that were updated in the last 7 days. This does not reset your database. GSAK is smart about merging logs into your database and retaining any notes or corrected coordinates you may have.

 

I understand that by getting only caches that have been updated in the last 7 days you can get your update with fewer PQs. I think I have explained why Groundspeak actually may prefer you use your pocket queries less efficiently.

 

Apparently you can't take no for an answer (OK, I don't speak for Jeremy so perhaps he will change his mind, but this has been asked many times and so far he hasn't). So keep asking. I'm done.

Link to comment

Why don't you get the individual PQs each week instead of the "Updated in the past 7 days" PQ? idea.gif

 

If you get all your PQs once or twice a week, the "Last .gpx update" filter will work very nicely for you, once you get your "Found" caches, and the other stale data, out of your existing database. <_<

 

Basically because I'd have to 1). run more PQ's then I do now for no reason, 2). I hate the idea of rebuilding a DB all the time because I may have entered user data and it would be lost. Beside there are 2,500 active caches with 50 miles of my home, and there'd be a lot more if Lake Erie wasn't 30 miles north of here.

 

And, like I said before, you do it your way and I'll do it my way.

I'm here to tell you that the method we're describing will work just fine in your area. My pocket queries cover a radius more than 70 miles from your first cache (which usually is close to your home coordinates, in this case, the Akron area). You could drive west nearly to Sandusky and not miss any caches.

 

I have all those Ohio caches, even though I live in Pittsburgh, because I love to go on impromptu roadtrips just like you described. My database covers a 150 mile radius from my home in Pittsburgh, giving me all caches I haven't found in a circle that includes parts of Pennsylvania, Ohio, West Virginia, Virginia, Maryland and New York.

 

This is all done with a set of queries that are run once each week, with the newest caches arriving on Friday just in time for my weekend caching. 3 or 4 queries run each day, separated by placement date range. And there's plenty of room for me to run other queries any day of the week in case I take a trip outside that 150 mile circle.

 

I accumulate all the old logs so they're available to read if I get stumped, or want to see who found the cache in January. I never have to dump old data.

 

Before a caching trip, I sort my 150 mile database by "Last GPX Date" and I check the couple dozen caches that have slipped off the pocket queries. Some I found, so I mark those off. The others I check online, and I mark them as disabled or archived. I do this while watching TV or listening to the radio. I am interested in seeing what happened to the caches that were disabled or archived. It's just a couple of mouseclicks per cache. When I'm done, I filter out all the found, disabled and archived caches, and load the "good" current data. If I'm in a hurry, I will skip the individual cache checks and just cut off the disabled, archived and found caches with a filter based on the last GPX date.

 

I have been doing this since January, and I've never searched for a cache based on data more than a week old. I've never hunted in vain for a cache that's been removed, though it's always possible. I've never missed a cache that was just published or just re-enabled.

 

I am here to say that it's possible to do all this. It's easy to do this. I'm not very technical, and I needed some help to do this. The guy who answered my questions when I was setting it up lives just a few miles away from you. He would be happy to show you all of this and way, way more. Geocachers like to help each other.

Link to comment

TPTB does notify you of archived caches via one solid method - email notifications. As such, i have written a script to scrape my incoming email, and mark archival notifications as archived in GSAK. This does not involve any touching or scraping of Groundspeak's servers, and seems to be pretty much the best method.

 

Once i get the scripts solid, i plan to post them. Currently it's written as a perl script, so has a lot of dependencies, so is not a 'one-click' operation anyone can use, unfortunately.

Link to comment

I see the last couple postings do mention other ways about keeping the other database programs up to date. My issue comes when I have over 10000 caches in my database(all in WA). I can only fully update it every month or so. If each file is limited to 500 waypoints, the easy solution is to run a PQ of the Archived caches and let it all update that way. Plus, there are caches that have been archived that had some really good containers and areas.

As I have said many times before, "Why do you need so much data?" :huh:

 

You can't possibly go hunting for all 10,000 caches. All the data you need is on this website. Why do you personally have to keep all that data? :blink: Can't you just keep your local area, and areas you travel to/through frequently updated?

 

I only update the areas in my GSAK database I think I will be traveling through or to when I am headed that direction. Otherwise I will have stale data. And, no one wants to have stale data, depending on why a cache went from "active" to "Archived" in one step . . . <_<

 

Is there some reason you have to have all the caches in Washington state. What about those just across the border in Oregon and Idaho? Maybe some of those should be included, as well . . . :P

 

Actually, that was just my example... I also have ID. At one time, I did have the whole PNW Region(WA, ID, MT, WY,and OR), but after GSAK completely Crashing when I'd try and update things, I had to start over... I try to get everything of WA(and ID) because its Easier to search for caches than directly on the website, and I know my DB is current within the week(with the exception being archived caches...). It'd be Really nice to be able to download an archived cache listing to be able to update it all.

 

The Steaks

Link to comment

it is so simple: just add one option in the section "that": "is archived"

 

Since the archived caches are a part of the online database (I can see them! so it is online information).

 

And when you don't want to push the (for some old) information too much, just put an option that pocket queries for archived caches only can run once a week/month whatever.

 

I would be very happy with it, because the last.gpx update doesn't work for me

Link to comment

I stand corrected corrected. Maybe. :):)

 

I'm looking into doing my PQ's differently, now that I understand how the "Last GPX date" works. We'll see how it works. But, I still feel that "Archived" caches should be available in a PQ. :P

Link to comment

Is there some reason you have to have all the caches in Washington state. What about those just across the border in Oregon and Idaho? Maybe some of those should be included, as well . . . :)

 

Maybe. He may travel all over the state and want to know where the active caches are. :P:)

Actually, the Wa State PQ does capture some of the caches just across the border considering the radius of the search pattern.

 

Yep. It all depends on how close to the boarder you are.

No it doesn't.

 

Search radius on a state PQ starts at the center of the state and overlaps borders to catch the far reaches of the state corners. The keyword is radius indicating a circular regioin is being used rather than polygon.

Link to comment

GSAK does handle archived caches very nicely if you set up your PQ's properly by placed date at regular intervals. Details can be found here.

 

I think you will find most people using the PQ's, especially if they use GSAK or something similar, don't want the file's size increased more than it is with archived caches that don't really add any value to the PQ or DB. With the methods in the quoted thread, we even keep the status of my found caches up to date.

Edited by baloo&bd
Link to comment

Is there some reason you have to have all the caches in Washington state. What about those just across the border in Oregon and Idaho? Maybe some of those should be included, as well . . . :)

 

Maybe. He may travel all over the state and want to know where the active caches are. :P:)

Actually, the Wa State PQ does capture some of the caches just across the border considering the radius of the search pattern.

 

Yep. It all depends on how close to the boarder you are.

No it doesn't.

 

Search radius on a state PQ starts at the center of the state and overlaps borders to catch the far reaches of the state corners. The keyword is radius indicating a circular regioin is being used rather than polygon.

 

Actually its Really Nice. I have the PQs set up to where it is a point in the center of the state, Distance of 500 miles, State: Washington

 

Then I load that into my WA database, and repeat for ID.

 

The Steaks

Link to comment

it is so simple: just add one option in the section "that": "is archived"

 

Since the archived caches are a part of the online database (I can see them! so it is online information).

 

And when you don't want to push the (for some old) information too much, just put an option that pocket queries for archived caches only can run once a week/month whatever.

 

I would be very happy with it, because the last.gpx update doesn't work for me

 

I think thats the consensus of this thread. The issue comes that Groundspeak dosen't seem to want to impliment this Great Feature.

Link to comment

TPTB does notify you of archived caches via one solid method - email notifications. As such, i have written a script to scrape my incoming email, and mark archival notifications as archived in GSAK. This does not involve any touching or scraping of Groundspeak's servers, and seems to be pretty much the best method.

 

Once i get the scripts solid, i plan to post them. Currently it's written as a perl script, so has a lot of dependencies, so is not a 'one-click' operation anyone can use, unfortunately.

Ohhh... Thats it. Of course, that would only work if I had one for each type of cache for every 50 Miles...

 

I understand that there has to be a great demand for a feature for it to be recognized and actively debated by Groundspeak, So lets keep this going in the direction of being able to get a GPX of the requested archived caches. Even if its only once a week.

 

The Steaks

Link to comment

Ohhh... Thats it. Of course, that would only work if I had one for each type of cache for every 50 Miles...

 

I understand that there has to be a great demand for a feature for it to be recognized and actively debated by Groundspeak, So lets keep this going in the direction of being able to get a GPX of the requested archived caches. Even if its only once a week.

 

I think the point that is being missed is this has been addressed by "TPTB" on more than one occasion. GC recognizes that some may keep a limited offline database however does not support them.

 

In regards to archived caches, they are viewed as gone, not able to be sought by the cacher, so are considered information that would not be needed for download. If a user has a specific need, they also probably have either the GC number or hider name, both of which would make it accessible (i.e. you found it before it was archived and did not log it yet).

 

Not to offend (really) however you are in a very small, possibly vocal, minority who wants to be able to have these in PQ's. With GSAK, you get the best of both worlds.

Link to comment

....

 

I understand that there has to be a great demand for a feature for it to be recognized and actively debated by Groundspeak, So lets keep this going in the direction of being able to get a GPX of the requested archived caches. Even if its only once a week.

 

The Steaks

What you seem to have missed in here somewhere is the fact that this is brought up nearly weekly for over 2 years. It isn't going to happen anytime soon.

 

TPTB have actively stated that this is not going to be a part of any planned upgrade. There are good workarounds that are discussed here - use one of them.

 

Sorry to be so blunt but fact is fact.

Link to comment

Ohhh... Thats it. Of course, that would only work if I had one for each type of cache for every 50 Miles...

 

I understand that there has to be a great demand for a feature for it to be recognized and actively debated by Groundspeak, So lets keep this going in the direction of being able to get a GPX of the requested archived caches. Even if its only once a week.

 

I think the point that is being missed is this has been addressed by "TPTB" on more than one occasion. GC recognizes that some may keep a limited offline database however does not support them.

 

In regards to archived caches, they are viewed as gone, not able to be sought by the cacher, so are considered information that would not be needed for download. If a user has a specific need, they also probably have either the GC number or hider name, both of which would make it accessible (i.e. you found it before it was archived and did not log it yet).

 

Not to offend (really) however you are in a very small, possibly vocal, minority who wants to be able to have these in PQ's. With GSAK, you get the best of both worlds.

 

I do agree that archived caches are viewed as gone, and not able to be sought by the cacher, but I don't agree that the information won't be needed for download. The only reason that people ask for it is because they want or need it. I feel that When this request is implimented, the feature could be tracked to determine usage and further implimentation.

 

Talking about the PQ Implimentation, What I've forgot to say is that If the "Archived" Checkbox is Not checked, then archived caches will not show up. If it is checked, then Just the Archived caches will show up, and that PQ would be restricted to Once a week(with further restriction being not more than 3 Seperate archived PQs could run in a week's time).

 

I may be "a very small, possibly vocal, minority who wants to be able to have these in PQ's," but you also have to remember that there are Alot of people that don't come in the forums just look around. Heck, I found this thread, not because I was searching for it, but because it just happened to look interesting for a second to read. However, this is a feature that I do feel many people could and would use. I will say that the monitoring period right after implimentation will see a rise in Bandwidth usage, But after what I would assume to be a week or two the Bandwidth usage will drop(due to the fact that people aren't having to check each cache to make sure its not been archived).

 

The Steaks

Link to comment

....

 

I understand that there has to be a great demand for a feature for it to be recognized and actively debated by Groundspeak, So lets keep this going in the direction of being able to get a GPX of the requested archived caches. Even if its only once a week.

 

The Steaks

What you seem to have missed in here somewhere is the fact that this is brought up nearly weekly for over 2 years. It isn't going to happen anytime soon.

 

TPTB have actively stated that this is not going to be a part of any planned upgrade. There are good workarounds that are discussed here - use one of them.

 

Sorry to be so blunt but fact is fact.

 

Hmm. Asked for over and over. That should tell you and the powers at geocaching.com that it's a wanted option.

Link to comment

GSAK does handle archived caches very nicely if you set up your PQ's properly by placed date at regular intervals. Details can be found here.

 

I think you will find most people using the PQ's, especially if they use GSAK or something similar, don't want the file's size increased more than it is with archived caches that don't really add any value to the PQ or DB. With the methods in the quoted thread, we even keep the status of my found caches up to date.

 

GSAK provides an automated updating system to it's database. Doing archived caches, which some people like to keep, is a manual update which is subject to error. It would be so much better if it could be done automatically when the cache is archived.

Link to comment

it is so simple: just add one option in the section "that": "is archived"

 

Since the archived caches are a part of the online database (I can see them! so it is online information).

 

And when you don't want to push the (for some old) information too much, just put an option that pocket queries for archived caches only can run once a week/month whatever.

 

I would be very happy with it, because the last.gpx update doesn't work for me

 

I think thats the consensus of this thread. The issue comes that Groundspeak dosen't seem to want to impliment this Great Feature.

 

I wonder if we are ever going to hear from Groundspeak on this.

Link to comment

GSAK provides an automated updating system to it's database. Doing archived caches, which some people like to keep, is a manual update which is subject to error. It would be so much better if it could be done automatically when the cache is archived.

 

Not to sound contrary, but not too sure about the "subject to error" part. Many have been doing this for quite sometime, myself for over two years, with no error.

Link to comment

I wonder if we are ever going to hear from Groundspeak on this.

 

They have, repeatedly, in other threads. They do not see a need for information on caches that no longer exist nor support off line databases due to the stale nature of the databases.

My off line database is not stale. The only problem for me is I have to manually remove the archived caches. I could use a macro, but sometimes there is another reason a cache has not been updated. Moving caches sometimes pop up and I delete them and mark them to not be imported again. This is an extra hassle that would be avoided if I could get a PQ of archived caches. I could also avoid all this if I could get more than the last 5 logs when I got a list of caches to hunt.

Link to comment

The moment the PQ leaves the server the data is stale.

 

other_beatingA_DeadHorse.gif

 

But in reality, its not stale if nothing changes for the cache you are looking at. Trust me, I have up-to-date info All the time. Of course, I don't know when caches are ARCHIVED... Wouldn't it be nice to get a PQ of archived caches at request...

 

The Steaks

Link to comment

A really simple low-tech solution would be to include caches archived within the last month.

 

What I'd really like to see is an FTP download made available for each state (or the UK in my case). This would reduce the load on the database and email massively. I'm sure GSAK could even build in a FTP process to make it transparent. Everyone would be happy and PQ's would be redundant - but that ain't ever gonna happen. :blink::blink::laughing:

Link to comment

Just to add my two cents to the subject.

 

While it has been said by TPTB that they don't want to create PQ's of archived caches because they don't want cachers trying to find them. I would like to be able to have a PQ of archived caches for that same reason, because I don't want to find them. If I have created a series of PQ's in anticipation of a trip, it would be nice to get them setup and download in advance and then at the last minute run another that just downloads those caches that have been changed recently. The current 'updated in 7 days' PQ filter is a bit useless because normal finds trigger the 'updated' status and floods the PQ with caches that have not really changed, using up the 500 cache limit. Setting up notification on archived caches does not always work because often the area does not fit within a 50 mile radius, and before someone mentions caching along a route, they should read all the threads that describe the all bugs it has.

 

I would agree that there is no reason to get caches that have been archived a year or even a month ago but within the last 7 days is reasonable. So I would propose a change to the 'Updated in the last 7 days' filter to not be triggered by 'Found' logs and to include caches that have been archived within that time. If someone is annoyed by the removal of found caches from the 'Updated in the last 7 days' queries they can easily setup a PQ on caches 'Found in the last 7 days'.

Link to comment

Just to add my two cents to the subject.

 

While it has been said by TPTB that they don't want to create PQ's of archived caches because they don't want cachers trying to find them. I would like to be able to have a PQ of archived caches for that same reason, because I don't want to find them. If I have created a series of PQ's in anticipation of a trip, it would be nice to get them setup and download in advance and then at the last minute run another that just downloads those caches that have been changed recently. The current 'updated in 7 days' PQ filter is a bit useless because normal finds trigger the 'updated' status and floods the PQ with caches that have not really changed, using up the 500 cache limit. Setting up notification on archived caches does not always work because often the area does not fit within a 50 mile radius, and before someone mentions caching along a route, they should read all the threads that describe the all bugs it has.

 

I would agree that there is no reason to get caches that have been archived a year or even a month ago but within the last 7 days is reasonable. So I would propose a change to the 'Updated in the last 7 days' filter to not be triggered by 'Found' logs and to include caches that have been archived within that time. If someone is annoyed by the removal of found caches from the 'Updated in the last 7 days' queries they can easily setup a PQ on caches 'Found in the last 7 days'.

 

But what about us that DO want to know the caches that have been archived in the last 6 months?? It usually takes me that long to go about getting a complete DB. I just hate having to start over every few months to 'sort out' the dead caches...

Link to comment

I jumped into Clayjar's chat tonight, and asked "Would it be useful to have an option to get a PQ of archived caches?"

 

5 people responded. 4 said they would like the option in some way, 1 said they only wanted active caches.

 

I know this is just a smidgin of the whole community, But I think it shows a random cross-section.

 

The Steaks

Link to comment

Just to add my two cents to the subject.

 

While it has been said by TPTB that they don't want to create PQ's of archived caches because they don't want cachers trying to find them. I would like to be able to have a PQ of archived caches for that same reason, because I don't want to find them. If I have created a series of PQ's in anticipation of a trip, it would be nice to get them setup and download in advance and then at the last minute run another that just downloads those caches that have been changed recently. The current 'updated in 7 days' PQ filter is a bit useless because normal finds trigger the 'updated' status and floods the PQ with caches that have not really changed, using up the 500 cache limit. Setting up notification on archived caches does not always work because often the area does not fit within a 50 mile radius, and before someone mentions caching along a route, they should read all the threads that describe the all bugs it has.

 

I would agree that there is no reason to get caches that have been archived a year or even a month ago but within the last 7 days is reasonable. So I would propose a change to the 'Updated in the last 7 days' filter to not be triggered by 'Found' logs and to include caches that have been archived within that time. If someone is annoyed by the removal of found caches from the 'Updated in the last 7 days' queries they can easily setup a PQ on caches 'Found in the last 7 days'.

 

But what about us that DO want to know the caches that have been archived in the last 6 months?? It usually takes me that long to go about getting a complete DB. I just hate having to start over every few months to 'sort out' the dead caches...

:lol:

 

other_beatingA_DeadHorse.gif

 

:anitongue:

Link to comment
But what about us that DO want to know the caches that have been archived in the last 6 months?? It usually takes me that long to go about getting a complete DB.

I think that post right there illustrates exactly why Groundspeak frowns on offline databases, and isn't going to add features to support them. If parts of your database are up to 6 months old, you not only will have archived caches in there, you'll also have disabled, moved, replaced and just plain missing caches in there.

 

Run a fresh PQ right before you go out and you'll know you have current information on active caches. Simple.

Link to comment

GSAK provides an automated updating system to it's database. Doing archived caches, which some people like to keep, is a manual update which is subject to error. It would be so much better if it could be done automatically when the cache is archived.

 

Not to sound contrary, but not too sure about the "subject to error" part. Many have been doing this for quite sometime, myself for over two years, with no error.

 

But, it's a manual process. How about the non-techie cachers that can't figure out how to do it the way you've been doing it. I have a friend, a great cache hunter, but a complete box of rocks when it comes to computers. Trying to explain your method to hime would not work. Make it automated, so that everyone can benefit.

Link to comment

Doing one additional filter in GSAK shouldn't be very hard to figure out, even for someone who is a "complete box of rocks" when it comes to computers, if they are already using GSAK.

 

All they have to do is download fresh PQs to their database and do the "Last .gpx Update" filter. Easy peasey. <_<

Link to comment

Doing one additional filter in GSAK shouldn't be very hard to figure out, even for someone who is a "complete box of rocks" when it comes to computers, if they are already using GSAK.

 

All they have to do is download fresh PQs to their database and do the "Last .gpx Update" filter. Easy peasey. :o

 

I can't believe how stubborn some people are. Every other part of this is completely automatic. It's just this one thing that isn't. Manual processes are stupid and have NO PLACE in the world of automation that GSAK can provide.

 

GSAK is getting close to being able to automate this. If the new functionality of "Check Status" was made available as a function in a macro I could easily write a macro that would be able to automatically clean out the garbage. Not quite as nice as Groundspeak actually providing the data but better than going through your "not found, no GPX in the last 10 days" list and manually checking them all.

 

And just how much busier would the Groundspeak servers be if it weren't for the offline GSAK databases? I believe that GSAK is taking a significant load off of their servers and they should be happy to support it.

Edited by ak8b
Link to comment

:o Is the previous post directed at me? :laughing:

 

If so, I sure don't understand. I just updated my GSAK database and when I did the "Last .gpx Update" filter, it returned four caches. That was few enough for me to check the individual cache pages and Delete the Archived or recently-Disabled caches from my database.

 

Is there a problem doing it that way . . . ? :blink:

Link to comment

There is no manual checking. Go to the link that I provided earlier to see.

 

Iif you have your queries set-up right, you just set a filter to check for caches with no GPX in "x" and delete them all. No need to check each one. GSAK even have macros already written to do it for you. Go ahead, check the few that come up however you will realize that it is not necessary in short order.

 

You can keep trying to change GC's position on offline DB's, causing yourself further frustration, or learn some new tricks (workarounds) that do EXACTLY what you are trying to accomplish.

 

This is not being stubborn, just realistic about expectations.

Link to comment

I can't believe how stubborn some people are. Every other part of this is completely automatic. It's just this one thing that isn't. Manual processes are stupid and have NO PLACE in the world of automation that GSAK can provide.

 

GSAK is getting close to being able to automate this. If the new functionality of "Check Status" was made available as a function in a macro I could easily write a macro that would be able to automatically clean out the garbage. Not quite as nice as Groundspeak actually providing the data but better than going through your "not found, no GPX in the last 10 days" list and manually checking them all.

 

And just how much busier would the Groundspeak servers be if it weren't for the offline GSAK databases? I believe that GSAK is taking a significant load off of their servers and they should be happy to support it.

I can't believe how all the people keep doing things Jeremy has asked them not to do because they feel they are taking a significant load off of Groundspeak servers. Jeremy of course has a direct cost in having to have servers that support the people who access the website online to get the latest data - or who run several pocket queries each time they go looking for caches to cover the area the may be interested in. But he has an asset too and wants to limit the amount of data you can copy from the website at any time. He is very generous in allowing up to 2500 caches per day. He is probably a little miffed at Clyde for coming up with GSAK, though I expect he knew that someone would develop a way to maintain an offline database as soon as he started delivering PQs in GPX format with the logs and such. He has made a business decision that he would rather upgrade servers to allow users to get the latest information from the web site each time they go caching instead of making it easy to maintain their offline databases with pocket queries that only report changes, including when caches are archived. An unscrupulous person could use that kind of query to be able to replicate the entire Geocaching database (or at least a large portion of it).

Link to comment

There is no manual checking. Go to the link that I provided earlier to see.

 

Iif you have your queries set-up right, you just set a filter to check for caches with no GPX in "x" and delete them all. No need to check each one. GSAK even have macros already written to do it for you. Go ahead, check the few that come up however you will realize that it is not necessary in short order.

 

You can keep trying to change GC's position on offline DB's, causing yourself further frustration, or learn some new tricks (workarounds) that do EXACTLY what you are trying to accomplish.

 

This is not being stubborn, just realistic about expectations.

 

There are people that don't understand that concept. See below.

 

.

Edited by W8TTS
Link to comment

Well it happened today, the reason that I've asked for Archived caches to be sent in PQ's one way or another.

 

I was out caching with a buddy that is about as computer literate as a brick. It's taken two of us months of work just getting him to use a Palm instead of paper (save the trees) and he still writes down every GC number that he finds in a notebook. He doesn't trust himself or the technology. Getting him to understand the "Last GPX date" concept would be next to impossible.

 

He had chosen the next cache to go after using his GPSr's "find closest" function. When he told me what the GC number was I couldn't find it in my GSPr or Palm and I told him that I thought it might be archived. But, being himself he insisted that we look for it. So, we did.

 

Guess what? We couldn't find it. Which really bums him out. He doesn't like to leave without finding the cache. And, our 10 minute rule turned into a 20 minute hunt, and it was a 1.5/1.5. But, I was finally able to talk him into giving up.

 

Now, if the "archived" caches had been available to him, like I've asked, through a PQ, so that it could automatically his list there wouldn't have been the problem.

 

And, don't give me the line that we should have checked every cache against gc.com's files. There were over 400 caches within the 20 mile circle that we were going to be working in. So, it wasn't an option.

Link to comment

I can't believe how stubborn some people are. Every other part of this is completely automatic. It's just this one thing that isn't. Manual processes are stupid and have NO PLACE in the world of automation that GSAK can provide.

 

GSAK is getting close to being able to automate this. If the new functionality of "Check Status" was made available as a function in a macro I could easily write a macro that would be able to automatically clean out the garbage. Not quite as nice as Groundspeak actually providing the data but better than going through your "not found, no GPX in the last 10 days" list and manually checking them all.

 

And just how much busier would the Groundspeak servers be if it weren't for the offline GSAK databases? I believe that GSAK is taking a significant load off of their servers and they should be happy to support it.

I can't believe how all the people keep doing things Jeremy has asked them not to do because they feel they are taking a significant load off of Groundspeak servers. Jeremy of course has a direct cost in having to have servers that support the people who access the website online to get the latest data - or who run several pocket queries each time they go looking for caches to cover the area the may be interested in. But he has an asset too and wants to limit the amount of data you can copy from the website at any time. He is very generous in allowing up to 2500 caches per day. He is probably a little miffed at Clyde for coming up with GSAK, though I expect he knew that someone would develop a way to maintain an offline database as soon as he started delivering PQs in GPX format with the logs and such. He has made a business decision that he would rather upgrade servers to allow users to get the latest information from the web site each time they go caching instead of making it easy to maintain their offline databases with pocket queries that only report changes, including when caches are archived. An unscrupulous person could use that kind of query to be able to replicate the entire Geocaching database (or at least a large portion of it).

 

If what you say about Jeremy is true, lets here it from him not from a third party.

Edited by W8TTS
Link to comment

:P Is the previous post directed at me? :laughing:

 

If so, I sure don't understand. I just updated my GSAK database and when I did the "Last .gpx Update" filter, it returned four caches. That was few enough for me to check the individual cache pages and Delete the Archived or recently-Disabled caches from my database.

 

Is there a problem doing it that way . . . ? :laughing:

 

Not a problem if you understand the concept and some people don't. But, why shouldn't it be automated like everything else is? That's what we're getting at.

 

.

Link to comment

Well it happened today, the reason that I've asked for Archived caches to be sent in PQ's one way or another.

 

I was out caching with a buddy that is about as computer literate as a brick. It's taken two of us months of work just getting him to use a Palm instead of paper (save the trees) and he still writes down every GC number that he finds in a notebook. He doesn't trust himself or the technology. Getting him to understand the "Last GPX date" concept would be next to impossible.

 

He had chosen the next cache to go after using his GPSr's "find closest" function. When he told me what the GC number was I couldn't find it in my GSPr or Palm and I told him that I thought it might be archived. But, being himself he insisted that we look for it. So, we did.

 

Guess what? We couldn't find it. Which really bums him out. He doesn't like to leave without finding the cache. And, our 10 minute rule turned into a 20 minute hunt, and it was a 1.5/1.5. But, I was finally able to talk him into giving up.

 

Now, if the "archived" caches had been available to him, like I've asked, through a PQ, so that it could automatically his list there wouldn't have been the problem.

 

And, don't give me the line that we should have checked every cache against gc.com's files. There were over 400 caches within the 20 mile circle that we were going to be working in. So, it wasn't an option.

Maybe your friend should just load a fresh pq, and skip the GSAK database.

Link to comment

Well it happened today, the reason that I've asked for Archived caches to be sent in PQ's one way or another.

 

I was out caching with a buddy that is about as computer literate as a brick. It's taken two of us months of work just getting him to use a Palm instead of paper (save the trees) and he still writes down every GC number that he finds in a notebook. He doesn't trust himself or the technology. Getting him to understand the "Last GPX date" concept would be next to impossible.

 

He had chosen the next cache to go after using his GPSr's "find closest" function. When he told me what the GC number was I couldn't find it in my GSPr or Palm and I told him that I thought it might be archived. But, being himself he insisted that we look for it. So, we did.

 

Guess what? We couldn't find it. Which really bums him out. He doesn't like to leave without finding the cache. And, our 10 minute rule turned into a 20 minute hunt, and it was a 1.5/1.5. But, I was finally able to talk him into giving up.

 

Now, if the "archived" caches had been available to him, like I've asked, through a PQ, so that it could automatically his list there wouldn't have been the problem.

 

And, don't give me the line that we should have checked every cache against gc.com's files. There were over 400 caches within the 20 mile circle that we were going to be working in. So, it wasn't an option.

Maybe your friend should just load a fresh pq, and skip the GSAK database.

 

Maybe GC should just do it the correct way and provide archived caches.

 

.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...