Jump to content

PQs - archived caches


Pajaholic

Recommended Posts

I'm gutted. The cache I was saving for my 100th was archived last month and I've only just found out about it. I suspect the reason for this is that pocket queries seem not to return archived caches and I'm relying on GSAK, updated with weekly PQs, rather than searching through GC.com each time.

 

I can't find a way of either including archived caches in my PQs or just having a PQ return only archived caches.

 

Can someone point the way?

 

TIA,

 

Geoff

Link to comment

Pocket queries, by design, do not return any archived caches. The reasoning is that if they are archived, then the cache can't be searched for. The purpose of PQs is to provide a list of active caches to be searched for based on your specific criteria. They are not to be used to maintain off-line databases. They do provide the data for creating one, though!

 

Since you use GSAK, the best way I've found to weed out the archived ones is to utilize the "Last GPX" date column (add it via the "View" function at the top. After you load your latest PQ, sort by the "Last GPX" field. The older ones are most likely the archived ones (or they don't meet your current PQ criteria). You can open them one by one on-line to verify status.

Edited by Cache O'Plenty
Link to comment

Pocket queries, by design, do not return any archived caches. The reasoning is that if they are archived, then the cache can't be searched for. The purpose of PQs is to provide a list of active caches to be searched for based on your specific criteria. They are not to be used to maintain off-line databases. They do provide the data for creating one, though!

 

Since you use GSAK, the best way I've found to weed out the archived ones is to utilize the "Last GPX" date column (add it via the "View" function at the top. After you load your latest PQ, sort by the "Last GPX" field. The older ones are most likely the archived ones (or they don't meet your current PQ criteria). You can open them one by one on-line to verify status.

 

Since the PQ filters out archived caches, why is GSAK needed weed out archived caches? Eye confused.

Link to comment
Since the PQ filters out archived caches, why is GSAK needed weed out archived caches? Eye confused.

 

I think the point is that there are archived caches in an offline GSAK database. Therefore if you're using GSAK to house the cache list, you need to use GSAK to eliminate the archived caches. So it isn't really a PQ question so much as a GSAK question.

Edited by Markwell
Link to comment

Thanks guys. I guess that GS will never provide exactly the service that I thought I was signing up for when I bought premium membership. It's come as a bit of a shock that I need six overlapping PQs to cover the area I need, that I can only run five PQs per day, and that they don't provide the means to automatically mark caches as archived.

 

Also, PQs only return the 5 most recent logs whereas an offline GSAK database usually maintains the full log history from the time you first added a cache to the database. IMO the last five logs just isn't enough. Those last 5 could all be (and sometimes have been) DNFs followed by an owner maintenance and you need the earlier logs to get a hint or two.

 

I wonder whether it's possible to compare two GSAK databases: the "master" and one formed from just the latest PQ run? Then any cache in the master but not in the latest run is likely to be archived.

Link to comment

Hi Pajaholic

 

Your remark about "overlapping PQs" suggests that you haven't mastered the use of date range to define an area. Rather than pull all the caches in a radius, with overlap, pull 1000 caches between certain dates.

If you use this technique that you'll be able to get 5000 unique caches in a large area.

 

You can keep grabbing and grabbing caches for the same area over and over, each time getting the last 5 logs.

 

You can use GSAK filters to eliminate the archived, and add the new logs to the GSAK database with each new PQ.

Link to comment
Your remark about "overlapping PQs" suggests that you haven't mastered the use of date range to define an area. Rather than pull all the caches in a radius, with overlap, pull 1000 caches between certain dates.

If you use this technique that you'll be able to get 5000 unique caches in a large area.

Thanks for your response. However, I can't see that option. Can you help further?

 

FWIW, I've found a checkbox for "Updated within the last 7 days" but nothing to choose caches updated between user-specified dates. While I've found options for caches placed between user-specified dates, that's not the same thing. Since I don't run PQs at the same time every week, "Updated within the last 7 days" doesn't do what I need. That said, GSAK reports that about a third to a half of all caches in each PQ have been updated, and that's not counting those that have been updated by one of the overlapping PQs already loaded. Thus I suspect I'd only be able to reduce the number of PQs by one or two.

 

Thanks again,

 

Geoff

Link to comment

Read my pq page, specifically the second bullet in the tips and tricks section.

Thanks, but that seems potentially difficult to maintain. As caches get archived, the number of caches per "split date query" in the area in which you're interested reduces while the number of caches returned outside that area (which would clutter up your GSAK database) increases. That is, the "signal to noise ratio" of each PQ reduces with time. As time progresses, you have to add more PQs to the "front end" of the chain or you have to spend a long time every so often rearranging the cut-off dates. It would be a long time since you'd need to check the geographical boundaries being returned and they would be different for each of your "spit-date" queries. I do appreciate your input, but your method just seems another kludge to work around a limitation that GS have imposed and no better than the overlapping PQs I'm currently using.

 

FWIW, I'd prefer to just have one PQ that returned every cache within a distance of a specified centre and have the distance limited rather than the number of caches as it's done now. It wouldn't cost GS any more resources (and would probably save some due the removal of overlaps) if they allowed you to run one (say) 100-mile radius, unlimited caches PQ per day or up to five of the current PQs that are limited to 1,000 caches each. Even better would be allowing us to specify a polygon in the same way that GSAK does, perhaps limiting the enclosed area.

 

Thanks again,

 

Geoff

Edited by Pajaholic
Link to comment
FWIW, I'd prefer to just have one PQ that returned every cache within a distance of a specified centre and have the distance limited rather than the number of caches as it's done now.

AFAIK it does that. regular PQs return the caches sorted by distance from the origin, and after the cache limit is reached the results are simply cut off. that means if you select 1000 caches within a 500 mile radius, you'll get the 1000 closest to the origin. then again, that may not be what you want.

Edited by dfx
Link to comment

I've often thought that gc.com could save themselves a lot of CPU cycles by allowing for rectangular sections if a user wanted to go that route. Much easier to compute simple bounds at the corners vs. the whole "points around a circle" thing.

 

x = cx + r * cos(a)

y = cy + r * sin(a)

 

I have my own caching "grid" areas (rectangles in the Front Range of Colorado) with names, and keep track of caches within each of those areas. Would be nice to be able to pull up PQs that match those as a matter of convenience. As it stands now, gc.com gets to go through the computational exercise above to see what fits within a radius of a circle, and then I get to go through the extra effort of post-processing everything with GSAK to restore things to rectangular sections. Seems the hard way around for those who are OK with the simpler approach.

Edited by ecanderson
Link to comment
Thanks, but that seems potentially difficult to maintain. As caches get archived, the number of caches per "split date query" in the area in which you're interested reduces while the number of caches returned outside that area (which would clutter up your GSAK database) increases. That is, the "signal to noise ratio" of each PQ reduces with time. As time progresses, you have to add more PQs to the "front end" of the chain or you have to spend a long time every so often rearranging the cut-off dates.

I re-arrange the date set about two or three times a year. GSAK has a macro to help with that - taking your current data set and showing somewhere less than 1000 caches for each group and what the date ranges need to be.

It would be a long time since you'd need to check the geographical boundaries being returned and they would be different for each of your "spit-date" queries. I do appreciate your input, but your method just seems another kludge to work around a limitation that GS have imposed and no better than the overlapping PQs I'm currently using.
The way I do it is one large boundary - 51 miles from a set of coordinates. Once a week over the course of two days' worth of queries I get 8,000 caches in that 51 mile radius using nine queries. When I tweaked it last on May 8, it was this setup (and I've included the changes of each set over the two weeks:

Jan-01-1995 - Apr-06-2006 - July 8: 973 caches; July 20: 971 caches - change: -2

Apr-07-2006 - Aug-31-2007 - July 6: 958 caches; July 20: 956 caches - change: -2

Sep-01-2007 - May-30-2008 - July 6: 981 caches; July 20: 978 caches - change: -3

May-31-2008 - Nov-23-2008 - July 6: 964 caches; July 20: 956 caches - change: -8

Nov-24-2008 - May-15-2009 - July 6: 956 caches; July 20: 954 caches - change: -2

May-16-2009 - Sep-01-2009 - July 6: 968 caches; July 20: 966 caches - change: -2

Sep-01-2009 - Feb-11-2010 - July 6: 970 caches; July 20: 969 caches - change: -1

Feb-12-2010 - May-08-2010 - July 6: 859 caches; July 20: 854 caches - change: -5

May-09-2010 - Dec-31-2011 - July 6: 763 caches; July 20: 869 caches - change: +106

So I leave the queries alone until the last one (going to 2011) gets really close to 1,000 - then adjust them all using the GSAK macro. Takes about 15 minutes of adjusting and usually lasts about three months unless there's some major glut in placement. This current grouping was set up on May 9 (about 10 weeks so far).

 

FWIW, I'd prefer to just have one PQ that returned every cache within a distance of a specified centre and have the distance limited rather than the number of caches as it's done now.

That is indeed how it works for radius-based caches (center point and distance rather than geographical boundary).

 

Even better would be allowing us to specify a polygon in the same way that GSAK does, perhaps limiting the enclosed area.
There are convoluted ways to do this with routes, but I do agree a polygon would be a great solution.

 

The only problem with polygon queries is that you almost need to do them like geographical boundary queries, where it starts with something other than distance truncating based on another characteristic like date placed. If you had a polygon (first the system would have to ensure that it is a closed polygon) you would have to limit it by choosing the first 1000 by date placed or GCID (the numeric sequence). Radial distance from a polygon is pretty hard to do.

In the end though I do agree that polygon queries would be great.

Link to comment

FWIW, I'd prefer to just have one PQ that returned every cache within a distance of a specified centre and have the distance limited rather than the number of caches as it's done now. It wouldn't cost GS any more resources (and would probably save some due the removal of overlaps) if they allowed you to run one (say) 100-mile radius, unlimited caches PQ per day or up to five of the current PQs that are limited to 1,000 caches each. Even better would be allowing us to specify a polygon in the same way that GSAK does, perhaps limiting the enclosed area.

 

Holy cow!!! :(

 

A 100 mile radius may not be much in your neck of the woods...but in some areas that could easily be 10,000+ caches. That would be a h-u-g-e PQ!! :(

 

Why does anyone need that many results in a single PQ? :) I'm still using the 500 result PQs (based on radius, like you run, so I have some overlap) and in one week I get about 25 of them and they cover a LOT more area than I can find in a few months of daily caching.

Link to comment

Holy cow!!! :(

 

A 100 mile radius may not be much in your neck of the woods...but in some areas that could easily be 10,000+ caches. That would be a h-u-g-e PQ!! :(

 

Why does anyone need that many results in a single PQ? :) I'm still using the 500 result PQs (based on radius, like you run, so I have some overlap) and in one week I get about 25 of them and they cover a LOT more area than I can find in a few months of daily caching.

I suggested 100 miles as that would bring in about 5,000 caches where I live and from what I've seen I suspect that there aren't many places where 100 miles would bring in more than twice that - but it's only a suggestion and TPTB would set the limit. The thing is that you'd get every cache in the specified area (and polygon queries would be even more precise) and so you'd only need to run one of these each week to cover the whole area whereas you say you need 25?! :(

 

I suspect that one, large polygon query would be more efficient that several smaller, geographically overlapping queries. However, this is all besides the point since I started this thread to try to find a way to automatically update my offline database WRT archived caches, and that's something I still haven't "nailed".

Actually, I do know a way to do it, but it's not permitted by GS's TOU as it involves using a "scraper" to update all caches in the offline database likely to have been archived.

 

Geoff

Link to comment

Holy cow!!! :(

 

A 100 mile radius may not be much in your neck of the woods...but in some areas that could easily be 10,000+ caches. That would be a h-u-g-e PQ!! :(

 

Why does anyone need that many results in a single PQ? :) I'm still using the 500 result PQs (based on radius, like you run, so I have some overlap) and in one week I get about 25 of them and they cover a LOT more area than I can find in a few months of daily caching.

I suggested 100 miles as that would bring in about 5,000 caches where I live and from what I've seen I suspect that there aren't many places where 100 miles would bring in more than twice that - but it's only a suggestion and TPTB would set the limit. The thing is that you'd get every cache in the specified area (and polygon queries would be even more precise) and so you'd only need to run one of these each week to cover the whole area whereas you say you need 25?! :(

 

I suspect that one, large polygon query would be more efficient that several smaller, geographically overlapping queries. However, this is all besides the point since I started this thread to try to find a way to automatically update my offline database WRT archived caches, and that's something I still haven't "nailed".

Actually, I do know a way to do it, but it's not permitted by GS's TOU as it involves using a "scraper" to update all caches in the offline database likely to have been archived.

 

Geoff

 

I had a similar problem to yourself, and while I disagree with you that the PQ based on date placed option is difficult to manage (I've only had to change it twice since I set up the ranges), I do agree that archived caches can be tricky to maintain.

 

My solution is to use a couple of GSAK macros. I refresh the db on a daily / weekly basis by automating the downloading (grabbing the caches by email), and all my PQs run on a weekly cycle. This means that I have the most up to date db I can manage. I then run a macro to clean up the db of any caches which are temporarily unavailable (unless they have already been found). That then leaves me with a db with either active caches, found caches or (potentially) archived caches (as these by their nature will not have been updated - though practically quite a lot become temporarily unavailable first anyway).

 

I then run the "review for archive" macro. As all live caches should have a "last GPX date" of less than 7 days, it simply runs through those with an older date and displays the cache page in the browser. I simply tick "archived" (easy to see as there is red text at the top) or "ignore". Caches are not archived that rapidly, so its a five minute task every month or so across my 4800 cache db.

 

Its not perfect, but the logic of having PQs exclude archived caches is perfectly sound (they are archived, so why would you want to grab their details anyway!), so (short of screen scraping) this seems the most sensible option.

 

HTH!

 

Matt

Link to comment

Markwell's way works great. I do a variation that depends on running the PQs for your area once a week. With the 1000 PQs, that means I can effectively cover an 85 mile radius with 12 PQs sorted be placed date, the 12th is run every day until it reaches the 100 limit and the date is set to the day after the 11the PQ rang to sometime in the future and I keep an eye on the number.

 

I keep everything in one database rather than do as some and separate out the found.

 

Here are the high level steps;

 

1. I use the PlacedPQ macro in GSAK to break up my PQs into manageable segments. As mentioned above, I do this when my daily new PQ starts to reach max. The only type I filter out is events, as I run a separate PQ for those every 2 weeks or so.

 

2. Every day after loading up my PQs, I run a filter set to no GPX in last seven days/not found/events excluded/dnf excluded (I keep these for personal stat info). I could do this once a week as well, or even once a month as it would still apply, however it only takes seconds to do.

 

3. From the results of the filter, I simply delete everything. Since I run my PQs on a regular basis, anything not included is archived.

 

By doing this, the 11 weekly PQs shrink enough that it is rare (even when I used the 500 method) that I have to add another PQ however it is possible for some. When I run the placed PQs, the adjustments are usually sufficient to reduce that 12th PQ. All my PQs are set to a max of 990 to allow for the occasional unarchiving, however it is so rare, this is probably just me being paranoid.

 

Before anyone says it, I know the info is stale. it is stale the minute I DL it. The business I own has me covering a 4000+ square mile area on a regular basis. If it was not for that, my radius would be much smaller. The PlacedPQ is the only macro I need to run and that is occasional. I have done this now for several years and when spot checking have yet to find a problem.

Link to comment

Once again we have a geocacher who is trying to "help" Geocaching.com be more efficient by maintaining a copy of a significant portion of the Geocaching.com database locally on his computer. Of course he wants to help out by asking for changes to PQs that, in his opinion, would put less load on the Geocaching.com servers while allowing him to maintain a larger offline database.

 

Groundspeak is not interested in people maintaining copies of significant portions of the database, even if it means less load on the Geocaching.com servers. They view the database (the aggregation of geocache data) as intellectual property and exercise their rights to protect it. They want to make the point that there is only one place where you can get up to date information on geocaches. The geocaching data is constantly changing. Caches are enabled and disabled; new caches are published; caches get archived; people post logs for finding or not finding caches. Trying to duplicate any significant portion of this data and keep it up to date is not going to be successful and Groundspeak is likely to keep trying to prevent you from doing so.

 

Groundspeak provides pocket queries to premium members to allow them to download a reasonable number of geocaches which can be loaded on to a GPS or futher processed with third party tools to plan for a geocaching outing. In some cases, the data can be used as an offline database for situations when the geocacher cannot access the Geocaching.com site. The pocket query data may also be used to generate personal geocaching statistics using third party tools.

 

I'm only guessing, but I believe that most people probably get a Pocket Query of the caches they are going out to find and load it directly into their GPS (or use a tool to convert it into a format they can load). The idea of keeping an offline database never crosses their mind. However a significant number of people, including myself, do build up a database of their local area and use it to plan geocaching outings. They have developed various tecniques to ensure they have the latest (or at least relatively recent) updates when they are ready to go geocaching (see Markwell and other posts above).

 

The limitations that Groundspeak imposes means that we have to "limit" the size of the offline database. There may be times where we don't have the caches for an area we are going to in our offline database. Since we have to get up to date data in anycase, we often end up getting a fresh PQ of just the area we are heading out to.

 

For those who travel a lot and don't always know where they will be, there are now several approved applications for smartphones. These apps provide access to the online Geocaching.com database from anywhere you have a phone connection. You download specific caches and save them for when you are outside of cell phone coverage.

Link to comment
Groundspeak is not interested in people maintaining copies of significant portions of the database, even if it means less load on the Geocaching.com servers. They view the database (the aggregation of geocache data) as intellectual property and exercise their rights to protect it.

in this context i feel obligated to comment that all the content of cache listings and therefore all the content of PQs comes from the users, which means that it's the respective user who is owner and copyright holder of the content of each cache listing and not Groundspeak.

 

of course the TOU state that by submitting the content to the website, you grant Groundspeak full rights to the content, totally free of charge, allowing them to do whatever they want with it, even sell it. so there is that. it doesn't transfer copyright ownership to Groundspeak though.

 

now it's up to everyone individually to judge what the big plan behind restricting PQs is. did Groundspeak fall victim to corporate greed and wants to keep all the content they so conveniently have full rights over to themselves as much as possible, so they can put themselves in a monopoly position? or do they want to protect the rights of the actual copyright owners, the users, by not giving their content out to 3rd parties too easily? your call...

Edited by dfx
Link to comment
Since the PQ filters out archived caches, why is GSAK needed weed out archived caches? Eye confused.

 

I think the point is that there are archived caches in an offline GSAK database. Therefore if you're using GSAK to house the cache list, you need to use GSAK to eliminate the archived caches. So it isn't really a PQ question so much as a GSAK question.

 

Thank you. I didn't realize that there might be confusion wrt having to use GSAK to manage entities stored in a GSAK database.

Link to comment

You did read my archived caches page, right?

I did - thanks. Purging would remove the log history, which I don't want to do. Your second method seems better but I can't quite bring myself to just delete everything before the last GPX, so I've opened the suspect caches in GC.com and then marked them as archived in GSAK if GC.com says they're archived. If they "come back from the dead" at any later stage, their history will be resurrected in my offline database.

Link to comment
Groundspeak is not interested in people maintaining copies of significant portions of the database, even if it means less load on the Geocaching.com servers. They view the database (the aggregation of geocache data) as intellectual property and exercise their rights to protect it.

in this context i feel obligated to comment that all the content of cache listings and therefore all the content of PQs comes from the users, which means that it's the respective user who is owner and copyright holder of the content of each cache listing and not Groundspeak.

If you noticed, I indicated the it is the aggregation of the data that Grounspeak views as intellectual property. If you want to get each cache owner to send you a hand built GPX file with the same infomation they provided to Geocaching.com to load in your offline database go for it. Groundspeak doesn't want you to grab information from Geoaching.com, using either tools they have provided or third party tools, for any purpose beyond what you agreed to when you clicked the Waypoint License agreement. They believe that pocket queries provide you more than generous access to what you would need for those purposes.

Link to comment
Groundspeak is not interested in people maintaining copies of significant portions of the database, even if it means less load on the Geocaching.com servers. They view the database (the aggregation of geocache data) as intellectual property and exercise their rights to protect it.

in this context i feel obligated to comment that all the content of cache listings and therefore all the content of PQs comes from the users, which means that it's the respective user who is owner and copyright holder of the content of each cache listing and not Groundspeak.

But it's not just the cache description and coordinates that are the issue here. It's also the found, not found, and especially the archive and disable logs, many of which were written by people other than the cache owner. It is this aggregation that Groundspeak wants to protect.

Link to comment

I've been lurking for long enough. My thoughts:

 

1) 1 PQ per STATE would be nice to be able to download Once per week(max 5 States). this would take the place of 1 DAY per state worth of PQs. Only "Available" caches would be included. These could be Generated by Groundspeak Once or twice in a week, and create a 'Canned' PQ.

 

2) Archived Caches - Would be nice to JUST get the GC code, and the Archived Attribute. Archived caches COULD be added to PQs ONLY for 2 WEEKS after Archival! (This is to Ensure that cachers are NOT searching for Archived Caches)

 

Currently, I have 6 PQs (by Date) for MO that I run Once a week, and then 3 for MO, and 3 for KS that are Only "Updated in the last 7 days". All my PQs are only Caches that "I haven't found", and are "Active".

 

The Steaks

Link to comment

1) 1 PQ per STATE would be nice to be able to download Once per week(max 5 States). this would take the place of 1 DAY per state worth of PQs. Only "Available" caches would be included. These could be Generated by Groundspeak Once or twice in a week, and create a 'Canned' PQ.

OK, if I can choose 5 states, how about California (81,686), Washington State (20,624), Oregon (20,890), Nevada (10,040) and Arizona (16,506). Right now, I can get close to 35,000 caches per week. Your proposal would be 149,746 caches this week. Really? :laughing:

 

2) Archived Caches - Would be nice to JUST get the GC code, and the Archived Attribute. Archived caches COULD be added to PQs ONLY for 2 WEEKS after Archival! (This is to Ensure that cachers are NOT searching for Archived Caches)
I'd support that, but after years of asking and getting "no" I don't think Groundspeak is going to change on this.
Link to comment

I've been lurking for long enough. My thoughts:

 

1) 1 PQ per STATE would be nice to be able to download Once per week(max 5 States). this would take the place of 1 DAY per state worth of PQs. Only "Available" caches would be included. These could be Generated by Groundspeak Once or twice in a week, and create a 'Canned' PQ.

 

2) Archived Caches - Would be nice to JUST get the GC code, and the Archived Attribute. Archived caches COULD be added to PQs ONLY for 2 WEEKS after Archival! (This is to Ensure that cachers are NOT searching for Archived Caches)

 

Currently, I have 6 PQs (by Date) for MO that I run Once a week, and then 3 for MO, and 3 for KS that are Only "Updated in the last 7 days". All my PQs are only Caches that "I haven't found", and are "Active".

 

The Steaks

 

Why just states? I live in Washington, why not British Columbia? Your on a very slippery slope here, every country has some sort of political boundaries that could be analogous to states. You will have to include each and every one of them. Including some with others might be a very politically touchy subject. As Markwell has pointed out having five states can be a very large number.

 

I have never figured out the fascination with archived caches. I've heard some pretty good stories, but for the most part they come down to one simple reason ... so I can clean up my GSAK database.

Link to comment

Once again we have a geocacher who is trying to "help" Geocaching.com be more efficient by maintaining a copy of a significant portion of the Geocaching.com database locally on his computer. Of course he wants to help out by asking for changes to PQs that, in his opinion, would put less load on the Geocaching.com servers while allowing him to maintain a larger offline database.

 

[...]

 

For those who travel a lot and don't always know where they will be, there are now several approved applications for smartphones. These apps provide access to the online Geocaching.com database from anywhere you have a phone connection. You download specific caches and save them for when you are outside of cell phone coverage.

 

[edited by Pajaholic to remove what might be misconstrued as a personal attack]

 

I asked how to configure PQs to include archived caches. I thought that I must be missing something, since GS state in their "Getting started as a premium member" page that you can use pocket queries to:

Update the status of caches that you have already downloaded so that you don't head outdoors looking for a cache that is no longer there
That's all I'm trying to do, and I can't think of a way to do it using PQs unless they can include archived caches.

 

WRT smartphones :laughing: Where I live, they cost rather a lot unless you take out at least a 24-month contract that either costs more and/or gives a lot less than my present tariff. Somehow I don't think that anyone should have to buy a $500 phone just to work around Groundspeak's limitations. In any case and AFAICT smartphone apps are useless for "drive-by" caching where there's no 3G phone signal (which is the case in about a third of my territory).

Edited by Pajaholic
Link to comment

I asked how to configure PQs to include archived caches. I thought that I must be missing something, since GS state in their "Getting started as a premium member" page that you can use pocket queries to:

Update the status of caches that you have already downloaded so that you don't head outdoors looking for a cache that is no longer there
That's all I'm trying to do, and I can't think of a way to do it using PQs unless they can include archived caches.

 

Sure you can - wipe the data of your offline database with each load. That's the way to do it with Pocket Queries. the problem is that you want to accumulate the logs offline. That's where there's a breakdown.

Link to comment

Once again we have a geocacher who is trying to "help" Geocaching.com be more efficient by maintaining a copy of a significant portion of the Geocaching.com database locally on his computer. Of course he wants to help out by asking for changes to PQs that, in his opinion, would put less load on the Geocaching.com servers while allowing him to maintain a larger offline database.

 

[...]

 

For those who travel a lot and don't always know where they will be, there are now several approved applications for smartphones. These apps provide access to the online Geocaching.com database from anywhere you have a phone connection. You download specific caches and save them for when you are outside of cell phone coverage.

 

[edited by Pajaholic to remove what might be misconstrued as a personal attack]

 

I asked how to configure PQs to include archived caches. I thought that I must be missing something, since GS state in their "Getting started as a premium member" page that you can use pocket queries to:

Update the status of caches that you have already downloaded so that you don't head outdoors looking for a cache that is no longer there
That's all I'm trying to do, and I can't think of a way to do it using PQs unless they can include archived caches.

 

WRT smartphones :laughing: Where I live, they cost rather a lot unless you take out at least a 24-month contract that either costs more and/or gives a lot less than my present tariff. Somehow I don't think that anyone should have to buy a $500 phone just to work around Groundspeak's limitations. In any case and AFAICT smartphone apps are useless for "drive-by" caching where there's no 3G phone signal (which is the case in about a third of my territory).

 

Take a look at post #20.

Link to comment

Sure you can - wipe the data of your offline database with each load. That's the way to do it with Pocket Queries. the problem is that you want to accumulate the logs offline. That's where there's a breakdown.

Playing devil's advocate here. If they wipe their database then they loose their corrected coordinates, additional child waypoints and notes. If the site allowed for those items to be stored online and downloaded in the PQ then a database wipe wouldn't be a problem for most people.

Link to comment
Sure you can - wipe the data of your offline database with each load. That's the way to do it with Pocket Queries. the problem is that you want to accumulate the logs offline. That's where there's a breakdown.

Playing devil's advocate here. If they wipe their database then they loose their corrected coordinates, additional child waypoints and notes. If the site allowed for those items to be stored online and downloaded in the PQ then a database wipe wouldn't be a problem for most people.

That sounds like a design problem with the offline database, rather than a problem with the PQs. The PQs are working exactly the way Groundspeak designed them. For those of us that use them as intended, they work perfectly. If your offline database isn't working the way you'd like, complain to it's author, not to Groundspeak. :laughing:

Link to comment

Sure you can - wipe the data of your offline database with each load. That's the way to do it with Pocket Queries. the problem is that you want to accumulate the logs offline. That's where there's a breakdown.

Playing devil's advocate here. If they wipe their database then they loose their corrected coordinates, additional child waypoints and notes. If the site allowed for those items to be stored online and downloaded in the PQ then a database wipe wouldn't be a problem for most people.

There is a way to prevent that in GSAK (or, at least "protect" the updated coords)...but, at this moment...I can't recall how...

Link to comment
Take a look at post #20.

I did, thanks. However, when I went through everything with a GPX date prior to the last run there were half a dozen non-archived caches there. I'll need to investigate why that happened, but in the meantime I'm wary of deleting everything with an old last-GPX date.

Link to comment

I do check all of the non-updated caches for actually archiving them - but therein lies the problem with radius searches verses date placed ranges. The radius shrinks if you hit the maximum limit and caches that aren't archived won't get updated. Also you should make sure you're not limiting the caches to "active" because the disabled ones won't get updated either.

 

If you set up the date ranges, once you set up the group they shouldn't climb above the max 1,000 unless a whole bunch get un-archived (with the exception of the most recent date group). The multiple radius won't work as reliably - caches move, a portion may get a whole bunch in an area if someone places 50 caches in one park, etc.

Link to comment

That sounds like a design problem with the offline database, rather than a problem with the PQs. The PQs are working exactly the way Groundspeak designed them. For those of us that use them as intended, they work perfectly. If your offline database isn't working the way you'd like, complain to it's author, not to Groundspeak. :laughing:

Wrong Devil. :laughing:

 

It's a shortcoming of the site since if you remove the offline database from the equation and load the PQs directly into the GPS you get the same problem. No corrected coordinates, no additional personal waypoints and no notes.

 

It's these site shortcomings that drive many to use an offline database in the first place. Then people run into the archived issues. The suggested fix was to wipe the database.

 

If you wipe the database you lost your corrected coordinates, additional personal waypoints, and notes. Your back to the site shortcomings again.

 

My point was that if the site had those items then people wouldn't need to do a database wipe and most people wouldn't need a database entirely.

Link to comment

 

Wrong Devil. :laughing:

 

It's a shortcoming of the site since if you remove the offline database from the equation and load the PQs directly into the GPS you get the same problem. No corrected coordinates, no additional personal waypoints and no notes.

 

It's these site shortcomings that drive many to use an offline database in the first place. Then people run into the archived issues. The suggested fix was to wipe the database.

 

If you wipe the database you lost your corrected coordinates, additional personal waypoints, and notes. Your back to the site shortcomings again.

 

My point was that if the site had those items then people wouldn't need to do a database wipe and most people wouldn't need a database entirely.

 

Easy solution to your issues...

 

Keep a copy of all your notes, corrected coordinates, additional waypoints in a second database (yes, for those who only use the Default database in GSAK...you can maintain multiple databases! I think I have about 25-30 of them...for various reasons.).

 

Wipe the main database before loading new PQ. Copy the waypoints from the second database to the main database only updating existing waypoints.

 

All your additional notes, corrected coordinates, etc will be carried over from the second database and since you wiped the database before loading...there will be no Archived caches.

 

:laughing:

 

This is NOT the method I use...but it does solve the issue you pointed out.

 

I use overlapping circles for my PQs and the Last GPX method of removing archives and I always have plenty of caches to look for. If I lose a few near the edges of a circle, who cares...eventually they will return as I find caches near the center.

 

And, I maintain a database which contains all my local PQs...so if I need to look at the BIG PICTURE then I can always use that database. More often than not, a waypoint lost from one PQ probably still exists in the adjacent PQ...so it's very rare that something is missing from that combined database unless it has been archived.

Link to comment

1) 1 PQ per STATE would be nice to be able to download Once per week(max 5 States). this would take the place of 1 DAY per state worth of PQs. Only "Available" caches would be included. These could be Generated by Groundspeak Once or twice in a week, and create a 'Canned' PQ.

OK, if I can choose 5 states, how about California (81,686), Washington State (20,624), Oregon (20,890), Nevada (10,040) and Arizona (16,506). Right now, I can get close to 35,000 caches per week. Your proposal would be 149,746 caches this week. Really? :laughing:

 

Sorry, after putting this same idea out in a forum(over a YEAR ago), I had forgotten to mention that CA, and TX would be broken into Two. I think that there may be another state that could get split as well, but don't have access to the current numbers of caches /state. I was also throwing out a MAX of 5 States as a extreme, realizing that even I would only use 2. I would even be perfect with a limit of 3. This would allow for All of CA, and Oregon.

 

The Steaks

Link to comment

1) 1 PQ per STATE would be nice to be able to download Once per week(max 5 States). this would take the place of 1 DAY per state worth of PQs. Only "Available" caches would be included. These could be Generated by Groundspeak Once or twice in a week, and create a 'Canned' PQ.

OK, if I can choose 5 states, how about California (81,686), Washington State (20,624), Oregon (20,890), Nevada (10,040) and Arizona (16,506). Right now, I can get close to 35,000 caches per week. Your proposal would be 149,746 caches this week. Really? :laughing:

 

Sorry, after putting this same idea out in a forum(over a YEAR ago), I had forgotten to mention that CA, and TX would be broken into Two. I think that there may be another state that could get split as well, but don't have access to the current numbers of caches /state. I was also throwing out a MAX of 5 States as a extreme, realizing that even I would only use 2. I would even be perfect with a limit of 3. This would allow for All of CA, and Oregon.

 

The Steaks

 

Let's not forget Florida (25,480)!! :laughing:

Link to comment

That sounds like a design problem with the offline database, rather than a problem with the PQs. The PQs are working exactly the way Groundspeak designed them. For those of us that use them as intended, they work perfectly. If your offline database isn't working the way you'd like, complain to it's author, not to Groundspeak. :laughing:

Wrong Devil. :laughing:

 

It's a shortcoming of the site since if you remove the offline database from the equation and load the PQs directly into the GPS you get the same problem. No corrected coordinates, no additional personal waypoints and no notes.

 

It's these site shortcomings that drive many to use an offline database in the first place. Then people run into the archived issues. The suggested fix was to wipe the database.

 

If you wipe the database you lost your corrected coordinates, additional personal waypoints, and notes. Your back to the site shortcomings again.

 

My point was that if the site had those items then people wouldn't need to do a database wipe and most people wouldn't need a database entirely.

 

That's not true. There were at least three solutions given and you have zeroed in on one. Admittedly Markwell's document shows the way most DB admins would recommend. The way I showed that I do in post #20, I maintain all logs in my offline DB. All corrected waypoints. All my personal waypoints. All notes and still only remove unfound, not DNF'd archived caches, and the DNF's would be easily removed however I choose to keep them.

 

It is actually not a shortcoming but a feature of the site. There has never been a valid reason for giving caches in a PQ that are no longer there. When if you load your GPSr from a PQ, they are no longer there and if you maintain an offline DB, there is a simple, painless way to remove these archived caches without affecting the integrity of your collected data.

Link to comment

That's not true. There were at least three solutions given and you have zeroed in on one.

That's why I said I was playing the Devil's Advocate. Just pointing out why that simple solution is the least desirable. I don't use that myself.

 

It is actually not a shortcoming but a feature of the site. There has never been a valid reason for giving caches in a PQ that are no longer there. When if you load your GPSr from a PQ, they are no longer there and if you maintain an offline DB, there is a simple, painless way to remove these archived caches without affecting the integrity of your collected data.

Here you misunderstand me. I wasn't referring to not sending archived caches as the shortcoming. The shortcoming is not being able to keep track of personal game data on the site.

 

Having the ability to add corrected coordinates to puzzles, additional waypoints for when your working on a large multi or the owner only lists them in the text, and notes and then having THAT data come down in a PQ is what's missing.

Link to comment

There has never been a valid reason for giving caches in a PQ that are no longer there. When if you load your GPSr from a PQ, they are no longer there and if you maintain an offline DB, there is a simple, painless way to remove these archived caches without affecting the integrity of your collected data.

I agree that sending all the data for an archived cache is not needed. Having a list of GC codes that are archived in a certain area would be a nice to have. Doesn't look like it will happen though.

Link to comment

my personal "solution" (AKA workaround): notification emails for "archive" logs for all caches within the max radius, automatically processed to flag the respective caches as archived in the DB.

 

Which, if I'm not mistaken, is only 50 Miles. ** Just checked, and its actually 49.7096953789867**

 

Really sucks when you go on quick trips around an entire 2 STATE area.

 

The Steaks

 

P.S. Yes, my idea for PQs of all States would also apply to Provinces/Territorys.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...