Jump to content

Pocket Queries not returning archived caches


soulnavigator

Recommended Posts

For all those who have called the Sort by last GPX date a great workaround:

 

If you use the "have been updated in the past 7 days" option to prevent useless PQ generation, this doesn't work. Am I just not supposed to use that option, and have my PQ's return all caches, even if they haven't been updated?

Link to comment

I am not sure how many more ways I can flat out state that the owners of this website DO NOT intend for users to create and maintain offline databases of the data downloaded.

 

I am not sure how many different ways some users can continue to ignore this fact and ask for tools that ignore the intent.

 

If archived caches within your offline database cause you or anybody else a problem, then you alone are at fault - not the website.

 

Wishing for TPTB to change thier mind will have to be addressed before archived caches can be included in a PQ. Markwell has thoughtfully pointed out workarounds and some other good points.

 

Offline databases are a fact of life now, and the DO (sometimes) contain old data after a cache has been Archived. Groundspeak CAN do something about it.

 

Groundspeak has done something about. They make sure the web site is up-to-date and that PQ's only include non-archived cache information. They encourage you NOT to keep the old information. I am sorry they did not implement the solution you think is best.

Link to comment

<snip>... the owners of this website DO NOT intend for users to create and maintain offline databases of the data downloaded.

<snip>...

Groundspeak has done something about. They make sure the web site is up-to-date and that PQ's only include non-archived cache information. They encourage you NOT to keep the old information. I am sorry they did not implement the solution you think is best.

Groundspeak can intend whatever they like - but geocachers around the world will still do what they want with the data supplied in pocket queries. Offline databases will continue to exist, and when a cacher upsets a land owner, who do you think gets the blame?.

 

The way that I see it ...

  • Groundspeak acknowledges that many geocachers have and maintain their own offline databases with data from pocket queries.
  • Groundspeak supports the use of offline databases by both advertising these databases on its website and also by continuing to produce GPX file data.
  • Groundspeak acknowledges that there is a problem with offline databases.
  • Groundspeak can but continues to refuse to include the requested extra data in pocket queries.

In my non legal opinion, that makes Groundspeak an accessory before the fact when new geocachers use old data from their offline database.

 

Wayne

Link to comment

I am not sure how many more ways I can flat out state that the owners of this website DO NOT intend for users to create and maintain offline databases of the data downloaded.

 

If this is indeed the case (and I believe you, Starbrand, when you say it is) then can you please help me to understand why GC.com implemented the use of pq's? Because it seems to me, the only thing they're really used for is creating and maintaining data in an offline form.

Link to comment

In my non legal opinion, that makes Groundspeak an accessory before the fact when new geocachers use old data from their offline database.

 

Wayne

Suppose I'm driving down the road at night following my highway map that was printed in 1998. I make a left turn to cross a river and plunge 200 feet to my death because the bridge that is shown on the map was removed in 2002. Of course, if I had been using the 2007 map which I already had in my glove compartment, I would have seen that the new bridge is a half mile farther up the road.

 

Who's at fault?

 

The mapmaker for not explicitly letting me know that I should be using the most current map available, or me for using the older map? Per your argument, it would seem that you would hold the mapmaker responsible.

Link to comment

I am not sure how many more ways I can flat out state that the owners of this website DO NOT intend for users to create and maintain offline databases of the data downloaded.

 

If this is indeed the case (and I believe you, Starbrand, when you say it is) then can you please help me to understand why GC.com implemented the use of pq's? Because it seems to me, the only thing they're really used for is creating and maintaining data in an offline form.

PQs were created to make it easy to get a large number of caches into your geocaching devices and not require you to punch all the data in by hand or have to print out reams or paper. Their intent (was likely) that whenever you went out caching, you would generate a fresh PQ of active caches. PQs existed before GSAK did, so they could not have known that people would be keeping databases of thousands of caches.

 

Perhaps they have accepted that apps like GSAK exist (as evidenced by the GPX developers thread) but it all comes down to taking responsibility for yourself. Just because you choose to do something unexpected with your PQs doesn't mean that Groundspeak is responsible if you go off hunting caches that no longer exist.

Link to comment

The number of responses this thead has generated is testiment to how people cache today. I couldn't cache without GSAK. I find it crutial to have more than 5 logs in the field and this isn't possible without GSAK.

Also my ISP went bust last September and I was forced to use dial up for several weeks. Without GSAK I would have had to put caching on hold.

 

The suggested solutions are so simple I just don't understand where all the resistance is coming from.

Link to comment

The suggested solutions are so simple I just don't understand where all the resistance is coming from.

 

No resistance, we're just pointing out the fact that Groundspeak does not support offline databases.

 

I have also yet to see where including an archived cache in a PQ assists in keeping offline data fresh.

Link to comment

I have also yet to see where including an archived cache in a PQ assists in keeping offline data fresh.

Is it because you aren't listening, or you just don't believe what we say?

How about if the question were reworded this way:

 

"I have also yet to see where including an archived cache in a PQ assists is necessary in for keeping offline data fresh."

Link to comment

I have also yet to see where including an archived cache in a PQ assists in keeping offline data fresh.

Is it because you aren't listening, or you just don't believe what we say?

How about if the question were reworded this way:

 

"I have also yet to see where including an archived cache in a PQ assists is necessary in for keeping offline data fresh."

 

Sure. We all agree it's possible to keep our offline data correct without it. A lot of us are doing it. We just know it would be easier with a simple additional option. But it won't happen...because if it were available we might maintain offline databases.

Link to comment

I have also yet to see where including an archived cache in a PQ assists in keeping offline data fresh.

Is it because you aren't listening, or you just don't believe what we say?

 

Actually, it is because I am paying attention and do not believe what is being said.

 

I will use GSAK since it is what I am most familiar with.

 

Currently, we get a PQ, use a filter or macro to remove all caches that were not in the last PQ (if you are maintaining your PQ's properly, this assumption is ALWAYS correct).We leave to cache at 8 AM. at 9:45 StarBrand archives a cache that they went to do maintenance on and found missing. 11:30, we try to locate the cache and come up with a DNF.

 

Your proposal, we get a PQ that includes archived caches. We leave to cache at 7:58 AM (no filter to run, we got out 2 minutes earlier). at 9:45 StarBrand archives a cache that they went to do maintenance on and found missing. 11:30, we try to locate the cache and come up with a DNF.

 

What's different? In both scenarios I had a reasonable assumption that my data was as fresh as possible. At least in the first one, the one I use, I also have a assumption that in rare instances, one will get missed. The same would apply to what you are proposing.

 

I just don't see the point if someone is unwilling to maintain their offline DB, why GC, who have clearly stated their disdain for said offline DB and have taken the position to ensure as much as possible that their data is current, should have to change their position. Add to this the fact that only a hand-full of the thousands, possibly tens of thousands, of cachers who have offline DBs are unwilling to maintain their own data using programs unsupported (albeit acknowledged) by GC.

 

When all is said and done, we can discuss this until we are blue in the face. Like Microsoft's position that you can only legally install XP or Vista on one computer, even though you have two at home, GC is not going to change their position. Unlike MS, GC appears to be quite responsive to the reasonable requests of the majority. Seems to be working pretty good to this point.

Link to comment

Actually, it is because I am paying attention and do not believe what is being said.

<snip>...

I'm not so sure.

My primary reason for getting involved in this thread is not so much for myself, though I would benefit a little.

 

It's for the (probably) thousands of geocachers who don't know enough to maintain their offline databases, let alone download and install a macro so that it's easy to use. Leaving them high and dry is just a cop-out, while the change requested would benefit them greatly.

 

Wayne

Link to comment

I tried using the macro in GSAK and it is not a snap by any means. It seems from what I saw it is necessary to click through on each listing to see if it is archived or just temporarily disabled. If I can still bring up the cache page when I check each one manually it would seem that this could be automated. If there was a "flag" set to archived or disabled that could be read by GSAK then the cache listing in my offline database would reflect this. It would not require the data to be in the pocket query but it could still be automated. I would not have to go through the pages one by one and I could be sure that I had the proverbial "fresh" data on my palm. :)

Link to comment
I tried using the macro in GSAK and it is not a snap by any means. It seems from what I saw it is necessary to click through on each listing to see if it is archived or just temporarily disabled.

 

If you include the temporarily disabled in your PQ, then any caches that you don't get have been archived (so you don't need to review each one individually.

Link to comment
I tried using the macro in GSAK and it is not a snap by any means. It seems from what I saw it is necessary to click through on each listing to see if it is archived or just temporarily disabled.

 

If you include the temporarily disabled in your PQ, then any caches that you don't get have been archived (so you don't need to review each one individually.

I am not sure if I understand. Are you saying do a PQ and exclude temporarily disabled caches so they are not downloaded? Then run the macro using that PQ as the last update? The temporarily disabled caches are already in my database so I don't know. :)

Link to comment
I tried using the macro in GSAK and it is not a snap by any means. It seems from what I saw it is necessary to click through on each listing to see if it is archived or just temporarily disabled.

 

If you include the temporarily disabled in your PQ, then any caches that you don't get have been archived (so you don't need to review each one individually.

I am not sure if I understand. Are you saying do a PQ and exclude temporarily disabled caches so they are not downloaded? Then run the macro using that PQ as the last update? The temporarily disabled caches are already in my database so I don't know. :)

 

It is easiest done with a filter if you get your PQ's at regular intervals. Filter so it shows everything that was noy found and does not have a GPX since the last GPX was loaded, then erase everything in the filter.

Link to comment

It's for the (probably) thousands of geocachers who don't know enough to maintain their offline databases, let alone download and install a macro so that it's easy to use. Leaving them high and dry is just a cop-out, while the change requested would benefit them greatly.

 

So if I choose to use a tool, in this case a offline DB, but do not know how to maintain it, it is GC's responsibility to fix it?

 

How come no one is whining that GC does not create the PRC to go directly into PDA's or make their system download directly into a GPS?

Link to comment

I am not sure if I understand. Are you saying do a PQ and exclude temporarily disabled caches so they are not downloaded? Then run the macro using that PQ as the last update? The temporarily disabled caches are already in my database so I don't know. :)

 

No, just the opposite. INCLUDE the temporarily disabled in your PQ. That way you know that anything not received is archived and you can delete them with not further checking.

Link to comment

 

So if I choose to use a tool, in this case a offline DB, but do not know how to maintain it, it is GC's responsibility to fix it?

No - not their responsibility. But they know of the problem and CAN do something to assist - but refuse to help.

 

How come no one is whining that GC does not create the PRC to go directly into PDA's or make their system download directly into a GPS?

 

Because not everyone uses or wants PRC files. <_<

 

If you want PRC files directly from GC just start a thread and let me know - I'll join in and oppose it using the same arguments as I have seen on this thread (I know that they don't hold water, but Groundspeak will most likely take notice of them)

 

Wayne

Link to comment

Bottom line: Offline DB's are not supported.

 

Arguing this appears to be like wrestling with a pig in the mud. After a while, you realize that the pig is enjoying it. :(

 

In other words, Groundspeak have made a decision, and are not prepared to change it, even if it's found to be bad?

Link to comment

One point of clarification: None of us on this thread speak for or represent Groundspeak. We are just stating positions as we understand them that have been made clear, both in the guidelines and in other threads, by Groundspeak.

 

Your mileage may vary.

Link to comment

One point of clarification: None of us on this thread speak for or represent Groundspeak. We are just stating positions as we understand them that have been made clear, both in the guidelines and in other threads, by Groundspeak.

 

Your mileage may vary.

I am glad you mentioned this. From the passion some have on this subject you would think they were being asked to pay for it. :P

Link to comment
Am I just not supposed to use that option, and have my PQ's return all caches, even if they haven't been updated?

Yes.

 

This subject has been argued until everyone is blue in the face. The solution is to bog down the PQ servers with useless queries that do little to nothing for the end user except weed out archived caches. Many solutions and work arounds have been suggested. TPTB have made it clear they want it done the way they're doing it right now.

 

I've even pointed out that a typical week's worth of PQs only update about a quarter of the caches. That means Groundspeak is sending out 4 times the data necessary. YMMV in your area. I've even gone nearly a month before importing the PQs with very similar results.

 

There have been solutions presented that counters every reason why archived cache should not be included. Probably the most clear and workable solution was the one where the archived caches are only sent when archived within the last 7 days. This would go right along with updated in last 7 days. If you consider two things then this scheme makes sense; folks use stale data until the next day they throw out the old and use the new, and they typically do this on a weekly schedule. Not sending out an archived cache that has been archived for a week or less is not going to get it out of the hands of the seekers any sooner. I really don't see the problem.

 

~shrugs~

 

That's all well and fine. TPTB wants us to download 4 times the data necessary? No problem. As long as I get what I pay for.

Link to comment

 

Then perhaps someone who does represent Groundspeak can come and discuss it with us. There are many people who want this feature added in.

 

Wayne

 

I really don't expect that. Jeremy has been to a thread like this before. CoyoteRed (who doesn't sound blue in the face, but looks it) has nicely summarized the situation.

 

Jeremy understands that some users would be happier and the PQ load would diminish, but he has a vision of what we SHOULD be doing and doesn't want to make it easier to maintain our databases.

 

Many have tried to find the right words to convince him otherwise, but I suspect those words don't exist.

Link to comment

Jeremy understands that some users would be happier and the PQ load would diminish, but he has a vision of what we SHOULD be doing and doesn't want to make it easier to maintain our databases.

 

Then until he decides to listen to what the people are saying, Groundspeak will continue to lose money.

Link to comment

Jeremy understands that some users would be happier and the PQ load would diminish, but he has a vision of what we SHOULD be doing and doesn't want to make it easier to maintain our databases.

 

Then until he decides to listen to what the people are saying, Groundspeak will continue to lose money.

 

Lose money? I doubt that. Leave money on the table? Hey, it's their company. Maybe he found a place that allows (effectively) un-metered bandwidth.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...