Jump to content

Suggestions For Pq Generation System


Hynr

Recommended Posts

We are all running our queries each week (or day). Every time we do this 90something% of the data is identical, but every run extracts, zips up and emails all of the data. When we import it into GSAK (or similar) what's the first thing it does? Yes, it throws away all of that duplicate data. What a waste of resource.

In practical terms one solution would be to run a full extract if a query has never been run before and then on subsequent runs only extract cache data where the cache has been updated since the last PQ run. You could have a timestamp column on the cache master row to indicate when the last update was made to the cache associated details.

This method would assume no changes on a significant number of caches since your last run, which from my experience is true.

 

BTW, my old scheduled PQ actually ran last night!. :P

Link to post

So now even though it supposedly was generated and I never received it

 

That's most likely a problem with your e-mail/ISP and not a problem with the PQ system. Who do you use for your e-mail? AOL users are having trouble as AOL is apparantly throttling stuff again. You ISP could have flagged it, or it could be in a spam folder somehow. Do you have the geocaching e-mail ots in your "safe list"?

[/quote

 

Ive been using Yahoo for years and have never had a problem before this. And even if it did go into the spam folder,which it didn't, at least I would have been able to use it as I always double check it just in case something does get by. So No I think it is a GC.com problem and not my ISP's.

Edited by haffy
Link to post

If given the option, I would gladly uncheck the logs, TBs, even the description. Just keep the URL so that I can link from MapSource. For me, essentially the waypoints, which is significantly smaller and probably much faster to generate.

Link to post
Guest
This topic is now closed to further replies.
×
×
  • Create New...