Jump to content

SidAndBob

+Premium Members
  • Posts

    784
  • Joined

  • Last visited

Everything posted by SidAndBob

  1. If I was dropping the TB off I would have put a note in the log book explaining that I wouldn't be able to log the drop off for n days, or maybe even put a note in with the TB if it was in a bag. Then no-one would get their knickers in a twist.
  2. We are all running our queries each week (or day). Every time we do this 90something% of the data is identical, but every run extracts, zips up and emails all of the data. When we import it into GSAK (or similar) what's the first thing it does? Yes, it throws away all of that duplicate data. What a waste of resource. In practical terms one solution would be to run a full extract if a query has never been run before and then on subsequent runs only extract cache data where the cache has been updated since the last PQ run. You could have a timestamp column on the cache master row to indicate when the last update was made to the cache associated details. This method would assume no changes on a significant number of caches since your last run, which from my experience is true. BTW, my old scheduled PQ actually ran last night!.
  3. What about offering a suite of commonly used queries optimised for performance. Personally I'm only really interested in nearest caches I haven't done, not owned by me, not archived. If a lot of people used these queries would it improve performance? I haven't read the entire thread, so forgive me if this has already been suggested.
×
×
  • Create New...