We are all running our queries each week (or day). Every time we do this 90something% of the data is identical, but every run extracts, zips up and emails all of the data. When we import it into GSAK (or similar) what's the first thing it does? Yes, it throws away all of that duplicate data. What a waste of resource.
In practical terms one solution would be to run a full extract if a query has never been run before and then on subsequent runs only extract cache data where the cache has been updated since the last PQ run. You could have a timestamp column on the cache master row to indicate when the last update was made to the cache associated details.
This method would assume no changes on a significant number of caches since your last run, which from my experience is true.
BTW, my old scheduled PQ actually ran last night!.