Jump to content

Lucy's Ranger

+Premium Members
  • Posts

    6
  • Joined

  • Last visited

Everything posted by Lucy's Ranger

  1. How about a bookmark list so that we can create a PQ and get the route plotted. Much easier than trying to create our own from the route PDF..... Thanks.
  2. 7 days gone by, any updates to the route? Looking forward to this event.
  3. Just seeing this thread for the first time today....... US Army Enlisted - 1990-Current Ft Benning - 3/75 1990-2003 Ft Bragg - USASOC 2003-2005 Ft Lewis - 2/75 & 4-2ID 2005-Current About to PCS to C-38 Cav (LRS-ABN) within the next 45 days Spent way too much time in the sandbox, both countries - Including the invasion of both. Still remember those old Trimble Trimpacks. Saw Geocaching in a Ft Bragg publication and decided to give it a try - the rest has been history.
  4. Is there a way to have GSAK update a custom database not based on a PQ? For instance, if I have a database made up of individual GPX files based on caches I intend to find, and want to kep seperate from a PQ laundry list of caches, is there a way to update the cache info contained within my database? TIA
  5. Bug completed mission, reassign original bug to a new mission. Ethical and moral?
  6. Clyde, Great program and I am still learing the functions and just how usefull this program really is. As everyone else here, I do have a question though. Do you know of a macro, or is there a function built into the program to download the current GPX file for a given set of caches in the database. I know this can be done via PQs, but my databases arent always that linear. I might have a few caches from many PQs in a single database, or caches created manualy along a route. As I am sure you can imagine it is quite time consuming to have to open up each pages online URL and then download the GPX file for each page, then drag and drop the resulting GPS into the database. Normaly this wouldnt be a a problem, but when creating a multi-state route, this can be hundreds of caches. An example. I have spent the last few weeks, as I have time available, plotting a route accross the US for a work related move I am making. Begins in NC and ends in WA. I am picking and choosing what caches I will find along the route based on ease of locating (I am on a time crunch) and the cache draw. The resulting database has about a hundred caches. This is the result of many hours looking at pages and pages of caches. By the time I leave some of them are potentialy weeks old and the information may be way out of date. In one case of going back for a quick look to ensure I was where I thought I was, the caches had been archived. Would have a pain if I had arrived in the area not knowing the cache had been picked up already and no longer available. If there was an automated method to update the cache pages already in my databases this could be prevented. Make any sense of is this ramblong diatribe? Thanks
×
×
  • Create New...