Jump to content

washboy

+Charter Members
  • Posts

    143
  • Joined

  • Last visited

Posts posted by washboy

  1. Still think PQ's odd. Looking at your dates, this PQ

     

    PQ No {19} 3rd Nov 2005 to 17 December 2005

     

    returns for me no caches places between 4th Nov-11th Nov inclusive. 478 records in all.

    Did you actually download that PQ/gpx and view it in GSAK or did you just look at the preview listing as a web page on gc.com? Those on-line listings aren't in date order - they're all mixed up! When I look, I see dates including 9 Nov 05 on page 23 of 24, for example.

     

    Just a thought.

    Just because they appear to not be in date order does not mean that the data in the PQ is incorrect.

    That was my point exactly! :)

  2. Still think PQ's odd. Looking at your dates, this PQ

     

    PQ No {19} 3rd Nov 2005 to 17 December 2005

     

    returns for me no caches places between 4th Nov-11th Nov inclusive. 478 records in all.

    Did you actually download that PQ/gpx and view it in GSAK or did you just look at the preview listing as a web page on gc.com? Those on-line listings aren't in date order - they're all mixed up! When I look, I see dates including 9 Nov 05 on page 23 of 24, for example.

     

    Just a thought.

  3. I am wondering if at certain points when caches being organised in the Groundspeak databases some were given the identities England,  Scotland, Wales.  These then aren't listed in 'UK' Pq's.

     

    Dave

    Got anymore examples ?? theres no option for wales scotland etc now so you may have some very early PQ's

    Yes. Can you quote a few GC codes for caches in Scotland, England & Wales please? (i.e. ones showing as such in your GSAK 'Country' coumn).

  4. When I try to run GSAK from within a batch file (and GSAK then runs a macro), GSAK appears to pass control back to the batch file long before the macro is complete. Why is this? :)

     

    More to the point, is there anything I can do to prevent the remainder of the batch file from executing until GSAK has finished processing the macro? :)

  5. It would be nice to be able to store/load the definitions of exported files.

    For example I use %drop2%typ=2 as a waypoint name and %Name (%Dif/%Ter) %typ as description format in one case but on another purpose the values are %code and %Name (%Dif%typ1%Ter).

    It would be really nice if I didn't need to change these manually. (And of course the path & filename are different too).

     

    You can do this now. That is exactly what "settings" are for. Just set up your export the way you like, then click on the save button to save your settings under a name that is meaningful to you. Now you can restore those settings at any time without having to rekey them. The "settings" are also available for you to use in macros. This is also a great feature for those that use more than one GPSr as the settings include the GPSr options and icon settings.

     

    I must be doing something wrong but I can't get this to work reliably with Custom export. 90% of the time, the settings details are lost. I fill in the settings, including the Statement to run, click Save and give it a name. However, when I subsequently select the saved settings, from the Settings dropdown, the Statement to run is blank!

     

    What am I doing wrong, please?

  6. In my book, a 'Series' cache is one of a number of caches that are connected by a common theme. In a Series, there will usually be a bonus cache which can only be found if you have successfully completed all the other caches in the series. Scotty's Cluedo series was a good example, or the Alchemy Quest series.

     

    In the days when Virtual caches were freely permitted, it was not necessary to have a real container cache for each element in a series.

     

    Any cache which involves visiting several locations, simply in order to collect clues to the location of the cache container, is a 'MultiCache'.

     

    Creating Virtual caches of nondescript locations, just because the location can be used for a clue (information boards, monumental inscriptions, etc.), is wrong. There should be some particular distinction to the location. In this respect, McDeHack's own (excellent) London Rainbow series might have difficulty getting past the approvers these days <_< The Jack the Ripper series, interestingly, has been passed as a series of 'Mystery' caches when, in fact, each clue element is probably a valid Virtual. Go figure, as they say.

     

    Incidentally, I'm most definitely not a numbers man. So, when I came out of self-imposed geo-retirement to do the Jack the Ripper series, I logged only the final, physical, cache as a find - thus losing 6 points for the intermediate locations. To me, it is really a MultiCache.

     

    Different strokes... <_<

  7. Oooh, this makes me so angry!  :lol:  :lol:  :huh:  :lol:  :lol:

     

    Dave and Peter have the full support of the UK community and are doing a sterling job.  Caches get approved at lightning speed and this is (was!) a happier place to be than I can ever remember it.  They are masters both of the arts of diplomacy, and of herding cats!

     

    <snip>

     

    ...will hold the post of UK moderator vacant until they have again regained control of their volunteer staff's behaviour and can persuade both Peter and Dave to take their rightful jobs back.

    I don't think I can say it better or more succinctly than Ian has done (and I have several drafts to prove it :huh:).

     

    I can quite understand that Peter and Dave might feel they need a break from moderating and approving (not least because their stirling work must be very time-consuming). They certainly should not feel they let us down at all by their decision (despite what some have said).

     

    But, for all our sakes, I would ask them both to consider simply taking a sabatical, rather than resigning completely.

     

    Things really have been brilliant around here since you took up the reins, Lacto & Eckers. Thank-you both :huh::huh:

  8. When importing a directory full of zipped gpx files (or loc, for that matter) from gc.com, what determines the order in which GSAK processes the files? Is it the file's date/timestamp, or a sort on the filename, or is it essentially random?

    As per the help file, it is the file date/time stamp that is used. Files are processed in chronological sequence.

    I had read the help file and presumed that would be the case but my eyes tell me otherwise. Noting the gpx filenames as the zips are being processed, they appear to be taken in the order they would be if sorted on the filename of the zip. Assuming you have program control over this behaviour, it would be great if it were user-selectable, i.e. by filename or by date/time stamp.

     

    (Each GPX file from Groundspeak has a creation date inside the GPX file (regardless of the time stamp of the file), and GSAK uses this to determine if the incomming data is "newer" than what you already have in the database)

    Just to clarify then, this date comparison is on a cache-by-cache basis, yes? It's not that, if the incoming gpx has a Groundspeak creation date earlier than the latest update date of the database, the whole gpx will be skipped?

     

    Clear as mud?  :lol:

     

    Hope this helps

    I do feel sometimes as if I'm as thick as mud! :lol: Yes, it does help a lot. Thanks :(

  9. Hi Clyde,

     

    When importing a directory full of zipped gpx files (or loc, for that matter) from gc.com, what determines the order in which GSAK processes the files? Is it the file's date/timestamp, or a sort on the filename, or is it essentially random?

     

    The question arises because I guess it would be important in the following case (or similar):

     

    PQ#1 arrives from gc.com on, say, Nov 1st but is not loaded into GSAK. It contains details of a busy cache.

     

    PQ#2 arrives a week later. Since PQ#1 was generated, there have been 4 new logs for the busy cache.

     

    Add subsequent PQs, if you like, each containing an extra few logs for the cache.

     

    Assume that the PQ zips get saved to the import directory with structured names, e.g.: 2004-11-01 Home Counties.zip, 2004-11-07 Home Counties.zip, 2004-11-14 Home Counties.zip, etc. (and each one contains a file named 123456.gpx). The zips may or may not have a timestamp of their creation date/time at gc.com (depending upon the method used to save the mail attachment).

     

    If the files were processed in non-date order, would some of the logs be ignored/over-looked? The significance of the Database Update Options is bewildering me :lol:

     

    Oh what it is to be a GSAK newbie :lol:

  10. There are plenty of geocachers who enjoy placing caches, so we don't need to make anyone feel obliged to set one.

    99 Finds since June '02 / 0 Hides

     

    It's difficult not to feel guilty, though.

     

    I look at my local area (N. London) and I think, 'Would I want to come here in search of a cache, especially perhaps only micro? Is it interesting, scenic or spectacular enough to justify one? Do I know the area of any potential cache site well enough to feel it's safe to bring other cachers to?'.

     

    I suppose I totally lack imagination :lol: As a consequence, after over two years gentle caching, I've hung up my spurs until I do break my duck. I'll still haunt the forums, though :lol:

  11. However the "show me updates in the past 7 days" might be a better way of doing this - something i found last week and am currently playing with.

    Don't make the mistake of thinking that this will give you the caches about which anything has changes, e.g. new logs, archived. It doesn't !!!

     

    It would be marvelous to be able to make a one-time snapshot to populate a GSAK database and then keep it up-to-date with such an 'updated' PQ. Unfortunately, Jeremy & co haven't got around to it yet. I'm sure gc.com is burdened with many more PQs than needs be, as a consequence ;)

  12. ...and I've received none of the PQ I regularly get on a Monday.

    All 5 of your queries on Monday ran and were delivered to your ISP's mail server. I'll send you a PM with the log entries in case you want to ask your ISP about the missing messages.

     

    :o Elias

    Many thanks for checking, Elias, and for the e-mailed report.

     

    One of the missing five did, in fact, arrive. I'll check with my ISP but, since subsequent days PQs appear uncompromised, I'll only be concerned if there's a repeat of the issue.

     

    Thanks again for a timely response - much appreciated! :lol:

  13. ...and I've received none of the PQ I regularly get on a Monday.

     

    They ran, allegedly, but it's as if every one of my PQs generated between midday and midnight (PST) on Monday went into a black hole.

     

    Tuesday morning's PQs have arrived OK though. Was there a problem with a server on Monday?

  14. Any change to the cache in the last 7 days, including DNFs, and notes, has been suggested AND thought a good idea by TPTB. However, it seems to have not made it the to-do list.

     

    Ah! I did some searching before posting but didn't unearth that. Perhaps TPTB will confirm if it did make it to the to-do-list, etc.

     

    What I've been doing is sorting by "Last Updated" and if the cache isn't in my last query I look at the page for a reason

     

    I'm trying to avoid PQing all 3500 caches each session. My PQs are intended to provide just the new and modified stuff (except they don't - that's the point) so, sadly, your method won't work for me. Thanks for your suggestion, though :lol:

     

    What's needed is the enhancement(s) I mentioned.

  15. In the 'That' section are various AND flags including 'Updated in the last 7 days' and 'Found in the last 7 days'. But what constitutes an update?

     

    I want to identify caches, within my local area, that have changed recently, i.e. have received new log entries, have been archived or suspended or have had their description changed by the owner.

     

    'Found within...' appears to select only 'Found' log entries (as one might expect) but I also want to know about DNFs, Notes and archive requests.

     

    I'd assumed that 'Updated in...' means that something about the cache has changed (description, logs or status). However, I've concluded that's not the case and that it's simply that the cache owner has edited the cache description. Yes? Does that include suspending or archiving it? If not, maybe it should.

     

    This question has arisen as a result of my trying to use GSAK (a most excellent tool). I'm in the UK which, I guess, is equivalent to a large US state, in cache terms (3500-odd active caches). It's perfectly reasonable for me to want PQs covering the whole country/island.

     

    GSAK facilitates the maintenance of such a small database but all the PQs I've tried so far work fine until a cache becomes archived/suspended or goes missing (usually signalled by DNFs). I'd hoped that using GSAK would mean fewer and smaller PQs since, after initially populating GSAK, ongoing PQs would be only for newly-placed and archive/suspended caches, plus new log entries. I'd have thought that's less of a strain on GC.com's PQ server(s) than regularly PQing all 3500 caches.

     

    I'd like to see an extra item in the 'That' section of PQs: a simple 'Caches With New Data in...' to include ANY change (like a 'dirty' flag). Right now, I'd settle for a 'Have New Logs in...' to include all types of log entry. Oh, and while you're at it ( ;) ), it would be really good to allow selection of the number of days, from 1 to 30 days, instead of just 7.

     

    I really appreciate the hard work that all those involved in Groundspeak put into the service. I know that just keeping ahead of demand on the servers is a challenge in itself, and that it comes before development and enhancement of special facilities. Still, what are the chances, Jeremy?

     

    Erm, having said which, I do hope that Groundspeak do not actually seek to discourage maintenance of small, up-to-date, local, offline databases by premium members for personal use (?).

  16. It seems that what's missing from PQs, and what would make keeping an up-to-date GSAK database a doddle, is a query for 'New Log Entries' within the last n days.

     

    There's 'Found' and 'Updated' and 'Placed', all of which can be refined by date, but not 'DNF', etc.

     

    It would actually reduce the load on GC.com to provide this because, after using Stuey's date-spread method to get the basic 'set" of 3500-odd caches (currently), all you then need is a handful of daily queries to provide updates to that set.

     

    I did as Stuey suggested and created 8 queries which gave me all caches since Dec 1 2000 to date. I then created a handful of queries to give me those 'Found', 'Placed' and 'Updated' in the last 7 days. I'd happily reduce this to the last 24hrs, since I'd want to run them daily.

     

    So, I've been doing this for a couple of weeks and I thought everything was just grand until a couple of occasions when I tried to find a cache that my GSAK database said had last been 'Found'. I couldn't find 'em (not unusual for me :D) but later, when I made my 'DNF' logs on-line, I discovered that they had, in fact, had recent 'DNF's which had not updated my GSAK database.

     

    I did some investigating and concluded that 'Found' and 'Updated' do not include 'DNF' or 'Note' logs :D

     

    I'll be very happy if someone can tell me I've just made a simple mistake and that I don't actually have to download all 3500 caches weekly. Please...

  17. Sometimes I believe you will get a discrepancy

    You'll get a discrepancy all the time - of around 100m!

     

    "British Grid" position format is only relevant if you're using "Ord Srvy GB" map datum. Using hddd mm.mmm position format with OSGB datum is useless with the WGS84 references quoted for geocaching. Using British Grid when set for WGS84 datum is equally useless for geocaching.

     

    So if you display both British Grid and hdd mm.mmm postion formats simultaneously then one of them is always going to be around 100m out!

     

    I don't use an eTrex so apologies if I've misunderstood how you use it or otherwise got hold of the wrong end the stick :lol:

  18. I don't think it's supported as the mxf format doesn't cater for it.

    True, the .mxf format doesn't support "hotspots". The .csv format does but that's been taken out of v2004.

     

    I have the tech specs from Memory Map somewhere, so I'll check.

    Do you have the tech spec for the .mmo file? That would be very interesting to see.

×
×
  • Create New...