Jump to content

SnoWake

+Premium Members
  • Posts

    187
  • Joined

  • Last visited

Everything posted by SnoWake

  1. Okay, I'm seeing more strange behavior - and I know Clyde is going to tell me it's a "personal problem" but here goes... After running macros (most recently and specifically my "process found" macro, which copies waypoints to a "Report" database) - GSAK seems to be in an unusual state. Hitting CTRL-D, or selecting "Database->Select" from the menu, does not bring up the database selection dialog box - but instead does a "Calculating distance and bearing" progress bar, and then refreshes the current list. The only way I can resolve this is to exit GSAK and restart. This is "semi-reproducible" - it was happened three times this evening, but does not happen EVERY time I run a macro. Weird. That, coupled with the previous issue, would make me think I don't have the latest/greatest version - but I certainly appear to. Hmmmm.... like I said - must be a personal problem. Have a great evening! Billy
  2. I'm running 5.1.1. Help->About shows: 5.1.1 (Build 14) Interesting - in my case, each time I've run it, selecting different starting caches - the "center point" as displayed in the bottom status bar is not one of the caches in the raid. I'm wondering if it might be the "next" cache? Haven't really walked through the logic, nor am I that concerned - it just caught my eye. At first, I thought it was just incrementing, like a "route distance" or something - but then realized that wasn't the case. No big deal, at all - I was just curious. Well, the PostNuke content management system I use provides a download section, complete with categories, ratings, links, etc. I'm just using the "out-of-the-box" one right now, but there are enhanced ones that provide more functionality (remote user uploads, etc). I would certainly be more than happy to provide such a service, if it were deemed valuable. On the other hand, I wouldn't bother taking the time to configure it if everyone is just as happy posting them here... Whaddya think? Billy
  3. One more, even more simplistic macro: My "found" PQs come in three chunks, to get them all returned under the 500-cache-per-PQ limit (I may have to go to four, soon). Again, I extract all the PQs into a particular directory, and then run the following script, which pulls them into my Found database, exports to Streets and Trips, does a MOVECOPY to a Report database. At that point, I'd like to purge all the logs except my own - resulting in an HTML "journal" of sorts - of my finds, in chronological order, which only my log entries displayed (as seen here: http://www.gotwake.com/caches/foundMy Caching Journal Don't hesitate to ask if you have any questions on filter or export settings that go along with either of these macros. Next will be to automate my "Tahoe Arc" process, where I pick out all the caches within 2 miles of my route on weekly commute from the Bay Area to Lake Tahoe. Very cool getting these things down to a single macro... make that Menu Bar Button Click! # Load directory of NotFound GPX files - includes database selection LOAD Settings=found Database=found # Export to Streets and Trips EXPORT Type=SNT Settings=found # Copy the waypoints to the Reports Database MOVECOPY Settings=Report # Hmmm - no purge function via Macro commands? # At least, not the purge I'm looking for (PURGENOTE is different) # Perform this step manually for now # Export to HTML 'journal' EXPORT Type=HTML Settings=Found Nothing very fancy here - but with the frequency that I perform this task, it still saves me measurable time. Have a great evening, Billy
  4. Here's a fairly simple macro I use to process my incoming GPX files (PQs) by importing into GSAK, and then filtering and exporting to CacheMate, Streets and Trips, and even (optionally) the GPSr: # Load directory of NotFound GPX files - includes database selection LOAD Settings=ClearLoadNotFound Database=NotFound # Export ALL Not Found caches to CacheMate EXPORT Type=CMT Settings=CMnotfound # Center on home CENTRE Location=Home # Apply "Close" filter - selecting < 1000 caches closest to home FILTER Name=close # Export to HTML EXPORT Type=HTML Settings=HomeNotFound # Export to Streets and Trips EXPORT Type=SNT Settings=HomeNotFound # Send to GPS receiver PAUSE Msg="Is GPSr connected?" EXPORT Type=GPS Settings=60CS Of course, you would have to customize to match your database and settings names, etc. Nothing fancy, but it sure saves me a whole bunch of clicks! I've just started using the Thunderbird "Attachment Extractor" plug-in, which streamlines the process of downloading/saving the attachments - but I'm considering this a temporary fix, until Clyde adds POP support and (once again) further simplifies the whole process. Let me know if you've got any questions, Billy
  5. Clyde: That macro is way cool, and rather clever, as well. I bumped it up to creating a string of 10 caches, and it's working as-advertised. The only thing that seems a bit odd is that it generates an alert box on each pass through, that reads "Error: Filter contains 0 waypoints - it will now be cleared" - and the user has to acknowledge this - so, I have to sit here and watch it run, and hit enter 10 times - it would be nice if that alert could be supressed, at least in this context. One more quick question: Once this macro completes, and GSAK is displaying the chain2 database with the (in my case, 10) 'chained' waypoints - what does the value in the distance column represent? The distance between the current cache, and the next one in the list - or vice versa? I can't quite make sense of it. Either way - it's definitely cool. Couple this feature with the new(ish) "Cache Density" option available in the HTML export, and you've got some pretty useful tools for designing a "cache raid", as you so appropriately put it. I sometimes think of them as a blitz, or a spree, or even a rampage... If it's simply a matter of storage space, I would be happy to host a library of GSAK macros at http://www.gotwake.com/ - but the Forums and/or gsak.net are probably more suitable locations. Let me know if there's anything I can do to be of assistance, and... Thanks! Billy
  6. Just wanted to publicly acknowledge Clyde for all his outstanding work and continued development/ehancement to GSAK. I'm sure we've all replaced several applications and a lot of clunky processes with this great tool (and the ones "under the hood"). I just caught wind of a proposed feature addition in 5.2, which is support for POP3, allowing GSAK to connect to your mailbox and pull down PQ emails. I can't WAIT!! That is going to be a critical component in automating the process of getting PQs from my Inbox into GSAK, my Palm, my GPSr, and HTML "journals". With regard to that feature: I'm hoping (and have every reason to believe, based on the rest of GSAK) that the filters for selecting PQs will be fairly granular. If I can merely do a REGEX on the subject header, that should be sufficient - it's just that I get both "found" and "notfound" PQs which need to get imported to different databases depending on the query. Anyway - counting the days until that release hits the streets! On a completely unrelated note: I've been doing some work to automate routine/repetitive tasks within GSAK, and came across a function I can't seem to find in the macro "dictionary". In order to create a "journal", of sorts, containing just my finds, sorted by date - I copy my "found" database to a "report" database, and then perform a "Database->Purge Logs" with a particular set of options. Just wondering how hard it would be to expose this functionality as part of the macro language? It seems that all the other macro functions support a "saved settings" concept on their respective dialogue boxes, while the "Purge Logs" function does not. I have no idea if this would hinder exposing this function, but... can I make a feature request that, in some future release, that it is available from a macro? Compared to the POP support, it's WAY down on the list - I'd just like to see it make its way into version 7 or 8.... Thanks again for an outstanding tool that makes my life SO much easier!! Billy
  7. Wow - gotta love the forums! Much thanks to all - I'm still holding out hope for a procmail recipe and Perl modules or something to that effect to automate this process on the Linux server - but LOTS of great ideas here. Then Clyde goes and drops a hint as to that 5.2 feature, and... WOW. I'll experiment with some of these others in the interim, but that will be a big enhancement, from my perspective. Time to go grab that Attachment Extractor - as it sounds like, at a minimum, that could streamline my manual process while I wait for POP support in GSAK. I guess asking for IMAP is just getting greedy - but I've already got that compiled/installed/configured, appropriate firewall rules set, etc for that protocol. Adding POP will be well worth the reward! Thanks again, and keep those ideas coming!
  8. As a result of the hard work and great efforts of many others, I have slowly migrated my caching "processes" from manual, paper-intensive ones to a simple, efficient eCaching system. Thanks to Jeremy and everyone at GC.COM (Pocket Queries), Clyde and Robert Lipe (GSAK and GPSBabel, respectively), Brian Smith (CacheMate) and Jeff Boulter (Express Logger and other useful tools), I never print cache pages, manually enter coordinates, or even write notes on paper in the field - it's all electronic, integrated, and free of data-entry errors/ommisions/etc. Now that I've got a process/workflow all designed and functioning - it's time to AUTOMATE, right? Thanks to GSAK's macro capabilities, I'm able to automate the processing of GPX files, applying filters, and then output to HTML, CacheMate, Streets and Trips, and even the GPSr, complete with customized icons by cache type. All very cool. The only part that's still fairly manual-intensive is the processing of the daily PQ emails. Since everything else is running on Windows, it's easiest for me to fire up a mail client (Mozilla Thunderbird), log in to my mail server (which is in the other room, here on my home network), navigate to the folder I route them to, and then, one at a time, select them, save the attachment off, and then delete. Once this is complete,I'm off and running. Now, if I could just automate this email attachment extraction, I would be golden - and could probably have the entire process run on a schedule (either via cron, or some Windows scheduler). Ideally, the whole solution could run on a Linux server - but so far I've had no luck getting GSAK to run under WINE. My mail is sitting on a Linux server, and it SEEMS like between Procmail and Perl modules, there should be a fairly straightforward way to accomplish this. I think I may have even had a couple of leads awhile back, based on a previous posting here, but they're long gone. I've already got procmail filtering the incoming messages, and sorting them into a folder. I'm just looking for a "recipe" that pipes it to a script which extracts the attachment (and could then simply discard the message). In a perfect world, I'd even like to replace the PQ# (numeric) filename with the friendlier Pocket Query name (present in the email Subject header). This is a step in my manual process today. Alternatively, the mail is also exposed (on my internal network only) via IMAP - so I would certainly consider any Windows-based solutions as well (perhaps I should crack the Thunderbird manual ) Is anyone doing this, or does everyone process PQ emails manually? I get 3-5 per day, so that's a lot of "Select Message->Right Click on Attachment->Save As->Navigate to target location" that I'm doing each day... over, and over, and OVER. A prime target for automation - now I just need to find the right tool(s)! Thanks much for any ideas, suggestions, or links to relevant packages, other threads, ?? Have a great day, Billy (aka SnoWake)
  9. Yep - the site redesign inadvertently "broke" Boulter's Coordinates Grabber - but, like the his excellent Express Logger, he made some quick adjustments, and it seems to be functional again today. As a perfect example - I recently found the cache "Rock(lin) of Ages" on my way home from Tahoe. The Grabber would have extracted all six sets of coordinates from the cache page, greatly simplifying tedious (and error-prone!) data entry into the GPSr. Alternatively, if you have the ability to post something to the web (personal site, geocities, AOL, ...?), you can do exactly what is being discussed here: Create a GPX file with all the relevant coordinates, place it on the website you have access to, and simply include a link to it from the cache page. Sure, it's easier if we make Jeremy do all the work and provide the storage - but this is a viable workaround for many, until the feature is implemented at gc.com. Here's a great example: Hynr's Arboretum Geocaching Tour. Finally, the no-tech way: What I usually do on multis is just go to the first waypoint, and then "edit" that waypoint to reflect the next stage. Has me doing manual data entry, but on the other hand, it cuts down on "clutter" in the GPSr. Of course, of multi's/puzzles where there's math involved in calculating the next coords, you all but HAVE to do it this way... Just some thoughts, Billy (aka SnoWake)
  10. I use CacheMate almost exclusively for eCaching - coupled with Boulter's "Express Logger", it really completes the circle and adds another level of "automation", if you will. Sadly, the new "anti-frames" protection against CSS and phishing attacks, has made that tool a little more clunky - but it still works. The only reason I bother with any other format is for those caches that require you to see the images in the cache page: E.G. "SOMA Little Pictures", "Columba", etc. So, every once in awhile, I'll take a collection of GPX data (via GSAK) and export a dataset as HTML, and then use Plucker to convert it into a Palm-friendly format. I rarely have revert to it, and if there's just a single cache on a planned outing that I need the images for, I'll actually print a single page - because the Plucker process takes HOURS to convert/import 1000 caches. That's the only reason I ever stray from CacheMate, and I refer to it < 1% of the time.
  11. I understand the desire for the "locked" dates for certain types of maintenance activities - but if that can be implemented in a way that doesn't force the page reload for all the "common", date-flexible activities, that would be optimal. Of course, Jeremy already said he was planning on removing it - so I guess I'm preachin' to the choir. Thanks again for the redeisgn - overall, I'm a big fan.
  12. I would like to second boulter's comments. I, too, am enjoying the new look and feel (apologies to those in the "creative and proud" camp - but I appreciate the optimized use of screen real estate. You STILL get a little bit of background - and can embed all the images you want WITHIN your page, so... isn't that already a compromise?) But I digress - my point is about frames. I absolutely understand and appreciate the security concerns about content being 'framed', from cross-site scripting to phishing scams. However, I also have grown completely dependent on boulter's Express Logger. This tool completed the circle on my paperless eCaching, allowing me to take my field notes from CacheMate and quickly and easily log all my finds. In addition to security concerns, I appreciate the Jeremy and company have significant performance concerns, and need to be wary of external entities developing automated interfaces to the geocaching.com platform. However, in the case of this particular tool, server load on gc.com is actually reduced significantly, as I am saved multiple "search->render results->select->render results->Log this find" transactions, instead just having quick access to a chronologically-sorted list of all the finds for the day (week/month), complete with my trade and TB notes, timestamp for the find - even the appropriate link to email the cache owner in the event its a virtual or ... Okay, so enough cheerleading. I understand the justification for adding the "no-frame" javascript - but like the background camp, I plead for a compromise: Give some form of review, could "approved" sites pull page elements as frames? I guess I should just shoot for the sky, and ask for a standards-compliant Web Services interface exposing a handful of key functions... but how do I motivate Jeremy into doing that?? Thanks again for the upgrade, and the new features. Looks great!! Have a great weekend, Billy (aka SnoWake)
  13. Hey there, SciFi Man! Dingo and I are MTZ residents... we try to make it to BADGES events when they roll around, and we're always good for a hike...
  14. Greetings, Fellow Cachers (specifically, you of the snow-loving variety)! As you've certainly realized, Tahoe has recently been buried in snow. I have first hand experience, as I had to dig myself out every morning (noon and night)! Anyway - I like to mix up my activities, and was just wondering: Anyone interesting in a "Geocacher Ski Daze"? I know there are at least a few of us skiers/boarders among the community, and more likely there are a LOT. We could carpool, or caravan - and meet at any resort the group decides on. Depending on location, we can even do some caching WHILE riding the slopes! If you, or someone you know, might be intereted... drop me a line and I'll start building a list. Then I'll propose a few dates, and see what works for people. Midweek or weekend - I'm down for anything, any where. I normally ride Squaw - but am willing to to ride anywhere in Tahoe. See you on the slopes! Billy (aka SnoWake)
  15. Ditto - another Mozilla / Firefox user here who was baffled by the new icons, until I found this thread. Fired up The Evil Empire browser (IE) and finally saw the mouse-overs. Until then, I was trying to figure out if it was a graphical representation combining difficulty and terrain, with how long it's been since the cache was found as a modifier... Thanks for the new feature - even if I don't make use of it via the browser, it will certainly be handy in GSAK/CacheMate, once I figure out how to leverage it there.
  16. Okay, so this might be a bit of a stretch, but in my mind, this IS a geocaching topic ... I recently installed a Pioneer AVIC-N1 in my Tahoe. It's not entirely unlike my recent experience with a Blackberry (converged cell phone/PDA/wireless email/calendar device): It's pretty cool, but has a few major short-comings. This isn't intended to be a product review, so I'll save that for my site and the Pioneer engineers. However, relative to caching - it would be GREAT if we (I) could burn a CD or DVD containing waypoint data in the appropriate format, and have the device auto-route me, much like the 60CS does (or this unit does, for data contained on it's Navigation DVDs). Anyone else got one of these units, or had any luck using something other than the factory-provided DVDs for navigation points? My concern is - even if we could provide the waypoint data, it seems as if the unit is dependent on the DVD containing all the maps and supporting data... so it might be a matter of making a "backup" of the stock DVD, with additional waypoints added in. I haven't a clue, but just wondering if anyone else is using one of these. I'm sure the same concept holds true for any factory (OEM) installed navigation system, as well - although the data formats are likely to be prioprietary in each case. I must say, the voice prompts are nice. For now (if I feel the need/interest), I'll manually select a point near my target cache, and let it lead me there, switching over to the 60CS when I get close. I'll be sure to post any discoveries... Billy (aka SnoWake)
  17. Now included in GSAK release 4.1 Thanks for picking that up. Yes, the parser is treating the underscore the same as a space so you would need to include the file name in double quotes. I will see if I can work around this, or at least add this information to the help file. Clyde- Glad to hear that feature will be included in an upcoming release (automating the export of a Mapsource file). On the topic of automation, I have one quick observation - although I'm quite certain it's entirely outside your control. When exporting a CacheMate file, the Palm installer is invoked. When that pops (it's about in the middle of my script of MUCH processing, against two different databases), the whole process stalls until I click "Done". Just wondering if there's any switch or parameter that can be passed to the Install Tool to suppression this required interaction? On the topic of underscores: I don't have a problem with needing to quote them (e.g. I don't think you need to "deal with it") - but perhaps just a couple extra words in the relevant help page, to read something like: "You only need to surround these parameters in double quotes if they contain spaces or other special characters." It only took me two runs to figure it out - and with all your (space-containing) examples being quoted, trying that was an obvious first tactic. Thanks again for all you efforts, in front of AND behind-the-scenes!! BTW: Just upgraded to 4.0.2 (your response sent me looking for 4.1! ) and see that I was missing a couple of important fixes that followed the 4.0.0 release I was running. Getting closer by the minute to fully-automated e-caching! Thanks again to all, Billy
  18. There's a way to do that with Rick's tools on http://geo.rkkda.com but I can't look up which one right now... <reeling> Wow. Where has THAT link been all my life?!? I'm just getting started on the README - but this is amazing. You answered my next question before I even posted it!! MANY thanks again - to ALL! I'll report back soon on progress... Quick Update: I've set up a couple of simple batch files to automate the more routine bits of my processing of PQs and (some) exports within GSAK, as well as getting Rick Richardson's geo-* tools working with gpsbabel on my Linux server. Now if I can just automate the extraction of the GPX attachment from my email (or figure out how to make use of the geo-gpxmail command... I've updated my Procmail recipes to invoke it on PQs, but think I have to wait for tomorrow (the geo-demand command doesn't seem to do anything - presumably because I've already run my 5 PQs for the day?). Anyway - this is all really, REALLY cool stuff. Here's a couple things I noticed, RE: GSAK automation: - Couldn't automate the export of a Mapsource file (for uploading waypoints into GPSr) - I found that I had to quote the IMPORT filename, even if it didn't include spaces, or I was getting weird errors from GSAK. Now, my filenames DID have underscores in them - perhaps you have to quote ANY special characters? This is going to save me some SERIOUS time! Thanks again, Billy
  19. Clyde -- you rule. What's funny is that I was already using that option ("Use gc.com logon ID") - but I had entered my userID ("SnoWake") instead of my actual "owner ID" or "finder ID" (the numeric 'uid' (in UNIX terms ;-) that is what's wanted for this field. So - I have no idea why it was working quite like it was, before particularly, with regard to finds, where the user doesn't have the option to "override" the name, like you do when you place a cache). Who CARES why it was working like it was (matching roughly 3/4 of the finds)... Guess how many caches in my "found" database that don't have a "found by me" date, now?? Zero. In the words of Kyle.... Kick a**! Thanks, Clyde. I'll key an eye out to enhancements in the export options (glad to see someone else found some value in the idea, as well!) as time and priorities permit. In the meantime, I'm WAY closer, and can at least generate a simple, chronological output of my finds: http://www.gotwake.com/caches/found/UserFound.htm I could probably write a Perl script to omit all the log entries but mine - although it would take me longer than just waiting for Clyde to incorporate the functionality in release 6.3 Thanks again - what a GREAT tool. Now if I could just automate some of the extraction of attachments from my email (I already sort them to a folder on my Linux server using Procmail) - I could wrap that with a little GSAK automation (some could be done locally on the Linux box with GPSBabel, but... I kind of need access to GSAK's databases - at least by my current thinking). The ultimate goal would be an automated system which frees me of some of the (nearly daily) durgery of taking the day's 5 PQs, parsing them into the appropriate database, and generating a Mapsource file (for uploading waypoints into the GPSr), a CacheMate file (for sync to the Palm), a Streets and Trips file (for pretty pictures), and HTML (for the link above, or a personal "not found, closest to home" index "plucked" onto my Palm), and even back into GPX (for visualizing in ExpertGPS), etc. Granted, I've got the routine down pat - but it STILL takes awhile, and I'd love for it to just magically happen while I sleep (depending on when the daily batch of PQs arrive). To get started towards that goal, I'm going to at LEAST "batch up" the part of the process that's easily accomplished with the GSAK automation commands. I'll let Clyde, and y'all, know how it goes. Thanks again, Billy
  20. A quick cross-post here - and actually, considering the location of both Bill, AND Delta Butter - I probably should have posted it here originally. Congrats to Bill of Green Achers for FTF! Also, would like to extend the invite to any of the CVC gang that would like to come along for a boat ride and a couple (few?) finds on Sunday, 8/22. See link above for details, or just drop me a line if you'd like to join in. Have a great evening - and again, MANY congrats to Bill for his perseverance! Billy (aka SnoWake)
  21. First, I'd like to second the KaiserKlan praise for Clyde and all his efforts for creating, maintaining, supporting and enhancing such a valuable tool for our community. I don't know where I'd be without it - and if you haven't registered it yet - I would strongly urge you to do so. When thinking about the money I shell out for caching: New GPSr (or two ), upgraded PDAs, ammo cans, log books, disposable cameras, gas money, new tires, signature items... the cost of registering this program, compared to the value it adds, was a no brainer. Now for the question part of my post: I sure hope this doesn't turn out to be an RTFM issue, but I have a problem I can't seem to figure out... In my "found" database, many (hundreds?) of the entries don't have a "Found by me" date. At first, I thought this might have been a result of the old, archived caches I had to import via .LOC files (before .GPX files were available for individual caches) -- and that made sense. However, I have since realized that this is not the case, and that this problem continues to grow. There are caches that I've found in the past week, still active, and returned by one of my "found" PQs - which, when imported, DO show my log entry... but no "found by me" date. I started through the process of just manually editing in this data - until I realized the scope of the problem, and how long it would take me given my current, totally manual process. Plus, the potential for data entry error is huge... While I've got your attention, I might as well describe my "goal" - as even once I solve the 'date found by me' issue, I've still got a little work to do. Here's what I'd like: Some form of output (HTML, plain text, CSV file, GPX file... whatever) which represents my 'caching journal'. E.G. it would be a cronological list of my finds, with JUST my log entries. I guess a simpler way of saying it is something that approximates the look/functionality of "My Cache Page" at gc.com - but an "offline" (or even web-postable) version. I thought about rolling my own MySQL database, with a PHP web-front end, and just transferring all the data - but that felt a lot like re-inventing the wheel, when I think Clyde (and Robert Lipe, et al) have already done all the heavy lifting (not to mention, I'm more of an infrastructure guy than a developer - I'd struggle with such an undertaking). So - my apologies in advance if this is clearly covered in the documentation (it hasn't jumped out at me) or has already been discussed here or in another thread. Also, thanks in advance for any guidance or suggestions - on either or both issues. Have a great evening, Billy
  22. I'll THIRD that. I recently lost a GPSMAP 60CS (DOH! ) and had to order a replacement. Well, the sad reality was, even once the replacement arrived, I still couldn't make use of my auto and bike mounts... because I realized that the "yoke" (for lack of a better term) that allows the 60CS to clip into these various mounts was lost with the unit (and obtained as part of the "Auto Navigation Kit" I bought the first time around). I dealt with both mail order shops I purchased the original, and replacement units from (getfeetwet.com and compuplus.com), asking if it was possible to obtain JUST the "yoke" - and if not, what the cheapest way to obtain the part was. GFW had the nerve to tell me they'd sell me just that piece - for $20 - when an entire new bicycle mount has an MSRP of $18. When I responded with that, the "customer service rep" (and I use the term loosely) responded with "that mount won't work". HUH? At any rate -- to make a short story long... I finally emailed Garmin tech support, and described my "problem". Not only did they fully understand it - but they drop-shipped a complete bicycle mount (which included the part I needed) that same day. Similarly - the cartography group was incredibly friendly, helpful, and knowledgeable with regards to "recycling" my unlock code for City Select, so that I still retained the ability to unlock two units for my personal use. So, I've got to give Garmin the BIG thumbs up, based on my experiences... YMMV, Billy
  23. OUCH! My blisters weep for you, Marky! Like Nurse Dave, I'm a "real man" and seem to ALWAYS cache in shorts (and short sleeves, and...). I've actually gone to the doctor for a shot and some meds (the "DosePak" ) , but I'd say at it's worst, my various exposures have paled in comparison to virtually ALL of those photos. Makes me envy guys like Kemosabe, who report immunity to the stuff. What a treat that would be, huh? I just consider P.O. an "occupational hazard", at this point: realizing that there are actions I can take to mitigate the risk, but that it's not entirely avoidable... ESPECIALLY when there's a free-running dog involved! On a different note - my TiVo is all set up to record "Word Wars" - and I'm second in line for a Kablooey autograph. My new cache should be approved before Sunday - so anyone interested (I've got a couple takers already) should be able to score at least a 3-fer (assuming you haven't already trekked out to Bethel Island for "Bethel Island Beauty") on the outing. Have a GREAT afternoon! Billy (aka SnoWake)
  24. I just wanted to congratulate Bill of Green Achers for his triumphant FTF on Delta Butter. Talk about overcoming adversity! A missing cache, and countless boat troubles - but he prevailed! Nice work. I believe it was Bill who once gave Skier4000 major credit for the lengths he went to in order to find a cache... but read the logs on this one, and I think you'll see that this takes the cake! Okay, not everyone has access to a boat. SO... I'm organizing a little outing to this cache - AND another I just placed out on the Delta yesterday. The latter IS auto-accessible, but probably easier by boat. I can take up to 9 more people on this coming Sunday, 8/22/2004. I keep my boat at Russo's Marina on Bethel Island, so we'll launch from there. We can head out and hit the two caches pretty quickly, and then if anyone wants to stick around for some wakeboarding -- you're MORE than welcome. My crew has an amazing success rate with first-time riders... If you're interested in coming along on this aquatic adventure, just drop me a line and let me know. Also, if you can't make it on that date, but would like to go - we can definitely work something out, as I'm out there all the time (this time of year ;-). I wonder when the new cache ("Valve Open" - an inside joke among my wakeboarding crew) will be approved, and if it will still be unfound come Sunday morning? I'll check in on this thread over the next few days, but your best bet is to email me if you're interested. See you on the sloughs, Billy (aka SnoWake)
  25. EUREKA!! This is EXACTLY the answer I've been looking for. You seem, I was foolishly a**uming that my CA_found PQ was returning my 500 most recent finds - but not so! I guess they're sorted by date placed, or perhaps CacheID - so I was finally able to figure out why my query often had huge gaps, even a week after caches were found. So now, I've got two PQs: CA_found_old and CA_found_new, and selected a date that has each of them returning between 300-400 caches - obviously I'll have to adjust that date over time, but at least now - I'm getting them all!! WooHOO!! Thanks so much to all - but especially to IV_Warrior for holding my hand and showing me what I kept blindly overlooking. Now - I'm STILL a big fan of the feature request for a single, "all caches found" PQ, which would include archived caches. Unfortunately, I don't have log data for some of my finds (in GSAK) as a result of downloading them by way of GPX files individually, long after my log entry had aged off the page. Being able to run a single query for all finds, including locationless, disabled and archived caches of all types, would be INCREDIBLE!! Thanks again for all the help, Billy (aka SnoWake)
×
×
  • Create New...