Jump to content

caderoux

+Premium Members
  • Posts

    486
  • Joined

  • Last visited

Posts posted by caderoux

  1. Even copies of PQs which haven't run for weeks seem to be handled much slower than fresh un-run PQs.

    I disagree. I enabled an old PQ to run on Friday mid-day. The PQ had last run 19 days previous. It was in my inbox within 5 minutes. Another PQ that had last run 4 days before and I enabled at the same time, took over 2 hours to run.

    Yep, it seems to vary greatly. I had one which came two hours later and hadn't been run in nearly a week. Which would seem to indicate that a weekly run is not going to come at a consistent time. The poster's intentions of running it daily instead of twice weekly - while making it more likely he'll have one not more than a day old, isn't as good as just using really old copies or virgin PQs on demand.

     

    My point being that you can re-use queries which haven't run in a long time, but you are better off not scheduling them at all and using copies which haven't run in well over a week (since the largest scheduled range is a week - and even that doesn't reliably seem to come within minutes which a virgin PQ does).

  2. I would think that donating to a cause more closely associated with Geocaching/nature would make more sense.

     

    A local hider puts livestrong bracelets (or the coordinates for a one-time cache containing the bracelet) in caches as FTF prizes - they are popular.

     

    I think knock-offs would not be as popular, but GC-related would definitely be cool.

     

    I thought it might be a nice signature item, also, before you could buy three packs with random sayings at the local service station.

     

    But yes, I also think the whole thing has gone way too far, and there's probably a new bandwagon just around the corner.

  3. I only run this query twice per week but I'll probably add to the server burden by requesting it more often just so I know I'll have a recent one to load and research before heading out Saturday morning.

    You may not do yourself much of a service with that technique, either. I found it is a lot more reliable to simply make a new PQ. Even copies of PQs which haven't run for weeks seem to be handled much slower than fresh un-run PQs (which appear to run almost instantly - begging the question of whether any of these problems with server load have anything at all to do with PQs). I know they are handled in separate batches, but it seems kind of odd to me.

  4. Oddly enough, for all the talk here of monetary support and clamoring for that as part of the structure of the solution (separate members-only servers, etc.), I haven't seen anyone say whether a direct cause is monetary.

     

    Money's important, and paying members are important - but will having more money actually improve anything more quickly in any way?

  5. Here are some alternatives which ARE within peoples' power while they are waiting for gc.com to be able to handle the load:

     

    1. Log in GSAK

     

    2. Don't log at all (there are plenty of people who don't log "lame micros", for some reason)

     

    3. Build another system - post your own logs on your own site and just link to gc.com via the waypoint

     

    4. Go to another geocache listing site and hunt and log caches there

  6. Come to think of it, the last cache I placed was made from a "Proximity Fuse" cylinder which I picked up from cheaperthandirt.com.

     

    Plus, it's in (a tree in a) a cemetary, so although I guess I might end up getting geocaching outlawed in Louisiana (like it looks like may happen with all the misunderstandings going on in SC), all the people in the cemetary are already dead!

     

    Yes, I did paint over it - it is quite roomy and has a rubber gasket.

  7. Why not serve up smaller pages to reduce the server load and start limiting non-paying users to a more basic set of features.

     

    I don't see how gc.com can be a business based on paying users and simultaneously attempt to be a pro bono resource for the world geocaching community (unless they can figure out another way to fund it).

  8. When it comes to caching, the only number that's important to me is the number "one"...as in, "the next one".  :anicute:

    Exactly - to me it's just 1+1+1+1+1+1+1...

     

    And for that reason, I log all my finds and DNFs, lame or not lame - I even log notes if I drive by a cache and think about trying to find it again - I view that as the history of my activity. I keep my GSAK database up to date. I understand CoyoteRed not logging lame caches, but I simply log them and they come off my hitlist (and my unfound filter, too).

     

    Since lame caches rarely take a great deal of effort, it doesn't seem to me to be that big of a deal.

     

    Perhaps some places have a bunch of lame caches which clutter people's PQs or maps. I've basically cleared out a 25 mile radius (minus 18 caches on my hitlist for various reasons) from my home.

  9. What I do is change the day that is checked - then go to the bottom and submit it again. Comes in just a few minutes - usually, last night it took a while but it was better than waiting for the uncertainty of the Daily PQ.

    That's what I started doing, but re-triggering a PQ last sent a week ago was still two hours slower than creating a new one which had never run (I believe they are processed separately according to other posts here - never run queries have highest priority, then scheduled queries in order of last run).

     

    Not sure if I would have had better luck re-triggering a PQ run more than a week ago.

  10. This would be useful. The information must already be in a database table somewhere since reviewers check the .1 mile rule against all the stages of other multis (unless I've misunderstood something).

    Doubt it - the last time I made a multi all the other locations were simply input in the notes.

  11. Anybody adapting GSAK to geodashing (or should I say "adapting geodashing to GSAK!)?

     

    Side note:  I tried to search for discussions on this topic and this thread showed in the search results but I had no luck finding which of the 14 pages of this thread contained the text I was searching for.  How does one search threads like this that are VERY long for a specific word?

    I am using it for gc.com, the few dashpoints I've done and the terracaches also - in one database (when I tried geodashing I imported into another database temporarily, since there are so many dashpoints - I only brought over the ones I was going to attempt). On some multis, I also copy the main cache waypoint to additional waypoints of the form GCXXXX-n and correct the coordinates as I find them (on caches which aren't solved ina single outing).

  12. I don't know if this has been suggested, but a good option would be to include a GPX file attached and being able to specify the email address, too.

     

    Then GSAK would pull in the email automatically from my GPX mailbox and everything would be tied into my automated GPSr prep.

     

    Right now, I watch the new caches in my state and then trigger a PQ and then use GSAK.

  13. I unscheduled almost all my PQs, and went with the multiple copies method (and leaving them unscheduled and manually attempting to trigger them using the oldest one).

     

    But previously run queries seem to be taking a very long time to be handled: I went and triggered one which last run eight hours shy of a full week ago - still not run after a few minutes. Creating a new copy runs immediately.

     

    It looks like the best strategy is getting to be more towards using the new query feature for anything you are expecting to be able to control and not just re-using previously used queries (no matter how old) at all.

  14. idiosyncratic, that's a great idea.

     

    In the past, I've pulled out the laptop and cables from my car to show cachers I met on the hunt and give a quick demo.

     

    I brought along my laptop to the last event I hosted, expecting to be asked a few questions, but there was too much going on to give a demo. Please let us know how you pull it off at an event cache and any tips you might have.

     

    Thanks,

     

    Cade

  15. 30/90 has placed a few in New Orleans (U-Boat is one) along the bike path along Lake Ponchartrain. He has had some teething troubles, but they work. They are pillbottles weighted down with bricks. They are accessible when there is low tide by walking on the rocks.

     

    As far as maintenance, I don't think I would place one of these myself (he lives very close, I believe). The more permanent you make it, the more difficult to replace when it breaks - the less sturdy, the easier to replace, but the more likely to be compromised...

  16. Excellent, I will look through these.

     

    I will bring my camelbak and expect not to attempt any hikes more than a hour from the car.

     

    Wife has already warned me that she's worried about me going caching alone!

     

    I guess y'all have quite a few more poisonous snakes than we have here. I'll have to keep all that in mind.

     

    Thanks a lot,

     

    Cade

  17. Someone deleted one of my locationless finds on a technicality that I didn't violate 100%.  I don't mind that.  What I do mind is that when your log is deleted, you don't get a copy of it with the deletion notification.

     

    However, thanks to GSAK, I never have to worry about losing information which used to be on the site here.

    When a log is deleted, you should have received an e-mail notification. In that notification is a hyperlink to the log's separate "permalink" page, so you can still read the text, restore the log, copy it elsewhere, etc.

    I got an email but I don't think it had a permalink - I'll have to check my archives.

×
×
  • Create New...