Jump to content

Volvo Man

+Premium Members
  • Posts

    332
  • Joined

  • Last visited

Everything posted by Volvo Man

  1. You know, everybody is assuming that this item was really "mistaken for a suspicious device" perhaps the person calling it in, or the first officer on scene was a geocacher who really hates Walmart Guard Rail caches, of course, that's not going to narrow down the field much.
  2. I take it from this comment, you ride a cruiser or similar, 'cause loose dirt is hella fun on a dirt bike! But I'd also like to know if its a dirt road, so I can choose which bike I use, like yo usay, loose dirt is not so fun on an expensive Harley, and it means you're gonna spend hours cleaning it later.
  3. Personally, I'd keep an eye out any time this happens to see if any cachers who show their occupation as Swat officer or Bomb disposal expert, post a cache less than 528 feet away within a few weeks. Or perhaps its a new OC tactic
  4. OZ2CPU, I think you need a taste of what things were like before anyone was paying for this service (which was provided by Jeremy) Step Into My Time Machine
  5. I think if nobody paid, the site would quickly vanish, or at the very least become unusably bogged down by popup adverts and the like. Not supporting GC.com means not supporting the geocaching community as a whole. You should have worked this out at any of the 12 events you've attended. (which would not have been publicised for free without GC) I know where the money goes, it's not cheap to buy & run the equipment that GC are using to keep this site as fast as it is. and yes, it's probably only keeping a handful of people employed, and yes, I'm pretty sure there is a nice profit every year for those in the top echelon to enjoy. That's what ensures that they have an incentive to keep running and improving the site. This isn't overpriced CDs or DVDs we're talking about here, where only pennies go to the originating talent and the rest goes to fat cats like Simon Cowell. The nearest analogy I can think of is we're subscribing to a high quality free-to-post listing service that makes our hobby simple to get into and simple to do. If you want to see what could happen if there was no subscribers try craigslist and imagine you were trying to find a geocache on it from over 1.5 million others.
  6. The Point is "Supporting the Site" & the "Geocaching Community" And yes, its real easy to do what you're doing, it'd be just as easy to buy 10 single month memberships and raid as much data as you can in a month, then just use API status checking to keep your offline DB "fresh-ish" for the next however long you want, then lather rinse repeat. But is it "Right" - that's for you to decide. For context, I happily do some things using my internet connection that certain "Industry Associations" would really rather I did not. But I am still more than happy to have 3 PM subscriptions renew annually here. Why? the active word is "Community"
  7. I'm gonna have to disagree with you here, Lifetime memberships are a fantastic way for a company to raise revenue and help themselves grow, but they have to be setup carefully, and the revenue raised managed appropriately. Most lifetime memberships are setup based n a 10 year model, that is, 10 times the expected annual revenue from a subscriber. as a PM hasn't changed in price for a very long time, and probably won't, given the rapid growth in membership coupled with cost per member reductions presented by economies of scale, That gives us a starting price of $300. If that $300 is invested in high growth long term safe investments such as FDIC insured CDs, its going to return annual interest sufficient to at least cover the annual costs per member. If the yield is more than the cost per member, this excess can either be taken annually to boost profits or left invested to yield higher returns in successive years. This also leaves a nice little sum in the bank for emergencies, and negates the need for a separate contingency fund (to some extent) Now, lifetime memberships also have to be sold in a controlled way, if everybody takes the lifetime option, revenue will be huge for one year, then drop to a trickle, plus whatever return on investments you might get. Whilst this in theory means that if invested as mentioned above, the company would have a secure and solid revenue, in practice it tends to breed complacency and stagnation in growth and development. Flat profits, even if they are high, will inevitably result in senior management boredom and can lead to rapid collapse of a business. So, any lifetime membership program must have limits, this can be a set number allowed per year (up to a max of 10% of subscribers is a good figure) on a first come first served basis. Or it could be on a length of membership basis, perhaps over 5 years as a PM for instance. I can even be done as a draw, but this would more than likely result in allegations of fixing from those who don't win, and would lead to some losses of existing customers (not a good idea). A lifetime membership option can even drive continuous growth in annual membership sales, as it gives new subscribers the opportunity to sample the benefits of lifetime membership, along with something to aspire to if they find they like it. A simple example of this concept is the monthly option. Arguably, GS should get more benefit from monthly memberships (12x3=$36) than annually, assuming they have negotiated a good per transaction deal with the bank, but this means revenue trickles in, making forecasting and investment in growth tricky. So, they give us a discount to get our money now instead of over the next 12 months. I'll also bet that monthly subscriptions have a tendency to falter around December/January time. GS have a good advantage in the lifetime membership stakes compared to many other companies as their offering is centralized and virtual. A lifetime membership that gives a regular printed newsletter or magazine or such means that the costs per member will remain forever and will likely go up over time (postage & paper), even if the member has lost interest. The cost impact of GC's premium members is data queries and Bandwidth, if a member loses interest, bandwidth costs stop within a week and server time within a week of whenever they do an untick the PQ after its been run clear out. They are also likely to go down in the future as bandwidth/dollar and processor power/dollar go up. $300 is a good price if GC decided to do LMs without increasing what you get from your membership, ie 5PQs and 6000 API refreshes per day. However it is a little on the low side. $500 would be a good figure (around the price of a top end GPSr) but this would need to include some added benefits to attract customers such as increasing to 20 PQs per day with unlimited API refreshes. Also something exclusive should be thrown in, perhaps a piece of software to automate PQ grid management & definition submission without using the website (a bit like eBay Turbo Lister). Advertising supported free software with a paid ad free option uses exactly this business model, a popular piece of software that is heavily used probably attracts more revenue from the push ads than paid downloads, but the paid downloads are secure and not dependent upon use. A similar concept is the lifetime warranty. I'm sure long ago someone told Snap-On tools they were crazy to offer a lifetime warranty on their tools because once someone buys a tool, they never have to buy another one (of that type). Well, they're still the market leader after 92 years, with an annual revenue of $2.9 billion! Why? because there's always someone new to the trade to buy and grow their tool kit, and nothing grows brand loyalty like replacing a damaged wrench that's over 40 years old for free. So my vote goes thus: I'd pay $300 for lifetime membership at the current (but matching the future) PM benefits. I'd pay $500 for increased access LM if it was as I've listed above or better I'd pay $750 for an LM if it had Unlimited PQs & API access, plus unlimited whatever else comes along in the future. Incidentally, the unlimited $750 option would probably decrease somewhat the load on the servers, as I'd just run a full query of my PQ area the day before I go caching instead of having to do so continuously because it takes over 2 weeks to run the full set once.
  8. Another Strategy: I use a netbook in the car with a large cache database, these are also loaded into various mapping softwares, including satelite view. If I'm going off the beaten path in a rural area, I first study the map to see the best point from nearby trails and paths to launch my attempt at the cache, then go to that point, which is rarely more than a few hundred feet from the cache. This usually means I have a decent landmark on the nearest trail to head for, such as an unusual tree, lone bush or something similar. Once back at a beaten path, memory will serve to get me back to the car if the GPSr isn't helping (I use tracks) If I'm walking the path and I see more than a couple of features I don't recognise, I am probably going in the wrong direction. At the end of the day, if you can tell the difference between a Human made Trail and a deer track, following the human made trail is going to get you somewhere of help. If you know you've walked further on the path after the cache than before, you're probably going the wrong way, so turn around. All human made paths originate from somewhere Humans can access easily enough, and go somewhere humans want to go.
  9. lunarcaching.com is registered. How about lunaticcaching.com, that way ALL of US good take part! Since we all seem to qualify properly. Doug 7rxc So, what's the premise? are we hiding Lunatics, then going to look for them or are the lunatics doing the hiding and finding all by themselves? I see this would probably need a partner site to keep things under control lunaticcatching.com, if we're hiding lunatics, could be a good little online supply store, shipping may be an issue though
  10. I note several things from the many many threads that have come up over the years on this subject, and every thread is pretty much the same apart from a couple of significant differences: 1: The pro camp in each thread are mostly a different group of people to the last bring back virtuals debate. 2: Most of the against camp are the same as the last one, and the one before that and the one before that. This list of names changes only very slowly across the years. 3: We all know that many reviewers cache under a different name (for valid reasons, no sock puppets here) and I often wonder how many of them post against Virts under their caching profile rather than their reviewing profile. (totally legitimate, but it'd be interesting to know) 4: When someone posts solid stats in support of the pro view, they are pretty much ignored by the against camp. 5: This is probably the most civilized thread I have seen on this subject to date. 6: The arguments used by the against camp never change 7: whilst the pro camp use everything that has come up before, there are always a few new arguments for their case.
  11. As I have recently announced Here, I will shortly be releasing a line of Virtual Cache containers, this should clear up the arguments and pave the way for the return of Virtuals. Now if you'll excuse me, I have a consignment of Air Guitars to shift.
  12. A Little advice from an Expert on this subject, I'm a Terminal Manager at a major International Airport. Unfortunately, since 9/11, security has been getting progressively tighter, whilst virtuals are not particularly a problem in the vicinity of airport facilities, the logging requirements need to be easy to find without drawing attention, such as reading a dedication plaque etc, anything that requires abnormal behaviour to answer, such as reading the serial number from the underside of a bench or such, is going to draw the kind of attention you don't want and unfortunately, my colleagues in security are not known for their sense of humour. In the case of physical caches, anything inside the boundary of a major Airport or in many cases even minor ones, has the potential to get you landed in jail if you don't have permission, and the bigger the airport, the more redtape you'll have to wade through to get it. Anyone deliberately hiding an item within airport grounds for any reason is creating a security risk. Anything deemed as deliberate or potentially deliberate (I won't expand on that) will result in an escalated security alert, and anyone found to have deliberately placed such an item causing an alert will almost certainly be arrested ( this could apply to both the original hider and all subsequent finders in the case of a cache) So, permission is paramount on an airport. I know for a fact that we would not allow a physical cache to be hidden anywhere on the airport, but, one method I have seen in the past could work, and wouldn't require too much red tape. I once found a cache which was kept inside a barber's shop (anyone who knows me will know how ironic that is). You went to the co-ords listed, which put you in front of the shop, went inside and asked the guy inside for the geocache, he wasn't a cacher, but I guess he knew someone who was, and kept the cache on a shelf inside. This would work at any Airport retail outlet or information desk, as long as the cache container was completely clear, rather than the usual slightly cloudy tupperware, and contained nothing which could hide something else. Permission for this would only need to come from the Unit manager in the case of retail, or the Terminal Manager in the case of an info desk, but access would only be during business hours, and you would need to do a good sales job with plenty of information about geocaching if going in cold. Outside the airport is also becoming an issue nowadays, anything on the final approach path is a no-no, anyone rooting around in the bushes under such an area runs the risk of explaining their presence in handcuffs at the business end of an M16. (Again I'll not expand on this) We had a TB hotel once, quite a way from the perimeter, but as the location was causing security alerts from the calls about cachers rooting in the bushes, it had to be moved.
  13. But would you bring back vitruals? I've already written the press release and recruited the reviewers, plus I have a whole new line of Geostashing gear ready to go (well, a box of magic markers and the old gear mostly) In addition, I'll be doing a whole new line of Virtual Geostashing merchandise, such as Exclusive Virtual Stash Containers in all sizes, including Magnetic Nanos, Virtual Stash Log Books, Virtual Travel Insects, Virtual Stash Pens, Virtual T-Shirts, Virtual Hats, Virtual Rucksacks, Virtual Torches & Virtual Keyrings. Don't worry, you'll be able to get you GroundWhisper inc gear well ahead of time, I'll be releasing it on ebay shortly, just as soon as I've made space by shifting this consignment of Air Guitars. Sadly though we will have to lay off Signal with a golden flippershake to make way for our new brand mascot, Cyril the Squirrel. Among new policies will be a Donation based voting system for new or old cache types to come back or be banned, basically the way this will work is that every month, we'll pick a vote topic, and you all can vote for or against by writing yes or no on the back of a dollar bill (don't forget to log it at where's george) then send it in, at the end of the month, we'll count the Dollars Votes and announce the winner. Oh, and we'll be rebranding the lackeys too, henceforth they will be known as minions..... Mwahhahahahhahahah
  14. I'll bet the most caches in a day record will soon be rising (as soon as theres enough mooncaches to do it)
  15. For USD$20 Million, I'd give up a lot of stuff, including Geocaching. Then I'd buy out Jeremy and change the name to Geostashing.com and become a pro Geostasher
  16. It will never happen. I'm not that lucky. They don't have a special icon, so there's no point. not yet, but there is a campaign
  17. You know, I just had this flash of the future, so I'm going to start my campaign now: BRING BACK MICROS
  18. To add some stats to ponder: In the UK, there are currently 205 active Virtuals, of this number, 96 were placed before the wow factor requirement. Live Caches in the UK at 31/12/2003 ~ 2549 Live Virtuals in the UK at 31/12/2003 ~ 223 (8% of Total) Remaining Caches in the UK placed before 2004 - 1180 Remaining Virtuals in the UK placed before 2004 - 174 (14.5% of Total) Overall that's almost 100% better survival rate of those virtuals than other types. Of those survivors, more than half are pre-wow factor, so, maybe elsewhere they were not of such good quality, but here in the UK, they obviously were, and without the help of the wow factor requirement. I used Pre 2004 stats as I don't have a full UK database for before Oct 2003 The point that has been made here so many times is if you can think of a "lame" virtual scenario, (and I know out reviewers can reel so many of them off) there is a matching physical cache scenario, perhaps they would share with us some of the "lame" traditionals that have been tried and pushed back on, and probably caused as much hassle. What sticks in my mind is the story of the rotten sneaker in the woods. This rotten sneaker is brought up as an example of a bad virtual submission, but I have to ask, how is it any different that this was submitted as a virtual than as a traditional, I really hope to the gods of things lost in the woods (I'll find out who that is later) that it would have been refused as a physical cache too, along with the dead animal carcass et al (its worthy of note the animal carcass would not have been there for long anyway, I'd give it 3 weeks, then the bones another 6 months) -edit- I promised earlier that I would find out who is the god of things lost in the woods. According to my research, the viking name for one who through Mystical Powers was able to find hidden and lost things was Volva, therefore, It turns out, I am the God of things lost in the woods, wasn't expecting that outcome.
  19. It would be nice to have a way for outside applications to submit and manage pocket query definitions. I understand that the GC web interface is designed in a certain way to allow creation of a query, and I am not arguing with that. What I am saying is that I have loads of PQs which are basically the same but for a single change, and they work together to generate a larger group of results ( just to get 15 miles out takes 10+ PQs adjusted by Date Placed). It gets very laborious changing the parameters for the whole group. There are three possible solutions I can see for this, two of which get flogged to death every few months on this forum: 1: Allowing the combination of PQs into a single super PQ (ie allow option to choose option of 5x1000, or 1x5000 per day) this could also be achieved by changing the PQ limits from number of queries to number of results per day (ie 5000) kinda like the way the API has been set up (flogged to death in the past) 2: Increase PQ result limits to reflect current cache density (this one really gets flogged to death) 3: Include API method to submit PQ definitions from a remote source, there are several ways to do this, I think the easiest would be to allow the submission of a formatted text file utilizing an agreed protocol. This could then be managed and created by offline programs, leaving the programming work to authors of outside software.
  20. I've seen about 5 personally, and I suspect there are a few times as many that I never noticed. But, still, I think it's definitely so few that they aren't a serious issue. That's part of the reason I'm prepared to assume the reviewers are doing it for a perfectly good reason, even if it's not entirely clear to me what that reason is. By the way, retraction isn't a complete deletion, it just returns the listing to the reviewing state, from what I've seen. One cache had several logs including a find and discussion of why it shouldn't have been published. It was retracted, but then later republished after the problems were sorted out. In fact, I'd say all of the caches I've seen retracted could conceivably have been republished. Perhaps it's cleaner to republish a cache that's been retracted instead of archived. I don't think anyone here is really arguing about whether retracting a cache is right or wrong per se, although a couple of examples have been given where perhaps more appropriate methods could have been used. What's at issue with these caches, is the "cleanliness" of the retraction. if a cache is found to be inappropriate after publication and finding, no matter whether it should have been allowed in the first place or not, It should be archived. If for some reason, it is also deemed that access to the cache details should be removed from public view, then, and only after it is archived, should it be retracted. If a cache is mistakenly published before it is ready, then, by all means retract it, and await its release shortly, but I know this clogs up the reviewer's work queue if its there too long. The reviewers have told many a hider who takes too long or cannot maintain a cache for too long, that if circumstances change, they are happy to Un-Archive at some point in the future, so its obviously not an issue to archive any cache and resurrect later if need be. It does appear that TPTB have fixed the dodgy data issue which these practices were causing, so the effect on offline databases is limited to the occasional incorrect status that just won't ever update. In the case of caches that have concerns over people trying to find them in places they should never have been hidden, these absolutely should be archived before being retracted. As, should a cacher not notice that the waypoint has not been refreshed for 5 years, (or perhaps even it was retracted 2 weeks ago) they might go looking for it and cause the very damage the retraction was intended to prevent (or worse still personal injury). In the vast majority of cases that have been discussed here, perhaps a more appropriate method would be for the reviewer to archive the cache, and change the co-ords to a nearby location which would not be an issue (such as next to a live cache) and then lock the cache. In the cases of unco-operative or belligerent COs, any inflammatory or otherwise troublesome content could also be removed from the cache page, then at least the find logs would be visible to all for posterity.
  21. Were you a reviewer when the site accepted virtual cache submissions? I was. As a statement of fact, virtuals were the single biggest source of aggravation, flames, insults and threats out of any activity that's been part of my volunteer work. The day that Groundspeak asks me to start reviewing virtual caches again is my last day as a volunteer for them. They know this, so systems like Waymarking and Challenges have been designed around that fact of life. For an illustration of how community-based voting works here, look at the first week of Challenges, when they counted as geocaching finds. Users were shooting down Challenges just for sport. I'm sure you are a valued and cherished contributor, but allocating so many resources to keep one volunteer happy is a strange way to run a business. Community based voting? That was anarchy. The botched Challenges launch was a good example of what happens when you completely misread your customers. It was not really surprising when the initial revulsion immediately turned hostile. It's also not surprising that a lot of the big numbers people saw challenges as a threat. Just to note, my understanding is that the Flack the reviewers got came from their judgements of the "WOW" Factor once it was introduced, should this requirement be added to any other type of cache, the abuse would return. It should also be noted that this "WOW" factor was a change to the requirements that came from GC.
  22. This argument is also amusing, since GC is known only for the highest caliber of hides! Funny how this is stated so matter of factly, and accepted as fact in this thread. But bring up that there are too many lame micros on this website in a lame micro thread, and you're like a mean, cold-hearted member of the radical vocal minority. There have been a few people out there who have downloaded the entire Opencaching.com cache database, and crunched numbers; how many caches there are, how many are unique, etc. Where are these people? I'd like to see the current numbers. I think Fendmar was being wryly Sarcastic with that comment (note the smiley)
  23. That was absolutely NOT the right thing to do. That cache listing should have been locked, not retracted. That way the listing is still available to be viewed, but no one can add any logs to it and the CO can't edit it. I've seen listings be locked because they've been used as a forum, the CO is allowing armchair logs, etc. To me, retracting a long-active listing is unacceptable. Retraction should only be used to correct mistakes, such as a mistakenly-published cache (or for reviewers to have fun on April 1 ). As soon as there are any finds on a cache, it should never be retracted, but rather archived. Didn't know it was possible for a cache to be locked, as even an archived cache can be logged if you know how (I've been using one of mine for pitstoping my TBs for years). So on that piece of information, I agree, locking would be the right thing to do in this case.
  24. Well, I seem to have succeeded in my aim of this thread. I have just tried some API refreshes of the caches mentioned in this thread, and whilst their various statuses have not changed, suddenly the API no longer passes invalid data for them, and simply skips over them as if they didn't exist, thereby not wasting any API credits, and not crashing my refresh halfway through, as was happening a few days ago. Thanks to GC for fixing that bug, much appreciated. BTW, the status check still returns the cache status, I tried manually changing the status in GSAk, then status checking them and it returned them to what they have been showing all along, archived for GC8D0D & Disabled for GC1C5QB
  25. On the Matter of GC8D0D, as it is one that I have logged personally, I can absolutely vouch that it is a retracted cache, What happened was that the cache was archived by a reviewer not long after I found it. The reasons for archiving were the result of the CO setting up an autoreply to all contact via GC.com (one of the ALRs for the cache) which made a somewhat political statement about his view of a change in GC policy at the time. The CO didn't let it stop at the archiving, and began posting rebuttal notes to some of his cache pages that had been archived, but were cross listed and active "elsewhere". this was before the change in the way URLs were formed for cache pages and anyone who knew the GC code for a cache could access and log it regardless of its status. I believe there was some similar goings on on these forums from the CO too, which resulted in him being banned (this was at one time actually shown on his GC profile, as opposed to his forum profile although it doesn't show anymore). Anyway, eventually, and quite rightly, the caches that were being used as soapboxes were retracted, as a cache page is not a discussion forum. At this point, the cache details and a few logs were in my GSAK database as finds, and got transferred over to a separate my finds database. I hadn't ever refreshed the data on that database since the entries, once found, were simply moved over to show the point at which I had found the cache, and I had no further interest in using my PQs to keep logs and details up to date for caches I would never be searching for again. Fast forward to last week and I was using my finds data to test out some changes to a refresh macro to try to stop some intermittent errors I had been experiencing, and I discovered that this cache's unpublished status was what was causing the errors I had been experiencing. I then did a status check, and it changed from available to archived, which cured the issue as my filter ignores archived caches. Having that personal evidence that the cache had once been active, but now was "unpublished" led to the conclusion that this was being used to retract caches. Now, in this case, due to the course of events, the retraction was done the way it really should be, in that it was first archived, then retracted, so it will not hamper the API (albeit this happened by chance long before the API existed) once a status check has been done the cache will forever be ignored by any well formed refresh method ie status first then refresh what's live. In highlighting this issue, discovering the reasons for it and then further examining the possible scenarios, I hope to bring to attention the need to archive a cache before retracting it, as, as can be seen in the case of GC1C5QB, if the cache is not archived before being retracted, anyone who has got that cache in their DB, perhaps for tracking their finds or whatever, will not be able to eliminate it with a status check, as even manually changing it to archived is restored to temp disabled when status checked, but it is impossible to refresh so an error results.
×
×
  • Create New...