Jump to content

Feature request: Sharing PQs, and remove the 500 limit.


bjorges

Recommended Posts

One would think that if Groundspeak is ok with a cacher purchasing multiple PM's (and why wouldn't they be), then they would recognize the need for a "platinum" PM and create it. Charge accordingly.

 

There seems to be a constant mentality in the forums of "things are fine the way they are. Just develop some cumbersome kludgy workaround for what you want."

Yepp, and if workarounds was the only solution one did not need to fix things that could be done more elegant :laughing:

 

I am glad this thread has got new postings with more support to my reques than naging about what not to do.

A suggestion, IMHO, should be possible to give without getting flamed by those who thinks their world is perfect and don't bother if anyone else has a different view.

 

Frankly, I actually had expected a response from GS on this thread, but they are obvious silent :D

Link to comment

Hmmm, I see a possible solution here,

 

Ok, so I buy additional subscriptions to get the extra PQs, fine.

 

How about then a feature request instead.........

 

The ability to link multiple Premium accounts together with the ability to treat finds from one account as finds for all as far as the PQ server is concerned. Maybe treat the extra account as a slave account, enabling 10 queries a day instead of 5 and a library of 80

 

On the library point, it would be nice if I had the ability to submit my PQs remotely, thereby enabling me to have an unlimited library on my machine and have my machine send them in daily as needed. (or is this already in the works, I seem to remember hearing a rumour)

 

Sounds like a possible "Bell & Whistle" that will get additional Premium subscriptions sold.

 

From a data cruncing point of view, it would probably only take a few of extra lines of code in the Member profile system and the PQ handler. (I'm no SQL expert, but I've done similar in Access and GSAK with little extra effort)

 

This is a good idea - solves 2 problems - the one that keeps your finds available on both accounts, so you're not downloading caches you have already done, and keeps TPTB happy because they're getting the extra $$.

 

How about it, Jeremy?

 

And I very much agree - with the previous poster - just because you don't see a use for something or have a need or desire for something to change or improve, especially when it would have NO effect on you whatsoever if you don't choose to use it, doens't mean you should constantly put down the people that are trying to make things better.

Edited by FireRef
Link to comment
Why do I need so many caches?? I am part of a volunteer Search and Rescue group that covers an area of Ontario that is over 300 km in diameter. And we don't know when or where the next call will be. By maintaining a large volume in my GPS or on my laptop, I can cache in an area, after the search, without any last minute effort prior to departing for the search.

 

I realize that not everyone has this need but it is narrow minded to say no one has the need.

I have a friend who's on call 24/7 and can get called on at any time to head anywhere in the world.

 

Just recently he was out with his family one weekend 100 miles from home and got a call saying he had to catch a flight in 5 hours to Africa.

 

So, he drove 2 hours home, had 30 minutes to get ready, drove 30 minutes to the airport and then caught his flight 2 hours later.

 

I think Groundspeak should make a "world-wide" download available so that folks like my friend can download every cache in the world and have them at his fingertips no matter where he is.

 

In this case, it didn't matter because the nearest cache was 90 miles away, but still, he could end up anywhere and should be able to have all nearby caches on hand.

 

Where do you draw the line? Some people want to be able to download entire states and now entire countries. Why not the world?

 

Well, I'm sure Groundspeak is going with what works for the majority of people. It's just like what internet providers do. They offer several pricing structures. Slow internet is cheap. Fast is more and the fastest is even more expensive.

 

For the vast majority of people, being able to download 17,500 caches a week is more than enough, but nothing is stopping anyone from getting a second or third account if you need more than that.

 

Edit to add that I just realized multiple memberships wouldn't work because the 2nd account would download all the finds from the primary account since there's no way to eliminate caches found by certain people.

Edited by Skippermark
Link to comment

Sounds like a possible "Bell & Whistle" that will get additional Premium subscriptions sold.

The powers that be are not interested in getting more Premium Memberships sold.

At least that's what I read into their removing the option to buy a 1 month membership for $3 when they made a big change a few months ago.

At least they do have a new 1 season membership for $10 option available, until they take that away.

Salesmen they are not, or have none, or...

Edited by trainlove
Link to comment

Sounds like a possible "Bell & Whistle" that will get additional Premium subscriptions sold.

The powers that be are not interested in getting more Premium Memberships sold.

At least that's what I read into their removing the option to buy a 1 month membership for $3 when they made a big change a few months ago.

At least they do have a new 1 season membership for $10 option available, until they take that away.

Salesmen they are not, or have none, or...

 

Actually that was a good move on there part. I'm sure a number of folks would pop in for a month, download a bunch and then be gone for a good while, only to pop in for another month. Now you have to stay a while. And if you do this more than once a year you start to think, well for just a bit more I get the whole year, so I might as well go for the year. Looks to me they upped the revenue stream, which is not a bad thing. Looks to me like they did a good job of selling more premium memberships.

 

Jim

Link to comment

Actually that was a good move on there part. I'm sure a number of folks would pop in for a month, download a bunch and then be gone for a good while, only to pop in for another month. Now you have to stay a while.

But, but, but.

With the premium s.i.c. markup of 20% for a one month membership they actually would make more off of those kinds of people. And the Premium markup of 33.3% for a one season membership might be too much for people, who have no idea how good a PM would be or what they could get from it (don't say that it's all spelled out somewhere), to spend. Short term memberships allow people to try things out cheeply, and generates MORE memberships than not having short term memberships, I think.

Link to comment
With the premium s.i.c. markup of 20% for a one month membership they actually would make more off of those kinds of people.

 

Only if they signed up for at least 10 months out of 12 - which jholly was asserting that they probably didn't - and even then, only if PayPal don't have a fixed part of the fee for the transaction - which they do. I suspect that 12 1-month subscriptions probably resulted in Groundspeak seeing about $1.50 more of the extra $6, with the rest going in fees.

Link to comment
Where do you draw the line? Some people want to be able to download entire states and now entire countries. Why not the world?

OMG.

I must be stupid, of course GS actually should remove the 500 limit, and replace it with a limit on 1 - one cache. Why the H*** should anyone need more than that????

 

Sorry Skippermark: Your argument is just stupid IMHO.

Link to comment

I'm really not as much of a "status quo" person as people make me out to be. I'm just a kind of guy that offers solutions and work-around within an existing system until other (better) solutions come along.

 

But we're all missing the point here: Groundspeak has said it over and over again that they did not create PQs so that people can maintain an offline database of caches - it was for the ease of downloading multiple points into your GPS. I doubt that they're going to bend over backward to make their model of Pocket Queries fit the needs of a function that they never intended it to meet.

=======================

 

But for the sake of argument, let's say that you're limited to 2480 caches PER DAY (I say that because even with 5 x 500 you can't get all 2500, even with the date range method).

 

The solution is to limit the criteria of the caches to ones you like to find. In my perfect world, I go for Traditional Caches more than 1.0 for terrain and less than 4.5 for difficulty. I'm not much for micros (and there are some "non-categorized" micros out there) - so I would exclude caches that are not Large or Regular. I'm also not going to download caches that I have found, and I'll exclude temporarily disabled caches.

 

My definition of "the Chicago Region" encompasses an area that has around 4,800 caches. However, those that fit the criteria of what I like to find limits the caches to 829 caches (17%). I can get all the caches in the Chicago area that fit that criteria in 2 pocket queries - every day.

 

If users are more selective with the PQ criteria rather than trying to download the ENTIRE database and pick and chose the database in GSAK, you can get a pretty good sampling of caches THAT YOU KNOW YOU WOULD LIKE. Go out, find caches in your GPS and you'll know they are the good ones (according to you). When you run out of caches, go back and download some more. They'll be more waiting for you.

 

The one complaint I have is that I wish Geocaching.com had more of the filters in their PQs that GSAK has (polygon searches, ranges of terrain and difficulty, etc.)

 

=======================

 

 

Even if THAT scenario doesn't meet your needs, how "up-to-date" is good enough?

 

Let's also assume that I can't for some reason get all of the caches you want every single day. What's the worst that can happen? I go for a walk outside and log a DNF because I didn't have a cache as up-to-date as I need. :rolleyes:

 

I download the local caches once a week. Each week I get all the caches I need and they're pretty close to up-to-date. Is that good enough? Probably. It's not a matter of life or death to have up-to-the-minute copies of the Geocaching.com database on your local drive.

 

 

People need to stop being a database administrators and let geocaching.com do that work. We get to go out and play with the results.

Link to comment
Where do you draw the line? Some people want to be able to download entire states and now entire countries. Why not the world?

OMG.

I must be stupid, of course GS actually should remove the 500 limit, and replace it with a limit on 1 - one cache. Why the H*** should anyone need more than that????

 

Sorry Skippermark: Your argument is just stupid IMHO.

That's okay. We don't have to agree.

Link to comment

Oh my god, I can;t believe I get to correct the famous Markwell on a point, minor though it may be!

 

If TPTB were not interested in changing their original PQ model, there would not be logs attached to them, and they would not have doubled the saved query limit to 40, which is enough to cover a full week's worth of individual PQs or 5 duplicated ones that run quickly when the server resets the counts at midnight PST.

 

up to Week old data is IMHO good enough for my purposes, the percentage chance that that particular cache has been archived in the last week is pretty low, and if I have trouble finding a cache, the history file I construct offline will give me a good enough idea to make a judgement whether it's disappeared, been archived or it's just a good hide. This database is held on my laptop which lives in my glove compartment.

 

As I've previously stated, if you're planning a road trip without a specific route, one day's worth doesn't get you very far at all, 17500 caches doesn't get you that far in places either. My next roadtrip is northern California and southern Oregon. I won't know which direction I'm heading until I turn on the freeway from my centrepoint, 17500 caches gives me a 200 mile radius to work with, ok for my purposes this time as I'm doing a centre based trip this time.

 

Last road trip covered California, Oregon and Washington, took 7 days and covered over 3000 miles. I didn't do any caching on that trip as we had other priorities, but had I had my TomTom with its POI database, it would have probably been a different story, as I am sure that the places we went would have had caches, it was a twin peaks locations tour which was a lot like going caching but with cool locations from our favourite TV show instead.

 

As for filtering, I don't do a lot of Multis but will do the shorter ones in passing, Same for Puzzles, Traditional caches though, if it's there, I will go find it, no particular preference for Size or difficulty.

 

And talking about missing the point, Geocaching's popularity is due to the flexibility of the sport/hobby, you make your own rules how you do it, there are only a few basic common sense guidelines in place regarding conduct in the field and placement of caches, mostly regarding protecting the environment we are using and our fellow cachers. The GS TOU are not the rules of geocaching.

 

What this thread is about is the standard commercial practice of supply and demand. In this case, some customers are "demanding" more data from the "supplier" in the hope that the supplier will recognize that "demand" and set a price to fulfill our requirements. In any other supply and demand model, the supplier is only too happy to supply more product to the customers and often encourages this through pricebreaks for larger deliveries. Very few companys are interested when some other customers say we don't want any more product than we are currently receiving and they certainly don't prevent larger customers from obtaining more product (limited supply special sale prices excepted as this is usually a loss leader).

 

In it's basic terms, They have data, I have money, lets do a swap.

Link to comment

Oh, and just in case, lets call that upgraded membership "Gold" and reserve "Platinum" and "Diamond" for later requirements.

 

one other thing to remeber, when 5% of your customers want to double their payments, they effectively become 10% (no i'm not going to do the precice math at this time of night) (oh, it got the better of me, 9.5238%)

Link to comment

What this thread is about is the standard commercial practice of supply and demand. In this case, some customers are "demanding" more data from the "supplier" in the hope that the supplier will recognize that "demand" and set a price to fulfill our requirements. In any other supply and demand model, the supplier is only too happy to supply more product to the customers and often encourages this through pricebreaks for larger deliveries. Very few companys are interested when some other customers say we don't want any more product than we are currently receiving and they certainly don't prevent larger customers from obtaining more product (limited supply special sale prices excepted as this is usually a loss leader).

 

Groundspeak is happy to supply you with more product right now - you can buy 2 PMs. I agree that it would be nice if there were a discount, say with the 5th PM being half-price, but then they'd have to decide if they wanted to address the issue of people clubbing together to buy them. ;) And it would also be nice if you could have both PMs on the same account, but if you're doing it for the PQs, you can use the same e-mail address, so there's just the minor inconvenience of managing the actual PQ entries.

 

A platinum membership with 25 PQs per day would be both "more product" in that you get more data, but also a different product in terms of marketing and software. I suspect that the amount of programming required to go from "1" to "more than 1" level of membership, to allow this, would be substantial, compared to the amount of revenue to be generated. Are there really that many people who would pay, say, $100/year for 20 PQs but won't pay $120 for 4x5 PQs?

 

Actually I think there might be a market for Platinum Membership, but mainly as a "vanity" item. A lot of PMs never run a pocket query and are just proud to "support the site". I suspect quite a lot of those would be prepared to support it quite a bit more. Kind of like the people who have black credit cards, or pay $25,000 for dinne :rolleyes: to listen to a presidential candidate speak. Of course, then you'd have a load of "Geocaching should be free, I could run this site for $200 a year" people complaining about how Premium Membership was already too elitist, and Platinum Membership proves that Jeremy is in fact Gordon Gecko.

 

More generally, don't underestimate Groundspeak's business acumen. For a small Internet company funded by cashflow, I don't think they're doing too badly. :)

Edited by sTeamTraen
Link to comment

What this thread is about is the standard commercial practice of supply and demand. In this case, some customers are "demanding" more data from the "supplier" in the hope that the supplier will recognize that "demand" and set a price to fulfill our requirements. In any other supply and demand model, the supplier is only too happy to supply more product to the customers and often encourages this through pricebreaks for larger deliveries. Very few companys are interested when some other customers say we don't want any more product than we are currently receiving and they certainly don't prevent larger customers from obtaining more product (limited supply special sale prices excepted as this is usually a loss leader).

 

Groundspeak is happy to supply you with more product right now - you can buy 2 PMs. I agree that it would be nice if there were a discount, say with the 5th PM being half-price, but then they'd have to decide if they wanted to address the issue of people clubbing together to buy them. ;) And it would also be nice if you could have both PMs on the same account, but if you're doing it for the PQs, you can use the same e-mail address, so there's just the minor inconvenience of managing the actual PQ entries.

 

A platinum membership with 25 PQs per day would be both "more product" in that you get more data, but also a different product in terms of marketing and software. I suspect that the amount of programming required to go from "1" to "more than 1" level of membership, to allow this, would be substantial, compared to the amount of revenue to be generated. Are there really that many people who would pay, say, $100/year for 20 PQs but won't pay $120 for 4x5 PQs?

 

Actually I think there might be a market for Platinum Membership, but mainly as a "vanity" item. A lot of PMs never run a pocket query and are just proud to "support the site". I suspect quite a lot of those would be prepared to support it quite a bit more. Kind of like the people who have black credit cards, or pay $25,000 for dinne :rolleyes: to listen to a presidential candidate speak. Of course, then you'd have a load of "Geocaching should be free, I could run this site for $200 a year" people complaining about how Premium Membership was already too elitist, and Platinum Membership proves that Jeremy is in fact Gordon Gecko.

 

More generally, don't underestimate Groundspeak's business acumen. For a small Internet company funded by cashflow, I don't think they're doing too badly. :)

 

I think it's reasonably obvious that there are a decent number of people willing to pay a little more to get a little more information in PQ's (or a few more PQ's). What isn't obvious is the unwillingness of Jeremy to tap this revenue stream. Multiple memberships doesn't work, unless the person decides to log every find with every account, so that when they do the PQ's, they can filter their finds on the website rather than offline (therefore getting more caches online). And that would probably get rather irritating to cache owners. "Logging this find to keep my PQ's stright" with 3-4 cache names?

 

Why not just increase the limit? Or create the ability to tie accounts together... this has been requested before. Makes much more sense than randomly shutting down cache types (Virtuals) or moving caches to another website (Locationless) which doesn't have nearly the functionality that this one does, and appears to have been at a standstill for a long time now in terms of new features (PQ's for Waymarks being a primary one). But that's another issue...mostly.

Link to comment

GPS units now support up to 2000 caches (Garmin Oregon and Colorado). Presumably to meet a consumer demand.

 

A major function of GSAK is the ability to maintain some sort of offline database. Presumably a consumer demand.

 

Recently, there have been a couple of Friday's where PQ's were not delivered.

 

In the Toronto area, there are over 4000 caches within 100 miles and Toronto is not the most dense area in the world.

 

Clearly the dynamics of geocaching is changing at a blistering pace.

 

It would appear that for some, but not all, the current PM offering is not keeping pace with the sport.

 

Seems like a fairly strong case for a change to the business model.

 

Perhaps Groundspeak should survey the customer base in a more formal way and see what the feedback is.

Link to comment
As I've previously stated, if you're planning a road trip without a specific route, one day's worth doesn't get you very far at all, 17500 caches doesn't get you that far in places either. My next roadtrip is northern California and southern Oregon. I won't know which direction I'm heading until I turn on the freeway from my centrepoint, 17500 caches gives me a 200 mile radius to work with, ok for my purposes this time as I'm doing a centre based trip this time.

Originally, I thought that 17,500 a week is more than enough for anyone, but after reading your post and remembering our trip to northern CA, I realized that 17,500 may not be enough. Within something like 35 miles of our hotel were over 5000 caches. Add in your drive to OR and other places (400 total miles in diameter) and you'll be way over that.

 

Perhaps Groundspeak will implement a tiered structure. That way, those that want more can pay for it. Those that don't, don't have to.

 

I would say you could sign up for something like Trimble Navigator that will give you live, up to the minute results of cache status, but being from the UK, I'm sure that would do you no good once you go back home.

 

And that would probably get rather irritating to cache owners. "Logging this find to keep my PQ's stright" with 3-4 cache names?

Originally, I thought just getting a second PM account would do the trick, but then I realized it wouldn't work for this very reason. You'd have to log all the previous caches you found with the new account so that when you choose "found my me" would work correctly.

 

Maybe, as mentioned by others, GS will add a way to eliminate caches placed by and found by certain cachers.

Link to comment

A more powerful feature that would work for eliminating caches in a PQ would be to have a PQ subtract caches what are in certain bookmarks. Create a bookmark for your primary account's finds and then have your secondary account subtract those caches.

 

What would make this much more powerful is the function would also allow folks to make bookmarks for any reason: a master ignore list is one good example. A group of friends could get together and create a bookmark list of caches they don't want to do. Share the bookmark and each would be able to eliminate those caches.

 

Another way it could be used is when two cachers want to cache together and want to eliminate caches the other has found. They'd just run their PQ and substrate the other person's founds bookmark.

 

It wouldn't be all that hard to do. Just treat more than one bookmark as an ignore list.

Link to comment

They'd just run their PQ and substrate the other person's founds bookmark.

So, let me understand this.

Each and every cache that exists, would have a public bookmark list for each and every cacher who has found that cache?

Sounds like an exponential growth situation in the making, let alone making regular bookmark lists impossible to see/use.

Link to comment

They'd just run their PQ and substrate the other person's founds bookmark.

So, let me understand this.

Each and every cache that exists, would have a public bookmark list for each and every cacher who has found that cache?

Sounds like an exponential growth situation in the making, let alone making regular bookmark lists impossible to see/use.

I don't think that is what CR is asking for.

 

Currently a premium member can get a PQ that returns the caches on any public bookmark list whether owned by them or someone else. That's how we do it when I go group caching with friends. One of us makes a bookmark list of the caches, then we each get our own nearly identical PQ so we can load up our GPSrs and PDAs without violation the TOUs by sharing the PQ :o

 

CR is suggesting that we should also be able to get a PQ that excludes all the caches on any public bookmark list. For the purpose of a cacher who purchases extra premium memberships in order to download more caches, they can keep a bookmark list of found caches on their primary account and the secondary accounts could then exclude these caches from their pocket queries.

Link to comment

Again, I'm just trying to understand why people need such an offline database*, when Geocaching.com has an online version that has all of the data. One comment was

A major function of GSAK is the ability to maintain some sort of offline database. Presumably a consumer demand.
But this wasn't written by Groundspeak.

 

I guess what I don't understand is that even if you're going out with a limitation of 2500 caches on a single day (or 17500 in a week), if you're limiting the caches to ones that you want to find, it should be more than sufficient for your needs for finding caches, right?

 

Let's use a scenario:

 

Let's say I'm driving I-5 from Los Angeles to Oregon. Let's also say that for some reason I'm not going to use the cache along a route method. In 50 mile chunks, I can get that distance in about 8 circles.

49a72216-4ac2-4c38-a2c5-6d0fc0885003.jpg

 

In each of those 8 circles there are probably thousands of caches. If my criteria is such that I want to find caches that are traditional, multi and unknown, and I don't care about terrain or difficulty or container size, I'm stuck. But if I chose a date range that makes it so that it is just under the 500 mark I should get a sufficient number of caches for me to deal with in each circle.

 

For example, let's look at the third circle from the top, which centers 50 miles around Sacramento (near 38.71049, -121.56189). If I try to find a radius of 500 caches around that point with no other criteria I could only go out 11.86 miles. But I CAN get it out to 50 miles and have it be 496 caches in that PQ if I choose only caches placed before May 31 2004. That would give me 496 caches in a 50 mile radius around Sacramento that I could find.

 

The same could be done for each of those circles (it took me all of 5 minutes to come up with that date range). By limiting the criteria to something - date range, container size, terrain, difficulty) it isn't being limited by the radius any more and you should be able to get a good sampling of cache THAT YOU'D LIKE TO FIND.

 

Will you get all of the potential caches? No. But these will be ones you'd like and would be in the area in which you're traveling.

 

====================

 

The point I'm trying to make is that while it would be really neat to slice and dice all of the data, Geocaching.com is the constant best source for the data as everything else is instantly stale. If you only download what you need to go hunting and be creative in the ways you use the wonderful system you've got - you'll probably be able to spend more time actually caching.

 

 

*Do I keep an offline database? Yes. But I don't use it in such a way that I need to pull more than about 4000 caches per week, well within the tolerance of having one single account.

Edited by Markwell
Link to comment

Seems like an extremely cumbersome way to get around a limitation that is dated.

 

And what happens when you go to run your cumbersome set of queries and that is the day (and there are those days frequently) when the PQ generator cacks. Let me guess: It is your own fault for waiting till the last minute to run the PQ's.

 

Here is another way of looking at it. If there is demand for it and they are willing to pay for it, why would you not give it to them? I doubt that Groundspeak is so revenue rich that additional sources of revenue are not welcome.

 

The reality is that many cachers maintain some sort of offline database. If only for the reason you can slice and dice in GSAK with ease compared to gc.com. And you avoid the days when the PQ generator cacks. And by having all of the caches for an area (as opposed to some cumbersome query that you may have gotten wrong), you can slice and dice while on the trip even if you do not have access to the Internet.

 

I am not saying it is for everyone. In fact, I doubt I would take advantage of it. After developing a set of Date Placed PQ's I get what I need with the 5 per day.....assuming they run when I ask them to run. But the fact that the topic continues to arise here and the fact that it seems to draw such emotional responses clearly demonstrates a need for the service.

 

Again, I offer out the suggest that Groundspeak survey the user community and see what the demand is. If it is miniscule then forget it. But if there is sufficient demand, put a price on it and offer it.

 

Enough get arounds have been offered here. The point of the discussion is how do we convince Groundspeak to offer us a more feature rich membership.

Link to comment

Oddly enough I found a reference in a thread about raising the 500 cache limit.

 

There's no plans to adjust the way Pocket Queries are handled at this time. Instead we have been concentrating on applications that get you access in real time data via a mobile phone. As TotemLake has indicated you can get the information you need today by being creative with how your pocket queries are generated.

 

I haven't heard anything in any of the gobs of requests to increase the limits from anyone on the official front.

Link to comment

 

Again, I offer out the suggest that Groundspeak survey the user community and see what the demand is. If it is miniscule then forget it. But if there is sufficient demand, put a price on it and offer it.

 

Enough get arounds have been offered here. The point of the discussion is how do we convince Groundspeak to offer us a more feature rich membership.

I suspect the demand would be minuscule. Most people either cache in a limited area or are willing to take time to plan their cache trips and prepare days in advanced. Those that find they need to cache anywhere they might be and who might be anywhere, use a mobile app like Trimble or the Groundspeak iPhone app, or they use the WAP site, or they find a wireless access point. The people who say they need to have offline access to a huge number of caches find all kinds of excuses why mobile access doesn't work for them, but most often its the expense. Yet they say they would pay extra so that Groundspeak could profit from this. I wonder what is the price point where Groundspeak would consider this versus what the people who say they need it would still be willing to pay?

 

Groundspeak allows us to have offline databases in the first place because there was user demand for this. They would probably prefer that all users look up each cache on the website just prior to hunting so that they would have the latest information. In order to encourage getting the latest information the number of caches you can get in a PQ is limited. But you can get plenty of caches from which you can use third party tools like GSAK to filter and plan your caching. And then the next time you plan to go caching you get a new PQ for either the same or a different area. If you really don't know where your next trip will be until you actually leave, they want you to use a mobile solution to get caches to find. But if you know that you might be in one of two or three or even five places, you can get PQs for all of those.

 

Of course once you have an offline database there is no guarantee that you will use it as Groundspeak would prefer you to. You can consolidate the data from many pocket queries over time and use stale data when caching. Experienced cachers soon learn the value of getting fresh data so they have the newest caches and can eliminate the ones that have been archived or disabled. Groundspeak has refused many requests to include archived caches in most PQs. This is probably just a way to limit the usefulness of building big offline databases that you really don't need.

 

There may be some people who claim that the current options don't work for them but they haven't made a convincing argument yet.

Link to comment

Firstly, let me say that I'm neutral on the PQ sharing issue raised here, my agenda is merely to advocate the proposal to raise the PQ limits by some means, financial if necessary. I do however know that there are PQ syndicates around with reciprocal sharing agreements (all PMs contribuing PQs equally) for this to be viable, for instance in the UK, would take 70 PQs a day which adds up to a significant number of members in the syndicate, assuming that they do not each use all 5 of their PQs for this purpose.

 

There's no point arguing about TOU violations, everybody knows PQ sharing is a violation of the TOU, but still it happens and I'm sure that GS commit some effort to tracking down such activity and sanctioning such persons. Perhaps such efforts should be directed at the cause not the symptom, as this will result in a more lasting solution.

 

With regard to database load, comparing each query to a public ignore list would not place as much load on the system as some people think, it just depends on your methodology in when the comparison is executed, as far as know, GS use MySQL for the DB and tthis is a pretty powerful piece of software, add to that the fact that processor speed has grown (with a continuing downward price trend) at a far faster rate than geocaching. By my calculations, the size of a database that includes the last five logs for all the caches in the world, and all the data that would be included in a PQ would be around 2.7 Gb in XML format, not so big when you consider the power of a quad processor machine these days. Database speed and load relies not on the size of the dataset so much as the quality and relevancy of the indices.

 

It's also far more efficient to query a large dataset for a large number of results a few time than to query for few results many times. Incidentally Markwell, your suggestions on narrowing down the search by adding more and more criteria actually increase the load on the server quite significantly. just consider the implications of calculating the distance of every cache in the dataset from an arbitrary centrepoint, this is something that just can't be indexed, the best you can do is have the system initially eliminate all caches outside a top left bottom right box, then calculate the distance for each cache in that box. this load climbs if you just go for the nearest 500 and ignore the radius parameter. in effect this method requires the constant creation of temporary indices. Its far simpler and quicker for the system to just eliminate all caches that don't meet a criteria on a fixed index such as State and then spit out all results that do meet it.

 

There is another point to make, my experience on tuning a query to give around 490 results is that you need at least 5 previews to get it right, this climbs a lot when you are not familiar with the area and the caching activity there. each one of these previews places a similar load on the system as a full PQ, so again, by tuning, you are placing more and more strain on the server, effectively running 25 PQs to get 5, for a total of 30.

 

Lets examine the math of that for a moment, imagine two cachers who are lucky enough to not have to work and spend 6 days a week caching and one day planning the week. One of them tunes his queries and stays within the 5 /500 limit, the other has convinced jeremy to create a platinum account and has access to say 20 PQs a day and maintains an offline database of the area he's covered with that. then lets assume that each continues this for 12 months. We'll also assume that they use the seventh day's PQs to add to the week's data.

 

Cacher 1 - for each day runs 5 PQs and 25 previews to tune them , 30 a day, 210 per week 10920 per year to generate a total of 1820 individual PQs, if we assume a low overlap of repeating data of 60%, that gives us 364,000 caches queried

 

Cacher 2 - for each day runs 20 PQs with the coresponding 100 previews to tune them, say he runs different PQs each day of the week but keeps the week's PQs constant for the year. thats 120 a day for 1 week, 840 with a further 7140 for the year gives a total of 7980 per year with a total of 70,000 caches queried, we'll also allow another 300 previews per year for re-tuning, which can be achieved first time without previews if you use your offline database properly. We'll also say he's finding 20 a day and tuning his queries to use that space but does so with his offline database, not affecting the server query load, that gives 76,240 caches queried.

 

in summary

 

Cacher 1 - 10920 queries, 394,000 caches queried $30 subscription paid, 6,240 Finds, 52 days spent online at GC.com tuning. (Found 1.7% of caches queried)

 

Cacher 2 - 7980 queries, 76,240 caches queried, $120 subscription paid, 6,240 finds, 1 hard day spent tuning queries, 51 days spent with his feet up after checking his offline map and saying - "I'll go there next week" (Found 8.1% of caches queried)

 

We'll forget the logging of caches as they are equal for both cachers in this example. The question is, now who's putting more load on the server, even without factoring in centrepoint queries versus state based date queries? Also, who's got more data than he needs? And finally who's the better customer from a revenue to product point of view? Who's contributing more towards equipment and bandwidth costs?

 

There are other advantages, Cacher 2 will be able to grab nearby caches along his route to the main caching grounds on impulse whereas Cacher 1 has to plan to do this.

 

Such tools as trimble navigator are a great idea, but come nowhere near to the level of functionality that a laptop with an offline database coupled to a sat nav with a POI database can give, not to mention my previous point about cell tower coverage, check out the coverage maps for Northern California for instance, your iPhone wouldn't be a lot of help here. On top of this, my current combination can be utilised anywhere in the world, without massive overseas roaming charges. With international roaming as high as $8/minute for calls, I'd hate to think what data roaming would cost.

 

As for the elitism comment, that seems to be more prevalent amonst the opponents to a Platinum tier that in the proponents, I certainly note they use the word "newbie" a lot, one of the reasons I weighed in on this discussion was to show that some old hands are for it too.

 

And finally in answer to the I5 route, simply - "the road less travelled", there's a reason there is a series of caches titled "I Hate I5" and yes, they are on my finds list.

Edited by Volvo Man
Link to comment

Oh, this has really got the mathematical wheels turning for me.

 

I has just occurred to me the reality from a cost to implement point of view just how logical the request for a Platinum Membership would be:

 

Cost of suitable server to satisfy this "Miniscule Demand"

$1000

 

Cost to copy PQ software to second server and change the number 5 to 20 and the number 40 to 160

$200

 

Cost of simple broad band connection per year:

$400

 

Electricity to run above per year

$200 (generous)

 

Total $1800

 

#of "Platinum $120" members required to recover Year 1 costs

Just 15 (total of 300 Mb/Day upload, well within basic broadband connection capability and TOU)

 

Annual profitability if no further members upgrade

$1200

 

Wow, now that's what you call a return on investment, 12 moth break even and 200% markup on ongoing costs are dream scenarios of any business in the world, except perhaps the recording industry :yikes:

 

Naturally, economies of scale would make this even more profitable as bulk bandwidth charges are much lower per Gig, and its as easy to clone 5 hard drives as it is to clone 1, plus that $1000 server has a massively greater capacity to run queries than is being used in the above example.

 

How about it, is that a good enough business case? I have at least 15 cachers in my address book who would be willing to pay $120/ year for this, but I really can't be bothered to trawl the forum to count how many others would be Pro Platinum, but I'm willing to bet that server would pay for itself pretty quickly. Don't forget all those subscriptions would be up front payments so the break even could be as short as Day 1 if it was properly announced and advertised to the membership. (I'd make the Platinum payment 12 months only and not allow split payments)

Link to comment

Oddly enough I found a reference in a thread about raising the 500 cache limit.

 

There's no plans to adjust the way Pocket Queries are handled at this time. Instead we have been concentrating on applications that get you access in real time data via a mobile phone. As TotemLake has indicated you can get the information you need today by being creative with how your pocket queries are generated.

 

I haven't heard anything in any of the gobs of requests to increase the limits from anyone on the official front.

 

Ironic choice of quotes given the fact that mobile phone access no longer works unless you have an iPhone.

Link to comment

There's already a "Platinum" membership. But you aren't allowed to talk about it. :laughing:

... and you just forfeited yours because you mentioned it. Take some time today to decide which 15 of your pocket queries to cut out once you're throttled back to 5 tomorrow. :yikes:

 

In all seriousness, I very much doubt that Jeremy is limited to 5 a day and I'll bet he isn't above extending that capability to others (and why should he be, RHIP)

 

on the point of several levels of membership, there already are more levels of access - Basic, Premium, Charter, Founder, Reviewer, Administrator, Jeremy.

 

You can guarantee that the software engineers have a test account on the live database that has no limits on daily PQs, for the purpose of testing new functionality.

 

Adding a new level of access is likely closer to adding half a dozen lines to an ini file or some such to dictate the new parameters/limits etc.

Edited by Volvo Man
Link to comment

Over the summer, a group of friends went on a road trip. The original goal was to travel north. We'd all done all the work, setting up pocket queries, and gathering data.

Then, we decided to head in a completely different direction, with a different goal - 100 finds in 24 hours. So, new queries.

About halfway through the trip, we decided to change it up again. Time for another batch of queries.

But guess what? With selective filtering and a few stops for wi-fi, we all managed to get our queries, of the caches we wanted to hunt, within our daily allotment. Granted, if we hadn't run the first queries a day earlier, we could have had some problems. And this was the only time I've ever maxed out my PQs for the day (that does, indeed, make me pretty unique).

 

I like the pocket query setup as it sits right now. When I go caching, or when I'm getting ready to go caching, I really don't have a need for puzzles - I need time to solve them. Maybe I'll work on those next week, when I have some free time, and go get them when I head out that way. Don't need those loaded into the GPS. Multis? I'm looking for a cache or two as I'm headed to the office in another part of the state. I won't be hitting any multis. No need to download them. What's that? A five star terrain? I don't have any rock-climbing gear. They're gone. Virtuals? Hate 'em, don't do 'em (actually, I do like them - this is only an example). They're off the list.

Can you see where this is headed? Pocket queries allow for flexibility. You can choose what you want to do, when you're ready.

As others have noted, if you really want to get all the caches in your state, you just need to do a little homework and tweak your pocket queries a bit.

I much rather filter my query on the site, then on GSAK, rather than dealing with that mess, thankyouverymuch. But to each his (or her) own.

Link to comment

There's already a "Platinum" membership. But you aren't allowed to talk about it. :laughing:

... and you just forfeited yours because you mentioned it. Take some time today to decide which 15 of your pocket queries to cut out once you're throttled back to 5 tomorrow. :yikes:

...

on the point of several levels of membership, there already are more levels of access - Basic, Premium, Charter, Founder, Reviewer, Administrator, Jeremy.

...

It would surprise me if all those other than Basic were just different names for the same level of access outside the forums. I do know that Charter and Premium are the same. Reviewers do have access to the review process (obviously), and I'd assume Jeremy and other admins have a bit more 'super secret' access so they can administer the site, but that's one of the bennies of being an admin. I know I get more access on the office computer than my 'underlings,' but that's because I need it to do my job.

 

(edit to add... that was a bit contradictory, wasn't it?) :laughing::laughing:

Edited by PJPeters
Link to comment

It would surprise me if all those other than Basic were just different names for the same level of access outside the forums. I do know that Charter and Premium are the same. Reviewers do have access to the review process (obviously), and I'd assume Jeremy and other admins have a bit more 'super secret' access so they can administer the site, but that's one of the bennies of being an admin. I know I get more access on the office computer than my 'underlings,' but that's because I need it to do my job.

 

(edit to add... that was a bit contradictory, wasn't it?) :laughing::yikes:

 

he he, thanks for accidentally agreeing with me. :laughing:

 

I don't doubt that the charter and premium are just the same level of access, but it does demonstrate that the system already has multiple account types with minor changes in their config file, in this case the word charter replaces the word premium. This thereby shows that vast ammounts of programming would not be required to create another account type for a second tier of membership.

Edited by Volvo Man
Link to comment

Some people have voiced the opinion "you don't NEED all that data"

 

You may well be right! (shocked you didn't I?)

 

but if we're to start making that qualification, lets ask some more questions

 

Do you need a GPS? Actually, you don't, you can achieve the same result with a compas and a map, that's called letterboxing.

 

Do you need a cell phone? not at all, there's plenty of call boxes around and besides it probably won't work out in the boonies anyway, much better to carry a distress flare and a portable CB radio, no pesky roaming charges either.

 

Do you need that shiny SUV? I know you've got them, but honestly when did you last go off road in it, and even if you have you could have walked there just as easily, just takes a bit more time. I honestly hope that it isn't used for driving up to every cache in the woods. anyway, theres so many caches around these days, a bicycle would be ok to get you around, should keep you occupied for quite a while.

 

Do you need a premium membership? you can get all that data for free from the site online, one page at at time! Too many people start asking that question and GC.com disappears.

 

People asking that question on a global scale "Do I really need it?" is causing some pretty serious economic crises right now. If companies started only supplying what customers needed rather than what they want, some major brands would disappear over night, Harley Davidson, Ferrari, Bang & Oluffsen, Apple, Cadillac, Magellan, Garmin, all these brands spring to mind as manufacturers of products exclusively designed for people you want more than they need to achieve the same basic result and are willing to pay a premium.

 

One more point to add, using open wifi points is illegal in the UK (and therefore a breach of GC's TOU technically), unless its an advertised public use hotspot (which I've never seen), although there are a handful of subscription based hotspot providers ($240/year) but they are usually inside businesses like Starbucks, who like you to actually be a customer if you're going to sit down and plug in your laptop in their store. Not sure how the law sits in the US regarding open Wifi networks.

 

At the end of the day, our sport is a development from the more basic sport of letterboxing. Basically, geocaching is based upon the utilisation of ever developing technology and systems, stemming from the US government deciding to open up the functionality of an existing system, and that at no cost to the rest of the world. Some of us are just asking for some areas to continue developing to keep pace with the rest.

Edited by Volvo Man
Link to comment
For the purpose of a cacher who purchases extra premium memberships in order to download more caches, they can keep a bookmark list of found caches on their primary account and the secondary accounts could then exclude these caches from their pocket queries.

 

I didn't quite understand what CR was saying at first, but this made it clear. Thanks!

 

We sometimes cache with others and try to avoid going to caches that someone else has already done. The found bookmark idea would work but only if everyone else has created one. Having an option to eliminate caches found or placed by certain people would do the same thing, and it would avoid having to require people to create bookmarks. It could be setup to work with Regular Expressions so that people could choose not to show caches found by multiple people.

 

The way we do it now is pick an area and then use Lil Devil's VIP script and manually look at each cache to see if one of the group has found it. That works, and it's not too bad, but it would be much more efficient to say "Don't show caches found by CacherA | CacherB | CacherC".

Link to comment
The point I'm trying to make is that while it would be really neat to slice and dice all of the data, Geocaching.com is the constant best source for the data as everything else is instantly stale.

GC.com's data is stale, too. The data is only as fresh as the last log written on any one cache. GC.com is only one level up the stale ladder.

 

If you only download what you need to go hunting and be creative in the ways you use the wonderful system you've got - you'll probably be able to spend more time actually caching.

Actually, the less creative you have to be in getting the caches you need to cover you caching range and criteria the more caching you can do.

 

Also, you can't download more than 5 (6 if you've found that cache) logs at a time. Offline databases are much more powerful in this respect.

 

Offline database are also handy when GC.com goes down, the user's internet goes down, or any other issue that could get between the GC.com database and the user's.

Link to comment

It amazes me how people become experts on database performance and also on business models in order to argue for this idea of providing large PQs . Why don't you send your resume to Groundspeak and apply for the position of Director of Platinum Membership? Jeremy and company are obviously doing it wrong and need your expertise to set them straight :yikes:

Link to comment

Offline database are also handy when GC.com goes down, the user's internet goes down, or any other issue that could get between the GC.com database and the user's.

Actually, when geocaching.com goes down, I still can always read every cache page that I ever need to and log them by going to the wap.geocaching.com site, that's on a different server and both have never been 'down' simultaneously. As for the database itself, I have not seen that go down.

Link to comment

It amazes me how people become experts on database performance and also on business models in order to argue for this idea of providing large PQs . Why don't you send your resume to Groundspeak and apply for the position of Director of Platinum Membership? Jeremy and company are obviously doing it wrong and need your expertise to set them straight :yikes:

 

Expert? never said I was, but to present my credentials, I have worked with both SQL & Access database design, maintenance and management in previous jobds, and it has become a bit of a hobby for me as I find it a great tool to organise information funnily enough, and I apply this in whatever I do that involvoes information.

 

My current job? I'm a terminal manager at one of the worlds busiest international airports and this requires me to have a strong knowledge of dozens of different business models and practices, to be able to consider the needs and capabilities of the hundred plus companies required to make an airport work, not including the 150 plus airlines we deal with and thousands of different people every day that pass through oblivious to what goes on behind the scenes. This and powers of observation give me the business insights that I have explained here. every company that I deal with, including my own employer, the airport itself, offers tiered service levels for various premiums. the only model that doesn't entirely match that which I have suggested is the airlines' first & business class services, as the cost to service profile is reversed, but this is mainly designed to keep the service exclusive (elitist I know, but it wasn't my idea)

 

Besides, people have used the lack of a convincing business case as a con argument, so don't get sore when someone finally delivers who has thought carefully about the majority of the issues and can address the most legitimate concerns.

 

In the end, how would it affect you negatively if GS gave a Platinum membership for $120 a year? or even linkable PM accounts, another possibility would be copyable ignore lists or even better the capability to turn your finds list into an ignore list for another account, which would enable a great way to compete with another cacher too. Everyone against seems to assume that GS would not invest in hardware to support such features, thereby harming their own caching experience. Now who's got a downer on GS's business accumen?

 

As I've shown, a brand new server to support such services would break even in as little as 15 members, and I'm sure the current system could support the equivalent of 45 new members in the short term until the revenue to invest in new hardware is safely in the bank. This would also mean Platinum Members queries being moved to another server, meaning your queries would be quicker and more reliable. Its pretty clear from some of the comments here that the PQ server is in need of upgrading, and this would be a great way to raise revenue to contribute to that.

 

As for my resume, I don't think they'd want to pay my salary requirements and Seattle's a nice place to visit, but I don't fancy living there although I do really enjoy watching the flying fish guys at the pike place market. :laughing:

Link to comment

The point I'm trying to make is that while it would be really neat to slice and dice all of the data, Geocaching.com is the constant best source for the data as everything else is instantly stale. If you only download what you need to go hunting and be creative in the ways you use the wonderful system you've got - you'll probably be able to spend more time actually caching.

 

 

Exuse me, but does this "defence" of GS follow with the moderator role?

Link to comment

Some people have taken the position that PQ size should be expanded because GPSrs can now hold more waypoints.

 

I fail to see the connection between these two bits of information. Sure, when I upgraded to a Venture Cx, I went from being able to load 500 caches to being able to load a seemingly infinite number of caches into the GPSr, but I fail to see why my GPSr having more capacity has anything to do with whether I should be able to download cache information for more than 17,500 caches per week. After all, my cache finding ability hasn't been upgraded.

Link to comment

Some people have taken the position that PQ size should be expanded because GPSrs can now hold more waypoints.

 

I fail to see the connection between these two bits of information. Sure, when I upgraded to a Venture Cx, I went from being able to load 500 caches to being able to load a seemingly infinite number of caches into the GPSr, but I fail to see why my GPSr having more capacity has anything to do with whether I should be able to download cache information for more than 17,500 caches per week. After all, my cache finding ability hasn't been upgraded.

 

Actually, it has and can be upgraded further with what we are suggesting ie paying for a higher level of service.

 

Your ability to find caches has been upgraded by the significant reduction in distance between caches, that is to say less time travelling between them, more time at the cache sites trying to find them. From a yearly point of view in the UK, we are experiencing exponential growth at the moment, in 2006, the number of PQs for the whole country was 16, last year in the 30s this year its 70. when I started in 2003, you could cover the whole UK in 3 PQs

 

Further upgrade to your ability to find caches could be achieved by you spending less time tuning your queries and planning each days caching and more time out there actually caching, this could be achieved by setting up repeating queries for a large area and not having to maintain them very often (this is the beauty of date range queries, unless time travellers start placing geocaches, all you need to allow for is a half dozen or so under the 500 limit to allow for temp disabled caches coming back online, the only maintenance needed is to re-tune for efficiency once your finding has brought some PQs down to the 460s, this is easily achieved by sorting your database in GSAK by date, then working out your new date cutoffs.

 

So to recount then: in the last 4 years

GPSr - Upgraded

PC/Laptop - Upgraded

Internet Connection - Upgraded

Geocache numbers - Upgraded

Geocache types - Upgraded (although this can be a matter of opinion)

Ability to find caches - Partially Upgraded

Pocket Queries - not upgraded

Link to comment
But for the sake of argument, let's say that you're limited to 2480 caches PER DAY (I say that because even with 5 x 500 you can't get all 2500, even with the date range method).

Can you expound on this? I have a couple PQs that had 499 caches when I set them up.

The PQs could still be tweaked to offer 500 caches. Sure, they might miss a few, but they'll still give 500 caches.

Link to comment
But for the sake of argument, let's say that you're limited to 2480 caches PER DAY (I say that because even with 5 x 500 you can't get all 2500, even with the date range method).

Can you expound on this? I have a couple PQs that had 499 caches when I set them up.

 

There are a couple reasons why the date range approach doesn't get you 500 cachers per PQ:

1) If there's an average of 9 caches per day, then 55 days will get you 495 and 56 days will get you 504... oops. So you test until you get some number under 500.

2) Suppose you get the ranges set up with 10 consecutive PQs each returning 495 caches. Then a few get archived in each date range. Unless you're prepared to clunk through the PQ interface again, your PQs have become less productive. Once they get down to 460 or whatever, you have to take a deep breath and re-do all of them, shuffling them up to get closer to 500, but you're not going to do that until you have to.

 

As Markwell said above, time spent messing with PQs is time not spent caching. It's slightly cool to have all the caches in your country in a database, but only slightly, and only once.

Link to comment

Some people have taken the position that PQ size should be expanded because GPSrs can now hold more waypoints.

 

I fail to see the connection between these two bits of information. Sure, when I upgraded to a Venture Cx, I went from being able to load 500 caches to being able to load a seemingly infinite number of caches into the GPSr, but I fail to see why my GPSr having more capacity has anything to do with whether I should be able to download cache information for more than 17,500 caches per week. After all, my cache finding ability hasn't been upgraded.

 

Did you ever wonder why the GPS functionality was increased to beyond 500??????

Link to comment
But for the sake of argument, let's say that you're limited to 2480 caches PER DAY (I say that because even with 5 x 500 you can't get all 2500, even with the date range method).

Can you expound on this? I have a couple PQs that had 499 caches when I set them up.

 

This all depends on cache density and rate of placement in the area you are setting up, if you are going for a chunk of the california database for example, you're going to have to pick a centrepoint and fixed radius otherwise you're only going to get a small fraction of all the caches.

 

With a query set designed to produce say 16500 a week (allowing space for later placement) then you're going to have more and more trouble getting 499 the closer you get to the present date as daily placing can get up to 15 a day. As I have said above, you need to allow for caches being un-archived, so 495 is a good target, give or take a couple, I managed to get most of mine within a couple of caches of this target, but a couple are down to 487 due to nearly 20 being placed on the day at the end of the query.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...