Jump to content

500 cache limit on PQ


rdmnks
Followers 2

Recommended Posts

Seems that the rate of new caches just is not slowing down (good thing)....... To maintain a good database of the surrounding states you may cache in is not possible with the 40 PQ limit. I keep queries for not found caches in CT, MA, RI. I also wanted to keep data for NJ, NH and VT but I just don't have enough queries. I wind up having to delete and recreate them depending on what state needs to be "refreshed". I wish GSpeak would let us run a query for an entire state at some interval; maybe once a week, or up the number of caches per query up to 1000. Are others encountering this frustration? I am absolutely certain that this has been discussed many many times. But, if you don't speak up the answer is no. Why would GSpeak be so reluctant to increase customer satisfaction? With today's computer technology you would think this would be an easy fix.

Link to comment

Seems that the rate of new caches just is not slowing down (good thing)....... To maintain a good database of the surrounding states you may cache in is not possible with the 40 PQ limit. I keep queries for not found caches in CT, MA, RI. I also wanted to keep data for NJ, NH and VT but I just don't have enough queries. I wind up having to delete and recreate them depending on what state needs to be "refreshed". I wish GSpeak would let us run a query for an entire state at some interval; maybe once a week, or up the number of caches per query up to 1000. Are others encountering this frustration? I am absolutely certain that this has been discussed many many times. But, if you don't speak up the answer is no. Why would GSpeak be so reluctant to increase customer satisfaction? With today's computer technology you would think this would be an easy fix.

 

Simple answer, Groundspeak doesn't want you to keep your own database.

 

The listing of the database is the only product they have. If they let you keep your own database that could potentially be sold by you then they lose their product.

Link to comment

I am absolutely certain that this has been discussed many many times. But, if you don't speak up the answer is no. Why would GSpeak be so reluctant to increase customer satisfaction? With today's computer technology you would think this would be an easy fix.

Yes this has been discussed many many times. Groundspeak has given its answer many times as well. I wrote this in response to the question with a summary of my take on Groundspeak's reasons. (While some of this is from Groundspeak's stated reasons, much is my personal interpretation. Base on PM's I received I think it is fairly accurate.) For those who do feel the need to be able to access the nearest caches wherever they go there are solutions with today's technology. Groundspeak provides an iPhone app and Trimble Geocache Navigator is available on several other smart phones.

Edited by tozainamboku
Link to comment

I wouldn't mind seeing a 2,500 cache limit a day broken up however you see fit. 1 PQ or several.

 

How about making it real simple?

 

17500 caches per week plus the My Finds PQ.

 

Any way you slice it or dice it 17500 PQ's with one cache each to 1PQ with 17500 caches in it.

 

I think this approach of having a cache per week total would add the flexibility we need. That would be perfect! How do we start raising our collective voice about this?

Link to comment
Its fine as it is.

 

There is no logical reason to increase the number of caches you receive in a PQs each week, collectively. Nobody could possibly find that many caches in a week.

 

Agreed no one could find 17500 caches in a week. In my case I enjoy maintaining a data base of every cache in Kansas minus the archived caches. Why? Don't know... It's just something I like to do. Right now I use 12 PQ's out of the 35 for a week. One of these days, years from now, I won't be able to do that. Thank goodness I don't live in California or other states where the number of caches might or is very close to exceeding 17500.

 

What benefit have I gotten from keeping such a data base? Well for one it has helped me look for caches for the DeLorme/County challenges. Why not use the "cache along a route" option? It's too much of a headache and not very efficent. Another benefit is no matter where I go, within the state, I always have local caches to view and seek since I have them on my Nuvi as a POI file.

Link to comment

Seems that the rate of new caches just is not slowing down (good thing)....... To maintain a good database of the surrounding states you may cache in is not possible with the 40 PQ limit. I keep queries for not found caches in CT, MA, RI. I also wanted to keep data for NJ, NH and VT but I just don't have enough queries. I wind up having to delete and recreate them depending on what state needs to be "refreshed". I wish GSpeak would let us run a query for an entire state at some interval; maybe once a week, or up the number of caches per query up to 1000. Are others encountering this frustration? I am absolutely certain that this has been discussed many many times. But, if you don't speak up the answer is no. Why would GSpeak be so reluctant to increase customer satisfaction? With today's computer technology you would think this would be an easy fix.

 

Simple answer, Groundspeak doesn't want you to keep your own database.

 

The listing of the database is the only product they have. If they let you keep your own database that could potentially be sold by you then they lose their product.

 

Unless you paid extra for that offline data base. Groundspeak sells Geomate Jr. with 250,000 caches loaded.

Link to comment
...no one has found 500 caches in a day, much less 2500...
Nobody could possibly find that many caches in a week.

The argument against raising the limit based on the fact that nobody would be able to find all the caches within the limit is SO lame.

 

Folks that fill their GPSs don't do so because they want to find them all, it's because they're not sure where they'll be when they realize they have time to cache, and it's nice to have some cache information with you at that time. If you don't have mobile internet access the only way to do this is with as many caches in your GPS as possible.

Link to comment
Its fine as it is.

 

There is no logical reason to increase the number of caches you receive in a PQs each week, collectively. Nobody could possibly find that many caches in a week.

 

Agreed no one could find 17500 caches in a week. In my case I enjoy maintaining a data base of every cache in Kansas minus the archived caches. Why? Don't know... It's just something I like to do. Right now I use 12 PQ's out of the 35 for a week. One of these days, years from now, I won't be able to do that. Thank goodness I don't live in California or other states where the number of caches might or is very close to exceeding 17500.

 

What benefit have I gotten from keeping such a data base? Well for one it has helped me look for caches for the DeLorme/County challenges. Why not use the "cache along a route" option? It's too much of a headache and not very efficent. Another benefit is no matter where I go, within the state, I always have local caches to view and seek since I have them on my Nuvi as a POI file.

 

Be careful. Maintaining your own database is against the Terms of Service you agreed to. You can lose your account for doing that, and especially admitting to it.

Link to comment

The argument against raising the limit based on the fact that nobody would be able to find all the caches within the limit is SO lame.

 

Folks that fill their GPSs don't do so because they want to find them all, it's because they're not sure where they'll be when they realize they have time to cache, and it's nice to have some cache information with you at that time. If you don't have mobile internet access the only way to do this is with as many caches in your GPS as possible.

 

We're talking about a very small segment of the GC population that are to lazy to prepare before they go out or the current limits would not work for their area.

 

Since the current limit of 17,500 covers a fairly wide area in even the most cache dense area, and cache along a route is available, problem solved. Take out events and 5 terrain caches which even the most foolhardy will tell you needs preparation, the area increases even more.

 

Combine that with a GPS that will still give you all the descriptions, hints etc will currently only hold 2000 caches, arguing for a increase seems shortsighted and not well thought out at the very least. You would still need a laptop and a wireless card is either included or very cheap.

 

Having said all this, constant request are futile since due to their business model as well as data becoming increasingly stale and potentially incorrect the longer the time between download and actual use, Groundspeak increasing the limit it is unlikely.

Link to comment

Be careful. Maintaining your own database is against the Terms of Service you agreed to. You can lose your account for doing that, and especially admitting to it.

 

Wow, make me jump to the other side why dontchya? Please point to that because it would mean not being able to load 2 or more caches on my GPSr, which would be, in fact, an offline DB.

 

While I do not think GS encourages or supports them, I do not think they are disallowed for personal use.

Link to comment
The argument against raising the limit based on the fact that nobody would be able to find all the caches within the limit is SO lame.

 

Folks that fill their GPSs don't do so because they want to find them all, it's because they're not sure where they'll be when they realize they have time to cache, and it's nice to have some cache information with you at that time. If you don't have mobile internet access the only way to do this is with as many caches in your GPS as possible.

We're talking about a very small segment of the GC population that are to lazy to prepare before they go out or the current limits would not work for their area.

 

Since the current limit of 17,500 covers a fairly wide area in even the most cache dense area, and cache along a route is available, problem solved. Take out events and 5 terrain caches which even the most foolhardy will tell you needs preparation, the area increases even more.

 

Combine that with a GPS that will still give you all the descriptions, hints etc will currently only hold 2000 caches, arguing for a increase seems shortsighted and not well thought out at the very least. You would still need a laptop and a wireless card is either included or very cheap.

 

Having said all this, constant request are futile since due to their business model as well as data becoming increasingly stale and potentially incorrect the longer the time between download and actual use, Groundspeak increasing the limit it is unlikely.

See, that's a lot better. Those arguments against the increase, while hard to follow because of the incomplete thoughts, are WAY better than "you can't find 500 in a day".

 

With the current limits I'm able to keep 1,000 of the closest caches in my GPS, and I get them refreshed daily so there is an acceptably small chance that they're out of date by the time I go hunt for a cache. So for me there is no need to increase the PQ limitations.

 

I can, however, see why others could benefit, so I'd support the change to increase the limits in PQs.

Link to comment

Be careful. Maintaining your own database is against the Terms of Service you agreed to. You can lose your account for doing that, and especially admitting to it.

 

Wow, make me jump to the other side why dontchya? Please point to that because it would mean not being able to load 2 or more caches on my GPSr, which would be, in fact, an offline DB.

 

While I do not think GS encourages or supports them, I do not think they are disallowed for personal use.

 

I don't make the rules. I was just letting you know that it was in the TOS.

Edited by bittsen
Link to comment

Be careful. Maintaining your own database is against the Terms of Service you agreed to. You can lose your account for doing that, and especially admitting to it.

 

Wow, make me jump to the other side why dontchya? Please point to that because it would mean not being able to load 2 or more caches on my GPSr, which would be, in fact, an offline DB.

 

While I do not think GS encourages or supports them, I do not think they are disallowed for personal use.

 

I don't make the rules. I was just letting you know that it was in the TOS.

 

Apparently you are making up the rulesguidelines. I can't find it anywhere in there and you can't point to it.

 

In fact in 3. License to Use Site; Restrictions it states;

 

"You may make a single copy of the Site Materials solely for Your personal, noncommercial use..."

 

You did read the ToS, right?

Edited by baloo&bd
Link to comment

You do realize that you really don't have to find each and every cache that exits out there?? right?? Target all the 2/2 small caches - the 2500 daily limit will load a very large area. Next, you can download all the 2/3 micros along your usual travel route. Later you can grab all the 3/1 regulars centered around atown 25 miles south. Yes you will return to some of the same areas but with new caches going out all the time you do that anyway. You just don't "need" 100% of all the caches across a huge area at your fingertips. If you think you do, the solution is either the iphone app or the trimble navigator app. It is quite easy to load up more than enough caches to keep you busy with a little tweaking.

Link to comment

Be careful. Maintaining your own database is against the Terms of Service you agreed to. You can lose your account for doing that, and especially admitting to it.

 

Wow, make me jump to the other side why dontchya? Please point to that because it would mean not being able to load 2 or more caches on my GPSr, which would be, in fact, an offline DB.

 

While I do not think GS encourages or supports them, I do not think they are disallowed for personal use.

 

I don't make the rules. I was just letting you know that it was in the TOS.

 

Apparently you are making up the rulesguidelines. I can't find it anywhere in there and you can't point to it.

 

In fact in 3. License to Use Site; Restrictions it states;

 

"You may make a single copy of the Site Materials solely for Your personal, noncommercial use..."

 

You did read the ToS, right?

 

Yes, I read the ToS

 

5. Access and Interference

Much of the information on the Site is updated on a real time basis and is proprietary or is licensed to Groundspeak by our users or third parties. You agree that you will not use any robot, spider, scraper or other automated means to access the Site for any purpose without our express written permission. Additionally, you agree that you will not: (a) take any action that imposes, or may impose in our sole discretion an unreasonable or disproportionately large load on our infrastructure; or (:) interfere or attempt to interfere with the proper working of the Site or any activities conducted on the Site or other measures we may use to prevent or restrict access to the Site.

 

The maintaining of a database comment that I read and referenced (albeit incorrectly) was sent to me in response to a question. I thought it WAS part of the ToS but I guess not. The actual content is above. The term "large load" applies to maintaining a database (as opposed to creating one).

 

My bad (possibly)

Link to comment

I also like having the limit based on the total number of caches. Most of my PQ's are far less than 500 caches (bookmark list PQs, PQs for small areas around a park, etc.). Having more PQ's available, even keeping the total number of caches the same, would be a huge benefit.

Link to comment

The maintaining of a database comment that I read and referenced (albeit incorrectly) was sent to me in response to a question. I thought it WAS part of the ToS but I guess not. The actual content is above. The term "large load" applies to maintaining a database (as opposed to creating one).

 

This is where we run into problems. Even when in writing, people choose to get creative. It says what it says. These myths and interpetations are why there are as many guidelines as there are.

 

No, it does no mean or have anything to do with offline DBs, unless your point is that raising the limit would put a larger burden on the servers. It is referencing scraping methods or spiders which, in some cases, can put a load similar to a DoS on the system.

 

Sorry for the OT. Back to your regularly scheduled thread.

Link to comment

Its fine as it is.

 

There is no logical reason to increase the number of caches you receive in a PQs each week, collectively. Nobody could possibly find that many caches in a week.

 

While that is the way I use PQ's (I grab one for the day's caching, and that's it) other people use them in other ways. Some people DO try to maintain their own database, for example. They will have a number of PQs set up to run on a scheduled basis, and extracted into GSAK automatically via macros, and then select a day's worth of caching from that database. From the sounds of it, the OP is using PQs like this, and the 500 limit isn't keeping up.

 

So, yes... there IS a logical reason for the request... just not for the way that you and I use PQs

Link to comment

Be careful. Maintaining your own database is against the Terms of Service you agreed to. You can lose your account for doing that, and especially admitting to it.

 

Wow, make me jump to the other side why dontchya? Please point to that because it would mean not being able to load 2 or more caches on my GPSr, which would be, in fact, an offline DB.

 

While I do not think GS encourages or supports them, I do not think they are disallowed for personal use.

 

I don't make the rules. I was just letting you know that it was in the TOS.

 

Please support your claim with the facts. I would like to be able to quote those references if I ever have to tell someone that. Thanks.

Link to comment

I do a lot of random geocaching in my area. 500 caches makes for a amazingly small area around my house. 1k would be nice, just for convinces sake. (yeah, I run two PQs...)

 

10k would be better... but my GPS doesn't know how to deal with that many geocaches.

Edited by Arrow42
Link to comment

Its fine as it is.

 

There is no logical reason to increase the number of caches you receive in a PQs each week, collectively. Nobody could possibly find that many caches in a week.

 

Every once in a while, i want to run a query of all caches (found and unfound) in my area. The reason being, there is always a possibility of coming across a new location where i might want to hide a new cache. Having these caches loaded into the gpsr would let me know instantly if i was too close to another cache to hide a new cache. As it stands now, there are over 500 caches placed around our area so i have to run 2 queries to get them all, with the additional headaches of having to figure out a location from where i need to run them from, then load them into gsak and then let it do the sorting to get them all. It can be done but it is a pain which would be avoided if the query limit was increased...

 

There have also been times when we've wanted to cache in a cache rich area where i wasn't positive which area i was going to. Houston for example has over 2000 caches i believe, and trying to piece together a query for that area is not easy.

Link to comment
Seems that the rate of new caches just is not slowing down (good thing)....... To maintain a good database of the surrounding states you may cache in is not possible with the 40 PQ limit. I keep queries for not found caches in CT, MA, RI. I also wanted to keep data for NJ, NH and VT but I just don't have enough queries. ...
If the number of queries that you are allowed to run is insufficient, buy a second membership. You'll then get to run twice as many queries. Dump them all into GSAK and have it yank out your finds based on a 'my finds' PQ.

 

You'll end up with exactly what you are asking for.

 

Another possibility would be to tweak your PQs to maximize their effectiveness. Exclude as many caches that you won't go after and it will be just like TPTB increased your limit. Also, you could run some PQs less often. If it's rare that you go to NJ, don't run those PQs every single week. Also, older caches rarely get archived, so run these PQs less often.

Edited by sbell111
Link to comment

There have also been times when we've wanted to cache in a cache rich area where i wasn't positive which area i was going to. Houston for example has over 2000 caches i believe, and trying to piece together a query for that area is not easy.

 

That means 5 PQ's. 495 each to leave a buffer. Set by place date takes a few minutes to set up and you have it. Even you are saying, at least implying, that this will be an exception and not the rule. You can't program for every "possible" exception, especially in light of the fact that the current system will work in almost every conceivable case.

 

As mentioned earlier, take out the ones you know you won't be doing and it becomes significantly less, meaning 4, or even 3, PQ's will do it.

 

Sorry, it's not just that it is not valid for me, however there has not been a valid reason given yet for an increase. Every one mentioned can be achieved easily within the current parameters.

Link to comment
I wouldn't mind seeing a 2,500 cache limit a day broken up however you see fit. 1 PQ or several.

I like this idea.

 

I'm okay with the limit on the number of caches available to download per day, but if the limit of caches per PQ was raised, it would simplify caching for me when I don't have cell service.

 

We use a regular GPS for caching and an iPhone for looking up cache info. We often go to northern New England where cell service can be spotty. In the White Mountains, the Green Mountains and parts of Massachusetts, there are areas without service for miles around.

 

The latest iPhone version lets you download PQs directly to your phone. That's awesome, but in one area of the White Mountains where there's no cell service for 30-35 miles across the mountains, and it would take 3 PQs to get all the caches.

 

Now, I realize I won't find 1500 caches, but if I'm in the area, I'd like to have access to them all because I'm not sure what I'll want to find. Again, if I had cell service, this wouldn't be an issue.

 

Creating PQs using the circle method isn't efficient, and if I do them by date, I'd have to load each one, one at a time in the iPhone to see which PQ had the cache we were looking for.

 

If I could download one PQ with 1500 caches, that's all I'd have to download to the phone, and it would be really sweet.

Link to comment
If the number of queries that you are allowed to run is insufficient, buy a second membership. You'll then get to run twice as many queries. Dump them all into GSAK and have it yank out your finds based on a 'my finds' PQ.

That wouldn't work because there's no provision with PQs for filtering caches found or placed by another username.

 

If the OP chooses to exclude "caches placed by me" and "caches that I've found," that would do him no good because the second account probably hasn't found or placed any caches, and he would end up downloading all the caches owned and placed by his original username.

 

Edit to add that in this situation, the OP could double log all the caches he finds, using each of his usernames so that caches won't appear when he chooses to hide ones he's found or placed.

Edited by Skippermark
Link to comment

There have also been times when we've wanted to cache in a cache rich area where i wasn't positive which area i was going to. Houston for example has over 2000 caches i believe, and trying to piece together a query for that area is not easy.

 

That means 5 PQ's. 495 each to leave a buffer. Set by place date takes a few minutes to set up and you have it. Even you are saying, at least implying, that this will be an exception and not the rule. You can't program for every "possible" exception, especially in light of the fact that the current system will work in almost every conceivable case.

 

As mentioned earlier, take out the ones you know you won't be doing and it becomes significantly less, meaning 4, or even 3, PQ's will do it.

 

Sorry, it's not just that it is not valid for me, however there has not been a valid reason given yet for an increase. Every one mentioned can be achieved easily within the current parameters.

 

better leave a bigger buffer than 5 as all child waypoints usually count when loading into the GPSr. I am still trying to figure out how to get the 17500 caches requested earlier into my GPSr <_<

Link to comment

Be careful. Maintaining your own database is against the Terms of Service you agreed to. You can lose your account for doing that, and especially admitting to it.

 

Wow, make me jump to the other side why dontchya? Please point to that because it would mean not being able to load 2 or more caches on my GPSr, which would be, in fact, an offline DB.

 

While I do not think GS encourages or supports them, I do not think they are disallowed for personal use.

 

I don't make the rules. I was just letting you know that it was in the TOS.

 

Groundspeak would go bankrupt if they whacked the account of everyone who maintains an offline database.

 

What do you think GSAK does??????

Link to comment
As mentioned earlier, take out the ones you know you won't be doing and it becomes significantly less, meaning 4, or even 3, PQ's will do it.

When we went on vacation awhile ago, we filtered out everything over a 2 terrain (I always take all difficulties), and it only reduced the number of caches from 1500 to 1200.

 

I like keeping higher terrains loaded because there are a LOT that are overrated. Park on the side of the road, bushwhack 450 feet and find a cache. That's not a 4 terrain.

 

So, I filter out 5 terrains and it reduces the 1500 to 1475. Add 4.5s and you end up with 1465.

 

In my experience, the majority of caches are 2 terrain or less, so unless you filter out anything over 1.5, it dosn't reduce PQs by much.

Link to comment

The simple fact is that Groundspeak has no interest in helping you keep a Large offline database. I can load enough caches to keep me busy for a week on a 900-mile trip with the services available to me (probably longer). I believe the system we have in place is just fine the way it is.

 

There is no reason to have bigger queries or more of them. This is coming from someone who keeps pretty close to 2000 caches in my GPSr and would have more if I could. I have to update them based on where I am going, but that isn't a problem. I want the most up-to-date caches at all times. Any offline database would go "stale" quickly.

 

We all have access to a massive database of caches. It is called www.geocaching.com.

 

StaticTank

Link to comment
If the number of queries that you are allowed to run is insufficient, buy a second membership. You'll then get to run twice as many queries. Dump them all into GSAK and have it yank out your finds based on a 'my finds' PQ.

That wouldn't work because there's no provision with PQs for filtering caches found or placed by another username.

 

If the OP chooses to exclude "caches placed by me" and "caches that I've found," that would do him no good because the second account probably hasn't found or placed any caches, and he would end up downloading all the caches owned and placed by his original username.

 

Edit to add that in this situation, the OP could double log all the caches he finds, using each of his usernames so that caches won't appear when he chooses to hide ones he's found or placed.

I think that you misunderstood my post.

 

You can set GSAK to filter out the caches that are in your 'my finds' PQ (or simply don't load them to your GPSr). There is no need for the PQ generator to filter out those caches or to double-log them.

Edited by sbell111
Link to comment

I have to ask, why is it that every time a thread is started suggesting a new feature that those who are not interested in that new feature have to be automatically AGAINST the feature?

 

That stinks and so does reading all the pile on on such threads.

 

You do not have to be against something others may want but you may not use or need.

 

I would love to see a total update to the PQ system. One that uses a more efficient system of downloading directly instead of the poorest way possible (via email attachments). This would save tons of CPU usage (the single biggest drain on any internet server). Sending by email uses tons of server CPU. Besides email is an ancient and inefficient at best system and not always reliable.

Link to comment
I have to ask, why is it that every time a thread is started suggesting a new feature that those who are not interested in that new feature have to be automatically AGAINST the feature?

 

That stinks and so does reading all the pile on on such threads.

 

You do not have to be against something others may want but you may not use or need.

 

I would love to see a total update to the PQ system. One that uses a more efficient system of downloading directly instead of the poorest way possible (via email attachments). This would save tons of CPU usage (the single biggest drain on any internet server). Sending by email uses tons of server CPU. Besides email is an ancient and inefficient at best system and not always reliable.

I'd like a pony, but that's not going to happen. Since I can't have a pony, sharing in my desire for a pony would serve no purpose. Therefore, when I express my desire for a pony, people tend to explain why I can't have one and give me alternatives to a pony.

 

The fact that people often ask for a pony and the entire thing has been discussed many, many, many times before tends to cause people to give less attention to explaining the many reasons not to get a pont and more attention to the pont alternatives and workarounds.

Link to comment
I'd like a pony, but that's not going to happen. Since I can't have a pony, sharing in my desire for a pony would serve no purpose. Therefore, when I express my desire for a pony, people tend to explain why I can't have one and give me alternatives to a pony.

 

The fact that people often ask for a pony and the entire thing has been discussed many, many, many times before tends to cause people to give less attention to explaining the many reasons not to get a pont and more attention to the pont alternatives and workarounds.

 

QFT

Link to comment
I think that you misunderstood my post.

 

You can set GSAK to filter out the caches that are in your 'my finds' PQ (or simply don't load them to your GPSr). There is no need for the PQ generator to filter out those caches or to double-log them.

I knew what you meant.

 

I'm not saying this because I want this because I don't want/need a second account. I'm just thinking of possible issues that could come up with 2nd account.

 

For instance, if someone has a good amount of finds, they're not going to get another 40 PQs avaiable to them.

 

For instance, if I had a second account and ran a PQ for where I live, 6 PQs would be filled up with caches I've already found or have been placed by me. Add in the other areas I regularly cache and another 6 PQs would be taken with caches I've already found.

 

Even filtering them out in GSAK wouldn't help because I'd be downloading so many I've already done.

Link to comment
I think that you misunderstood my post.

 

You can set GSAK to filter out the caches that are in your 'my finds' PQ (or simply don't load them to your GPSr). There is no need for the PQ generator to filter out those caches or to double-log them.

I knew what you meant.

 

I'm not saying this because I want this because I don't want/need a second account. I'm just thinking of possible issues that could come up with 2nd account.

 

For instance, if someone has a good amount of finds, they're not going to get another 40 PQs avaiable to them.

 

For instance, if I had a second account and ran a PQ for where I live, 6 PQs would be filled up with caches I've already found or have been placed by me. Add in the other areas I regularly cache and another 6 PQs would be taken with caches I've already found.

 

Even filtering them out in GSAK wouldn't help because I'd be downloading so many I've already done.

The ignore list would be your friend, I would think.

Link to comment

I agree that downloading the data would be better.

 

**Work-around Warning** (but there's a point)

 

Let's use a hypothetical area of Chicago (pretty dense in caches). There's about 6000 caches in the Chicago region covered by GONIL (see my sig).

 

Let's say I limit it to Traditionals only, but I don't care about the size. Let's also say that I limit it to terrain >= 1.5 (leaving all difficulties). That only limits me to 4,026 (67.2%). That's still more than the 2500 I can get in one day's worth of PQs.

 

Therefore I'm missing 1,526 caches from my offline database.

 

However, I would state that you could get these in a random distribution by grabbing the first 2500 placed.

Jan-01-2000 - Dec-26-2005 - 495

Dec-27-2005 - Jan-27-2007 - 499

Jan-28-2007 - Oct-23-2007 - 499

Oct-24-2007 - Apr-19-2008 - 499

Apr-20-2008 - Aug-15-2008 - 497

This will NOT get you all 4,026 caches, it does get you 2,489 of the caches that you would like to find based on your criteria, and they are pretty randomly distributed. The other commonality is that the caches have all been around for a while, and are not likely "fly-by-night" caches.

 

The point is that with the tools available, I could even get 2500 caches along a route or in an area - in a single day's worth of Pocket Queries that would be ones I would likely enjoy finding. If they are evenly distributed throughout the area by some other artificial means (like all caches placed earlier than August 15 2008), then you are finding a widely distributed bunch of caches that you would likely enjoy - 2500 of them.

 

It doesn't take much creativity or technical know-how to come up with solutions like this to fit each and every cachers' needs within the system. Unless you are determined to try and implement an offline database with more than the 17,500 caches allowed by one user account per week. That's the true limitation that the site managers have implemented.

 

And with the availability of secondary premium accounts, there's not even an absolute limitation (as SBell111 pointed out). For $150 per year I could conceivably get 87,500 caches per week. That's all of the caches within 380 miles of my house, or an 8½ hour drive in ANY direction. <_<

Link to comment
I have to ask, why is it that every time a thread is started suggesting a new feature that those who are not interested in that new feature have to be automatically AGAINST the feature?

 

That stinks and so does reading all the pile on on such threads.

 

You do not have to be against something others may want but you may not use or need.

 

I would love to see a total update to the PQ system. One that uses a more efficient system of downloading directly instead of the poorest way possible (via email attachments). This would save tons of CPU usage (the single biggest drain on any internet server). Sending by email uses tons of server CPU. Besides email is an ancient and inefficient at best system and not always reliable.

I'd like a pony, but that's not going to happen. Since I can't have a pony, sharing in my desire for a pony would serve no purpose. Therefore, when I express my desire for a pony, people tend to explain why I can't have one and give me alternatives to a pony.

 

The fact that people often ask for a pony and the entire thing has been discussed many, many, many times before tends to cause people to give less attention to explaining the many reasons not to get a pont and more attention to the pont alternatives and workarounds.

 

But in this case - the subject is no pony. Actually the subject is a HUGE performance hit on the GC servers. Fixing the system should be priority number one for the team - after the forum.

 

And just because you think a subject is a pony, you may not be correct. It goes back to my thought weeks ago, around here some people talk too much.... <_< (not directed to anyone specific!)

 

No doubt a revamp of the PQ system is in order. Email attachments just do not cut it for best server cpu usage. I wonder how many PQ's emailed out simply get deleted! A system where you had to go and click on a link to get the PQ would cut back on server usage big time.

Edited by Frank Broughton
Link to comment
The ignore list would be your friend, I would think.

I've never ignored a cache, so I don't know how it works, but if ignored caches don't show up in PQs that would definitely work.

 

With the lousy site performance, it would take you days to ignore a couple of thousand finds.

 

Edit to add: OMG. Just saw Skippermark has almost 6000 finds. Ignoring that would be a full time job. Not to mention that the Ignore function might not support such a large number. Doubt anyone has tested it to that level.

Edited by Tequila
Link to comment

Markwell, is there an efficient way to download 2 or 3 PQs worth of caches in an area?

 

I usually setup my PQs by date and have found the tools on hand more than sufficient...up until version 2.1 of the iPhone app. It can download PQs directly from the GC site. Normally, I cache in areas with service and wouldn't need it, but let's say I'm going somewhere where there's no service and 2 PQs is enough to cover all the caches in the area.

 

PQs load pretty slowly in the app, and if they're setup by date, it's possible that nearby caches could be in either PQ. It's not practical to keep reloading them to see if there are any nearby in PQ1 and then PQ2.

 

I'd like to setup those 2 PQs by area so that I only have to load one PQ until I move far away to another area but don't want to have a lot of "overlap" or missing areas.

Link to comment

I get around the 500 Cache limit this way.

 

If I'm gonna hit an area dense with caches, I set up 3 PQ's for that specific area. One is for traditional caches, one for multi's and one for unknown (puzzles). I add in a few filters (attributes/terrains/difficulty)and I'm set to go

 

I also do the same for my own area so I don't miss anything.

 

I've been lucky though living in a rural area, so I don't have much caches to choose from.

 

I would hate to have to run PQ's in somewhere like, oh, West Bend, Wisconsin, who claims to have 500 caches in just a 7 mile radius. Eventually I'll have to do that as I'm going to West Bend next month. I'm gonna have to start running PQ's a week before hand so I can make sure I'm covered by the time my trip comes around.

Link to comment

There have also been times when we've wanted to cache in a cache rich area where i wasn't positive which area i was going to. Houston for example has over 2000 caches i believe, and trying to piece together a query for that area is not easy.

 

That means 5 PQ's. 495 each to leave a buffer. Set by place date takes a few minutes to set up and you have it. Even you are saying, at least implying, that this will be an exception and not the rule. You can't program for every "possible" exception, especially in light of the fact that the current system will work in almost every conceivable case.

 

As mentioned earlier, take out the ones you know you won't be doing and it becomes significantly less, meaning 4, or even 3, PQ's will do it.

 

Sorry, it's not just that it is not valid for me, however there has not been a valid reason given yet for an increase. Every one mentioned can be achieved easily within the current parameters.

 

Not easily...

 

I at first thought you had something there when you said run queries by placed date. But then i got to thinking about it and can see that most likely, some caches aren't going to make it into the query, at least on the first try.

 

For instance, i use set dates of 1-1-01 to 1-1-02. The 500 cache limit is reached before getting all the caches that were placed in that time period. Now i have to go back and try another, shorter, date period and see how the results come out on it. I may or may not get them all this time but as you can see, i'm having to run another query just to find out. Unless i'm missing something, there may be quite a bit of trial and error getting an accurate query therefore making this not as easy to accomplish as you make it sound.

 

I do admit that the 500 limit is more than satisfactory most of the time so this is not a big issue for me. But there is no doubt that having a higher query limit could and would be useful at times. Groundspeak allows 5 queries, 500 caches each, 2500 total for the day. I have to ask, would it really be more taxing on servers to allow the same number of caches, but in a single query? If not, what is the reason for limiting to 500?

Link to comment

Markwell, is there an efficient way to download 2 or 3 PQs worth of caches in an area?

 

I usually setup my PQs by date and have found the tools on hand more than sufficient...up until version 2.1 of the iPhone app. It can download PQs directly from the GC site. Normally, I cache in areas with service and wouldn't need it, but let's say I'm going somewhere where there's no service and 2 PQs is enough to cover all the caches in the area.

 

PQs load pretty slowly in the app, and if they're setup by date, it's possible that nearby caches could be in either PQ. It's not practical to keep reloading them to see if there are any nearby in PQ1 and then PQ2.

 

I'd like to setup those 2 PQs by area so that I only have to load one PQ until I move far away to another area but don't want to have a lot of "overlap" or missing areas.

You might look at selecting caches by type within the PQs (such as 1=Trads, 2=all others). Depending how you load the GPSr, you can have the cache type in the name, thereby telling you which PQ to look at. You could also use GSAK to put a PQ identifier (number/letter) with the name (this way you could use any PQ selection).

Link to comment

Not easily...

 

I at first thought you had something there when you said run queries by placed date. But then i got to thinking about it and can see that most likely, some caches aren't going to make it into the query, at least on the first try.

 

For instance, i use set dates of 1-1-01 to 1-1-02. The 500 cache limit is reached before getting all the caches that were placed in that time period. Now i have to go back and try another, shorter, date period and see how the results come out on it. I may or may not get them all this time but as you can see, i'm having to run another query just to find out. Unless i'm missing something, there may be quite a bit of trial and error getting an accurate query therefore making this not as easy to accomplish as you make it sound.

 

The part you appear to be missing is that you do not have to run the PQ everytime. Just preview it, which is an option it gives you after setting it up as well as from the list, to check how many caches are in the query and adjust accordingly.

 

I do admit that the 500 limit is more than satisfactory most of the time so this is not a big issue for me. But there is no doubt that having a higher query limit could and would be useful at times. Groundspeak allows 5 queries, 500 caches each, 2500 total for the day. I have to ask, would it really be more taxing on servers to allow the same number of caches, but in a single query? If not, what is the reason for limiting to 500?

 

The server load keeps being mentioned. Apparently GC does not feel it is significant or, at the very least, not significant enough to outweigh the disadvantages.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Followers 2
×
×
  • Create New...