Jump to content

500 Waypoint Pocket Query Limit


ThePropers

Recommended Posts

With so many of the newer GPSrs being able to hold 1000 or more waypoints, I was thinking (quit rolling your eyes!).

 

I understand upping this limit to 1000 would cause headaches on the server, but would it be possible to allow ONE pocket query each day return 1000? Or barring that, one each week?

 

I assume most of us run what I would call a "primary" PQ for around our home coordinates, then run several "secondary" PQs for outlying areas or vacation spots we'll be traveling to that the 500 waypoint limit is more than acceptable. If I could set up my primary one that I use 95% of the time to return 1000, that would be a huge improvement for me, and might even end up saving some overhead since people wouldn't need to run several PQs to return the same area that it would cover.

 

Any thoughts on that?

Link to comment

My Magellan with a 1GB memory card can hold the entire eastern US in detail maps. It would also hold every active geocache in the world if that were possible. So why not make PQ size unlimited?

 

In reality there has to be a limit, and I think 500 is a good number. If you want more than 500 then make two PQs

 

Here is how I setup my local PQs:

 

PQ1: All caches placed prior to 2002

PQ2: All caches placed in 2003

PQ3: All caches placed in 2004

PQ4: All caches placed in 2005

PQ5: All caches placed in 2006

 

That is a total of 2165 caches with 100 miles of my house since 2002 & 2003 don't quite return 500 each. Now that I have these I never need to run PQ1-4 except to eliminate the archived caches. If I needed to I could then split 2006 into 2 PQs, but I haven't found the need to do that.

 

Really, there is no need to have a PQ download more than 500.

 

I find hunting the caches from PQ1 to be much more rewarding, and it isn't just because of their age. They just don't hide them like they used to!

Edited by AB4N
Link to comment

My Magellan with a 1GB memory card can hold the entire eastern US in detail maps. It would also hold every active geocache in the world if that were possible. So why not make PQ size unlimited?

 

In reality there has to be a limit, and I think 500 is a good number. If you want more than 500 then make two PQs

 

Let's say you want 1000 waypoints closest to your house. Your run a query and get 500, and they're in a circle closest to your home. How would you then expand that circle to include a wider sweep of the next 500 out, without including the original 500? You can't. Running 2 PQs won't cut it. Having one run of 1000 either per day or week would eliminate needing to use 3 or 4 queries to get the same info in a circular pattern. I'd love to have the option of running one 1000 point query instead of two 500 point queries. ThePropers idea is a good one.

Link to comment

Expand the radius, but split it in half with the date placed - find the midpoint like cache placed prior to and including 1/1/2005 and caches placed after and including 1/2/2005.

 

For example, I can get a radius of caches 51 miles surrounding my house with 5 queries.

 

When I set up the current set of queries I use, this is the distribution I got.

 

JAN 01 2000 - APR 15 2004 = 497

APR 16 2004 - MAR 24 2005 = 494

MAR 25 2005 - OCT 28 2005 = 487

OCT 29 2005 - APR 27 2006 = 493

APR 28 2006 - DEC 31 2009 = 396

 

The last time I ran it, there were 2,389 caches, but if I were relying on one pocket query with a radius of my home, I wouldn't get any further than 13 miles from home.

 

So it CAN be done with multiple queries, and since you're probably not going to find more than 2,500 caches per day, I would think with the current system you could get just about EVERYTHING you're looking for within the guidelines established by the PQ generator.

Link to comment

Thanks for the heads up on a good system. You are correct that I wouldn't need more than 2500 caches....I actually only need 1000 so I should be able to do that with 2 queries. I would still rather just have one query (so they'd arrive all together and I wouldn't have to merge files and all that), but it'll work.

 

I still wonder though, would it be less overhead (serverwise) to run less queries that would return more results (i.e. setting the higher limit) or more queries that return less results per query? Serious question.

Link to comment

I have over 800 finds and there's a big donut-shaped hole around my home coordinates. I run 4 PQ's twice a week to get the closest caches to me in all directions. If I could merely get the closest 1000, I would only need to run that 1 query, so I concur with the consensus. I believe it had been proposed before - rather than 5 queries at 500, simply permit up to 2,500 results per day however you would want to receive them.

 

I do have date-based queries set up for Michigan for when we travel cache - takes 12 or 13 queries to get the entire state. With a higher number of returns per query, I could do it in many fewer.

 

And lastly, for caches along a route - I like to get caches within 5 miles of the highway - sometimes the query isn't perfect and if I stop for a meal, I want to be able to get off the road and find some caches. Traveling from Detroit to Kansas City requires entirely too many queries because of the 500 limit. I put more stress on the server building and testing these queries than I would if I could create 1 route and get all results in 1 PQ.

 

Here's an idea - I can run a 500 result query 7 times a week, right? And in the summer, I run my main ones every single day. Why not permit up to, say, 20 PQs to be installed, but each one can only return 3,000 results weekly. If I run a 500 one daily, that's 3,500 results. But if it returns 1,000 - I could only run it 3x a week. If it returned 2,000, only once. Seems good to me...

 

Cheers,

PittCaleb

Link to comment

There may be another reason for the 500 cache limit, equipment limitations.

 

Depending on whatever is happening between the time the database spits out the data and it is sent out, it could be faster to work in two 500 cache chunks than one 1000 cache chunk. The more data you have to shuffle at a time the more memory you have to have and the slower some functions work.

 

This doesn't preclude automatically splitting a single 1000 cache chunk into smaller chunks and sending those out, but that's a bit of programming and not exactly user intuitive--as in "why can't we have larger files"--though it may likely be completely transparent to a good chunk of users.

Link to comment

There may be another reason for the 500 cache limit, equipment limitations.

 

Depending on whatever is happening between the time the database spits out the data and it is sent out, it could be faster to work in two 500 cache chunks than one 1000 cache chunk. The more data you have to shuffle at a time the more memory you have to have and the slower some functions work.

 

This doesn't preclude automatically splitting a single 1000 cache chunk into smaller chunks and sending those out, but that's a bit of programming and not exactly user intuitive--as in "why can't we have larger files"--though it may likely be completely transparent to a good chunk of users.

 

If true, that would explain why ALL FINDS is only allowed once every 7 days :P

Link to comment

Boy, with people pulling pocket queries for 2,500 caches daily every day, no wonder the servers are taxed to the max. To me, it seems like some of you are just pulling way to much data -- and are wanting to double that?

 

icon_shrug.gif

 

If you had 500 finds per day, maybe I could understand it. If you had 500 finds per week, maybe I could understand it. As is, I don't understand it. If you miss a log or two per day, or even a cache or two per day, is your world going to end? Has your whole trip been ruined because a cache was listed yesterday and you missed it in your PQ? I've seen that on trips. There were still plenty of other caches in my PQ to find.

Link to comment

I used to use 'date placed' to split up my PQs, but I've recently changed it. I now request them by cache size. I have one PQ that returns nothing but micros and another that gives me everything else. They each run once per week.

 

This way, I can dump all of them into my Quest and go after whatever I want if I'm caching alone. If I'm caching with my wife, we will only look for non-micros. If I just feel like going numbers nutty, I can just charge after the micros. Since they reside in three different files in Plucker (micros, non-micros, everything), I need not be bothered by the caches that I don't want to look for on any particular day.

Edited by sbell111
Link to comment

I am not really sure why 500 caches is not enough. I don't see many people that do 500 caches in a week at all. If they do they have usually gone travelling. I have a hard time going through the 300 that I pull every week for my home area. If you are looking for more variety do more specific searches for cache types. Just a thought.

Link to comment

I am blown away by the way people respond to feature requests. Inevitably, someone requests a feature, and within a couple posts someone is explaining why we should not have it. I have never witnesses such a love for status quo.

 

Take this thread for instance, why in the world would you argue against it. Having an upper limit of 1000 would not in any way negatively affect your caching experiance. If it would help others, then it seems like a valid request.

 

Also, it drives me nuts how quickly individuals speak on behalf of the developmers at Groundspeak. Talk about backseat drivers - how can you all speak about sever loads, or development agendas, or the such.

 

If there is a valid disadvatage then that needs to be discussed, but let the experts talk about server-side issues. They are one who know what is possible.

Edited by SG-MIN
Link to comment

I am not really sure why 500 caches is not enough. I don't see many people that do 500 caches in a week at all. If they do they have usually gone travelling. I have a hard time going through the 300 that I pull every week for my home area. If you are looking for more variety do more specific searches for cache types. Just a thought.

Because pulling the 500 closest caches may not really be that big of a circle.

 

I am quite often in different parts of town and if I only have the 500 closest, I may very well be outside that circle. I like to keep my GPS and PDA loaded up (my gps holds 1000), for the times that I have extra time before I have to get home or between appointments.

 

I guess the argument could be made for even more in cache dense areas.

 

500 caches just doesn't cut a big enough circle for me, 1000 does. So I would like to see it bumped to 1000, but I kinda doubt thats gona happen. Until then, I'm running twice as many PQ's to keep the data current.

 

If (as some are arguing) 1000 puts too much of a load on the PQ generator, then running two or three PQ's with 500 results is going to put MORE of a load.

 

I'm guessing the 500 limit was decided quite a while back before some areas were so cache dense, and in that time with growth and stuff GC has probably done some hardware upgrades, so maybe this should be evaluated again.

 

On top of that, I see that you've found around 2800 caches. That means for you to run a PQ and get 500 results back would probably give you a much much larger circle than someone who only has a couple hundred. finds. So what may not be a problem for you could be a big problem for your new caching buddy next door. (I'm assuming most PQ's have "not found by me" checked, mine do).

Link to comment

I am blown away by the way people respond to feature requests. Inevitably, someone requests a feature, and within a couple posts someone is explaining why we should not have it. I have never witnesses such a love for status quo.

 

Yeah, it's really bizzare. I have commented on this in the past. Browse through the feature requests on this forum and you will find multiple people explaining why that shouldn't be done in basicaly every thread.

 

Load issues: Anyone who knows anything about database tech can tell you that size doesn't really matter. Individual queries do. 5 queries will almost always be more DB load than 1 query, even if the 1 query is 5 times the number of results.

<3000 results is nothing to a database.

Edited by benh57
Link to comment

I am blown away by the way people respond to feature requests. Inevitably, someone requests a feature, and within a couple posts someone is explaining why we should not have it. I have never witnesses such a love for status quo.

 

Yeah, it's really bizzare. I have commented on this in the past. Browse through the feature requests on this forum and you will find multiple people explaining why that shouldn't be done in basicaly every thread.

 

Browse through Jeremy's responses to this request.

 

Jeremy himself has said numerous times that he won't be increasing the PQ limit. He has also indicated that he didn't make PQs so that people can have an offline immediately-out-of-date database of caches. He made it so that you can pull cache information for caches you want to take with you.

 

So, if you want to get a ridiculous amount of caches (which will be out-of-date by the time you get them loaded), build 5x500 (2500) caches per day if you like. Build a separate set of queries for the next day ANOTHER (2500) - OR buy a second premium account and get 5000 caches per day if you want. GSAK can handle it without any problems.

 

But I don't think that Jeremy is going to be increasing the limits any time soon.

Link to comment

I am blown away by the way people respond to feature requests. Inevitably, someone requests a feature, and within a couple posts someone is explaining why we should not have it. I have never witnesses such a love for status quo.

 

Yeah, it's really bizzare. I have commented on this in the past. Browse through the feature requests on this forum and you will find multiple people explaining why that shouldn't be done in basicaly every thread.

 

Browse through Jeremy's responses to this request.

 

Jeremy himself has said numerous times that he won't be increasing the PQ limit. He has also indicated that he didn't make PQs so that people can have an offline immediately-out-of-date database of caches. He made it so that you can pull cache information for caches you want to take with you.

 

So, if you want to get a ridiculous amount of caches (which will be out-of-date by the time you get them loaded), build 5x500 (2500) caches per day if you like. Build a separate set of queries for the next day ANOTHER (2500) - OR buy a second premium account and get 5000 caches per day if you want. GSAK can handle it without any problems.

 

But I don't think that Jeremy is going to be increasing the limits any time soon.

 

The merits of this particular requet aren't relevant to what you quoted.

 

This is about every single request - even reasonable ones - being shot down every time. Note that most people replying didn't justify their negativity with the correct answer (Jeremy's point of view) rather invalid reasons involving DB load.

Link to comment
...involving DB load.

 

I think you were the only one that mentioned DB load. I and others were referring to the server. The results from the DB has to be converted to a useful format. Additionally, 500 caches from one table can turn into 2500 logs for those caches and no telling how many hitchhikers and child waypoints.

 

Then again, we don't know the table structure of the PQ server's DB, so this is all speculation.

 

Also, if referring back to the OP it is only a convenience request. You can still get the same end result with a little bit of work.

Link to comment

Pardon me? The main part of my post dealt with how you use the data in getting finds.

 

Frankly, my point is illustrated very well by the number of finds SG-MIN and benh57 have. Neither of you have over 200 finds. Yet, you talk down to me because I can't understand why you need to pull 5,000 caches each day? The main part of my post did not talk about server load. I'm talking about the reality of being able to *actually use the data*. At least the OP is something of a power cacher at almost 900 finds. Still, getting 2,500 caches emailed to you a day seems to be enough to me as it relates to *actually being able to find those caches*.

 

I'll say it again.

 

If you had 500 finds per day, maybe I could understand it. If you had 500 finds per week, maybe I could understand it. As is, I don't understand it.

 

How about some facts for you. Even a team of mega power cachers trying their hardest to find as many caches as they could in 24 hours that broke into two groups at times and signed the outside of the containers instead of the logbooks didn't even go over 400 finds and DNF's combined in 24 hours. The facts speak for themselves.

 

Edited to add... let's do the math. 2,500 caches divided by 24 hours is 104.16 caches an hour. Divide that by 60 minutes and that is 1.736 caches per minute. Best of luck on achieving that.

Edited by mtn-man
Link to comment

How about some facts for you. Even a team of mega power cachers trying their hardest to find as many caches as they could in 24 hours that broke into two groups at times and signed the outside of the containers instead of the logbooks didn't even go over 400 finds and DNF's combined in 24 hours. The facts speak for themselves.

 

Edited to add... let's do the math. 2,500 caches divided by 24 hours is 104.16 caches an hour. Divide that by 60 minutes and that is 1.736 caches per minute. Best of luck on achieving that.

 

But sometimes its not about quantity, but about opportunity. In my case, the 500 closest caches are a smaller circle than my daily travels. 1000 closest caches fit the bill. My ideal circle size currently has about 880 caches, so I could reduce my circle down to one PQ (and that PQ wouldn't even be maxed out), and this leaves me with something that fits on my GPS (1000) and PDA (seems to choke up once file gets around 1000) Now I realize there are areas (esp in California) that even 1000 can still be a pretty small circle, so 1000 may not help everyone, but a lot of areas I look at 1000 seems to be a much better circle than 500.

Link to comment

I do consider myself a heavy user of PQs even though I'm not a power cacher.

 

Having large number of caches in a PQ offline DOES make it easier to browse and slice and dice with GSAK. Given that the Chicago area (at least what I call Chicago area) has over 2500 caches if I want to get that data to mine it does take all 5 PQs for 500 caches each.

 

Data mining is not something that PQs do well. I can't do a single PQ of caches where it's in a rectangle, or within a certain polygon, and through GSAK I can. To accurately get all of the caches within a particular rectangle or polygon in the area, I need to over-pull the data. That's why I gather the 2500 caches in Chicago once per week.

 

If it were possible to get a single GPX file using all of the criteria I could on GSAK or other filters, then I would never come close to the upper limits. But there would have to be some major developments in the PQ criteria, AND some improvements on site response (see the recent down times) for me to forego getting 2500 caches a week.

 

But I just don't think that 2500 caches per day is an unreasonable limitation to the system.

Link to comment

How about some facts for you. Even a team of mega power cachers trying their hardest to find as many caches as they could in 24 hours that broke into two groups at times and signed the outside of the containers instead of the logbooks didn't even go over 400 finds and DNF's combined in 24 hours. The facts speak for themselves.

 

Edited to add... let's do the math. 2,500 caches divided by 24 hours is 104.16 caches an hour. Divide that by 60 minutes and that is 1.736 caches per minute. Best of luck on achieving that.

 

But sometimes its not about quantity, but about opportunity. In my case, the 500 closest caches are a smaller circle than my daily travels. 1000 closest caches fit the bill. My ideal circle size currently has about 880 caches, so I could reduce my circle down to one PQ (and that PQ wouldn't even be maxed out), and this leaves me with something that fits on my GPS (1000) and PDA (seems to choke up once file gets around 1000) Now I realize there are areas (esp in California) that even 1000 can still be a pretty small circle, so 1000 may not help everyone, but a lot of areas I look at 1000 seems to be a much better circle than 500.

I live in Atlanta. I understand that totally. I travel across the country. Most of my travels are to the top 10 markets because of my advertising job. I've never had a problem.

Link to comment
Pardon me? The main part of my post dealt with how you use the data in getting finds.

 

Frankly, my point is illustrated very well by the number of finds SG-MIN and benh57 have. Neither of you have over 200 finds. Yet, you talk down to me because I can't understand why you need to pull 5,000 caches each day? The main part of my post did not talk about server load. I'm talking about the reality of being able to *actually use the data*. At least the OP is something of a power cacher at almost 900 finds. Still, getting 2,500 caches emailed to you a day seems to be enough to me as it relates to *actually being able to find those caches*.

 

I'll say it again.

 

If you had 500 finds per day, maybe I could understand it. If you had 500 finds per week, maybe I could understand it. As is, I don't understand it.

 

How about some facts for you. Even a team of mega power cachers trying their hardest to find as many caches as they could in 24 hours that broke into two groups at times and signed the outside of the containers instead of the logbooks didn't even go over 400 finds and DNF's combined in 24 hours. The facts speak for themselves.

 

Edited to add... let's do the math. 2,500 caches divided by 24 hours is 104.16 caches an hour. Divide that by 60 minutes and that is 1.736 caches per minute. Best of luck on achieving that.

 

And where did I talk down to you? If you read my post is has nothing to do with pulling PQ's or database load or anything like that (and thus is admittedly off topic). I was just stating my disdain for people who argue against a new site feature for no reason. Maybe I was talking about, but more than likely I was not. I just think it is ridiculous for people to balk at new feature ideas when in truth if they were to be implemented it would have absolutely no affect on them.

 

A side point to that, concerning the database, was the fact that many people, in the course of their arguing against feature ideas, often talk as if they have authority over GC.com's servers. Only the developers know the limitations of a particular system, and thus they should be the ones talking about feasibility and the such rather that John Q. Cacher.

 

And by the way, how does having 140 caches instead of 1527 make me any less qualified to speak to the attitude of cachers in regard to new feature requests?

Link to comment
And by the way, how does having 140 caches instead of 1527 make me any less qualified to speak to the attitude of cachers in regard to new feature requests?

In the same way that you tell us that we should allow Groundspeak to speak to these issues, maybe you could allow those who actually find a lot of caches on a daily basis speak to the need for the feature.

 

In other words, practice what you preach.

Link to comment

And by the way, how does having 140 caches instead of 1527 make me any less qualified to speak to the attitude of cachers in regard to new feature requests?

 

Actually, for this particular request, I think the cachers with lower counts should have a word here, so I agree that you are not only qualified, but MORE qualified than a higher count cacher. The feature being requested DOESN'T HELP cachers with larger counts.

 

Someone with 1500 finds who runs a PQ that limits results to 500 is going to end up with a MUCH MUCH larger circle of caches than someone who only has a couple hundred finds.

 

For cachers with larger find counts... So what if your circle returns caches that are a couple hundred miles out, mine don't, so please don't argue against a feature that will help me just because it won't help you...

Edited by Potato Finder
Link to comment

I never said you should not have a voice in the matter. Here are my points (and my only points):

 

1.) There is no reason to argue against a proposed feature unless that feature would clearly negatively affect your personal caching experiance.

 

2.) The most common arguement against a feature often is its effect on the website/servers or the difficulty in developing it. I am simply saying that the people who should speak to these server/development issues should be the developers rather than the average cacher who has absolutely no connection with the actual behind the scenes development and implementation of the features.

 

Now if someone were to be advocating the implementation of a feature such as the ability to "impeach" a cache (i.e. 10 votes against a cache and it is archived), then yes people could argue against that because it could have a negative affect on people's personal experiance. In that case, experianced cachers would have much more to say.

Link to comment
Also, it drives me nuts how quickly individuals speak on behalf of the developmers at Groundspeak. Talk about backseat drivers - how can you all speak about sever loads, or development agendas, or the such.
I never said you should not have a voice in the matter.

:blink: I'll get back in the back seat now.

Edited by mtn-man
Link to comment

I would just hope for an atmosphere where we can be encouraging of good ideas. I apologize if I was rude.

 

Mtn-main, I value your opinion on caches because of your extensive find count (and reputation); I however would rather hear a developer comment on the feasibility of new features.

Link to comment

But sometimes its not about quantity, but about opportunity. In my case, the 500 closest caches are a smaller circle than my daily travels. 1000 closest caches fit the bill. My ideal circle size currently has about 880 caches, so I could reduce my circle down to one PQ (and that PQ wouldn't even be maxed out), and this leaves me with something that fits on my GPS (1000) and PDA (seems to choke up once file gets around 1000) Now I realize there are areas (esp in California) that even 1000 can still be a pretty small circle, so 1000 may not help everyone, but a lot of areas I look at 1000 seems to be a much better circle than 500.

 

This was pretty much my reason for the OP, as 500 caches currently covers a 27 mile radius from my house, but I very frequently travel 40-50 miles away and tend to be more of a "when the opportunity arises" cacher.

 

However, the 'PQ by date hidden' method mentioned earlier fits my needs to get the necessary radius that works for me and I get my 1000 caches that I wanted (well, I only get about 900, but leaves room for the child waypoints). Whether or not me running 3 PQs in order to get that is better than running 1 PQ that would return the same 1000 is another matter, but as far as being an end-user, it functions ok for me (despite the PQs coming in at different times so I have to wait for both to arrive).

 

Regardless, I am satisfied.

Link to comment

My "new" GPSr can only hold 500 waypoints, so that's how many I load it up with whenever I go out of town for a business trip. Typically I only find between 1 and 10 caches on these trips, and more often than not it's more like 1 or 2. :D

 

However, IF my GPSr held 1,000 waypoints, I'd want to fill it up. Why? Because even though I'd never find 1,000 caches on a trip (or even 100) I never know where I'll be in the city when I find time to cache.

 

Example - I went to Austin recently and had been getting the closest 500 to my hotel for several weeks prior to the trip to make sure I had plenty of logs in my PDA. One morning I knew I was going to have a breakfast meeting but before I got to town I didn't know where it would be. As it turned out, I was well outside the circle of caches in my GPSr when I got to the meeting about an hour early.

 

So there I was, time on my hands for caching, and no close caches in my GPSr. Lukily I had my laptop, which has a wireless card that I can use to dial-up anywhere that has cell phone coverage, and while sitting in the parking lot I saw that there was an Off Your Rocker cache only 400 feet from me.

 

If I'd had 1,000 caches loaded I probably would have had that one and several others already loaded and would have had more time to find more than just one.

 

Obviously this 500 waypoint limit in my GPSr isn't something this website can fix, but the story does show a good example of why having more than 500 caches downloaded into your GPSr, even though you won't find them all, is helpful for those whos units can hold 1,000.

 

If mine could have held 1,000, I would have had it loaded up with 1000, even though the website has a 500 limit per PQ. Would doing it the "hard" way have been more of a load on the site than getting them all with one PQ?

Link to comment

Boy, with people pulling pocket queries for 2,500 caches daily every day, no wonder the servers are taxed to the max. To me, it seems like some of you are just pulling way to much data -- and are wanting to double that?

 

icon_shrug.gif

 

If you had 500 finds per day, maybe I could understand it.

Others have explained why one might want a large number of caches available offline. I'll add one more reason that I haven't seen thrown out here. Groundspeak's current cache mapping options kind of stink, at least in comparison with what can be done using offline data.

 

Here's how I use my PQs: I have three PQs that run a few times a week for areas near my home, in different directions. These yield the roughly 1000 caches that I always have in my GPSMap 60csx as waypoints. I also have a half dozen or so PQs for more outlying areas that I may travel to in the near future, again in various directions from home. These are combined in an "outlying" GSAK database and exported as (1) custom points of interest that go into the 60csc; (2) a combined GPX file accessed via GPX Sonar on my PDA (for access to cache info in the field, where access to the online database is either not available or at best unwieldy); and (3) exported to various mapping programs (MapSource; Topo!; Google Earth). Having all of the caches in outlying areas I might visit available in my mapping programs helps me decide where I might like to take my next weekend excursion. I can pan around, zoom in and out, and see what's available - functionality that barely exists on gc.com. (Yes, the Google maps beta is a nice start, but it just plain doesn't work for this purpose right now: The "add bookmark" function works less than half the time; if you visit the book mark list you cannot go back to the map without losing your location; it doesn't hide caches you've already found; etc.)

 

So for me, it's not about how may caches I can find in a day - that's irrelevant. It's how many are at my disposal. A cache that is not in my offline database might as well not exist, given the way I cache and the way I plan my caching excursions. I'm not complaining about what we get for $3/month, just explaining why some people might want greater flexibility (and in my case, would be willing to pay for it).

Link to comment

Groundspeak's current cache mapping options kind of stink, at least in comparison with what can be done using offline data.

 

Have you tried the Google Earth interface? Not precise (intentionally), but it does have a really cool mapping interface: pan, zoom, tilt in 3D, etc., etc.

Link to comment

Hmm... I'm pretty much in the same situation as The Propers was at the beginning of this topic... the GPS holds 1000, and I can only get 500. Like him, I enjoy caching when the opportunity arises, so not having to pre-plan where I'll be caching comes into play. I don't specifically want to get four PQ's that kinda 'surround' my home coordinates, and get gsak to remove the doubles, then sort by distance, then remove anything over 1000 results, and you get the idea. A lot of work for something that would take a LOT of tweaking to maximize the distance, and minimize the duplicates.

 

I however have (potentially) thought of an alternate method to the above listed 'date' system. If I'm not mistaken, you can get a PQ to return results for specific types of caches. Hence... I'm thinking I'll set up a PQ to give me the results for 500 closest Traditional caches, and 500 "everything else" caches. Quite likely give me more distance in one PQ or the other, but there would be no duplicates, and the end result would be exactly 1000 caches. Hence, no sorting, no removing, no cleaning. The uneven traditional/non-traditional cache distances would be an acceptable loss for the benifits therein :)

 

So... just thought I'd toss the idea into the fray. IF the above PQ is even possible. At work now, so I can't test this theory, or compare distances obtained.

Link to comment

Groundspeak's current cache mapping options kind of stink, at least in comparison with what can be done using offline data.

 

Have you tried the Google Earth interface? Not precise (intentionally), but it does have a really cool mapping interface: pan, zoom, tilt in 3D, etc., etc.

Sure. As I mentioned, I have GSAK export to Google Earth for when that makes sense. Most of the time, it really doesn't. GE is cool, but for this application, not particularly useful. Even with roads turned on, I find it pretty difficult to get my bearings in GE. I'm far from a newbie when it comes to maps, but GE doesn't label locations (cities, etc.) in any predictable, usable manner, unless you're zoomed way in. GE is also slow on my home connection, pretty heavy on PC resources, and useless if I have no internet connection. The Geocaching KML also does not account for caches I've already found. I find MapSource and Topo! to be much more useful in locating places to go caching, and planning the excursion.

Link to comment

There is no duplication with a properly set up, date-divided series of queries. No one has been able to devise a scheme superior to the date method.

True, I could see the 'date' method working well initially. You'd find the approximate 'halfway' point between age of a cache to get an even distribution of distance with the 500 queries each. However, that method would require tweaking every month or so, as there's always going to be new caches, and never more caches in the "between 2000 and 2004" or whatever. Hence, the 'newer' PQ will constantly be growing, and the other one will never change. It's basically the tweaking I was wanting to avoid. I'm wanting to make a sorta "set it and forget it" type of thing, so to speak.

Link to comment

I would just hope for an atmosphere where we can be encouraging of good ideas. I apologize if I was rude.

 

Mtn-main, I value your opinion on caches because of your extensive find count (and reputation); I however would rather hear a developer comment on the feasibility of new features.

You were direct, but not rude (to me, anyhow), so I'll try to reply to your comments. :laughing:

 

I'm definitely no authority on Groundspeak server issues, but there's an "effect" whenever a change is made, for example, servers getting loaded up again after Caches Along a Route was implemented. Not sure others are sticking to the status quo for this reason, but I am.

 

Having said that, I can see the benefits of increasing the PQ limit which would increase the cache radius, or for Caches Along a Route PQ increase the distance. It does seem to allow more freedom and sense of adventure.

 

From personal experience, though, I'm not sure this will enhance my Geocaching, as I've tried Geocaching without PQ, with PQ, and with Caches Along a Route. Even with Caches Along a Route, density in urban areas is intimidating enough that I usually drive right past them until the map becomes sparse enough where I can choose my adventure. I often stop on the side of the road to read the title and the description to see if a cache is interesting enough, so there's still time spent on "preparation."

 

I even carry more than one GPSr, so in theory, I can carry more than 1000 unique waypoints, but so far, I haven't even come close to utilizing all that information. It's really not going to help the "Mega Cachers" either, since they already spend tremendous amount of time in preparation, although I'm sure they'll take what they can get when the PQ limit increases.

 

It also helps that I'm not as "smiley greedy" as I can be, so I don't lament on caches I missed out, but instead, cherish the ones I've found. :laughing:

Link to comment

Yeah, the tweaking is minimal. The older PQs only have to be tweaked once you've found a significant number in each to warrant an adjustment, or as caches are archived.

 

The latest PQ is not tweaked unless earlier PQs are adjusted to allow for an earlier start date of that query, or the number of returned caches approach 500 at which time another query is created. Additionally, only the latest in a series is the one you have to worry about growing beyond the 500 cache limit.

 

Believe me, I've tried all sorts of different angles and "by date" is the best over all method. Like Markwell, I only tweak every few months.

Link to comment

Sorry for dredging up an older topic... but the subject of said topic is exactly what I'm wanting to refer to, and it also relates to my previous response in this topic.

 

Nonetheless... if anyone is still kinda batting around ideas to get more than 500 different waypoints... I've come up with the perfect "set it and forget it" type solution. Well... it won't be 100% set-and-forget, but significantly less tweaking than the date method, where one PQ will consistantly get more caches in it, and the other won't.

 

Yesterday, I decided to play around with PQ's again. I figured I'd attempt to find a way to get 900 PQ's onto my Garmin (the internal-memory max being 1000, since like... 95% of my memory card is full of maps :anibad:). I picked 900 so I'd have room to play around with personal waypoints, as well as leave room for my "specifically winter-friendly" list of about 80 caches. Again, I wanted to avoid the 'date' method, do to it's constantly changing nature.

 

Hence... I decided to try to work with cache size. I attempted to play with cache-type once, but could never get a relatively even balance when using two PQ's, but cache size was a new playing field.

 

I basically kept trying different combinations, different numbers per PQ (totalling 900), and then previewing it, going to the last page, and seeing how far the furthest one from me was. If I could get the two distances to near-match, things would be good.

 

 

And thus... I got a near-perfect balance between the two. Since I decided to go with 900 specifically, it gave me a bit of room to play with the amount per PQ as well, to even things out. I've found that (at least in Manitoba, anyway), if you go with 400 Virtual, Regular, Other, Unknown (or 400 VROU as I called it), and 500 Small, Large, Micro (listed as 500 SLM), they near perfect balance eachother out.

 

The VROU ended up being 291.4 km away for the furthest point it reached. The SLM was 296.1 km for the furthest. Technically, I could have probably tweaked it to 498 and 402 or so... but 500 and 400 are just nice, round numbers :anibad:.

 

BUT... yes... my objective had been reached. Since I'm going by cache sizes, it's going to be an average percentage of each size over the 900 total. Hence, as more caches are added, they should have at least approximately the same percentage of types added over time... thus not needing to tweak it as much. Obviously, you'll have to adjust for the percentage of types wherever anyone else lives, but one should be able to get close to even distance per PQ if tweaked enough.

 

So... just thought I'd share my tale with anyone else looking to want more than 500 geocaches in their PQ, and not want to work with date.

Edited by Kabuthunk
Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...