Jump to content

Pocket Queries: Raising the 500 limit


TheCarterFamily

Recommended Posts

Just curious as to why you'd want this? I'm not sure about anyone else, but I know I have a hard time getting through 200 caches before they get "stale," not to mention 1000 (much less 500 caches).

Edited by PJPeters
Link to comment
Just curious as to why you'd want this? I'm not sure about anyone else, but I know I have a hard time getting through 200 caches before they get "stale," not to mention 1000 (much less 500 caches).

 

Few reasons. The main one. We go caching as a family and my wife can never tell me which way and how far we're going. So I tend to do a circle around our current area. 500 is great if i circle around just north of my home. But the closer I get to Toronto it takes over. Theres like +1000 cache just in the surrounding areas.

 

Another reason is I tend to search for caches based on things that pocket queries can't pull. (ie 100 metres from a road, or by a user, has a find within the last 7 days by a user, etc...) So what I tend to do is pull 5 queries blanketing an area. Then do my custom queries from the combined GPX files.

 

Personally I'd like to just be able to pull by province. so if we are going on a driving location I don't have to worry about where we're going. Just load the GPS on the road from the GPX file.

Link to comment

Oh and not to mention you only get 5 queries a day. If you screw up on one and no rows are returned. Your now down to 4. So usually I don't play around. Just pull a blanet query and go from there.

 

To solve *that* problem, you can setup the PQ but do not check any days to run the query.

 

Then after you've got it setup, go preview it to be sure it is doing kinda what you want. Now, if it is working properly, then you can go and check the day(s) that you want it to go.

 

I expect with the faster hardware that they have in place, a brand new PQ setup will run very fast -- no more time delay to check out the query results before it runs...

Link to comment

I have more than 30 PQs in my list. They are centered on different areas around this cache-rich area. I don't "Schedule" any of them. If I am heading out in one direction, I get the 500 caches out that way. If I want 2500 caches in any one day, I run five different PQs to populate the different databases I have in GSAK.

 

I really don't see the need more than 500 caches in any one PQ . . .

Edited by Miragee
Link to comment

I really don't see the need more than 500 caches in any one PQ . . .

Therefore nobody that does see the need should be able to enjoy a 1,000 cache PQ?

 

Your method of having 30 PQs around your area is great, but what about people that travel a lot? I just found out today that I'll be driving from Atlanta to Huntsville on Monday evening, staying in Huntsville that night, spending an unknown amount of time on Tues in a meeting (in an unknown as of yet location in the city), and driving home Tue night.

 

A PQ for the drive will be easy to set up. No problems there. But Huntsville has WAY more than 500 caches. Even if I limit them to less than 3/3 I still have a PQ of 500 that only covers a small portion of the city.

 

I can ensure the area around the hotel will be covered, but since I don't know where the meeting is, and potentially where I'll be with some free time to cache, I could easily be outside the area covered by my PQ. If I had the ability to download 1,000 per PQ, or even 2,500 caches a day in a single PQ, I'd be more likely to be able to grab 4 or 5 caches.

Link to comment

When you are traveling, doesn't the "Caches Along A Route" feature solve that problem? Seems it would be better to get caches along the highways you think you will be traveling instead of getting a giant circle of 1000 caches . . . :laughing:

Link to comment

When you are traveling, doesn't the "Caches Along A Route" feature solve that problem? Seems it would be better to get caches along the highways you think you will be traveling instead of getting a giant circle of 1000 caches . . . :laughing:

That's true, for the drive. Which is why in my post I said, "A PQ for the drive will be easy to set up. No problems there. " :laughing:

 

However, if I find myself with a few hours to kill but I don't know where in the city I'll be until I get there, then the Cache Along A Route doesn't do anything. It's even less helpful than a small circle of 500 caches might be if I'm not in the small circle. :laughing:

Link to comment
When you are traveling, doesn't the "Caches Along A Route" feature solve that problem? Seems it would be better to get caches along the highways you think you will be traveling instead of getting a giant circle of 1000 caches . . . :laughing:

 

On my personal DB sure, I can do ploygons and have the route, and a 20 km circle on every click... but I need the over 1000 to get it there. Caching along a route in GC.com gets complicated. Since I never know where I'm really going how do I take a route for it? I don't know how many times I plan to go caching west and when my wife get's in the car she want's to go east.

Link to comment

Oh and not to mention you only get 5 queries a day. If you screw up on one and no rows are returned. Your now down to 4. So usually I don't play around. Just pull a blanet query and go from there.

Well that's an error between the chair and keyboard when you don't preview your PQ to make sure it works.

 

No matter who gets satisfied with this request, there will always be the need to increase it by xxx because it doesn't get me past xx miles from home. For instance, a 10 mile radius around my zipcode nets me 995 caches.

 

So, take advantage of 3rd party software such as GSAK, and go paperless with a PDA. With the PDA you can carry a mini database and you should be able to export to your GPS with the appropriate software. Then you can use the PQs for updating fresh data to your GSAK database and not worry about the limitations of the PQ.

Edited by TotemLake
Link to comment

Give me 500 and I will ask for 1000; give me 1000 and I will ask for 2000.

 

The one argument I see in the forums all the time is akin to "I would never use it/do it, so no one else would ever use it/do it so why bother?" Does not hold water with me. I don't eat meat so why should any restaurants serve meat? Because I am not the only one out there.

 

If you want to argue why PQs should not be increased 1000 give a reason that makes a difference.

 

Loch Cache

Link to comment

In the last 30 days, I have traveled to Northwest GA out of my normal "GA Radial" query area, Minneapolis, Seattle, Pittsburgh, Birmingham, Raleigh and Charleston (drove Charleston and used a route query going there and a radial query in Charleston). Five queries of 500 caches per each per day has worked fine for me.

 

By the way, Huntsville wasn't a good example. A 500 PQ from downtown of all caches 3/3 or less takes you out to Decatur 22 miles away. You should have no problems on your trip there with the system as is. An example of Raleigh with GW5 coming up is better actually. You have to be careful with your center of your PQ. I would center right on GW5 myself and probably still have enough caches returned to have a cache or two to find. Either that, or the halfway point between GW5 and the location in the area you are staying in (like me near the airport).

 

(I highly recommend that people going to GW5 pull a query early (like this weekend) to have one ready. Test it, check it, then pull a last minute one to catch last minute cache listings if you want it. At least this way you have a good query now that you can use in case of some strange issues late next week.)

Link to comment
If you want to argue why PQs should not be increased 1000 give a reason that makes a difference.
For right now, the PQ server is overloaded. Just look at the topics regarding folks not getting their PQ's. Until the functionality is there to support 1,000 returns on a PQ, why further bog down the system at this time?
Link to comment

Before I went on a trip from California to Colorado, I created several PQs all along highways I thought I would travel. Since I can have up to 40 PQs in my list, this was great. I created different databases in GSAK for the different states I was traveling through. Along the way, I refreshed the data.

 

Maybe in Denver, a huge circle of 1000 caches would have been convenient, but I think the two PQs of 500 caches each, centered on the opposite sides of the city I visited, got all the caches I would have chanced upon. Seems like being able to get five PQs of 500 caches each every day is plenty of data for most people.

 

I have more than 1600 caches in my Default GSAK database, and many other caches in additional GSAK databases. :laughing: If some people want 1000 caches in their PQs, okay . . . :laughing: I'm thankful for all the other changes they are making to the site. For me, this is very low on a "list of things that need changing." :laughing:

Link to comment

By the way, Huntsville wasn't a good example. A 500 PQ from downtown of all caches 3/3 or less takes you out to Decatur 22 miles away.

I did this trip to Huntsville last week but instead of going to the engineers office (that I still don't know the location of) we went to the project site, in a town called New Market. This was defintely outside of the circle of 500 caches I'd downloaded prior to making the trip, but I might not have just kept it to 3/3, I can't remember. But your point about cities like Raleigh is still valid. I've been on several trips where I've found myself outside of the circle of 500 when I was ready to cache.

 

The fact that I ended up not having time to cache on last weeks trip just makes a sad story worse. There was, however, a Hooter's Bikini Contest going on across the street from the hotel, so it wasn't a completely wasted trip. :laughing::laughing:

 

:laughing:

Link to comment

Give me 500 and I will ask for 1000; give me 1000 and I will ask for 2000.

 

The one argument I see in the forums all the time is akin to "I would never use it/do it, so no one else would ever use it/do it so why bother?" Does not hold water with me. I don't eat meat so why should any restaurants serve meat? Because I am not the only one out there.

 

If you want to argue why PQs should not be increased 1000 give a reason that makes a difference.

 

Loch Cache

If you want to give a valid argument that it should be increased to 1000 give it. Otherwise you're just posting rhetoric that doesn't make a difference.

 

The fact is cache density is not a valid reason to increase it beyond 500 when so many people have found ways to work within the limitations as provided by Groundspeak.

 

To gain caches for the entire state, I do it by date starting with the earliest and working it out till I max out at or near 500. That takes me 21 PQs out of the 40 I can create, and out of the 35 I can run. I combine those into GSAK and I use GSAK to generate what I need when I need it AND I don't have to depend on Groundspeak to give me more in the PQ, nor do I have to depend on them to send me an immediate PQ for the day.

 

Downloading to my PDA, I can have all the caches I want within my given parameters and export them to the GPS at any given time. I don't have to be hooked up to Groundspeak for that either. Imagine the freedom gained you so blithely ignored by the previous posters saying it isn't needed.

 

An increase to the PQ isn't needed.

Link to comment

I'm thankful for all the other changes they are making to the site. For me, this is very low on a "list of things that need changing." :laughing:

I agree. There are also more important things I'd rather be fixed/changed. I was just giving my support to the topic since it has been a limiting factor for me in the past.

 

Not so limiting that I couldn't work around it before, or keep working around it in the future. It's not keeping me from having fun at all. :P:):laughing:;):laughing:

Link to comment

An increase to the PQ isn't needed.

It depends on how you define "needed".

 

Geocaching itself isn't needed. But it sure is fun and I hope they don't take it away.

500 in a PQ isn't needed. The work arounds would work for a 100 cache limit too. Would you be okay if the limit were reduced?

PQs themselves aren't needed. I know several people, with lots and lots of finds, that have never used a single PQ. They have as much fun as you or I do.

 

Does something have to be required by everybody before it can be used by anybody?

Link to comment

 

Does something have to be required by everybody before it can be used by anybody?

No, but the gent did call for a valid argument on why it isn't needed. I gave it and concluded with it. There are plenty of other ways of handling the problem without increasing the cache count on the PQ as this increase request will only result in requests for more increases. I can easily say I need to have 2500 to get past my 10 mile radius. I choose not to instead using the tools available to me to modify and use whatever count I want in any database I create for my use. As a result, it isn't needed in the manner the OP requested with the reason given.

 

I'll end this with the following statement.....

 

It doesn't have to be needed by everybody, just by the majority. And so far, the majority says bleh.

Edited by TotemLake
Link to comment
If you want to argue why PQs should not be increased 1000 give a reason that makes a difference.
For right now, the PQ server is overloaded. Just look at the topics regarding folks not getting their PQ's. Until the functionality is there to support 1,000 returns on a PQ, why further bog down the system at this time?

 

Few thoughts on this.

1. They just upgraded the hardware and I've seen many reports that the pocket queries are running much faster.

2. GPX files are just text files. So the bulk of the load would be more on querying the data then building the file.

3. If a DB is doing a full table scan it would take less server power to run the query once and pull 100, 000 caches then two 500 sets. So increasing to 1000 may actually speed up the server depending on how the data is being queried

Link to comment
Before I went on a trip from California to Colorado, I created several PQs all along highways I thought I would travel. Since I can have up to 40 PQs in my list, this was great. I created different databases in GSAK for the different states I was traveling through. Along the way, I refreshed the data.

 

Maybe in Denver, a huge circle of 1000 caches would have been convenient, but I think the two PQs of 500 caches each, centered on the opposite sides of the city I visited, got all the caches I would have chanced upon. Seems like being able to get five PQs of 500 caches each every day is plenty of data for most people.

 

I have more than 1600 caches in my Default GSAK database, and many other caches in additional GSAK databases. :D If some people want 1000 caches in their PQs, okay . . . <_< I'm thankful for all the other changes they are making to the site. For me, this is very low on a "list of things that need changing." :D

 

Here's another point on load. Think circle. To blanked a given area (and catch all caches) the two circles need to overlap 50%. Thus about ~50% of the caches are listed in both files. So the problems.

 

We are overloading the PQ server by running at least ~50% redundant queries. Not to mention the waist of bandwidth, e-mail, and database time. If increasing to 1000+ caches is not a good idea, then how about a way to link several PQ's together to maximize the data you get back.

Link to comment
Before I went on a trip from California to Colorado, I created several PQs all along highways I thought I would travel. Since I can have up to 40 PQs in my list, this was great. I created different databases in GSAK for the different states I was traveling through. Along the way, I refreshed the data.

 

Maybe in Denver, a huge circle of 1000 caches would have been convenient, but I think the two PQs of 500 caches each, centered on the opposite sides of the city I visited, got all the caches I would have chanced upon. Seems like being able to get five PQs of 500 caches each every day is plenty of data for most people.

 

I have more than 1600 caches in my Default GSAK database, and many other caches in additional GSAK databases. :D If some people want 1000 caches in their PQs, okay . . . <_< I'm thankful for all the other changes they are making to the site. For me, this is very low on a "list of things that need changing." :D

 

Here's another point on load. Think circle. To blanked a given area (and catch all caches) the two circles need to overlap 50%. Thus about ~50% of the caches are listed in both files. So the problems.

 

We are overloading the PQ server by running at least ~50% redundant queries. Not to mention the waist of bandwidth, e-mail, and database time. If increasing to 1000+ caches is not a good idea, then how about a way to link several PQ's together to maximize the data you get back.

Instaed of overlappoing circles - get the PQ by placed date in one larger circle. Use a program like Geocaching Swiss Army Knife to load them together.

Link to comment

Give me 500 and I will ask for 1000; give me 1000 and I will ask for 2000.

 

The one argument I see in the forums all the time is akin to "I would never use it/do it, so no one else would ever use it/do it so why bother?" Does not hold water with me. I don't eat meat so why should any restaurants serve meat? Because I am not the only one out there.

 

If you want to argue why PQs should not be increased 1000 give a reason that makes a difference.

 

Loch Cache

If you want to give a valid argument that it should be increased to 1000 give it. Otherwise you're just posting rhetoric that doesn't make a difference.

 

The fact is cache density is not a valid reason to increase it beyond 500 when so many people have found ways to work within the limitations as provided by Groundspeak.

 

To gain caches for the entire state, I do it by date starting with the earliest and working it out till I max out at or near 500. That takes me 21 PQs out of the 40 I can create, and out of the 35 I can run. I combine those into GSAK and I use GSAK to generate what I need when I need it AND I don't have to depend on Groundspeak to give me more in the PQ, nor do I have to depend on them to send me an immediate PQ for the day.

 

Downloading to my PDA, I can have all the caches I want within my given parameters and export them to the GPS at any given time. I don't have to be hooked up to Groundspeak for that either. Imagine the freedom gained you so blithely ignored by the previous posters saying it isn't needed.

 

An increase to the PQ isn't needed.

 

Sorry for all the replies at once. I program faster than I can read.. <_<

 

I love this... we don't need an increase to 10,500 caches because I can run 20 pocket queries to return the exact same amount. You've obviously never had to tune a web server before. :D

 

Let's take a lesson from OracleDB bulk collects. In the beginning to do an update you needed to run things in a loop.

 

 for x in (select * from someTable) loop
update someOtherTable set someColumn = x.someOtherColumn;
 end loop;

 

So on large queries this would take a good 20 minutes because it must bounce back and forth between the plsql and sql engine for each row.

 

So Oracle introduced bulk collects:

 

 foreach x in someValue.First..someValue.Last
  update someOtherTable set someColumn = someNewValue(x) where column = someValue(x);

 

This speed things up to taking 3 seconds from the 20 minutes. Why because it ran the query in one shot. It didn't need to open/close a sql connection for every row.

 

So now the math example:

 

1000 all run 21 queries to blanket an area. 21,000 bounces of opening sql connection, pulling data, closing the connection, building the file, ziping the file, and sending it. So let's say 6 steps per query. Totalling 126,000 tasks to do it currently.

 

Now lets increase the limit so the 21 queries run in only 1 query. 1000 users, of 6 steps. So that's 6000 tasks.

 

Increasing the limit might actually reduce the load by quite a bit. (Yes I tune sql/servers as part of my job, and I have seen this type of thing before)

Link to comment

Instaed of overlappoing circles - get the PQ by placed date in one larger circle. Use a program like Geocaching Swiss Army Knife to load them together.

 

The point was that if I run 10 queries I don't get 5000 caches but more like 2500 because of the redundency. It one could run only 1 query at 5000 then we max our experience. <_<

 

Also I currently do blanket the entire province of Ontario and use a program I created (like swiss army knife) But with a 750km limit on the cicle I still get over lap. Not to mention that archived caches never get updated in pocket queries. Thus a swiss army type system would need to be refresh frequently.

Edited by TheCarterFamily
Link to comment

Here's another point on load. Think circle. To blanked a given area (and catch all caches) the two circles need to overlap 50%. Thus about ~50% of the caches are listed in both files. So the problems.

 

We are overloading the PQ server by running at least ~50% redundant queries. Not to mention the waist of bandwidth, e-mail, and database time. If increasing to 1000+ caches is not a good idea, then how about a way to link several PQ's together to maximize the data you get back.

The thing is, I only order a PQ when I am heading in one direction or another. I don't need every last log for every cache in this area. I don't average more than two PQs per day for this very cache-rich area. All I need is fresh data before I head out on a caching adventure.

 

And, when I was in Denver, the two 500-cache PQs I got didn't overlap by 50% either. They overlapped, but not by that much. They covered a nice diagonal area from Lakewood to Littleton. A 1000-cache PQ centered in downtown Denver would not have covered the areas I was going to be in, or travel through, as well as the two 500-cache PQs did . . . <_<

Link to comment

Instaed of overlappoing circles - get the PQ by placed date in one larger circle. Use a program like Geocaching Swiss Army Knife to load them together.

 

The point was that if I run 10 queries I don't get 5000 caches but more like 2500 because of the redundency. It one could run only 1 query at 5000 then we max our experience. <_<

 

Also I currently do blanket the entire province of Ontario and use a program I created (like swiss army knife) But with a 750km limit on the cicle I still get over lap. Not to mention that archived caches never get updated in pocket queries. Thus a swiss army type system would need to be refresh frequently.

 

I don't think you understood the quote you quoted.

"get the PQ by placed date"

There is NO OVERLAP if you get multiple PQs based on placed date, thus you can cover more area with less queries.

Link to comment

Give me 500 and I will ask for 1000; give me 1000 and I will ask for 2000.

 

The one argument I see in the forums all the time is akin to "I would never use it/do it, so no one else would ever use it/do it so why bother?" Does not hold water with me. I don't eat meat so why should any restaurants serve meat? Because I am not the only one out there.

 

If you want to argue why PQs should not be increased 1000 give a reason that makes a difference.

 

Loch Cache

If you want to give a valid argument that it should be increased to 1000 give it. Otherwise you're just posting rhetoric that doesn't make a difference.

 

The fact is cache density is not a valid reason to increase it beyond 500 when so many people have found ways to work within the limitations as provided by Groundspeak.

 

To gain caches for the entire state, I do it by date starting with the earliest and working it out till I max out at or near 500. That takes me 21 PQs out of the 40 I can create, and out of the 35 I can run. I combine those into GSAK and I use GSAK to generate what I need when I need it AND I don't have to depend on Groundspeak to give me more in the PQ, nor do I have to depend on them to send me an immediate PQ for the day.

 

Downloading to my PDA, I can have all the caches I want within my given parameters and export them to the GPS at any given time. I don't have to be hooked up to Groundspeak for that either. Imagine the freedom gained you so blithely ignored by the previous posters saying it isn't needed.

 

An increase to the PQ isn't needed.

 

Sorry for all the replies at once. I program faster than I can read.. <_<

 

I love this... we don't need an increase to 10,500 caches because I can run 20 pocket queries to return the exact same amount. You've obviously never had to tune a web server before. :D

 

Let's take a lesson from OracleDB bulk collects. In the beginning to do an update you needed to run things in a loop.

 

 for x in (select * from someTable) loop
update someOtherTable set someColumn = x.someOtherColumn;
 end loop;

 

So on large queries this would take a good 20 minutes because it must bounce back and forth between the plsql and sql engine for each row.

 

So Oracle introduced bulk collects:

 

 foreach x in someValue.First..someValue.Last
  update someOtherTable set someColumn = someNewValue(x) where column = someValue(x);

 

This speed things up to taking 3 seconds from the 20 minutes. Why because it ran the query in one shot. It didn't need to open/close a sql connection for every row.

 

So now the math example:

 

1000 all run 21 queries to blanket an area. 21,000 bounces of opening sql connection, pulling data, closing the connection, building the file, ziping the file, and sending it. So let's say 6 steps per query. Totalling 126,000 tasks to do it currently.

 

Now lets increase the limit so the 21 queries run in only 1 query. 1000 users, of 6 steps. So that's 6000 tasks.

 

Increasing the limit might actually reduce the load by quite a bit. (Yes I tune sql/servers as part of my job, and I have seen this type of thing before)

If it speeds things up I'd rather run one 1000 cache query instead of two or more 500s. Circles don't fit together well so you have to have overlap (waste).
Link to comment
There is NO OVERLAP if you get multiple PQs based on placed date, thus you can cover more area with less queries.
Interesting! I've never tried that. You learn something new everyday! <_<

 

... And if your GSAK database is up to date you can run this macro to generate and optimize the from and to dates for all your PQs that you need.

Link to comment
If you want to argue why PQs should not be increased 1000 give a reason that makes a difference.
For right now, the PQ server is overloaded. Just look at the topics regarding folks not getting their PQ's. Until the functionality is there to support 1,000 returns on a PQ, why further bog down the system at this time?

 

Just a question, but does a single PQ returning 1000 or 2 PQ's returning 500 each use more system resources on the servers? If the single PQ uses less, then upping to 1000 may make sense.

Link to comment
If you want to argue why PQs should not be increased 1000 give a reason that makes a difference.
For right now, the PQ server is overloaded. Just look at the topics regarding folks not getting their PQ's. Until the functionality is there to support 1,000 returns on a PQ, why further bog down the system at this time?

 

Just a question, but does a single PQ returning 1000 or 2 PQ's returning 500 each use more system resources on the servers? If the single PQ uses less, then upping to 1000 may make sense.

 

Most people don't think of more taking less. Someone who has access to the source code would have to check. See how much process is around the PQs.

Link to comment

Many people have asked for this in the past. The point is that you can receive up to 2,500 caches PER DAY already - and planned correctly (using the date placed feature) that's 2,500 UNIQUE caches.

 

If you REALLY feel the need to have more it is possible. Set up a second account and pay a second premium membership. That way you can get 5,000 caches PER DAY. :blink::D Want more? Buy another for an additional $30 and get 7,500 caches per day. If you want to tax the system heavier, I think it's great that Groundspeak allows for heavy-end Pocket Queries users to participate in this scalable fashion.

 

But more importantly, why not try to be more selective in your pocket queries? Unless you're doing a numbers run in a highly dense area, most likely you're going to be taxing the PQs grabbing caches that you never would even consider "on the fly". Consider the types that you're going after and the style you truly hunt for. Are you really going to be doing 5/5 Puzzle caches with scuba gear required on a cache run? No? Then try limiting some of the criteria.

 

And for those that complained above that the archived caches don't get updated in PQs - that's by design. Archived caches "no longer exist". Geocaching.com and PQs are not designed for everyone to have offline databases. If you use an offline database to store caches before you go on a hunt and haven't found the method provided by GSAK of weeding out the archived caches, I don't believe it should be GC.com that has to make some type of change to accomodate your using the PQs in an unintended method.

Link to comment

Most GPS these days support up to 1000 waypoints.

 

I personally know of none.

 

Over here, Garmin sells the most units, and AFAIK their standard GPSr only provide 500 WP memory.

 

ime

Link to comment
There is NO OVERLAP if you get multiple PQs based on placed date, thus you can cover more area with less queries.
Interesting! I've never tried that. You learn something new everyday! :D

 

... And if your GSAK database is up to date you can run this macro to generate and optimize the from and to dates for all your PQs that you need.

This macro doesn't do me a whole lot of good because of the complicated way I split up the areas queried. I've tried using the same criteria with a GSAK filter as the PQ area, but for some reason I come up with differing results. So, I have to manually split up the area by date. It is a pain.

 

I don't know if the return would be worth it, but Groundspeak could create a little button that counted caches from a particular start date and returned an end date. This would eliminate the trial and error scheme which, in turn, would reduce the hits to the server. A little AJAX, anyone?

Link to comment
There is NO OVERLAP if you get multiple PQs based on placed date, thus you can cover more area with less queries.
Interesting! I've never tried that. You learn something new everyday! :D

Amazing. I say it in post 19, and it takes Starbrand's post 25 for it to kick in.

Link to comment
There is NO OVERLAP if you get multiple PQs based on placed date, thus you can cover more area with less queries.
Interesting! I've never tried that. You learn something new everyday! :D

Amazing. I say it in post 19, and it takes Starbrand's post 25 for it to kick in.

Sorry about that. I only read the last few posts before I posted. I'm not sure how you can determine the correct date to get equal sized circles. I guessed and got fairly close. I figured there were about the same number of caches placed in the last 2 years compared to the first 5 years. Edited by TrailGators
Link to comment
There is NO OVERLAP if you get multiple PQs based on placed date, thus you can cover more area with less queries.
Interesting! I've never tried that. You learn something new everyday! :D

Amazing. I say it in post 19, and it takes Starbrand's post 25 for it to kick in.

Sorry about that. I only read the last few posts before I posted. I'm not sure how you can determine the correct date to get equal sized circles. I guessed and got fairly close. I figured there have fbeen about the same number of caches placed in the last 2 years than in the first 5 years.

Currently, that's what it takes. CR's suggestion is a good one:

 

Groundspeak could create a little button that counted caches from a particular start date and returned an end date. This would eliminate the trial and error scheme which, in turn, would reduce the hits to the server. A little AJAX, anyone?

 

About every 6 months, I go back and look at these circles. Some of them condense as caches become archived so a minor adjustment on each date circle will maximize the circle and reduce the need for more by 1 or 2 circles. I get the entire state of Washington this way with overlaps into Idajo, Oregon and a piece of Canada in 21 PQs. That leaves 19 PQs free to do whatever I want.

Link to comment

I have the Garmin GPSmap60CS and it holds 1000 waypoints. According to Garmin's own specs the new GPSmap60CSX also holds 1000 waypoints.

 

My GPS, a Magellan can hold an INFINITELY many caches. Sure only 500 at a time but a 2GIG SD card can hold some hundred thousand separate files of 500 each. Shouldn't there be no restriction on query number?

Just kidding, the hampsters have it hard enough already to process all the queries that currently being done.

Link to comment

Most GPS these days support up to 1000 waypoints.

 

I personally know of none.

 

Over here, Garmin sells the most units, and AFAIK their standard GPSr only provide 500 WP memory.

 

ime

My Lowarance holds 1000 waypoints and with the SD card, more than I can ever search!

Link to comment

Bottom line is that no matter HOW many waypoints your GPS has, Groundspeak has already made it possible for you to load up to 2,500 per day into your GPS. Purchase an additional premium account and you can load 5,000 waypoints.

 

Right?

Link to comment

I have a question for one of you computer savvy types---at what point does the size of the PQ become a problem for "most" email servers?

 

It seems to me that at some point, the download would either slow down so much that you might as well run two smaller ones, or it would be rejected by the recipient's email program.

 

I'm content with the PQ system the way it is, by the way. I'm just curious.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...