Jump to content

500 Cache Limit On Pq's


Geofool

Recommended Posts

I'd like to see this too. I'm surprised no one else has chimed in.

 

Not that I really need 1000 caches, but if I'm going somewhere, I want to get all the caches and pick out the few that suit me. In many places there are more than 500 caches, so you have to restrict the PQ and/or set up more than one.

 

Jamie

 

[edit] Guess I spoke too late.

Edited by Jamie Z
Link to comment

Is there a way to split the result up into two separate files for the user to join? (Even if it counts as two queries for the day?)

 

I'd rather have that than run two queries that overlap and result in less than 1000 when the duplicates are thrown out, not to mention the area would look more like an "8" than a circle.

Link to comment

If you can't do 1000 it would be nice to sequence one search to pick up where another left off. Then you could string your daily 5 PQ's together and have the same result.

 

Or you could do a 2500 result limit per day that would limit the file to what you are allowed to have with 5 500 result searches. This would be limited by your ISP's mail limitations.

Link to comment
Is there a way to split the result up into two separate files for the user to join? (Even if it counts as two queries for the day?)

 

I'd rather have that than run two queries that overlap and result in less than 1000 when the duplicates are thrown out, not to mention the area would look more like an "8" than a circle.

Sure you can. Build one PQ of traditional caches and one starting at the same point for all other kinds of caches.

Link to comment
If you can't do 1000 it would be nice to sequence one search to pick up where another left off.  Then you could string your daily 5 PQ's together and have the same result.

DING! DING! DING!

 

Now there's a good idea. All you would need to do was add one field to the query: starting record number. Default the value to one, but allow users to enter an alternate number. That way I could get the 500th closest, and then in another query, get the 501th throught the 1000th closest.

 

Jeremey, this seems like a pretty reasonable solution to the problem. What do you think?

 

--Marky

Link to comment

Yup. Although I don't have a GPS that will hold 1000, I've thought the option should be there for people that do. Some way to limit PQs by total file size for the day rather than number of PQs would be cool. Or just have a PQ that is over 500 count as two run for that day.

Link to comment

Well, there are other reasons too.

 

The initial 500 limit was created because most GPS units only allow 500 waypoints at the time the feature was created.

 

The 500 limit was hard-coded so it would involve additional programming time that currently needs to be spent on moving the rest of the site to the new codebase.

 

The 500 limit is tangible and easy to understand, while a 2500 total cache listing list would create problems with people getting incomplete results on their request before it hits the limit.

Link to comment
The initial 500 limit was created because most GPS units only allow 500 waypoints at the time the feature was created.

Yes and many of the newer ones will now hold more than 500. And many programs have since come along that allows users to work with the data in ways they see fit. I don't think these programs have limits of 500.

 

The 500 limit was hard-coded so it would involve additional programming time that currently needs to be spent on moving the rest of the site to the new codebase.

So this is on code that was just rewritten and a poor programming practice was repeated knowing full well that the 500 limit has been an issue for some time.

 

The 500 limit is tangible and easy to understand, while a 2500 total cache listing list would create problems with people getting incomplete results on their request before it hits the limit.

I am not even sure what that means. 500 is no more tangible than 100 or 2500 or 10000. And there is no more problems with getting incomplete results on requests that hits a limit whatever that limit might be.

Edited by GrizzlyJohn
Link to comment

Having just completed a week road trip that included some caching in at least 8 different states, I too would like to see some way of occasionally doing a greater than 500 query.

 

Any way to possibly let us download complete states occasionally, Maybe allow a premium member to download each complete state once a year. Or give us 10 or 15 complete state downloads during the year.

 

For myself, I think it could result in fewer queries in the end as I would just need to update with new or changed caches since the last complete download if returning to an area.

 

I know in preparing for my trip with a couple weeks of intense queries, I had to do many twice if I made an error in my initial request while trying to piece together state data.

 

I had quite a map full of caches in the end, but still had some holes in the map for areas I somehow missed.

 

Being able to download all of 8 states would have definitly made life easier and complete. And in the future I'd only need to download much smaller new and changed queries.

Link to comment
The 500 limit is tangible and easy to understand, while a 2500 total cache listing list would create problems with people getting incomplete results on their request before it hits the limit.

I am not even sure what that means. 500 is no more tangible than 100 or 2500 or 10000. And there is no more problems with getting incomplete results on requests that hits a limit whatever that limit might be.

 

What I think he means here is that the number of posts remaining in the query budget would be non-obvious. I think he's trying to avoid the inevitable deluge of questions saying, "How come on my last query I only got 138 finds when I asked for 500", when the answer is that in their previous 3 queries that day they had accumulated 2362 results.

Link to comment
Is there a way to split the result up into two separate files for the user to join? (Even if it counts as two queries for the day?)

 

I'd rather have that than run two queries that overlap and result in less than 1000 when the duplicates are thrown out, not to mention the area would look more like an "8" than a circle.

Sure you can. Build one PQ of traditional caches and one starting at the same point for all other kinds of caches.

This is a solution to a point. The problem is that most caches are traditional (guessing around 80%). This will max out the 500 limit in a cache dense area and not reach out as far as I would prefer.

 

I only have one query on a Friday. I would prefer to manipulate one file and not have to be bothered with two. I thought computers were supposed to make things easier. I think this is a reasonable request for premium members.

 

As far as the too large of a file thing, some people have high-speed access. No one will be required to take the limit. The query filters allow this flexibility currently.

 

I have two GPSr’s, and they both hold up to 1000 waypoints. I live near three major cities that are cache dense. I’d like to have one query and not have to (at the last minute) create another query, (and wait for it in my mailbox) because I decided to visit one of these other areas.

 

Again, this seems like a reasonable request.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...