Jump to content

Feature request (Increase PQ Results Quantity)


The Outlaw

Recommended Posts

As the sport (hobby? whatever) has increased in popularity over the years, it has been increasingly difficult to maintain a decent and accurate database of my home area. My suggestion is can you explore the possibility of increasing the pocket query size from 500 to 1,000? I for one would be willing to pay an extra $5 or $10 a year for this. Maybe you can create a second level of premium user for those of us who would like this added feature. 500 was a great number in the beginning, but the proliferation of caches has made it less effective lately, and this is likely to get worse going forward. It's a good problem to have, but it does pose a problem. In any event, hopefully this is something you all will consider in the future.

 

Thanks,

Wayne Lind

AKA The Outlaw

Austin Tx.

Link to comment

Simple solution if you are willing to pay more - just buy a 2nd premium account.

 

Not a particularly helpful solution. I would have to re-log all my finds in a second account.

Alternatively, you could just ignore them or set up GSAK to weed them out. Edited by sbell111
Link to comment

Simple solution if you are willing to pay more - just buy a 2nd premium account.

 

Not a particularly helpful solution. I would have to re-log all my finds in a second account.

Alternatively, you could just ignore them or set up GSAK to weed them out.

Ignore what? My finds? A current pocket query yields 500 and I have some 10,000 finds within 100 miles of my house. A second account is not going to be a solution to this.

Link to comment

First of all I will support your request. Even I will benefit from more caches in a PQ file. Especially since the newer GPS:s are capable of handling several thousand waypoints / caches.

 

But while waiting for GS to implement your request we need to fix some workaround. In order to come with some creative ideas I would like to know what you are storing in your database and what you use it for.

Link to comment

Simple solution if you are willing to pay more - just buy a 2nd premium account.

 

Not a particularly helpful solution. I would have to re-log all my finds in a second account.

Actually, there is a way to do this, here is how to do it.

 

When logged into the 2nd "query only" account, go to this url:

 

http://www.geocaching.com/seek/nearest.asp...untName&f=1 (replacing "FirstAccountName" with your main account).

 

If anything is there, ignore it. This takes the longest when you first start, but as you keep it up, doesn't really take any time.

 

Then for the PQ's for the "query only" account check the "Are not on my ignore list", with all your finds on the ignore list of the query only account, this acts the same as "I have found".

 

Every couple days or how ever often you rework your PQ's, go to the url, if anything is there (your finds since last time you checked) ignore it.

Link to comment

Simple solution if you are willing to pay more - just buy a 2nd premium account.

 

Not a particularly helpful solution. I would have to re-log all my finds in a second account.

 

If you are using gsak as long as your second account has no finds it makes no difference the found log GPX takes precedence over the GPX file with no found log. If you pay $3 for one months worth of pq's and try one of them you will be able to test this out.

 

HAving said that i agree the curretnlimits vs the current growth do not match up very well.

Link to comment

First of all I will support your request. Even I will benefit from more caches in a PQ file. Especially since the newer GPS:s are capable of handling several thousand waypoints / caches.

 

But while waiting for GS to implement your request we need to fix some workaround. In order to come with some creative ideas I would like to know what you are storing in your database and what you use it for.

I keep a running active GSAK database of all unfound caches within 100 miles of my house. Every Saturday morning, I run 5 pocket queries, 1 difficulty all terrains, 1.5 difficulty all terrains, 2 difficulty all terrains, 2.5 difficulty all terrains, and finally 3 difficulty and higer all terrains. For the moment anyway, this is working for me, but my 1.5 difficulty PQ is always dangerously close to maxxing out at 500. Since 1.5 is the most common hide (at least here anyway) I am constantly fighting to keep it below 500. If the PQ sizes were increased to 1,000, the PQs would all fit comfortably. Hope this makes sense.

Link to comment

First of all I will support your request. Even I will benefit from more caches in a PQ file. Especially since the newer GPS:s are capable of handling several thousand waypoints / caches.

 

But while waiting for GS to implement your request we need to fix some workaround. In order to come with some creative ideas I would like to know what you are storing in your database and what you use it for.

I keep a running active GSAK database of all unfound caches within 100 miles of my house. Every Saturday morning, I run 5 pocket queries, 1 difficulty all terrains, 1.5 difficulty all terrains, 2 difficulty all terrains, 2.5 difficulty all terrains, and finally 3 difficulty and higer all terrains. For the moment anyway, this is working for me, but my 1.5 difficulty PQ is always dangerously close to maxxing out at 500. Since 1.5 is the most common hide (at least here anyway) I am constantly fighting to keep it below 500. If the PQ sizes were increased to 1,000, the PQs would all fit comfortably. Hope this makes sense.

The problem is that you are not running your PQs efficiently. If you ran them by 'placed date' instead of your current method, you could tweek the dates to maximize your queries.

 

You can also run your queries on multiple days.

Edited by sbell111
Link to comment

I guess I've never understood why people have to keep these huge databases. I just run a query of the area I want to cache in. Why do all of that work? :D

Agreed - I let the website keep the database for me.

Just because I don't want to keep a database of my own I would like to get all the geocaches in the area I will go caching in in one PQ. It will be less load on the servers to produce one PQ with 1000 caches than three PQs with each 500 caches to cover the same area. There will be a lot of caches that will be in two or three of the PQs just to get the ones in the outskirts.

Edited by nils-Olov
Link to comment

I guess I've never understood why people have to keep these huge databases. I just run a query of the area I want to cache in. Why do all of that work? :D

Agreed - I let the website keep the database for me.

Just because I don't want to keep a database of my own I would like to get all the geocaches in the area I will go caching in in one PQ. It will be less load on the servers to produce one PQ with 1000 caches than three PQs with each 500 caches to cover the same area. There will be a lot of caches that will be in two or three of the PQs just to get the ones in the outskirts.

Did you not read my previous post about your PQs being run inefficiently?

 

Also, do you agree that it would be a bigger server load to create 10000 PQs of 1000 caches each than it would be to create 10000 PQs of 500 caches each?

Link to comment

I guess I've never understood why people have to keep these huge databases. I just run a query of the area I want to cache in. Why do all of that work? :D

 

Databases are great for pulling all your individual requirements together in one place i.e. adding notes, corrected coordinates for puzzle caches, adding child waypoints for parking only included in the description or your own defined areas. etc. :D

 

All this information can easily be included in your printouts and transferred to your electronic gadgets. :D

 

There would be no need to keep offline databases up to date if there was the ability to generate custom PQ's (within the current limits), selected from a database filter upon this user added criteria.

 

There are an awful lot of caches out there, I won't get around them all and I would prefer to select those I visit by means greater than the current PQ selection criteria.

 

So timely generated customisable PQ built from a database filter is top of my Christmas wish list :D

 

Do I believe in Christmas or anyone sharing my views. :D

Link to comment
There would be no need to keep offline databases up to date if there was the ability to generate custom PQ's (within the current limits), selected from a database filter upon this user added criteria.

You can always create a bookmark and generate a PQ based on it.

 

For online DB only, what I'd like are private notes and corrected coordinates stored in Groundspeak that only the person creating the log (and administrators) can view. It's been asked before, and doesn't look like it is going to happen.

Link to comment
There would be no need to keep offline databases up to date if there was the ability to generate custom PQ's (within the current limits), selected from a database filter upon this user added criteria.

You can always create a bookmark and generate a PQ based on it.

 

For online DB only, what I'd like are private notes and corrected coordinates stored in Groundspeak that only the person creating the log (and administrators) can view. It's been asked before, and doesn't look like it is going to happen.

 

Bookmarks are far from the speedy reselectable alternative I suggest.

 

It is also a pain in the rectum to remove caches one by one from a booklist once you've found the cache.

 

No such problem with a database system, always just the caches you desire included. :)

 

My system would ease the burden on Groundspeak servers, yours would increase it. :)

Link to comment

I guess I've never understood why people have to keep these huge databases. I just run a query of the area I want to cache in. Why do all of that work? :)

Agreed - I let the website keep the database for me.

Just because I don't want to keep a database of my own I would like to get all the geocaches in the area I will go caching in in one PQ. It will be less load on the servers to produce one PQ with 1000 caches than three PQs with each 500 caches to cover the same area. There will be a lot of caches that will be in two or three of the PQs just to get the ones in the outskirts.

It would be easier to let users pull up to their daily limit of 2500 with one PQ, but we have to play the cards that we are dealt. Anyhow you can pull 1000 with two PQs by using the date placed feature. For example, in my area:

 

If I set the date to return caches between 1/1/2000 and 12/31/2005 and I preview the search then I get a circle of 30 miles and 500 caches.

 

Then if I set the date to return caches between 1/1/2006 and 12/31/2009 and I preview the search then I get a circle of 10 miles and 500 caches.

 

So now I have to adjust the 1/1/2006 date until both PQs return the same radius. It takes a little while but once it's done you are set and you can send those to yourself every week. You can also do this for 5 PQs and 2500 caches...

 

My center date ended up being 1/1/2009. So there are 1000 caches within 12.5 miles from my center-point (zipcode). 500 of them were placed this year! Wow!

Edited by TrailGators
Link to comment

I guess I've never understood why people have to keep these huge databases. I just run a query of the area I want to cache in. Why do all of that work? :)

 

I try to keep a data base of my state because I travel with my laptop all over my state on a regular basis and like to grab caches as I go. It is frustrating looking for a cache and finding out that it has been unavailable or archived for a couple of weeks. My preference would be for GS to allow a PQ of caches archived within the last week. How is that for a Christmas wish?

Link to comment

I guess I've never understood why people have to keep these huge databases. I just run a query of the area I want to cache in. Why do all of that work? :)

 

I try to keep a data base of my state because I travel with my laptop all over my state on a regular basis and like to grab caches as I go. It is frustrating looking for a cache and finding out that it has been unavailable or archived for a couple of weeks. My preference would be for GS to allow a PQ of caches archived within the last week. How is that for a Christmas wish?

 

Yeah but with my system you would'nt need that.

 

Filter the database for 500 oldest or the area you wish to visit.

Create the PQ from this filter committing one of the PQ's from your allowance.

Job done. :)

 

The Christmas wish is just for the ability to create that PQ from the filter. or the Groundspeak TOU to permit such action. :)B)

 

Currently you can create a PQ from a bookmark list so it should be easy enough to implement, it needs Groundspeaks sanction/implementation as would breach the present TOU. :)

Link to comment
Simple solution if you are willing to pay more - just buy a 2nd premium account.
Not a particularly helpful solution. I would have to re-log all my finds in a second account.
not to mention you'd STILL get the same 500 as the other account!
That is only true if you ran identical PQs on both accounts. Why would you do that?
I guess I've never understood why people have to keep these huge databases. I just run a query of the area I want to cache in. Why do all of that work? :)
Agreed - I let the website keep the database for me.
Just because I don't want to keep a database of my own I would like to get all the geocaches in the area I will go caching in in one PQ. It will be less load on the servers to produce one PQ with 1000 caches than three PQs with each 500 caches to cover the same area. There will be a lot of caches that will be in two or three of the PQs just to get the ones in the outskirts.
It would be easier to let users pull up to their daily limit of 2500 with one PQ, but we have to play the cards that we are dealt. Anyhow you can pull 1000 with two PQs by using the date placed feature. For example, in my area:

 

If I set the date to return caches between 1/1/2000 and 12/31/2005 and I preview the search then I get a circle of 30 miles and 500 caches.

 

Then if I set the date to return caches between 1/1/2006 and 12/31/2009 and I preview the search then I get a circle of 10 miles and 500 caches.

 

So now I have to adjust the 1/1/2006 date until both PQs return the same radius. It takes a little while but once it's done you are set and you can send those to yourself every week. You can also do this for 5 PQs and 2500 caches...

 

My center date ended up being 1/1/2009. So there are 1000 caches within 12.5 miles from my center-point (zipcode). 500 of them were placed this year! Wow!

Alternatively, you can set both (all) PQs to a radius of 30 miles and tweak the dates until each (every) PQ returns 499(ish) caches. Edited by sbell111
Link to comment

I guess I've never understood why people have to keep these huge databases. I just run a query of the area I want to cache in. Why do all of that work? :o

 

Databases are great for pulling all your individual requirements together in one place i.e. adding notes, corrected coordinates for puzzle caches, adding child waypoints for parking only included in the description or your own defined areas. etc. :D

 

All this information can easily be included in your printouts and transferred to your electronic gadgets. :D

 

There would be no need to keep offline databases up to date if there was the ability to generate custom PQ's (within the current limits), selected from a database filter upon this user added criteria.

 

There are an awful lot of caches out there, I won't get around them all and I would prefer to select those I visit by means greater than the current PQ selection criteria.

 

So timely generated customisable PQ built from a database filter is top of my Christmas wish list :D

 

Do I believe in Christmas or anyone sharing my views. :)

 

Guess Christmas is truly make believe then. ;)

Link to comment

First of all I will support your request. Even I will benefit from more caches in a PQ file. Especially since the newer GPS:s are capable of handling several thousand waypoints / caches.

 

But while waiting for GS to implement your request we need to fix some workaround. In order to come with some creative ideas I would like to know what you are storing in your database and what you use it for.

I keep a running active GSAK database of all unfound caches within 100 miles of my house. Every Saturday morning, I run 5 pocket queries, 1 difficulty all terrains, 1.5 difficulty all terrains, 2 difficulty all terrains, 2.5 difficulty all terrains, and finally 3 difficulty and higer all terrains. For the moment anyway, this is working for me, but my 1.5 difficulty PQ is always dangerously close to maxxing out at 500. Since 1.5 is the most common hide (at least here anyway) I am constantly fighting to keep it below 500. If the PQ sizes were increased to 1,000, the PQs would all fit comfortably. Hope this makes sense.

 

Well, there's your problem...you're only running PQs one day a week and expecting to get everything.

 

I run 3 to 5 PQs a day to keep my off-line database as up-to-date as I can. I live near the intersection of two major highways (I-4 and I-95) and can travel 100 miles in three directions (N/S/W) with plenty of unfound caches to find for a few days (if I decide to make a weekend trip of it) no matter which direction I travel.

 

A simple adjustment to your PQs should take care of your issue.

Link to comment

First of all I will support your request. Even I will benefit from more caches in a PQ file. Especially since the newer GPS:s are capable of handling several thousand waypoints / caches.

 

But while waiting for GS to implement your request we need to fix some workaround. In order to come with some creative ideas I would like to know what you are storing in your database and what you use it for.

I keep a running active GSAK database of all unfound caches within 100 miles of my house. Every Saturday morning, I run 5 pocket queries, 1 difficulty all terrains, 1.5 difficulty all terrains, 2 difficulty all terrains, 2.5 difficulty all terrains, and finally 3 difficulty and higer all terrains. For the moment anyway, this is working for me, but my 1.5 difficulty PQ is always dangerously close to maxxing out at 500. Since 1.5 is the most common hide (at least here anyway) I am constantly fighting to keep it below 500. If the PQ sizes were increased to 1,000, the PQs would all fit comfortably. Hope this makes sense.

 

Well, there's your problem...you're only running PQs one day a week and expecting to get everything.

 

I run 3 to 5 PQs a day to keep my off-line database as up-to-date as I can. I live near the intersection of two major highways (I-4 and I-95) and can travel 100 miles in three directions (N/S/W) with plenty of unfound caches to find for a few days (if I decide to make a weekend trip of it) no matter which direction I travel.

 

A simple adjustment to your PQs should take care of your issue.

His problem isn't that he's failing to capture every log. His problem is that his PQs grow to exceed teh 500 cache limit. Obviously, this is because he runs his PQs by by difficulty, rather than placed date. If he ran his PQs by placed date, the only query that would ever be at risk of going over 500 caches would be the one containing the caches currently being placed. All other queries would slowly reduce in size regardless of his caching activity. By occasionally tweaking the dates of these PQs, he could very easily slow (or stop) his need for additional PQs.

Link to comment

First of all I will support your request. Even I will benefit from more caches in a PQ file. Especially since the newer GPS:s are capable of handling several thousand waypoints / caches.

 

But while waiting for GS to implement your request we need to fix some workaround. In order to come with some creative ideas I would like to know what you are storing in your database and what you use it for.

I keep a running active GSAK database of all unfound caches within 100 miles of my house. Every Saturday morning, I run 5 pocket queries, 1 difficulty all terrains, 1.5 difficulty all terrains, 2 difficulty all terrains, 2.5 difficulty all terrains, and finally 3 difficulty and higer all terrains. For the moment anyway, this is working for me, but my 1.5 difficulty PQ is always dangerously close to maxxing out at 500. Since 1.5 is the most common hide (at least here anyway) I am constantly fighting to keep it below 500. If the PQ sizes were increased to 1,000, the PQs would all fit comfortably. Hope this makes sense.

 

Well, there's your problem...you're only running PQs one day a week and expecting to get everything.

 

I run 3 to 5 PQs a day to keep my off-line database as up-to-date as I can. I live near the intersection of two major highways (I-4 and I-95) and can travel 100 miles in three directions (N/S/W) with plenty of unfound caches to find for a few days (if I decide to make a weekend trip of it) no matter which direction I travel.

 

A simple adjustment to your PQs should take care of your issue.

His problem isn't that he's failing to capture every log. His problem is that his PQs grow to exceed teh 500 cache limit. Obviously, this is because he runs his PQs by by difficulty, rather than placed date. If he ran his PQs by placed date, the only query that would ever be at risk of going over 500 caches would be the one containing the caches currently being placed. All other queries would slowly reduce in size regardless of his caching activity. By occasionally tweaking the dates of these PQs, he could very easily slow (or stop) his need for additional PQs.

 

From his post "Every Saturday morning, I run 5 pocket queries..." hence my comment that he's only running his PQs once a week. If he created additional PQs to fill in whatever he's missing and ran them on two days, or three days, or however many days it requires, he should be able to get around his current issue.

 

Where did I say he was failing to capture every log? B)

 

For the record...I don't worry about capturing every log. I just want to keep the status updated to within a few days, to avoid hunting for missing/archived caches. By updating my PQs once-a-week it's very rare that I come across something that's been archived, and I keep my DNFs to a minimum by not loading geocaches with multiple consecutive DNFs unless there is a current maintenance log by the owner stating that the cache is present and ready to be found. Back before I knew how to manage the data I had some stale data issues. A simple home-brewed macro in GSAK has taken care of that problem for me. I only run each PQ on my list once a week...my goal isn't to capture every log, it's to cover a large area to give me many options where to go caching on a moments notice.

Link to comment

First of all I will support your request. Even I will benefit from more caches in a PQ file. Especially since the newer GPS:s are capable of handling several thousand waypoints / caches.

 

But while waiting for GS to implement your request we need to fix some workaround. In order to come with some creative ideas I would like to know what you are storing in your database and what you use it for.

I keep a running active GSAK database of all unfound caches within 100 miles of my house. Every Saturday morning, I run 5 pocket queries, 1 difficulty all terrains, 1.5 difficulty all terrains, 2 difficulty all terrains, 2.5 difficulty all terrains, and finally 3 difficulty and higer all terrains. For the moment anyway, this is working for me, but my 1.5 difficulty PQ is always dangerously close to maxxing out at 500. Since 1.5 is the most common hide (at least here anyway) I am constantly fighting to keep it below 500. If the PQ sizes were increased to 1,000, the PQs would all fit comfortably. Hope this makes sense.

 

Well, there's your problem...you're only running PQs one day a week and expecting to get everything.

 

I run 3 to 5 PQs a day to keep my off-line database as up-to-date as I can. I live near the intersection of two major highways (I-4 and I-95) and can travel 100 miles in three directions (N/S/W) with plenty of unfound caches to find for a few days (if I decide to make a weekend trip of it) no matter which direction I travel.

 

A simple adjustment to your PQs should take care of your issue.

His problem isn't that he's failing to capture every log. His problem is that his PQs grow to exceed teh 500 cache limit. Obviously, this is because he runs his PQs by by difficulty, rather than placed date. If he ran his PQs by placed date, the only query that would ever be at risk of going over 500 caches would be the one containing the caches currently being placed. All other queries would slowly reduce in size regardless of his caching activity. By occasionally tweaking the dates of these PQs, he could very easily slow (or stop) his need for additional PQs.

 

From his post "Every Saturday morning, I run 5 pocket queries..." hence my comment that he's only running his PQs once a week. If he created additional PQs to fill in whatever he's missing and ran them on two days, or three days, or however many days it requires, he should be able to get around his current issue.

 

Where did I say he was failing to capture every log? B)

 

For the record...I don't worry about capturing every log. I just want to keep the status updated to within a few days, to avoid hunting for missing/archived caches. By updating my PQs once-a-week it's very rare that I come across something that's been archived, and I keep my DNFs to a minimum by not loading geocaches with multiple consecutive DNFs unless there is a current maintenance log by the owner stating that the cache is present and ready to be found. Back before I knew how to manage the data I had some stale data issues. A simple home-brewed macro in GSAK has taken care of that problem for me. I only run each PQ on my list once a week...my goal isn't to capture every log, it's to cover a large area to give me many options where to go caching on a moments notice.

Still, you are advocating that he stays with a method that is inneficient and requires more babysitting of his PQs. If he ran his PQs efficiently, he would likely be able to continue to run them all on one day.
Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...