Jump to content

Another Pq Solution (dream)


TheAprilFools

Recommended Posts

I have read several topics over time where peaple have asked for

 

archived caches or all the caches they have found regardless of quantity or if they are archived.

 

I have also read topics and responses where we have had problems with the PQ server not being able to generate all the queries it needs to in a single day because of the resources required to pull up all those notes and generate the GPX file. I may have an idea may help all these things.

 

We allow users to create a query that lifts the 500 mile limit, the archived restriction and the 500 cache limit (to some higher number like 5000, there has to be some limit) provided that the PQ only generates a LOC file or GPX file without Groundspeak extenstions and no ebook.

 

I know there are some who would like complete descriptions of the caches they have found but be honest, who really needs the descriptions or the notes for caches they have already found, they are not likely to go and find them again.

 

If you are traveling, you could use it to download the location of all the caches in a state, not to actually use it to find the caches, but to see what area's to focus on so you could then create normal PQ's for them.

 

Before everyone gets out there popcorn, I already have mine ready.

 

:mad:

Link to comment

The only flaw I see in your logic is that this PQ would take roughly the same amount of processing time to generate with or without the gc.com extensions. Also, given that there are a handful of geocachers soon to be in the 5 digit cache finds range, it seems like maybe 5000 would be too low.

 

Isn't what you are asking for pretty much the same as if we requested a .loc file for our PQ?

 

--Marky

Link to comment
The only flaw I see in your logic is that this PQ would take roughly the same amount of processing time to generate with or without the gc.com extensions. Also, given that there are a handful of geocachers soon to be in the 5 digit cache finds range, it seems like maybe 5000 would be too low.

 

Isn't what you are asking for pretty much the same as if we requested a .loc file for our PQ?

 

--Marky

From my understanding the notes are stored in the Groundspeak extensions and its gathering the notes that takes the bulk of the resources. Of the few things that remain in the GPX file after you remove the Groundspeak stuff is the cache type and if its found or not, which I find useful and should not require much more resource to read from the database as it already has the cache record already.

 

As for the 5000 limit, right now based on lists I have seen it would effect maybe 11 cachers, lift it to 10000 and it would effect 1 (and a second soon). I realise you don't want someone creating a query that would get all caches in north america, there has to be some limit, 5000 seamed like a resonable suggestion.

Link to comment

Interesting. TPTB found 500 to be reasonable.

 

My point is that a 'found caches' PQ is not really necessary; neither is an 'archived' query. There are ways to get this information already. Granted it takes a little work on the part of the cacher, but the work only has to be done once.

 

I really don't see a good reason to lift the number of caches included in PQs beyond the current limit. Through the careful building multiple PQs you can get what you need.

Link to comment

I will pipe up and ask for this as a member benefit again, please. Even if you could only run this found pq once a month it would be great. I tried putting together a series of pq's to get this information last week, but after running the required 12 pq's over three days it was still incomplete since I have found several hundred cacches which are now archived. On top of that there is no way to keep it current without running all 12 pq's again. I was excited that I had finally broken it down and figured out the necessary pq's to make it happen even if it did take three days to get them all in. Looking back at it I realized that just a few finds in states where I intend to cache in the next 2 months will make it necessary to throw out the whole thing and start over calculating how many states I can fit into each pq and how many pq's divided by which dates I will have to use to get some states.

 

So please, please let us have a "found" pq that includes archived caches and is not limited to 500 or even 5000, but to your total finds. I am serious that I am willing to pay extra, above and beyond my premium membership for it.

Link to comment
My point is that a 'found caches' PQ is not really necessary; neither is an 'archived' query.  There are ways to get this information already.  Granted it takes a little work on the part of the cacher, but the work only has to be done once.

A "little" work? A LITTLE work?

 

AFAIK, the only way to get all the caches you have found is an enormous amount of work, involving many carefully-crafted PQs, followed by a tedious manual search to figure out which ones have been archived, and manual downloading of each of those.

 

Is there some technique you are privy to that makes it only a little work?

 

(Doesn't matter to me, as I have been keeping a DB of my finds since about 6 months after I started caching. But I know many other cachers who never did that, and now consider themselves completely out of luck.)

Edited by fizzymagic
Link to comment

For me and I suspect most cachers who generate multiple a PQ with our found caches, we are using them to generate maps of where we have been. I dont think full descriptions are needed so a simplified output file is all thats needed, and I think a LOC file would fit the bill.

 

I am sure everyone has used the cache display screen for there own finds or someone elses finds, or even show me all the caches close to a zip code, and you see in the corner the total number of records and which page of 20 are currently on the screen. Each time this page is displayed, the server code fetches the list of caches from the database, displays the total count and which ever page is needed. It already has all the information it needs to generate a LOC file of the entire result without having to hit the database again. I suspect hitting "next" a couple times would requires more resources than downloading the result as a LOC file once.

Link to comment

I agree, I do not desire the entire cache information, just the coordinates. I am looking to create a "where I have hunted" kind of map. It would also be nice to play with the stats, see how many in each state, or check average difficulty/terrain, etc. but the big thing is to get the pins in the map.

Link to comment

The problem with keeping it updated is that when you find too many caches in an area it screws up the entire system. If I find ten or fifteen more traditionals in Florida, then I will go over the 500 threshold, which means it will have to be split into two pq's searching FLorida by date. The problem is that searching by date, does not search for the date of your log, but by the date the cache was placed. So you can't just build a database and run a single pq to keep it accurate. Basically, I will have to periodically have to run all 12 (soon to be 13) pq's again to catch older gc numbers that I may have found. It seems to me that it is less stress on the servers to send a single pq with 5K+ results than to run 12 or 13 individual queries to get those same results.

 

To answer another post, you could pull up the "found list" on the my account page and the go to the cache and download the individual cache coordinates, drop into a database file and repeat, 5600 times. I was just hoping for an easier answer than that.

 

I know this is a low priority issue, but I would think that it would also be fairly easy to address, of course it is also only a benefit for a small percentage of cachers. So if anyone, has some free time, please put this one on the table to bbe considered.

Link to comment

If you use Watcher or GSAK you only have to run a PQ for the caches you have found in the last week. Add that to the database you already have in Watcher or GSAK and save. Next week do another PQ for caches you found in that week and add them in. One PQ, only 100 caches if you have a good week. (OK maybe more than 100 caches if you had a really good week)

Link to comment
If you use Watcher or GSAK you only have to run a PQ for the caches you have found in the last week. Add that to the database you already have in Watcher or GSAK and save. Next week do another PQ for caches you found in that week and add them in. One PQ, only 100 caches if you have a good week. (OK maybe more than 100 caches if you had a really good week)

Yes, this is true, the database of personal cache finds is fairly easy to maintain and update once you create the database. And that is the roadblock.

 

I decided to create a master GPX file of all my finds after I reached 1,000 finds. I think I was at around 1,200 at the time. After figuring out how to divvy up the state queries (two for Pennsylvania, one for Tennessee and Kentucky, one for everywhere else...), I found I was missing more than 100 caches that had been archived. I paged through 50+ screens of "found by" search results, opened each archived cache in a new window, and downloaded all the individual GPX files for those caches. Then, merge them all together into the main file. This was quite a lot of work, even with a high speed internet connection.

 

Having built the database, I now try to update it once or twice per month by merging in the newer versions of the "caches I've found" queries. But I noticed recently that I'm off by about ten finds. This is likely due to caches being archived very soon after I found them, and before I got them into the updated master file.

 

An "all caches found by me, including archived caches" specialized query would solve both the initial barrier to creating a useful personal file, and it would save lots of time and periodic queries to keep things up to date. This feature has been noted favorably by the site developers, and I hope to see it implemented once some of the current projects are finished.

Link to comment
I agree, I do not desire the entire cache information, just the coordinates. I am looking to create a "where I have hunted" kind of map. It would also be nice to play with the stats, see how many in each state, or check average difficulty/terrain, etc. but the big thing is to get the pins in the map.

I remember reading in the forums that Jeremy said he'd make the "found by" PQ available someday, but for the archived caches he wouldn't include the coords.

 

So we'll still need to go back to all the archived cache pages and download GPX files, or cut and paste the coords from them.

Link to comment
If you use Watcher or GSAK you only have to run a PQ for the caches you have found in the last week.

Maybe I am dense, but I don't see any option in the PQ generator for "caches I found in the last week."

 

How do you set up this query? I'd love to be able to run that once a week, instead of downloading pages by hand, which is what I have to do now.

Link to comment
You need to select caches found by me and then found in the last week. Two check boxes.

Sorry. That won't work for me. That brings up all caches that I have ever found that were found by anyone in the last week. In my area, that comes to well over 500 caches, which may or may not include the ones I found during the week.

 

Your query would work for people with < 500 finds, but it is by no means a general solution to keeping one's data up-to-date.

 

I think that a personal PQ, such as outlined by Lep in this thread, could solve the problem. A very minor change to the PQ page and logic for these queries would do it. Instead of having date ranges based on when the cache was placed, have them based on when you found the cache. That way, people could run an initial series of PQs (none of which would exceed the 500-cache limit) to populate their database, and then run very small weekly PQs to keep their data up-to-date.

 

I believe that this would result in a net reduction in total PQ resources used.

Link to comment

I'm sure that I don't do this the easier way, because I almost always make technical things hard on myself, but I don't even create a PQ for my recent finds. I already have the caches in GSAK in my most recent PQ (the one I loaded to my pda to go out and find them). I just save those that I find to my little 'finds' database. It just takes a couple of minutes.

Link to comment
I already have the caches in GSAK in my most recent PQ (the one I loaded to my pda to go out and find them). I just save those that I find to my little 'finds' database. It just takes a couple of minutes.

Not being terribly familiar with GSAK (despite having paid for it), I didn't know about that feature. Does it allow you to record the time and date of your find in the 'finds' database? That would make it worthwhile.

Link to comment

Well, I save mine as an excel file and create columns for things like date. I could dump it into an access database and really go crazy, I guess.

 

edit: If I were President I'd make up my own words. Then, I'd never have to correct typos.

Edited by sbell111
Link to comment

I know a number of ppl who have two or more PQ's setup to weekly generate the caches they have found. For me, unlike others who have have commented here, I don't want the archived caches in my list because the way I use it, I create maps of available caches for a group I often cache with and even though I have found it, it shows that its available for others in the group to find.

 

The point I would like to make is rather than make someone setup a number PQ's (up to twelve or more for some) that use up a lot of server resources. Lets create an incentive to create PQ's that are not as costly for the server to produce.

 

There have been times where I was planning a trip to another city or state and I wanted to see a map of all the caches in the area, so I ended up downloading them 20 at a time into expertgps and combining them into a single map, this has to take a terrible toll on the server. Why not make a stripped down PQ that shows more caches, I would think in the end everyone would win as the users get the data they want with the minimum of work on the part of the GC.com servers.

Link to comment
Now I'm really confused. I thought this thread was about PQs of found caches. :D

Well when i originally opened this thread I was proposing a type of PQ that would live the limits cache quantities and distance provided a stripped down output file (like a LOC file), on the assumtion that this would reduce the demand on the server generating all the PQ's. This would also help solve the problem with getting all your found caches in a single PQ that a number of users are complaining about.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...