Jump to content

Pocket Queries For All Found Caches?


robotman

Recommended Posts

Is there a way to do a pocket query for ALL my found caches? I've tried a zip code origin with 10000 miles radius, but that doesn't work. For some reason, I only get 143 out of the 197 caches I've found.

 

I know I have a couple virtual caches, but not that many. I have a few caches in England (~20). I don't see why I'm not pulling more of my found caches up when I do a pocket query.

 

Any tricks?!

 

John

Link to comment

Leave the origin blank but highlight all countries (click the first and shift click the last). That will give you all caches found in all countries (except for the archived ones and US ones). You then need a second query using all states instead of all countries since for some odd reason the US doesn't count as a country :lol: . Combining the two in GSAK will give you the complete list of active found caches.

Link to comment

Thanks! The international trick worked. I actually only had 13 in England. Still missing quite a few. [:D] I imported these in to GSAK and got up to 157 out of my 198.

 

I don't think I have more than 3-4 reverse (locationless) caches.

 

After thinking about it a bit more, I tried "show all caches" in my account page and ALL 198 showed up (plus DNFs and notes). Exactly what I wanted. It was a helpful list because it shows that only 5 have been archived. A great summary of my geo-life. But...

 

Is there any easy way to generate a GPX file from the cache list on my account page?

 

Thanks!

 

John

Edited by robotman
Link to comment
Thanks! The international trick worked. I actually only had 13 in England. Still missing quite a few. [:D] I imported these in to GSAK and got up to 157 out of my 198.

 

I don't think I have more than 3-4 reverse (locationless) caches.

 

After thinking about it a bit more, I tried "show all caches" in my account page and ALL 198 showed up (plus DNFs and notes). Exactly what I wanted. It was a helpful list because it shows that only 5 have been archived. A great summary of my geo-life. But...

 

Is there any easy way to generate a GPX file from the cache list on my account page?

 

Thanks!

 

John

If you didn't have any limits on caches, other than "that I have found" those 2 queries should have returned all except for the archived ones. Are you sure you didn't have an extra box or two checked? Limiting difficulty or terrain, or excluding certain cache or container types?

 

The only way to get archived ones, as well as anything from a search list or the cache list on your profile is one cache at a time. After you have them all merge them together with Watcher and save as a single file.

Link to comment

Yup... double checked query. There is NOTHING checked except "All container types" and "that I have found"

 

I selected "no origin", highlighted ALL the states, radius of 999 miles, and it still returns only 158 of my finds. I've only cached in the San Fran area so I don't have any caches that would be that far away.

 

Puzzled...

 

John

Link to comment
Copy and paste the URL of your query and other Premium Members can look at it, but not edit it.

I didn't know you could do that. That's a handy feature for sharing settings (and helping debug a friends PQ). Thanks for pointing this out.

 

--Marky

Link to comment
Thanks!  The international trick worked.  I actually only had 13 in England.  Still missing quite a few.  [:(]  I imported these in to GSAK and got up to 157 out of my 198.

It should give you 193. This URL will give you a list of all your 193 unarchived finds:

 

http://www.geocaching.com/seek/nearest.aspx?ul=robotman

 

You should be able to duplicate that list in a pocket query (at least it works for me :D ). Might want to check one that is in the 193 list but not in the 157 list and see what is different about it.

Link to comment
This URL will give you a list of all your 193 unarchived finds: http://www.geocaching.com/seek/nearest.aspx?ul=robotman

You should be able to duplicate that list in a pocket query (at least it works for me  :( ).  Might want to check one that is in the 193 list but not in the 157 list and see what is different about it.

That list includes all the archived finds.

 

It was a helpful list because it shows that only 5 have been archived.

I did a quick check of your finds and you have actually found about 39 caches (including 1 LC) that have been archived. A further 6 caches you found are currently diabled.

 

The issue is going to be those archived caches that you have found. Those are the ones you need to download individually and merge with your PQ.

Link to comment
That list includes all the archived finds.

You're right - my mistake. The List All Cache Finds URL includes archived caches but not Locationless (I think it did at one time but I must have missed the change). The total caches found on the user stats tab and the user image found count includes both archived and locationless. PQ's should give you all except archived.

 

Now I've got myself confused since in my case I still have a discrepancy of one cache but no big deal - it's not about the numbers :(

Link to comment
This looks great - but what if you have more than 500 found caches? Isn't that the limit on how many total caches a PQ will return?

 

I'd like to be able to read all my "Found it" logs without searching cache by cache online; that's more than 500 finds by now.

Currently, you have to figure out a way to break it into multiple PQ's. Say, one for "traditional" caches, one for everything else, or seperate queries for different states, or whatever makes it possible to get them into "500 or less" groups, then merge however many it takes into one big file using Watcher (or similar) after you get the e-mails.

 

Archived caches, you'll have to do 'single page downloads' from the cache page.

 

Hopefully, the previously (other threads) mentioned, "all finds" PQ will make it to production soon, and will simplify this.

Link to comment

Thanks for everyone's help. Something still doesn't exactly jive.

 

If I use this link:

http://www.geocaching.com/seek/nearest.aspx?ul=robotman

 

I see 193 caches.

 

If I go to 'my account' and list all the caches I've found:

 

I see 197 caches. (which is the correct according to my own record keeping).

 

I counted 38 archived caches. I initially was confused on the 'my account' page with the icons. What I was counting were notes that I put in caches that recommended a cache be archived so I only thought 5 were archived.

 

197 - 38 = 159. Currently in GSAK I'm showing 159.

 

SO... PQ's DO show locationless caches based on their page coordinates (which are usually an example of the LC).

 

I don't get the 193 vs 197 discrepency, though. Any ideas?

 

And any ideas how to get the archived caches without going into each page and downloading the GPX info?

 

John

Link to comment

I think that the ability to get all of your finds world wide including archived caches would be an awesome premium members feature. Make it cost an extra 50 bucks. I would pay it without batting an eye.

 

Man, I wish I had kept up a found pq from the time I started.

Link to comment

Phew! I think I finally got all the numbers to match.

 

I copied and pasted the 197 list from "My Account" into Excel and sorted by name.

 

I manually downloaded all the archived caches from the (http://www.geocaching.com/seek/nearest.aspx?ul=robotman) link and added them to GSAK. Still only came up with 196!

 

So, I copied and pasted the sorted list from GSAK into Excel.

 

GSAK doesn't sort the same as Excel. And I just realized I should have resorted the GSAK names with Excel instead of trying to piece together the two lists.

 

After way too much time, I finally determined that the one archived, locationless cache was the discrepency. I went to that page and added it and now GSAK is up-to-date.

 

That only took about 1.5 hours.

 

Yes! How do we request that a PQ gives us all our finds?! That would be a much easier way to do this. :grin:

 

John

Link to comment

As offered before, and others have also agreed to, a small, say $ 10.00, fee for a one time download of ALL finds would be very worthwhile for those of us that would like to save a few hours pasting things together. A project that could get painfully annoying as your finds get into the thousands.

 

Thanks again,

maleki

Link to comment

I certainly agree with maleki on this!

 

I believe Jeremy had promised this feature, but it hasn't happened yet, and doesn't seem to be a high priority. However, while waiting for this to be implemented, the task of assembling our own lists becomes more and more daunting...

 

I'd be happy to pay a fee for a one time PQ of all my own logs, including Finds, No Finds, archived caches, etc. I could keep it up-to-date myself after that.

Link to comment

Yeah... once you have it up tp date, it's very easy to maintain!

 

Actually, you don't even have to maintain it... it just stays correct as you import your new finds in GSAK. It's worth the effort to get things all reconciled since you don't have to constantly be "fixing" things.

 

I like seeing the actual number of found caches in my database match my geo-reality. <_<

Link to comment

I dunno about paying extra above and beyond the existing premium member fee, but... if that's what it took to fund a dedicated server that could spit these out, I guess I'd chip in :lol:

 

Just to keep this thread pinned, and re-iterate my interest: I, too, would VERY much like a PQ of ALL found caches, regardless of status. If it turned out that we still had to "chunk them up" into groups of less than 500, I could live with that, but would be less useful (it's annoying to have to burn 3 or 4 of the 20 PQ 'slots' just to try to keep up with finds).

 

And yes, the only reason I'm re-iterating my interest is cuz I recently fat-fingered my GSAK "found" database, lost a few, and am now faced with a pretty hefty reconciliation process to locate the missing ones.

 

Here's hoping this one can inch up on the priority list - though if it has a negative impact of server availability and performance... it can wait :lol:

 

Thanks!

Billy

Link to comment
Yeah... once you have it up tp date, it's very easy to maintain!

 

Actually, you don't even have to maintain it... it just stays correct as you import your new finds in GSAK. It's worth the effort to get things all reconciled since you don't have to constantly be "fixing" things.

 

I like seeing the actual number of found caches in my database match my geo-reality. :lol:

I must be missing something. How do you easily maintain and update your new finds?

 

OK, so after MANY annoying PQs trying to patch together a Found database for GSAK I finally have something that is sort of complete.

 

We have 2200+ finds and I'm still missing 50 or 60. Yes, I know it's probably because of archived, locationless, or whatever reason. I'm happy with what I have even though somewhat incomplete. I'm not gonna sit ther and page thru 2k + finds to fix it any time real soon and apparently an occasional one time query of ALL finds is not coming my way any time soon. My offer to pay a nominal amount for a one-time download still stands.

 

Perhaps at least we can be mildly appeased by giving us some ability to at least PQ our recent finds or finds by date intstead of date placed. The ability to at least query for my last 500 finds, or found since xx/xx/xxxx, or found in the last month would at least allow me to update what I have easily.

 

In lieu of a simple query like mentioned, I need to piece together, with 2200+ finds, 5 queries based on the date the caches were placed just to update anything found in the last month. I just need the 100 from the last month or so but I need to burden the system with 5 queries just to update the last months activity. Somethings wrong here. For awhile I was running th 5 queries weekly to update things but it got too annoying for both me and the GC system. Surely it can be done easier but perhaps I've missed something. If not I'll need to piece together my 5 'found by date placed' queries again which is another hour I could be out caching and a very unecessary drain on already sometimes overloaded system resources.

 

Time to get off the soapbox again. I need time to get my 5 PQ's recreated.

Link to comment
Yeah... once you have it up tp date, it's very easy to maintain!   

 

Actually, you don't even have to maintain it... it just stays correct as you import your new finds in GSAK.  It's worth the effort to get things all reconciled since you don't have to constantly be "fixing" things. 

 

I like seeing the actual number of found caches in my database match my geo-reality. ;)

I must be missing something. How do you easily maintain and update your new finds?

I just d/l the gpx file for each cache after I log them. I then add them to my found gpx file that I keep on my comp. I actually divide my finds by year this way. So I have 1 found gpx file for 2003, 1 for 2004 and 1 for 2005.

Link to comment
Yeah... once you have it up tp date, it's very easy to maintain!   

 

Actually, you don't even have to maintain it... it just stays correct as you import your new finds in GSAK.  It's worth the effort to get things all reconciled since you don't have to constantly be "fixing" things. 

 

I like seeing the actual number of found caches in my database match my geo-reality. B)

I must be missing something. How do you easily maintain and update your new finds?

I just d/l the gpx file for each cache after I log them. I then add them to my found gpx file that I keep on my comp. I actually divide my finds by year this way. So I have 1 found gpx file for 2003, 1 for 2004 and 1 for 2005.

Yes, that would work but too many extra steps if you cache a lot compared to an occasional query of found caches. With a populated GSAK database I could easily pull out any year or date range that I wanted if I could easily keep the database current.

 

I can go into GSAK and manually tag each cache I found during the day as found but I'd prefer to update it with an occasional(and small) download instead of my huge 5 queries for all of my 2222(almost) Found Caches. 1 step instead of MANY! I'd not spend the time to download a gpx for each individual cache. I actually tried that for a couple caches earlier today before realizing it was too annoying and time consuming. If everyone did them that way, one at a time, it probably would be quite a drain on GC system resources also - maybe not. It might be a possible work around for some though so thanks.

Link to comment
Yeah... once you have it up tp date, it's very easy to maintain!   

 

Actually, you don't even have to maintain it... it just stays correct as you import your new finds in GSAK.  It's worth the effort to get things all reconciled since you don't have to constantly be "fixing" things. 

 

I like seeing the actual number of found caches in my database match my geo-reality. B)

I must be missing something. How do you easily maintain and update your new finds?

I just d/l the gpx file for each cache after I log them. I then add them to my found gpx file that I keep on my comp. I actually divide my finds by year this way. So I have 1 found gpx file for 2003, 1 for 2004 and 1 for 2005.

Yes, that would work but too many extra steps if you cache a lot compared to an occasional query of found caches. With a populated GSAK database I could easily pull out any year or date range that I wanted if I could easily keep the database current.

 

I can go into GSAK and manually tag each cache I found during the day as found but I'd prefer to update it with an occasional(and small) download instead of my huge 5 queries for all of my 2222(almost) Found Caches. 1 step instead of MANY! I'd not spend the time to download a gpx for each individual cache. I actually tried that for a couple caches earlier today before realizing it was too annoying and time consuming. If everyone did them that way, one at a time, it probably would be quite a drain on GC system resources also - maybe not. It might be a possible work around for some though so thanks.

I have been known to cache a bit and never had a problem doing things the way I described. It does not take long at all to d/l the individual gpx file, maybe 5 seconds(?) and then maybe an extra minute or so to add them in bulk to your found database. The reason I keep them seperated by year is 1: personal preference and 2: to keep the file size a bit more manageable for Plucker (I also keep all of this on my Palm).

 

If you already have a database with all of your finds why do you need to run the PQ's for them all the time? Just create 1 PQ that will run every day and that will include all of your finds for the past 7 days. Add that one to your data base as often as you like. The only problem I can see, once you get all of your finds, is if you want to have the absolute up to minute info on each and every one of your finds.

Link to comment

I have been known to cache a bit and never had a problem doing things the way I described. It does not take long at all to d/l the individual gpx file, maybe 5 seconds(?) and then maybe an extra minute or so to add them in bulk to your found database. The reason I keep them seperated by year is 1: personal preference and 2: to keep the file size a bit more manageable for Plucker (I also keep all of this on my Palm).

 

If you already have a database with all of your finds why do you need to run the PQ's for them all the time? Just create 1 PQ that will run every day and that will include all of your finds for the past 7 days. Add that one to your data base as often as you like. The only problem I can see, once you get all of your finds, is if you want to have the absolute up to minute info on each and every one of your finds.

Being from the same area I know you do cache a bit - i've signed the logs at caches right after you many times. One of these days I'm sure we'll cross paths. We all have our own way of doing things, software and hardware products we use, and the way we might like to see things done IF it were possible.

 

Once again, perhaps I'm missing something, if anyone can tell me how to do a query for my last 7 days finds that would be simply wonderful for now. I'd love to do my last months finds(or period of my choice) even better but I'd settle for the last seven days for now.

 

I'd just like to be able to use the data thats already there to eliminate extra busy work and have an easy way to update my finds. For now I just try to remember to manually tag caches found in my GSAK database. It works I guess, but some flexibilty in the query system as far as found caches goes will be a welcome addition someday.

Link to comment
Being from the same area I know you do cache a bit - i've signed the logs at caches right after you many times. One of these days I'm sure we'll cross paths. We all have our own way of doing things, software and hardware products we use, and the way we might like to see things done IF it were possible.

 

Once again, perhaps I'm missing something, if anyone can tell me how to do a query for my last 7 days finds that would be simply wonderful for now. I'd love to do my last months finds(or period of my choice) even better but I'd settle for the last seven days for now.

 

I'd just like to be able to use the data thats already there to eliminate extra busy work and have an easy way to update my finds. For now I just try to remember to manually tag caches found in my GSAK database. It works I guess, but some flexibilty in the query system as far as found caches goes will be a welcome addition someday.

I'm actually suprised we haven't bumped into each other at some point...hehe.

 

I went to check and am begining to see the problem better. I've been doing them individually for so long that I forgot how much of a pain it was to try and do them any other way. Thanks for being patient with this young grasshopper. Glad I got my database set before the numbers got out of hand.

 

So what would be needed would be a selection that says 'list all caches that I have found within the last 7 days' (or a pull down menu so you can select a time frame). Of course that would be in addition to the 'one time, all finds' query (unless the time frame in the pull down could go back to your member start date). And of course, both would need to include archived caches in some way.

Link to comment

Maybe it's time for a Super Premium user, for a fee, that can have queries greater that 500, queries that include archived caches, queries with named gpx & prc files, priority queries, queries that do caches within x miles of a route, etc.

 

I would gladly pay double for these and maybe a few other features.

 

Dean

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...