Jump to content
Sign in to follow this  
Followers 3
JL_HSTRE

Downloading Large Numbers Of Caches Based On Date Last Found

Recommended Posts

I maintain a series of bookmark lists for Lonely Caches in Florida. For those unfamiliar, a Lonely Cache has zero Found logs during the past year (365 days). To keep things simple for myself, I update the lists at the end of the year. So the update I'm doing right now will feature all Florida caches not found during calendar year 2018.

 

Updating the list requires a time consuming process outlined here. Short version: use the old search to bring up all caches within the state, sort by Last Found, then download lots of LOCs (updated in GSAK).

 

To make matters more difficult, geocaching.com now has reCaptcha meaning that I keep having to prove to the website that I am not a bot as I download hundreds of caches thus slowing down an already slow process. It's rather unfortunate that I'm trying to do a good deed for the geocaching community and now I have extra hurdles in the way.

 

Pocket Queries are not an alternative as they only allow for querying via Placed Date, not Found Date. Also, keep in mind there are over 41,000 active cache listings in Florida.

 

Does anyone have a suggestion on a better way I might go about doing this? Something through Project-GC or a similiar site using the API?

Share this post


Link to post

Why not use GSAK's API access?

The new API lets you download 16000 caches/day with full data and 10000 "light data" (as long as the current API remains active you could even switch between the two and get an addition 6000 full and 10000 light API calls per day)

You might start by gettting all FL caches in a database with time based (placed date) PQs. Once you have that database up to date, use the API to get newly published caches every week. The only thing left to do then is to filter the database for caches that were not found in the last year and use the API to add them to your bookmark list.

Depending on the amount of caches in FL it may take some time to populate your database but once that's done keeping it updated shouldn't be a huge task.

 

  • Helpful 1

Share this post


Link to post
Posted (edited)

In GSAK using the Api V1 (aka new api):

1. Download the BM list via api access into a new empty database.

2. Delete all caches found in 2018.

3. Empty your online BM list.

4. Bulk upload the remaining caches to the online BM list via api access.

Actually it's a snap updating your BM list. 😉

 

Hans

 

NB: you may also use a different BM list view beforehand to visually sort and check the list.

https://www.geocaching.com/play/search?bm=BM4PJK4&t=1.0-5.0&d=1.0-5.0#Content

There is a GSAK macro creating this view's url.

Edited by HHL
Typo

Share this post


Link to post

I'd tackle this from a  Florida GSAK database.  You can create this once, and then update.  PQ Splitter ( date range to divide existing caches by ~ 990) can be found on project gc

 https://project-gc.com/Tools/pqsplit

As a history guy, I suggest you might want to have 2 databases. IE, the one you don't update, and the one you do.   This allows for some comparisons over time.

 

As a matter of idle curiosity I'd use current search (as opposed to old search, or beta map search) ask for caches in Florida (sans events, virts, webcams and earthcaches -they're never lonely) and rank on date of last find...  from the return,  you could can get 1000 on a list, but that only takes you back to last found on  08/29/2016 ... 

Florida, physical caches, ranked on date of last find

 

 

  • Upvote 2

Share this post


Link to post

Thanks all. The PQ splitter is definitely helpful.

 

I guess it's time for me break down and build an "every cache in FL" GSAK database which will be useful for both Lonely Caches, but also other criteria like Oldest Caches by County.

  • Upvote 1

Share this post


Link to post

Remember if you run the PQ through the website you are limited to 6000 caches per 24 hours, using GSAK you can grab 22000 in the same time period by switching between old and new API. Instead of making three queries (for example) of 990 caches combine three rows of the splitter display and set your maximum cache return to be slightly above the total of the combined rows, three rows of 990 set the maximum to around 3000. As far as I know you can call 16000 caches in one query but I limit a query to around 3000 caches, this takes right at 15 minutes to download to a database on my machine.

Share this post


Link to post
Posted (edited)
5 hours ago, 31BMSG said:

Remember if you run the PQ through the website you are limited to 6000 caches per 24 hours

That's wrong. You are limited to 10 PQs à 1000 caches. That is: 10,000 caches per weekday.

NB: You may set API download date ranges as well.

 

Hans

Edited by HHL
  • Upvote 1

Share this post


Link to post
17 minutes ago, HHL said:

That's wrong. You are limited to 10 PQs à 1000 caches. That is: 10,000 caches per weekday.

 

5 hours ago, 31BMSG said:

Remember if you run the PQ through the website you are limited to 6000 caches per 24 hours, using GSAK you can grab 22000 in the same time period by switching between old and new API. 

 

We got GSAK for Christmas although we have been exploring it for a few months. 

I understand the 10 PQs of 1000 caches but ran into the error message something like "exceeded your allowance of 6000" yesterday.

I've been reading the help notes while I wait for 24 hours for another 6000 but haven't found the answer yet.

Please could you explain old and new API?

Share this post


Link to post
Posted (edited)
2 hours ago, searcherdog said:

I understand the 10 PQs of 1000 caches but ran into the error message something like "exceeded your allowance of 6000" yesterday.

You're mixing up the PQ feature* and the API download feature. That are two seperate ways to get caches. API downloads with an allowance are not PQs.

 

Hans

 

* https://www.geocaching.com/help/index.php?pg=kb.chapter&id=7&pgid=118

Edited by HHL

Share this post


Link to post
3 hours ago, HHL said:

That's wrong. You are limited to 10 PQs à 1000 caches. That is: 10,000 caches per weekday.

NB: You may set API download date ranges as well. 

Hans is correct, I was going from two year old memory that apparently failed me. I only run one PQ once a week to maintain a large DB and that results in less than 100 caches a week.

Share this post


Link to post

A side note for those interested in lonely caches... Geooh Live for Android shows the the last found date plus a description of how long ago that was. If longer than a year, the text blinks to highlight it's a lonely cache.

 

device-2019-01-02-082942.thumb.png.29d6b21c0ba357198d01e0cff346fbc8.png

  • Upvote 1

Share this post


Link to post
6 hours ago, HHL said:

You're mixing up the PQ feature* and the API download feature. That are two seperate ways to get caches. API downloads with an allowance are not PQs.

 

Hans

 

* https://www.geocaching.com/help/index.php?pg=kb.chapter&id=7&pgid=118

 

Thanks for the info on PQs which I already understand and use. I can not see how this answers my question which was "Please could you explain old and new API?"

Perhaps that information will become clearer once we have worked with our own database and dates. It's only 15000 caches and we can split it into smaller databases to find the info we would like.

 

Share this post


Link to post
1 hour ago, searcherdog said:

I can not see how this answers my question which was "Please could you explain old and new API?"

Of course you can't. I just didn't answer that part. 😉

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Sign in to follow this  
Followers 3

×
×
  • Create New...