Jump to content

Just Created 29 Pocket Queries To Load all of So. California


Recommended Posts

I wanted all the geocaches within 100 miles of my home in So. California. So, I created 29 Pocket Queries to do that with no overlap using the date method. I just realized that it took me 2 hours. I'm glad my wife didn't catch me on the computer doing this! I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

Link to comment

Why?

x2

 

I am fairly certain you have little or no interest in at least some block of those that could have been filtered out.

 

The data will age and be near useless within just a few weeks.

 

It is unlikely that you will get them all loaded in a very useful fashion on any unit.

Link to comment

I wanted all the geocaches within 100 miles of my home in So. California. So, I created 29 Pocket Queries to do that with no overlap using the date method. I just realized that it took me 2 hours. I'm glad my wife didn't catch me on the computer doing this! I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

 

Could you explain to me how you got 30,000 caches out of PQ's that would give you 29,000? This might be an interesting and useful trick.

 

edit: on second thought you might not want to post that publicly since Raine might read it and fix the bug. :unsure:

Edited by jholly
Link to comment

So, I created 29 Pocket Queries to do that with no overlap using the date method. I just realized that it took me 2 hours. I'm glad my wife didn't catch me on the computer doing this! I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

 

Could you explain to me how you got 30,000 caches out of PQ's that would give you 29,000?

 

He did say "approximately." 29,000 is approximately 30,000.

Link to comment

I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

Actually it won't, unless garbage data excites you.

 

What you will have is a database of 29,000 static cache listings, 1,000-block groupings of them accurate only on the day that the particular PQ was run.

 

Cache listings are dynamic, meaning that they change frequently. GSAK is static, so data does not change unless it is replaced by new data.

 

Sticking data in a static database is sorta like printing it... in a dynamic world if it's printed it's probably out of date the moment it leaves the printer! Same with GSAK.

 

So you will have at the end of the week a huge database with an unknown percentage of bad data.

 

For it to be even close to accurate you would have to run all 29 PQs every day, and even if you could I expect that someone at the Lilly Pad would be explaining to you why that's a bad plan! :unsure:

Link to comment

 

For it to be even close to accurate you would have to run all 29 PQs every day, and even if you could I expect that someone at the Lilly Pad would be explaining to you why that's a bad plan! :unsure:

 

Running them weekly would qualify as close to accurate IMO. But it would be quite a bit of work to check which ones were archived over the past week unless you just automatically dumped any that weren't updated in the last round of PQs.

Link to comment

I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

Actually it won't, unless garbage data excites you.

 

I thought about doing something similar for the state of North Carolina. My motive? I was planning to identify the oldest active cache in each of the 100 counties in North Carolina.

 

I still might do this someday. But it's such a low priority I might never get around to it.

Link to comment

I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

Actually it won't, unless garbage data excites you.

 

I thought about doing something similar for the state of North Carolina. My motive? I was planning to identify the oldest active cache in each of the 100 counties in North Carolina.

 

I still might do this someday. But it's such a low priority I might never get around to it.

Wouldn't it make sense to run a PQ with the placed date as a qualifier to get that answer?

Run one for the max. distance from say each corner with a set date range.

Link to comment

I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

Actually it won't, unless garbage data excites you.

 

I thought about doing something similar for the state of North Carolina. My motive? I was planning to identify the oldest active cache in each of the 100 counties in North Carolina.

 

I still might do this someday. But it's such a low priority I might never get around to it.

Wouldn't it make sense to run a PQ with the placed date as a qualifier to get that answer?

Run one for the max. distance from say each corner with a set date range.

 

not if he's in north carolina. :unsure::lol:

Link to comment

I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

Actually it won't, unless garbage data excites you.

 

I thought about doing something similar for the state of North Carolina. My motive? I was planning to identify the oldest active cache in each of the 100 counties in North Carolina.

 

I still might do this someday. But it's such a low priority I might never get around to it.

Wouldn't it make sense to run a PQ with the placed date as a qualifier to get that answer?

Run one for the max. distance from say each corner with a set date range.

 

not if he's in north carolina. :unsure::lol:

 

Yeah, I was going to ask: "What's a corner?" Must be one of those square-state things.

Link to comment

I wanted all the geocaches within 100 miles of my home in So. California. So, I created 29 Pocket Queries to do that with no overlap using the date method. I just realized that it took me 2 hours. I'm glad my wife didn't catch me on the computer doing this! I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

:unsure: Reading the replies it seems not too many others will follow suit, but kudos to another local cacher (I live in Menifee/Sun City) and I've seen your handle on some logs.

Link to comment

Wouldn't it make sense to run a PQ with the placed date as a qualifier to get that answer?

Run one for the max. distance from say each corner with a set date range.

 

not if he's in north carolina. :lol::)

 

Yeah, I was going to ask: "What's a corner?" Must be one of those square-state things.

 

I do live in Colorado. One of those square states. :unsure:

Link to comment

Why?

Because he can.

 

Yes. I fail to see what the problem is here. GC permits 35 GPX files per week. That's one of the benefits of Premium Membership. I load all of New Jersey, plus anything else within 65 miles. Under 11000 caches. New Caches (within 65 miles) runs on Friday. Takes me less than an hour to update the GPS, and Nuvi. Including deleting archived caches, and 'currently unavailable' caches. I'm set to go almost anwhere I care to drive to hunt caches. Yup. I redo it every week. Maybe I have too much time on my hands? Only taks an hour. But I don't see what the problem is. GC provides its premium members with the opportunity to do this. Our Palm does not have an expansion card, so I seldom load it with more than a thousand caches. But, if necessary, the nuvi has all the information we need.

I thought this was the way all premium members operated! Boy, are youse guys missing out!

Link to comment

I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

Actually it won't, unless garbage data excites you.

 

What you will have is a database of 29,000 static cache listings, 1,000-block groupings of them accurate only on the day that the particular PQ was run.

 

Cache listings are dynamic, meaning that they change frequently. GSAK is static, so data does not change unless it is replaced by new data.

 

Sticking data in a static database is sorta like printing it... in a dynamic world if it's printed it's probably out of date the moment it leaves the printer! Same with GSAK.

 

So you will have at the end of the week a huge database with an unknown percentage of bad data.

 

For it to be even close to accurate you would have to run all 29 PQs every day, and even if you could I expect that someone at the Lilly Pad would be explaining to you why that's a bad plan! :unsure:

Why load more than the caches you are planning on hunting that day? GPSr's are static (cache data), so data doesn't change unless it is rplaced by new data.

 

Sticking data in a static unit is sorta like printing it...

 

you get the idea. The static data/stale data argument is valid for any use of data downloaded from GC.com - it's old as soon as it "leaves" the web page. Why do people jump all over someone who downloads data to GSAK but never says a word to someone who would load the same data (via POI) to their GPSr? It getting kind of old.

Link to comment

Wow, I'm off the computer for only 12 plus hours and I get a bunch of (nasty??) replies. You guys are starting to sound like my wife. Geez. Yes dear, I'll get it done right away.

__________

My goal is to keep my geocaches very current. I do plan to keep only active caches in my GSAK database by doing the following steps daily:

 

1. Grind my coffee beans right after I turn on my laptop.

 

2. Log on to GC website and download pocket queries to the following directory on my Oregon 450: \garmin\GPX\needabeernow.

 

3. Put ground beans in the coffee pot, add water (filtered of course), then start the coffee pot.

 

4. Start up GSAK and delete all the waypoints from my database.

 

5. Have GSAK load all of the files in the \garmin\GPX\needabeernow directory.

 

6. By this time, my coffee is ready and steaming hot. I drink mine black, like my President. Actually, I've been drinking it black since my old scouting days in the Army. My 1st Sgt. used to say, "I like mine black, like my women." Note: He was an old White Cajon from the Bayou married to a beautiful black lady.

 

7. As usual, like all the projects that I start, I never finish them. Sort of like when my wife sends me to the market to get some tomatoes, and I come back with a 12 pack of New Castles and no recollection of her ever asking me for anything.

 

For those that have gotten this far on this post, I am doing this because I am always driving all over So. California with my girls who play club soccer. There is not one query that I could run that covers all of these areas. Some may say that I am keeping static information. But, do geocaches really change that much over a one week period? Nah! If a geocache happens to be archived before I run my next query, I'll just log it as a DNF. I just hope that my laptop doesn't blow it's lid by bouncing around in my Jeep all the time.

Link to comment

I wanted all the geocaches within 100 miles of my home in So. California. So, I created 29 Pocket Queries to do that with no overlap using the date method. I just realized that it took me 2 hours. I'm glad my wife didn't catch me on the computer doing this! I guess it will be worth it once I load all (approx. 30,000) of them into GSAK by the end of the week.

:unsure: Reading the replies it seems not too many others will follow suit, but kudos to another local cacher (I live in Menifee/Sun City) and I've seen your handle on some logs.

 

Yeah, I know. I didn't mean to imply that others should do what I am doing. I wish I could just run one query and have it cover all of the different areas that we travel to every week.

 

Menifee is one of our stomping grounds! My oldest girl has played quite a few games behind the Mt. San Jacinto College campus over the past year.

Link to comment

29,000 is approximately 30,000.

 

NASA-think!

That's how we get satellites burning up in Mars' atmosphere or never return to the moon.

 

But I'm a statistician by training, and I work in media research. I have a whole bunch of software applications that will happily report a number such as "202,881 viewers of TV station WXXX drank a Diet Pepsi in the past 7 days." This irks me, because I know that the survey methodology doesn't support that level of precision. When I use such data, I round it off to something more sensible.

Link to comment

 

1. Grind my coffee beans right after I turn on my laptop.

 

 

wrong move there.....

 

cover those beans in chocolate....and eat'm on the road! bbbbbbbbbbzzzzzzzzzzzzzzztttttttttttttttttttttt :blink::D

 

Then, I could share my inner-beans with whoever is riding with me. :D

 

What was this discussion about?

 

Oh yeah, loading a lot of geocaches. I have since changed my mind and I am not going to keep loading 29k+ geocaches into GSAK on a weekly basis. Not because I don't want to have them handy. It's just that I don't want to have to spend extra time using GSAK, not that I don't like GSAK. What I have now decided to do is use 5 pocket queries that have little overlap and include all the geocaches placed within 40 miles from my home. This will cover 1,256 square miles of geocaching territory! I will have the queries run once, or twice, a week and download them directly onto my Oregon 450. Less time on the computer leaves more time for geocaching.

Link to comment

I was really disappointed by the replies to this post. Why is it people feel compelled to say some variation on "I personally don't need that many caches in my GPS so you must be some kind of idiot?" I am not going to candy coat this: You guys are out of line. OK, rant off, on with my post.

 

I travel every week over a swath of southeastern MN and eastern Iowa. I use the same date range method so that I don't have to have overlapping circles and thus waste my PQ quota on duplicates.I actually have 2 large circles instead of one huge one, so there is a little overlap, but works like a charm.

 

I have about 16,000 caches in my travel footprint. I have PQs set up so that I get the less traveled areas updated once a week, more traveled areas twice a week to keep all that data up to date. Then I run a daily query on each of my two big circles that is all the caches i haven't found that have been placed in the last 30 days. So I get daily updates on the newest caches, twice a week on all others along my travel routes, and once a week on the surrounding areas.

 

So, I have pretty complete, current data in GSAK for my travel area. Then I use Prospero's macro to load all 16,000 caches (21,000 points as I include all the children) into my 60csx as POIs, *not* as waypoints. This gives me the basics, about 4 lines of data like last4 and last find date, and the hint, on my 60. Then I also load all 16,000 caches onto my Nuvi. I have been using pilotsnipe's unbelievable macro for this, but have experimented more lately with jjreds' makeover of pilotsnipes. It is slow, the macro can take 5-10 minutes to run so I start it and work on other things. But it places the complete details... the cache description, hint, and all logs on my nuvi, again as POIs, not as waypoints.

 

And finally, I have a filter that identifies archived caches that I haven't found and it culls those out of my data. This all sounds like a lot of work, but I am investing 5-10 minutes a day in return for the 100+ spontaneous caches I find every month with little or no planning.

Link to comment
Or get an iPhone or Android-type device and do it all on the road with no local database at all.

 

 

I do this... I run "Trimble Navigator" on my blackberry and I love it... But, that only works in coverage and I am more of an out-in-the-woods kinda cacher. But I do use it frequently. But it doesn't at all replace th data, and the map presentation, of my nuvi with all my caches loaded on it.

Link to comment

Yeah, I know. I didn't mean to imply that others should do what I am doing. I wish I could just run one query and have it cover all of the different areas that we travel to every week.

I'm with you. I run 7 PQs a week (which oddly produce 7,523 caches?) and which cover pretty much the whole of SW England (up to Scotland for the oldest placed cahes) - and, erm, a bit of France too.

 

I import them to GSAK every week, just overwriting the data each time.

Then a quick tomtom macro export and my tomtom is loaded with every cache in Devon and Cornwall I am ever likely to drive to/past. It also shows which are disabled, which I have already found (in case I'm out with friends who haven't found the ones I have found), what cache type they are etc etc.

 

I can then export a GPX file from GSAK for a days caching and pop it onto my phone.

If I travel out of my planned area on any day, I have to hand full details of any caches I might want to look at, on the laptop in the van.

 

I also generate a weekly Garmin POI export from GSAK to load into Tinker's Garmin GPS, so that's all up to date too. So wherever we decide to go caching on a Sunday, 5 mins down the road or the other side of the county, the Garmin has all the caches in it we are likely to encounter.

The Garmin has the location and hint, my phone/PDA has the full cache details, pictures etc and up to date logs.

 

The data is never more than 48 hours old as the PQs run on a Friday and Saturday and our usual caching day is a Sunday.

 

The whole thing takes about 10-15 minutes each week, as you say, over a cuppa (although tea in my case):

- download/save PQs

- import each into GSAK while the next downloads

- click the tomtom macro button

- copy the files to the tomtom SD card

- click the Garmin POI macro button

- copy the file across to the GPS

 

Do I need over 7500 caches?

Do I need to cover such a large area?

Nope.

 

But using the date placed method of constructing the PQs (only recently pointed out to me), you might as well set the PQ to run to max distance and get the full 1000 caches.

 

Personally I probably wouldn't run 29000 caches but then I might not travel as far as you :lol:

 

So I can fully understand why you would do it and how you would easily make sure the data is kept up to date enough for all practical purposes.

 

We are obviously in a minority.

 

Maybe some of it is down to the fact that we can, so we do :)

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...