Jump to content

Gpx For An Entire State


mlgnight

Recommended Posts

I've been a premium member since September, and very much appreciate the ability to get pocket queries and download the GPX files, to use with GSAK program.

 

What I'd love though, is to be able to get a GPX file for my entire state (Wisconsin). Otherwise, if I'm traveling around on vacation, I have to figure out how to get multiple queries that effectively cover the area(s) I will be in. Perhaps we could suggest that a special query, breaking the limit of 500 data points, be allowed to run say only once a month (or week) and/or in non-peak hours.

Link to comment

I can understand the demands that large searches place on database engines. Maybe the site could run the 50 searches for the states (plus 13 for Canadian provinces, etc...) and just put them up as downloadable links. Updated once daily, the downloads could be there for the taking for anyone who needs them at any given time.

 

I'd take 5 states, put them into GSAK, then plot an arc along my vacation route. The query would then be burning up CPU cycles on my machine, not Geocaching.com.

Link to comment

I can understand the demands that large searches place on database engines. Maybe the site could run the 50 searches for the states (plus 13 for Canadian provinces, etc...) and just put them up as downloadable links. Updated once daily, the downloads could be there for the taking for anyone who needs them at any given time.

 

I'd take 5 states, put them into GSAK, then plot an arc along my vacation route. The query would then be burning up CPU cycles on my machine, not Geocaching.com.

 

Great idea!

 

Cached common queries.

Link to comment

I can understand the demands that large searches place on database engines. Maybe the site could run the 50 searches for the states (plus 13 for Canadian provinces, etc...) and just put them up as downloadable links. Updated once daily, the downloads could be there for the taking for anyone who needs them at any given time.

 

I'd take 5 states, put them into GSAK, then plot an arc along my vacation route. The query would then be burning up CPU cycles on my machine, not Geocaching.com.

 

I love the collaboration, and think this would be a GREAT idea to accomplish what I'm thirsting for!

Link to comment

If this is such a good idea, then my rule of "All Good Ideas Have Been Markwelled" means this thread will soon be trounced upon by the Markwellians.

Happy to oblige, I'd hate to disappoint! :) This has been suggested many times before and never acted upon.

 

Perhaps the request is reasonable for a place like West Virginia (707 active caches) or even the OP's home state of Wisconsin (3,645). But what about Texas (11,648) or Germany (16,071) or California (25,508)? That is a lot of data to jam down the pipe, even if it's once a week.

 

The OP has found 60-something caches. Are you worried about burning through the 3,580 other Wisconsin caches before you can update your pocket queries? If so, set up the 8 pocket queries necessary to build your offline database. Run them 4 per day, saving one for special emergencies, and you'll have data that's no more than a few days old (assuming they always ran).

 

Me, I keep a few queries active for my home area, and if I'm thinking of traveling, I will set up a few for my destination and for along the way. Even on a few days notice, I am all set to go. And those queries filter out my finds, caches that are disabled, and caches of a type I don't like.

 

I would much rather see pocket query programming resources devoted to solving the "Caches along a route" challenge at the database level -- a targeted solution.

Link to comment

If this is such a good idea, then my rule of "All Good Ideas Have Been Markwelled" means this thread will soon be trounced upon by the Markwellians.

Happy to oblige, I'd hate to disappoint!

:D:):laughing::laughing::lol:B)B)B):D:)B):)

 

Tonight's forecast rain.

 

Tomorrow morning my wife WILL be in the mood.

 

My kid just made an "A" on her spelling test.

 

I'm writing down tomorrow night's winning lottery numbers as we speak.

 

I'll live a long happy life and die in a freak accident involving the Saintsations chearleaders and a hot tub.

Link to comment

Happy to oblige, I'd hate to disappoint! :) This has been suggested many times before and never acted upon.

 

Perhaps the request is reasonable for a place like West Virginia (707 active caches) or even the OP's home state of Wisconsin (3,645). But what about Texas (11,648) or Germany (16,071) or California (25,508)? That is a lot of data to jam down the pipe, even if it's once a week.

So make it once per month then.

Link to comment

The problem is that any time someone has HUGE chunks of data, they are working with a stale database. PQs were not designed to make a long-term offline database. They were designed to have people work offline with some filtered data and then come back to the site to confirm that what they were searching for a was still there.

 

Stale data is a real problem. If a land manager wants a cache removed from listing because the cache is right next to the home of a colony of Myotis grisescens, it needs to be removed NOW. If you've got a PQ from a month ago and go out searching, you might be searching for a cache that a) isn't there and b) might get you in a lot of trouble for searching.

 

If you're going to be traveling around the state, take a look at the different methods of finding caches along a route in the sticky thread at the top. The Google Earth/Bookmark List method is great for those situations like "My mother-in-law wants to go to the new store in Eau Claire - what's around there."

Link to comment
... PQs were not designed to make a long-term offline database. They were designed to have people work offline with some filtered data and then come back to the site to confirm that what they were searching for a was still there.

Hi Markwell, do you have a link to a Groundspeak policy document where this is stated as the design criteria for PQs?

Link to comment

The problem is that any time someone has HUGE chunks of data, they are working with a stale database. PQs were not designed to make a long-term offline database.

 

They may not be designed for creating a database, but they ARE used for that purpose. There are lots of reasons why a local database is useful. Travelling salespersons for example, might well not know what part of the country they are heading to tomorrow. As I understand it PQ's can take a while to come though so can cant even using them for creating a last minute database to take travelling.

 

If they had a largely uptodate database, its wouldnt take long to get it correct using a syncronization process. Under the current system there isnt an easy way to keep it uptodate so the sad truth is some people dont make much of an effort.

 

Now if there was a lightwight mechanism for receiveing updates to the data - the most important being archiving etc, then the database wouldn't be such a problem and would even take strain off the main servers.

Link to comment
... PQs were not designed to make a long-term offline database. They were designed to have people work offline with some filtered data and then come back to the site to confirm that what they were searching for a was still there.

Hi Markwell, do you have a link to a Groundspeak policy document where this is stated as the design criteria for PQs?

 

Since the search is working run a search on state and PQs. Posts by Jeremy. you will find it.

 

Also remember this bit from the Terms of Use

Groundspeak may also impose limits on certain features offered on the Site with or without notice.
Link to comment
Stale data is a real problem.

 

Not really.

 

Jeremy doesn't mind either:

  • You have stale data or ...
  • You chew up massive amounts of bandwidth.

This is highlighted by the refusal of allowing downloads of archived caches, even if they are only 7 days old, in order for offline databases to keep their data current. The alternative is to continually download full databases using tremendous amount of resources.

 

There are perfectly valid reasons for many of us to keep offline databases. We want to keep them current and we want to keep the loads light on the gc.com servers, but we're not the ones saying, "no."

 

Besides, if stale data was such a big problem then PQs would be free so folks won't be working from paper binders. Now there's stale data.

Link to comment

ColoradoCaches.jpg

 

It takes me 5 PQ's to get all caches in Colorado. I usually download it once a month. Sometimes caches go missing, and even having real-time data wouldn't allow me to find every cache.

 

To set this up, I made the first PQ get all active caches from January 1, 2000 to whatever date after that got the query to return about 490 caches. That way, if a cache was made active later it wouldn't go over the limit of 500 per query. The second PQ is set for the day after the first one ends, and so forth. I go back and update those PQ's every 6 months or so as caches are archived and new ones are published. It really isn't all that hard to setup, and once it's done it only requires minimal work to keep up to date. It uses 5 of my PQ's leaving me more than enough (30) for the rest of the week. Even if I'm going on vacation I won't use 5 PQ's a day for a whole week, and this set of 5 is only run once a month. If you're in a state like California you might, but do you really need the entire state even once a month? I don't know that many people that commute from San Diego to Sacramento :anicute:

Link to comment

It took me 11 "time span" queries to get all the caches for Wisconsin.

 

While I understand that you don't want to go around with "stale" information and I understand that the execution time can be an issue, I still think that a "state" query breaking the 500 cache limit that is restricted in how often (like the new "My Finds") it can run and/or time of day that it gets processed, would be a real plus. Otherwise, here I am scheduling 11 queries to run at staggered times throughout the week. Not to mention all the processing time people are spending in trial running the queries to figure out what time spans work without hitting the 500 cache limit. This seems like it would be more of a resource drain.

 

But, I've said what I wanted. And made the suggestions that I have to offer.

 

Enjoy your caching!

Link to comment

It took me 11 "time span" queries to get all the caches for Wisconsin.

 

While I understand that you don't want to go around with "stale" information and I understand that the execution time can be an issue, I still think that a "state" query breaking the 500 cache limit that is restricted in how often (like the new "My Finds") it can run and/or time of day that it gets processed, would be a real plus. Otherwise, here I am scheduling 11 queries to run at staggered times throughout the week. Not to mention all the processing time people are spending in trial running the queries to figure out what time spans work without hitting the 500 cache limit. This seems like it would be more of a resource drain.

 

But, I've said what I wanted. And made the suggestions that I have to offer.

 

Enjoy your caching!

I wasn't trying to say your idea wasn't good, I was just giving you a work-around to get the same data. I actually like the idea of getting an entire state in one file, perhaps once a month for your home state. Maybe you could get two or three of these a month if you live in one of those little states back east ;) but for us westerners just the one would work fine.

Link to comment

Since I live in the NE corner of Georgia, I am far more interested in geocaches inside a 50 mile circle than in list of all Georgia geocache. If there was geocache in the SW corner of Georgia it would be about 250 miles away. So a GPX of my home state of GA would not be near as useful as getting some sort of archive notice for geocache inside that 50 mile circle.

Link to comment

So a GPX of my home state of GA would not be near as useful as...

 

But you can't "can" a query based around you. A query based on a state is easily canned and very useful to a lot of people.

 

As for the argument about massive numbers of caches like in CA, states can easily be broken into regions.

 

Canned queries are very work-able, useful, and would be a benefit to us all.

 

...well, except Groundspeak apparently.

Link to comment

But you can't "can" a query based around you. A query based on a state is easily canned and very useful to a lot of people.

I understand that. So if there were canned state queries, I would want GA, NC and SC, not because I really want all of those geocaches, but because I could then filter them to the ones within 50 or so miles of me. But if Team GPSaxophone's idea of
I actually like the idea of getting an entire state in one file, perhaps once a month for your home state
was implemented I wouldn't be able to get NC and SC. However if your original idea of the "Changed in the last 7 days" PQ including archived in the last 7 days then I would be ok.

 

Of course Jeremy has said no to both ideas, so I'll stick with getting all geocaches with 50 miles and checking the un-updated geocaches to find the archived ones. That way anytime I decide to head out, I can use GSAK to filter by route to get me a custom list in the direction I am heading.

Link to comment

I have presented an idea in the past for pocket queries that I still think would work and also meet Jeremy's objection to getting a whole state.

 

If every major city had a generic pocket query returning all caches within 75 miles of center of the city I think you could accommodated a large number of cachers without allowing someone to build an alternate site database.

 

For instance in Kansas a query for Wichita, Salina, Topeka and Kansas City

For Missouri, Kansas City, Springfield, Columbia, St Joseph and St Louis

 

My guess would be that this would cover 85 percent of the cachers in Missouri and Kansas

Edited by webscouter.
Link to comment

I wasn't trying to say your idea wasn't good, I was just giving you a work-around to get the same data.

Don't get me wrong, I valued the suggested work-around and put it to use to accomplish what I was trying to do. The message of "there's still got to be a better way both for me and the system" was more directed at those in charge.

 

Again, thanks!

Edited by mlgnight
Link to comment

Would it be feasible to work out a set of PQ's nicely covering a certain province, and then sharing this PQ-set publicly?

 

In this case, all cachers wanting to go find caches in that province can make use of the same pre-defined PQs.

 

:ph34r: Other than violating the terms of use of the site :mellow: ... nope nothing wrong with that :ph34r:

Link to comment

I would think that most people have their PQs set to filter out their own finds. So any pre-packaged GPX of an entire city or state will not meet their needs.

 

GSAK interogates the logs for those written by you and adjusts accordingly. In fact, one way I can "ignore" a cache that I've found but not logged on gc.com is by writting my own "Found It" it log. GSAK picks it up beautifully and marks it as found.

 

Just set a canned query's caches to not found and you're all set. Users keep a copy of their found caches on hand and everything will work out.

Link to comment

While having my own personal local database is very useful, I maintain a GSAK database of all cache within 60 miles of my home, it currently takes 13 PQ's to generate it, and I run them once a week. If I could create a 'super query' and run it once a week that would not be limited by number of waypoints but have a lower radius say 60-75-100 miles rather than the 500 mile radius for a normal query, that would save me a lot of time. But that does not fix the issue raised by the OP.

 

What the OP really needs is a PQ where you can get all the caches along a route. you could plug in a series of coordinates and get all caches with in say 10 miles of that route, up to the 500 max for a PQ. You cant really load more than that into your GPS anyways so getting the whole state is a bit wasteful.

 

Last time I did something similar (I took a road trip from the SF bay area to Yellowstone and back) I created a series of overlapping PQ's to cover the route I was going to take (so many I got a second premium membership for a month so I could run them all), then used a combination of Mapsource and GSAK to identify the coordinates for a route and then filter all waypoints within 5 miles of that route for what I expected to drive that day. It was a lot of work.

Edited by Blanston12
Link to comment

Would it be feasible to work out a set of PQ's nicely covering a certain province, and then sharing this PQ-set publicly?

 

In this case, all cachers wanting to go find caches in that province can make use of the same pre-defined PQs.

 

If you know a cacher that has already done the work and she/he is willing to share you can run the same PQ's yourself.

 

All you need to do is have them send you the url to there pocket query set up page. Then if you open that url it will open the query generation page with the same parameters set for you. Then you save it as a pocket query for yourself and viola! your query is set up.

 

To try it out you can use my nearest unfound caches that are difficulty 1.5 and up centered on Liberty Missouri

http://www.geocaching.com/pocket/gcquery.a...c3-babb48b9c9e7

 

Just click on the link and you should get the same query that I run.

Link to comment

Would it be feasible to work out a set of PQ's nicely covering a certain province, and then sharing this PQ-set publicly?

 

In this case, all cachers wanting to go find caches in that province can make use of the same pre-defined PQs.

 

:ph34r: Other than violating the terms of use of the site :mellow: ... nope nothing wrong with that :ph34r:

 

Sharing the GPX results would violate the terms of use. Sharing the criteria used to get a GPX is not (in my uninformed opinion) a violation, or Jeremy would not have made the ability to share a URL for the PQ available.

Link to comment

I also don't really have a problem with the whole state, or whole city thing. I only have a problem with the "once-a-month" aspect.

 

But since Jeremy has said No...

 

Oh and BTW...

Hi Markwell, do you have a link to a Groundspeak policy document where this is stated as the design criteria for PQs?

 

I love being able to search the forums again

 

Just adjust your searches. Remember, this was designed to help you go geocaching, not build yourself an offline database of caches.

 

I think 20mb in Pocket Queries a day is sufficient, thanks.

Link to comment

What the OP really needs is a PQ where you can get all the caches along a route. you could plug in a series of coordinates and get all caches with in say 10 miles of that route, up to the 500 max for a PQ. You cant really load more than that into your GPS anyways so getting the whole state is a bit wasteful.

My eTrex Legend holds 1000 waypoints, thank you very much.

 

The wife's eXplorist 500 holds quite a bit more than that on a 128MB SD card

Edited by Team GPSaxophone
Link to comment
I love being able to search the forums again

 

Just adjust your searches. Remember, this was designed to help you go geocaching, not build yourself an offline database of caches.

 

...and I think it awful short-sighted to think there is only one way to prepare for an outing.

 

Our way does make it easier for us to go geocaching. We keep a complete database of the area we're likely to go on a moments notice. In fact, there is no piddling with PQs. No adjusting coords, radii, and placements dates. We just go.

Link to comment

You know, occasionally, I've been known to go to other parts of the country without much notice. It would be really helpful if I could get a PQ of the entire country, just in case. I realize that this data could get old, so could I have a new PQ sent to me daily? It might be too large to email, but you could just burn a DVD and overnight it to me.

 

While I'm thinking about it, Iv'e been known to go to Canada on occasion. Therefore, can you make that daily PQ for the continent, instead of just the country? That would really help me out. Also, if you could include all archived caches in a seperate file, that wood be good.

 

Sending me the date this way will make it easier for me to go geocaching. I keep a complete database of the continent I'm likely to go on a moments notice. In fact, there is no piddling with PQs. No adjusting coords, radii, and placements dates. I just go.

 

...and I think it awful short-sighted to think there is only one way to prepare for an outing.

 

;)

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...