Jump to content

UT Topo Mapset almost done - help me pick next state...


IndyJpr

Recommended Posts

Indy,

 

That's was a good overall view... At the contouring stage, have you found it easier to process whole state as one file or to tile it? And to save it as one or in a tiled format? I imagine tiling could cause some contour alignment issues at tile boundaries unless sufficient overlap was included befoe clipping. It's also interesting now that the Colorado 400t comes with it's topo as one laaaaaarge img file... no tiling at all

 

I might also try processing some land ownership shading as I have that data also.

 

I like the idea of these maps being free... after all we have already paid for the source data with our taxes... :-)

 

Thanks...

Link to comment
At the contouring stage, have you found it easier to process whole state as one file or to tile it? And to save it as one or in a tiled format? I imagine tiling could cause some contour alignment issues at tile boundaries unless sufficient overlap was included befoe clipping.
Hi pasayten_pete,

 

I use GlobalMapper (GM) and several of its features to generate contours. I'll describe the process I use and the corresponding GM terminology but I'm guessing most GIS applications have equivalent functionality.

 

- I add all of the NED DEM files into a GM catalog. The catalog is simply an index of all the data files and can be configured as to what zoom levels the data appears. The catalog can then be loaded into GM and manipulated or utilized as any other data layer. The key feature is GM will only load into memory those tiles of the catalog that need to be for the current view or process.

 

- I use GM to generate the contours into shapefiles for the entire state. I have GM grid the output back into 100K tiles and also compress the shapefiles into a zip like archive.

 

I've checked the seams between tiles on various occasions and never noticed any issues.

Link to comment
Okay, so I only seem to have 45 of the 48 NHD I requested and I don't know how to figure out which ones I got and which ones I didn't. :anibad: I have the Reference Polygon ID's of the ones I got, but I have no idea how that refers to a name. Plus, one came back as ID = 0, and that file was way bigger than the others (3 - 4 times bigger). So, I'm pretty much confused about this. Was my problem that I queued up multiple requests, where I should have just submitted one, waited for it to come back, saved it and moved on to the next?

So far I haven't found any reference whith the NHD zip files that indicate the quad...the way I check is to load them up into GM and see where the gaps are. They may trickle in later - I wouldn't worry about them for now.

Okay, I found that CELL ID on the Extract Selection window is the same as the Reference Polygon ID in the email. So now I can tell which ones I've received and which ones I didn't.

 

--Marky

Link to comment

Marky,

 

I mailed my California data to IndyJpr today.

 

In addition to the 25 NED quads that you are aware of, I also included about 25 NHD quads.

 

Unless you have grabbed all of the NHD we are probably still short on the NHD data.

 

Go HERE to see the data I sent to IndyJpr

 

I am prepared to grab additional NHD quads, but I just don't know which NHD quads you have already downloaded.

 

What do you suggest?

Edit: I can't tell what you sent from that list in the link. The NHD filenames are arbitrarily generated. Unless you tell me the Reference Polygon ID of those files, I have no idea what they are.

 

I started with the top left corner (Crescent City) and did the first ten rows, ending in Goldfield.

 

Dtd you go by rows?

 

--Marky

Edited by Marky
Link to comment

Here's what I did so far:

Crescent City 67355

Happy Camp 67631

Yreka 68629

Tulelake 72205

Cedarville 67258

Orick 68098

Hoopa 67681

Mount Shasta 68008

McArthur 67923

Alturas 67034

Eureka 67492

Hayfork 67650

Redding 68220

Burney 67203

Eagle Lake 67448

Cape Mendocino 67232

Garberville 67563

Red Bluff 68218

Lake Almanor 67790

Susanville 68427

West of Covelo 0

Covelo 67348

Willows 72211

Chico 67288

Portola 68184

Ukiah 68506

Lakeport 67806

Yuba City 68630

Truckee 76841

Point Arena 68167

Healdsburg 67656

Sacramento 76836

Placerville 68159

Smith Valley 68380

Bodega Bay 67159

Napa 68027

Lodi 67863

San Andreas 68299

Bridgeport 67182

Excelsior Mountains 72157

Farallon Islands 67505

San Francisco 68305

Stockton 72199

Oakdale 68071

Yosemite Valley 68627

Benton Range 76779

Goldfield 67584

Link to comment
So far I haven't found any reference whith the NHD zip files that indicate the quad...the way I check is to load them up into GM and see where the gaps are. They may trickle in later - I wouldn't worry about them for now.
The name of the quad or subbasin was in the notification email - in Nov, 2007.
Show me where the name is in this email response from the system:

From: NHDAutoEmailer@usgs.gov

Subject: NHD data available

Date: February 26, 2008 5:42:35 AM PST

 

Your NHD data request has been processed and is available at:

ftp://nhdftp.usgs.gov/NHD#######.zip

 

Requester ................ > Marky

Request Type ............. > ShapeFile Extract

Request Date ............. > 26-FEB-2008

Resolution ............... > High

Reference Layer Name ..... > Index 100K

Reference Polygon Id(s)... > 76779,

 

Please Note:

If you are having problems connecting to the FTP site,

you may need to disable you passive FTP setting in Internet Explorer.

To change this setting click Tools -> Options -> Advanced and scroll down

about half way until you find a checkbox for 'Use Passive FTP...'.

If you uncheck this box, you should be able to get to the FTP site.

 

This file will be removed from the system in 14 days.

 

National Hydrography Dataset Program

United States Geological Survey

http://nhd.usgs.gov

As I posted above, the Reference Polygon Id matches the Cell Id in the extract request window, so with a little work, you can figure out which ones you have and haven't received. If I do this next time, I'm copying the info from the extract request window into a spreadsheet, and then marking off the ones I receive back, so I know which ones to re-request.

 

--Marky

Link to comment

Here's what I did so far:

Crescent City 67355

Happy Camp 67631

SNIP

 

Ok, I started at San Francisco and worked my way down two or three rows. I will attempt to get the remainder of CA south of the Bay Area today.

Sounds good. Let me know if you need any help. Otherwise, I will assume you've got it covered. :anibad:

 

--Marky

Link to comment
Ok, I started at San Francisco and worked my way down two or three rows. I will attempt to get the remainder of CA south of the Bay Area today.

IndyJpr, should I send you everything I downloaded, or should I just send you north of SF, since Barrikady already sent you from S.F. south? The former is much easier that the latter, since I would have to go through all the emails and figure out which files are S.F. south. Since some of the first files I requested were some of the last to arrive, I can't tell by file date/time.

 

--Marky

Link to comment

quote name='Marky' post='3337945' date='Feb 25 2008, 07:42 AM']Show me where the name is in this email response from the system:

 

Marky,

Sorry, I should have checked. Wanted to get the info to you in case you were doing a direct ftp from the site and not using the emails. Now that I see it, I remember for the 100K's I opened each in GlobalMapper and renamed the file.

Link to comment
IndyJpr, should I send you everything I downloaded, or should I just send you north of SF, since Barrikady already sent you from S.F. south? The former is much easier that the latter, since I would have to go through all the emails and figure out which files are S.F. south. Since some of the first files I requested were some of the last to arrive, I can't tell by file date/time.

You can send it all - I should be able to sort it out pretty easily.

 

How does four large chunky dem files for MT sound? I mean, can you work with DEM files that big or should I break them up even further? Each of the four files is 5.7GB. Workable or no?

Are these DEM or shapefiles (contours)? Shapefiles shouldn't be a problem. If it's DEM I can give it a try - I've never tried a file that big.

Link to comment
IndyJpr, should I send you everything I downloaded, or should I just send you north of SF, since Barrikady already sent you from S.F. south? The former is much easier that the latter, since I would have to go through all the emails and figure out which files are S.F. south. Since some of the first files I requested were some of the last to arrive, I can't tell by file date/time.

You can send it all - I should be able to sort it out pretty easily.

Okay, I'll start writing all I have to DVD.

 

--Marky

Link to comment

All of the California NHD data and California NED data is now complete.

 

I grabbed 84 NHD quad ftp links today, some are duplicate files, but I wanted to make sure we give IndyJpr all of the necessary data.

 

I still have 64 quads to download from the government ftp server, but dealing with that finicky USGS NHD Geodatabase web site should be finished.

 

IndyJpr, I want to thank you again for your generous offer to process California and the other western states. I know that your wonderful topo maps will be well used by many grateful GPS users.

 

I will send you the data I collected today no later than Thursday. If any of the data is corrupt or missing please let me or Marky know, and we will grab the necessary files.

 

Marky, I also want to thank you for the work you put into this project. Without your fast Internet connection and your perseverance, the project would be less than 25% complete.

 

Barrikady

Edited by Barrikady
Link to comment

Hi Oz,

 

I have a 100K index of the US (got it somewhere in Google land), I simply selected all the quads in a state, save the subset to a shapefile and the open the corresponding dbf file within my favorite spreadsheet application. Here's one for AZ:

 

http://www.miscjunk.org/forum/ned/az_quads.xls

 

Hope that helps,

I could have done that. I had the 100k quad already just how to make it work in excel was the question.

 

In addition, I figured out a way to speed up getting the data from USGS seamless. I added two columns for tracking what I had gotten and then another one that creates the urls to immediately load up the four downloads. So I am given: http://extract.cr.usgs.gov/Website/distreq...&PL=ND301HZ,

for each quad automatically.

 

The formula for the cell is:

=CONCATENATE("http://extract.cr.usgs.gov/Website/distreq/RequestSummary.jsp?AL=",D11,",",G11,",",F11,",",E11,"&PL=ND301HZ,")

and it will work in any of your speadsheets. Just put it in one cell then drag it down for everything. It doesn't speed things up a ton but it does get rid of some of the annoyance. Now if only i could figure out how seamless requests the download then i could write a script to get all the files for me :-D

 

I'm sure I'll have more global mapper questions as I get all the data but for now i'm just getting data. Where do you get your road data from? I got mine from the Arizona GIS Clearinghouse which i have a username for but I was wondering if you had better sources.

Link to comment

I am new to the forum. Glad to see that someone else is working on this project. I recently started developing maps for the East coast DC, MD, VA and WV, focusing only on certain regions. I am using USGS Arc 1/3 high res, and NHD hires datasets.

I would be happy to work with any other members interested in getting east coast maps. email/pm me.

Link to comment
In addition, I figured out a way to speed up getting the data from USGS seamless. I added two columns for tracking what I had gotten and then another one that creates the urls to immediately load up the four downloads. So I am given: http://extract.cr.usgs.gov/Website/distreq...&PL=ND301HZ,

for each quad automatically.

 

The formula for the cell is:

=CONCATENATE("http://extract.cr.usgs.gov/Website/distreq/RequestSummary.jsp?AL=",D11,",",G11,",",F11,",",E11,"&PL=ND301HZ,")

and it will work in any of your speadsheets. Just put it in one cell then drag it down for everything. It doesn't speed things up a ton but it does get rid of some of the annoyance. Now if only i could figure out how seamless requests the download then i could write a script to get all the files for me :-D

Thanks for the tip! That will help tremendously.

 

I'm sure I'll have more global mapper questions as I get all the data but for now i'm just getting data. Where do you get your road data from? I got mine from the Arizona GIS Clearinghouse which i have a username for but I was wondering if you had better sources.

No, I don't have any special sources. I generally look for the following (in my preferred order):

- state GIS agency

- state DOT

- other (Google)

- Tiger

 

CO it was DOT, WY was Tiger, UT was state GIS...

 

Thanks,

Link to comment
Marky, I also want to thank you for the work you put into this project. Without your fast Internet connection and your perseverance, the project would be less than 25% complete.

Sharing the workload is always fun, and working on a project like this is awesome. I'll try to get my data in the mail by week's end. Hopefully DVD+R disks are okay.

 

--Marky

Link to comment

Does anyone else have problems with the NHD ftp server? I have a bunch of data "available" but their ftp isn't working. Ideas?

Yes, initially I was also unable to download. Did you disable "Passive Mode (PASV)" on your ftp client? If not, doing that should resolve the problem. Otherwise, it may be that the NHD ftp server is just being difficult, I had problems with the web site. Try later in the evening.

 

Quote from the server:

 

Please Note:

If you are having problems connecting to the FTP site,

you may need to disable you passive FTP setting in Internet Explorer.

To change this setting click Tools -> Options -> Advanced and scroll down

about half way until you find a checkbox for 'Use Passive FTP...'.

If you uncheck this box, you should be able to get to the FTP site.

 

 

 

.

Edited by Barrikady
Link to comment
At the contouring stage, have you found it easier to process whole state as one file or to tile it? And to save it as one or in a tiled format? I imagine tiling could cause some contour alignment issues at tile boundaries unless sufficient overlap was included befoe clipping.
Hi pasayten_pete,

 

I use GlobalMapper (GM) and several of its features to generate contours. I'll describe the process I use and the corresponding GM terminology but I'm guessing most GIS applications have equivalent functionality.

 

- I add all of the NED DEM files into a GM catalog. The catalog is simply an index of all the data files and can be configured as to what zoom levels the data appears. The catalog can then be loaded into GM and manipulated or utilized as any other data layer. The key feature is GM will only load into memory those tiles of the catalog that need to be for the current view or process.

 

- I use GM to generate the contours into shapefiles for the entire state. I have GM grid the output back into 100K tiles and also compress the shapefiles into a zip like archive.

 

I've checked the seams between tiles on various occasions and never noticed any issues.

I have all of Arizona's NED DEM files. I put them all in a catalog and told it to start going on the best computer I had. I then did a test of just one 100k grid tile and it took 27 minutes to create the 40ft contours for that one quad. This would mean that it would take around 34 hours to process the entire state. Does this sound right? I know some quads will take less (the desert) and others more (the grand canyon). But if its 34hours for that it'll probably be about a day for each part of the water data.

 

Also, how did you export back to the proper 100k grids? I can export into quads that are 1 degree wide by half a degree tall but they may not necessarily line up with the 100k boundaries, especially for my road data and stuff.

 

EDIT: In addition, the one 100k test quad is a 414mb *.mp file. Considering I have 68 quads thats gonna take a long time for cgpsmapper to process. I must be doing something odd.

Edited by -Oz-
Link to comment
I have all of Arizona's NED DEM files. I put them all in a catalog and told it to start going on the best computer I had. I then did a test of just one 100k grid tile and it took 27 minutes to create the 40ft contours for that one quad. This would mean that it would take around 34 hours to process the entire state. Does this sound right? I know some quads will take less (the desert) and others more (the grand canyon). But if its 34hours for that it'll probably be about a day for each part of the water data.
Sounds reasonable. I just checked WY and it took ~25 hours to generate the contours.

 

Also, how did you export back to the proper 100k grids? I can export into quads that are 1 degree wide by half a degree tall but they may not necessarily line up with the 100k boundaries, especially for my road data and stuff.
To force your quads to line up to the 100K, use the Export Bounds tab in GM, set the West to the left most quad, East to the right most, etc.

 

EDIT: In addition, the one 100k test quad is a 414mb *.mp file. Considering I have 68 quads thats gonna take a long time for cgpsmapper to process. I must be doing something odd.
That seems too big. My WY 100K MP files averaged about 30MB. Now those didn't have the NHD data but I wouldn't think that data would increase the MP by a factor of 10...

 

Let me know if any of that doesn't make any sense...

Link to comment
That seems too big. My WY 100K MP files averaged about 30MB. Now those didn't have the NHD data but I wouldn't think that data would increase the MP by a factor of 10...

I haven't added the NHD data yet. That was just contours. 40 ft contours, interpolating if needed.

 

On the NHD data, the meta data files are useless for telling me what I actually have but so far I have: StreamRiver, CanalDitch, ArtificialPath, Pipeline as Flowlines.

 

I think washes are the Artificial Pathes.

 

For areas I have: Wash, Inundation Area, CanalDitch,

 

For points I have: Well, SpringSeep, Gate,

 

I have only checked one quad so far, and I think it was a "dry" quad (what isn't in Arizona), but I'm trying to think of a way to preprocess the dbf files to make it easier in global mapper. I don't think I'll be including the points because the springs I'll include I'll get from the USGS GNIS system. I also know somewhere I'll have WaterBody data.

 

I'm just wondering how much of the flowline data to include.

Edited by -Oz-
Link to comment
That seems too big. My WY 100K MP files averaged about 30MB. Now those didn't have the NHD data but I wouldn't think that data would increase the MP by a factor of 10...
I haven't added the NHD data yet. That was just contours. 40 ft contours, interpolating if needed.
Wow, that is big just for contours... Did you adjust the simplification in GM or leave it at default? And this was for a 100K quad - 1deg x 1/2 deg? If you want and if you have a server you can put it on I could download it and take a look...

 

On the NHD data, the meta data files are useless for telling me what I actually have but so far I have: StreamRiver, CanalDitch, ArtificialPath, Pipeline as Flowlines.

 

I think washes are the Artificial Pathes.

 

For areas I have: Wash, Inundation Area, CanalDitch,

 

For points I have: Well, SpringSeep, Gate,

 

I have only checked one quad so far, and I think it was a "dry" quad (what isn't in Arizona), but I'm trying to think of a way to preprocess the dbf files to make it easier in global mapper. I don't think I'll be including the points because the springs I'll include I'll get from the USGS GNIS system. I also know somewhere I'll have WaterBody data.

 

I'm just wondering how much of the flowline data to include.

I haven't finalized by mapping yet. I kept most of the area/waterbody data but simplified it down to only a few types. For the flowline data - not final - but I'm pretty sure I'm going to leave the intermittent stuff out...it's just a little to much.
Link to comment
Wow, that is big just for contours... Did you adjust the simplification in GM or leave it at default? And this was for a 100K quad - 1deg x 1/2 deg? If you want and if you have a server you can put it on I could download it and take a look...

 

I'm uploading the shape file now since its a bit smaller 264mb (186mb zipped). It'll take about an hour to upload and I'll edit this then.

 

I haven't finalized by mapping yet. I kept most of the area/waterbody data but simplified it down to only a few types. For the flowline data - not final - but I'm pretty sure I'm going to leave the intermittent stuff out...it's just a little to much.

 

What types did you simplify down to for area and waterbody? I can't seem to find anything that says whether its intermittent or not.

Link to comment
What types did you simplify down to for area and waterbody? I can't seem to find anything that says whether its intermittent or not.
I used the following reference

http://nhd.usgs.gov/NHDinGEO_FCodes_by_layer.pdf

to decode the ftype/fcode from the shapefile.

 

I don't have my laptap booted yet - where my notes are - but from the top of my head the waterbody mapped pretty well: glacier lake, wetlands... For area I believe I mapped almost everything (paths, canals, connectors, etc.) to small streams...

Link to comment

Here's my contours as a shape file: http://www.oztheory.com/contours.zip

Hi Oz,

 

I think your problem is too much detail. Here is a portion of your contours at 1:5000:

 

v2.png

 

Here is a portion of one of my contours at 1:5000:

 

v1.png

 

Each "dot" is a vertice. Now my example contours are probably not detailed enough but I think yours are way too detailed - they look fantastic but I have a feeling that it would take "forever" to compile and your img file sizes will end up pretty large. Also, I'm not an expert on the compiling, but I think your resolution is probably greater than what the GPS is capable of so the compiler will probably spend a lot of time simplifying the data anyways...

 

Maybe you could take a small area and do some testing - see how long it takes and what the file sizes end up being...

 

Hope that helps - let me know what you find out (if you do any testing).

Link to comment

Thanks for zooming in. Mine are definitely way way too detailed. I will have to experiment with one of the quarters of the 100k quad. It would probably take me a month of computing power just to compile everything.

 

On the note of compiling, I have GNIS points, roads, water, etc, how do you process them all? Load them all together in one map then export it or?

 

Also, how are you sorting the NHD data. I loaded about 3/4 of the state as a catalog but it seems to overload when I do the search by attributes. I assume you have a trick. or you're working on a smaller scale.

 

As always, thanks.

Link to comment
On globalmapper contouring, what tile size are you contouring at a time and what are your contour settings for resolution in arc degrees and simplification value?
I'm gridding to 100K tiles - 1 deg x 1/2 deg. The settings (all default) for GM are:

 

Resoultion

X: 0.000104605841386575 arc deg

Y: 0.000081169112836251 arc deg

 

Interpolate... checked

Smooth... checked

 

Simplification

0.50

 

I also use the "Export Contours Directly to Package Files..." option.

Link to comment
On the note of compiling, I have GNIS points, roads, water, etc, how do you process them all? Load them all together in one map then export it or?
I process them all separtely and in the end load them together and export to MP.

 

Also, how are you sorting the NHD data. I loaded about 3/4 of the state as a catalog but it seems to overload when I do the search by attributes. I assume you have a trick. or you're working on a smaller scale.
I loaded up the whole state and suffered through the slowness to separate the perennial from the intermittent. From there working with the perennial wasn't too bad.

 

One GM trick you might already know - zoom in extremely close before doing the search by attributes, the processing still takes time but this removes the long screen refreshes...

 

Thanks,

Link to comment

Thanks for zooming in. Mine are definitely way way too detailed. I will have to experiment with one of the quarters of the 100k quad. It would probably take me a month of computing power just to compile everything.

 

Love your contours. Howerver, and IndyJpr could verify this as I have only been looking at a 1 degree square in Colorado, my '.img' contour lines files are about 1/8 the size of the '.mp' files. That would make your 430mb '.mp' file about a 50m '.img' file. With about 60 square degrees in Arizona, you would have about 6Gb of '.img' files. IndyJpr could probably say if an 50mb '.img' is too big for cgpsmapper. The early version of GM9 was set to use the previous simplification evey though the slider bar was showing 0.5 when you entered that screen.

Link to comment

Thanks for zooming in. Mine are definitely way way too detailed. I will have to experiment with one of the quarters of the 100k quad. It would probably take me a month of computing power just to compile everything.

 

Love your contours. Howerver, and IndyJpr could verify this as I have only been looking at a 1 degree square in Colorado, my '.img' contour lines files are about 1/8 the size of the '.mp' files. That would make your 414mb '.mp' file about a 50m '.img' file. With about 60 square degrees in Arizona, you would have about 6Gb of '.img' files. IndyJpr could probably say if an 50mb '.img' is too big for cgpsmapper. The early version of GM9 was set to use the previous simplification evey though the slider bar was showing 0.5 when you entered that screen.

Link to comment
What types did you simplify down to for area and waterbody? I can't seem to find anything that says whether its intermittent or not.
I used the following reference

http://nhd.usgs.gov/NHDinGEO_FCodes_by_layer.pdf

to decode the ftype/fcode from the shapefile.

 

I don't have my laptap booted yet - where my notes are - but from the top of my head the waterbody mapped pretty well: glacier lake, wetlands... For area I believe I mapped almost everything (paths, canals, connectors, etc.) to small streams...

 

The Fcodes have intermittent versus pernial information. I have a dBase file I can send you, but have no idea how to do so with the computer I am now at, at the public library. I try to do it first thing tomorrow.

Link to comment
I'd love to do something too, but am not willing to pony up the bucks for GM.
Well, if you're only wanting to do small areas, there's GpsMapEdit, MapMan and Mapwel:

http://www.geopainting.com/en/

http://www.mapman.org.uk/

http://www.mapwel.biz/

 

Larger areas (states) will inevitably require some type of processing - working with shapefiles, DEM files, etc. GM is very handy for this but there are also some freeware/open source GIS applications out there. I haven't used any of them so I don't have any recommendations...

Link to comment
Is there a step by step guide on this entire process? I'd love to help do something in my home state, Indiana
Hey, my home state too!

 

Here are some tutorials that will give you an idea of what's involved:

 

http://home.cinci.rr.com/creek/garmin.htm

http://www.gpsinformation.org/adamnewham/a...1/gpsmapper.htm

http://forums.Groundspeak.com/GC/index.php?showtopic=146955

 

There's also a good pdf linked to earlier in this thread.

 

Hope that helps,

Link to comment
What types did you simplify down to for area and waterbody? I can't seem to find anything that says whether its intermittent or not.
I used the following reference

http://nhd.usgs.gov/NHDinGEO_FCodes_by_layer.pdf

to decode the ftype/fcode from the shapefile.

 

I don't have my laptap booted yet - where my notes are - but from the top of my head the waterbody mapped pretty well: glacier lake, wetlands... For area I believe I mapped almost everything (paths, canals, connectors, etc.) to small streams...

 

The Fcodes have intermittent versus pernial information. I have a dBase file I can send you, but have no idea how to do so with the computer I am now at, at the public library. I try to do it first thing tomorrow.

On my drive up to skiing I starting processing a single quad just to see.

 

So, interesting issue. I processed a quad of DEM data once and it was 40mb, similar to yours (jagged but still good). I tried to process the same quad again tonight and its back to being 400mb. I can't for the life of me figure out what is going on but it is definitely interesting. I just saw that you posted your contour processing data so I'm gonna try that and I'll report back.

 

I'll have intermittent internet this weekend but am interested in the dBase file.

 

Also, how do you restrict data to just the state you're working on. I have the Arizona boundary but there appears to be no way to just end everything at that boundary. I even asked at the global mapper forums and they said you can't do it with vector data. That seems silly to me.

 

I may do some of this processing in ArcView. I have it from a class I took.

Edited by -Oz-
Link to comment

Indy,

 

With the contour shape files from globalmapper, how are you setting the Garmin levels for zoom in the subsequent programs. I was going to use gpsmapedit. I see that there are major, intermediate, and minor contours "names" in the dbf file, but they are not the garmin 0x0020-22 format

 

Thanks

Link to comment
With the contour shape files from globalmapper, how are you setting the Garmin levels for zoom in the subsequent programs. I was going to use gpsmapedit. I see that there are major, intermediate, and minor contours "names" in the dbf file, but they are not the garmin 0x0020-22 format.
Hi pasayten_pete,

 

If you're doing GM -> shapefile -> GpsMapEdit then I'm not sure how to get the zoom levels in there...I've not done that before. I use the DICTIONARY in the header that sets the zoom based on MP type - this way you don't have to set zoom levels for each item.

 

I'm not sure I'm understanding the question completely so let me know if I'm off track...

 

Thanks,

Link to comment
With the contour shape files from globalmapper, how are you setting the Garmin levels for zoom in the subsequent programs. I was going to use gpsmapedit. I see that there are major, intermediate, and minor contours "names" in the dbf file, but they are not the garmin 0x0020-22 format.
Hi pasayten_pete,

 

If you're doing GM -> shapefile -> GpsMapEdit then I'm not sure how to get the zoom levels in there...I've not done that before. I use the DICTIONARY in the header that sets the zoom based on MP type - this way you don't have to set zoom levels for each item.

 

I'm not sure I'm understanding the question completely so let me know if I'm off track...

 

Thanks,

 

So, are you exporting elev shapefiles from gm. Then do you recombine them with the NED water shapefiles back into gm and then export as a mp file?

 

Man, there are a lot of ways to construct these maps...

 

Thanks

Link to comment
So, are you exporting elev shapefiles from gm. Then do you recombine them with the NED water shapefiles back into gm and then export as a mp file?
I do most of my processing in GM and when finished I will export a clean shapefile. I also generate contours and export shapefiles.

Then I have a master GM "workspace" where I load in all the finished shapefiles. Once I have all the data I want I export MP out of GM (gridded) and compile.

 

Man, there are a lot of ways to construct these maps...
Yes - I know what you mean :blink:
Link to comment
So, are you exporting elev shapefiles from gm. Then do you recombine them with the NED water shapefiles back into gm and then export as a mp file?
I do most of my processing in GM and when finished I will export a clean shapefile. I also generate contours and export shapefiles.

Then I have a master GM "workspace" where I load in all the finished shapefiles. Once I have all the data I want I export MP out of GM (gridded) and compile.

 

Man, there are a lot of ways to construct these maps...
Yes - I know what you mean :blink:

 

OK, I now have the picture of how you are doing it... The DICT tag is a great way to parse this stuff up! Much easier than worrying about it too early on...

 

Offf to playing around with my maps... :-)

 

Thanks

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...