Jump to content


  • Posts

  • Joined

  • Last visited

Posts posted by monkeykat

  1. Nice looking page papa bear. I've been following your progress somewhat as I also work on my google maps generic solution. I think it has been a bit of an exercise for both of us massaging the data into the formats we want. Its really amazing how many marks there are in the NY City area.


    As you probably now, some pretty nice weather moving into the North East for us this weekend. So I think my computer benchmark hacking will have to go on hold. Gotta go out and look for some physical marks!

  2. Assuming that a databse update is different than a benchmark recovery submission, what types of things is Deb, or are you, looking for in a database update? I thought I read somewhere that things like water towers, standpipes, or other "building" stations usually aren't used much by surveyors anymore because they are not as accurate as the actual disks are. I remember seeing someone write that the NGS would be "happy to take these off the books" if they can be confirmed destroyed.


    Perhaps for my benefit, and the benefit of others that come accross this thread, someone can reply with what type of "database" information Deb is looking for. This may also help Deb avoid sifting through emails of simple benchmark recoveries or such, that people stumbled accross this thread and simply emailed her.

  3. Looked into the AWK scripting BDT, seems like a pretty powerful language for parsing files. I've been pretty happy with what I can do with Perl, but its nice to see another option out there in case I choose to pursue it. I too, am pretty amazed at what the Shape files have to offer. I expected some kind of lat/lng box coordinates or something.


    I'm still putting in some effort on the Google Maps API, althoughGoogle Earth definitely seems more powerful at this point. For now the website based input has been killing me. I decided to look into subdomain management to see if I could set up something like a benchmarking.mydomain.com/ type of site. So each maps project I did could have its own domain area.


    The subdomain was set up kind of like a linking web site. The files are still stored in my main directory at mydomain, they have a separate folder for them called "benchmarking". After a little tinkering I discovered that even though the files are in mydomain\benchmarking, the Google Maps API says I need to get a new key because the website //benchmarking.mydomain.com is sufficiently different than the Key I had for //mydomain.com.


    So I got a new key for my "new" subdomain. Then I found I couldn't load any data to my website. After about 2 hours of struggling, including going almost all the way back down to "hello world", I discovered that the function I used to get data (GDownloadUrl), considered the data in mydomain\benchmarking to be in the original domain. So the API said I needed a new key because I was in a new domain, but the file download function said I couldn't access the file data because it felt the file was back in the original domain. Grrrrrr :rolleyes:


    I hope this will all be resolved when I switch over to a SQL Database. As then there will not be a question of where and what domain has the data, it will all be in one database.


    I'm still enjoying this experimenting more than the programming I am doing at work though...

  4. To answer John's question, finally, I am not sure that any google maps api application would necesarily have more functionality than what you are doing with Google Earth and GSAK. That does seem like a pretty accurate, and not to difficult way of accessing the data. Thanks for posting the images, I was interested to see what it looked like. Its pretty neat looking.


    I think what I am looking at is maybe more oriented to the newer user, that hasn't downloaded the county info, and maybe doesn't have google earth. With an online web based application, you might be able to just scroll around near your house and see benchmarks nearby. Or maybe if you are on a vacation to cape cod, you could just go to your location, and instantly have the benchmark data available on the web, without having downloaded the data sheets (even though NGS has made that quite easy).


    Using the shape file, I was able to create an application that displays all the NYState benchmarks. A new problem, its really SLOW! I should have expected this with trying to load and manage 22,000 markers. I have downloaded the Rhode Island database (much smaller) :huh: and easier to manage. But it is still pretty slow.


    I am currently working on managing the displayable markers. I have made it so you need to zoom in pretty close (level 13) before any markers appear. This helps manage the display better, but there is still the initial load time that kills you. Reading online, I have found that organizing data into an SQL database may drastically improve load time. Apparently SQL queries are faster than text file parsing for data. I haven't worked with SQL database, so hooooray! a new learning opportunity. :smile:


    For now, and I make no guarantees of this working, but here is what I made to access Rhode Island. I wonder if enough people try it out if it will crash the web site or server.


    Test - Rhode Island Benchmarks

  5. Hmm, I did some more poking around on the NGS web site, and I found - Shape Files. Has anyone looked at these shape files before? It seems like they provide the data I am looking for, in a rather compact form.


    here's an example:


    11,20070301,http://www.ngs.noaa.gov/cgi-bin/ds_mark.prl?PidBox=OF1452, -77.4423864528,43.0766217000,OF1452,013 NYGS 1969,NY,MONROE,FAIRPORT (1978),43 04 35.83812(N),077 26 32.59123(W),NAD 83,(1996),ADJUSTED, 198.40 ,NAVD 88 ,VERTCON,,,2,Y,,,,,,,1935 ,19950324,MARK NOT FOUND ,USPSQD,,,D


    It provides:



    All delimitted by commas. I downloaded an archived shape file for NY State, and it is only 6.4 megabytes in size. :laughing: Seems relatively small to me, for the amount of NGS data it holds. I think with this data in a compact form, I could easily parse for the lat and longitude, the PID, the name, and provide a link to the data source when the marker is clicked on.


    Best thing is it may be small enough to store on the server (which is my friends PC). That would greatly simplify the process of getting the data out, because we can use the GDownloadUrl method to "download" the file into the application for processing. I wonder if it will have a problem parsing a 6 megabyte file? Load times? Or will the marker manager take care of that?


    Another intriguing possibility.

  6. Hello Everyone, I started this thread to talk about the Google Maps API, and NGS Database of Benchmarks. First off, stop me if this has been done before, but I haven't found an application that displays benchmarks from the NGS Database in a google map. I read on the message board about importing data into Google Earth, however I didn't look too much into it, how easy it is etc...


    I frequently use the Geocaching web site's Google Maps Beta feature to look for nearby geocaches. It displays the 20 caches nearest map center, color coded by type, as markers on a google map. However when you try to do something similar for benchmarks, I think the best you can get is the 1 mark you are looking for.


    I really wanted to have this similar functionality for benchmarks in my area, so I jumped right in and decided to develop my own google map application to display nearby benchmarks. So far it has been an excellent learning experience, and a nightmare all in one. What programming project isn't?


    Its certainly not done yet, but I had some questions to try and gauge community interest. First off, I have found the Google maps API to be extremely user friendly. It works by assigning a maps key to your specified domain to access the google maps info. However one big issue is the gathering of data, in this case, the benchmarks. The Google Maps API can only display data that you have local to your domain. Meaning if I wanted to display all the NGS benchmarks, I would need to have a full copy of their database on my domain. Not really feasable with a database as big as the NGS.


    I have found a way to import data from a file on a different domain to my home domain with the google map. I can build hyperlinks to access NGS data and return it to myself in an HTML file. This web page: NGS GET_MARK_LIST page describes how you can get NGS benchmark data in a variety of different ways. In this way I can prompt a user for what region they are trying to find data for, then display it on a google map for them.


    The part I don't like is the format the dadta comes back in. It gives the following:


    |Dist|PID...|H V|Vert_Source|Approx.|Approx..|Stab|Designation

    |----|------|- -|-----------|-------|--------|----|-----------

    |....|DE6651|. 2|88/ADJUSTED|N300613|W0820940|D...|135.032

    |....|DE6647|. 2|88/ADJUSTED|N300658|W0821214|D...|51 CMP


    As you can see, the coordinates provided for the points is approximate. I can use the PID provided to link to the actual data sheet if the user clicks on the google maps marker, that is no big deal. So would the potential user be interested in getting a localized map of the benchmarks they were planning on seeking, even though the data returned would be approximate?


    I had hoped to also color code the markers based on scaled or adjusted horizontal coordinates, however the GET_MARK_LIST script provided by NGS returns the Vertical Source data, not Horizontal, so I am not sure I can notify users if the horizontal coords are scaled or not. Therefore, I think all points would have to be considered scaled.


    I hope to maybe have a test application running after this weekend. And if not then, maybe Easter weekend (the wife will be out of town leaving me to fight between benchmark contest hunting, or google maps applications).


    So does anyone think this might be useful? I was going to start by allowing users to select a state and county and display the marks. There may be issues here because the maps will slow down when over 100 points are displayed, but I think I can use the maps API to limit the number of points displayed at a time. Kinda like how geocaching limits to 20 the number of nearby caches displayed. Then when you scroll the map around, new marks appear and disappear based on your focus.


    I was then going to allow other forms of querrying the database, like radius etc. My home page with GoDaddy will not ever allow me to have users upload file, as virus could be lodad to their servers, so I don't think I can allow users to submit a file of PIDs to view, but I may be able to have a text box they can supply them to. In fact godaddy won't even let me read the contents of a file so I am experimenting on another site.


    So I guess I am asking, is there any interest in an application that can do this? Even though I will probably finish this anyway. Has anything like this been done already? Any comments? I will post a link once I have something remotely functional.

  7. Still a little snowy out for any detailed searching for me, but I was able to confirm with the town historian and a retired town official that mark OF2468 was destroyed in 1996. That and I visited the "adjusted" coordinates and didn't see an 80 foot silver tower there. The previous logger erroneously logged a newer standpipe that does not fit the description.


    OF2468 ERR

  8. very interesting seeing photos of the different types of standpipes. In the Rochester NY area, the standpipes look like this:


    This is Mark NB2100



    And another nearby standpipe:



    They often times have a Water Authority logo painted on them. (a W above an A with a water drop as the center of the A). As a comparison, the Water Towers in the area have 4-6 legs holding them 80-110 feet off the ground.


    Thanks for sharing your standpipe photos, they definitely were more interesting looking in the past.

  9. Well, this may be my only post for a while as this benchmark hunting in the snow is not working for me, and we are getting more snow today. Most the marks I went looking for were burried and scaled, a bad combo, so most of these finds are standpipes and water towers. Photos of GPS and these towers did not work.


    NB2030 - REG

    NB2076 - REG

    OF2396 - REG

    OF2453 - REG

    OF2475 - REG

    NB2100 - REG

    OF2392 - REG or DNF?


    OF2392 is a mark I couldn't find in some really tall grass this fall. I drove by the site today on the way to a geocache and saw a stake and pink tape poking out. Looks like a surveyor must have dug it out recently. So in a way, I didn't even find it, someone else did! :blink:


    I am happy with REG credit for it, as calling it a DNF recovery brings about some ethics of finding your own DNFs. But I certainly wouldn't turn down DNF points if they are offered by the judges.


    If this snow keeps up, next weekend I'll have to find church steeples!

  10. I agree BDT, when you think about it, just 2 or 3 extra lines where they aren't needed in a file of 10 PIDs ends up costing you 20-30 extra lines, which can be 2-3 pages on a palm. Thanks for the input, and I'll have to try and find some time to do a little fine tuning before the contest begins. (Although I think I have some time to spare, as we still have a couple feet of snow on the ground here).


    I'm thinking of prompting the user for a "lite" option, and "heavy" output options that would include extras like the ascii box of detailed directions.

  11. I wrote this script for fun, and it is suitable for my benchmarking purposes so far. I'm offering it out to the community in case anyone else would find it useful. If you are interested in trying it, I welcome your input, and am more than happy to assist you in using it.


    Are there fields that you feel should definitely be included? Is there a different format you would like to see? More user customizing? One thing I plan on doing is introducing a variable for column width. Right now it is set to 30, but I could see someone wanting to set it manually. In case your palm has better resolution and can handle 40 lines or something.




    Data Sheet Parsing Script

  12. Now I know there are a lot of valuable fields I have left out. But I think having this data, and the descriptions of where the marks lay, is valuable enough to find them and any RMs in the area. For a more detailed difficult search, you would probably want the Reference Objects and survey control data. But for most finds I think this may be ample.


    Here are some examples of a data sheet txt file, and its outputted palm file. I use the TIBR book reader software to convert the txt file to a palm formatted file.


    Data Sheet Text File of 3 Benchmarks

    Output for the Palm Pilot in TXT format


    To run the perl script, you will need to have perl installed on your PC. Not everybody has this, but man perl is so good at parsing. If maybe there is significant interest I may try to port to a windows friendly version.


    Active State Perl is Free

    Zip file with Benchmarking Perl Script in it


    I find it best to do your data sheet query, and then sort by PID. The script will take an filename as input (the data file to parse) and prompt you for the number of PIDs you want in a file. The default is 10. If you have a large number of marks to search through, it could take a long time to scroll through them to find the mark you want. I found 10 to be a good number per file. The script will output a file using the first PID in the file as a filename (thats why it is good to sort first, so your files will be closest to alphabetized). The script accepts files in data sheet text or html format.


    I have a 1GB MicroSD card in my GPS for storing map data, and I took the 64MB MicroSD card that came with the garmin vista CX I have and put it (with an adapter) into the palm m500. I can now store 64MB of benchmark text files on the palm.

  13. I have been doing a little work on paperless benchmarking after I spent an hour one day printing and hole-punching data sheets for a recent weekend of benchmarking. "There's got to be a better way.", I thought.


    I investigated a lot of the paperless caching msgs and went and bought myself a palm m500. The m500 has a black and white screen so it does not use as much batteries as a color screen, and it is sharper when outdoors. The palm I bought was used on ebay, and I got it for $28 ($7 was shipping fees). This particular model cost several hundreds of dollars when they first came out in 2001. I think it is a great target for begining paperless benchmarking because most people look for the color screens, so you can buy these pretty cheap. Plus, I think B/W screens are better for outdoors, as there is more contrast. This model also has a SD Card slot for additional memory.


    Next I was a little disappointed with the Cachemate software, as there seemed to be multiple screens, and a lot of "clicking" to do to get tot he information I wanted. So I decided to write up a program to look at the data sheets from a text format, filter out the data I need, and display it on the palm screen. I know you can just take the data sheets in txt format and view them on your palm, but the formatting is all off because the palm screens are only about 30 chars wide.


    I created a software program that uses a perl script to parse through a data sheet file and store what I think is the most valuable information in a more readable format for the palm. It saves the following fields from the data sheet:


    1. BM Designation

    2. PID

    3. State/County

    4. Quad

    5. Text about scaled or adjusted

    6. Marker type

    7. Setting

    8. Stability

    9. History - Dates and Descriptions

  14. I'm in, still a bit of a newbie, but learning with each one. Going to be a fun challenge, as we have about 6 inches of snow on the ground here in Rochester at this time of the year, but its gotta melt some time soon. I'll have to search the forums to learn more about last years contest. Benchmarks are way more fun than caches! Woo - hoo!

  • Create New...