Jump to content


  • Posts

  • Joined

  • Last visited

Everything posted by tossedsalad

  1. Thanks for the link. It doesn't look at all familiar to me. But then I really don't remember doing it at all. I believe I added the map about 2 years ago, so I'm not surprised that I don't recall. The part that makes me think it was somehow automatic is that there is one for my travelbugs/geocoins and I don't know where they have been. So I have no idea how I generated that map, but that doesn't mean I didn't do it!!!
  2. Interesting. It has been so long that I guess I have completely forgotten how I did that. But I am sure I did not manually select the states. I have no idea where my TB and geocoins have been. I expect there was a web page of some sort that generated the code from my logs, etc and I cut and pasted it. However I have *NO* idea where that was... Alzheimer's is a terrible thing...
  3. I have cached in several states including Tennessee, but this state does not show on the map on my profile page. I visited those caches around New Years of 05/06. Any idea why the map does not show that I have cached in TN?
  4. It will take a little longer than you would like to wait, but not long enough that you find the mistakes you made...
  5. You weren't in Mrs. Lloyd's Latin class, were you? She was 70 years old, but she thought that was just hilarious. So what does it mean? The Latin translator I tried gives me nonsense. Oh, nevermind...
  6. You weren't in Mrs. Lloyd's Latin class, were you? She was 70 years old, but she thought that was just hilarious. So what does it mean? The Latin translator I tried gives me nonsense.
  7. Now the real question is, is it necessary? the difference is probably 10-12 feet of accuracy vs 20-25 ft. If you are finding your position on a map, or finidng landmarks, that ammount of accuracy may be sufficient for your needs. You might try to run it for a day with Battery saver off, then run it with batt. saver on and see if the difference in battery life vs increased accuracy is worth it for you. Where did you get these numbers? I have done a fair amount of testing and I have never been able to distinguish the difference in accuracy between using SBAS (WAAS and EGNOS) and not using it. In a conversation with a manufacturer's rep (who had nothing to gain by telling me this) I was told that SBAS has a *very minimal* impact on accuracy, about 1.5 to 2 meters. SBAS was actually developed to provide verification of accuracy so GPS could be used for aircraft landing. If you check the accuracy numbers from the manufacturers they are all over the map. So I don't believe any of them since accuracy depends greatly on conditions. If you have been able to actually measure this degree of accuracy change when you disable WAAS, I would be surprised. Your accuracy will only vary in the back country depending on the terrain (valleys between high hills will see fewer sats and they will be tighter together giving a higher DOP) and cover (some GPSrs are less sensitive so they lose sat signals under heavy tree cover) and possibly weather (similar issues as cover). If anyone can conclusively show me that WAAS makes a bigger improvement in accuracy than 2 meters, I will be happy to change my opinion, but I have not seen it in objective testing.
  8. Early on WAAS was described as buoy bumping accuracy. That is essentially the best way to describe it. When in thick fog on the water, with WAAS you'll find that buoy the first time, without WAAS it may take 4 or 5 passes. A lot of fuss about a bit of accuracy, but sometimes that bit of accuracy can save a life. But you have my permission not to use it being so dead set against it The only trouble with what you say is that it is not really quantitative. Your description is essentially "sales" talk as Jotne is saying. The diagram you provide is not factually accurate. I can assure you that you do not get a 5x improvement in accuracy as is shown (15 m >>> 3 m). I get 3 meter accuracy (50% CEP) without WAAS and I don't see any measurable improvement with WAAS working. Where did you get this diagram? Certainly there is little down side to using WAAS, but it is silly to expect it to make a difference in anything you do in normal usage. If I were trying to "bump a buoy" in dense fog and my life depended on it, well, I wouldn't want to be counting on a commercial GPS receiver anyway!!! I'm curious, why would you need to "bump" a buoy to save your life? Why would you need to be that accurate if you were in open water? If you were navigating, I don't think you need that sort of accuracy and if you were trying to maintain a position, why would you need the buoy rather than just the GPS?
  9. I assume the source of your info is Sam Wormley's page Sam Wormley's GPS Errors & Estimating Your Receiver's Accuracy The table data is shown here and I apologize for not being able to align the columns. Is there a feature in this forum to do that properly? One-sigma error, m Error source Bias Random Total DGPS ------------------------------------------------------------ Ephemeris data 2.1 0.0 2.1 0.0 Satellite clock 2.0 0.7 2.1 0.0 Ionosphere 4.0 0.5 4.0 0.4 Troposphere 0.5 0.5 0.7 0.2 Multipath 1.0 1.0 1.4 1.4 Receiver measurement 0.5 0.2 0.5 0.5 ------------------------------------------------------------ User equivalent range error (UERE), rms* 5.1 1.4 5.3 1.6 Filtered UERE, rms 5.1 0.4 5.1 1.5 However you omitted the Bias and Random columns which are very important. Sam's page describes these two columns here, "Each error is described as a bias (persistence of minutes or more) and a random effect that is, in effect "white" noise and exhibits little correlation between samples of range." This says to me that the "random" component should not be correctable by a non-real time differential method such as SBAS. However, the Satellite Clock error is shown as being eliminated completely. There are other methods of differential correction than SBAS or WAAS. The table does not specifically talk about SBAS or WAAS, but rather DGPS. I don't see how SBAS can achieve this level of corrections as it is not truly real time in terms of providing unique corrections for each sample. To compensate for "random" errors which do not correlate between different samples, you would have to have a real time correction such as a direct radio link to a differential ground station. Am I misinterpreting the data above? I will say that my measurements have not shown a significant improvement in measurement accuracy when using WAAS, but like yours, they are fairly limited.
  10. Specs on Garmin 60cx with SiRF chip: GPS accuracy <10 meters (33 feet) 95% typical DGPS (waas accuracy in north america) 3-5 meters (10-16 feet) 95% typical Garmin Map76 NON-SiRF GPS accuracy: <15 meters (49feet) RMS 95% typical DGPS (USCG)3-5 meters (10-16 feet) 95% typical DGPS (WAAS) 3 meters (10ft) 95% typical with DGPS corrections The SiRF chip does some interesting things. It is not necessarily more accurate. It grabs every availible signal, rejecting none as others will, and computes the heck out of it for the best position. This is why you can get a fix in the woods. At times, such as on start-up, it will make assumptions to provide you with a quick fix and position, as I understand it. I wonder why the 60cx is spec'd to perform so much more poorly than SIRF rates their own products. Their GSC3f/LP eval kit spec sheet gives an accuracy rating of... Horizontal Position Accuracy Autonomous <2.5 m SBAS <2.0 m Of course the devil is in the details. There are a lot of conditions that have to be spec'd in order to state an accuracy. I would be curious as to how they perform in the real world. Like I have said in other posts, if you just record all the readings from a unit over a period of time, regardless of whether SBAS is on or off, the measured values range over a 20 to 30 m (or larger) circle with the 95% radius being much smaller, say 10 m. But I have never seen a unit get anywhere near the 3 meters (or less) that they are rated. BTW, how do you know how the SIRF chip works internally that is different from other GPS chips?
  11. What you have just done is to verify that the same receiver on the same day will find the point it just measured. That is not the same thing as getting the actual coordinates more accurately. If you want to measure the location more accurately, you need to measure it many times over multiple days. There are a lot of different sources of error in a GPS measurement and to try to measure around them you need to average a lot of data over a long period of time. For the purpose of Geocaching, there is not much point. It is better to just accept that the error can be larger than you might like and live with that. But then maybe that is what you are saying...
  12. I would say it is more user hype rather than marketing hype. I have never seen a manufacturer claim that it makes a significant difference. The data sheets all show between 1 and 2 meters improvement which, as you say, in the midst of larger, more obvious errors, is not an improvement at all really.
  13. I almost forgot, the purpose of WAAS is to provide an indication of whether your calculated position is within an appropriate error range or not. This is used in aircraft so that an erroneous reading is not used to guide the aircraft on landing where even an excessive error can cause a problem. I don't know exactly how WAAS provides this safety check, but that is its purpose, not greatly improved accuracy.
  14. Better yet, put the receiver into NMEA mode and record the readings over a long period. I have done that with my bluetooth receiver and found that the raw, un-averaged data wanders within (if you are lucky) a 50 m circle. Of course the 95% circle is much smaller than that, but it shows that even under static conditions, a receiver will get large errors at some time. That is why a hand held receiver averages the raw data to display to the user. You will see this when you walk or drive and it takes the cursor a second or two to catch up when you stop. In fact, I expect that is the source of the "overshoot" that the Magellans are so famous for. I have found that most of the larger error readings happen in a "burst" where the position wanders away from the current spot for up to 10 seconds and then returns to the smaller error. I can't say if this is something external or internal to the unit being tested. I have not tested a lot of units. Also, a note about WAAS. While trying to get diff GPS mode turned on in my bluetooth unit, I had a number of conversations on one of the GPS forums. One was with a representative of one of the bluetooth GPS makers and he indicated that WAAS does *not* give a significantly more accurate position. The improvement is on the order of 1 to 2 meters. In the spec sheet this can be up to 50% of the non WAAS accuracy. But that is under very ideal conditions only seen in the lab. In the real world (even standing in an open field with 12 SVs in perfect locations) there are significant sources of error and you will only get 10 to 15 meters regardless of what the receiver tells you. The EPE reported by a GPS receiver is the "estimated" position error and is calculated from the location and number of SVs. ***There is NO WAY the receiver can KNOW the actual error*** If that were possible, it could be subtracted out of the calculation and corrected. This number is just an indication of how the constellation will degrade the accuracy and tells you nothing about the other sources of error.
  15. Actually, I have a PLGR unit... my understanding was that the only difference between these and the commercial units are that the PLGR is (1) ruggedized, (2) able to get 3m accuracy even with SA enabled, and (3) able to operate in a jamming environment. I got my Garmin because it's smaller & lighter than the PLGR, SA is disabled, and if someone wants to Jam my GPS signal so I don't find my cache, well I guess that's okay. That does make me think though... maybe I should take it out with my Garmin side by side and do a comparison.... I fat-fingered my original post -- I meant 20 feet, not 10. I'll walk up to a cache with my GPSR saying "2 feet" and pointing me at something, then flip over to the satellite page to see the accuracy at "+-20" or higher. I just figured I had something misconfigured. I'll look for the "D" next to the satellites next time I'm out... You have the PLGR down pat. The added accuracy of the mil units was from not being subject to SA. Once that was turned off, all GPS receivers are on equal footing... at least until the next generation of mil GPS comes on line :^) I have always thought it was interesting that most mil users go with the Garmin. I expect it is just a matter of using what the guys around you are using. Going with something different is of no point when the standard works just fine. I actually worked for Thales (not in the GPS division) and did some work with GPS. We took a commercial GPS receiver module an put it into a remote control of a piece of gear. Remote as in on a cable, not on an RF link. We put the GPS receiver in the remote so it could get a clear view of the sky when worn on the chest or the shoulder. It worked pretty well and as long as SA is not turned on the only disadvantage compared to the PLGR is the lack of anti-spoof and anti-jam.
  16. Is it just me or does anyone else think a 2.2" screen is a bit small? My current Merigold is 2.7 inches or so and I would like to have it larger! The units with touch screens have screens about that size, but considering the high resolution (QVGA) it seems too small still. Is it just me and my aging eyes?
  17. It must have been a cost issue. Most consumer electronic companies push to save every last penny. When you think about it, it really wouldn't be a deal breaker for buying one. Maybe that's what they figured. I'm sorry I can't subscribe to, or abide by the new mindset . . . "Never do any more than you absolutely have to, to keep from getting fired" or "Sometimes just good enough really is!". It's counter to my up-bringing. Norm I hope I got the quotes right, no one was trimming and it got rather long... First, this may be a nit, but I want to clarify the meaning of USB 1.1 vs 2.0. USB 1.1 includes low speed and full speed operation. USB 2.0 is a superset of 1.1 and is designed to be fully interoperable. Either type of device can be plugged into either type of host and still operate. That said, USB 2.0 does have some features that USB 1.1 does not, including optional high speed operation. So saying USB 2.0 is not synonymous with high speed operation which is what I think is really being discussed here. To the real issue, why they didn't provide high speed operation, I can tell you that it is definitely more expensive to add high speed operation to most designs. There are many CPUs which include both the PHY and the logic for a full speed USB interface. But I have not seen one (to my recollection which is not so good these days) that includes the high speed PHY. This part consumes a lot more power and would add some $20 or more to the final product cost. So it is definitely a product cost issue. While you can buy a card reader for $10 or less and many computers now include them as standard features, a $20 price difference would have cost a significant number of sales to the competition. This is also why they don't include the USB cable. There is very much a demand curve with sales falling off very quickly with rising price.
  18. If I am not mistaken, there are several people who have cracked the map formats for the Garmin units and produce custom maps for their areas. They are not cheap, IIRC, but they are availalbe and the formats are available, IIRC. I think I read about these here, but I have no idea which thread.
  19. What is wrong with the MapSource topo maps? They are intended for hiking and such. Are you aware that you can download them to your GPS?
  20. Thanks for the link. They still have these and the shipping is still free. I think this is the best deal I have seen in a while and it doesn't require *any* rebates!!! I ordered three of them. :^)
  21. I am glad that you called about this. A lot of people post misinformation here and then it is picked up and repeated as gospel. I find it odd that they would make this product so different from their other units, the difficulty another poster is having with downloads. But this may be due to the combination of two distinct product lines with incompatible software bases in terms of these features. For example, the road units may have a different download protocols since they do very different things than the trail units. On thing I don't believe that was posted here is the internal hard drive. It is common that the term "hard drive" is used when talking about a flash drive. This is to separate it from flash that connects directly to a processor and can be executed from. I don't even know of any hard drives that are only 4 GB, although there may be some of the 1" units in that size range. I know the makers were really pushing to have them designed in, but even two years ago it was clear that flash had longer legs in the embedded market. Now I would not use a rotating hard drive in anything under 50 GB. Even if it is a bit pricey right now, in a year it will be much more affordable... maybe. Flash prices have really bottomed out lately. The makers are barely making profit and are switching a lot of capacity to DRAM. So look for PC memory prices to continue to fall and flash prices to stablilize and possibly go up over the next 3 to 6 months. I haven't seen this unit, but this is the sort of unit I would like to have. I just want it to run open source software.
  22. If it would make you feel any better, I would greatly appreciate a detailed explanation of UTM and even info on how to convert between lat/lon, UTM and that other one that is X,Y,Z referenced to the center of the earth. I have been reading a bit on these and understand the basics. But I have not been able to find enough info to allow me to generate the equations or write code to perform the conversions. I have some code that was written by someone else with no comments at all. One of the things it does is to perform these sorts of conversions. I can reverse engineer it, but it will be a bear. The author may be willing to help out some, but I would prefer to learn as much as possible myself without bothering him. Knowledge about the conversions will help me a lot to understand the code. I am aware that UTM divides the earth into a number of regions. Each region is mapped to a grid measured in meters rather than lat/lon coordinates. I believe the regions are small enough that the errors this mapping causes are small enough to be ignored for anything other than survey work. Where is the info on exactly what the regions are, how to convert between lat/lon and UTM and are there significant issues with tranversing the boundaries between regions? I read that the ECEF is an XYX based description that ignores the shape of the earth and just gives a result in three space with a reference that rotates with the earth. I think I can convert to spherical coords, but it actually needs to be converted to an elipsoid defined by WGS-84. Where do I get the specs on WGS-84?
  23. What exactly is better about the SiRF based units compared to any other GPSr designed in the last couple of years? I have been looking very hard at GPSr of all types and I have not found that much difference between them in regards to what the SiRF front end would provide. For example, a lot of people tout the sensitivity of the SiRF III. But there are other units on par with the SiRF II in that department. Notably the Numerix devices that are being used in a number of bluetooth receivers. I expect you will not be able to observer significant differences in handheld units if you compare units of similar design age. Maybe I am just on a rant, but I have heard a *lot* about how much better the SiRF III devices are and I am not seeing it. I see that a lot of newer devices do just as well or better than the SiRF in all regards and I hate to see a receiver promoted on the buzzword "SiRF".
  24. A very pertinent question indeed, as many of us are likely to have been in this position or may be in the future. Unfortunately the responses so far are not addressing this question. I bought a Vista Cx, loaded my maps onto it (hence using up one unlock code). I was not happy with the unit and sent it back to the dealer for an upgrade to the 60CSx. I emailed Garmin (UK) to ask for another unlock code to replace the one I had 'used up' on the Vista but got no reply. I had to use my second unlock code to get maps onto my 60CSx. And, of course, I only have one unit! So, I can't answer your question as Garmin never answered it for me. I guess it might come down to whichever person you get through to at Garmin, and how generous they are feeling. They might want proof that you no longer own the unit, and that might be difficult to give them. Yes, your situation is one that points out the major difference between hardware and software. It is very easy for anyone to determine that a piece of hardware has been returned. But with software it can be much harder to verify that software is not being used. In your case the hardware that the software was locked to was returned and there is no way the software could be used with that unit (or any other using the same lock code). So it seems reasonable that you should get another lock code for free. But obviously Garmin has no provision for this in their system. You would have suffered the same loss if you had returned the unit outright and not upgraded. You could not have returned the unlocked software for a refund. Again, Garmin has no provision for this in their system (nor does nearly anyone else). I find it interesting that you never got a reply to your message. I hear so much about how good Garmin support is, but this sort of thing still happens. Did you follow up with a phone call?
  25. I spent the last week learning about my new Holux 240 bluetooth receiver. I downloaded a number of different applications trying to figure out how to enable WAAS support. In the process I thought that these bluetooth units just did not include WAAS capability or they did not provide a means of turning it on. In the end I found that it is just the lack of good software and documentation and they are really not hard at all to use. In short, the bluetooth units are a bit special because they have a true serial port between the two chips inside the receiver; the GPS chip and the bluetooth chip. This seial data rate defaults to 38,400 bps normally. Many of the advanced features and status displays for the SiRF III chip require the use of SiRF binary mode and are not available using the NMEA command set. Various programs will let you send and receive the binary commands, but they typically don't use the binary commands for data display. The SiRFDemo program from SiRF is intended to be used with the SiRF demo boards and not end user devices. For whatever reason they decided that when you switch it to binary mode, the seial data rate is changed to 57,600 bps. But in a bluetooth unit, only the GPS chip sees this command and changes the seial data rate. The bluetooth chip remains at the old seial data rate and all communications is garbled. The operation of the receiver is corrupted until the battery drains and the unit "forgets" its new setting. The way to use the bluetooth receivers with SiRFDemo is to not use the menu command to switch to binary mode, but to manually send the NMEA command to change modes *WHILE KEEPING THE SERIAL DATA RATE THE SAME*. In my unit this command is "PSRF100,0,38400,8,1,0". The menu command "Action - Transmit Serial Message" allows you to enter this string and it will append the $ and checksum. Be sure to select NMEA as the "Protocol Wrapper". You can even store this command in a file so you don't have to remember it. From then on your unit will be in binary mode until it is set back to NMEA with the appropriate binary command or the battery is drained. Several commands are required to put the unit into SBAS or WAAS mode. They can be executed from the menus. Under "Navigation" set "DGPS Mode" to "Automatic" with the default timeout. Set "DGPS Source" to "SBAS channel". Set "SBAS Control"; "SBAS PRN" to "Auto Scan", "Mode" to "Testing" and "Timeout Source" to "Default". Then, if you have an SBAS SV insight, you should see a SV in channel 12 with a PRN greater than 32. When the signal is used for corrections it will show in the "Navigation View" window as "3d+DGPS" Another note on the Holux 240 SiRF III receiver and SBAS that I figured out with some advice from Dennis Gröning. Seems that the SiRFDemo program won't display the actual corrections from the SiRF III receivers, but they are calculated and are sent over the comm port. Dennis suggested that I turn message 27 off (DGPS Status) and turn Message ID 29 on (Nav Lib DGPS Data). I had tried this before, but saw no difference in the dsiplay, which when MID 27 is used shows a constant 14.27 meter correction for all SVs. Looking at the logged commands I saw that MID 29 does indeed indicate that corrections are enabled and different corrections are shown for different SVs. However, I did not see a correcton for *every* SV being used in the fix. If I restart SiRFDemo with MID 27 disabled and MID 29 enabled, nothing shows up in the DGPS view window as if SiRFDemo does not report MID 29. I believe in one of the GPSPassion forum threads, Carl from SiRF said that the SiRF III does not report corrections even when they are calculated and applied. I guess they *are* reported, but not with MID 27. If you want full info on how to use SiRFDemo to control and monitor a bluetooth GPS receiver, go to http://www.gpspassion.com/forumsen/topic.a...amp;whichpage=1 and check out both pages. That is the thread I will be updating as I find more info.
  • Create New...