Jump to content

accuracy


fatsbreeze

Recommended Posts

I've got a Palm Pre too, and it works GREAT for Geocaching! The GoTo software shows you right on the screen the EPE and it's usually plenty accurate enough for Geocaching. Maybe not "as good" as a dedicated standalone GPSr but good enough.

 

Remember even a dedicated standalone GPSr won't get you "0 feet" of accuracy

Link to comment

My HTC Touch (Vogue) has a Qualcomm GPSOne chipset. It is a A-GPS (A is for Assisted). When in cell phone coverage, it is as good as the best of them for accuracy. It is about 3m (doing benchmark testing). When out of cell phone coverage, it truley blows.

 

The big issue with alot of smart phones is that they have "static navigation". This is a nice feature when driving, but when geocaching it will drive you nuts. If yours has this 'feature', and it drives you nuts, consider getting an external bluetooth GPSr (they are cheep).

Link to comment

is there anything i need do to make the coords for the caches i hide more accurate?

Further to my previous post, If your phone has 'static navigation', averaging needs to be done correctly or you will get bad coordinates.

 

Basicaly you need to take several readings, and then average the decimals of the minutes. Each reading shoudl be done like this:

Quickly walk away from the cache, looking at your phones coordinate readout.

When the readout starts to change (significantly, not the little fake moving present even during 'static navigation' lockup), walk back to the cache.

When you get to the cache, let the GPS settle for 3-10 seconds.

Take the reading.

Link to comment
Basicaly you need to take several readings, and then average the decimals of the minutes. Each reading shoudl be done like this:

Quickly walk away from the cache, looking at your phones coordinate readout.

When the readout starts to change (significantly, not the little fake moving present even during 'static navigation' lockup), walk back to the cache.

When you get to the cache, let the GPS settle for 3-10 seconds.

Take the reading.

alternatively you can try using an app that does continuous averaging for you, just as you would use with a GPS that doesn't have static navigation. and then keep walking around in a big circle around GZ while the app does the averaging.

 

no guarantees though :shocked::laughing:

Link to comment
Averaging won't do much good, unless your willing to take hundreds of readings over many hours. Otherwise, you're just as likely to be making things worse. Linky

what part of that page do you interpret as "just as likely to be making things worse"? all i can see is good proof that averaging always improves accuracy.

Edited by dfx
Link to comment
cell phones use the cell signal to triangulate your position

it is not true gps which uses satellite signals to triangulate

That may have been true in the past, but many modern phones do use true GPS (unless you've turned off the GPS antenna). They also use other signals (cell towers, WiFi networks) to identify your approximate location, and they'll use your approximate location when the GPS antenna is turned off, but they also use true GPS (using your approximate location to get a faster GPS lock).
Link to comment
Averaging won't do much good, unless your willing to take hundreds of readings over many hours. Otherwise, you're just as likely to be making things worse. Linky

what part of that page do you interpret as "just as likely to be making things worse"? all i can see is good proof that averaging always improves accuracy.

Averaging for just a short period of time means you may be just collecting a lot of bad data points, if the current constellation is poor, or you're getting multipath error. And averaging bad data doesn't turn it into good data. Averaging goes back to the days when SA inserted a random "fuzz factor" into the signal. Since it was randomized, averaging actually did some good. But the errors we get today are usually caused by an one or more specific external factors and are not random. If one out of every 5 data points is consistently 30 feet to the east, simply averaging them isn't going to give you a "better" set of coordinates. You're better off letting the GPS settle down and taking one reading. If the constellation's bad or you only have a sat locks, pack it up and come back another time.

Link to comment
Averaging for just a short period of time means you may be just collecting a lot of bad data points, if the current constellation is poor, or you're getting multipath error. And averaging bad data doesn't turn it into good data.

yes. only that any single reading under such circumstances will also give you bad coordinates. in fact statistically they will be worse, because you have no data to compare it to. so how can averaging "make it worse"?

 

But the errors we get today are usually caused by an one or more specific external factors and are not random.

if there's a specific reason such as multipath for getting bad coordinates, then yes, but otherwise no. the errors are quite random, and even under optimum conditions every consumer grade GPS receiver will "wander" around GZ while sitting in the same spot. but whatever the reasons may be, the point previously made remains: a single reading will never be better than an averaged reading.

 

If one out of every 5 data points is consistently 30 feet to the east, simply averaging them isn't going to give you a "better" set of coordinates.

actually it does, because averaging does exactly that, it gets rid of such error spikes. when you look at your GPS, observe the coordinates shown and try to ignore obvious error spikes and instead take a reading of coordinates that you think are close to what they should be, you're actually doing a kind averaging in your head. you might as well just let the GPS unit do it.

 

taking a single reading without averaging is doing it blindly. when you do that, you have nothing to compare it to, and you won't know if it's anywhere close to where it should be. in order to get a feel of what the coordinates should be, you need more data, more coordinate readings, and see which coordinates they hover around. and that's exactly what averaging does.

 

of course you need a good averaging algorithm for that to work, weighted and ignoring obvious errors. from what i hear, the garmin 60 series may be doing a bad job at it, and that may be the reason why people believe that averaging is a bad thing.

Link to comment

I was outside a local Golden Corral today and decided to try out the GPS feature. I turned on the GPS and it locked on in seconds. Looking on Google Earth, it showed me in the proper place. I walked out into the parking lot and GE showed me on the correct painted parking line and in the proper place. I am certain aGPS was in effect and it won't be this accurate without cell tower coverage, but I am on Verizon and coverage in my area is 100%. I will have to try it in the woods.

Link to comment
Averaging for just a short period of time means you may be just collecting a lot of bad data points, if the current constellation is poor, or you're getting multipath error. And averaging bad data doesn't turn it into good data.

yes. only that any single reading under such circumstances will also give you bad coordinates. in fact statistically they will be worse, because you have no data to compare it to. so how can averaging "make it worse"?

 

But the errors we get today are usually caused by an one or more specific external factors and are not random.

if there's a specific reason such as multipath for getting bad coordinates, then yes, but otherwise no. the errors are quite random, and even under optimum conditions every consumer grade GPS receiver will "wander" around GZ while sitting in the same spot. but whatever the reasons may be, the point previously made remains: a single reading will never be better than an averaged reading.

 

If one out of every 5 data points is consistently 30 feet to the east, simply averaging them isn't going to give you a "better" set of coordinates.

actually it does, because averaging does exactly that, it gets rid of such error spikes. when you look at your GPS, observe the coordinates shown and try to ignore obvious error spikes and instead take a reading of coordinates that you think are close to what they should be, you're actually doing a kind averaging in your head. you might as well just let the GPS unit do it.

 

taking a single reading without averaging is doing it blindly. when you do that, you have nothing to compare it to, and you won't know if it's anywhere close to where it should be. in order to get a feel of what the coordinates should be, you need more data, more coordinate readings, and see which coordinates they hover around. and that's exactly what averaging does.

 

of course you need a good averaging algorithm for that to work, weighted and ignoring obvious errors. from what i hear, the garmin 60 series may be doing a bad job at it, and that may be the reason why people believe that averaging is a bad thing.

 

What you're describing is data post-processing, where the outliers can be identified and compensated for. Unfortunately, that's not what consumer-grade GPSs do. The just perform a simple averaging function, which means any outliers make the data worse, not better. If you'll look at the first chart on the link I provided previously, you'll see that it takes several hours worth of data points before the trials start to level off. In several cases, the coordinates being averaged get progressively worse for 2 or 3 hours before they start leveling off.

Link to comment
Averaging for just a short period of time means you may be just collecting a lot of bad data points, if the current constellation is poor, or you're getting multipath error. And averaging bad data doesn't turn it into good data.

yes. only that any single reading under such circumstances will also give you bad coordinates. in fact statistically they will be worse, because you have no data to compare it to. so how can averaging "make it worse"?

 

But the errors we get today are usually caused by an one or more specific external factors and are not random.

if there's a specific reason such as multipath for getting bad coordinates, then yes, but otherwise no. the errors are quite random, and even under optimum conditions every consumer grade GPS receiver will "wander" around GZ while sitting in the same spot. but whatever the reasons may be, the point previously made remains: a single reading will never be better than an averaged reading.

 

If one out of every 5 data points is consistently 30 feet to the east, simply averaging them isn't going to give you a "better" set of coordinates.

actually it does, because averaging does exactly that, it gets rid of such error spikes. when you look at your GPS, observe the coordinates shown and try to ignore obvious error spikes and instead take a reading of coordinates that you think are close to what they should be, you're actually doing a kind averaging in your head. you might as well just let the GPS unit do it.

 

taking a single reading without averaging is doing it blindly. when you do that, you have nothing to compare it to, and you won't know if it's anywhere close to where it should be. in order to get a feel of what the coordinates should be, you need more data, more coordinate readings, and see which coordinates they hover around. and that's exactly what averaging does.

 

of course you need a good averaging algorithm for that to work, weighted and ignoring obvious errors. from what i hear, the garmin 60 series may be doing a bad job at it, and that may be the reason why people believe that averaging is a bad thing.

 

This dialog, I like.

 

I like a whole lot.

Link to comment
What you're describing is data post-processing, where the outliers can be identified and compensated for. Unfortunately, that's not what consumer-grade GPSs do. The just perform a simple averaging function, which means any outliers make the data worse, not better.

wrong, the data gets statistically better with each and every sample, assuming it's processed properly (weighted). it may or may not get closer to real GZ, but you don't know that without knowing where GZ really is, and you know that only by looking at the sum of the collected data. well, you still don't know it, but you can make an educated guess. and that's what averaging does, because that's what statistics is all about. it doesn't give you any guarantees of getting good coordinates, but it improves the likelihood.

 

and yes, consumer-grade GPSrs do perform weighted averaging. maybe not the 60 series (heh), but other/newer models do. even if it's just a simple DOP filter mask, given enough samples the end result will be better than any single reading during that time.

 

If you'll look at the first chart on the link I provided previously, you'll see that it takes several hours worth of data points before the trials start to level off. In several cases, the coordinates being averaged get progressively worse for 2 or 3 hours before they start leveling off.

wrong again. take a look at the graph yourself, especially at the very beginning. this is what you get by performing a single reading without any averaging, and as you will see the extremes of the graph at this point (8 meters off) have much worse accuracy than even the worst peaks after a while of averaging (4 meters off).

 

now why are those peaks there? because at that time, the GPS was giving coordinates that are further off GZ than the averaged coords used to be up to that point, thus shifting the avaraged coords more off. which means that any single reading during that time would have given you coords that are even more further off GZ than the averaged coords up to that point. QED.

 

of course, some single readings will be closer to GZ than some averaged readings even after 24 hours of averaging. the point is that you don't know that from only that single reading, because there's an identical chance that the single reading may be 8+ meters off (e.g. as seen in the graph). this is where statistics come into play, and averaging is statistics. its power lies in the number of samples.

Link to comment

Well I think you guys are nit-picking too much :anibad:

 

Yeah both sides of the argument have valid points, but at the end of the day in my opinion, averaging a coordinate does more good-than-harm. Fair assumption?

 

Ok sure there will be occasions when averaging doesn't help, but you can't necessarily know *this time* is one of those times, so just go ahead & average. My old Lowrance iFinder Pro had a nice averaging screen that would show you the little averaging dots accumulating right on the screen and even a casual GPS user could see if the dots are fairly linear or popping up erratically. Do Garmin's do that?

 

Back to the OP question. Yes the Palm Pre does indeed do the a-GPS cell tower triangulation to "quickly" determine your general area, and then goes on to using the satellites to pin-point you. I think a-GPS is a GREAT feature when used this way, and is a shining star with Smart-phone GPSr's. It doesn't make up for all the other smart-phone deficiencies :cry: but is a nice feature.

Link to comment
Yeah both sides of the argument have valid points, but at the end of the day in my opinion, averaging a coordinate does more good-than-harm. Fair assumption?

well, PS here and a few other users don't seem to agree with that, and actually try to tell everyone that averaging is a bad thing and shouldn't be done. which is total BS, but from my observations it's mostly (only?) users of garmin 60 devices who recommend against it, so my theory is that averaging on those devices is flawed or outright broken. it's the only explanation i can come up with how anyone would ever come to the conclusion that averaging is a bad thing.

 

of course if taking single reading works for them, then by all means continue to do that, but don't tell people that it's the only way, especially when even the Groundspeak howto for placing caches tells people to do averaging.

Edited by dfx
Link to comment

i use my VZW BlackBerry to cache and it's very accurate! i hid 2 caches so far and i just walked away from the cache and quickly to it and then recorded the coordinates when i got to the cache. Did that 3 times per cache and it seems right on. i'll know more when i have more finds on my caches hidden!

 

i like using my BlackBerry to cache. i can get caches on the go, wherever i am. i'm getting a rugged case to protect it out there.

Link to comment
Averaging won't do much good, unless your willing to take hundreds of readings over many hours. Otherwise, you're just as likely to be making things worse. Linky

what part of that page do you interpret as "just as likely to be making things worse"? all i can see is good proof that averaging always improves accuracy.

 

In my cache, I added a note saying something along the lines of "If you could, please write down the coordinates that your GPS shows at this cache to help improve the accuracy of this listing."

 

I figure, getting multiple readings, from multiple receivers, on multiple days can't be a bad thing.

 

I went to my GZ on 3 different days writing down the coordinates before making the cache public.

Link to comment

i use my VZW BlackBerry to cache and it's very accurate! i hid 2 caches so far and i just walked away from the cache and quickly to it and then recorded the coordinates when i got to the cache. Did that 3 times per cache and it seems right on. i'll know more when i have more finds on my caches hidden!

 

i like using my BlackBerry to cache. i can get caches on the go, wherever i am. i'm getting a rugged case to protect it out there.

 

How about battery life?

Link to comment

Of the caches that I have hidden, my version of 'averaging' has worked out quite well. Perhaps if I had hidden more caches, I would feel differently, hard to say.

 

On only one occasion during my original hides did I get accuracy complaints and that was due to mis-typing the coordinates.

 

I do suppose that the averaging technique could be irrelevant, I however will continue using it. And right or wrong, it is moderately comforting to know that many fellow cache hiders use the same technique.

Link to comment

My HTC Touch (Vogue) has a Qualcomm GPSOne chipset. It is a A-GPS (A is for Assisted). When in cell phone coverage, it is as good as the best of them for accuracy. It is about 3m (doing benchmark testing). When out of cell phone coverage, it truley blows.

 

 

I thought A-GPS allowed the unit to get a rough position very quickly, then the "true" GPS chip in the phone takes over and provides a more accurate position. So, once out of cell tower range, you'd still get a good position fix, it would just take a lot longer.

Link to comment

i use my VZW BlackBerry to cache and it's very accurate! i hid 2 caches so far and i just walked away from the cache and quickly to it and then recorded the coordinates when i got to the cache. Did that 3 times per cache and it seems right on. i'll know more when i have more finds on my caches hidden!

 

i like using my BlackBerry to cache. i can get caches on the go, wherever i am. i'm getting a rugged case to protect it out there.

 

How about battery life?

 

The thing that drains the battery is having the backlight on. So i make sure when i'm not looking at the screen i put the BlackBerry in standby. Then i put it on the charge in the car if needed. i'm thinking of getting a solar charger.

 

i'd say my BlackBerry lasts about 4 hours when caching heavily with no charge.

Link to comment

and yes, consumer-grade GPSrs do perform weighted averaging. maybe not the 60 series (heh), but other/newer models do. even if it's just a simple DOP filter mask, given enough samples the end result will be better than any single reading during that time.

Citation please?

 

wrong again. take a look at the graph yourself, especially at the very beginning. this is what you get by performing a single reading without any averaging, and as you will see the extremes of the graph at this point (8 meters off) have much worse accuracy than even the worst peaks after a while of averaging (4 meters off).

 

now why are those peaks there? because at that time, the GPS was giving coordinates that are further off GZ than the averaged coords used to be up to that point, thus shifting the avaraged coords more off. which means that any single reading during that time would have given you coords that are even more further off GZ than the averaged coords up to that point. QED.

Sorry, but no. The graph shows the running average. That's the whole point of it. Think about it. If it were showing at-the-moment data points, instead of a running average, it would never even out, would it?

 

Averaging can be used to get a more precise coordinate, but it requires taking a large number of readings at different times during the day, to compensate for constellation changes and other transient factors (and Garmin agrees). But the way most geocachers use it, by letting it run for a minute two for a single session, is just smoke and mirrors. It might make you feel like you're getting a better reading, but that's not necessarily the case. You're not getting the Truth. You're getting Truthiness. If you don't have the time, you're better off just letting the numbers settle down, then taking a single reading. If you've happen to hit that Mark button during an error spike, it will be obvious, since the values won't match what you just saw. Had you been averaging during that period, that spike would have been tossed into the coordinate stew, and likely made your result worse.

 

This side-topic really isn't a "Getting Started" issue, and belongs in the Tech section with all the other Averaging topics. I suggest continuing there.

Edited by Prime Suspect
Link to comment
Sorry, but no. The graph shows the running average. That's the whole point of it. Think about it. If it were showing at-the-moment data points, instead of a running average, it would never even out, would it?

you missed the point there. of course does the graph show the (cumulative) moving average. but 1) the "average" at the very beginning of the graph is thus not an "average", but rather corresponds to a single reading, and 2) for the averaged coords to move further away from GZ, the data points from that time would have to be consistently even more further away from GZ. if they'd been closer to GZ than the average so far, then the average would also move closer to GZ and not away from it.

 

apart from that, i don't feel like repeating what i've already posted. you may want to look at this page, especially the first graph. it's from a garmin 60C and is a plot of all the data points from a 5-minute session of averaging. it clearly shows how the data points (= single waypoint readings) are randomly pretty much all over the place and up to ~8 meters off, while the averaged coordinates are pretty darn close to GZ, less than 2 meters off. but i'm sure you have an explanation for that too :ph34r:

Link to comment

i use my VZW BlackBerry to cache and it's very accurate! i hid 2 caches so far and i just walked away from the cache and quickly to it and then recorded the coordinates when i got to the cache. Did that 3 times per cache and it seems right on. i'll know more when i have more finds on my caches hidden!

 

i like using my BlackBerry to cache. i can get caches on the go, wherever i am. i'm getting a rugged case to protect it out there.

 

How about battery life?

Those lithium ion batteries in smart phones last for quite a while. And swapping in a replacement is quite easy. Or, you can charge them from a AA battery charging pack. Batterie life is not a significant issue. I use my smart phone for back country hikeing with no problem. My last one was 11.5h through very rouged terrain.

Link to comment

My HTC Touch (Vogue) has a Qualcomm GPSOne chipset. It is a A-GPS (A is for Assisted). When in cell phone coverage, it is as good as the best of them for accuracy. It is about 3m (doing benchmark testing). When out of cell phone coverage, it truley blows.

 

 

I thought A-GPS allowed the unit to get a rough position very quickly, then the "true" GPS chip in the phone takes over and provides a more accurate position. So, once out of cell tower range, you'd still get a good position fix, it would just take a lot longer.

There are several veriations of A-GPS. I don't know how each one works. Mine (QualCOMM GPSOne) works the way I explained it. The cell phone network somehow augments the gps. When out of coverage, the gps has a very hard time even locking in, and the accuracy sucks. Cloud cover can block the signal. When in cell coverage, it is quite sencitive. It will work under bridges etc, that I would not expect it to work, and I get about 3m accuracy (on benchmarks). When I am planning to be out of cell coverage, I bring my bluetooth GPSr. It has great performance.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...