GPS reported accuracy, error function - gps

Most GPS systems report "accuracy" in units of meters, with the figure varying over orders of magnitude. What does this figure mean? How can it be translated to an error function for estimation, i.e. the probability of an actual position given the GPS reading and its reported accuracy?
According to the Wikipedia article on GPS accuracy, a reading down to 3 meters can be achieved by precisely timing the radio signals arriving at the receiver. This seems to correspond with the tightest error margin reported by e.g. an iPhone. But that wouldn't account for external signal distortion.
It sounds like an error function should have two domains, with a gentle linear slope out to the reported accuracy and then a polynomial or exponential increase further out.
Is there a better approach than to tinker with it? Do different GPS chipset vendors conform to any kind of standard meaning, or do they all provide only some kind of number for the sake of feature parity?

The number reported is usually called HEPE, Horizontal Estimated Position Error. In theory, 67% of the time the measurement should be within HEPE of the true position, and 33% of the time the measurement should be in horizontal error by more than the HEPE.
In practice, no one checks HEPE's very carefully, and in my experience, HEPE's reported for 3 or 4 satellite fixes are much larger than they need to be. That is, in my experience 3 satellite fixes are accurate to within a HEPE distance much more than 67% of the time.
The "assumed" error distribution is circular gaussian. So in principle you could find the right ratios for a circular gaussian and derive the 95% probability radius and so on. But to have these numbers be meaningful, you would need to do extensive statistical testing to verify that indeed you are getting around 95%.
The above are my impressions from working in the less accuracy sensitive parts of GPS over the years. Concievably, people who work on using GPS for aircraft landing may have a better sense of how to predict errors and error rates, but the techniques and methods they use are likely not available in consumer GPS devices.

Related

Calibration of magnetometer attached to a vehicle as Figure 8 calibration isn't possible in such scrnaroo

I was trying to find a way to calibrate a magnetometer attached to a vehicle as Figure 8 method of calibration is not really posible on vehicle.
Also removing magnetomer calibrating and fixing won't give exact results as fixing it back to vehicle introduces more hard iron distortion as it was calibrated without the vehicle environment.
My device also has a accelerometer and gps. Can I use accelerometer or gps data (this are calibrated) to automatically calibrate the magnetometer
Given that you are not happy with the results of off-vehicle calibration, I doubt that accelerometer and GPS data will help you a lot unless measured many times to average the noise (although technically it really depends on the precision of the sensors, so if you have 0.001% accelerometer you might get a very good data out of it and compensate inaccuracy of the GPS data).
From the question, I assume you want just a 2D data and you'll be using the Earth's magnetic field as a source (as otherwise, GPS wouldn't help). You might be better off renting a vehicle rotation stand for a day - it will have a steady well known angular velocity and you can record the magnetometer data for a long period of time (say for an hour, over 500 rotations or so) and then process it by averaging out any noise. Your vehicle will produce a different magnetic field while the engine is off, idle and running, so you might want to do three different experiments (or more, to deduce the engine RPM effect to the magnetic field it produces). Also, if the magnetometer is located close to the passengers, you will have additional influences from them and their devices. If rotation stand is not available (or not affordable), you can make a calibration experiment with the GPS (to use the accelerometers or not, will depend on their precision) as following:
find a large flat empty paved surface with no underground magnetic sources (walk around with your magnetometer to check) then put the
vehicle into a turn on this surface and fix the steering wheel use the cruise control to fix the speed
wait for couple of circles to ensure they are equal make a recording of 100 circles (or 500 to get better precision)
and then average the GPS noise out
You can do this on a different speed to get the engine magnetic field influence from it's RPM
I had performed a similar procedure to calibrate the optical sensor on the steering wheel to build the model of vehicle angular rotation from the steering wheel angle and current speed and that does not produce very accurate results due to the tire slipping differently on a different surface, but it should work okay for your problem.

What's the precision of a GPS depending on the number of satellites?

I know that the precision of the GPS depends a lot on the quality of it but if you know aproximatively the precision of a GPS tracking depending on the number of satellites I would be very thankful.
The number of satellites has a counterituitivly low impact on GPS precision.
On a theoretical basis, 3 Satellites will provide a perfect fix, in reality it will create a pyramid-shaped area of confidence space, that when projected onto a Map will be an error triangle.
More satellites will add more facettes to the confidence space, but after projection onto a 2D map this will give only a modes reduction of error area.
The biggest benefit of more satellites is the possibility to discard outliers - this is especially true in built-up areas where a reflected signal can create a wildly wrong area with a high confidence. If you have, say, 5 satellites with 4 agreeing well and one outlier, you have a good argument to discard it.
This works out to a situation with 4 satellites and good signal providing a much better fix than 7 satellites with bad signal.

Correcting SLAM drift error using GPS measurements

I'm trying to figure out how to correct drift errors introduced by a SLAM method using GPS measurements, I have two point sets in euclidian 3d space taken at fixed moments in time:
The red dataset is introduced by GPS and contains no drift errors, while blue dataset is based on SLAM algorithm, it drifts over time.
The idea is that SLAM is accurate on short distances but eventually drifts, while GPS is accurate on long distances and inaccurate on short ones. So I would like to figure out how to fuse SLAM data with GPS in such way that will take best accuracy of both measurements. At least how to approach this problem?
Since your GPS looks like it is very locally biased, I'm assuming it is low-cost and doesn't use any correction techniques, e.g. that it is not differential. As you probably are aware, GPS errors are not Gaussian. The guys in this paper show that a good way to model GPS noise is as v+eps where v is a locally constant "bias" vector (it is usually constant for a few metters, and then changes more or less smoothly or abruptly) and eps is Gaussian noise.
Given this information, one option would be to use Kalman-based fusion, e.g. you add the GPS noise and bias to the state vector, and define your transition equations appropriately and proceed as you would with an ordinary EKF. Note that if we ignore the prediction step of the Kalman, this is roughly equivalent to minimizing an error function of the form
measurement_constraints + some_weight * GPS_constraints
and that gives you a more straigh-forward, second option. For example, if your SLAM is visual, you can just use the sum of squared reprojection errors (i.e. the bundle adjustment error) as the measurment constraints, and define your GPS constraints as ||x- x_{gps}|| where the x are 2d or 3d GPS positions (you might want to ignore the altitude with low-cost GPS).
If your SLAM is visual and feature-point based (you didn't really say what type of SLAM you were using so I assume the most widespread type), then fusion with any of the methods above can lead to "inlier loss". You make a sudden, violent correction, and augment the reprojection errors. This means that you lose inliers in SLAM's tracking. So you have to re-triangulate points, and so on. Plus, note that even though the paper I linked to above presents a model of the GPS errors, it is not a very accurate model, and assuming that the distribution of GPS errors is unimodal (necessary for the EKF) seems a bit adventurous to me.
So, I think a good option is to use barrier-term optimization. Basically, the idea is this: since you don't really know how to model GPS errors, assume that you have more confidance in SLAM locally, and minimize a function S(x) that captures the quality of your SLAM reconstruction. Note x_opt the minimizer of S. Then, fuse with GPS data as long as it does not deteriorate S(x_opt) more than a given threshold. Mathematically, you'd want to minimize
some_coef/(thresh - S(X)) + ||x-x_{gps}||
and you'd initialize the minimization with x_opt. A good choice for S is the bundle adjustment error, since by not degrading it, you prevent inlier loss. There are other choices of S in the litterature, but they are usually meant to reduce computational time and add little in terms of accuracy.
This, unlike the EKF, does not have a nice probabilistic interpretation, but produces very nice results in practice (I have used it for fusion with other things than GPS too, and it works well). You can for example see this excellent paper that explains how to implement this thoroughly, how to set the threshold, etc.
Hope this helps. Please don't hesitate to tell me if you find inaccuracies/errors in my answer.

Distance estimation based on signal strength

I have set of data which includes position of a car and unknown emitter signal level. I have to estimate the distance based on this. Basically signal levels varies inversely to the square of distance. But when we include stuff like multipath,reflections etc we need to use a diff equation. Here come the Hata Okumura Model which can give us the path loss based on distance. However , the distance is unknown as I dont know where the emitter is. I only have access to different lat/long sets and the received signal level.
What I am asking is could you guys please guide me to techniques which would help me estimate the distance based on current pos and signal strength.All I am asking for is guidance towards a technique which might be useful.
I have looked into How to calculate distance from Wifi router using Signal Strength? but he has 3 fixed wifi signals and can use the FSPL. However in an urban environment it doesnot work.
Since the car is moving, using any diffraction model would be very difficult. The multipath environment is constantly changing due to moving car, and any reflection/diffraction model requires well-known object geometry around the car. In your problem you have moving car position time series [x(t),y(t)] which is known. You also have a time series of rough measurement of the distance between the car and the emitter [r(t)] of unknown position. You need to solve the stationary unknown emitter position (X,Y). So you have many noisy measurement with two unknown parameters to estimate. This is a classic Least Square Estimation problem. You can formulate r(ti) = sqrt((x(ti)-X)^2 + (y(ti)-Y)^2) and feed your data into this equation and do least square estimation. The data obviously is noisy due to multipath but the emitter is stationary and with overtime and during estimation process, the noise can be more or less smooth out.
Least Square Estimation

What does horizontalAccuracy exactly mean?

I am working on an iOS application using location services. Having a background in experimental physics, I am wondering what exactly horizontalAccuracy in a location found in locationManager:didUpdateToLocation:fromLocation: stands for. The documentation is a bit sparse...
I assume that the accuracy gives a confidence interval based on a gaussian (or poisson?) distribution. Thus, with a certain probability, the actual position is within a circle with a radius of horizontalAccuracy, but could as well be outside that area. The question is then: how big is that probability? If horizontalAccuracy corresponds to 1σ, I'd have a probability of 68% to be within that circle with horizontalAccuracy, but looking the other way around, in nearly one third of the cases, the actual position will be outside that area. Thus, in certain cases, I'd rather use 2σ (2*horizontalAccuracy) or even 3σ (3*horizontalAccuracy) to calculate with.
To put it short: is there any indication somewhere, which confidence interval horizontalAccuracy has?
Comment to all who respond "Apple says it is within":
Well - the measurement can not be exact. It must have a certain level of uncertainty. If you repeat the measurement very often, you will get a distribution of results - probably a gaussian distribution. This gaussian has a certain width, which corresponds to the level of uncertainty of the measurements. Measuring the position more often will reduce the uncertainty and thus increase accuracy, but never will give you a distinct interval where the actual position is guaranteed to be in. You will only get a probability. But if the accuracy is 3sigma, we have 99,7% - which is close to certain.
To put it short - I doubt the documentation from Apple.
I have been looking for the same information and could not find any answers. The only pointer I have, is that on Android, they are using 1σ:
http://developer.android.com/reference/android/location/Location.html#getAccuracy%28%29
To all the non-believers, this link also explains a little bit how the accuracy thing works.
My guess is, the same is true on iOS, but there is no way to be sure - except for asking the guy who wrote the code ;)
Edit:
After some playing around and checking location updates vs. physical location it seems like it is more likely 3σ on iOS. There are two observations that lead me to believe that is true:
On Android locations that come from WiFi triangulation are usually reported as having an accuracy between 20 and 50 meters. On iOS it's between 65 and 165 meters.
When measuring the distance between a reported location and the device's physical location, it has been within the reported accuracy every time so far.
The iOS documentation doesn't specify the probability of containment, but android reports a one-sigma horizontal accuracy, which they define to represent 68% probability that the true location is within the circle.
Their explanation is that location errors follow a normal distribution, and therefore +/- one-sigma represents 68% probability. However, 68% is the probability for a one-dimensional normal distribution. In two dimensions, a one-sigma error represents 39% probability of containment within a circle (the distance error follows a Rayleigh distribution, a.k.a. a chi distribution with two degrees of freedom).
There are two possible explanations.
The circle truly represents 68% probability of containment, in which case android developers have scaled the one-dimensional sigma by a factor of about 1.5 so that the circle happens to represent 68%. In this case, their choice of 68% is completely arbitrary.
The circle actually represents 39% probability of containment. In this case, their description would be correct if you replaced a one-dimensional gaussian with a two-dimensional one and its associated probability.
I think the second explanation is more likely.
iOS: https://developer.apple.com/library/ios/documentation/CoreLocation/Reference/CLLocation_Class/index.html#//apple_ref/occ/instp/CLLocation/horizontalAccuracy
Android: http://developer.android.com/reference/android/location/Location.html#getAccuracy%28%29
Which is denoting the Accuracy Level of Location. Example: If horizontalAccuracy is 0 means high accuracy and 500 as horizontalAccuracy means low accuracy.
Location Services Provider updates the location based on the consolidated best value of cellular, WiFi (in the case of WiFi connections) and GPS. So, the location value will be oscillating base on coverage. You can filter it by using this horizontalAccuracy.
Horizontal accuracy of X indicates that your horizontal position can be X meters off.. Remember location can be found out using GPS, cell tower triangulation or wifi location data. CLLocationManager gives you a most accurate location from these 3 methods.. And say there is a chance it may be off by atmost X meters.
In what way is the documentation sparse?
The radius of uncertainty for the location, measured in meters. (read-only)
The location’s latitude and longitude identify the center of the circle, and this value indicates the radius of that circle. A negative value indicates that the location’s latitude and longitude are invalid.
So your location is within the circle. It isn't outside the circle, or the radius would be bigger. Your assumption about confidence intervals is incorrect.