Conversion from WGS84 to local coordinates using proj4, conserving angles and distances - wgs84

I try to convert WGS84 points data from googlemaps to local x,y referential in meters. I looked to many post on this site and others, but the way to define the local referential to get true distances and angles has not been depicted in these elements.
To do this conversion, I’m using the Proj4.js library, based on the following instruction:
var LocalProjection = "+proj=merc +lat_ts=43.6 +lon_0=3.9 +x_0=0 +y_0=0 +ellps=WGS84 +units=m +no_defs" //to define the local referential
proj4('WGS84',LocalProjection,Point)
However, when doing this, I got distances between points which doesn’t match the ones I measure on google maps, so I believe there is an issue in the definition of the local referential I cannot figure out.
Would you have any clue on this, and especially on the parameters of the local referential ?
Example:
I consider the following points, representative of the outline from a typical building.
var PointTopLeft=[43.587778, 3.868792]
var PointTopRight=[43.587744, 3.868873]
var PointBottomRight=[43.587695, 3.868743]
var PointBottomLeft=[43.587666, 3.868822]
The distance between two points in the above order is calculated to respectively 7.0m, 9.3m, 7.5m and 11m, using pythagora formula on the difference between points.
Distance between points TopLeft - BottomLeft and between points TopRight – BottomRight should be 10m according to measure done on google earth. Similarly, distance between points TopLeft - BottomRight and between points TopRight – BottomLeft should be 7.5m according to measure done on google earth.

Related

How to calculate the Horizontal and Vertical FOV for the KITTI cameras from the camera intrinsic matrix?

I would like to calculate the Horizontal and Vertical field of view from the camera intrinsic matrix for the cameras used in the KITTI dataset. The reason I need the Field of view is to convert a depth map into 3D point clouds.
Though this question has been asked quite a long time ago, I felt it needed an answer as I ran into the same issue and was unable to find any info on it.
I have however solved it using the information available in this document and some more general camera calibration documents
Firstly, we need to convert the supplied disparity into distance. This can be done through fist converting the disp map into floats through the method in the dev_kit where they state:
disp(u,v) = ((float)I(u,v))/256.0;
This disparity can then be converted into a distance through the default stereo vision equation:
Depth = Baseline * focal length/ Disparity
Now come some tricky parts. I searched high and low for the focal length and was unable to find it in documentation.
I realised just now when writing that the baseline is documented in the aforementioned source however from section IV.B we can see that it can be found in P(i)rect indirectly.
The P_rects can be found in the calibration files and will be used for both calculating the baseline and the translation from uv in the image to xyz in the real world.
The steps are as follows:
For pixel in depthmap:
xyz_normalised = P_rect \ [u,v,1]
where u and v are the x and y coordinates of the pixel respectively
which will give you a xyz_normalised of shape [x,y,z,0] with z = 1
You can then multiply it with the depth that is given at that pixel to result in a xyz coordinate.
For completeness, as P_rect is the depth map here, you need to use P_3 from the cam_cam calibration txt files to get the baseline (as it contains the baseline between the colour cameras) and the P_2 belongs to the left camera which is used as a reference for occ_0 files.

Path mapping using VectorNav VN100 IMU to map a route between two GPS coordinates

I'm trying to use a VectorNav VN100 IMU to map a path through an underground tunnel (GPS denied environment) and am wondering what is the best approach to take to do this.
I get lots of data points from the VN100 these include: orientation/pose (Euler angles, quaternions), and acceleration and gyroscope values in three dimensions. The acceleration and gyro values are given in raw and filtered formats where filtered outputs have been filtered using an onboard Kalman filter.
In addition to IMU measurements I also measure GPS-RTK coordinates in three dimensions at the start and end-points of the tunnel.
How should I approach this mapping problem? I'm quite new to this area and do not know how to extract position from the acceleration and orientation data. I know acceleration can be integrated once to give velocity and that in turn can be integrated again to get position but how do I combine this data together with orientation data (quaternions) to get the path?
In robotics, Mapping means representing the environment using perception sensor (like 2D,3D laser or Cameras).
Once you got the map, it can be used by robot to know its location(Localization). Map is also used for find a path between locations to move from one place to another place(Path planning).
In your case you need a perception sensor to get the better location estimation. With only IMU you can track the position using Extended Kalman filter(EKF) but it drifts quickly.
Robot Operating System has EKF implementation you can refer it.
Ok so I came across a solution that gets me somewhat closer to my goal of finding the path travelled underground, although it is by no means the final solution I'm posting my algorithm here in the hopes that it helps someone else.
My method is as follows:
Rotate the Acceleration vector A = [Ax, Ay, Az] output by the VectorNav VN100 into the North, East, Down frame by multiplying by the quaternion VectorNav output Q = [q0, q1, q2, q3]. How to multiply a vector by a quaternion is outlined in this other post.
Basically you take the acceleration vector and add a fourth component on to the end of it to act as the scalar term, then multiply by the quaternion and it's conjugate (N.B. the scalar terms in both matrices should be in the same position, in this case the scalar quaternion term is the first term, so therefore a zero scalar term should be added on to the start of the acceleration vector) e.g. A = [0,Ax,Ay,Az]. Then perform the following multiplication:
A_ned = Q A Q*
where Q* is the complex conjugate of the quaternion (i, j, and k terms are negated).
Integrate the rotated acceleration vector to get the velocity vector: V_ned
Integrate the Velocity vector to get the position in north, east, down: R_ned
There is substantial drift in the velocity and position due to sensor bias which causes drift. This can be corrected for somewhat if we know the start and end velocity and start and end positions. In this case the start and end velocities were zero so I used this to correct the drift in the velocity vector.
Uncorrected Velocity
Corrected Velocity
My final comparison between IMU position vs GPS is shown here (read: there's still a long way to go).
GPS-RTK data vs VectorNav IMU data
Now I just need to come up with a sensor fusion algorithm to try to improve the position estimation...

How to calculate Altitude using GPS latitude and longitude

How to calculate Altitude from GPS Latitude and Longitude values.What is the exact mathematical equation to solve this problem.
It is possible for a given lat,lon to determine the height of the ground (above sea level, or above Referenz Elipsoid).
But since the earth surface, mountains, etc, do not follow a mathematic formula,
there are Laser scans, performed by Satelites, that measured such a height for e.g every 30 meters.
So there exist files where you can lookup such a height.
This is called a Digital Elevation Modell, or short (DEM)
Read more here: https://en.wikipedia.org/wiki/Digital_elevation_model
Such files are huge, very few application use that approach.
Many just take the altidude value as delivered by the GPS receiver.
You can find some charts with altitude data, like Maptech's. Each pixel has a corresponding lat, long, alt/depth information.
As #AlexWien said these files are huge and most of them must be bought.
If you are interest of using these files I can help you with a C++ code to read them.
You can calculate the geocentric radius, i.e., the radius of the reference Ellipsoid which is used as basis for the GPS altitude. It can be calculated from the the GPS latitude with this formula:
Read more about this at Wikipedia.

GEOS C API - calculating areas with WGS84 coords (SRID=4326)

I create a polygon where each x/y point is WGS84 format
lat/long values.
The polygons are good approximations to circles and sectors of
radius R (each circumference/arc point is a projected lat/long
value of distance R from a centre/apex coordinate - which I have
verified is correct by computing the Haversine distance between
the edge and reference points and getting a value of R back) .
I use GEOSSetSRID(4326) to indicate the coords are WGS84 format.
GEOSGetSRID() confirms the SRID is set.
Use of GEOSArea then gives a value not even remotely close to
the expected value.
I do not see what else I can programmatically do.
If I set the points in cartesian format, and then set the SRID to
4326, will GEOS implicitly convert the polygon points to WGS84 ??
Is the basic GEOS C API incapable of doing the above ??
Dos SRID have no meaning to the API at all ??
Any info/pointers to correct usage/solutions would be much appreciated.
TIA.
The distance that is given is something like degrees between the two points. In actuality, the GEOS API (at least the C++ interface) is units agnostic; the units it gives the distance in is based on whatever you passed in.
In general, multiplying the result you get by 111000 gives you a fairly accurate measurement in meters. For area, you have to do 111000^2.

How to compute direction between 2 locations

Simmilar with Direction between 2 Latitude/Longitude points in C#
but with objective-c
Also I want a formula that works for large distance near the pole if it's possible.
You'll need the following complete but rather difficult stuff. A slightly easier description is found on wikipedia.
Or you could save yourself a lot of time and use CLLocation's distanceFromLocation method:
distanceFromLocation:
Returns the distance (in meters) from the receiver’s location to the specified location.
Discussion
This method measures the distance between the two locations by tracing a line between them that follows the curvature of the Earth. The resulting arc is a smooth curve and does not take into account specific altitude changes between the two locations.
http://developer.apple.com/library/ios/DOCUMENTATION/CoreLocation/Reference/CLLocation_Class/CLLocation/CLLocation.html#//apple_ref/occ/instm/CLLocation/distanceFromLocation: