Conversion of quaternion to Euler angles for pitch crossing 90 degrees - quaternions

I am using 9-axis IMU fused orientation data and I need to convert quaternion pose to Euler angles for purposes of graphing the data to user. I want my all three angles to be from range [-180, 180] and I found a lot of references how to do that easily.
Roll and yaw work well but when the pitch crosses +90 degrees it starts going back towards -90 and then back to 0, as I increase Y component of my quaternion. This video explains it better.
You can also see the raw data from the sensor on this image:
There are in total three full rotations around each of the axes one after another. First two (solid red and green lines) are OK. Blue one (misslabelled "roll" on the plot, but is actually pitch rotation) does well until it reaches -90 degrees and then flips the direction until +90 and goes back to 0. At the same time this happens, both of the other axes give wrong readings (very fast change).
Is there a way of circumventing this issue around +-90 degrees and computing absolute pose in world frame?

Related

How to calculate Rotation matrix for Rotation from GPS Co-ordinate system to SLAM Co-ordinate system

In our SLAM algorithm, we want to rotate GPS co-ordinates to SLAM co-ordinate system so that those GPS positions can be used for bundle adjustment in monocular SLAM (Scale correction).
We have used following procedure for rotation.
Convert Lat, Long, Att GPS co-ordinates to XYZ co-ordinate using Geoconverter library. (Resetting origin same as SLAM co-ordinate's origin)
Therefore, calculate rotation matrix between XYZ GPS co-ordinates and SLAM co-ordinates using rodrigues rotation formula for the first vector after origin.
Then, for each XYZ GPS position use the rotation matrix to calculate the GPS co-ordinates in SLAM co-ordinate system.
However, the results are not as expected, the GPS co-ordinate plane does not align with SLAM co-ordinates, hence bundle adjustment fails.
To check the actual position of the GPS co-ordinates after rotation, we plotted them (as shown below).
Front view (GPS(olive), SLAM(blue), MapPoints(black))
Without MapPoints
Side view to understand GPS and SLAM are in different plane
Side view with MapPoints
Side view different angle
Can you point out the mistake in the process or suggest an alternate process to align GPS data.
Thanks
Saurabh Kumar

Interpolating rotation around an axis - quaternions and slerp vs linear interpolation of angles

Im making a simple 3d game where Ive got some boats colliding and changing directions to avoid eachother.
Part of the collision handling is built around bouncing and then diverting the heading direction slightly (boat A hit boat B, A bounces back then rotates say 10 degrees to the left and resumes movement)
So far, Ive just updated the heading direction, which looks a bit abrupt. I intend to interpolate from the old heading to the new one. It is very simple, the heading is always just an angle around one axis. So basically its going from say 90 degs to 110 degs.
Im aware of quaternions and slerp, which would give me a constant velocity (my rotation should be silky smooth). But I just end up feeling like its using a sledge hammer to kill a fly. What is really the consequence of just doing a regular vanilla linear interpolation from 90 to 110 for the rotation angle? Will it even be visually noticeable that I have used quaternions instead of the much simpler and much cheaper linear interpolation of angle values? I have no special important "key frames" that need to be hit - there is no "animational data" at all, the 3d models are static.
So if someone could shed some light of what potential problems I could run into if I just interpolate the rotation degrees instead of using slerp, it would be much appreciated.
Thanks
/Jim

Detecting Angle using Kinect

I have a flat pan and using kinetic v1.
I want to receive the angle of the pan using kinetic camera.
for eg: If I put the angle in 45 degrees so kinetic will read the closet or exact angle it placed.
is this possible or any solutions ?
Thanks.
I don't know exactly how the data comes back in Kinect V1 but I believe this methodology should work for you.
First: You have to assume that the Kinect is your level of reference, if it is necessary to get the pans angle relative to the ground then make sure the Kinect is level with the ground.
Second: Separate the pan data from all other data. This should be straight forward, the pan should be the closets object so transmit the closest measurements into 3D coordinate points (array of x,y,z).
Third: Assuming you wish for horizontal angle find the highest and lowest grounds of data and average their depth from the camera. Then save both those depths and the vertical distance they are away from each other.
Fourth: Now you can essentially do the math for a triangle. Given you know the width of the pan(saves steps to know the objects size otherwise you have to estimate that too) you can solve for a triangle with sides a: distance to point 1, side b: distance to point 2, side c: size of pan and finding the angle of where points a and c or b and c meet will give you the horizontal angle of the pan relative to the Kinect.
Fifth: For verification your measurements came back correct you can then use the angle you found to calculate the width of the pan given the angle and distance of the top and bottom most points.
Needless to say, you need to make sure that your understanding of trig is solid for this task.

Optical Flow egomotion estimation

below you can see the result of the optical flow if a camera makes a translation movement. If the camera makes a roll rotation the result looks like the second picture. Is it possible to retrieve the yaw angle from a camera if its only rotation around the yaw axis?
I think in the optical flow you can recognize if the camera is rotating around the yaw axis (z-axis), but i don't know how to retrieve the information how much the cam has rotated.
I would be gradeful for any hints. Thanks
Translation:
Roll rotation:
Orientation of camera:
If you have a pure rotation of your cam then you can use findhomography. You need four point correspondence in your pictures. For a pure rotation the homography matrix is already a rotation matrix. Otherwise you need to decompose the homograohy matrix. For a camera movement off 6 dof you can use the function find essential matrix and decompose this to translation and rotation.

Remove gravity from IMU accelerometer

I've found this beautiful quick way to remove gravity from accelerometer readings. However, I have a 6dof IMU (xyz gyro, xyz accel, no magnetometer) so I am not sure if I can use this code (I tried and it doesn't work correctly).
How would someone remove the gravity component? It's a big obstacle because I can't proceed with my project.
EDIT:
What I have:
quaternion depicting the position of aircraft (got that using Extended Kalman Filter)
acceleration sensor readings (unfiltered; axes aligned as the plane is aligned; gravity is also incorporated in these readings)
What I want:
remove the gravity
correct (rotate) the accelerometer readings so it's axes will be aligned with earth's frame of reference's axes
read the acceleration towards earth (now Z component of accelerometer)
Basically I want to read the acceleration towards earth no matter how the plane is oriented! But first step is to remove gravity I guess.
UPDATE: OK, so what you need is to rotate a vector with quaternion. See here or here.
You rotate the measured acceleration vector with the quaternion (corresponding to the orientation) then you substract gravity [0, 0, 9.81] (you may have -9.81 depending on your sign conventions) from the result. That's all.
I have implemented sensor fusion for Shimmer 2 devices based on this manuscript, I highly recommend it. It only uses accelerometers and gyroscopes but no magnetometer, and does exactly what you are looking for.
The resource you link to in your question is misleading. It relies on the quaternion that comes from sensor fusion. In other words, somebody already did the heavy lifting for you, already prepared the gravity compensation for you.