kinect joint coordinate values change in opposite way - kinect

After I place a Kinect 2 and stand in front of it, I moved my arm up in front and down (e.g. forward flexion).
Then, I found my y-coordinate of the wrist joint that changes large (0.17) to small (0.11) and to around (0.16).
I found this strange, because in the Kinect guide, positive y-axis indicates upward direction.
https://msdn.microsoft.com/en-us/library/hh973078.aspx
it seems like we should have larger value of wrist-y coordinate when we place a arm up direction.
I am getting the opposite results. Anyone can comment on this?
Q. Are we supposed to get decreasing y-value of wrist when is moving upward direction?
Q. If not, can anyone have any ideas why this happens?
Q. In addition, I found my the other side of wrist (left) has negative value. Can anyone comment why left side of wrist has negative value?

That happens because of the Kinect v2 reference system. The center of the sensor corresponds to (0,0,0) - x,y,z.
If you make movements to the left or right in the X and Y axis, it is logical to have negative values.
The origin (x=0, y=0, z=0) is located at the center of the IR sensor on Kinect
X grows to the sensor’s left
Y grows up (note that this direction is based on the sensor’s tilt)
Z grows out in the direction the sensor is facing
1 unit = 1 meter
https://msdn.microsoft.com/en-us/library/dn785530.aspx
Hope this helps you.

Related

Convert a Lat/Lon coordinate on a map based on preset coordinates

First off, I am not sure if this is the right place so I apologize if this belongs elsewhere - please let me know if it does. I am currently doing some prototyping with this in VB so that's why I come here first.
My Goal
I am trying to make a program to be able to log different types of information for a video game that I play. I would like to be able to map out the entire game with my program and add locations for mobs, resources, etc.
What I have
The in game map can be downloaded so I have literally just stuck this in as a background image on the form (just for now). The map that I get downloaded though is not exactly as the map appears in the game though since the game will add extra water around everything when scrolling around. This makes it a bit tricky to match up where the origin for the map is in game compared to where it would be on the downloaded map.
The nice thing though is that while I am in the game I can print my current coordinates to the screen. So I thought that maybe I can somehow use this to get the right calculation for the rest of the points on the map.
Here is an example image I will refer to now:
In the above map you will see a dotted bounding box. This is an invisible box in the game where once you move your mouse out of the longitude and latitude points will no longer show. This is what I refer to above when I mean I can't find the exact point of origin for the in game map.
You will also see 2 points: A and B. In the game there are teleporters. This is what I would use to get the most accurate position possible. I am thinking I can find the position (in game) of point A and point B and then somehow calculate that into a conversion for my mouse drag event in VB.
In VB the screen starts at top-left and is 0,0. I did already try to get the 2 points like this and just add or subtract the number to the x and y pixel position of the mouse, but it didn't quite line up right.
So with all this information does anyone know if it is possible to write a lon/lat conversion to pixels based on this kind of data?
I appreciate any thoughts and suggestions and if you need any clarification of any information I have posted please let me know and I will be happy to expand on it. I am really hoping I can get this solved!
Thanks!
EDIT:
I also want to mention I am not sure if there is an exact pixel to lat/lon point for the in game map. I.e. the in game map could be 1 pixel = 100 latitude or something. So I might also need to figure out what that conversion number is?
Some clarifications about conversion between the pixel location to 'latitude and longitude'.
First the map in your game is in a geometry coordinate system, which means everything lies in 2D and you can measure the distance between two points by calculate the pixel position.
But when we talk about longitude and latitude, we are actually talking about a geography coordinate system, which is a '3D' model of the sphere oabout the surface of the earth. All the maps on earth are abstracted from 3D to 2D through one step called projection. Like google maps or your GPS. In this projection process, the 3D model converted to 2D model but there is always some part of the map will be tortured, so that same distance in pixels on a map could be different in length in reality.
So if you don't care about the accuracy then you can consider the geometry point as geography point. Otherwise, you need to implement some GIS library to handle the geodesic distance and calculate the geography point based on the projection coordinate system.

Detecting Angle using Kinect

I have a flat pan and using kinetic v1.
I want to receive the angle of the pan using kinetic camera.
for eg: If I put the angle in 45 degrees so kinetic will read the closet or exact angle it placed.
is this possible or any solutions ?
Thanks.
I don't know exactly how the data comes back in Kinect V1 but I believe this methodology should work for you.
First: You have to assume that the Kinect is your level of reference, if it is necessary to get the pans angle relative to the ground then make sure the Kinect is level with the ground.
Second: Separate the pan data from all other data. This should be straight forward, the pan should be the closets object so transmit the closest measurements into 3D coordinate points (array of x,y,z).
Third: Assuming you wish for horizontal angle find the highest and lowest grounds of data and average their depth from the camera. Then save both those depths and the vertical distance they are away from each other.
Fourth: Now you can essentially do the math for a triangle. Given you know the width of the pan(saves steps to know the objects size otherwise you have to estimate that too) you can solve for a triangle with sides a: distance to point 1, side b: distance to point 2, side c: size of pan and finding the angle of where points a and c or b and c meet will give you the horizontal angle of the pan relative to the Kinect.
Fifth: For verification your measurements came back correct you can then use the angle you found to calculate the width of the pan given the angle and distance of the top and bottom most points.
Needless to say, you need to make sure that your understanding of trig is solid for this task.

Center of Kinect Coordinate System

I am working on a Kinect project and need to know the center of the coordinate system of Kinect. I have searched and found no definite working diagrams or spec sheet regarding this. Can anyone help me on this problem?
The coordinate origin should be the the location of your sensor(IR lens I assume). Everything it sees in it's frustum is then viewed as followed:
X axis is horizontal
Y axis is vertical
Z axis is depth (close objects have a small depth value/far objects have a larger depth value)
The depth unit with most sdks is in milimeters
You can use a 4x4 transformation matrix to define your custom coordinate system and make adjustments as you need them (e.g. consider the origin in a different location,etc.)

Remove gravity from IMU accelerometer

I've found this beautiful quick way to remove gravity from accelerometer readings. However, I have a 6dof IMU (xyz gyro, xyz accel, no magnetometer) so I am not sure if I can use this code (I tried and it doesn't work correctly).
How would someone remove the gravity component? It's a big obstacle because I can't proceed with my project.
EDIT:
What I have:
quaternion depicting the position of aircraft (got that using Extended Kalman Filter)
acceleration sensor readings (unfiltered; axes aligned as the plane is aligned; gravity is also incorporated in these readings)
What I want:
remove the gravity
correct (rotate) the accelerometer readings so it's axes will be aligned with earth's frame of reference's axes
read the acceleration towards earth (now Z component of accelerometer)
Basically I want to read the acceleration towards earth no matter how the plane is oriented! But first step is to remove gravity I guess.
UPDATE: OK, so what you need is to rotate a vector with quaternion. See here or here.
You rotate the measured acceleration vector with the quaternion (corresponding to the orientation) then you substract gravity [0, 0, 9.81] (you may have -9.81 depending on your sign conventions) from the result. That's all.
I have implemented sensor fusion for Shimmer 2 devices based on this manuscript, I highly recommend it. It only uses accelerometers and gyroscopes but no magnetometer, and does exactly what you are looking for.
The resource you link to in your question is misleading. It relies on the quaternion that comes from sensor fusion. In other words, somebody already did the heavy lifting for you, already prepared the gravity compensation for you.

Working with Accelerometer

I am working on gestures using acceleration values (x, y, z) from a device.
If I hold the device in my hand in a resting position (x,y,z) = ((0,0,0)). But if I change the direction of device (still at resting position) the values are changed to something like ((766,766,821)). As all the x, y, z axis are changed compared to their original orientations.
Is there any way (trigonometric function OR other) to resolve this issue?
The acceleration due to gravity will always be present. It appears you are subtracting that value from one of the axes when the device is in a particular orientation.
What you will need to do to detect gestures is to detect the tiny difference that momentarily appears from the acceleration due to gravity as the devices begins moving. You won't be able to detect if the device is stationary or moving at a constant velocity, but you will be able to determine if it is turning or being accelerated.
The (x,y,z) values give you a vector, which gives the direction of the acceleration. You can compute the (square of the) length of this vector as x^2 + y^2 + x^2. If this is the same as when the device is at rest, then you know the device is unaccelerated, but in a certain orientation. (Either at rest, or moving at a constant velocity.)
To detect movement, you need to notice the momentary change in the length of this vector as the device begins to move, and again when it is brought to a stop. This change will likely be tiny compared to gravity.
You will need to compare the orientation of the acceleration vector during the movement to determine the direction of the motion. Note that you won't be able to distinguish every gesture. For example, moving the device forward (and stopping there) has the same effect as tilting the device slightly, and then bringing it back to the same orientation.
The easier gestures to detect are those which change the orientation of the device. Other gestures, such as a punching motion, will be harder to detect. They will show up as a change in the length of the acceleration vector, but the amount of change will likely be tiny.
EDIT:
The above discussion is for normalized values of x, y, and z. You will need to determine the values to subtract from the readings to get the vector. From a comment above, it looks like 766 are the "zero" values to subtract. But they might be different for the different axes on your device. Measure the readings with the devices oriented in all six directions. That is get the maximum and minimum values for x, y, and z. The central values should be halfway between the extremes (and hopefully 766).
Certain gestures will have telltale signatures.
Dropping the device will reduce the acceleration vector momentarily, then increase it momentarily as the device is brought to a stop.
Raising the device will increase the vector momentarily, before decreasing it momentarily.
A forward motion will increase the vector momentarily, but tilt it slightly forward, then increase it again momentarily, but tilted backward, as the device is brought to a stop.
Most of the time the length of the vector will equal the acceleration due to gravity.
If the device is not compensating automatically for the gravitational acceleration you need to substract the (0,0,~9.8m/s2) vector from the output of the device.
However, you will also need to have the orientation of the device (Euler angle or Rotation Matrix). If your device isn't providing that it's basically impossible to tell if the signaled acceleration is caused by actually moving the device (linear acc) or by simply rotating it (gravity changing direction).
Your compensated acceleration will become:
OutputAcc = InputAcc x RotMat - (0,0,9.8)
This way your OutputAcc vecor will always be in a local coord frame (ie. Z is always up)
I find your question unclear. What exactly do you measure and what do you expect?
In general, an accelerometer will, if held in fixed position, measure the gravity of the earth. This is displayed as acceleration upwards, which might sound strange at first but is completely correct: as the gravity is accelerating "down" and the device is in a fixed position some force in the opposite direction, i.e. "up" needs to be applied. The force you need to hold the device in a fixed position is this force, which has a corresponding acceleration in the "up" direction.
Depending on your device this gravity acceleration might be substracted before you get the values in the PC. But, if you turn the acceleratometer, the gravity acceleration is still around and still points to the same "up" direction. If, before turning the acceleratometer, "up" would correspond to x it will correspond to a different axis if turned 90°, say y. Thus, both the measured acceleration on x and y axis will change.
So to answer your question it's necessary to know how your accelerometer presents the values. I doubt that in a resting position the acceleration values measured are (0, 0, 0).
Your comment makes your question clearer. What you need to do is calibrate your accelerometer every time the orientation changes. There is no getting around this. You could make it a UI element in your application or if it fits with your uses, recalibrate to 0 if the acceleration is relatively constant for some amount of time (won't work if you measure long accelerations).
Calibration is either built into the device's api (check the documentation) or something you have to do manually. To do it manually, you have to read the current acceleration and store those 3 values. Then whenever you take a reading from the device, subtract those 3 values from each read value.