CMAttitudeReferenceFrameXTrueNorthZVertical doesn't point to north - objective-c

I am using this code to define the device position. Roll & pitch values looks good(they are 0 when device is on the table), but when yaw is 0, compass points to west. What's wrong?
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXTrueNorthZVertical toQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMDeviceMotion *motion, NSError *error){
[self performSelectorOnMainThread:#selector(handleDeviceMotion:) withObject:motion waitUntilDone:YES];
}];
}
}
- (void)handleDeviceMotion:(CMDeviceMotion*)motion {
CMAttitude *attitude = motion.attitude;
double yaw = attitude.yaw * 180 / M_PI;
double pitch = attitude.pitch * 180 / M_PI;
double roll = attitude.roll * 180 / M_PI;
self.xLabel.text = [NSString stringWithFormat:#"%7.4f", yaw];
self.yLabel.text = [NSString stringWithFormat:#"%7.4f", pitch];
self.zLabel.text = [NSString stringWithFormat:#"%7.4f", roll];
[self sendDeviceAttitudeLogWithYaw:yaw pitch:pitch roll:roll];
}

When yaw is 0, phone must point to north, no?
What do you mean by "phone"? The question is, what part of the phone. Let's read the docs:
CMAttitudeReferenceFrameXTrueNorthZVertical
Describes a reference frame in which the Z axis is vertical and the X axis points toward true north.
The X axis runs out the right side of the device. So you should expect yaw to be 0 when the right side of the device is north.

Related

Yaw(angle) value are not stable, it is drifting a few degrees. Does anyone know how to solve it?

I am currently working on iOS project, where i use motion data.
I get a good results with pitch and roll values, but yaw value is constantly drifting. I have applied Kalman filter and results are remain the same.
Does anyone has an idea how to solve it?
Here is some source code (Objective C)
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXArbitraryCorrectedZVertical
toQueue:[NSOperationQueue currentQueue]
withHandler:^(CMDeviceMotion *motion, NSError *error)
{
//NSString *yaw = [NSString
//stringWithFormat:#" %.3f", motion.attitude.yaw];
NSString *pitch = [NSString
stringWithFormat:#" %.3f", motion.attitude.pitch];
NSString *roll = [NSString
stringWithFormat:#" %.3f", motion.attitude.roll];
//Converting NSSring type variable in to a double
//double a_yaw = [yaw doubleValue];
double a_pitch = [pitch doubleValue];
double a_roll = [roll doubleValue];
CMQuaternion quat = self.motionManager.deviceMotion.attitude.quaternion;
double yaw = 180/M_PI * (asin(2*quat.x*quat.y + 2*quat.w*quat.z));
// Kalman filtering
static float q = 0.1; // process noise
static float r = 0.1; // sensor noise
static float p = 0.1; // estimated error
static float k = 0.5; // kalman filter gain
float x = motionLastYaw;
p = p + q;
k = p / (p + r);
x = x + k*(yaw - x);
p = (1 - k)*p;
motionLastYaw = x;
//Converting angles to degrees
//yaw = yaw * 180/M_PI;
a_pitch = a_pitch * 180/M_PI;
a_roll = a_roll * 180/M_PI;
"yaw" value need a reference to additional coordinate system(camera...).

CMMotionManager changes from cached reference frame not working as expected

When I call startDeviceMotionUpdatesUsingReferenceFrame, then cache a reference to my first reference frame and call multiplyByInverseOfAttitude on all my motion updates after that, I don't get the change from the reference frame that I am expecting. Here is a really simple demonstration of what I'm not understanding.
self.motionQueue = [[NSOperationQueue alloc] init];
self.motionManager = [[CMMotionManager alloc] init];
self.motionManager.deviceMotionUpdateInterval = 1.0/20.0;
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame: CMAttitudeReferenceFrameXArbitraryZVertical toQueue:self.motionQueue withHandler:^(CMDeviceMotion *motion, NSError *error){
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
CMAttitude *att = motion.attitude;
if(self.motionManagerAttitudeRef == nil){
self.motionManagerAttitudeRef = att;
return;
}
[att multiplyByInverseOfAttitude:self.motionManagerAttitudeRef];
NSLog(#"yaw:%+0.1f, pitch:%+0.1f, roll:%+0.1f, att.yaw, att.pitch, att.roll);
}];
}];
First off, in my application I only really care about pitch and roll. But yaw is in there too to demonstrate my confusion.
Everything works as expected if I put the phone laying on my flat desk, launch the app and look at the logs. All of the yaw, pitch roll values are 0.0, then if I spin the phone 90 degrees without lifting it off the surface only the yaw changes. So all good there.
To demonstrate what I think is the problem... Now put the phone inside of (for example) an empty coffee mug, so that all of the angles are slightly tilted and the direction of gravity would have some fractional value in all axis. Now launch the app and with the code above you would think everything is working because there is again a 0.0 value for yaw, pitch and roll. But now spin the coffee mug 90 degrees without lifting it from the table surface. Why do I see significant change in attitude on all of the yaw, pitch and roll?? Since I cached my initial attitude (which is now my new reference attitude), and called muptiplyByInverseOfAttitude shouldn't I just be getting a change in the yaw only?
I don't really understand why using the attitude multiplied by a cached reference attitude doesn't work... And I don't think it is a gimbal lock problem. But here is what gets me exactly what I need. And if you tried the experiment with the coffee mug I described above, this provides exactly the expected results (i.e. spinning the coffee mug on a flat surface doesn't affect pitch and roll values, and tilting the coffee mug in all other directions now only affects one axis at a time too). Plus instead of saving a reference frame, I just save the reference pitch and roll, then when the app starts, everything is zero'ed out until there is some movement.
So all good now. But still wish I understood why the other method did not work as expected.
self.motionQueue = [[NSOperationQueue alloc] init];
self.motionManager = [[CMMotionManager alloc] init];
self.motionManager.deviceMotionUpdateInterval = 1.0/20.0;
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame: CMAttitudeReferenceFrameXArbitraryZVertical toQueue:self.motionQueue withHandler:^(CMDeviceMotion *motion, NSError *error)
{
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
if(self.motionManagerAttitude == nil){
CGFloat x = motion.gravity.x;
CGFloat y = motion.gravity.y;
CGFloat z = motion.gravity.z;
refRollF = atan2(y, x) + M_PI_2;
CGFloat r = sqrtf(x*x + y*y + z*z);
refPitchF = acosf(z/r);
self.motionManagerAttitude = motion.attitude;
return;
}
CGFloat x = motion.gravity.x;
CGFloat y = motion.gravity.y;
CGFloat z = motion.gravity.z;
CGFloat rollF = refRollF - (atan2(y, x) + M_PI_2);
CGFloat r = sqrtf(x*x + y*y + z*z);
CGFloat pitchF = refPitchF - acosf(z/r);
//I don't care about yaw, so just printing out whatever the value is in the attitude
NSLog(#"yaw: %+0.1f, pitch: %+0.1f, roll: %+0.1f", (180.0f/M_PI)*motion.attitude.yaw, (180.0f/M_PI)*pitchF, (180.0f/M_PI)*rollF);
}];
}];

How To Create a Rotating Wheel Control with UIKit

Hi I'm trying to create a Rotation Wheel in iOS and I found this fantastic tutorial
How to Create a Rotation Wheel Control
and it is very nice and complete, but in this case the selected object is in the left and need the object in the right.
So I'm wondering if somebody knows what I need to change in order to select the rigth side
Well in the example we can see in the endtrackingWithTouch Event the following code
// 1 - Get current container rotation in radians
CGFloat radians = atan2f(container.transform.b,container.transform.a);
NSLog(#"Radians %f", radians);
// 2 - Initialize new value
CGFloat newVal = 0.0;
// 3 - Iterate through all the sectors
for (SMSector *s in sectors) {
// 4 - Check for anomaly (occurs with even number of sectors)
if (s.minValue > 0 && s.maxValue < 0) {
if (s.maxValue > radians || s.minValue < radians) {
// 5 - Find the quadrant (positive or negative)
if (radians > 0) {
newVal = radians - M_PI;
} else {
newVal = M_PI + radians;
}
currentSector = s.sector;
}
}
// 6 - All non-anomalous cases
else if (radians > s.minValue && radians < s.maxValue) {
newVal = radians - s.midValue;
currentSector = s.sector;
}
}
Doing the Math for radians and making some comparing the min and max in the sectors we get the selected sector also if I change (CGFloat radians = atan2f(container.transform.b,container.transform.a);) for CGFloat radians = atan2f(container.transform.d,container.transform.c); I'm able to get the sector from the bottom
I think you can simply put your wheel in another view and rotate this view one PI. Something like this:
UIView *testView = [[UIView alloc]initWithFrame:CGRectMake(10, 80,300, 300)];
SMRotaryWheel *wheel = [[SMRotaryWheel alloc] initWithFrame:CGRectMake(0, 0,300, 300)
andDelegate:self
withSections:5];
[testView addSubview:wheel];
testView.transform = CGAffineTransformMakeRotation(M_PI);
[self.view addSubview:testView];

Converting meters to miles inaccurate result

I'm trying to convert the meters travelled into miles. The problem I'm having is that the results are completely inaccurate and I'm having trouble finding out why.
Here is the code I'm using:
CLLocation* newLocation = [locations lastObject];
NSTimeInterval age = -[newLocation.timestamp timeIntervalSinceNow];
if (age > 120) return; //
if (newLocation.horizontalAccuracy < 0) return;
if (self.oldLocation == nil || self.oldLocation.horizontalAccuracy < 0) {
self.oldLocation = newLocation;
return;
}
CLLocationDistance distance = [newLocation distanceFromLocation: self.oldLocation];
NSLog(#"%6.6f/%6.6f to %6.6f/%6.6f for %2.0fm, accuracy +/-%2.0fm",
self.oldLocation.coordinate.latitude,
self.oldLocation.coordinate.longitude,
newLocation.coordinate.latitude,
newLocation.coordinate.longitude,
distance,
newLocation.horizontalAccuracy);
NSLog(#"distance is %f", distance);
totalDistanceBetween =+ (distance * 0.000621371192);
NSString *cycleDistanceString = [[NSString alloc]
initWithFormat:#"%f meters",
totalDistanceBetween];
_CurrentDistance.text = cycleDistanceString;
self.oldLocation = newLocation;
Can anyone give me an idea where i'm going wrong?
Thanks
Your code says this:
totalDistanceBetween =+ (distance * 0.000621371192);
That uses the unary + operator, which does nothing. It seems likely that you meant this:
totalDistanceBetween += (distance * 0.000621371192);
Also, the units of distance is meters (because that is what distanceFromLocation: returns), and the units of totalDistanceBetween is miles, because 0.000621371192 miles = 1 meter. But in the next statement, your format string says %f meters. It should say %f miles.

know the position of the finger in the trackpad under Mac OS X

I am developing an Mac application and I would like to know the position of the finger in the trackpad when there is a touch.
Is it something possible and if yes, how?
Your view needs to be set to accept touches ([self setAcceptsTouchEvents:YES]). When you get a touch event like -touchesBeganWithEvent:, you can figure out where the finger lies by looking at its normalizedPosition (range is [0.0, 1.0] x [0.0, 1.0]) in light of its deviceSize in big points (there are 72 bp per inch). The lower-left corner of the trackpad is treated as the zero origin.
So, for example:
- (id)initWithFrame:(NSRect)frameRect {
self = [super initWithFrame:frameRect];
if (!self) return nil;
/* You need to set this to receive any touch event messages. */
[self setAcceptsTouchEvents:YES];
/* You only need to set this if you actually want resting touches.
* If you don't, a touch will "end" when it starts resting and
* "begin" again if it starts moving again. */
[self setWantsRestingTouches:YES]
return self;
}
/* One of many touch event handling methods. */
- (void)touchesBeganWithEvent:(NSEvent *)ev {
NSSet *touches = [ev touchesMatchingPhase:NSTouchPhaseBegan inView:self];
for (NSTouch *touch in touches) {
/* Once you have a touch, getting the position is dead simple. */
NSPoint fraction = touch.normalizedPosition;
NSSize whole = touch.deviceSize;
NSPoint wholeInches = {whole.width / 72.0, whole.height / 72.0};
NSPoint pos = wholeInches;
pos.x *= fraction.x;
pos.y *= fraction.y;
NSLog(#"%s: Finger is touching %g inches right and %g inches up "
#"from lower left corner of trackpad.", __func__, pos.x, pos.y);
}
}
(Treat this code as an illustration, not as tried and true, battle-worn sample code; I just wrote it directly into the comment box.)
Swift 3:
I've written an extension to NSTouch that returns the trackpad-touch pos, relative to an NSView:
extension NSTouch {
/**
* Returns the relative position of the touch to the view
* NOTE: the normalizedTouch is the relative location on the trackpad. values range from 0-1. And are y-flipped
* TODO: debug if the touch area is working with a rect with a green stroke
*/
func pos(_ view:NSView) -> CGPoint{
let w = view.frame.size.width
let h = view.frame.size.height
let touchPos:CGPoint = CGPoint(self.normalizedPosition.x,1 + (self.normalizedPosition.y * -1))/*flip the touch coordinates*/
let deviceSize:CGSize = self.deviceSize
let deviceRatio:CGFloat = deviceSize.width/deviceSize.height/*find the ratio of the device*/
let viewRatio:CGFloat = w/h
var touchArea:CGSize = CGSize(w,h)
/*Uniform-shrink the device to the view frame*/
if(deviceRatio > viewRatio){/*device is wider than view*/
touchArea.height = h/viewRatio
touchArea.width = w
}else if(deviceRatio < viewRatio){/*view is wider than device*/
touchArea.height = h
touchArea.width = w/deviceRatio
}/*else ratios are the same*/
let touchAreaPos:CGPoint = CGPoint((w - touchArea.width)/2,(h - touchArea.height)/2)/*we center the touchArea to the View*/
return CGPoint(touchPos.x * touchArea.width,touchPos.y * touchArea.height) + touchAreaPos
}
}
Here is an article I wrote about my GestureHUD class in macOS. With link to a ready-made extension as well: http://eon.codes/blog/2017/03/15/Gesture-HUD/
Example:
I don't know if there's an ObjC interface, but you might find the C HID Class Device Interface interesting.
At a Cocoa (Obj-C level) try the following - although remember that many users are still using mouse control.
http://developer.apple.com/mac/library/documentation/cocoa/conceptual/EventOverview/HandlingTouchEvents/HandlingTouchEvents.html