CoreMotion wont give me roll, pitch and yaw - objective-c

sBefore UIAccelerometer was deprecated from iOS I used the data from x, y and z from this class to calculate pitch, roll and yaw. I also had to do some filtering, but now I see that with the CoreMotion library I can get these data from the CMAttitude class and would really like to use these properties, but somehow I fail to do so.
Now, what I have done is to instantiate
CMMotionManager *motionManager;
CMDeviceMotion *deviceMotion;
CMAttitude *attitude;
...
deviceMotion = motionManager.deviceMotion;
attitude = deviceMotion.attitude;
motionManager.accelerometerUpdateInterval = 0.065; // 65ms
[motionManager startAccelerometerUpdates];
I am able to read x,y and z from motionManager.accelerometerData.acceleration.<x,y or z> but trying to read from attitude.<roll,pitch,yaw> gives me 0.
NSLog(#"Roll: %f", attitude.roll); // = 0
I read out the values in a method triggered by a continous timer each 100ms.
Any ideas on what I`m missing?

In order to use deviceMotion.attitude you have to call [motionManager startDeviceMotionUpdates].
startAccelerometerUpdates provides accelerometer data only like startGyroUpdates will give you gyroData. Note that device motion data is more than just accelerometer and gyro data as both of them will be combined (sensor fusion) to achieve more precision.

Related

Objective-C Continuous Control Loop

I am trying to create a flight using the DJI SDK via Virtual Sticks. I've worked out how to fly the drone in the direction of a GPS coordinate using the atan2 function to calculate the angle between two GPS Coordinates, then Yaw the drone to that angle and pitch to move in that direction.
I want to re-calculate that compass bearing, yaw and pitch every couple of seconds to account for wind and drift etc but I don't want to run it on the main thread in case it blocks UI. Or a UI interaction causes the timer not to fire and there is a missed calculation.
Should I use dispatch_source_set_timer in conjunction with GCD or is there a better method to achieve this and avoid memory leaks? Sample code below:
code sample taken from another question answer
// Create a dispatch source that'll act as a timer on the concurrent queue
dispatch_source_t dispatchSource = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0,
dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0));
// Setup params for creation of a recurring timer
double interval = 2.0;
dispatch_time_t startTime = dispatch_time(DISPATCH_TIME_NOW, 0);
uint64_t intervalTime = (int64_t)(interval * NSEC_PER_SEC);
dispatch_source_set_timer(dispatchSource, startTime, intervalTime, 0);
// Attach the block you want to run on the timer fire
dispatch_source_set_event_handler(dispatchSource, ^{
// code to calculate bearing and send Virtual Stick commands to yaw and pitch drone
});
// Start the timer
dispatch_resume(dispatchSource);
// ----
// When you want to stop the timer, you need to suspend the source
dispatch_suspend(dispatchSource);
// If on iOS5 and/or using MRC, you'll need to release the source too
dispatch_release(dispatchSource);

Cocos2D project with tileMap. Invalid positions returned

I have a project in which I am trying to implement a pathfinding class but I have stumbled upon an issue I can't seem to fix.
A video of the error as it occurs can be found here:
https://www.dropbox.com/s/25ajb0mc5p4a3a9/Tilemap%20Error.mov
The error message thrown is: Assertion failure in -[CCTMXLayer tileGIDAt:withFlags:], /Users/Dave/Desktop/AI_DISS_new copy/AI_DISS/libs/cocos2d/CCTMXLayer.m:304
Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'TMXLayer: invalid position'.
The point returns as pixel values while the map and layer sizes return in tile values so I tried dividing variables holding pixel values by the width or height of the tiles but nothing seems to have worked.
GID for tileAt:point returns too high when using pixel values (understandably) and when converted to tile values, searches for a very long time, eventually moving off the map and continuously searching until Xcode crashes.
I have tried to explain in as much detail as possible I apologize if this isn't the kind of question best suited to this forum. I would really appreciate any assistance.
If you need further details please ask.
The original instructions for using the class were on cocos2d forums which have just had an overhaul and the thread appears to have been moved or deleted so I can't quote them but I am sure the tilemap (a TMX Tilemap) and layer attributes have been passed in correctly.
From my HelloWorldLayer class the values are:
//Pathfinding related code
////////////////////////////
_tileMap = [CCTMXTiledMap tiledMapWithTMXFile:#"NewMap.tmx"];
_pathFinder = [[AStar alloc] initWithTileMap:_tileMap groundLayer:#"Collision"]; //collidable Tiles
[_pathFinder setCollideKey:#"Collidable"]; // defaults to COLLIDE
[_pathFinder setCollideValue:#"True"]; // defaults to 1
_pathFinder2 = [[AStar alloc] initWithTileMap:_tileMap groundLayer:#"Collision"]; //collidable Tiles
[_pathFinder2 setCollideKey:#"Collidable"]; // defaults to COLLIDE
[_pathFinder2 setCollideValue:#"True"]; // defaults to 1
/////////////////////////////
and the move method is called as:
/////////////// player path finding related code //////////////////
[_pathFinder moveSprite:_player from:s to:f atSpeed:0.003];
///////////// end player path finding related code //////////////
Thanks in advance.
GID Code as commented on below:
if (tileGid)
{
// check the tile for the collide property.
NSDictionary *dict = [tileMap propertiesForGID:tileGid];
if (dict)
{
//check the tile for the collideKey and return Yes if a match
NSString *collide = [dict valueForKey:collideKey];
if (collide && [collide compare:collideValue] == NSOrderedSame)
return YES;
}
}

iOS: Deprecation of AudioSessionInitialize and AudioSessionSetProperty

I'm very new to Objective-C, and am trying to update some code that's about 3 years old to work with iOS 7. There are two or two instances of AudioSessionSetProperty and AudioSessionInitialize appearing in the code:
1:
- (void)applicationDidFinishLaunching:(UIApplication *)application {
AudioSessionInitialize(NULL,NULL,NULL,NULL);
[[SCListener sharedListener] listen];
timer = [NSTimer scheduledTimerWithTimeInterval: 0.5 target: self selector: #selector(tick:) userInfo:nil repeats: YES];
// Override point for customization after app launch
[window addSubview:viewController.view];
[window makeKeyAndVisible];
}
And 2:
- (id)init {
if ([super init] == nil){
return nil;
}
AudioSessionInitialize(NULL,NULL,NULL,NULL);
Float64 rate=kSAMPLERATE;
UInt32 size = sizeof(rate);
AudioSessionSetProperty (kAudioSessionProperty_PreferredHardwareSampleRate, size, &rate);
return self;
}
For some reason this code works on iOS7 in the simulator but not a device running iOS7, and I suspect that these deprecations are the cause. I've been reading through the Docs and related questions on this website, and it appears that I need to use AVAudioSession instead. I've been trying to update the code for a long time now, and I'm unsure of how to properly switch over to AVAudioSession. Does anyone know how these two methods above need to look?
Side note: I've managed to hunt down an article that outlines the transition:
https://github.com/software-mariodiana/AudioBufferPlayer/wiki/Replacing-C-functions-deprecated-in-iOS-7
But I can't seem to apply this to the code above.
The code I'm trying to update is a small frequency detection app from git:
https://github.com/jkells/sc_listener
Alternatively, if someone could point me to a sample demo app that can detect frequencies on iOS devices, that would be awesome.
As you have observed, pretty much all of the old Core Audio AudioSession functions have been deprecated in favour of AVAudioSession.
The AVAudioSession is a singleton object which will get initialised when you first call it:
[AVAudioSession sharedInstance]
There is no separate initialize method. But you will want to activate the audio session:
BOOL activated = [[AVAudioSession sharedInstance] setActive:YES error:&error];
As regards setting the hardware sample rate using AVAudioSession, please refer to my answer here:
How can I obtain the native (hardware-supported) audio sampling rates in order to avoid internal sample rate conversion?
For other compares & contrasts between Core Audio audioSession and AVFoundation's AVAudioSession here are some of my other answers around the same topic:
How Do I Route Audio to Speaker without using AudioSessionSetProperty?
use rear microphone of iphone 5
Play audio through upper (phone call) speaker
How to control hardware mic input gain/level on iPhone?
I wrote a short tutorial that discusses how to update to the new AVAudioSession objects. I posted it on GitHub: "Replacing C functions deprecated in iOS 7."

How to access animated GIF's frames

I have an animated GIF successfully loaded into an NSData or NSBitmapImageRep object. Reference for NSBitmapImageRep
I've figured how to return data like the number of frames in that gif using:
NSNumber *frames = [bitmapRep valueForProperty:#"NSImageFrameCount"];
However, I'm a bit confused as to how I can actually access that frame as its own object.
I think one of these two methods will help, but I'm not actually sure how they'll get the individual frame for me.
+ representationOfImageRepsInArray:usingType:properties:
– representationUsingType:properties:
Any help appreciated. Thanks
I've figured how to return data like the number of frames in that gif using:
NSNumber *frames = [bitmapRep valueForProperty:#"NSImageFrameCount"];
However, I'm a bit confused as to how I can actually access that frame as its own object.
To have access to a special frame indexOfFrame ( 0 <= indexOfFrame < [frames intValue] ) you only need to set the NSImageCurrentFrame and you are done. There is no need to use CG-functions or make copies of frames. You can stay in the object oriented Cocoa world. A small example shows the duration of all GIF frames:
NSNumber *frames = [bitmapRep valueForProperty:#"NSImageFrameCount"];
if( frames!=nil ){ // bitmapRep is a Gif imageRep
for( NSUInteger i=0; i<[frames intValue]; i++ ){
[bitmapRep setProperty:NSImageCurrentFrame
withValue:[NSNumber numberWithUnsignedInt:i] ];
NSLog(#"%2d duration=%#",
i, [bitmapRep valueForProperty:NSImageCurrentFrameDuration] );
}
}
Another example: write all frames of a GIF image as PNG files to the filesystem:
NSNumber *frames = [bitmapRep valueForProperty:#"NSImageFrameCount"];
if( frames!=nil ){ // bitmapRep is a Gif imageRep
for( NSUInteger i=0; i<[frames intValue]; i++ ){
[bitmapRep setProperty:NSImageCurrentFrame
withValue:[NSNumber numberWithUnsignedInt:i] ];
NSData *repData = [bitmapRep representationUsingType:NSPNGFileType
properties:nil];
[repData writeToFile:
[NSString stringWithFormat:#"/tmp/gif_%02d.png", i ] atomically:YES];
}
}
I've figured how to return data like the number of frames in that gif using:
NSNumber *frames = [bitmapRep valueForProperty:#"NSImageFrameCount"];
However, I'm a bit confused as to how I can actually access that frame as its own object.
As far as I know, you can't—not from an NSBitmapImageRep.
Instead, create a CGImageSource from the GIF data, and use CGImageSourceCreateImageAtIndex to extract each frame (preferably as you need it).
Alternatively, you might try setting the NSImageCurrentFrame property. If you need a rep for each frame, make as many copies as there are frames (minus one, since you have the original), and set each rep's current frame to a different number. But I haven't tried that, so I'm not sure it will actually work.
Basically, NSBitmapImageRep's GIF support is weird, so you should just use CGImageSource.
I think one of these two methods will help, but I'm not actually sure how they'll get the individual frame for me.
+ representationOfImageRepsInArray:usingType:properties:
– representationUsingType:properties:
No, those methods are for serializing an image (or image rep). They're for writing data out, not reading it in. (Notice what constants those methods expect in their type parameters.)
If you want to have a look at some working source code for a GIF decoder for iOS (works for MacOSX too) then you can find it AVGIF89A2MvidResourceLoader.m at github. The approach is to use the ImageIO framework and call CGImageSourceCreateWithData() along with CGImageSourceCreateImageAtIndex() to get access to the Nth gif image in the file. But, there are some tricky details related to detecting if a transparent pixel appears in the GIF and how to write the results to a file to avoid running out of memory if the GIF is really long that might not be obvious.

CALayer opacity animation

I want to create a CALayer animation that gives sort of a 'flashy' effect. For that I'm trying to animate the 'opacity' property, but my problem is that I have no idea where to start and how to do it.
Here is a graphical explanation of the animation:
opacity
| ___
1 | | |
| | | * repeatCount
0 |___| |_ . . .
-------------------------> time
|______|
duration
The opacity starts at 0, then animates to 1, then to 0 again (this 0-to-1-to-0 animation takes a number of seconds equal to duration). Then this process is repeated 'repeatCount' times.
Here's some background on the code:
float duration = ...; // 0.2 secs, 1 sec, 3 secs, etc
int repeactCount = ...; // 1, 2, 5, 6, ect
CALayer* layer = ...; // I have a CALayer from another part of the code
layer.opacity = 0;
// Animation here
done = YES; // IN THE END of the animation set this ivar to yes
What is the best way to accomplish this? I have never used CALayers before, so this is also a good opportunity to learn how their animation system works. By the way, I have searched the docs and I understand how you add one or two simple animations, but I have no idea how to do this particular one.
The best way to accomplish this is to use an explicit animation (see guide) by creating an instance of CABasicAnimation and adding it to the layer.
The code would look something like this:
CABasicAnimation *flash = [CABasicAnimation animationWithKeyPath:#"opacity"];
flash.fromValue = [NSNumber numberWithFloat:0.0];
flash.toValue = [NSNumber numberWithFloat:1.0];
flash.duration = 1.0; // 1 second
flash.autoreverses = YES; // Back
flash.repeatCount = 3; // Or whatever
[layer addAnimation:flash forKey:#"flashAnimation"];
If you want to know when the animation is done you can set a delegate and implement the animationDidStop:finished: method, however it's best to use a completion block as that allows all the code to be in the same place. If you are writing for iOS 4 or OS X then you can use the excellent CAAnimationBlocks category to accomplish this.
Trojanfoe's answer is excellent. I just want to add that if you want more control over the "timeline" (how long should it take to fade out? how long should we then wait? then how long should it take to fade in? and so on) you're going to want to combine multiple CABasicAnimations into a CAAnimationGroup.
You might want to read my book chapter on this topic, the last part of which constitutes a tutorial on CAAnimation and its offspring:
http://www.apeth.com/iOSBook/ch17.html#_core_animation
Note that my discussion is directed at iOS; on Mac OS X, if that's where you are, the view/layer architecture is a little different, but what it says about CAAnimation is still correct.