Animation iOS with sound file - objective-c

I have some animation which needs to appear on screen at very specific timings, these are stored in SQLite database. What I am planning to do is use nstimer to keep time and pop the animation when the specific time is reached.
Im thinking of using an NSTimer to count for the duration of the sound file and animation then when certain points are reached pop an on screen animation. The problem is the set timing are like this 55.715000 seconds so are very accurate and these need to sync with an audio track that will be played with the animation.
Firstly is this even possible, secondly how can i compare such specific timings the problem i seem to be facing is the code can't run quick enough and the time jumps more than .001 of a second.
I have no knowledge of openGLES or Cocos2d and learning these is not really feasible for the time scales.

If your visuals need to be exactly in sync with the audio (for example a music app where animations have to appear on the beats) then you need to use the following approach. It works on very old iPhone hardware and there is basically nothing that can go wrong at runtime.
Pre-render each "frame" of your visuals so that each one is stored as a full screen image. This could be done on the desktop or you can run the render logic on the phone and then capture the output to a file as pixels.
Once each frame is saved as an array of pixels, play the audio via the standard AVAudioPlayer APIs on iOS. This API takes care of the audio playback and reports a time that you will use to determine which video frame to display.
Once audio is playing, get the "time" and divide it by your video framerate to determine which image from an array of N images to display. Get the image data and wrap it up in a CGImageRef/UIImage, this will blit the image data to the screen in a optimal way.
If you would like to see working source code for this approach, take a look at AVSync. This example code shows the implementation that I used in my own app called iPractice in the app store. It is very fast and can run at 30FPS even on the old iPhone 3G.

Regardless of how accurate your timings are, you aren't going to get the device to display more frequently than about 60Hz (16.7 ms per frame). As I see it, you have at least two options here:
1) Use a CADisplayLink callback to check the playback progress of the audio and trigger each animation as it becomes timely.
Display link timers are created in a similar way to regular NSTimers:
-(void)viewDidLoad
{
// ...
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(scheduleAnimations:)];
[self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSDefaultRunLoopMode];
}
- (void)scheduleAnimations:(CADisplayLink *)displayLink
{
// get playback time from player object
// iterate queue of timings
// if the timestamp of a timing is equal to or less than the playback time
// create its animation for immediate execution
// remove it from the queue
}
2) Create a collection of animations, setting their respective beginTime properties to the appropriate trigger time (or, if using implicit or block-based animations, use the delay parameter).
[CATransaction begin];
// iterate collection of timings
CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:#"key"];
animation.startTime = /* time to trigger animation */
animation.removedOnCompletion = NO;
[layer addAnimation:animation forKey:nil];
[CATransaction commit];

Related

SKAudioNode volume adjustment delayed

I am developing a game where an SKAudioNode plays the game music. When the player starts a new game, I want the music to fade in so I wrote the following code:
SKAudioNode *SFXNode = [[SKAudioNode alloc] initWithfile:#"GameMusic.mp3"];
[SFXNode runAction:[SKAction changeVolumeTo:0 duration:0]];
SFXNode.positional = NO;
[self addChild:SFXNode];
[SFXNode runAction:[SKAction changeVolumeTo:1 duration:1]];
However, when the scene initiates, the music plays at full volume for a split second, then it mutes and fades back in as it is supposed to. Does anyone have any idea why this is happening? (The music is only full volume after the user has triggered the modal segue to the scene, but before it has been displayed on the screen. As soon as the scene is displayed the music fades in normally). Also, the music does seem to be positional: as the player moves about in the scene the volume changes which I do not want happening. Solutions top either of these problems are much appreciated. Thank you!
This is because your volume change is happening after the audio is sent to the audio buffer. You need to do reset the volume before update happens. Now you can't do this as soon as you create the audio, you need to do it after it is added to the AudioEngine, so I would recommend doing this when your scene moves to the view (in the didMoveToView function).
The code to actually change the volume should be something like
((AVAudioMixerNode *)SFXNode.avAudioNode).volume = 0 but I would recommend checking if avAudioNode exists first and that it does conform to the AVAudioMixerNode protocol

Can´t get a smooth step by step rotation with CoreAnimation and CALayers

Im trying to rotate a CALayer with various sublayers according to time. There is UDP Multicast receiver in place which will receive new timecodes. I fetch new times via a timer:
[NSTimer scheduledTimerWithTimeInterval:0.02f
target:self
selector:#selector(fetchNewTimeAndRotate)
userInfo:nil
repeats:YES];
The CALayer should be rotated according to time. In my case, one rotation in 1.8 seconds. Actually i dont need any animation, i just need the angle to be set very often, so that it produces the actual animation.
I´ve tried setting the layers rotation in various ways:
1st i tried via CATransform3DMakeRotation:
CATransform3D rotation1 = CATransform3DMakeRotation([self DegreesToRadians:newAngle], 0.0f, 0.0f, 1.0f);
self.circleLayer.transform=rotation1
2nd try was using an instant rotation via an animation:
rotation.fromValue = [NSNumber numberWithFloat:[self DegreesToRadians:oldAngle]];
rotation.toValue = [NSNumber numberWithFloat:[self DegreesToRadians:newAngle]];
rotation.duration = 0.0f;
rotation.repeatCount = 0.0f;
rotation.removedOnCompletion = NO; //also used YES here with no effect
[self.circleLayer addAnimation:rotation forKey:#"transform.rotation.z"];
oldAngle=newAngle;
3rd try was just setting an angle to "transform.rotation.z"
[self.circleLayer setValue:[NSNumber numberWithFloat:[self DegreesToRadians:newAngle]] forKeyPath:#"transform.rotation.z"];
All of the above approaches do work, but lead to significant stuttering in the rotation. I´ve tried using several different timings in fetching and in animation-length. Nothing seems to get rid of the problem.
Using an autorotation every 0.02 and an 3.6° angle is the only option that presented smooth results for me. Am i making a fundamental mistake here or didn´t understand the concept of CoreAnimation? I know that it is to be used to make the animation itself. But i need it to react to user input and change immediatly.
Im thankful for any help.
You mean CADisplayLink, don't you?
DisplayLink objects are commonly used for frame-based animation in OpenGL, but they do not "use OpenGL". It's perfectly valid to create a display link and attach it to your application to trigger drawing.
As the other poster said, NSTimer is not a good choice for fine control of animation frames. If you write a program that just runs a timer and measures it, it will look good. However, timers depend on your app visiting the event loop frequently. They are not preemptive, and if your app is busy when the timer should fire, it will be off, or even miss a firing interval.
If you continue to use a display link, you might want to write your code to check the time elapsed from the last frame and calculate the amount to animated baed on that. That way if you drop a frame the animation still proceeds at the correct velocity (If you drop a frame then twice as much time elapses before the next frame, so you make your objects move twice as much.)

UIView animateWithDuration: slows down animation frame rate

I am using CADisplayLink to draw frames using the EAGLView method in a game at 60 times per second.
When I call UIView animateWithDuration: the framerate drops down to exactly half, from 60 to 30 fps for the duration of the animation. Once the animation is over, the fps rises instantly back up to 60.
I also tried using NSTimer animation method instead of CADisplayLink and still get the same result.
The same behavior happens when I press the volume buttons while the speaker icon is fading out, so it may be using animateWithDuration. As I would like to be able to handle the speaker icon smoothly in my app, this means I can't just rewrite my animation code to use a different method other than animateWithDuration, but need to find a solution that works with it.
I am aware that there is an option to slow down animations for debug on the simulator, however, I am experiencing this on the device and no such option is enabled. I also tried using various options for animateWithDuration such as the linear one and the user interaction, but none had an improvement.
I am also aware I can design an engine that can still work with a frame rate that varies widely. However, this is not an ideal solution to this problem, as high fps is desirable for games.
Has someone seen this problem or solved it before?
The solution to this is to do your own animation and blit during the CADisplayLink callback.
1) for the volume issue, put a small volume icon in the corner, or show it if the user takes some predefined touch action, and give them touch controls. With that input you can use AVAudioPlayer to vary the volume, and just avoid the system control altogether. you might even be able to determine the user has pressed the volume buttons, and pop some note saying do it your way. This gets you away from any animations happening by the system.
2) When you have an animation you want to do, well, create a series of images in code (either then or before hand), and every so many callbacks in the displayLink blit the image to the screen.
Here's an old thread that describes similar drops in frame rate. In that case, the cause of the problem was adding two or more semi-transparent sprites, but I'd guess that any time you try to composite several layers together you may be doing enough work to cut the frame rate, and animateWithDuration very likely does exactly that kind of thing.
Either use OpenGL or CoreAnimation. They are not compatible.
To test this remove any UIView animation, the frame rate will be what you expect. Add back UIView animation, it will drop to 30fps.
You said:
When I call UIView animateWithDuration: the framerate drops down to exactly half, from 60 to 30 fps for the duration of the animation. Once the animation is over, the fps rises instantly back up to 60
I dont know why your not accepting my answer, this is exactly what happens when you combine UIView animation with CA animation not using a UIView.

AVPlayerLayer - ReProgramming the Wheel?

I'm currently using an AVPlayer, along with an AVPlayerLayer to play back some video. While playing back the video, I've registered for time updates every 30th of a second during the video. This is used to draw a graph of the acceleration at that point in the video, and have it update along with the video. The graph is using the CMTime from the video, so if I skip to a different portion of the video, the graph immediately represents that point in time in the video with no extra work.
Anywho, as far as I'm aware, if I want to get an interface similar to what the MediaPlayer framework offers, I'm going to have to do that myself.
What I'm wondering is, is there a way to use my AVPlayer with the MediaPlayer framework? (Not that I can see.) Or, is there a way to register for incremental time updates with the MediaPlayer framework.
My code, if anyone is interested, follows :
[moviePlayer addPeriodicTimeObserverForInterval: CMTimeMake(1, 30) queue: dispatch_queue_create("eventQueue", NULL) usingBlock: ^(CMTime time) {
loopCount = (int)(CMTimeGetSeconds(time) * 30);
if(loopCount < [dataPointArray count]) {
dispatch_sync(dispatch_get_main_queue(), ^{
[graphLayer setNeedsDisplay];
});
}
}];
Thanks!
If you're talking about the window chrome displayed by MPMoviePlayer then I'm afraid you are looking at creating this UI yourself.
AFAIK there is no way of achieving the timing behaviour you need using the MediaPlayer framework, which is very much a simple "play some media" framework. You're doing the right thing by using AVFoundation.
Which leaves you needing to create the UI yourself. My suggestion would be to start with a XIB file to create the general layout; toolbar at the top with a done button, a large view that represents a custom playback view (using your AVPlayerLayer) and a separate view to contain your controls.
You'll need to write some custom controller code to automatically show/hide the playback controls and toolbar as needed if you want to simulate the MPMoviePlayer UI.
You can use https://bitbucket.org/brentsimmons/ngmovieplayer as a starting point (if it existed at the time you asked).
From the project page: "Replicates much of the behavior of MPMoviePlayerViewController -- but uses AVFoundation."
You might want to look for AVSynchronizedLayer class. I don't think there's a lot in the official programming guide. You can find bits of info here and there: subfurther, Otter Software.
In O'Really Programming iOS 4 (or 5) there's also a short reference on how to let a square move/stop along a line in synch with the animation.
Another demo (not a lot of code) is shown during WWDC 2011 session Working with Media in AV Foundation.

Performance problems on iPhone using simple graphics

We're working on a couple simple games and we have some performance problems that we keep running into, and I was curious if it was a code issue or a "we're using the wrong objects" issue.
Our games are basically simple ones that work as follows:
We have a custom view that we load, and a custom object that is our game engine. The view has a timer that fires like such:
[NSTimer scheduledTimerWithTimeInterval:1.0 / 30.0 target:self selector:#selector(animationTimerMethod) userinfo:nil repeats:YES];
and in our timer method, we have the following:
- (void)animationTimerMethod:(NSTimer*)timer
{
if([gameEngine doGameLoop]) //only redraw if we need to
[self setNeedsDisplay];
}
Our drawing code is very simple, we're just drawing a couple of images on the screen, using code similar to the following:
- (void)drawRect:(CGRect)rect
{
CGGraphicsContext ctx = UIGraphicsGetCurrentContext();
CGContextDrawImage(ctx, someRect, someImage);
}
The main problem we have is responsiveness to touch. Our code will not get a response in the touchesBegan method many times. Is this just a limitation of the Core Graphics library? Should we be using OpenGL? Are there other ways to do simple animations (not even objects moving on screen, just 6 or so frames of animation for an object in the same place) that would be more responsive?
Thanks so much!
There are definitely ways to speed this up. Manually redrawing the images every frame means you are redrawing textures you could be reusing. You could reuse them either by moving to OpenGL, or moving the images into a CALayer and then just repositioning the layer instead of redrawing the image.
If you do not want to move to OpenGL you would probably also see a significant performance win by moving to CAAnimation instead of having your code calculate each frame, but that might require significant changes to your engine.
Also, if you can avoid avoid alpha compositing it is a big speed up.
My first question is do your images have transparency?
Images with transparency with HIGHLY effect performance.
In Apple's doc's they make reference to this point several times.
The second thing I would ask is, have you tried running your app in Instruments and/or Shark? Both these apps can help give you an idea of where your problems are happening.