SKAudioNode volume adjustment delayed - objective-c

I am developing a game where an SKAudioNode plays the game music. When the player starts a new game, I want the music to fade in so I wrote the following code:
SKAudioNode *SFXNode = [[SKAudioNode alloc] initWithfile:#"GameMusic.mp3"];
[SFXNode runAction:[SKAction changeVolumeTo:0 duration:0]];
SFXNode.positional = NO;
[self addChild:SFXNode];
[SFXNode runAction:[SKAction changeVolumeTo:1 duration:1]];
However, when the scene initiates, the music plays at full volume for a split second, then it mutes and fades back in as it is supposed to. Does anyone have any idea why this is happening? (The music is only full volume after the user has triggered the modal segue to the scene, but before it has been displayed on the screen. As soon as the scene is displayed the music fades in normally). Also, the music does seem to be positional: as the player moves about in the scene the volume changes which I do not want happening. Solutions top either of these problems are much appreciated. Thank you!

This is because your volume change is happening after the audio is sent to the audio buffer. You need to do reset the volume before update happens. Now you can't do this as soon as you create the audio, you need to do it after it is added to the AudioEngine, so I would recommend doing this when your scene moves to the view (in the didMoveToView function).
The code to actually change the volume should be something like
((AVAudioMixerNode *)SFXNode.avAudioNode).volume = 0 but I would recommend checking if avAudioNode exists first and that it does conform to the AVAudioMixerNode protocol

Related

How to hear sound from mpmovieplayer at a specific time using UISlider?

I'm working on an iOS movie editor project. For this editor, i use MPMoviePlayer to show the video file selected by the user.
I use custom controls, and I have a UISlider that enables the user to move the player's currentTime position. When the user touches the slider, movie is paused and its currentTime changes along with the UISlider's value.
Everything works perfectly, but now i need to let the user hear the sound at this currentTime position.
For those who know iMovie, when you move your mouse over a movie event, you see the image and hear the sound at this position, and that's what i'd like in my editor.
I've tried to call player's play method with à NSTimer to stop after 0.2 seconds, but the result is kind of messy.
Has anyone already achieved to do something like this ?
Thanks !
Best regards.
Seeking takes time; that's why you've ended up using a timer. The real problem here is that MPMoviePlayerController, while convenient because it gives you controls, is a blunt instrument; it's just a massive simplified convenience built on top of AVFoundation. But you don't need the built-in controls, so I would suggest throwing away your entire implementation and getting down to the real stuff, using AVFoundation instead (AVPlayer etc). Now you have a coherent way to seek and get a notification when the seek has completed (seekToTime:completionHandler:), so you'll be able to start playing as soon as possible. Plus, AVFoundation is the level where you'll be doing all your "editing" anyway.

Movement of objects in game

I have a queston about movment of object in app (game) I am creating. I tried move ImageView with timer. So every 0.01 second object move for 1px. But movement is not smooth. So i also tried with animations. But there is problem, that if I close app and run it again (app was stil opened in background), there are problems, that picture of View stays on the end of animation. And also I want to check every 0.01 second if moving object and my character did colide, so animation is not the best option. Is there a way to move my object smooth with local time on phone? Or there is some other way to move object?
It sounds like you're not using SpriteKit or any game engine for that matter. Any game engine will come with an update loop that does all of this.
You can learn about SpriteKit here... http://www.raywenderlich.com/42699/spritekit-tutorial-for-beginners

Is there a way to play (rewind/fwd) a paused movie with left right gesture in objective c

I have a video clip that will work like a intro video when application starts. Video contains many objects that comes from all directions and incorporates with each other.
I have two ways to do this:
I start video in paused mode. Video goes forward or backward with finger left-right movement, exactly similar to the play strip of a MPMoviePlayerController. (video will remain paused and will not play)
I make every object separate and inside a UIScrollView I move them as per the video describes with the help of - (void)scrollViewDidScroll:(UIScrollView *)scrollView method.
2nd way is too complicated because the video is too lengthy and has near 45 objects. So how to go for 1st option?
You should export your movie as a serie of image. Then use [UIImageView animationImages:listOfImages] and bind a UIGestureRecognizer on your imageView and display an image instead of another depending on the direction of the slide.

Animation iOS with sound file

I have some animation which needs to appear on screen at very specific timings, these are stored in SQLite database. What I am planning to do is use nstimer to keep time and pop the animation when the specific time is reached.
Im thinking of using an NSTimer to count for the duration of the sound file and animation then when certain points are reached pop an on screen animation. The problem is the set timing are like this 55.715000 seconds so are very accurate and these need to sync with an audio track that will be played with the animation.
Firstly is this even possible, secondly how can i compare such specific timings the problem i seem to be facing is the code can't run quick enough and the time jumps more than .001 of a second.
I have no knowledge of openGLES or Cocos2d and learning these is not really feasible for the time scales.
If your visuals need to be exactly in sync with the audio (for example a music app where animations have to appear on the beats) then you need to use the following approach. It works on very old iPhone hardware and there is basically nothing that can go wrong at runtime.
Pre-render each "frame" of your visuals so that each one is stored as a full screen image. This could be done on the desktop or you can run the render logic on the phone and then capture the output to a file as pixels.
Once each frame is saved as an array of pixels, play the audio via the standard AVAudioPlayer APIs on iOS. This API takes care of the audio playback and reports a time that you will use to determine which video frame to display.
Once audio is playing, get the "time" and divide it by your video framerate to determine which image from an array of N images to display. Get the image data and wrap it up in a CGImageRef/UIImage, this will blit the image data to the screen in a optimal way.
If you would like to see working source code for this approach, take a look at AVSync. This example code shows the implementation that I used in my own app called iPractice in the app store. It is very fast and can run at 30FPS even on the old iPhone 3G.
Regardless of how accurate your timings are, you aren't going to get the device to display more frequently than about 60Hz (16.7 ms per frame). As I see it, you have at least two options here:
1) Use a CADisplayLink callback to check the playback progress of the audio and trigger each animation as it becomes timely.
Display link timers are created in a similar way to regular NSTimers:
-(void)viewDidLoad
{
// ...
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(scheduleAnimations:)];
[self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSDefaultRunLoopMode];
}
- (void)scheduleAnimations:(CADisplayLink *)displayLink
{
// get playback time from player object
// iterate queue of timings
// if the timestamp of a timing is equal to or less than the playback time
// create its animation for immediate execution
// remove it from the queue
}
2) Create a collection of animations, setting their respective beginTime properties to the appropriate trigger time (or, if using implicit or block-based animations, use the delay parameter).
[CATransaction begin];
// iterate collection of timings
CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:#"key"];
animation.startTime = /* time to trigger animation */
animation.removedOnCompletion = NO;
[layer addAnimation:animation forKey:nil];
[CATransaction commit];

How can I programmatically pipe a QuickTime movie into a Quartz Composer input?

I'm working on an app that applies Quartz Composer effects to QuickTime movies. Think Photo Booth, except with a QuickTime movie for the input, not a camera. Currently, I am loading a quicktime movie as a QTMovie object, then have an NSTimer firing 30 times a second. At some point I'll switch to a CVDisplayLink, but NSTimer is okay for now. Every time the NSTimer fires, the app grabs one frame of the quicktime movie as an NSImage and passes it to one of the QCRenderer's image inputs. This works, but is extremely slow. I've tried pulling frames from the movie in all of the formats that [QTMovie frameImageAtTime:withAttributes:error:] supports. They are all either really slow, or don't work at all.
I'm assuming that the slowness is caused by moving the image data to main memory, then moving it back for QC to work on it.
Unfortunately, using QC's QuickTime movie patch is out of the question for this project, as I need more control of movie playback than that provides. So the question is, how can I move QuickTime movie images into my QCRenderer without leaving VRAM?
Check out the v002 Movie Player QCPlugin which is open source. Anyway, what more controls do you have exactly?