AVPlayer long song buffering issue - objective-c

I've got an issue with long songs while using AVPlayer.
I have tested it on a 64 min song.
The issue is: when the AVPlayer buffer is full it stops playback (rate = 0.0f) and stops to download new timeRanges. When I resume playback manually it plays for some seconds and stops again. I think it continues to download new content to buffer but this process is very slow and is not suitable for gapless playback.
Is it possible to control this situation to achieve gapless playback?
Am I allowed to modify loaded time ranges (clean the buffer) during playback?
Am I allowed to increase buffer size?

Are you running it on the main thread? Try to do something like this:
#include <dispatch/dispatch.h>
dispatch_queue_t playQueue = dispatch_queue_create("com.example.playqueue", NULL);
AVAudioPlayer* player = ...
dispatch_async(playQueue, ^{
[player play];
});
If that doesn't work I'd suggest giving OpenAL a try.

Related

How do you designate audio device channels when using AVPlayer?

I am attempting to designate output channels of an audio interface for sending audio during video playback on macOS. I am using AVPlayer to play a video file:
auto player = [AVPlayer playerWithPlayerItem:playerItem]; // playerItem created from a file path
player.audioOutputDeviceUniqueID = deviceID; // deviceID determined by passing kAudioObjectPropertyName to AudioObjectGetPropertyData and inspecting device name, e.g. "Babyface Pro"
/* select hardware channels? */
[player play];
The AVPlayer defaults to playing channels 1/2 of my Babyface Pro, but I'd like to select, for instance, channels 3/4. I cannot select these channel routes globally (i.e. through TotalMix FX) since I am using more than one AVPlayer and selecting separate channels for each.
I came across a nearly identical question, however the question did not require playing video, only audio.

multi track mp3 playback for iOS application

I am doing an application that involves playing back a song in a multi track format (drums, vocals, guitar, piano, etc...). I don't need to do any fancy audio processing to each track, all I need to be able to do is play, pause, and mute/unmute each track.
I had been using multiple instances of AVAudioPlayer but when performing device testing, I noticed that the tracks are playing very slightly out of sync when they are first played. Furthermore, when I pause and play the tracks they continue to get more out of sync. After a bit of research I've realized that AVAudioplayer just has too much latency and won't work for my application.
In my application I basically had an NSArray of AVAudioPlayers that I would loop through and play each one or pause/stop each one, I'm sure this is what caused it to get out of sync on the device.
It seemed like apple's audio mixer would work well for me, but when I try implementing it I get a EXC_BAD_ACCESS error that I can't figure out.
I know the answer is to use OpenAL or audio units but It just seems unnecessary to spend weeks learning about these when all I need to do is play around 5 .mp3 tracks at the same time. Does anyone have any suggestions on how to accomplish this? Thanks
thanks to admsyn's suggestion I was able to come up with a solution.
AVAudioPlayer has a currentTime property that returns the current time of a track and can also be set.
So I implemented the startSynchronizedPlayback as stated by admsyn and then added the following when I stopped the tracks:
-(void) stopAll
{
int count = [tracksArr count];
for(int i = 0; i < count; i++)
{
trackModel = [tracksArr objectAtIndex:i]
if(i = 0)
{
currentTime = [trackModel currentTime]
}
[trackModel stop]
[trackModel setCurrentTime:currentTime]
}
{
This code basically loops through my array of tracks which each hold their own AVAudioPlayer, grabs the current time from the first track, then sets all of the following tracks to that time. Now when I use the startSynchronizedPlayback method they all play in sync, and pausing unpausing keeps them in sync as well. Hope this is helpful to someone else trying to keep tracks in sync.
If you're issuing individual play messages to each AVAudioPlayer, it is entirely likely that the messages are arriving at different times, or that the AVAudioPlayers finish their warm up phase out of sync with each other. You should be using playAtTime: and the deviceCurrentTime property to achieve proper synchronization. Note the description of deviceCurrentTime:
Use this property to indicate “now” when calling the playAtTime: instance method. By configuring multiple audio players to play at a specified offset from deviceCurrentTime, you can perform precise synchronization—as described in the discussion for that method.
Also note the example code in the playAtTime: discussion:
// Before calling this method, instantiate two AVAudioPlayer objects and
// assign each of them a sound.
- (void) startSynchronizedPlayback {
NSTimeInterval shortStartDelay = 0.01; // seconds
NSTimeInterval now = player.deviceCurrentTime;
[player playAtTime: now + shortStartDelay];
[secondPlayer playAtTime: now + shortStartDelay];
// Here, update state and user interface for each player, as appropriate
}
If you are able to decode the files to disk, then audio units are probably the solution which would provide the best latency. If you decide to use such an architecture, you should also check out Novocaine:
https://github.com/alexbw/novocaine
That framework takes a lot of the headache out of dealing with audio units.

AVQueuePlayer playing items simultaneously rather than in sequence

I am trying to put together a simple app to play a pre-roll video followed by some content video.
Currently, I'm trying to use the AVQueuePlayer class to get this done. Unfortunately, it seems to want to play the videos properly in sequence.
For example, the pre-roll plays by itself for a few seconds, then (prior to the pre-roll being complete) it starts to try and play the content video. For a few seconds, the player seems to be at war with itself and switches back and forth between the two videos (neither playing very well at all). Finally, when the pre-roll is finished, the content video then plays the rest of the way normally.
I've looked through the documentation for the AVQueuePlayer and I don't see anything obvious that I'm missing.
My code is pretty basic:
AVPlayerItem *preRollItem = [AVPlayerItem playerItemWithURL: preRollUrl];
AVPlayerItem *contentItem = [AVPlayerItem playerItemWithURL: contentUrl];
self.player = [AVQueuePlayer queuePlayerWithItems:[NSArray arrayWithObjects:preRollItem, contentItem, nil]];
[self.player play];
What is the trick to getting the videos to play in sequence.
Make sure you are actually testing on the device. From my experience the iOS5 simulator has big problems with AVQueuePlayer and does bizarre things.
I found the iOS4.3 simulator is doing a much better job when it comes to testing AVFoundation.

How to speed up saving a UIImagePickerController image from the camera to the filesystem via UIImagePNGRepresentation()?

I'm making an applications that let users take a photo and show them both in thumbnail and photo viewer.
I have NSManagedObject class called photo and photo has a method that takes UIImage and converts it to PNG using UIImagePNGRepresentation() and saves it to filesystem.
After this operation, resize the image to thumbnail size and save it.
The problem here is UIImagePNGRepresentation() and conversion of image size seems to be really slow and I don't know if this is a right way to do it.
Tell me if anyone know the best way to accomplish what I want to do.
Thank you in advance.
Depending on the image resolution, UIImagePNGRepresentation can indeed be quite slow, as can any writing to the file system.
You should always execute these types of operations in an asynchronous queue. Even if the performance seems good enough for your application when testing, you should still do it an asynch queue -- you never know what other processes the device might have going on which might slow the save down once your app is in the hands of users.
Newer versions of iOS make saving asynchronously really, really easy using Grand Central Dispatch (GCD). The steps are:
Create an NSBlockOperation which saves the image
In the block operation's completion block, read the image from disk & display it. The only caveat here is that you must use the main queue to display the image: all UI operations must occur on the main thread.
Add the block operation to an operation queue and watch it go!
That's it. And here's the code:
// Create a block operation with our saves
NSBlockOperation* saveOp = [NSBlockOperation blockOperationWithBlock: ^{
[UIImagePNGRepresentation(image) writeToFile:file atomically:YES];
[UIImagePNGRepresentation(thumbImage) writeToFile:thumbfile atomically:YES];
}];
// Use the completion block to update our UI from the main queue
[saveOp setCompletionBlock:^{
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
UIImage *image = [UIImage imageWithContentsOfFile:thumbfile];
// TODO: Assign image to imageview
}];
}];
// Kick off the operation, sit back, and relax. Go answer some stackoverflow
// questions or something.
NSOperationQueue *queue = [[NSOperationQueue alloc] init];
[queue addOperation:saveOp];
Once you are comfortable with this code pattern, you will find yourself using it a lot. It's incredibly useful when generating large datasets, long operations on load, etc. Essentially, any operation that makes your UI laggy in the least is a good candidate for this code. Just remember, you can't do anything to the UI while you aren't in the main queue and everything else is cake.
Yes, it does take time on iPhone 4, where the image size is around 6 MB. The solution is to execute UIImagePNGRepresentation() in a background thread, using performSelectorInBackground:withObject:, so that your UI thread does not freeze.
It will probably be much faster to do the resizing before converting to PNG.
Try UIImageJPEGRepresentation with a medium compression quality. If the bottleneck is IO then this may prove faster as the filesize will generally be smaller than a png.
Use Instruments to check whether UIImagePNGRepresentation is the slow part or whether it is writing the data out to the filesystem which is slow.

How do you know when a user chooses to play a video through AirPlay?

I have a custom video player set up with custom controls, and I utilize MPVolumeView to provide an airplay button. When a user chooses to use AirPlay, they interact with that Apple UI and there is no event (that I can find) that says "hey, the video is now playing over AirPlay".
The problem is that, if I close the player and reopen it, it loads the movie (load state changes to MPMovieLoadStatePlayable), I play it, and I immediately get a playback did finish notification with reason being MPMovieFinishReasonPlaybackEnded, and the video continues to try to play through AirPlay. I'm certain the movie stops and is deallocated whenever I close the player.
If anyone has any advice on how to handle this, knows some events to listen for, or has any ideas about this whatsoever, please let me know. Thanks!
The answer here turns out to be that, at least up to 4.3, there is no way to get an answer to this through code.
The problem in this case is how you dispose of the MPMoviePlayerController when you're finished with it. Even if the video plays through, before you finally release it, you have to call pause and then stop. Like this:
MPMoviePlayerController *mp = [[MPMoviePlayerController alloc] init];
// use the player. then when done with it:
[mp pause];
[mp stop];
[mp release];
If you don't do this then the next time you create a MPMoviePlayerController, certain properties are somehow ghosted in the framework. Playing a video progressively caused audio from the previous mp to play while the new mp did its initial buffering. Also, if the previous video was playing over airplay, the next video would get a notification that the video finished right after it starts and some other weirdness.
Long story short, dispose of your video players with the above sequence to avoid issues with later movie players.