How can I handle separate tracks of an AVURLAsset independently? - objective-c

Here's my goal: I would like to load a .3gp movie file into an AVURLAsset. I would then like to take the video track and pump the output frames into an OpenGL ES texture. This will be the video playback. I would then like to continue leveraging AVFoundation to play back the audio. The framework is pretty vast, so I'm hoping for some veteran assistance on this one.
I actually have both parts working separately, but something always goes wrong when I try to do both at the same time. Here's my current attempt, in a nutshell (All error handling is omitted for brevity):
I load the .3gp file into the AVURLAsset and load the tracks:
NSURL* fileURL = [[NSBundle mainBundle] URLForResource:someName withExtension:someExtension];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:^ {/* More Code */}];
In the completion handler, I get a reference to the audio and video track:
// Tracks loaded, grab the audio and video tracks.
AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack* audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
Next, I create separate AVMutableCompositions that contain just the audio track and just the video track. I'm not sure if this is completely necessary, but it seems like a good idea and it does also seem to work:
// Make a composition with the video track.
AVMutableComposition* videoComposition = [AVMutableComposition composition];
AVMutableCompositionTrack* videoCompositionTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoCompositionTrack insertTimeRange:[videoTrack timeRange] ofTrack:videoTrack atTime:CMTimeMake(0, 1) error:nil];
// Make a composition with the audio track.
AVMutableComposition* audioComposition = [AVMutableComposition composition];
AVMutableCompositionTrack* audioCompositionTrack = [audioComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioCompositionTrack insertTimeRange:[audioTrack timeRange] ofTrack:audioTrack atTime:CMTimeMake(0, 1) error:nil];
Now I get into specifics of how to handle each track. I'm fairly confident that I have the one-and-only way of handling the video track, which is to create an AVAssetReader for the video composition, and add an AVAssetTrackReaderOutput that was created with the video composition track. By keeping a reference to that track output, I can call its -copyNextSampleBuffer method to get the info I need to pump the video output into an OpenGL ES texture. This works well enough by itself:
// Create Asset Reader and Output for the video track.
NSDictionary* settings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(NSString *)kCVPixelBufferPixelFormatTypeKey];
_assetReader = [[AVAssetReader assetReaderWithAsset:vComposition error:nil] retain];
_videoTrackOutput = [[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:vCompositionTrack outputSettings:settings] retain];
[_assetReader addOutput:_videoTrackOutput];
[_assetReader startReading];
What seems to spoil the whole thing is attempting to play back the audio in any way. I'm not really sure which approach to take for the remaining audio track. Just sticking to the realm of AVFoundation, I see two possible approaches. The first is to use an AVPlayer to play the audio composition:
// Create a player for the audio.
AVPlayerItem* audioPlayerItem = [AVPlayerItem playerItemWithAsset:aComposition];
AVPlayer* audioPlayer = [[AVPlayer playerWithPlayerItem:audioPlayerItem] retain];
[audioPlayer play];
This works, inasmuch as I can hear the desired audio. Unfortunately creating this player guarantees that the AVAssetReaderTrackOutput for the video composition fails with a cryptic error when calling -copyNextSampleBuffer:
AVAssetReaderStatusFailed
Error Domain=AVFoundationErrorDomain
Code=-11800 "The operation could not
be completed" UserInfo=0x456e50
{NSLocalizedFailureReason=An unknown
error occurred (-12785),
NSUnderlyingError=0x486570 "The
operation couldn’t be completed.
(OSStatus error -12785.)",
NSLocalizedDescription=The operation
could not be completed}
I'm confused about how they might be interfering with each other, but regardless, that approach seems to be a dead end.
The other option I considered for the audio playback was the AVAudioPlayer class, but I could not get it to work with an AVAsset as a starting point. I attempted to use its -initWithData:error: method with an NSData built by aggregating the contents of CMSampleBufferRefs taken with an approach identical to the one I use on the video track, but it does not appear to be formatted correctly.
At this point, I feel like I'm flailing around blindly, and would love it so very much if someone could tell me if this approach is even feasible. If it's not I would, of course, appreciate a feasible one.

Creating AVMutableCompositions (basically new AVAssets) for each track seems round-about to me, I'd simply use an AVAssetReader on the audio track. Also, your videoComposition doesn't seem to be used anywhere, so why create it?
In any case, to get either solution to work, set your audio session category to kAudioSessionCategory_MediaPlayback and enable kAudioSessionProperty_OverrideCategoryMixWithOthers.
I've never found any documentation that explains why this is necessary.

Related

How to load the next file while current one is playing through HTTP Live Streaming?

I'm working on an app in which I need to stream a large collection of audio files ranging from 5 to 15 seconds each.
I would like to reduce the load time between the files as much as possible.
Main Question:
Is there a way to start buffering the next file (through HLS) while the current one is playing the last segment?
Is AVQueuePlayer an appropriate solution for this on the iOS side?
A detailed explanation will be much appreciated, since I am new both to HTTP Live Streaming and the AV Foundation.
Related Question:
How do radio apps stream their audio with no lag between the songs?
Yes, AVQueuePlayer is an appropriate solution for playing a sequence of audio streamed from the internet via HTTP protocol.
I'm using AVQueuePlayer for quite a while now with excellent results and no lagging between songs. Here is a simple example on how to use AVQueuePlayer:
NSURL *url1 = [NSURL URLWithString:[urlsToPlay objectAtIndex:0]];
NSURL *url2 = [NSURL URLWithString:[urlsToPlay objectAtIndex:1]];
NSURL *url3 = [NSURL URLWithString:[urlsToPlay objectAtIndex:2]];
self.item1 = [[AVPlayerItem alloc] initWithURL:url1];
self.item2 = [[AVPlayerItem alloc] initWithURL:url2];
self.item3 = [[AVPlayerItem alloc] initWithURL:url3];
self.radioPlayerURLs = [[NSArray alloc] initWithObjects:self.item1,self.item2, self.item3, nil];
self.onDemandPlayer = [AVQueuePlayer queuePlayerWithItems:self.radioPlayerURLs];
[self.onDemandPlayer play];
For more details, please consulte Apple documentation:
AVFoundation
AVQueuePlayer

Mute an HTTP Live Stream in an AVPlayer

I've been trying to work out this problem for a good 48 hours now and haven't come up with anything. I have 2 AVPlayer objects playing different http live streams. Obviously, I don't want them both playing audio at the same time so I need a way to mute one of the videos.
Apple suggests this for muting an audio track playing in AVPlayer...
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVPlayerItemTrack *track in [_playerItem tracks]) {
if ([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio]) {
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:0.0 atTime:CMTimeMakeWithSeconds(0,1)];
[audioInputParams setTrackID:[track.assetTrack trackID]];
[allAudioParams addObject:audioInputParams];
// Added to what Apple Suggested
[track setEnabled:NO];
}
}
AVMutableAudioMix *audioZeroMix = [AVMutableAudioMix audioMix];
[audioZeroMix setInputParameters:allAudioParams];
[_playerItem setAudioMix:audioZeroMix];
When this didn't work (after many iterations), I found the enabled property of AVPlayerItemTrack and tried setting that to NO. Also nothing. This doesn't even register as doing anything because when I try an NSLog(#"%x",track.enabled), it still shows up as 1.
I'm at a loss and I can't think of another piece of documentation I can read and re-read to get a good answer. If anyone out there can help, that would be fantastic.
*Update: I got a hold of Apple and according to the AVFoundation team, it is impossible to mute or disable a track of an HLS video. I, personally, feel like this is a bug so I submitted a bug report (You should do the same to tell Apple that this is a problem). You can also
try and submit a feature enhancement request via their feedback page.
New iOS 7 answer: AVPlayer now has 2 new properties 'volume' and 'muted'. Use those!
And here is the original answer for life before iOS 7:
I've been dealing with the same thing. We created muted streams and streams with audio. To mute or unmute you call [player replaceCurrentItemWithPlayerItem:muteStream].
I also submitted a bug report. It looks like AVPlayer has this functionality on MacOS 10.7, but it hasn't made it to iOS yet.
AVAudioMix is documented not to work on URL assets here
Of course I tried it anyway, and like you I found it really doesn't work.
The best solution for this would be to actually embed the stream url feed with two audio tracks! One would be with the normal audio and the other audio track would be the muted audio.
It makes more sense to do it this way rather then the way ComPuff suggested as his way your actually creating two separate URL streams - which is not required.
Here is the code that you could use to switch the audio tracks:
float volume = 0.0f;
AVPlayerItem *currentItem = self.player.currentItem;
NSArray *audioTracks = self.player.currentItem.tracks;
DLog(#"%#",currentItem.tracks);
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVPlayerItemTrack *track in audioTracks)
{
if ([track.assetTrack.mediaType isEqual:AVMediaTypeAudio])
{
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:volume atTime:kCMTimeZero];
[audioInputParams setTrackID:[track.assetTrack trackID]];
[allAudioParams addObject:audioInputParams];
}
}
if ([allAudioParams count] > 0) {
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters:allAudioParams];
[currentItem setAudioMix:audioMix];
}
The only problem is that my stream url is only display two tracks (one for video and one for audio) when it should actually be three tracks (2 audio tracks). I cant work out if this is a problem with the stream url or my code! Can anyone spot any mistakes in the code?

Cocoa Add Image At End Of Video

I'm recording a video from the iSight camera using QTCaptureSession.
I would like to add an image at the end of the video, so I've implemented the didFinishRecordingToOutputFileAtURL delegate methods. Here's what I've done so far:
- (void)captureOutput:(QTCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL forConnections:(NSArray *)connections dueToError:(NSError *)error
{
// Prepare final video
QTMovie *originalMovie = [QTMovie movieWithURL:outputFileURL error:nil];
[originalMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute];
NSImage *splashScreen = [NSImage imageNamed:#"video-ending.jpg"];
NSImage *tiffImage = [[NSImage alloc] initWithData:[splashScreen TIFFRepresentation]];
id attr = [NSDictionary dictionaryWithObjectsAndKeys:#"tiff",
QTAddImageCodecType,
[NSNumber numberWithLong:codecHighQuality], QTAddImageCodecQuality,
nil];
[originalMovie addImage:tiffImage forDuration:QTMakeTime(2, 1) withAttributes:attr];
[tiffImage release];
[originalMovie updateMovieFile];
}
The problem with this code is that while quicktime plays it nice, other players don't. I'm sure I'm missing something basic here.
It would also be cool to add the image to the video before it gets saved (to avoid during it two times). Here's how I stop recording right now:
- (void)stopRecording
{
// It would be cool to add an image here
[mCaptureMovieFileOutput recordToOutputFileURL:nil];
}
While I used Cocoa touch this might still apply. I have two tips based on my experience writing images to movies. First, while I'll bet that addImage:forDuration takes care of a lot of things that AVAssetExportSessions do not, I had to make sure that images were added more regularly than a couple times a second or they would not work well with all players. Second, if there is a network streaming option, such as the AVAssetExportSession shouldOptimizeForNetworkUse to move as much metadata and headers forward in the movie, I found that it made the video compatible with more players as well.

AVURLAsset for long sound on ipod

long time reader, first time asker...
I am making a music app which uses AVAssetReader to read mp3 data from the itunes library. I need precise timing, so when I create an AVURLAsset, I use the "AVURLAssetPreferPreciseDurationAndTimingKey" to extract timing data. This has some overhead (and I have no problems when I don't use it, but I need it!)
Every thing works fine on iphone(4) and ipad(1). I would like it to work on my ipod touch (2nd gen). But it doesn't: if the sound file is too long (> ~7 minutes) then the AVAssetReader cannot start reading and throws an error ( AVFoundationErrorDomain error -11800. )
It appears that I am hitting a wall in terms of the scanter resources of the ipod touch. Any ideas what is happening, or how to manage the overhead of creating the AVURLAsset so that it can handle long files?
(I tried running this with the performance tools, and I don't see a major spike in memory).
Thanks, Dan
Maybe you're starting to read too son? As far as I understand, for mp3 it will need to go trough the entire file in order to to enable precise timing. So, try delaying the reading.
You can also try registering as an observer for some of the AVAsset properties. iOS 4.3 has 'readable' property. I've never tried it, but my guess would be it's initially set to NO and as soon as AVAsset has finished loading it gets set to YES.
EDIT:
Actually, just looked into the docs. You're supposed to use AVAsynchronousKeyValueLoading protocol for that and Apple provides an example
NSURL *url = <#A URL that identifies an audiovisual asset such as a movie file#>;
AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *keys = [NSArray arrayWithObject:#"duration"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [asset statusOfValueForKey:#"duration" error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:
[self updateUserInterfaceForDuration];
break;
case AVKeyValueStatusFailed:
[self reportError:error forAsset:asset];
break;
case AVKeyValueStatusCancelled:
// Do whatever is appropriate for cancelation.
break;
}
}];
If 'duration' won't help try 'readable' (but like I mentioned before 'readable' requires 4.3). Maybe this will solve your issue.

How to make QTMovie play file from URL with forced (MP3) type?

I'm using QTKit to progressively download and play an MP3 from a URL. According to this documentation, this is the code I should use to accomplish that:
NSURL *mp3URL = [NSURL URLWithString:#"http://foo.com/bar.mp3"];
NSError *error = nil;
QTMovie *sound = [[QTMovie alloc] initWithURL:mp3URL error:&error];
[sound play];
This works, and does exactly what I want — the MP3 URL is lazily downloaded and starts playing immediately. However, if the URL does not have the ".mp3" path extension, it fails:
NSURL *mp3URL = [NSURL URLWithString:#"http://foo.com/bar"];
NSError *error = nil;
QTMovie *sound = [[QTMovie alloc] initWithURL:mp3URL error:&error];
[sound play];
No error is given, no exception is raised; the duration of the sound is just set to zero, and nothing plays.
The only way I have found to work around this is to force a type by loading the data manually and using a QTDataReference:
NSURL *mp3URL = [NSURL URLWithString:#"http://foo.com/bar"];
NSData *mp3Data = [NSData dataWithContentsOfURL:mp3URL];
QTDataReference *dataReference =
[QTDataReference dataReferenceWithReferenceToData:mp3Data
name:#"bar.mp3"
MIMEType:nil];
NSError *error = nil;
QTMovie *sound = [[QTMovie alloc] initWithDataReference:dataReference error:&error];
[sound play];
However, this forces me to completely download ALL of the MP3 synchronously before I can start playing it, which is obviously undesirable. Is there any way around this?
Thanks.
Edit
Actually, it seems that the path extension has nothing to do with it; the Content-Type is simply not being set in the HTTP header. Even so, the latter code works and the former does not. Anyone know of a way to fix this, without having access to the server?
Edit 2
Anyone? I can't find information about this anywhere, and Google frustratingly now shows this page as the top result for most of my queries...
Two ideas. (The first one being a bit hacky):
To work around the missing content type, you could embed a small Cocoa webserver that supplements the missing header field and route your NSURL over that "proxy".
Some Cocoa http server implementations:
http://code.google.com/p/cocoahttpserver/
http://cocoawithlove.com/2009/07/simple-extensible-http-server-in-cocoa.html
http://culturedcode.com/cocoa/
The second one would be, to switch to a lower level framework (From QTKit to AudioToolbox).
You'd need more code, but there are some very good resources out there on how to stream mp3 using AudioToolbox.
e.g.:
http://cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html
Personally I'd go with the second option. AudioToolbox isn't as straightforward as QTKit but it offers a clean solution to your problem. It's also available on both - iOS and Mac OS - so you will find plenty of information.
Update:
Did you try to use another initializer? e.g.
+ (id)movieWithAttributes:(NSDictionary *)attributes error:(NSError **)errorPtr
You can insert your URL for the key QTMovieURLAttribute and maybe you can compensate the missing content type by providing other attributes in that dictionary.
This open source project has a QTMovie category that contains methods to accomplish similar things:
http://vidnik.googlecode.com/svn-history/r63/trunk/Source/Categories/QTMovie+Async.m
If you thought weichsel's first solution was hacky, you're going to love this one:
The culprit is the Content-Type header, as you have determined. Had QTKit.framework used Objective-C internally, this would be a trivial matter of overriding -[NSHTTPURLResponse allHeaderFields] with a category of your choosing. However, QTKit.framework (for better or worse) uses Core Foundation (and Core Services) internally. These are both C-based frameworks and there is no elegant way of overriding functions in C.
That said, there is a method, just not a pretty one. Function interposition is even documented by Apple, but seems to be a bit behind the times, compared to the remainder of their documentation.
In essence, you want something along the following lines:
typedef struct interpose_s {
void *new_func;
void *orig_func;
} interpose_t;
CFStringRef myCFHTTPMessageCopyHeaderFieldValue (
CFHTTPMessageRef message,
CFStringRef headerField
);
static const interpose_t interposers[] __attribute__ ((section("__DATA, __interpose"))) = {
{ (void *)myCFHTTPMessageCopyHeaderFieldValue, (void *)CFHTTPMessageCopyHeaderFieldValue }
};
CFStringRef myCFHTTPMessageCopyHeaderFieldValue (
CFHTTPMessageRef message,
CFStringRef headerField
) {
if (CFStringCompare(headerField, CFSTR("Content-Type"), 0) == kCFCompareEqualTo) {
return CFSTR("audio/x-mpeg");
} else {
return CFHTTPMessageCopyHeaderFieldValue(message, headerField);
}
}
You might want to add logic specific to your application in terms of handling the Content-Type field lest your application break in weird and wonderful ways when every HTTP request is determined to be an audio file.
Try replacing http:// with icy://.
Just create an instance like this...
QTMovie *aPlayer = [QTMovie movieWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:
fileUrl, QTMovieURLAttribute,
[NSNumber numberWithBool:YES], QTMovieOpenForPlaybackAttribute,
/*[NSNumber numberWithBool:YES], QTMovieOpenAsyncOKAttribute,*/
nil] error:error];