Compositing 2 videos on top of each other with alpha - objective-c

AVFoundation allows you to "compose" 2 assets (2 videos) as 2 "tracks", just like in Final Cut Pro, for example.
The theory says I can have 2 videos on top of each other, with alpha, and see both.
Either I'm doing something wrong, or there's a bug somewhere, because the following test code, although a bit messy, clearly states I should see 2 videos, and I only see one, as seen here: http://lockerz.com/s/172403384 -- the "blue" square is IMG_1388.m4v
For whatever reason, IMG_1383.MOV is never shown.
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// Track B
NSURL *urlVideo2 = [NSURL URLWithString:#"file://localhost/Users/me/Movies/Temp/IMG_1388.m4v"];
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];
// Track A
NSURL *urlVideo = [NSURL URLWithString:#"file://localhost/Users/me/Movies/Temp/IMG_1383.MOV"];
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
[from setOpacity:.5 atTime:kCMTimeZero];
// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition, nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(480, 360);
// Export
NSURL *outputURL = [NSURL URLWithString:#"file://localhost/Users/me/Movies/Temp/export.MOV"];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:[[composition copy] autorelease] presetName:AVAssetExportPresetHighestQuality];
[exportSession setOutputFileType:#"com.apple.quicktime-movie"];
exportSession.outputURL = outputURL;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:nil];
// Player
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
playerItem.videoComposition = videoComposition;
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
Are you seeing something wrong?
The "goal" of this code is to "record" the camera input (video 1) and the opengl output (video 2). I also tried to "compose" them "directly" with Buffers and all that, but I was as well unsuccessful :( Turns out AVFoundation is way less trivial than I thought.

It looks good, except this part:
AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];
You need to use videoTrack and videoTrack2 to build the layer instructions, i.e., the tracks added to composition, instead of the original assets videoAssetTrack and videoAssetTrack2.
Also, adding a transformation to rotate the video it's a bit trickier (like anything in AVFoundation beyond the basics).
I've just commented out the line to make it play the 2 videos.
This is your code with the modifications:
NSError *error = nil;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// Track B
NSURL *urlVideo2 = [[NSBundle mainBundle] URLForResource:#"b" withExtension:#"mov"];
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
//[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];
// Track A
NSURL *urlVideo = [[NSBundle mainBundle] URLForResource:#"a" withExtension:#"mov"];
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[from setOpacity:.5 atTime:kCMTimeZero];
// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition, nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = composition.naturalSize; // CGSizeMake(480, 360);

I think you've got it wrong.
A video file may have multiple stream of data. For example, if it's a video with sound the file will have 2 streams, the Audio stream and the video stream. Another example is an audio surround video file which may include 5 or more audio stream and 1 video stream.
As with audio, most video file container format (mov, mp4, etc...) support multiple streams of video in 1 file but in fact this doesn't mean that the streams will have any relation to each other, they are just stored on the same file container. If you will open such file with QuickTime for example, you will get as many windows as video streams on such file.
Anyhow, the video streams will not get 'mix' this way.
What you're trying to achieve is related to signal processing of the video stream, and I really recommend you reading more about it.
If you don't really need to 'mix' the video data together to a file, you might want to displaying both video files on each other using MPMediaPlayers. Keep in mind that dealing with video data is usually a CPU intensive problem which you might (sometime) wont be able to solve using now days iOS devices.

Related

AVFoundation - AVAssetExportSession - Operation Stopped on Second Export attempt

I am creating a Picture-In-Picture video, this function has worked flawlessly (as far as I know) for 1.5 years. Now it appears in IOS 11 it only works the first time it is called...when it is called to do a second video (without force closing the app first) I get the error Message below.
I found this article on stack, but I am already using the asset track correctly as per this article: AVAssetExportSession export fails non-deterministically with error: “Operation Stopped, NSLocalizedFailureReason=The video could not be composed.”
I have put the exact method I am using. Any help would be greatly appreciated!
Error Message:
Error: Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped"
UserInfo={NSLocalizedFailureReason=The video could not be composed.,
NSLocalizedDescription=Operation Stopped,
NSUnderlyingError=0x1c04521e0
{Error Domain=NSOSStatusErrorDomain Code=-17390 "(null)"}}
Method Below:
- (void) composeVideo:(NSString*)videoPIP onVideo:(NSString*)videoBG
{
#try {
NSError *e = nil;
AVURLAsset *backAsset, *pipAsset;
// Load our 2 movies using AVURLAsset
pipAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoPIP] options:nil];
backAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoBG] options:nil];
if ([[NSFileManager defaultManager] fileExistsAtPath:videoPIP])
{
NSLog(#"PIP File Exists!");
}
else
{
NSLog(#"PIP File DOESN'T Exist!");
}
if ([[NSFileManager defaultManager] fileExistsAtPath:videoBG])
{
NSLog(#"BG File Exists!");
}
else
{
NSLog(#"BG File DOESN'T Exist!");
}
float scaleH = VIDEO_SIZE.height / [[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].width;
float scaleW = VIDEO_SIZE.width / [[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].height;
float scalePIP = (VIDEO_SIZE.width * 0.25) / [[[pipAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width;
// Create AVMutableComposition Object - this object will hold our multiple AVMutableCompositionTracks.
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
// Create the first AVMutableCompositionTrack by adding a new track to our AVMutableComposition.
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
// Set the length of the firstTrack equal to the length of the firstAsset and add the firstAsset to our newly created track at kCMTimeZero so video plays from the start of the track.
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, pipAsset.duration) ofTrack:[[pipAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&e];
if (e)
{
NSLog(#"Error0: %#",e);
e = nil;
}
// Repeat the same process for the 2nd track and also start at kCMTimeZero so both tracks will play simultaneously.
AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, backAsset.duration) ofTrack:[[backAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&e];
if (e)
{
NSLog(#"Error1: %#",e);
e = nil;
}
// We also need the audio track!
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, backAsset.duration) ofTrack:[[backAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:&e];
if (e)
{
NSLog(#"Error2: %#",e);
e = nil;
}
// Create an AVMutableVideoCompositionInstruction object - Contains the array of AVMutableVideoCompositionLayerInstruction objects.
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
// Set Time to the shorter Asset.
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, (pipAsset.duration.value > backAsset.duration.value) ? pipAsset.duration : backAsset.duration);
// Create an AVMutableVideoCompositionLayerInstruction object to make use of CGAffinetransform to move and scale our First Track so it is displayed at the bottom of the screen in smaller size.
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
//CGAffineTransform Scale1 = CGAffineTransformMakeScale(0.3f,0.3f);
CGAffineTransform Scale1 = CGAffineTransformMakeScale(scalePIP, scalePIP);
// Top Left
CGAffineTransform Move1 = CGAffineTransformMakeTranslation(3.0, 3.0);
[FirstlayerInstruction setTransform:CGAffineTransformConcat(Scale1,Move1) atTime:kCMTimeZero];
// Repeat for the second track.
AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
CGAffineTransform Scale2 = CGAffineTransformMakeScale(scaleW, scaleH);
CGAffineTransform rotateBy90Degrees = CGAffineTransformMakeRotation( M_PI_2);
CGAffineTransform Move2 = CGAffineTransformMakeTranslation(0.0, ([[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].height) * -1);
[SecondlayerInstruction setTransform:CGAffineTransformConcat(Move2, CGAffineTransformConcat(rotateBy90Degrees, Scale2)) atTime:kCMTimeZero];
// Add the 2 created AVMutableVideoCompositionLayerInstruction objects to our AVMutableVideoCompositionInstruction.
MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction, SecondlayerInstruction, nil];
// Create an AVMutableVideoComposition object.
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
// Set the render size to the screen size.
// MainCompositionInst.renderSize = [[UIScreen mainScreen] bounds].size;
MainCompositionInst.renderSize = VIDEO_SIZE;
NSString *fileName = [NSString stringWithFormat:#"%#%#", NSTemporaryDirectory(), #"fullreaction.MP4"];
// Make sure the video doesn't exist.
if ([[NSFileManager defaultManager] fileExistsAtPath:fileName])
{
[[NSFileManager defaultManager] removeItemAtPath:fileName error:nil];
}
// Now we need to save the video.
NSURL *url = [NSURL fileURLWithPath:fileName];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:QUALITY];
exporter.videoComposition = MainCompositionInst;
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeMPEG4;
[exporter exportAsynchronouslyWithCompletionHandler:
^(void )
{
NSLog(#"File Saved as %#!", fileName);
NSLog(#"Error: %#", exporter.error);
[self performSelectorOnMainThread:#selector(runProcessingComplete) withObject:nil waitUntilDone:false];
}];
}
#catch (NSException *ex) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error 3" message:[NSString stringWithFormat:#"%#",ex]
delegate:self cancelButtonTitle:#"OK" otherButtonTitles: nil];
[alert show];
}
}
The Cause:
It ends up the "MainInstruction" timeRange is incorrect.
CMTime objects cannot be compared using "value". Instead, you must use CMTIME_COMPARE_INLINE.
To fix, replace this line:
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, (pipAsset.duration.value > backAsset.duration.value) ? pipAsset.duration : backAsset.duration);
With this line:
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTIME_COMPARE_INLINE(pipAsset.duration, >, backAsset.duration) ? pipAsset.duration : backAsset.duration);

iOS: AVPlayerItem to AVAsset

I got 2 AVAssets, and I do have change using VideoComposition, and AudioMix for AVPlayerItem. After that, I use asset from AVPlayerItem, but VideoComposition, and AudioMix are not applied.
I want the result asset to be applied by both VideoComposition, and AudioMix.
Here's the code.
+ (AVAsset *)InitAsset:(AVAsset *)asset AtTime:(double)start ToTime:(double)end {
CGFloat colorComponents[4] = {1.0,1.0,1.0,0.0};
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
//Here we are creating the first AVMutableCompositionTrack.See how we are adding a new track to our AVMutableComposition.
AVMutableCompositionTrack *masterTrack =
[mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
//Now we set the length of the firstTrack equal to the length of the firstAsset and add the firstAsset to out newly created track at kCMTimeZero so video plays from the start of the track.
[masterTrack insertTimeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(start, 1), CMTimeMakeWithSeconds(end, 1))
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
// Each video layer instruction
AVMutableVideoCompositionLayerInstruction *masterLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:masterTrack];
[masterLayerInstruction setOpacity:1.0f atTime:kCMTimeZero];
[masterLayerInstruction setOpacityRampFromStartOpacity:1.0f
toEndOpacity:0.0
timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(end, 1), CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
//See how we are creating AVMutableVideoCompositionInstruction object.This object will contain the array of our AVMutableVideoCompositionLayerInstruction objects.You set the duration of the layer.You should add the lenght equal to the lingth of the longer asset in terms of duration.
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
[MainInstruction setTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
[MainInstruction setLayerInstructions:[NSArray arrayWithObjects:masterLayerInstruction,nil]];
[MainInstruction setBackgroundColor:CGColorCreate(CGColorSpaceCreateDeviceRGB(), colorComponents)];
//Now we create AVMutableVideoComposition object.We can add mutiple AVMutableVideoCompositionInstruction to this object.We have only one AVMutableVideoCompositionInstruction object in our example.You can use multiple AVMutableVideoCompositionInstruction objects to add multiple layers of effects such as fade and transition but make sure that time ranges of the AVMutableVideoCompositionInstruction objects dont overlap.
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(1280, 720);
// [MainCompositionInst setFra]
AVMutableCompositionTrack *masterAudio = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[masterAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))
ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
// Each Audio
AVMutableAudioMixInputParameters *masterAudioMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:masterAudio];
[masterAudioMix setVolume:1.0f atTime:kCMTimeZero];
[masterAudioMix setVolumeRampFromStartVolume:1.0f
toEndVolume:0.0f
timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(end, 1), CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
// [SecondTrackMix setVolume:1.0f atTime:CMTimeMake(2.01, 1)];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithObjects:masterAudioMix,nil];
//Finally just add the newly created AVMutableComposition with multiple tracks to an AVPlayerItem and play it using AVPlayer.
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:mixComposition];
item.videoComposition = MainCompositionInst;
item.audioMix = audioMix;
return [item asset];
}
Do anyone have any idea ?
Best Regards.
Use AVAssetExportSeesion...
"An AVAssetExportSession object transcodes the contents of an AVAsset source object to create an output of the form described by a specified export preset."
Use AVAssetExportSeesion's properties audioMix and videoComposition.
audioMix
Indicates whether non-default audio mixing is enabled for export, and supplies the parameters for audio mixing.
#property(nonatomic, copy) AVAudioMix *audioMix
videoComposition
Indicates whether video composition is enabled for export, and supplies the instructions for video composition.
#property(nonatomic, copy) AVVideoComposition *videoComposition

CAKeyframeAnimation not animating CATextLayer when exporting video

I have an application that I am attempting to put a timestamp on a video. To do this I am using AVFoundation and Core Animation to place a CATextLayer over the video layer. If I place text into the CATextLayer's string property, the string is properly displayed in the exported video. If I then add the animation to the CATextLayer the text never changes. I figure I've overlooked something, but I can find what it is.
Thank you in advance for any help.
Here is a code example.
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableCompositionTrack *videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableCompositionTrack *audioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSDictionary *assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVAsset *asset = [[AVURLAsset alloc] initWithURL:myUrl options:assetOptions];
AVAssetTrack *audioAssetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil];
AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
videoComposition.renderSize = videoCompositionTrack.naturalSize;
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
[videoCompositionLayerInstruction setOpacity:1.0f atTime:kCMTimeZero];
videoCompositionInstruction.layerInstructions = #[videoCompositionLayerInstruction];
videoComposition.instructions = #[videoCompositionInstruction];
CALayer *parentLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0.0f, 0.0f, videoComposition.renderSize.width, videoComposition.renderSize.height);
CALayer *videoLayer = [CALayer layer];
videoLayer.frame = CGRectMake(0.0f, 0.0f, videoComposition.renderSize.width, videoComposition.renderSize.height);
[parentLayer addSublayer:videoLayer];
CATextLayer *textLayer = [CATextLayer layer];
textLayer.font = (__bridge CFTypeRef)([UIFont fontWithName:#"Helvetica Neue" size:45.0f]);
textLayer.fontSize = 45.0f;
textLayer.foregroundColor = [UIColor colorWithRed:1.0f green:1.0f blue:1.0f alpha:1.0f].CGColor;
textLayer.shadowColor = [UIColor colorWithRed:0.0f green:0.0f blue:0.0f alpha:1.0f].CGColor;
textLayer.shadowOffset = CGSizeMake(0.0f, 0.0f);
textLayer.shadowOpacity = 1.0f;
textLayer.shadowRadius = 4.0f;
textLayer.alignmentMode = kCAAlignmentCenter;
textLayer.truncationMode = kCATruncationNone;
CAKeyframeAnimation *keyFrameAnimation = [CAKeyframeAnimation animationWithKeyPath:#"string"];
// Step 8: Set the animation values to the object.
keyFrameAnimation.calculationMode = kCAAnimationLinear;
keyFrameAnimation.values = #[#"12:00:00", #"12:00:01", #"12:00:02", #"12:00:03",
#"12:00:04", #"12:00:05", #"12:00:06", #"12:00:07",
#"12:00:08", #"12:00:09"];
keyFrameAnimation.keyTimes = #[[NSNumber numberWithFloat:0.0f], [NSNumber numberWithFloat:0.1f],
[NSNumber numberWithFloat:0.2f], [NSNumber numberWithFloat:0.3f],
[NSNumber numberWithFloat:0.4f], [NSNumber numberWithFloat:0.5f],
[NSNumber numberWithFloat:0.6f], [NSNumber numberWithFloat:0.7f],
[NSNumber numberWithFloat:0.8f], [NSNumber numberWithFloat:0.9f]];
keyFrameAnimation.beginTime = AVCoreAnimationBeginTimeAtZero;
keyFrameAnimation.duration = CMTimeGetSeconds(composition.duration);
keyFrameAnimation.removedOnCompletion = YES;
[textLayer addAnimation:keyFrameAnimation forKey:#"string"];
textLayer.frame = CGRectMake(0.0f, 20.0f, videoComposition.renderSize.width, 55.0f);
[parentLayer addSublayer:textLayer];
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPreset1920x1080];
exportSession.videoComposition = videoComposition;
exportSession.audioMix = audioMix;
exportSession.outputFileType = [exportSession.supportedFileTypes objectAtIndex:0];
exportSession.outputURL = mySaveUrl;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
self.delegate = nil;
switch (exportSession.status) {
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export session cancelled.");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export session completed.");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export session failed.");
break;
case AVAssetExportSessionStatusUnknown:
NSLog(#"Export session unknown status.");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"Export session waiting.");
break;
default:
break;
}
NSError *error = exportSession.error;
if (error != nil) {
NSLog(#"An error ocurred while exporting. Error: %#: %#", error.localizedDescription, error.localizedFailureReason);
}
}];
I stay in contact with apple support and told me that it's a bug in his SDK. They are trying to fix the issue.
Also I open an incident in apple bug reporter, If you are apple developer, I recommend you open a new one to do more pressure over this incident.
https://bugreport.apple.com/logon
Best regards.
Finally I fond a solution in CATextLayer and animation. It's probably that it's not the best solution but at least it works fine.
To fix the issue we ought to pass NSString to UIImage or CALayer and then put all this images (CGImage) in NSMutableArray and load this mutable array in CAKeyAnimation and do the modification of contents variable
-(UIImage *)imageFromText:(NSString *)text
{
UIGraphicsBeginImageContextWithOptions(sizeImageFromText,NO,0.0);//Better resolution (antialiasing)
//UIGraphicsBeginImageContext(sizeImageFromText); // iOS is < 4.0
[text drawInRect:aRectangleImageFromText withAttributes:attributesImageFromText];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
Adding images to NSMutableArray
[NSMutableArrayAux addObject:(__bridge id)[self imageFromText:#"Hello"].CGImage];
Set CAKeyAnimation to modify contents of image (change image)
CAKeyframeAnimation *overlay = [CAKeyframeAnimation animationWithKeyPath:#"contents"];
overlay.values = [[NSArray alloc] initWithArray:NSMutableArrayAux];
It works really fine, but it takes 1 or 2 seconds to 5000 or 6000 images in array, and also with 15000/16000 images when you export your video it crash with a overflow error. This is another bug in framework.
As you know, this is a issue in framework, and this is a solution at least until apple fix CATextLayer and animated issue, also I give this solution to apple too.

What iOS framework is needed for this particular audio manipulation

Before I go sit down a read an entire book on CoreAudio, I wanted to know if it was the best Framework for me to study or if AVFoundation can do what I need. I want to be able to download a small portion of an MP3 located on a remote server, lets say 20 seconds of the file, preferable without downloading the entire file first then trimming it.
Then I want to layer 2 tracks of audio then bounce them as into one file.
Will I need to delve into CoreAudio or can AVFoundation so the trick? Advise is much appreciated.
The downloading part of the file is up to you, but if you want to mix 2 or more audio files into one, AVFoundation is probably the easiest route to take, using AVAssetExportSession to do the exporting and AVMutableAudioMix to do the mix.. There is some example code for a simple editor floating around in the apple docs but cant seem to find it, if i do I will post the link..
Here is a method that actually does the mix, keep in mind that im adding video here as well, _audioTracks and _videoTracks are mutable arrays with AVAssets in them
-(void)createMix
{
CGSize videoSize = [[_videoTracks objectAtIndex:0] naturalSize];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = nil;
AVMutableAudioMix *audioMix = nil;
composition.naturalSize = videoSize;
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAsset *videoAsset=[_videoTracks objectAtIndex:0];
CMTimeRange timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [videoAsset duration]);
AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:timeRangeInAsset ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil];
AVAssetTrack *clipAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack insertTimeRange:timeRangeInAsset ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
NSMutableArray *trackMixArray = [NSMutableArray array];
if(_audioTracks && _audioTracks.count>0)
{
for(AVAsset *audio in _audioTracks)
{
// CMTimeRange timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [audio duration]);
// AVAssetTrack *clipAudioTrack = [[audio tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
//[compositionAudioTrack insertTimeRange:timeRangeInAsset ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
NSInteger i;
NSArray *tracksToDuck = [audio tracksWithMediaType:AVMediaTypeAudio]; // before we add the commentary
// Clip commentary duration to composition duration.
CMTimeRange commentaryTimeRange = CMTimeRangeMake(kCMTimeZero, audio.duration);
if (CMTIME_COMPARE_INLINE(CMTimeRangeGetEnd(commentaryTimeRange), >, [composition duration]))
commentaryTimeRange.duration = CMTimeSubtract([composition duration], commentaryTimeRange.start);
// Add the commentary track.
AVMutableCompositionTrack *compositionCommentaryTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, commentaryTimeRange.duration) ofTrack:[[audio tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:commentaryTimeRange.start error:nil];
CMTime rampDuration = CMTimeMake(1, 2); // half-second ramps
for (i = 0; i < [tracksToDuck count]; i++) {
AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[tracksToDuck objectAtIndex:i]];
[trackMix setVolumeRampFromStartVolume:1.0 toEndVolume:0.2 timeRange:CMTimeRangeMake(CMTimeSubtract(commentaryTimeRange.start, rampDuration), rampDuration)];
[trackMix setVolumeRampFromStartVolume:0.2 toEndVolume:1.0 timeRange:CMTimeRangeMake(CMTimeRangeGetEnd(commentaryTimeRange), rampDuration)];
[trackMixArray addObject:trackMix];
}
}
}
// audioMix.inputParameters = trackMixArray;
if (videoComposition) {
// Every videoComposition needs these properties to be set:
videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps
videoComposition.renderSize = videoSize;
}
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPreset1280x720];
session.videoComposition = videoComposition;
session.audioMix = audioMix;
NSUInteger count = 0;
NSString *filePath;
do {
filePath = NSTemporaryDirectory();
NSString *numberString = count > 0 ? [NSString stringWithFormat:#"-%i", count] : #"";
filePath = [filePath stringByAppendingPathComponent:[NSString stringWithFormat:#"Output-%#.mp4", numberString]];
count++;
} while([[NSFileManager defaultManager] fileExistsAtPath:filePath]);
session.outputURL = [NSURL fileURLWithPath:filePath];
session.outputFileType = AVFileTypeQuickTimeMovie;
[session exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"Exported");
if(session.error)
{
NSLog(#"had an error %#", session.error);
}
if(delegate && [delegate respondsToSelector:#selector(didFinishExportingMovie:)])
{
[delegate didFinishExportingMovie:filePath];
}
});
}];
}
hope it helps..
Daniel

Why is the only available AVAssetExportSession.outputFileType = AVFileTypeQuickTimeMovie

I've set up an AVAssetExportSession with just 2 tracks of audio and no video, which plays just like I want it to in the AVPlayer - but as I go to export it, the only available outputFileType is AVFileTypeQuickTimeMovie - Why can't I choose an audio format?
When I NSLog(#"%#", [session supportedFileTypes]); i get;
[51330:c07] (
"com.apple.quicktime-movie"
)
Here is my code;
- (AVMutableComposition *)getComposition {
AVAsset *backingAsset = [AVAsset assetWithURL:self.urlForEightBarAudioFile];
AVAsset *vocalsAsset = [AVAsset assetWithURL:self.recorder.url];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionBackingTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionVocalTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *backingAssetTrack = [backingAsset.tracks objectAtIndex:0];
AVAssetTrack *vocalsAssetTrack = [vocalsAsset.tracks objectAtIndex:0];
CMTimeRange timeRange = CMTimeRangeFromTimeToTime(kCMTimeZero, backingAsset.duration);
[compositionBackingTrack insertTimeRange:timeRange ofTrack:backingAssetTrack atTime:kCMTimeZero error:nil];
[compositionVocalTrack insertTimeRange:timeRange ofTrack:vocalsAssetTrack atTime:kCMTimeZero error:nil];
return composition;
}
- (IBAction)acceptRecording:(id)sender {
AVAssetExportSession * session = [[AVAssetExportSession alloc] initWithAsset:[self getComposition] presetName:AVAssetExportPresetMediumQuality];
NSURL *output = [self.urlForPathToEightBarRecordings URLByAppendingPathComponent:#"mix.mov"];
session.outputURL = output;
session.outputFileType = AVFileTypeQuickTimeMovie;
NSLog(#"%#", [session supportedFileTypes]);
[session exportAsynchronouslyWithCompletionHandler:^() {
switch (session.status) {
case AVAssetExportSessionStatusCompleted:
NSLog(#"It's done...hallelujah");
break;
default:
break;
}
}];
}
Ah right so the reason why it was only giving me the option of quicktime movie was because my preset was set to AVAssetExportPresetMediumQuality which is a video only preset I guess. I set my preset to AVAssetExportPresetAppleM4A and the output file type to AVFileTypeAppleM4A and export was a success!
You can use these settings for 128kbps
Preset time: AVAssetExportPresetMediumQuality
OutputfileType AVFileTypeMPEG
format: mp4