AVMutableComposition Black Screen on Replay After 3rd clip - objective-c

In my app I am attempting to merge video clips dynamically. I have two properties which hold the previous recording and then the next recording to be appended to the end. This is basically acting as a pause function, and then the user can play it back on the fly.
I make the app write the first clip to "video.mp4" in the docs directory. This is set as the previous recording. Then I write the next clip to "video2.mp4", set it to the next recording, and merge them together using AVMutableComposition:
AVMutableComposition *mashVideosTogether = [AVMutableComposition composition];
NSError *error;
if([mashVideosTogether insertTimeRange:
CMTimeRangeMake(kCMTimeZero, [self.previousRecording duration])
ofAsset:self.previousRecording
atTime:kCMTimeZero
error:&error]) NSLog(#"Successfully added one");
else NSLog(#"error: %#", error.description);
if([mashVideosTogether insertTimeRange:
CMTimeRangeMake(kCMTimeZero, [self.nextRecording duration])
ofAsset:self.nextRecording
atTime:CMTimeAdd(kCMTimeZero, [self.previousRecording duration])
error:&error]) NSLog(#"Success on 2");
else NSLog(#"error: %#", error.description);
This appends the first and second videos. I then export the video to "combined.mp4", and when this is successfully finished I then delete the file at "video.mp4", and export the combined video to "video.mp4" (so at this point the combined video exists in two places). This plays fine in my player. If the user clicks record again, the newly combined video at "video.mp4" is set as the previous recording, and the newly recorded clip is set as next recording, and the whole process is repeated. They are appended and exported all over again to repeat the process.
However, once I add a third (or more) clip, the first clips from the created composition go black in the playback. Their duration is still kept, but there is no video or sound. Basically, anytime I create a new composition from an old composition, the first composition is presented blank, and then the only thing that is preserved is their duration, and the new recording. Is this data lost when the composition is then made into another composition? Do I need to manually add them as tracks? Any help is appreciated!
SOLVED
I read over Apple's AVEditDemo and it seems as if my original assumption was correct - when I was appending videos together using AVMutableComposition alone (AKA not creating individual track files and merging them), the data for these tracks was lost when they were added to another composition.
So I just created individual tracks for audio and video of every clip to merge them and now I have a working setup where I can shoot videos dynamically, stop, then begin shooting again and they will be concatenated on the fly.
if(self.previousRecording && self.nextRecording) {
NSArray *assetArray = [NSArray arrayWithObjects:
self.previousRecording, self.nextRecording, nil];
NSURL *fileURL = [self getURLWithPathComponent:#"combined.mp4"];
AVMutableComposition *mashVideosTogether = [AVMutableComposition composition];
NSError *error;
CMTime nextClipStartTime = kCMTimeZero;
AVMutableCompositionTrack *compositionVideoTrack =
[mashVideosTogether addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack =
[mashVideosTogether addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
for(int i=0; i<[assetArray count]; i++) {
AVURLAsset *asset = [assetArray objectAtIndex:i];
CMTimeRange timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [asset duration]);
AVAssetTrack *clipVideoTrack =
[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:timeRangeInAsset
ofTrack:clipVideoTrack atTime:nextClipStartTime error:nil];
AVAssetTrack *clipAudioTrack =
[[asset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0];
[compositionAudioTrack insertTimeRange:timeRangeInAsset
ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];
nextClipStartTime = CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration);
}
//do exports down here and then reset previous recording, etc.
}

Related

iOS: AVPlayerItem to AVAsset

I got 2 AVAssets, and I do have change using VideoComposition, and AudioMix for AVPlayerItem. After that, I use asset from AVPlayerItem, but VideoComposition, and AudioMix are not applied.
I want the result asset to be applied by both VideoComposition, and AudioMix.
Here's the code.
+ (AVAsset *)InitAsset:(AVAsset *)asset AtTime:(double)start ToTime:(double)end {
CGFloat colorComponents[4] = {1.0,1.0,1.0,0.0};
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
//Here we are creating the first AVMutableCompositionTrack.See how we are adding a new track to our AVMutableComposition.
AVMutableCompositionTrack *masterTrack =
[mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
//Now we set the length of the firstTrack equal to the length of the firstAsset and add the firstAsset to out newly created track at kCMTimeZero so video plays from the start of the track.
[masterTrack insertTimeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(start, 1), CMTimeMakeWithSeconds(end, 1))
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
// Each video layer instruction
AVMutableVideoCompositionLayerInstruction *masterLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:masterTrack];
[masterLayerInstruction setOpacity:1.0f atTime:kCMTimeZero];
[masterLayerInstruction setOpacityRampFromStartOpacity:1.0f
toEndOpacity:0.0
timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(end, 1), CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
//See how we are creating AVMutableVideoCompositionInstruction object.This object will contain the array of our AVMutableVideoCompositionLayerInstruction objects.You set the duration of the layer.You should add the lenght equal to the lingth of the longer asset in terms of duration.
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
[MainInstruction setTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
[MainInstruction setLayerInstructions:[NSArray arrayWithObjects:masterLayerInstruction,nil]];
[MainInstruction setBackgroundColor:CGColorCreate(CGColorSpaceCreateDeviceRGB(), colorComponents)];
//Now we create AVMutableVideoComposition object.We can add mutiple AVMutableVideoCompositionInstruction to this object.We have only one AVMutableVideoCompositionInstruction object in our example.You can use multiple AVMutableVideoCompositionInstruction objects to add multiple layers of effects such as fade and transition but make sure that time ranges of the AVMutableVideoCompositionInstruction objects dont overlap.
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(1280, 720);
// [MainCompositionInst setFra]
AVMutableCompositionTrack *masterAudio = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[masterAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))
ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
// Each Audio
AVMutableAudioMixInputParameters *masterAudioMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:masterAudio];
[masterAudioMix setVolume:1.0f atTime:kCMTimeZero];
[masterAudioMix setVolumeRampFromStartVolume:1.0f
toEndVolume:0.0f
timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(end, 1), CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
// [SecondTrackMix setVolume:1.0f atTime:CMTimeMake(2.01, 1)];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithObjects:masterAudioMix,nil];
//Finally just add the newly created AVMutableComposition with multiple tracks to an AVPlayerItem and play it using AVPlayer.
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:mixComposition];
item.videoComposition = MainCompositionInst;
item.audioMix = audioMix;
return [item asset];
}
Do anyone have any idea ?
Best Regards.
Use AVAssetExportSeesion...
"An AVAssetExportSession object transcodes the contents of an AVAsset source object to create an output of the form described by a specified export preset."
Use AVAssetExportSeesion's properties audioMix and videoComposition.
audioMix
Indicates whether non-default audio mixing is enabled for export, and supplies the parameters for audio mixing.
#property(nonatomic, copy) AVAudioMix *audioMix
videoComposition
Indicates whether video composition is enabled for export, and supplies the instructions for video composition.
#property(nonatomic, copy) AVVideoComposition *videoComposition

Cocoa: AVAsset loaded from file has 0 tracks

I'm attempting to concatenate some audio files using the technique shown here. My audio files are .m4a and I can verify that they play fine in Quicktime. Here's the code I'm trying to use to concatenate them:
[currFile.audioContent writeToFile:tempOldFilePath atomically:NO];
AVURLAsset *oldAudioAsset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:tempOldFilePath] options:nil];
AVURLAsset *newAudioAsset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:tempInputFilePath] options:nil];
NSLog(#"oldAsset num tracks = %lu",(unsigned long)oldAudioAsset.tracks.count);
NSLog(#"newAsset num tracks = %lu",(unsigned long)newAudioAsset.tracks.count);
AVAssetTrack *oldTrack = [[oldAudioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVAssetTrack *newTrack = [[newAudioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *error=nil;
[compTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, oldTrack.timeRange.duration) ofTrack:oldTrack atTime:kCMTimeZero error:&error];
if (error) {
NSLog(#"%#",error.localizedDescription);
error=nil;
}
[compTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, newTrack.timeRange.duration) ofTrack:newTrack
atTime:oldTrack.timeRange.duration error:&error];
if (error) {
NSLog(#"%#",error.localizedDescription);
error=nil;
}
exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetAppleM4A];
exporter.outputURL = [NSURL URLWithString:tempCompFilePath];
exporter.outputFileType = AVFileTypeAppleM4A;
[exporter exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"handller");
NSError *error=nil;
NSData *newData = [NSData dataWithContentsOfFile:tempCompFilePath options:0 error:&error];
NSLog(#"%lu",(unsigned long)newData.length);
if (error) {
NSLog(#"%#",error.localizedDescription);
}
currFile.audioContent = newData;
[[AppDelegate sharedDelegate] saveAction:nil];
}];
The first problem I noticed is that the exporter's handler method is never called. I'm guessing the reason for this is the other problem I noticed: After created my AVAssets from URL, log statements show that they contain 0 tracks. Apple's example doesn't exactly show how the AVAssets are loaded.
Any advice on how to get this working?
As you've already found, you need to use fileURLWithPath:, not URLWithString:, to create your URLs.
URLWithString: expects a string that describes a URL, such as #"file:///path/to/file" or #"http://example.com/". When your string describes a path alone, such as #"/path/to/file", you must use fileURLWithPath:, which will fill in the missing pieces correctly.
More technically, URLWithString: will interpret a path as simply a URL with only a path but no particular scheme, which you could go on to use relative to a base URL in any file-oriented scheme, such as HTTP (GET /path/to/file). fileURLWithPath: will interpret a path as a local file path, and return a file: URL accordingly.
I found the reason why this error was occurred and eventually solve it.
Originally I had set the source file path as ".mp4".
But the type of recorded video file was MOV so I changed as ".mov".
NSString *source_file_path = #"temp_video.mov"
instead of
NSString *source_file_path = #"temp_video.mp4"
The problem was fixed and it is working well now.
Hope to be helpful for all.
Apparently I was using the wrong NSURL method. I changed it to this:
AVURLAsset *oldAudioAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:tempOldFilePath] options:nil];
Now my AVAssets each have 1 track in them. If anyone can provide a good explanation as to why this is necessary, I will accept that answer.

AVPlayerItemVideoOutput file cannot be used in AVComposition IOS7

I have a fully working app in IOS6 that breaks in IOS7 when using AVComposition.
Here is the problem:
In a previous view controller, I capture the users screen using AVPlayerItemVideoOutput & AVAssetWriterInputPixelBufferAdaptor to generate a video output file. In my current view controller, I take that generated video file and add it to an AVComposition to generate a video composition between this file and some audio. In IOS6, this process works perfectly, and the AVExportSession completes. In IOS7, the export process does not complete (the finish handler is never called and status is always AVAssetExportSessionStatusExporting).
Here is my pseudocode:
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:self.metaInfo.videoCaptureFile options:nil];
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[track insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&error];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:&error];
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
/// Code for processing the composition
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset640x480];
// Code for setting up exporter
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinished:exporter];
});
}];
After debugging my code, the problem came down to the generated video file that was being used. If I change asset to a hardcoded file, the exporter completes. What's interesting is that the video file that is generated from AVPlayerItemVideoOutput plays fine in a MPMoviePlayerViewController. Did something change with AVPlayerItemVideoOutput or AVAssetWriterInputPixelBufferAdaptor in IOS7 that would prevent the output file from being used in a composition? Do I need to add additional specifications in the composition? Does it have to deal with different frame rates?
Thanks for the help!
This is likely a low storage space issue specific to your test device where the AVAssetExportSession cannot write the final movie asset (or some intermediate data) to disk. I would check that the problem exists on other devices. I had the same problem on an iPhone 4 (iOS 7) but not on an iPad 2 (iOS 7) when working with a common build.

AVFoundation to reproduce a video loop

I need to reproduce a video indefinitely (restarting the video when it ends) in my OpenGL application.
To do so I'm trying to utilize AV foundation.
I created an AVAssetReader and an AVAssetReaderTrackOutput and I utilize the copyNextSampleBuffer method to get CMSampleBufferRef and create an OpenGL texture for each frame.
NSString *path = [[NSBundle mainBundle] pathForResource:videoFileName ofType:type];
_url = [NSURL fileURLWithPath:path];
//Create the AVAsset
_asset = [AVURLAsset assetWithURL:_url];
//Get the asset AVAssetTrack
NSArray *arrayAssetTrack = [_asset tracksWithMediaType:AVMediaTypeVideo];
_assetTrackVideo = [arrayAssetTrack objectAtIndex:0];
//create the AVAssetReaderTrackOutput
NSDictionary *dictCompressionProperty = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id) kCVPixelBufferPixelFormatTypeKey];
_trackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:_assetTrackVideo outputSettings:dictCompressionProperty];
//Create the AVAssetReader
NSError *error;
_assetReader = [[AVAssetReader alloc] initWithAsset:_asset error:&error];
if(error){
NSLog(#"error in AssetReader %#", error);
}
[_assetReader addOutput:_trackOutput];
//_assetReader.timeRange = CMTimeRangeMake(kCMTimeZero, _asset.duration);
//Asset reading start reading
[_assetReader startReading];
And in -update method of my GLKViewController I call the following:
if (_assetReader.status == AVAssetReaderStatusReading){
if (_trackOutput) {
CMSampleBufferRef sampleBuffer = [_trackOutput copyNextSampleBuffer];
[self createNewTextureVideoFromOutputSampleBuffer:sampleBuffer]; //create the new texture
}
}else if (_assetReader.status == AVAssetReaderStatusCompleted) {
NSLog(#"restart");
[_assetReader startReading];
}
All work fine until the AVAssetReader is in the reading status but when it finished reading and I tried to restart the AVAssetReading with a new call [_assetReader startReading], the application crash without output.
What I'm doing wrong? It is correct to restart an AVAssetReading when it complete his reading?
AVAssetReader doesn't support seeking or restarting, it is essentially a sequential decoder. You have to create a new AVAssetReader object to read the same samples again.
Thanks Costique! Your suggestion brings me back on the problem. I finally restarted the reading by creating a new AVAssetReader. However in order to do that I noted that a new AVAssetReaderTrackOutput must be created and added to the new AVAssetReader
e.g.
[newAssetReader addOutput:newTrackOutput];

Merge two videos without ffmpeg (Cocoa)

I've looked and looked for an answer, but can't seem to find one. Lots have asked, but none have gotten answers. I have an app that have two video paths. Now I just want to merge them into one file that can be saved in a ".mov" format. Does anyone have any clue as to how this can be done?
Note : I want to to this without installing and obviously using ffmpeg.
Please if you have time, some code would be very helpful.
First, obviously you need to make sure that the movie type is readable/playable by the quicktime libraries.
But, assuming that's the case, the procedure is basically like this:
Get a pointer to some memory to store the data:
QTMovie *myCombinedMovie = [[QTMovie alloc] initToWritableData:[NSMutableData data] error:nil];
Next, grab the first movie that you want to use and insert it into myCombinedMovie You can have the parts you want combined in an array and enumerate over them to combine as many parts as you like. Also, if you wanted, you could alter destination range to add an offset:
QTMovie *firstMovie = [QTMovie movieWithURL:firstURL error:nil];
// NOTE THAT THE 3 LINES BELOW WERE CHANGED FROM MY ORIGINAL POST!
QTTimeRange timeRange = QTMakeTimeRange(QTZeroTime, [firstMovie duration]);
QTTime insertionTime = [myCombinedMovie duration];
[myCombinedMovie insertSegmentOfMovie:firstMovie timeRange:timeRange atTime:insertionTime];
Rinse and repeat for the second movie part.
Then, output the flattened movie (flattening makes it self-contained):
NSDictionary *writeAttributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], QTMovieFlatten, nil]; //note that you can add a QTMovieExport key with an appropriate value here to export as a specific type
[myCombinedMovie writeToFile:destinationPath withAttributes:writeAttributes];
EDITED: I edited the above as insertion times were calculating wrong. This way seems easier. Below is the code all together as one, including enumerating through an array of movies and lots of error logging.
NSError *err = nil;
QTMovie *myCombinedMovie = [[QTMovie alloc] initToWritableData:[NSMutableData data] error:&err];
if (err)
{
NSLog(#"Error creating myCombinedMovie: %#", [err localizedDescription]);
return;
}
NSArray *myMovieURLs = [NSArray arrayWithObjects:[NSURL fileURLWithPath:#"/path/to/the/firstmovie.mov"], [NSURL fileURLWithPath:#"/path/to/the/secondmovie.mov"], nil];
for (NSURL *url in myMovieURLs)
{
QTMovie *theMovie = [QTMovie movieWithURL:url error:&err];
if (err){
NSLog(#"Error loading one of the movies: %#", [err localizedDescription]);
return;
}
QTTimeRange timeRange = QTMakeTimeRange(QTZeroTime, [theMovie duration]);
QTTime insertionTime = [myCombinedMovie duration];
[myCombinedMovie insertSegmentOfMovie:theMovie timeRange:timeRange atTime:insertionTime];
}
NSDictionary *writeAttributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], QTMovieFlatten, nil];
bool success = [myCombinedMovie writeToFile:#"/path/to/outputmovie.mov" withAttributes:writeAttributes error:&err];
if (!success)
{
NSLog(#"Error writing movie: %#", [err localizedDescription]);
return;
}