CoreAnimation, AVFoundation and ability to make Video export - objective-c

I'm looking for the correct way to export my pictures sequence into a quicktime video.
I know that AV Foundation have the ability to merge or recombine videos and also to add audio track building a single video Asset.
Now ... my goal is a little bit different. I would to create a video from scratch. I have a set of UIImage and I need to render all of them in a single video.
I read all the Apple Documentation about AV Foundation and i found the AVVideoCompositionCoreAnimationTool class that have the ability to take a CoreAnimation and reencode it as a video. I also checked the AVEditDemo project provided by Apple but something seems not working on my project.
Here my steps:
1) I create the CoreAnimation Layer
CALayer *animationLayer = [CALayer layer];
[animationLayer setFrame:CGRectMake(0, 0, 1024, 768)];
CALayer *backgroundLayer = [CALayer layer];
[backgroundLayer setFrame:animationLayer.frame];
[backgroundLayer setBackgroundColor:[UIColor blackColor].CGColor];
CALayer *anImageLayer = [CALayer layer];
[anImageLayer setFrame:animationLayer.frame];
CAKeyframeAnimation *changeImageAnimation = [CAKeyframeAnimation animationWithKeyPath:#"contents"];
[changeImageAnimation setDelegate:self];
changeImageAnimation.duration = [[albumSettings transitionTime] floatValue] * [uiImagesArray count];
changeImageAnimation.repeatCount = 1;
changeImageAnimation.values = [NSArray arrayWithArray:uiImagesArray];
changeImageAnimation.removedOnCompletion = YES;
[anImageLayer addAnimation:changeImageAnimation forKey:nil];
[animationLayer addSublayer:anImageLayer];
2) Than I instantiate the AVComposition
AVMutableComposition *composition = [AVMutableComposition composition];
composition.naturalSize = CGSizeMake(1024, 768);
CALayer *wrapLayer = [CALayer layer];
wrapLayer.frame = CGRectMake(0, 0, 1024, 768);
CALayer *videoLayer = [CALayer layer];
videoLayer.frame = CGRectMake(0, 0, 1024, 768);
[wrapLayer addSublayer:animationLayer];
[wrapLayer addSublayer:videoLayer];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMake([imagesFilePath count] * [[albumSettings transitionTime] intValue] * 25, 25));
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
videoCompositionInstruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:wrapLayer];
videoComposition.frameDuration = CMTimeMake(1, 25); // 25 fps
videoComposition.renderSize = CGSizeMake(1024, 768);
videoComposition.instructions = [NSArray arrayWithObject:videoCompositionInstruction];
3) I export the video to document path
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetLowQuality];
session.videoComposition = videoComposition;
NSString *filePath = nil;
filePath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
filePath = [filePath stringByAppendingPathComponent:#"Output.mov"];
session.outputURL = [NSURL fileURLWithPath:filePath];
session.outputFileType = AVFileTypeQuickTimeMovie;
[session exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"Export Finished: %#", session.error);
if (session.error) {
[[NSFileManager defaultManager] removeItemAtPath:filePath error:NULL];
}
});
}];
At the and of the export a get this error:
Export Finished: Error Domain=AVFoundationErrorDomain Code=-11822 "Cannot Open" UserInfo=0x49a97c0 {NSLocalizedFailureReason=This media format is not supported., NSLocalizedDescription=Cannot Open}
I found it inside documentation: AVErrorInvalidSourceMedia = -11822,
AVErrorInvalidSourceMedia
The operation could not be completed because some source media could not be read.
I'm totally sure that the CoreAnimation build by me is right because I rendered it into a test layer and i could see the animation progress correctly.
Anyone can help me to understand where is my error?

maybe you need a fake movie that contains totally black frame to fill the video layer, and then add a CALayer to manpiulate the images

I found the AVVideoCompositionCoreAnimationTool class that have the ability to take a CoreAnimation and reencode it as a video
My understanding was that this instead was only able to take CoreAnimation and add it to an existing video. I just checked the docs, and the only methods available require a video layer too.
EDIT: yep. Digging in docs and WWDC videos, I think you should be using AVAssetWriter instead, and appending images to the writer. Something like:
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:somePath] fileType:AVFileTypeQuickTimeMovie error:&error];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:320], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain];
[videoWriter addInput:writerInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:CMTimeMakeWithSeconds(0, 30)]
[writerInput appendSampleBuffer:sampleBuffer];
[writerInput markAsFinished];
[videoWriter endSessionAtSourceTime:CMTimeMakeWithSeconds(60, 30)];
[videoWriter finishWriting];

Related

CAKeyframeAnimation not animating CATextLayer when exporting video

I have an application that I am attempting to put a timestamp on a video. To do this I am using AVFoundation and Core Animation to place a CATextLayer over the video layer. If I place text into the CATextLayer's string property, the string is properly displayed in the exported video. If I then add the animation to the CATextLayer the text never changes. I figure I've overlooked something, but I can find what it is.
Thank you in advance for any help.
Here is a code example.
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableCompositionTrack *videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableCompositionTrack *audioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSDictionary *assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVAsset *asset = [[AVURLAsset alloc] initWithURL:myUrl options:assetOptions];
AVAssetTrack *audioAssetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil];
AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
videoComposition.renderSize = videoCompositionTrack.naturalSize;
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
[videoCompositionLayerInstruction setOpacity:1.0f atTime:kCMTimeZero];
videoCompositionInstruction.layerInstructions = #[videoCompositionLayerInstruction];
videoComposition.instructions = #[videoCompositionInstruction];
CALayer *parentLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0.0f, 0.0f, videoComposition.renderSize.width, videoComposition.renderSize.height);
CALayer *videoLayer = [CALayer layer];
videoLayer.frame = CGRectMake(0.0f, 0.0f, videoComposition.renderSize.width, videoComposition.renderSize.height);
[parentLayer addSublayer:videoLayer];
CATextLayer *textLayer = [CATextLayer layer];
textLayer.font = (__bridge CFTypeRef)([UIFont fontWithName:#"Helvetica Neue" size:45.0f]);
textLayer.fontSize = 45.0f;
textLayer.foregroundColor = [UIColor colorWithRed:1.0f green:1.0f blue:1.0f alpha:1.0f].CGColor;
textLayer.shadowColor = [UIColor colorWithRed:0.0f green:0.0f blue:0.0f alpha:1.0f].CGColor;
textLayer.shadowOffset = CGSizeMake(0.0f, 0.0f);
textLayer.shadowOpacity = 1.0f;
textLayer.shadowRadius = 4.0f;
textLayer.alignmentMode = kCAAlignmentCenter;
textLayer.truncationMode = kCATruncationNone;
CAKeyframeAnimation *keyFrameAnimation = [CAKeyframeAnimation animationWithKeyPath:#"string"];
// Step 8: Set the animation values to the object.
keyFrameAnimation.calculationMode = kCAAnimationLinear;
keyFrameAnimation.values = #[#"12:00:00", #"12:00:01", #"12:00:02", #"12:00:03",
#"12:00:04", #"12:00:05", #"12:00:06", #"12:00:07",
#"12:00:08", #"12:00:09"];
keyFrameAnimation.keyTimes = #[[NSNumber numberWithFloat:0.0f], [NSNumber numberWithFloat:0.1f],
[NSNumber numberWithFloat:0.2f], [NSNumber numberWithFloat:0.3f],
[NSNumber numberWithFloat:0.4f], [NSNumber numberWithFloat:0.5f],
[NSNumber numberWithFloat:0.6f], [NSNumber numberWithFloat:0.7f],
[NSNumber numberWithFloat:0.8f], [NSNumber numberWithFloat:0.9f]];
keyFrameAnimation.beginTime = AVCoreAnimationBeginTimeAtZero;
keyFrameAnimation.duration = CMTimeGetSeconds(composition.duration);
keyFrameAnimation.removedOnCompletion = YES;
[textLayer addAnimation:keyFrameAnimation forKey:#"string"];
textLayer.frame = CGRectMake(0.0f, 20.0f, videoComposition.renderSize.width, 55.0f);
[parentLayer addSublayer:textLayer];
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPreset1920x1080];
exportSession.videoComposition = videoComposition;
exportSession.audioMix = audioMix;
exportSession.outputFileType = [exportSession.supportedFileTypes objectAtIndex:0];
exportSession.outputURL = mySaveUrl;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
self.delegate = nil;
switch (exportSession.status) {
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export session cancelled.");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export session completed.");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export session failed.");
break;
case AVAssetExportSessionStatusUnknown:
NSLog(#"Export session unknown status.");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"Export session waiting.");
break;
default:
break;
}
NSError *error = exportSession.error;
if (error != nil) {
NSLog(#"An error ocurred while exporting. Error: %#: %#", error.localizedDescription, error.localizedFailureReason);
}
}];
I stay in contact with apple support and told me that it's a bug in his SDK. They are trying to fix the issue.
Also I open an incident in apple bug reporter, If you are apple developer, I recommend you open a new one to do more pressure over this incident.
https://bugreport.apple.com/logon
Best regards.
Finally I fond a solution in CATextLayer and animation. It's probably that it's not the best solution but at least it works fine.
To fix the issue we ought to pass NSString to UIImage or CALayer and then put all this images (CGImage) in NSMutableArray and load this mutable array in CAKeyAnimation and do the modification of contents variable
-(UIImage *)imageFromText:(NSString *)text
{
UIGraphicsBeginImageContextWithOptions(sizeImageFromText,NO,0.0);//Better resolution (antialiasing)
//UIGraphicsBeginImageContext(sizeImageFromText); // iOS is < 4.0
[text drawInRect:aRectangleImageFromText withAttributes:attributesImageFromText];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
Adding images to NSMutableArray
[NSMutableArrayAux addObject:(__bridge id)[self imageFromText:#"Hello"].CGImage];
Set CAKeyAnimation to modify contents of image (change image)
CAKeyframeAnimation *overlay = [CAKeyframeAnimation animationWithKeyPath:#"contents"];
overlay.values = [[NSArray alloc] initWithArray:NSMutableArrayAux];
It works really fine, but it takes 1 or 2 seconds to 5000 or 6000 images in array, and also with 15000/16000 images when you export your video it crash with a overflow error. This is another bug in framework.
As you know, this is a issue in framework, and this is a solution at least until apple fix CATextLayer and animated issue, also I give this solution to apple too.

Creating an animated GIF in Cocoa - defining frame type

I've been able to adapt some code found on SO to produce an animated GIF from the "screenshots" of my view, but the results are unpredictable. GIF frames are sometimes full images, full frames ("replace" mode, as GIMP marks it), other times are just a "diff" from previous layer ("combine" mode).
From what I've seen, when there are fewer and/or smaller frames involved, the CG writes the GIF in "combine" mode, but failing to get the colors right. Actually, the moving parts are colored correctly, the background is wrong.
When CG saves the GIF as full frames, the colors are ok. The file size is larger, but hey, obviously you cannot have the best of both worlds. :)
Is there a way to either:
a) force CG to create "full frames" when saving the GIF
b) fix the colors (color table?)
What I do is (ARC mode):
capture the visible part of the view with
[[scrollView contentView] dataWithPDFInsideRect:[[scrollView contentView] visibleRect]];
convert and resize it to NSImageBitmapRep of PNG type
-(NSMutableDictionary*) pngImageProps:(int)quality {
NSMutableDictionary *pngImageProps;
pngImageProps = [[NSMutableDictionary alloc] init];
[pngImageProps setValue:[NSNumber numberWithBool:NO] forKey:NSImageInterlaced];
double compressionF = 1;
[pngImageProps setValue:[NSNumber numberWithFloat:compressionF] forKey:NSImageCompressionFactor];
return pngImageProps;
}
-(NSData*) resizeImageToData:(NSData*)data toDimX:(int)xdim andDimY:(int)ydim withQuality:(int)quality{
NSImage *image = [[NSImage alloc] initWithData:data];
NSRect inRect = NSZeroRect;
inRect.size = [image size];
NSRect outRect = NSMakeRect(0, 0, xdim, ydim);
NSImage *outImage = [[NSImage alloc] initWithSize:outRect.size];
[outImage lockFocus];
[image drawInRect:outRect fromRect:inRect operation:NSCompositeCopy fraction:1];
NSBitmapImageRep* bitmapRep = [[NSBitmapImageRep alloc] initWithFocusedViewRect:outRect];
[outImage unlockFocus];
NSMutableDictionary *imageProps = [self pngImageProps:quality];
NSData* imageData = [bitmapRep representationUsingType:NSPNGFileType properties:imageProps];
return [imageData copy];
}
get the array of BitmapReps and create the GIF
-(CGImageRef) pngRepDataToCgImageRef:(NSData*)data {
CFDataRef imgData = (__bridge CFDataRef)data;
CGDataProviderRef imgDataProvider = CGDataProviderCreateWithCFData (imgData);
CGImageRef image = CGImageCreateWithPNGDataProvider(imgDataProvider, NULL, true, kCGRenderingIntentDefault);
return image;
}
////////// create GIF from
NSArray *images; // holds all BitmapReps
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)[NSURL fileURLWithPath:pot],
kUTTypeGIF,
allImages,
NULL);
// set frame delay
NSDictionary *frameProperties = [NSDictionary
dictionaryWithObject:[NSDictionary
dictionaryWithObject:[NSNumber numberWithFloat:0.2f]
forKey:(NSString *) kCGImagePropertyGIFDelayTime]
forKey:(NSString *) kCGImagePropertyGIFDictionary];
// set gif color properties
NSMutableDictionary *gifPropsDict = [[NSMutableDictionary alloc] init];
[gifPropsDict setObject:(NSString *)kCGImagePropertyColorModelRGB forKey:(NSString *)kCGImagePropertyColorModel];
[gifPropsDict setObject:[NSNumber numberWithBool:YES] forKey:(NSString *)kCGImagePropertyGIFHasGlobalColorMap];
// set gif loop
NSDictionary *gifProperties = [NSDictionary
dictionaryWithObject:gifPropsDict
forKey:(NSString *) kCGImagePropertyGIFDictionary];
// loop through frames and add them to GIF
for (int i=0; i < [images count]; i++) {
NSData *imageData = [images objectAtIndex:i];
CGImageRef imageRef = [self pngRepDataToCgImageRef:imageData];
CGImageDestinationAddImage(destination, imageRef, (__bridge CFDictionaryRef) (frameProperties));
}
// save the GIF
CGImageDestinationSetProperties(destination, (__bridge CFDictionaryRef)(gifProperties));
CGImageDestinationFinalize(destination);
CFRelease(destination);
I've checked the ImageBitmapReps, when saved as PNG individually, they are just fine.
As I understood, the color tables should be handled by CG or am I responsible to produce the dithered colors? How to do that?
Even when doing the same animation repeatedly, the GIFs produced may vary.
This is a single BitmapRep
(source: andraz.eu)
And this is the GIF with the invalid colors ("combine" mode)
(source: andraz.eu)
I read your code. Please double check the "allImages" while you are creating the CGImageDestinationRef, and the "[images count]".
the follow test code works fine:
NSDictionary *prep = [NSDictionary dictionaryWithObject:[NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:0.2f] forKey:(NSString *) kCGImagePropertyGIFDelayTime] forKey:(NSString *) kCGImagePropertyGIFDictionary];
CGImageDestinationRef dst = CGImageDestinationCreateWithURL((__bridge CFURLRef)(fileURL), kUTTypeGIF, [filesArray count], nil);
for (int i=0;i<[filesArray count];i++)
{
//load anImage from array
...
CGImageRef imageRef=[anImage CGImageForProposedRect:nil context:nil hints:nil];
CGImageDestinationAddImage(dst, imageRef,(__bridge CFDictionaryRef)(prep));
}
bool fileSave = CGImageDestinationFinalize(dst);
CFRelease(dst);

Issue on capture screenshot of video

i want to take a screenshot of a video on my ipad app.
I searched on SO, and i found a lot of sample code. I tried everything but nothing seem to work.
I tried all of this methods:
1) Try with : MPMoviePlayerController
- (void) previewWithPlayer:(NSString*)path image:(UIImageView*)imView
{
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:path]];
UIImage *thumbnail = [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
[player stop];
[player release];
imView.image = thumbnail;
}
2) Try with : AVAssetImageGenerator - v1
- (void) generateImage:(NSString*)path image:(UIImageView*)imView
{
AVAsset *asset = [AVAsset assetWithURL:[NSURL URLWithString:path]];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = CMTimeMake(1, 1);
UIImage *thumbnail = [UIImage imageWithCGImage:[imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL]];
imView.image = thumbnail;
}
13) Try with : AVAssetImageGenerator - v2
- (void) generateImage:(NSString*)path image:(UIImageView*)imView
{
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL URLWithString:path] options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform=TRUE;
[asset release];
CMTime thumbTime = CMTimeMakeWithSeconds(2,30);
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result != AVAssetImageGeneratorSucceeded) {
NSLog(#"couldn't generate thumbnail, error:%#", error);
}
UIImage *thumbImg = [[UIImage imageWithCGImage:im] retain];
imView.image = thumbImg;
[generator release];
};
CGSize maxSize = CGSizeMake(320, 180);
generator.maximumSize = maxSize;
[generator generateCGImagesAsynchronouslyForTimes:[NSArray arrayWithObject:[NSValue valueWithCMTime:thumbTime]] completionHandler:handler];
}
But nothig works.
I tried with MOV, MP4, nothing.
Path is correct, video is working.
NSString *fPath = [[NSBundle mainBundle] pathForResource:#"VideoA" ofType:#"mp4"];
NSLog(#"%#", fPath);
[self generateImage:fPath image:_ImgA];
What's could be the problem? My image view show nothing and no error are returned.
iOS is 6.0/5.1, on iPad simulator/device.
Video is 854×480 pixels, H.264, AAC. About 30Mb of size.
please help me because i'm going crazy with this issue.
thanks.
edit
on device return this error:
couldn't generate thumbnail, error:Error Domain=NSURLErrorDomain
Code=-1 "unknown error" UserInfo=0x1e0a2f30
{NSUnderlyingError=0x1e0a3900 "The operation couldn’t be completed.
(OSStatus error -12935.)", NSLocalizedDescription=unknown error}
Solved.
The trick: use fileURLWithPath:, not URLWithString:. Apparently the difference is really, really significant.
thanks to Noah. https://stackoverflow.com/a/4201419/88461

How to create video from its frames iPhone

I had done R&D and got success in how to get frames in terms of images from video file played in MPMoviePlayerController.
Got all frames from this code, and save all images in one Array.
for(int i= 1; i <= moviePlayerController.duration; i++)
{
UIImage *img = [moviePlayerController thumbnailImageAtTime:i timeOption:MPMovieTimeOptionNearestKeyFrame];
[arrImages addObject:img];
}
Now the question is that, After change some image file, like adding emotions to the images and also adding filters, such as; movie real, black and white, How can we create video again and store the same video in Document directory with the same frame rate and without losing quality of video.
After changing some images I had done following code to save that video again.
- (void) writeImagesAsMovie:(NSString*)path
{
NSError *error = nil;
UIImage *first = [arrImages objectAtIndex:0];
CGSize frameSize = first.size;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
int frameCount = 0;
CVPixelBufferRef buffer = NULL;
for(UIImage *img in arrImages)
{
buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize];
if (adaptor.assetWriterInput.readyForMoreMediaData)
{
CMTime frameTime = CMTimeMake(frameCount,(int32_t) kRecordingFPS);
[adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
if(buffer)
CVBufferRelease(buffer);
}
frameCount++;
}
[writerInput markAsFinished];
[videoWriter finishWriting];
}
- (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
frameSize.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
frameSize.height, 8, 4*frameSize.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
I am new in this topic so please help me solve this question.
You can refer following links hope you get some help :-
http://www.iphonedevsdk.com/forum/iphone-sdk-development-advanced-discussion/77999-make-video-nsarray-uiimages.html
Using FFMPEG library with iPhone SDK for video encoding
Iphone SDK,Create a Video from UIImage
iOS Video Editing - Is it possible to merge (side by side not one after other) two video files into one using iOS 4 AVFoundation classes?

Compositing 2 videos on top of each other with alpha

AVFoundation allows you to "compose" 2 assets (2 videos) as 2 "tracks", just like in Final Cut Pro, for example.
The theory says I can have 2 videos on top of each other, with alpha, and see both.
Either I'm doing something wrong, or there's a bug somewhere, because the following test code, although a bit messy, clearly states I should see 2 videos, and I only see one, as seen here: http://lockerz.com/s/172403384 -- the "blue" square is IMG_1388.m4v
For whatever reason, IMG_1383.MOV is never shown.
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// Track B
NSURL *urlVideo2 = [NSURL URLWithString:#"file://localhost/Users/me/Movies/Temp/IMG_1388.m4v"];
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];
// Track A
NSURL *urlVideo = [NSURL URLWithString:#"file://localhost/Users/me/Movies/Temp/IMG_1383.MOV"];
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
[from setOpacity:.5 atTime:kCMTimeZero];
// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition, nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(480, 360);
// Export
NSURL *outputURL = [NSURL URLWithString:#"file://localhost/Users/me/Movies/Temp/export.MOV"];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:[[composition copy] autorelease] presetName:AVAssetExportPresetHighestQuality];
[exportSession setOutputFileType:#"com.apple.quicktime-movie"];
exportSession.outputURL = outputURL;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:nil];
// Player
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
playerItem.videoComposition = videoComposition;
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
Are you seeing something wrong?
The "goal" of this code is to "record" the camera input (video 1) and the opengl output (video 2). I also tried to "compose" them "directly" with Buffers and all that, but I was as well unsuccessful :( Turns out AVFoundation is way less trivial than I thought.
It looks good, except this part:
AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];
You need to use videoTrack and videoTrack2 to build the layer instructions, i.e., the tracks added to composition, instead of the original assets videoAssetTrack and videoAssetTrack2.
Also, adding a transformation to rotate the video it's a bit trickier (like anything in AVFoundation beyond the basics).
I've just commented out the line to make it play the 2 videos.
This is your code with the modifications:
NSError *error = nil;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// Track B
NSURL *urlVideo2 = [[NSBundle mainBundle] URLForResource:#"b" withExtension:#"mov"];
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
//[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];
// Track A
NSURL *urlVideo = [[NSBundle mainBundle] URLForResource:#"a" withExtension:#"mov"];
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[from setOpacity:.5 atTime:kCMTimeZero];
// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition, nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = composition.naturalSize; // CGSizeMake(480, 360);
I think you've got it wrong.
A video file may have multiple stream of data. For example, if it's a video with sound the file will have 2 streams, the Audio stream and the video stream. Another example is an audio surround video file which may include 5 or more audio stream and 1 video stream.
As with audio, most video file container format (mov, mp4, etc...) support multiple streams of video in 1 file but in fact this doesn't mean that the streams will have any relation to each other, they are just stored on the same file container. If you will open such file with QuickTime for example, you will get as many windows as video streams on such file.
Anyhow, the video streams will not get 'mix' this way.
What you're trying to achieve is related to signal processing of the video stream, and I really recommend you reading more about it.
If you don't really need to 'mix' the video data together to a file, you might want to displaying both video files on each other using MPMediaPlayers. Keep in mind that dealing with video data is usually a CPU intensive problem which you might (sometime) wont be able to solve using now days iOS devices.