AVFoundation - why can't I get the video orientation right - objective-c

I am using AVCaptureSession to capture video from a devices camera and then using AVAssetWriterInput and AVAssetTrack to compress/resize the video before uploading it to a server. The final videos will be viewed on the web via an html5 video element.
I'm running into multiple issues trying to get the orientation of the video correct. My app only supports landscape orientation and all captured videos should be in landscape orientation. However, I would like to allow the user to hold their device in either landscape direction (i.e. home button on either the left or the right hand side).
I am able to make the video preview show in the correct orientation with the following line of code
_previewLayer.connection.videoOrientation = UIDevice.currentDevice.orientation;
The problems start when processing the video via AVAssetWriterInput and friends. The result does not seem to account for the left vs. right landscape mode the video was captured in. IOW, sometimes the video comes out upside down. After some googling I found many people suggesting that the following line of code would solve this issue
writerInput.transform = videoTrack.preferredTransform;
...but this doesn't seem to work. After a bit of debugging I found that videoTrack.preferredTransform is always the same value, regardless of the orientation the video was captured in.
I tried manually tracking what orientation the video was captured in and setting the writerInput.transform to CGAffineTransformMakeRotation(M_PI) as needed. Which solved the problem!!!
...sorta
When I viewed the results on the device this solution worked as expected. Videos were right-side-up regardless of left vs. right orientation while recording. Unfortunately, when I viewed the exact same videos in another browser (chrome on a mac book) they were all upside-down!?!?!?
What am I doing wrong?
EDIT
Here's some code, in case it's helpful...
-(void)compressFile:(NSURL*)inUrl;
{
NSString* fileName = [#"compressed." stringByAppendingString:inUrl.lastPathComponent];
NSError* error;
NSURL* outUrl = [PlatformHelper getFilePath:fileName error:&error];
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 30] };
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings };
NSDictionary* videoOptions = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
AVAssetWriter* assetWriter = [AVAssetWriter assetWriterWithURL:outUrl fileType:AVFileTypeMPEG4 error:&error];
assetWriter.shouldOptimizeForNetworkUse = YES;
[assetWriter addInput:writerInput];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:inUrl options:nil];
AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// !!! this line does not work as expected and causes all sorts of issues (videos display sideways in some cases) !!!
//writerInput.transform = videoTrack.preferredTransform;
AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoOptions];
AVAssetReader* assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error];
[assetReader addOutput:readerOutput];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[assetReader startReading];
[writerInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock:
^{
/* snip */
}];
}

The problem is that modifying the writerInput.transform property only adds a tag in the video file metadata which instructs the video player to rotate the file during playback. That's why the videos play in the correct orientation on your device (I'm guessing they also play correctly in a Quicktime player as well).
The pixel buffers captured by the camera are still laid out in the orientation in which they were captured. Many video players will not check for the preferred orientation metadata tag and will just play the file in the native pixel orientation.
If you want the user to be able to record video holding the phone in either landscape mode, you need to rectify this at the AVCaptureSession level before compression by performing a transform on the CVPixelBuffer of each video frame. This Apple Q&A covers it (look at the AVCaptureVideoOutput documentation as well):
https://developer.apple.com/library/ios/qa/qa1744/_index.html
Investigating the link above is the correct way to solve your problem. An alternate fast n' dirty way to solve the same problem would be to lock the recording UI of your app into only one landscape orientation and then to rotate all of your videos server-side using ffmpeg.

In case it's helpful for anyone, here's the code I ended up with. I ended up having to do the work on the video as it was being captured instead of as a post processing step. This is a helper class that manages the capture.
Interface
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#interface VideoCaptureManager : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate>
{
AVCaptureSession* _captureSession;
AVCaptureVideoPreviewLayer* _previewLayer;
AVCaptureVideoDataOutput* _videoOut;
AVCaptureDevice* _videoDevice;
AVCaptureDeviceInput* _videoIn;
dispatch_queue_t _videoProcessingQueue;
AVAssetWriter* _assetWriter;
AVAssetWriterInput* _writerInput;
BOOL _isCapturing;
NSString* _gameId;
NSString* _authToken;
}
-(void)setSettings:(NSString*)gameId authToken:(NSString*)authToken;
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
-(void)startPreview;
-(void)stopPreview;
-(void)startCapture;
-(void)stopCapture;
#end
Implementation (w/ a bit of editing and a few little TODO's)
#implementation VideoCaptureManager
-(id)init;
{
self = [super init];
if (self) {
NSError* error;
_videoProcessingQueue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);
_captureSession = [AVCaptureSession new];
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession];
[_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
_videoOut = [AVCaptureVideoDataOutput new];
_videoOut.videoSettings = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
_videoOut.alwaysDiscardsLateVideoFrames = YES;
_videoIn = [AVCaptureDeviceInput deviceInputWithDevice:_videoDevice error:&error];
// handle errors here
[_captureSession addInput:_videoIn];
[_captureSession addOutput:_videoOut];
}
return self;
}
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
{
_previewLayer.connection.videoOrientation = orientation;
for (AVCaptureConnection* item in _videoOut.connections) {
item.videoOrientation = orientation;
}
}
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
{
return _previewLayer;
}
-(void)startPreview;
{
[_captureSession startRunning];
}
-(void)stopPreview;
{
[_captureSession stopRunning];
}
-(void)startCapture;
{
if (_isCapturing) return;
NSURL* url = put code here to create your output url
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 1],
};
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings
};
_writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
_writerInput.expectsMediaDataInRealTime = YES;
NSError* error;
_assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeMPEG4 error:&error];
// handle errors
_assetWriter.shouldOptimizeForNetworkUse = YES;
[_assetWriter addInput:_writerInput];
[_videoOut setSampleBufferDelegate:self queue:_videoProcessingQueue];
_isCapturing = YES;
}
-(void)stopCapture;
{
if (!_isCapturing) return;
[_videoOut setSampleBufferDelegate:nil queue:nil]; // TODO: seems like there could be a race condition between this line and the next (could end up trying to write a buffer after calling writingFinished
dispatch_async(_videoProcessingQueue, ^{
[_assetWriter finishWritingWithCompletionHandler:^{
[self writingFinished];
}];
});
}
-(void)writingFinished;
{
// TODO: need to check _assetWriter.status to make sure everything completed successfully
// do whatever post processing you need here
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection;
{
NSLog(#"Video frame was dropped.");
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if(_assetWriter.status != AVAssetWriterStatusWriting) {
CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[_assetWriter startWriting]; // TODO: need to check the return value (a bool)
[_assetWriter startSessionAtSourceTime:lastSampleTime];
}
if (!_writerInput.readyForMoreMediaData || ![_writerInput appendSampleBuffer:sampleBuffer]) {
NSLog(#"Failed to write video buffer to output.");
}
}
#end

For compressing /Resizing the video ,we can use AVAssetExportSession.
We can uppload a video of duration 3.30minutes.
If the video duration will be more than 3.30minutes,it will show a memory warning .
As here we are not using any transform for the video,the video will be as it is while recording.
Below is the sample code for compressing the video .
we can check the video size before compression and after compression.
{
-(void)trimVideoWithURL:(NSURL *)inputURL{
NSString *path1 = [inputURL path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path1];
NSLog(#"size before compress video is %lu",(unsigned long)data.length);
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset640x480];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *outputURL = paths[0];
NSFileManager *manager = [NSFileManager defaultManager];
[manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
outputURL = [outputURL stringByAppendingPathComponent:#"output.mp4"];
fullPath = [NSURL URLWithString:outputURL];
// Remove Existing File
[manager removeItemAtPath:outputURL error:nil];
exportSession.outputURL = [NSURL fileURLWithPath:outputURL];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
CMTime duration = CMTimeMakeWithSeconds(1.0, 600);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
NSString *path = [fullPath path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path];
NSLog(#"size after compress video is %lu",(unsigned long)data.length);
NSLog(#"Export Complete %d %#", exportSession.status, exportSession.error);
/*
Do your neccessay stuff here after compression
*/
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#",exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#",exportSession.error);
break;
default:
break;
}
}];}

Related

Second video recording with UIImagePickerController override the path of first video recording

I am trying to record two videos with UIImagePickerController. Everything is working fine but while recording the second video seems it override the Path of first recorded video.
I need to upload both videos to the server but first video path got nil while uploading and app got crashed. Is there any way to record the second video at different path?
Video Path as follows:
/private/var/mobile/Containers/Data/Application/1465EC90-4B57-41FF-996E-0CCB7713ECE7/tmp/50332801315__A883E4DB-ED72-4D31-9564-22FB363779BD.MOV
/private/var/mobile/Containers/Data/Application/1465EC90-4B57-41FF-996E-0CCB7713ECE7/tmp/50332802324__F11733AD-EB62-426D-BA1C-7E87D2BF66D0.MOV
Here is my imagePicker delegate code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if (CFStringCompare ((__bridge CFStringRef) mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo) {
NSURL *videoUrl = (NSURL*)[info objectForKey:UIImagePickerControllerMediaURL];
NSString *moviePath = [videoUrl path];
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath)) {
UISaveVideoAtPathToSavedPhotosAlbum (moviePath, nil, nil, nil);
}
NSLog(#"videoUrl: %#", videoUrl);
NSLog(#"moviePath: %#", moviePath);
// self.moviePath_1 = #"";
// self.moviePath_2 = #"";
NSLog(#"picker.title: %#", picker.title);
if ([picker.title isEqualToString:#"Video_1"]) {
self.moviePath_1 = moviePath;
self.video_1 = YES;
NSLog(#"self.moviePath_1: %#", self.moviePath_1);
self.video_1_Data = [NSData dataWithContentsOfURL:[NSURL fileURLWithPath:self.moviePath_1]];
NSLog(#"Video_1 Size: %#",[NSByteCountFormatter stringFromByteCount:self.video_1_Data.length countStyle:NSByteCountFormatterCountStyleFile]);
[self setupAndPlayback:#"Video_1"];
} else {
self.moviePath_2 = moviePath;
self.video_2 = YES;
NSLog(#"self.moviePath_2: %#", self.moviePath_2);
self.video_2_Data = [NSData dataWithContentsOfURL:[NSURL fileURLWithPath:self.moviePath_2]];
NSLog(#"Video_2 Size: %#",[NSByteCountFormatter stringFromByteCount:self.video_2_Data.length countStyle:NSByteCountFormatterCountStyleFile]);
[self setupAndPlayback:#"Video_2"];
}
}
[self dismissViewControllerAnimated:YES completion:nil];
}
Save Video on different Paths. You are overriding the same path that why this issue is happen. Add Time Stamp or increasing Number with Path and save it.
self.moviePath_1 = [NSString stringWithFormat: #"%#-%d.png", moviePath, num] ;
num += 1; // for next time

Play Last recording in Watchkit OS 2

i use the sample codes from apple for recording and playing the last recording
but i can't play the last recording
here are the codes
- (IBAction)playLastRecording {
// Present the media player controller for the last recorded URL.
NSDictionary *options = #{
WKMediaPlayerControllerOptionsAutoplayKey : #YES
};
[self presentMediaPlayerControllerWithURL:self.lastRecordingURL options:options completion:^(BOOL didPlayToEnd, NSTimeInterval endTime, NSError * __nullable error) {
if (!didPlayToEnd) {
NSLog(#"The player did not play all the way to the end. The player only played until time - %.2f.", endTime);
}
if (error) {
NSLog(#"There was an error with playback: %#.", error);
}
}];
}
and here is the error
i think we need to use the NSbundle and nsurl connection but how
to use for self.lastRecordingURL
please writing the correct codes for this problems
Optional(Error Domain=com.apple.watchkit.errors Code=4 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (1), NSUnderlyingError=0x17d9bf50 {Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted"}, NSLocalizedDescription=The operation could not be completed})
I had this problem. Make sure that the audio data is saved to the Apps Group correctly with the correct extension of the file.
If you are trying to record the audio file use the following codes.
- (void)startRecording
{
// Creating a path for saving the recorded audio file.
// We have to write the files to the shared group folder, as this is the only place both the app and extension can see.
NSURL *container = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
NSURL *outputURL = [container URLByAppendingPathComponent:#"AudioFile.m4a"];
// Setting the recorder options.
NSDictionary *dictMaxAudioRec = #{#"WKAudioRecorderControllerOptionsMaximumDurationKey":MAX_REC_DURATION};
// Presenting the default audio recorder.
[self presentAudioRecorderControllerWithOutputURL:outputURL preset:WKAudioRecorderPresetWideBandSpeech options:dictMaxAudioRec completion:^(BOOL didSave, NSError * error) {
if(didSave)
{
// Successfully saved the file.
NSURL *extensionDirectory = [[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask].firstObject;
NSUInteger timeAtRecording = (NSUInteger)[NSDate timeIntervalSinceReferenceDate];
NSString *dirName = [NSString stringWithFormat:#"AudioFile%d/",timeAtRecording];
NSURL *outputExtensionURL = [extensionDirectory URLByAppendingPathComponent:dirName];
// Move the file to new directory.
NSError *moveError;
BOOL success = [[NSFileManager defaultManager] moveItemAtURL:outputURL toURL:outputExtensionURL error:&moveError];
if (!success) {
NSLog(#"Failed to move the outputURL to the extension's documents direcotry: %#", moveError);
}
else {
NSData *audioData = [NSData dataWithContentsOfURL:outputExtensionURL];
NSLog(#"Actual Audio Data length: %lu", (unsigned long)audioData.length);
if(audioData.length)
{
// We have a valid audio data,do what ever you want to do with this data
}
}
}
else
{
// Something went wrong.
NSLog(#"%s - %#",__PRETTY_FUNCTION__,error);
}
}];
}
Or if you are trying to play a video that you have downloaded from other source or passed from the paired phone, write the audio data first to the App Groups with file extension. The following codes may help you for that.
- (void)writeAudioToAppGroupsWithData:(NSData *)audioData
{
// Writing the audio data to the App Groups
NSURL *containerURL = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
containerURL = [containerURL URLByAppendingPathComponent:[NSString stringWithFormat:DIRECTORY_PATH]];
[audioData writeToURL:containerURL atomically:YES];
}
In this case make sure that your DIRECTORY_PATH is Library/Caches/filename.extension.
Eg: Library/Caches/Audio.mp3
For playing the saved audio use the following codes.
- (void)playAudio
{
// Playing the audio from the url using the default controller
[self presentMediaPlayerControllerWithURL:[self getAudioUrl] options:nil completion:^(BOOL didPlayToEnd, NSTimeInterval endTime, NSError * _Nullable error) {
NSLog(#"Error = %#",error);
}];
}
You can get the audio url from the App Groups.
- (NSURL *)getAudioUrl
{
// Getting the audio url from the App Groups
NSURL *container = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
NSURL *outputURL = [container URLByAppendingPathComponent:[NSString stringWithFormat:DIRECTORY_PATH]];
return outputURL;
}
Hope this will fix your issue.

Composing multiple videos causes hang

I am working on an app that composes multiple video clips taken by the user. The clips are recorded on the camera, and overlayed with another video and then the recorded clips are composed together into one long clip. The length of each clip is determined by the overlaying video file.
I am using an AVAssetExportSession and exportAsynchronouslyWithCompletionHandler. The odd thing is this works with some clips and not others. The real problem is that the exporter doesn't report any errors or failures, just zero progress and never calls the completion handler.
I don't even know where to begin looking to find out what the issue is. Here's the function I use to compose the clips together
- (void) setupAndStitchVideos:(NSMutableArray*)videoData
{
// Filepath to where the final generated video is stored
NSURL * exportUrl = nil;
// Contains information about a single asset/track
NSDictionary * assetOptions = nil;
AVURLAsset * currVideoAsset = nil;
AVURLAsset * currAudioAsset = nil;
AVAssetTrack * currVideoTrack = nil;
AVAssetTrack * currAudioTrack = nil;
// Contains all tracks and time ranges used to build the final composition
NSMutableArray * allVideoTracks = nil;
NSMutableArray * allVideoRanges = nil;
NSMutableArray * allAudioTracks = nil;
NSMutableArray * allAudioRanges = nil;
AVMutableCompositionTrack * videoTracks = nil;
AVMutableCompositionTrack * audioTracks = nil;
// Misc time values used when calculating a clips start time and total length
float animationLength = 0.0f;
float clipLength = 0.0f;
float startTime = 0.0f;
CMTime clipStart = kCMTimeZero;
CMTime clipDuration = kCMTimeZero;
CMTimeRange currRange = kCMTimeRangeZero;
// The final composition to be generated and exported
AVMutableComposition * finalComposition = nil;
// Cancel any already active exports
if (m_activeExport)
{
[m_activeExport cancelExport];
m_activeExport = nil;
}
// Initialize and setup all composition related member variables
allVideoTracks = [[NSMutableArray alloc] init];
allAudioTracks = [[NSMutableArray alloc] init];
allVideoRanges = [[NSMutableArray alloc] init];
allAudioRanges = [[NSMutableArray alloc] init];
exportUrl = [NSURL fileURLWithPath:[MobveoAnimation getMergeDestination]];
finalComposition = [AVMutableComposition composition];
videoTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
animationLength = m_animation.videoDuration;
// Define all of the audio and video tracks that will be used in the composition
for (NSDictionary * currData in videoData)
{
currVideoAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_VIDEO_URL] options:assetOptions];
currAudioAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_AUDIO_URL] options:assetOptions];
currVideoTrack = [[currVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
NSArray *audioTracks = [currAudioAsset tracksWithMediaType:AVMediaTypeAudio];
if ( audioTracks != nil && audioTracks.count > 0 )
{
currAudioTrack = audioTracks[0];
}
else
{
currAudioTrack = nil;
}
clipLength = animationLength * [(NSNumber*)[currData objectForKey:KEY_STITCH_LENGTH_PERCENTAGE] floatValue];
clipStart = CMTimeMakeWithSeconds(startTime, currVideoAsset.duration.timescale);
clipDuration = CMTimeMakeWithSeconds(clipLength, currVideoAsset.duration.timescale);
NSLog(#"Clip length: %.2f", clipLength);
NSLog(#"Clip Start: %lld", clipStart.value );
NSLog(#"Clip duration: %lld", clipDuration.value);
currRange = CMTimeRangeMake(clipStart, clipDuration);
[allVideoTracks addObject:currVideoTrack];
if ( currAudioTrack != nil )
{
[allAudioTracks addObject:currAudioTrack];
[allAudioRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
}
[allVideoRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
startTime += clipLength;
}
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:nil];
if ( allAudioTracks.count > 0 )
{
[audioTracks insertTimeRanges:allAudioRanges ofTracks:allAudioTracks atTime:kCMTimeZero error:nil];
}
for ( int i = 0; i < allVideoTracks.count - allAudioTracks.count; ++i )
{
CMTimeRange curRange = [allVideoRanges[i] CMTimeRangeValue];
[audioTracks insertEmptyTimeRange:curRange];
}
// Delete any previous exported video files that may already exist
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
// Begin the composition generation and export process!
m_activeExport = [[AVAssetExportSession alloc] initWithAsset:finalComposition presetName:AVAssetExportPreset1280x720];
[m_activeExport setOutputFileType:AVFileTypeQuickTimeMovie];
[m_activeExport setOutputURL:exportUrl];
NSLog(#"Exporting async");
[m_activeExport exportAsynchronouslyWithCompletionHandler:^(void)
{
NSLog(#"Export complete");
// Cancel the update timer
[m_updateTimer invalidate];
m_updateTimer = nil;
// Dismiss the displayed dialog
[m_displayedDialog hide:TRUE];
m_displayedDialog = nil;
// Re-enable touch events
[[UIApplication sharedApplication] endIgnoringInteractionEvents];
// Report the success/failure result
switch (m_activeExport.status)
{
case AVAssetExportSessionStatusFailed:
[self performSelectorOnMainThread:#selector(videoStitchingFailed:) withObject:m_activeExport.error waitUntilDone:FALSE];
break;
case AVAssetExportSessionStatusCompleted:
[self performSelectorOnMainThread:#selector(videoStitchingComplete:) withObject:m_activeExport.outputURL waitUntilDone:FALSE];
break;
}
// Clear our reference to the completed export
m_activeExport = nil;
}];
}
EDIT:
Thanks to Josh in the comments I noticed there were error parameters I wasn't making use of. In the case where it is failing now I am getting the ever so useful "Operation could not be completed" error on inserting the time ranges of the video tracks:
NSError *videoError = nil;
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:&videoError];
if ( videoError != nil )
{
NSLog(#"Error adding video track: %#", videoError);
}
Output:
Error adding video track: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17426dd00 {NSUnderlyingError=0x174040cc0 "The operation couldn’t be completed. (OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed}
It is worth noting however that nowhere in this entire codebase is urlWithString used instead of fileUrlWithPath so that isn't the problem.
Judging from your for in enumeration of the videoData array, after you've Initialized the composition member variables, it looks as if you're blocking the calling thread. Although accessing each AVAssetTrack instance is permitted, the values for the keys are not always immediately available and run synchronously..
Instead, try registering for change notifications using AVSynchronousKeyValueLoading protocols. Apple's documentation should help you straighten out the issue and get you on your way!
Here are a few more Apple recommendations I've aggregated for AVFoundation:
Hopefully this will do the trick! Good luck and let me know if you have any further questions/problems.

uiWebView printing a pdf

Inside of uiwebview, what is a good way print a pdf document?
The pdf is accessible via a url or it can be loaded inside of an iframe.
Using the standard javascript widnow.print() functions will not work.
I am considering using a javascript bridge such as:
- (BOOL)webView:(UIWebView*)webView shouldStartLoadWithRequest:(NSURLRequest*)request navigationType:(UIWebViewNavigationType)navigationType {
NSURL *URL = [request URL];
if ([[URL scheme] isEqualToString:#"native"]) {
NSString *urlString = [[request URL] absoluteString];
NSArray *urlParts = [urlString componentsSeparatedByString:#":"];
NSString *cmd = [urlParts objectAtIndex:1];
if ( [cmd isEqualToString:#"printPdf"] ) {
// [self dosomething];
}
}
return YES;
}
At this point I need some sort of xcode function which accept a path to the pdf and send it the airPrinter.
Is this a good approach? I am searching examples of how to print a pdf inside a uiWebView.
As I've earned the tumbleweed badge for no up votes and no responses, I'll post my solution.
This fetches the pdf doc and opens the airPrint dialog--all within a uiWebView.
So if IOS would simply allow the javascript window.print() to function inside of uiWebView, my app would not be setting in the app store waiting for approval and re-release.
Anyway, here's a working solution:
- (void)printInit:(NSString *)parm {
UIPrintInteractionController *controller = [UIPrintInteractionController sharedPrintController];
if(!controller){
NSLog(#"Couldn't get shared UIPrintInteractionController!");
return;
}
NSString *base = #"https://someurl.com/";
NSString *ustr = [base stringByAppendingString:parm];
//NSURL *url = [NSURL fileURLWithPath:ustr];
NSURL *url = [NSURL URLWithString:ustr];
NSData *thePdf = [NSData dataWithContentsOfURL:url];
controller.printingItem = thePdf;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = #"PDFDoc";
controller.printInfo = printInfo;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) =
^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
NSLog(#"FAILED! error = %#",[error localizedDescription]);
}
};
CGRect rect = CGRectMake(310, 5, 100, 5);
[controller presentFromRect:rect inView:self.webView animated:YES completionHandler:completionHandler];
}

zXing (ios version) Black/White screen error

I'm working over application, that using zxing library to read QRcodes. I have problem with ZxingWidgetController - when view is showed, during application is in background/not active (eg. screen is lock) image from camera is not shown on screen - only background is visible, and scanner seems to be not working.
when i call initCapture method again, after a little delay video from camera is showed, but in this case, every time when application lose activity i need to reinitialize scanner - this behavior is not comfortable at all.
this bug can be repeated on almost all aplication used zXing, so i suppose that is some zXing bug.
zXing initCapture method code is:
- (void)initCapture {
#if HAS_AVFF
AVCaptureDeviceInput *captureInput =
[AVCaptureDeviceInput deviceInputWithDevice:
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]
error:nil];
if(!captureInput)
{
NSLog(#"ERROR - CaptureInputNotInitialized");
}
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
if(!captureOutput)
{
NSLog(#"ERROR - CaptureOutputNotInitialized");
}
[captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[[AVCaptureSession alloc] init] autorelease];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; // 480x360 on a 4
if([self.captureSession canAddInput:captureInput])
{
[self.captureSession addInput:captureInput];
}
else
{
NSLog(#"ERROR - cannot add input");
}
if([self.captureSession canAddOutput:captureOutput])
{
[self.captureSession addOutput:captureOutput];
}
else
{
NSLog(#"ERROR - cannot add output");
}
[captureOutput release];
if (!self.prevLayer)
{
[self.prevLayer release];
}
self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
// NSLog(#"prev %p %#", self.prevLayer, self.prevLayer);
self.prevLayer.frame = self.view.bounds;
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: self.prevLayer];
[self.captureSession startRunning];
#endif
}
Maybe you guys know what is wrong?
I dont understand your question. If application is in background/not active, of course it cant working. You should make it clear.