I want to implement a feature that lets the user trim an audio file (.caf) which he perviously recorded. The recording part already works, but how can i add a trimming feature similar to the one in the Voicememos app. Is there an api for the audio trimmer apple uses?
Any help would be great...
How about using the AVFoundation? Import the audio file into an AVAsset (composition etc), then you can export it - setting preferred time + duration - to a file.
I wrote a stock function awhile ago that exports an asset to a file, you can also specify an audiomix. As below it exports all of the file, but you could add a NSTimeRange to exporter.timeRange and there you go. I haven't tested that though, but should work(?). Another alternative could be to adjust time ranges when creating the AVAsset + tracks. Of course the exporter only handles m4a (AAC). Sorry if this wasn't what you wanted.
-(void)exportAsset:(AVAsset*)asset toFile:(NSString*)filename overwrite:(BOOL)overwrite withMix:(AVAudioMix*)mix {
//NSArray* availablePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:asset];
AVAssetExportSession* exporter = [AVAssetExportSession exportSessionWithAsset:asset presetName:AVAssetExportPresetAppleM4A];
if (exporter == nil) {
DLog(#"Failed creating exporter!");
return;
}
DLog(#"Created exporter! %#", exporter);
// Set output file type
DLog(#"Supported file types: %#", exporter.supportedFileTypes);
for (NSString* filetype in exporter.supportedFileTypes) {
if ([filetype isEqualToString:AVFileTypeAppleM4A]) {
exporter.outputFileType = AVFileTypeAppleM4A;
break;
}
}
if (exporter.outputFileType == nil) {
DLog(#"Needed output file type not found? (%#)", AVFileTypeAppleM4A);
return;
}
// Set outputURL
NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* parentDir = [NSString stringWithFormat:#"%#/", [paths objectAtIndex:0]];
NSString* outPath = [NSString stringWithFormat:#"%#%#", parentDir, filename];
NSFileManager* manager = [NSFileManager defaultManager];
if ([manager fileExistsAtPath:outPath]) {
DLog(#"%# already exists!", outPath);
if (!overwrite) {
DLog(#"Not overwriting, uh oh!");
return;
}
else {
// Overwrite
DLog(#"Overwrite! (delete first)");
NSError* error = nil;
if (![manager removeItemAtPath:outPath error:&error]) {
DLog(#"Failed removing %#, error: %#", outPath, error.description);
return;
}
else {
DLog(#"Removed %#", outPath);
}
}
}
NSURL* const outUrl = [NSURL fileURLWithPath:outPath];
exporter.outputURL = outUrl;
// Specify a time range in case only part of file should be exported
//exporter.timeRange =
if (mix != nil)
exporter.audioMix = mix; // important
DLog(#"Starting export! (%#)", exporter.outputURL);
[exporter exportAsynchronouslyWithCompletionHandler:^(void) {
// Export ended for some reason. Check in status
NSString* message;
switch (exporter.status) {
case AVAssetExportSessionStatusFailed:
message = [NSString stringWithFormat:#"Export failed. Error: %#", exporter.error.description];
DLog(#"%#", message);
[self showAlert:message];
break;
case AVAssetExportSessionStatusCompleted: {
/*if (playfileWhenExportFinished) {
DLog(#"playfileWhenExportFinished!");
[self playfileAfterExport:exporter.outputURL];
playfileWhenExportFinished = NO;
}*/
message = [NSString stringWithFormat:#"Export completed: %#", filename];
DLog(#"%#", message);
[self showAlert:message];
break;
}
case AVAssetExportSessionStatusCancelled:
message = [NSString stringWithFormat:#"Export cancelled!"];
DLog(#"%#", message);
[self showAlert:message];
break;
default:
DLog(#"Export unhandled status: %d", exporter.status);
break;
}
}];
}
The above answer of #Jonny is correct. Here's I'm adding the use of AudioMixer to add the Fade-in effect while audio trimming.
Output: Audio asset trimmed to 20 seconds with a 10 second fade in.
The trim being set up in the code snippet takes place at the 30 second
mark of the asset and therefore the track duration should be at least
50 seconds.
- (BOOL)exportAssettoFilePath:(NSString *)filePath {
NSString *inputFilePath = <inputFilePath>;
NSURL *videoToTrimURL = [NSURL fileURLWithPath:inputFilePath];
AVAsset *avAsset = [AVAsset assetWithURL:videoToTrimURL];
// we need the audio asset to be at least 50 seconds long for this snippet
CMTime assetTime = [avAsset duration];
Float64 duration = CMTimeGetSeconds(assetTime);
if (duration < 50.0) return NO;
// get the first audio track
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
if ([tracks count] == 0) return NO;
AVAssetTrack *track = [tracks objectAtIndex:0];
// create the export session
// no need for a retain here, the session will be retained by the
// completion handler since it is referenced there
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:avAsset
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession) return NO;
// create trim time range - 20 seconds starting from 30 seconds into the asset
CMTime startTime = CMTimeMake(30, 1);
CMTime stopTime = CMTimeMake(50, 1);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime);
// create fade in time range - 10 seconds starting at the beginning of trimmed asset
CMTime startFadeInTime = startTime;
CMTime endFadeInTime = CMTimeMake(40, 1);
CMTimeRange fadeInTimeRange = CMTimeRangeFromTimeToTime(startFadeInTime,
endFadeInTime);
// setup audio mix
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters =
[AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[exportAudioMixInputParameters setVolumeRampFromStartVolume:0.0 toEndVolume:1.0
timeRange:fadeInTimeRange];
exportAudioMix.inputParameters = [NSArray
arrayWithObject:exportAudioMixInputParameters];
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:filePath]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
exportSession.timeRange = exportTimeRange; // trim time range
//exportSession.audioMix = exportAudioMix; // fade in audio mix
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed");
} else {
NSLog(#"Export Session Status: %ld", (long)exportSession.status);
}
}];
return YES;}
Thanks
For More Details :
https://developer.apple.com/library/ios/qa/qa1730/_index.html
Related
I am trying to record two videos with UIImagePickerController. Everything is working fine but while recording the second video seems it override the Path of first recorded video.
I need to upload both videos to the server but first video path got nil while uploading and app got crashed. Is there any way to record the second video at different path?
Video Path as follows:
/private/var/mobile/Containers/Data/Application/1465EC90-4B57-41FF-996E-0CCB7713ECE7/tmp/50332801315__A883E4DB-ED72-4D31-9564-22FB363779BD.MOV
/private/var/mobile/Containers/Data/Application/1465EC90-4B57-41FF-996E-0CCB7713ECE7/tmp/50332802324__F11733AD-EB62-426D-BA1C-7E87D2BF66D0.MOV
Here is my imagePicker delegate code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if (CFStringCompare ((__bridge CFStringRef) mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo) {
NSURL *videoUrl = (NSURL*)[info objectForKey:UIImagePickerControllerMediaURL];
NSString *moviePath = [videoUrl path];
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath)) {
UISaveVideoAtPathToSavedPhotosAlbum (moviePath, nil, nil, nil);
}
NSLog(#"videoUrl: %#", videoUrl);
NSLog(#"moviePath: %#", moviePath);
// self.moviePath_1 = #"";
// self.moviePath_2 = #"";
NSLog(#"picker.title: %#", picker.title);
if ([picker.title isEqualToString:#"Video_1"]) {
self.moviePath_1 = moviePath;
self.video_1 = YES;
NSLog(#"self.moviePath_1: %#", self.moviePath_1);
self.video_1_Data = [NSData dataWithContentsOfURL:[NSURL fileURLWithPath:self.moviePath_1]];
NSLog(#"Video_1 Size: %#",[NSByteCountFormatter stringFromByteCount:self.video_1_Data.length countStyle:NSByteCountFormatterCountStyleFile]);
[self setupAndPlayback:#"Video_1"];
} else {
self.moviePath_2 = moviePath;
self.video_2 = YES;
NSLog(#"self.moviePath_2: %#", self.moviePath_2);
self.video_2_Data = [NSData dataWithContentsOfURL:[NSURL fileURLWithPath:self.moviePath_2]];
NSLog(#"Video_2 Size: %#",[NSByteCountFormatter stringFromByteCount:self.video_2_Data.length countStyle:NSByteCountFormatterCountStyleFile]);
[self setupAndPlayback:#"Video_2"];
}
}
[self dismissViewControllerAnimated:YES completion:nil];
}
Save Video on different Paths. You are overriding the same path that why this issue is happen. Add Time Stamp or increasing Number with Path and save it.
self.moviePath_1 = [NSString stringWithFormat: #"%#-%d.png", moviePath, num] ;
num += 1; // for next time
I am working on an app that composes multiple video clips taken by the user. The clips are recorded on the camera, and overlayed with another video and then the recorded clips are composed together into one long clip. The length of each clip is determined by the overlaying video file.
I am using an AVAssetExportSession and exportAsynchronouslyWithCompletionHandler. The odd thing is this works with some clips and not others. The real problem is that the exporter doesn't report any errors or failures, just zero progress and never calls the completion handler.
I don't even know where to begin looking to find out what the issue is. Here's the function I use to compose the clips together
- (void) setupAndStitchVideos:(NSMutableArray*)videoData
{
// Filepath to where the final generated video is stored
NSURL * exportUrl = nil;
// Contains information about a single asset/track
NSDictionary * assetOptions = nil;
AVURLAsset * currVideoAsset = nil;
AVURLAsset * currAudioAsset = nil;
AVAssetTrack * currVideoTrack = nil;
AVAssetTrack * currAudioTrack = nil;
// Contains all tracks and time ranges used to build the final composition
NSMutableArray * allVideoTracks = nil;
NSMutableArray * allVideoRanges = nil;
NSMutableArray * allAudioTracks = nil;
NSMutableArray * allAudioRanges = nil;
AVMutableCompositionTrack * videoTracks = nil;
AVMutableCompositionTrack * audioTracks = nil;
// Misc time values used when calculating a clips start time and total length
float animationLength = 0.0f;
float clipLength = 0.0f;
float startTime = 0.0f;
CMTime clipStart = kCMTimeZero;
CMTime clipDuration = kCMTimeZero;
CMTimeRange currRange = kCMTimeRangeZero;
// The final composition to be generated and exported
AVMutableComposition * finalComposition = nil;
// Cancel any already active exports
if (m_activeExport)
{
[m_activeExport cancelExport];
m_activeExport = nil;
}
// Initialize and setup all composition related member variables
allVideoTracks = [[NSMutableArray alloc] init];
allAudioTracks = [[NSMutableArray alloc] init];
allVideoRanges = [[NSMutableArray alloc] init];
allAudioRanges = [[NSMutableArray alloc] init];
exportUrl = [NSURL fileURLWithPath:[MobveoAnimation getMergeDestination]];
finalComposition = [AVMutableComposition composition];
videoTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
animationLength = m_animation.videoDuration;
// Define all of the audio and video tracks that will be used in the composition
for (NSDictionary * currData in videoData)
{
currVideoAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_VIDEO_URL] options:assetOptions];
currAudioAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_AUDIO_URL] options:assetOptions];
currVideoTrack = [[currVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
NSArray *audioTracks = [currAudioAsset tracksWithMediaType:AVMediaTypeAudio];
if ( audioTracks != nil && audioTracks.count > 0 )
{
currAudioTrack = audioTracks[0];
}
else
{
currAudioTrack = nil;
}
clipLength = animationLength * [(NSNumber*)[currData objectForKey:KEY_STITCH_LENGTH_PERCENTAGE] floatValue];
clipStart = CMTimeMakeWithSeconds(startTime, currVideoAsset.duration.timescale);
clipDuration = CMTimeMakeWithSeconds(clipLength, currVideoAsset.duration.timescale);
NSLog(#"Clip length: %.2f", clipLength);
NSLog(#"Clip Start: %lld", clipStart.value );
NSLog(#"Clip duration: %lld", clipDuration.value);
currRange = CMTimeRangeMake(clipStart, clipDuration);
[allVideoTracks addObject:currVideoTrack];
if ( currAudioTrack != nil )
{
[allAudioTracks addObject:currAudioTrack];
[allAudioRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
}
[allVideoRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
startTime += clipLength;
}
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:nil];
if ( allAudioTracks.count > 0 )
{
[audioTracks insertTimeRanges:allAudioRanges ofTracks:allAudioTracks atTime:kCMTimeZero error:nil];
}
for ( int i = 0; i < allVideoTracks.count - allAudioTracks.count; ++i )
{
CMTimeRange curRange = [allVideoRanges[i] CMTimeRangeValue];
[audioTracks insertEmptyTimeRange:curRange];
}
// Delete any previous exported video files that may already exist
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
// Begin the composition generation and export process!
m_activeExport = [[AVAssetExportSession alloc] initWithAsset:finalComposition presetName:AVAssetExportPreset1280x720];
[m_activeExport setOutputFileType:AVFileTypeQuickTimeMovie];
[m_activeExport setOutputURL:exportUrl];
NSLog(#"Exporting async");
[m_activeExport exportAsynchronouslyWithCompletionHandler:^(void)
{
NSLog(#"Export complete");
// Cancel the update timer
[m_updateTimer invalidate];
m_updateTimer = nil;
// Dismiss the displayed dialog
[m_displayedDialog hide:TRUE];
m_displayedDialog = nil;
// Re-enable touch events
[[UIApplication sharedApplication] endIgnoringInteractionEvents];
// Report the success/failure result
switch (m_activeExport.status)
{
case AVAssetExportSessionStatusFailed:
[self performSelectorOnMainThread:#selector(videoStitchingFailed:) withObject:m_activeExport.error waitUntilDone:FALSE];
break;
case AVAssetExportSessionStatusCompleted:
[self performSelectorOnMainThread:#selector(videoStitchingComplete:) withObject:m_activeExport.outputURL waitUntilDone:FALSE];
break;
}
// Clear our reference to the completed export
m_activeExport = nil;
}];
}
EDIT:
Thanks to Josh in the comments I noticed there were error parameters I wasn't making use of. In the case where it is failing now I am getting the ever so useful "Operation could not be completed" error on inserting the time ranges of the video tracks:
NSError *videoError = nil;
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:&videoError];
if ( videoError != nil )
{
NSLog(#"Error adding video track: %#", videoError);
}
Output:
Error adding video track: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17426dd00 {NSUnderlyingError=0x174040cc0 "The operation couldn’t be completed. (OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed}
It is worth noting however that nowhere in this entire codebase is urlWithString used instead of fileUrlWithPath so that isn't the problem.
Judging from your for in enumeration of the videoData array, after you've Initialized the composition member variables, it looks as if you're blocking the calling thread. Although accessing each AVAssetTrack instance is permitted, the values for the keys are not always immediately available and run synchronously..
Instead, try registering for change notifications using AVSynchronousKeyValueLoading protocols. Apple's documentation should help you straighten out the issue and get you on your way!
Here are a few more Apple recommendations I've aggregated for AVFoundation:
Hopefully this will do the trick! Good luck and let me know if you have any further questions/problems.
I'm writing a MP4 video file with a AVAssetWriter using a AVAssetWriterInputPixelBufferAdaptor.
The source is a video from a UIImagePickerController, either freshly captured from the camera or from the asset library. Quality right now is UIImagePickerControllerQualityTypeMedium.
Some times the writer fails. It's status is AVAssetWriterStatusFailed and the AVAssetWriter objects error property is:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed"
UserInfo=0xf5d8990 {NSLocalizedFailureReason=An unknown error occurred (-536870210),
NSUnderlyingError=0x4dd8e0 "The operation couldn’t be completed. (OSStatus error -536870210.)",
NSLocalizedDescription=The operation could not be completed
The error occurs approximately 20% of the times the code is run. It seems to fail more frequently on iPhone 4 / 4S than on iPhone 5.
It also occurs more frequently if the source video quality is higher.
Using UIImagePickerControllerQualityTypeLow the error doesn't happen so often.
Using UIImagePickerControllerQualityTypeHigh, the error happens a little more frequently.
I have also noticed something else:
It seems to come in waves. When it fails, the following runs will often fail too, even though I delete the app and reinstall it. That leaves me wondering, whether my program leaks some memory and if that memory stays alive even if the app gets killed (is that even possible?).
Here is the code i use to render my video:
- (void)writeVideo
{
offlineRenderingInProgress = YES;
/* --- Writer Setup --- */
[locationQueue cancelAllOperations];
[self stopWithoutRewinding];
NSError *writerError = nil;
BOOL succes;
succes = [[NSFileManager defaultManager] removeItemAtURL:self.outputURL error:nil];
// DLog(#"Url: %#, succes: %i, error: %#", self.outputURL, succes, fileError);
writer = [AVAssetWriter assetWriterWithURL:self.outputURL fileType:(NSString *)kUTTypeQuickTimeMovie error:&writerError];
//writer.shouldOptimizeForNetworkUse = NO;
if (writerError) {
DLog(#"Writer error: %#", writerError);
return;
}
float bitsPerPixel;
CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions((__bridge CMVideoFormatDescriptionRef)([readerVideoOutput.videoTracks[0] formatDescriptions][0]));
int numPixels = dimensions.width * dimensions.height;
int bitsPerSecond;
// Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
if ( numPixels < (640 * 480) )
bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.
else
bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.
bitsPerSecond = numPixels * bitsPerPixel;
NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey,
[NSNumber numberWithInteger:videoSize.height], AVVideoHeightKey,
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey,
nil], AVVideoCompressionPropertiesKey,
nil];
writerVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
writerVideoInput.transform = movie.preferredTransform;
writerVideoInput.expectsMediaDataInRealTime = YES;
[writer addInput:writerVideoInput];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
writerPixelAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerVideoInput
sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
BOOL couldStart = [writer startWriting];
if (!couldStart) {
DLog(#"Could not start AVAssetWriter!");
abort = YES;
[locationQueue cancelAllOperations];
return;
}
[self configureFilters];
CIContext *offlineRenderContext = [CIContext contextWithOptions:#{kCIContextUseSoftwareRenderer : #NO}];
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
if (!self.canEdit) {
[self createVideoReaderWithAsset:movie timeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, kCMTimePositiveInfinity) forOfflineRender:YES];
} else {
[self createVideoReaderWithAsset:movie timeRange:CMTimeRangeWithNOVideoRangeInDuration(self.thumbnailEditView.range, movie.duration) forOfflineRender:YES];
}
CMTime startOffset = reader.timeRange.start;
DLog(#"startOffset: %llu", startOffset.value);
[self.thumbnailEditView removeFromSuperview];
// self.thumbnailEditView = nil;
[glLayer removeFromSuperlayer];
glLayer = nil;
[playerView removeFromSuperview];
playerView = nil;
glContext = nil;
[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{
#try {
BOOL didWriteSomething = NO;
DLog(#"Preparing to write...");
while ([writerVideoInput isReadyForMoreMediaData]) {
if (abort) {
NSLog(#"Abort == YES");
[locationQueue cancelAllOperations];
[writerVideoInput markAsFinished];
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
}
if (writer.status == AVAssetWriterStatusFailed) {
DLog(#"Writer.status: AVAssetWriterStatusFailed, error: %#", writer.error);
[[NSUserDefaults standardUserDefaults] setObject:[NSNumber numberWithInt:1] forKey:#"QualityOverride"];
[[NSUserDefaults standardUserDefaults] synchronize];
abort = YES;
[locationQueue cancelAllOperations];
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
return;
DLog(#"Source file exists: %i", [[NSFileManager defaultManager] fileExistsAtPath:movie.URL.relativePath]);
}
DLog(#"Writing started...");
CMSampleBufferRef buffer = nil;
if (reader.status != AVAssetReaderStatusUnknown) {
if (reader.status == AVAssetReaderStatusReading) {
buffer = [readerVideoOutput copyNextSampleBuffer];
if (didWriteSomething == NO) {
DLog(#"Copying sample buffers...");
}
}
if (!buffer) {
[writerVideoInput markAsFinished];
DLog(#"Finished...");
CGColorSpaceRelease(colorSpace);
[self offlineRenderingDidFinish];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[writer finishWriting];
if (writer.error != nil) {
DLog(#"Error: %#", writer.error);
} else {
DLog(#"Succes!");
}
if (writer.status == AVAssetWriterStatusCompleted) {
videoConvertCompletionBlock(YES, nil);
}
else {
abort = YES;
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
}
});
return;
}
didWriteSomething = YES;
}
else {
DLog(#"Still waiting...");
//Reader just needs a moment to get ready...
continue;
}
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);
if (pixelBuffer == NULL) {
DLog(#"Pixelbuffer == NULL");
continue;
}
//DLog(#"Sample call back! Pixelbuffer: %lu", CVPixelBufferGetHeight(pixelBuffer));
//NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)CGColorSpaceCreateDeviceRGB() forKey:kCIImageColorSpace];
CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil];
CIImage *outputImage = [self filteredImageWithImage:ciimage];
CVPixelBufferRef outPixelBuffer = NULL;
CVReturn status;
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
CFDictionarySetValue(attrs,
kCVPixelBufferCGImageCompatibilityKey,
(__bridge const void *)([NSNumber numberWithBool:YES]));
CFDictionarySetValue(attrs,
kCVPixelBufferCGBitmapContextCompatibilityKey,
(__bridge const void *)([NSNumber numberWithBool:YES]));
status = CVPixelBufferCreate(kCFAllocatorDefault, ciimage.extent.size.width, ciimage.extent.size.height, kCVPixelFormatType_32BGRA, attrs, &outPixelBuffer);
//DLog(#"Output image size: %f, %f, pixelbuffer height: %lu", outputImage.extent.size.width, outputImage.extent.size.height, CVPixelBufferGetHeight(outPixelBuffer));
if (status != kCVReturnSuccess) {
DLog(#"Couldn't allocate output pixelBufferRef!");
continue;
}
[offlineRenderContext render:outputImage toCVPixelBuffer:outPixelBuffer bounds:outputImage.extent colorSpace:colorSpace];
CMTime currentSourceTime = CMSampleBufferGetPresentationTimeStamp(buffer);
CMTime currentTime = CMTimeSubtract(currentSourceTime, startOffset);
CMTime duration = reader.timeRange.duration;
if (CMTIME_IS_POSITIVE_INFINITY(duration)) {
duration = movie.duration;
}
CMTime durationConverted = CMTimeConvertScale(duration, currentTime.timescale, kCMTimeRoundingMethod_Default);
float durationFloat = (float)durationConverted.value;
float progress = ((float) currentTime.value) / durationFloat;
//DLog(#"duration : %f, progress: %f", durationFloat, progress);
[self updateOfflineRenderProgress:progress];
if (pixelBuffer != NULL && writerVideoInput.readyForMoreMediaData) {
[writerPixelAdaptor appendPixelBuffer:outPixelBuffer withPresentationTime:currentTime];
} else {
continue;
}
if (writer.status == AVAssetWriterStatusWriting) {
DLog(#"Writer.status: AVAssetWriterStatusWriting");
}
CFRelease(buffer);
CVPixelBufferRelease(outPixelBuffer);
}
}
#catch (NSException *exception) {
DLog(#"Catching exception: %#", exception);
}
}];
}
Ok, I think I solved it myself. The bad guy was this line:
[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ ....
The global queue I was passing is a concurrent queue. This allows a new callback to be made before the previous one is finished. The asset writer is not designed to be written to from more than one thread at a time.
Creating and using a new serial queue seems to remedy the problem:
assetWriterQueue = dispatch_queue_create("AssetWriterQueue", DISPATCH_QUEUE_SERIAL);
[writerVideoInput requestMediaDataWhenReadyOnQueue:assetWriterQueue usingBlock:^{...
I've spent two days googling and reading the Bluetooth programming guide while trying to piece together a small Mac app that will retrieve images from a drop folder and send any new files to a predetermined device over Bluetooth. There doesn't seem to be many good examples available.
I'm at the point where I'm able to spawn the Bluetooth Service Browser and select the device and its OBEX service, establishing a service and creating a connection, but then nothing more happens. Could anyone please point me in the direction of/show me a simple example that would work?
AppDelegate source code enclosed. Thanks for reading!
#import "AppDelegate.h"
#implementation AppDelegate
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
IOBluetoothServiceBrowserController *browser = [IOBluetoothServiceBrowserController serviceBrowserController:0];
[browser runModal];
//IOBluetoothSDPServiceRecord
IOBluetoothSDPServiceRecord *result = [[browser getResults] objectAtIndex:0];
[self describe:result];
if ([[result.device.name substringToIndex:8] isEqualToString:#"Polaroid"]) {
printer = result.device;
serviceRecord = result;
[self testPrint];
}
else {
NSLog(#"%# is not a valid device", result.device.name);
}
}
- (void) testPrint {
currentFilePath = #"/Users/oyvind/Desktop/_DSC8797.jpg";
[self sendFile:currentFilePath];
}
- (void) sendFile:(NSString *)filePath {
IOBluetoothOBEXSession *obexSession = [[IOBluetoothOBEXSession alloc] initWithSDPServiceRecord:serviceRecord];
if( obexSession != nil )
{
NSLog(#"OBEX Session Established");
OBEXFileTransferServices *fst = [OBEXFileTransferServices withOBEXSession:obexSession];
OBEXDelegate *obxd = [[OBEXDelegate alloc] init];
[obxd setFile:filePath];
[fst setDelegate:obxd];
OBEXError cnctResult = [fst connectToObjectPushService];
if( cnctResult != kIOReturnSuccess ) {
NSLog(#"Error creating connection");
return;
}
else {
NSLog(#"OBEX Session Created. Sending file: %#", filePath);
[fst sendFile:filePath];
[printer openConnection];
}
}
else {
NSLog(#"Error creating OBEX session");
NSLog(#"Error sending file");
}
}
#end
OK; here's what ultimately became the core parts of the functionality. The application I made was a sort of print server for Polaroid instant printers that would only accept images over Object Push.
First, ensure watched folder exists.
/*
Looks for a directory named PolaroidWatchFolder in the user's desktop directory
and creates it if it does not exist.
*/
- (void) ensureWatchedFolderExists {
NSFileManager *fileManager = [NSFileManager defaultManager];
NSURL *url = [NSURL URLWithString:#"PolaroidWatchFolder" relativeToURL:[[fileManager URLsForDirectory:NSDesktopDirectory inDomains:NSUserDomainMask] objectAtIndex:0]];
BOOL isDir;
if ([fileManager fileExistsAtPath:[url path] isDirectory:&isDir] && isDir) {
[self log:[NSString stringWithFormat:#"Watched folder exists at %#", [url absoluteURL]]];
watchFolderPath = url;
}
else {
NSError *theError = nil;
if (![fileManager createDirectoryAtURL:url withIntermediateDirectories:NO attributes:nil error:&theError]) {
[self log:[NSString stringWithFormat:#"Watched folder could not be created at %#", [url absoluteURL]]];
}
else {
watchFolderPath = url;
[self log:[NSString stringWithFormat:#"Watched folder created at %#", [url absoluteURL]]];
}
}
}
Then scan for available printers:
/*
Loops through all paired Bluetooth devices and retrieves OBEX Object Push service records
for each device who's name starts with "Polaroid".
*/
- (void) findPairedDevices {
NSArray *pairedDevices = [IOBluetoothDevice pairedDevices];
devicesTested = [NSMutableArray arrayWithCapacity:0];
for (IOBluetoothDevice *device in pairedDevices)
{
if ([self deviceQualifiesForAddOrRenew:device.name])
{
BluetoothPushDevice *pushDevice = [[BluetoothPushDevice new] initWithDevice:device];
if (pushDevice != nil)
{
[availableDevices addObject:pushDevice];
[pushDevice testConnection];
}
}
}
}
That last function call is to the BluetoothPushDevice's built-in method to test the connection. Here is the delegate handler for the response:
- (void) deviceStatusHandler: (NSNotification *)notification {
BluetoothPushDevice *device = [notification object];
NSString *status = [[notification userInfo] objectForKey:#"message"];
if ([devicesTested count] < [availableDevices count] && ![devicesTested containsObject:device.name]) {
[devicesTested addObject:device.name];
}
}
Upon server start, this method will run in response to a timer tick or manual scan:
- (void) checkWatchedFolder {
NSError *error = nil;
NSArray *properties = [NSArray arrayWithObjects: NSURLLocalizedNameKey, NSURLCreationDateKey, NSURLLocalizedTypeDescriptionKey, nil];
NSArray *files = [[NSFileManager defaultManager]
contentsOfDirectoryAtURL:watchFolderPath
includingPropertiesForKeys:properties
options:(NSDirectoryEnumerationSkipsHiddenFiles)
error:&error];
if (files == nil) {
[self log:#"Error reading watched folder"];
return;
}
if ([files count] > 0) {
int newFileCount = 0;
for (NSURL *url in files) {
if (![filesInTransit containsObject:[url path]]) {
NSLog(#"New file: %#", [url lastPathComponent]);
[self sendFile:[url path]];
newFileCount++;
}
}
}
}
When new files are found, ww first need to find a device that is not busy recieving a file of printing it:
/*
Loops through all discovered device service records and returns the a new OBEX session for
the first it finds that is not connected (meaning it is not currently in use, connections are
ad-hoc per print).
*/
- (BluetoothPushDevice*) getIdleDevice {
for (BluetoothPushDevice *device in availableDevices) {
if ([device.status isEqualToString:kBluetoothDeviceStatusReady]) {
return device;
}
}
return nil;
}
Then a file is sent with this method:
- (void) sendFile:(NSString *)filePath {
BluetoothPushDevice *device = [self getIdleDevice];
if( device != nil ) {
NSLog(#"%# is available", device.name);
if ([device sendFile:filePath]) {
[self log:[NSString stringWithFormat:#"Sending file: %#", filePath]];
[filesInTransit addObject:filePath];
}
else {
[self log:[NSString stringWithFormat:#"Error sending file: %#", filePath]];
}
}
else {
NSLog(#"No idle devices");
}
}
Upon transfer complete, this delegate method is called:
/*
Responds to BluetoothPushDevice's TransferComplete notification
*/
- (void) transferStatusHandler: (NSNotification *) notification {
NSString *status = [[notification userInfo] objectForKey:#"message"];
NSString *file = ((BluetoothPushDevice*)[notification object]).file;
if ([status isEqualToString:kBluetoothTransferStatusComplete]) {
if ([filesInTransit containsObject:file]) {
NSFileManager *fileManager = [NSFileManager defaultManager];
NSError *error = nil;
[fileManager removeItemAtPath:file error:&error];
if (error != nil) {
[self log:[NSString stringWithFormat:#"**ERROR** File %# could not be deleted (%#)", file, error.description]];
}
[self log:[NSString stringWithFormat:#"File deleted: %#", file]];
[filesInTransit removeObject:file];
}
else {
[self log:[NSString stringWithFormat:#"**ERROR** filesInTransit array does not contain file %#", file]];
}
}
[self updateDeviceStatusDisplay];
}
I hope this helps someone!
I tested the export method on an iOS application and it works fine. But when I moved it to a command line project the method exportAsynchronouslyWithCompletionHandler doesn't work.
I know the possible cause is the command line tool cannot handle asynchronous because it returns immediately. The method is pretty straightforward. It just take the address of the video, specify the quality, and output video to the output path. And here is my code.
void exportVideo(NSString *source, NSString *quality, NSString *filePath) {
NSString *preset = nil;
if ([quality isEqualToString:#"high"]) {
preset = AVAssetExportPreset960x540; // 16:9 recommended resolution for iPhone 4
} else if ([quality isEqualToString:#"middle"]) {
preset = AVAssetExportPresetAppleM4VWiFi; // 16:9 recommended resolution for iPhone 3GS
}
// Creat the asset from path
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:source] options:nil];
if (!asset) NSLog(#"There is no video in the asset");
//NSLog(#"Print all presets: %#", [AVAssetExportSession allExportPresets]);
//NSLog(#"Print all presets compatible with the asset: %#", [AVAssetExportSession exportPresetsCompatibleWithAsset:asset]);
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:asset];
//NSLog(#"All available presets: %#", compatiblePresets);
if ([compatiblePresets containsObject:preset]) {
// create export session
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:preset];
exportSession.outputURL = [NSURL fileURLWithPath:filePath]; // output path
if (preset == AVAssetExportPreset960x540) {
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
} else if (preset == AVAssetExportPresetAppleM4VWiFi) {
exportSession.outputFileType = AVFileTypeAppleM4V;
}
dispatch_semaphore_t sema = dispatch_semaphore_create(0); // Add this line
// In command line tool, it doesn't execute this method
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status == AVAssetExportSessionStatusCompleted) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (exportSession.status == AVAssetExportSessionStatusFailed) {
NSLog(#"AVAssetExportSessionStatusFailed");
NSLog (#"FAIL %#", exportSession.error);
} else {
NSLog(#"Export Session Status: %ld", exportSession.status);
}
dispatch_semaphore_signal(sema); // Add this line
}];
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER); // Add this line
dispatch_release(sema); // Add this line
} else {
NSLog(#"Requested quality is not available.");
}
}
int main(int argc, const char * argv[])
{
#autoreleasepool {
NSLog(#"Input the source file path:");
char str[100];
scanf("%s", str);
NSString *inputSource = [NSString stringWithUTF8String:str];
NSLog(#"Print the input source: %#", inputSource);
NSLog(#"Input the output video quality:");
scanf("%s", str);
NSString *quality = [NSString stringWithUTF8String:str];
NSLog(#"Print the quality: %#", quality);
NSLog(#"Input the output file path:");
scanf("%s", str);
NSString *outputPath = [NSString stringWithUTF8String:str];
NSLog(#"Print the output path: %#", outputPath);
exportVideo(inputSource, quality, outputPath);
}
return 0;
}
**The usage of dispatch_semaphore_t works for my case. I got this idea from an answer from stackoverflow. Thanks for all you guys help so I share all the code.
That is correct, you need to block your main function from returning until the asynch method is complete. The simplest way to do it is with wait flags. Either have a static boolean, or pass a boolean pointer to your export video function. Then while the boolean is false, sleep your main function. This is a terrible practice though, and is only relevant to this small situation since main only does this one thing. Don't do it in general.