i use the sample codes from apple for recording and playing the last recording
but i can't play the last recording
here are the codes
- (IBAction)playLastRecording {
// Present the media player controller for the last recorded URL.
NSDictionary *options = #{
WKMediaPlayerControllerOptionsAutoplayKey : #YES
};
[self presentMediaPlayerControllerWithURL:self.lastRecordingURL options:options completion:^(BOOL didPlayToEnd, NSTimeInterval endTime, NSError * __nullable error) {
if (!didPlayToEnd) {
NSLog(#"The player did not play all the way to the end. The player only played until time - %.2f.", endTime);
}
if (error) {
NSLog(#"There was an error with playback: %#.", error);
}
}];
}
and here is the error
i think we need to use the NSbundle and nsurl connection but how
to use for self.lastRecordingURL
please writing the correct codes for this problems
Optional(Error Domain=com.apple.watchkit.errors Code=4 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (1), NSUnderlyingError=0x17d9bf50 {Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted"}, NSLocalizedDescription=The operation could not be completed})
I had this problem. Make sure that the audio data is saved to the Apps Group correctly with the correct extension of the file.
If you are trying to record the audio file use the following codes.
- (void)startRecording
{
// Creating a path for saving the recorded audio file.
// We have to write the files to the shared group folder, as this is the only place both the app and extension can see.
NSURL *container = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
NSURL *outputURL = [container URLByAppendingPathComponent:#"AudioFile.m4a"];
// Setting the recorder options.
NSDictionary *dictMaxAudioRec = #{#"WKAudioRecorderControllerOptionsMaximumDurationKey":MAX_REC_DURATION};
// Presenting the default audio recorder.
[self presentAudioRecorderControllerWithOutputURL:outputURL preset:WKAudioRecorderPresetWideBandSpeech options:dictMaxAudioRec completion:^(BOOL didSave, NSError * error) {
if(didSave)
{
// Successfully saved the file.
NSURL *extensionDirectory = [[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask].firstObject;
NSUInteger timeAtRecording = (NSUInteger)[NSDate timeIntervalSinceReferenceDate];
NSString *dirName = [NSString stringWithFormat:#"AudioFile%d/",timeAtRecording];
NSURL *outputExtensionURL = [extensionDirectory URLByAppendingPathComponent:dirName];
// Move the file to new directory.
NSError *moveError;
BOOL success = [[NSFileManager defaultManager] moveItemAtURL:outputURL toURL:outputExtensionURL error:&moveError];
if (!success) {
NSLog(#"Failed to move the outputURL to the extension's documents direcotry: %#", moveError);
}
else {
NSData *audioData = [NSData dataWithContentsOfURL:outputExtensionURL];
NSLog(#"Actual Audio Data length: %lu", (unsigned long)audioData.length);
if(audioData.length)
{
// We have a valid audio data,do what ever you want to do with this data
}
}
}
else
{
// Something went wrong.
NSLog(#"%s - %#",__PRETTY_FUNCTION__,error);
}
}];
}
Or if you are trying to play a video that you have downloaded from other source or passed from the paired phone, write the audio data first to the App Groups with file extension. The following codes may help you for that.
- (void)writeAudioToAppGroupsWithData:(NSData *)audioData
{
// Writing the audio data to the App Groups
NSURL *containerURL = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
containerURL = [containerURL URLByAppendingPathComponent:[NSString stringWithFormat:DIRECTORY_PATH]];
[audioData writeToURL:containerURL atomically:YES];
}
In this case make sure that your DIRECTORY_PATH is Library/Caches/filename.extension.
Eg: Library/Caches/Audio.mp3
For playing the saved audio use the following codes.
- (void)playAudio
{
// Playing the audio from the url using the default controller
[self presentMediaPlayerControllerWithURL:[self getAudioUrl] options:nil completion:^(BOOL didPlayToEnd, NSTimeInterval endTime, NSError * _Nullable error) {
NSLog(#"Error = %#",error);
}];
}
You can get the audio url from the App Groups.
- (NSURL *)getAudioUrl
{
// Getting the audio url from the App Groups
NSURL *container = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
NSURL *outputURL = [container URLByAppendingPathComponent:[NSString stringWithFormat:DIRECTORY_PATH]];
return outputURL;
}
Hope this will fix your issue.
Related
I'm working on recording the voice then change the pitch of the audio and save.
I call this method after recording the voice and click on the button to change the pitch then the new file is also created but the not not able to listen the audio the audio is generated with out voice what may be the error?
-(void)saveEffectedAudioToFolder
{
NSError *error;
if (_audioEngine) {
AVAudioUnitTimePitch *pitchEffect = [AVAudioUnitTimePitch new];
pitchEffect.pitch = pitch;
AVAudioFormat * commonFormat = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatFloat32 sampleRate:44100 channels:2 interleaved:NO];
//
AVAudioFile *outputFile = [[AVAudioFile alloc] initForWriting:[NSURL URLWithString:_filePathDstaud] settings:commonFormat.settings error:&error];
if (error) {
NSLog(#"Error is 1 %#",[error localizedDescription]);
}
[pitchEffect installTapOnBus: 0 bufferSize: 8192 format: commonFormat block: ^(AVAudioPCMBuffer *buffer, AVAudioTime *when)
//here is that error
{
if (outputFile.length < _audioFile.length)
{
//Let us know when to stop saving the file, otherwise saving infinitely
NSError *error;
NSAssert([outputFile writeFromBuffer:buffer error:&error], #"error writing buffer data to file, %#", [error localizedDescription]);
}else{
_audioEngine = nil;
[pitchEffect removeTapOnBus:0];//if we dont remove it, will keep on tapping infinitely
NSLog(#"Did you like it? Please, vote up for my question");
}
}
];
}
NSString *documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *filePath = [documentsPath stringByAppendingPathComponent:name];
[fileManager removeItemAtPath:filePath error:&error];
}
I know there are a few questions asked on the behavior of NSData writeToFile atomically: and how that is supposed to be a SYNC method.
But in my experience it is not behaving like that. I've included pseudo code of my method that will accept and array of file names and will save them from Camera Roll to Documents folder of the app.
The For loop runs through each item in array and calls 'writeToFile'. My NSLogs show that the loop runs through and the NSLog message after the loop completion is displayed somewhere in the middle of the files being saved. for eg: I pass 4 images to this array to save, the NSLogs show that first file was saved and then it displays log that says it reached end of For loop and then the rest of the files being saved is displayed.
My question is - how do I make it synchronous or setup a completion handler of some sort that will only get called when the file write has completely finished. I need to eventually upload the files once they've been saved in the apps documents folder. Rest of the logic is in place. I just am having trouble finding the moment when saving of all files has completed.
I do not want to do a hack and set a delay because my file sizes and number of files can vary widely. Any guidance on this is much appreciated. Thanks
//'each item in 'arrayOfSelectedMedia' holds a dictionary of keys 'imageData' ,absolutePathToFileData' and 'filename'
-(void) saveArrayOfFiles: (NSMutableArray*) arrayOfSelectedMedia
toLocation: (NSString*) subFolder
whenCompletePostNotificationTo:(NSString*)backToCaller{
NSError *error;
if (![[NSFileManager defaultManager] fileExistsAtPath:globalChatsMediaFolder]){
[[NSFileManager defaultManager] createDirectoryAtPath:mediaFolder withIntermediateDirectories:NO attributes:nil error:&error]; //Create folder
}
else{
NSLog(#"folder %# exists already",mediaFolder);
}
NSString *completePath = [globalChatsMediaFolder stringByAppendingPathComponent:subFolder];
if (![[NSFileManager defaultManager] completePath]){
[[NSFileManager defaultManager] completePath withIntermediateDirectories:NO attributes:nil error:&error]; //Create folder
NSLog(#"creating folder %# ", completePath);
}
else{
NSLog(#"folder %# exists already", completePath);
}
/******* go through all files in array and save it to their specific folder ***********/
for(int i=0; i < [arrayOfSelectedMedia count]; i++){
NSString *saveFileAs = [NSString stringWithFormat:#"%#",[arrayOfSelectedMedia[i] objectForKey:#"filename"]];
NSURL *absolutePathToSelectedFile = [arrayOfSelectedMedia[i] objectForKey:#"absolutePathToFileInCameraRoll"];
// Create assets library
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init] ;
// Try to load asset at mediaURL
[library assetForURL:absolutePathToSelectedFile resultBlock:^(ALAsset *asset) {
// If asset exists
if (asset) {
// Type your code here for successful
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc((NSUInteger)rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:(NSUInteger)rep.size error:nil];
NSData *dataToUpload = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
NSLog(#"%s , A direct BYTE size of file %d bytes", __PRETTY_FUNCTION__, [dataToUpload length]);
NSLog(#"absolute path to asset is %# ", asset.defaultRepresentation.url);
NSString *absolutePathToFile = [completePath stringByAppendingPathComponent:saveFileAs];
BOOL writeStatus = [dataToUpload writeToFile:absolutePathToFile options:NSDataWritingAtomic error:&error];
if(writeStatus){
NSLog(#"file %# with %lu bytes was successfully saved locally", saveFileAs, (unsigned long)[dataToUpload length]);
}
else{
NSLog(#"write to local file failed: %#",error);
}
}
else {
// Type your code here for not existing asset
NSLog(#"%s, File at path '%#' could not be found.",__PRETTY_FUNCTION__,absolutePathToSelectedFile);
}
} failureBlock:^(NSError *error) {
// Type your code here for failure (when user doesn't allow location in your app)
NSLog(#"%s, failure to read file at path: %#. Check if user has allowed access to file location.",__PRETTY_FUNCTION__,absolutePathToSelectedFile);
}];
} // end of For loop
NSLog(#"Finished saving array of Files to local Disk. back to caller provided as '%#'",backToCaller);
if(backToCaller.length > 0){ // i.e. it was not set to nil and I need to post notification
NSLog(#"posting notification to go back to Caller %#", backToCaller);
[[NSNotificationCenter defaultCenter] postNotificationName:backToCaller object:nil userInfo:nil];
}
}
My iCloud Core Data app was running great on iOS7 and ready to launch. When I test on iOS 8 I get the following error and can't seem to fix it when trying to upload data to iCloud.
I suspect my problem is related to how I am getting the document directory and changes in the doc directory with iOS8 but I just can't figure this out..
014-10-12 15:14:17.862 XXXXXXX [4662:236693] __45-[PFUbiquityFilePresenter processPendingURLs]_block_invoke(439): CoreData: Ubiquity: Librarian returned a serious error for starting downloads Error Domain=BRCloudDocsErrorDomain Code=6 "The operation couldn’t be completed. (BRCloudDocsErrorDomain error 6 - Path is outside of any CloudDocs container, will never sync)" UserInfo=0x7f8b1a525f60 {NSDescription=Path is outside of any CloudDocs container, will never sync, NSFilePath=/Users/garyrea/Library/Developer/CoreSimulator/Devices/9AADFE8E-5ECC-4969-9418-57DA45B747C9/data/Containers/Data/Application/AD2E5E62-7295-4371-A08D-1790E8FCCD96/Documents/CoreDataUbiquitySupport/nobody~simA28745A4-A67F-598C-9260-F9AC36609ECF/iCloud/5B8BFA36-1ACA-4966-B7ED-A7344D36ACF1/container/nobody~simA28745A4-A67F-598C-9260-F9AC36609ECF/iCloud/2trlqdMQVpJ~wlEfiLvjWtQfrUJ8YiNCd84KW_xiw4A=/F0CF5F29-D437-4728-B0A2-C5BB90BBC239.1.cdt} with userInfo {
NSDescription = "Path is outside of any CloudDocs container, will never sync";
NSFilePath = "/Users/garyrea/Library/Developer/CoreSimulator/Devices/9AADFE8E-5ECC-4969-9418-57DA45B747C9/data/Containers/Data/Application/AD2E5E62-7295-4371-A08D-1790E8FCCD96/Documents/CoreDataUbiquitySupport/nobody~simA28745A4-A67F-598C-9260-F9AC36609ECF/iCloud/5B8BFA36-1ACA-4966-B7ED-A7344D36ACF1/container/nobody~simA28745A4-A67F-598C-9260-F9AC36609ECF/iCloud/2trlqdMQVpJ~wlEfiLvjWtQfrUJ8YiNCd84KW_xiw4A=/F0CF5F29-D437-4728-B0A2-C5BB90BBC239.1.cdt";
} for these urls: (
"file:///Users/garyrea/Library/Developer/CoreSimulator/Devices/9AADFE8E-5ECC-4969-9418-57DA45B747C9/data/Containers/Data/Application/AD2E5E62-7295-4371-A08D-1790E8FCCD96/Documents/CoreDataUbiquitySupport/nobody~simA28745A4-A67F-598C-9260-F9AC36609ECF/iCloud/5B8BFA36-1ACA-4966-B7ED-A7344D36ACF1/container/nobody~simA28745A4-A67F-598C-9260-F9AC36609ECF/iCloud/2trlqdMQVpJ~wlEfiLvjWtQfrUJ8YiNCd84KW_xiw4A=/F0CF5F29-D437-4728-B0A2-C5BB90BBC239.1.cdt"
)
my app delegate extension code where I create my persistent store is as follows. I have a seed database for first time installation.
- (NSPersistentStoreCoordinator *)createPersistentStoreCoordinator{
NSPersistentStoreCoordinator *persistentStoreCoordinator = nil;
NSManagedObjectModel *managedObjectModel = [self createManagedObjectModel];
persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc]initWithManagedObjectModel:managedObjectModel];
NSURL *storeURL = [[self applicationDocumentsDirectory] URLByAppendingPathComponent:#
"CoreData.sqlite"];
if (![[NSFileManager defaultManager]fileExistsAtPath:[storeURL path]]){
NSURL *preloadURL=[NSURL fileURLWithPath:[[NSBundle mainBundle]pathForResource:#"SeedDatabase" ofType:#
"sqlite"]];
NSError *error=nil;
if (![[NSFileManager defaultManager] copyItemAtURL:preloadURL toURL:storeURL error:&error]){
NSLog(#
"File couldnt save");
}
}
NSUbiquitousKeyValueStore *kvStore=[NSUbiquitousKeyValueStore defaultStore];
if (![kvStore boolForKey:#"SEEDED_DATA"]){
NSLog (#
"In the new database");
NSURL *seedStoreURL=[NSURL fileURLWithPath:[[NSBundle mainBundle]pathForResource:#"SeedDatabase" ofType:#
"sqlite"]];
NSError *seedStoreErrpr;
NSDictionary *seedStoreOptions=#{NSReadOnlyPersistentStoreOption: #YES};
NSPersistentStore *seedStore=[persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:seedStoreURL options:seedStoreOptions error:&seedStoreErrpr];
NSDictionary *iCloudOptions =#{NSPersistentStoreUbiquitousContentNameKey: #"iCloud",
NSMigratePersistentStoresAutomaticallyOption:#YES,
NSInferMappingModelAutomaticallyOption:#YES
};
NSOperationQueue *queue=[[NSOperationQueue alloc] init];
[queue addOperationWithBlock:^{
NSError *error;
[persistentStoreCoordinator migratePersistentStore:seedStore toURL:storeURL options:iCloudOptions withType:NSSQLiteStoreType error:&error];
NSLog(#
"Persistant store migrated");
[kvStore setBool:YES forKey:#
"SEEDED_DATA"];
// [self checkForDuplicates];
}];
}else{
NSError *error;
NSDictionary *storeOptions =#{NSPersistentStoreUbiquitousContentNameKey: #
"iCloud"
};
if (![persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType
configuration:nil
URL:storeURL
options:storeOptions
error:&error]) {
NSLog(#
"Unresolved error %#, %#", error, [error userInfo]);
abort();
}
}
return persistentStoreCoordinator;
}
- (NSURL *)applicationDocumentsDirectory{
return [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] lastObject];
}
I was able to resolve this error by specifying the iCloud drive directory (Same name as the one on the developer.apple.com interface).
-(NSURL *)cloudDirectory
{
NSFileManager *fileManager=[NSFileManager defaultManager];
NSString *teamID=#"iCloud";
NSString *bundleID=[[NSBundle mainBundle]bundleIdentifier];
NSString *cloudRoot=[NSString stringWithFormat:#"%#.%#",teamID,bundleID];
NSURL *cloudRootURL=[fileManager URLForUbiquityContainerIdentifier:cloudRoot];
NSLog (#"cloudRootURL=%#",cloudRootURL);
return cloudRootURL;
}
and including it in the icloudOptions Dictionary as a NSPersistentStoreUbiquitousContentURLKey
NSDictionary *storeOptions =#{NSPersistentStoreUbiquitousContentNameKey: #"iCloud",
NSPersistentStoreUbiquitousContentURLKey:[self cloudDirectory],
};
I was getting some strange errors so I removed the app from all devices, deleted the iCloud drive file and re ran on an actual device and it worked fine. Not sure if it runs on IOS7 now but since I only specified the NSPersistentStoreUbiquitousContentURLKey I am pretty confident it should be fine.
I had the same issue during loading some test data.
For the load of the data I was deleting all records.
To avoid the exception a simple sleep(1) between cleaning and loading was enough.
I am using AVCaptureSession to capture video from a devices camera and then using AVAssetWriterInput and AVAssetTrack to compress/resize the video before uploading it to a server. The final videos will be viewed on the web via an html5 video element.
I'm running into multiple issues trying to get the orientation of the video correct. My app only supports landscape orientation and all captured videos should be in landscape orientation. However, I would like to allow the user to hold their device in either landscape direction (i.e. home button on either the left or the right hand side).
I am able to make the video preview show in the correct orientation with the following line of code
_previewLayer.connection.videoOrientation = UIDevice.currentDevice.orientation;
The problems start when processing the video via AVAssetWriterInput and friends. The result does not seem to account for the left vs. right landscape mode the video was captured in. IOW, sometimes the video comes out upside down. After some googling I found many people suggesting that the following line of code would solve this issue
writerInput.transform = videoTrack.preferredTransform;
...but this doesn't seem to work. After a bit of debugging I found that videoTrack.preferredTransform is always the same value, regardless of the orientation the video was captured in.
I tried manually tracking what orientation the video was captured in and setting the writerInput.transform to CGAffineTransformMakeRotation(M_PI) as needed. Which solved the problem!!!
...sorta
When I viewed the results on the device this solution worked as expected. Videos were right-side-up regardless of left vs. right orientation while recording. Unfortunately, when I viewed the exact same videos in another browser (chrome on a mac book) they were all upside-down!?!?!?
What am I doing wrong?
EDIT
Here's some code, in case it's helpful...
-(void)compressFile:(NSURL*)inUrl;
{
NSString* fileName = [#"compressed." stringByAppendingString:inUrl.lastPathComponent];
NSError* error;
NSURL* outUrl = [PlatformHelper getFilePath:fileName error:&error];
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 30] };
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings };
NSDictionary* videoOptions = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
AVAssetWriter* assetWriter = [AVAssetWriter assetWriterWithURL:outUrl fileType:AVFileTypeMPEG4 error:&error];
assetWriter.shouldOptimizeForNetworkUse = YES;
[assetWriter addInput:writerInput];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:inUrl options:nil];
AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// !!! this line does not work as expected and causes all sorts of issues (videos display sideways in some cases) !!!
//writerInput.transform = videoTrack.preferredTransform;
AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoOptions];
AVAssetReader* assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error];
[assetReader addOutput:readerOutput];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[assetReader startReading];
[writerInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock:
^{
/* snip */
}];
}
The problem is that modifying the writerInput.transform property only adds a tag in the video file metadata which instructs the video player to rotate the file during playback. That's why the videos play in the correct orientation on your device (I'm guessing they also play correctly in a Quicktime player as well).
The pixel buffers captured by the camera are still laid out in the orientation in which they were captured. Many video players will not check for the preferred orientation metadata tag and will just play the file in the native pixel orientation.
If you want the user to be able to record video holding the phone in either landscape mode, you need to rectify this at the AVCaptureSession level before compression by performing a transform on the CVPixelBuffer of each video frame. This Apple Q&A covers it (look at the AVCaptureVideoOutput documentation as well):
https://developer.apple.com/library/ios/qa/qa1744/_index.html
Investigating the link above is the correct way to solve your problem. An alternate fast n' dirty way to solve the same problem would be to lock the recording UI of your app into only one landscape orientation and then to rotate all of your videos server-side using ffmpeg.
In case it's helpful for anyone, here's the code I ended up with. I ended up having to do the work on the video as it was being captured instead of as a post processing step. This is a helper class that manages the capture.
Interface
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#interface VideoCaptureManager : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate>
{
AVCaptureSession* _captureSession;
AVCaptureVideoPreviewLayer* _previewLayer;
AVCaptureVideoDataOutput* _videoOut;
AVCaptureDevice* _videoDevice;
AVCaptureDeviceInput* _videoIn;
dispatch_queue_t _videoProcessingQueue;
AVAssetWriter* _assetWriter;
AVAssetWriterInput* _writerInput;
BOOL _isCapturing;
NSString* _gameId;
NSString* _authToken;
}
-(void)setSettings:(NSString*)gameId authToken:(NSString*)authToken;
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
-(void)startPreview;
-(void)stopPreview;
-(void)startCapture;
-(void)stopCapture;
#end
Implementation (w/ a bit of editing and a few little TODO's)
#implementation VideoCaptureManager
-(id)init;
{
self = [super init];
if (self) {
NSError* error;
_videoProcessingQueue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);
_captureSession = [AVCaptureSession new];
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession];
[_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
_videoOut = [AVCaptureVideoDataOutput new];
_videoOut.videoSettings = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
_videoOut.alwaysDiscardsLateVideoFrames = YES;
_videoIn = [AVCaptureDeviceInput deviceInputWithDevice:_videoDevice error:&error];
// handle errors here
[_captureSession addInput:_videoIn];
[_captureSession addOutput:_videoOut];
}
return self;
}
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
{
_previewLayer.connection.videoOrientation = orientation;
for (AVCaptureConnection* item in _videoOut.connections) {
item.videoOrientation = orientation;
}
}
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
{
return _previewLayer;
}
-(void)startPreview;
{
[_captureSession startRunning];
}
-(void)stopPreview;
{
[_captureSession stopRunning];
}
-(void)startCapture;
{
if (_isCapturing) return;
NSURL* url = put code here to create your output url
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 1],
};
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings
};
_writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
_writerInput.expectsMediaDataInRealTime = YES;
NSError* error;
_assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeMPEG4 error:&error];
// handle errors
_assetWriter.shouldOptimizeForNetworkUse = YES;
[_assetWriter addInput:_writerInput];
[_videoOut setSampleBufferDelegate:self queue:_videoProcessingQueue];
_isCapturing = YES;
}
-(void)stopCapture;
{
if (!_isCapturing) return;
[_videoOut setSampleBufferDelegate:nil queue:nil]; // TODO: seems like there could be a race condition between this line and the next (could end up trying to write a buffer after calling writingFinished
dispatch_async(_videoProcessingQueue, ^{
[_assetWriter finishWritingWithCompletionHandler:^{
[self writingFinished];
}];
});
}
-(void)writingFinished;
{
// TODO: need to check _assetWriter.status to make sure everything completed successfully
// do whatever post processing you need here
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection;
{
NSLog(#"Video frame was dropped.");
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if(_assetWriter.status != AVAssetWriterStatusWriting) {
CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[_assetWriter startWriting]; // TODO: need to check the return value (a bool)
[_assetWriter startSessionAtSourceTime:lastSampleTime];
}
if (!_writerInput.readyForMoreMediaData || ![_writerInput appendSampleBuffer:sampleBuffer]) {
NSLog(#"Failed to write video buffer to output.");
}
}
#end
For compressing /Resizing the video ,we can use AVAssetExportSession.
We can uppload a video of duration 3.30minutes.
If the video duration will be more than 3.30minutes,it will show a memory warning .
As here we are not using any transform for the video,the video will be as it is while recording.
Below is the sample code for compressing the video .
we can check the video size before compression and after compression.
{
-(void)trimVideoWithURL:(NSURL *)inputURL{
NSString *path1 = [inputURL path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path1];
NSLog(#"size before compress video is %lu",(unsigned long)data.length);
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset640x480];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *outputURL = paths[0];
NSFileManager *manager = [NSFileManager defaultManager];
[manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
outputURL = [outputURL stringByAppendingPathComponent:#"output.mp4"];
fullPath = [NSURL URLWithString:outputURL];
// Remove Existing File
[manager removeItemAtPath:outputURL error:nil];
exportSession.outputURL = [NSURL fileURLWithPath:outputURL];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
CMTime duration = CMTimeMakeWithSeconds(1.0, 600);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
NSString *path = [fullPath path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path];
NSLog(#"size after compress video is %lu",(unsigned long)data.length);
NSLog(#"Export Complete %d %#", exportSession.status, exportSession.error);
/*
Do your neccessay stuff here after compression
*/
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#",exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#",exportSession.error);
break;
default:
break;
}
}];}
I am using AVFoundation to capture video. All seems to be well in my app until I get to here:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
}
}];
}
[library release];
}
}
When running it on a device I receive the "didFinishRecordingToOutputFileAtURL - success" message, and then it crashes and I get this error:
Video /private/var/mobile/Applications/EDC62CED-3710-45B2-A658-A2FE9238F517/tmp/output.mov cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12950 "Movie could not be played." UserInfo=0xe6749b0 {NSLocalizedDescription=Movie could not be played.}
I haven't been able to find a whole lot of information about this anywhere and am not really sure what's going on.
Here's the code with the temporary output URL:
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
NSError *error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
{
//Error - handle if requried
}
}
[outputPath release];
//Start recording
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[outputURL release];
I've already tried releasing the outputURL elsewhere after it saves, and that didn't help, so I don't think that's it.
I figured this out. I was an idiot and kept overlooking the fact that I didn't actually connect the video input to the capture session. Whoops.