I'm developing an app in iOS. I need to capture video from the camera and I need to record that video to a file and also get the uncompressed frames, that's why I need to use both AVCaptureOutput...
I read this in the apple's documentation "You can configure multiple inputs and outputs, coordinated by a single session:" So I think it must be doable, but I'm having problems with it...
I'm setting both to the session doing:
self.fileOutput.maxRecordedDuration = CMTimeMake(5000,1 );;
self.fileOutput.minFreeDiskSpaceLimit = 3000;
if([self.captureSession canAddOutput:self.fileOutput]){
[self.captureSession addOutput:self.fileOutput];
NSLog(#"Added File Video Output");
}else{
NSLog(#"Couldn't add video output");
}
if ([self.captureSession canAddOutput:videoOutput]){
[self.captureSession addOutput:videoOutput];
NSLog(#"Added Data Video Output");
}else{
NSLog(#"Couldn't add video output");
}
I'm getting both 'positive' confirmation messages. After that I'm calling to:
NSString *assetPath = [self createAssetFilePath:#"mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:assetPath];
[self.fileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[self.captureSession startRunning];
And then I have my delegate function:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
NSLog(#"Output File URL: %# ", outputFileURL);
BOOL recordedSuccessfully = YES;
if ([error code] != noErr) {
NSLog(#"Error: %#", error);
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
NSLog(#"Error: %#", value);
if (value) {
recordedSuccessfully = [value boolValue];
}
}
}
And I'm getting no error, but the "AVCaptureVideoDataOutput" was working before adding the "AVCaptureMovieFileOutput" and now it's not...
So... Is that possible to do both?! Any idea?!
Thanks!
The answer to this question: Simultaneous AVCaptureVideoDataOutput and AVCaptureMovieFileOutput indicates that you cannot have an AVCaptureVideoDataOutput and a AVCaptureMovieFileOutput to your session simultaneously. I can't verify this in the Apple documentation unfortunately. My experience is that I no longer receive messages to my AVCaptureVideoDataOutput's sampleBufferDelegate after I add an AVCaptureMovieFileOutput to the session's output, which seemingly backs up that assertion.
Related
This method sets the background object.
- (void) downloadWithURL: (NSMutableArray *)urlArray
pathArr: (NSMutableArray *)pathArr
mediaInfo: (MediaInfo *)mInfo
{
bgDownloadMediaInfo = mInfo;
reqUrlCount = urlArray.count;
dict = [NSDictionary dictionaryWithObjects:pathArr
forKeys:urlArray];
mutableDictionary = [dict mutableCopy];
backgroundConfigurationObject = [NSURLSessionConfiguration backgroundSessionConfigurationWithIdentifier:#"XXXXX"];
backgroundConfigurationObject.sessionSendsLaunchEvents = YES;
backgroundConfigurationObject.discretionary = YES;
backgroundSession = [NSURLSession sessionWithConfiguration: backgroundConfigurationObject
delegate: self delegateQueue: [NSOperationQueue currentQueue]];
self.requestUrl = [urlArray objectAtIndex:0];
download = [backgroundSession downloadTaskWithURL:self.requestUrl];
[download resume];
}
These are the completion handlers.
#pragma Mark - NSURLSessionDownloadDelegate
- (void)URLSession: (NSURLSession *)session
downloadTask: (NSURLSessionDownloadTask *)downloadTask
didFinishDownloadingToURL: (NSURL *)location
{
LogDebug(#"Download complete for request url (%#)", downloadTask.currentRequest.URL);
NSString *temp = [mutableDictionary objectForKey:downloadTask.currentRequest.URL];
NSString *localPath = [NSString stringWithFormat: #"%#",temp];
NSFileManager *fileManager = [NSFileManager defaultManager];
NSURL *destinationURL = [NSURL fileURLWithPath: localPath];
NSError *error = nil;
[fileManager moveItemAtURL:location toURL:destinationURL error:&error];
LogDebug(#"Moving download file at url : (%#) to : (%#)", downloadTask.currentRequest.URL, destinationURL);
reqUrlCount --;
downloadSegment ++;
// Handover remaining download requests to the OS
if ([finalUrlArr count] != 0) {
// remove the request from the array that got downloaded.
[finalUrlArr removeObjectAtIndex:0];
[finalPathArr removeObjectAtIndex:0];
if ([finalUrlArr count] > 0) {
// proceed with the next request on top.
self.requestUrl = [finalUrlArr objectAtIndex:0];
download = [backgroundSession downloadTaskWithURL:self.requestUrl];
[download resume];
}
}
if ([adsArray count] > 0) {
adsArrayCount --;
// delegate back once all the ADs segments have been downloaded.
if (adsArrayCount == 0) {
for (int i = 0; i < [adsArray count]; i++) {
NSArray *ads = [adsArray objectAtIndex: i];
for (int j = 0; j < [ads count]; j++) {
MediaInfo *ad = [ads objectAtIndex: j];
[self setDownloadComplete: ad];
// skip sending downloadFinish delegate if the media is marked as downloadDone
if (!ad.downloadDone) {
[delegate MediaDownloadDidFinish: ad.mediaId error: NO];
}
ad.downloadDone = YES;
}
}
downloadSegment = 0;
}
}
// delegate back once all the main media segments have been downloaded.
if (reqUrlCount == 0) {
[self setDownloadComplete: mediaInfo];
state = DownloadState_Done;
// skip sending downloadFinish delegate if the media is marked as downloadDone
if (!bgDownloadMediaInfo.downloadDone) {
[delegate MediaDownloadDidFinish: bgDownloadMediaInfo.mediaId error: NO];
}
bgDownloadMediaInfo.downloadDone = YES;
[urlArr release];
[pathArr release];
[finalUrlArr release];
[finalPathArr release];
// invalidate the NSURL session once complete
[backgroundSession invalidateAndCancel];
}
}
- (void)URLSession: (NSURLSession *)session
task: (NSURLSessionTask *)task
didCompleteWithError: (NSError *)error
{
if (error) {
NSLog(#"Failure to download request url (%#) with error (%#)", task.originalRequest.URL, error);
}
}
- (void)URLSession:(NSURLSession *)session
downloadTask:(NSURLSessionDownloadTask *)downloadTask
didWriteData:(int64_t)bytesWritten
totalBytesWritten:(int64_t)totalBytesWritten
totalBytesExpectedToWrite:(int64_t)totalBytesExpectedToWrite
{
// save the total downloaded size
[self downloaderDidReceiveData:bytesWritten];
// enable the log only for debugging purpose.
// LogDebug(#"totalBytesExpectedToWrite %llu, totalBytesWritten %llu, %#", totalBytesExpectedToWrite, totalBytesWritten, downloadTask.currentRequest.URL);
}
With out this code(beginBackgroundTaskWithExpirationHandler) the download stops when the app is pushed into background.
// AppDelegate_Phone.m
- (void)applicationDidEnterBackground: (UIApplication *)application
{
NSLog(#"applicationDidEnterBackground");
UIApplication *app = [UIApplication sharedApplication];
UIBackgroundTaskIdentifier bgTask;
bgTask = [app beginBackgroundTaskWithExpirationHandler:^{
[app endBackgroundTask:bgTask];
}];
}
Have you implemented application:handleEventsForBackgroundURLSession:completionHandler: in your app delegate? That should save the completion handler and start background session with the specified identifier.
If you don't implement that method, your app will not be informed if the download finishes after the app has been suspended (or subsequently terminated in the course of normal app lifecycle). As a result, it might look like the download didn't finish, even though it did.
(As an aside, note that if the user force-quits the app, that not only terminates the download, but obviously won't inform your app that the download was terminated until the user manually restarts the app at some later point and your app re-instantiates the background session. This is a second-order concern that you might not worry about until you get the main background processing working, but it's something to be aware of.)
Also, your URLSessionDidFinishEventsForBackgroundURLSession: must call that saved completion handler (and dispatch this to the main queue).
Also, your design looks like it will issue only one request at a time. (I'd advise against that, but let's just assume it is as you've outlined above.) So, let's imagine that you have issued the first request and the app is suspended before it's done. Then, when the download is done, the app is restarted in the background and handleEventsForBackgroundURLSession is called. Let's assume you fixed that to make sure it restarts the background session so that the various delegate methods can be called. Make sure that when you issue that second request for the second download that you use the existing background session, not instantiating a new one. You can have only one background session per identifier. Bottom line, the instantiation of the background session should be decoupled from downloadWithURL:pathArr:mediaInfo:. Only instantiate a background session once.
Add "Required background modes" in your .plist
There, add the item "App downloads content from the network"
i use the sample codes from apple for recording and playing the last recording
but i can't play the last recording
here are the codes
- (IBAction)playLastRecording {
// Present the media player controller for the last recorded URL.
NSDictionary *options = #{
WKMediaPlayerControllerOptionsAutoplayKey : #YES
};
[self presentMediaPlayerControllerWithURL:self.lastRecordingURL options:options completion:^(BOOL didPlayToEnd, NSTimeInterval endTime, NSError * __nullable error) {
if (!didPlayToEnd) {
NSLog(#"The player did not play all the way to the end. The player only played until time - %.2f.", endTime);
}
if (error) {
NSLog(#"There was an error with playback: %#.", error);
}
}];
}
and here is the error
i think we need to use the NSbundle and nsurl connection but how
to use for self.lastRecordingURL
please writing the correct codes for this problems
Optional(Error Domain=com.apple.watchkit.errors Code=4 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (1), NSUnderlyingError=0x17d9bf50 {Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted"}, NSLocalizedDescription=The operation could not be completed})
I had this problem. Make sure that the audio data is saved to the Apps Group correctly with the correct extension of the file.
If you are trying to record the audio file use the following codes.
- (void)startRecording
{
// Creating a path for saving the recorded audio file.
// We have to write the files to the shared group folder, as this is the only place both the app and extension can see.
NSURL *container = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
NSURL *outputURL = [container URLByAppendingPathComponent:#"AudioFile.m4a"];
// Setting the recorder options.
NSDictionary *dictMaxAudioRec = #{#"WKAudioRecorderControllerOptionsMaximumDurationKey":MAX_REC_DURATION};
// Presenting the default audio recorder.
[self presentAudioRecorderControllerWithOutputURL:outputURL preset:WKAudioRecorderPresetWideBandSpeech options:dictMaxAudioRec completion:^(BOOL didSave, NSError * error) {
if(didSave)
{
// Successfully saved the file.
NSURL *extensionDirectory = [[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask].firstObject;
NSUInteger timeAtRecording = (NSUInteger)[NSDate timeIntervalSinceReferenceDate];
NSString *dirName = [NSString stringWithFormat:#"AudioFile%d/",timeAtRecording];
NSURL *outputExtensionURL = [extensionDirectory URLByAppendingPathComponent:dirName];
// Move the file to new directory.
NSError *moveError;
BOOL success = [[NSFileManager defaultManager] moveItemAtURL:outputURL toURL:outputExtensionURL error:&moveError];
if (!success) {
NSLog(#"Failed to move the outputURL to the extension's documents direcotry: %#", moveError);
}
else {
NSData *audioData = [NSData dataWithContentsOfURL:outputExtensionURL];
NSLog(#"Actual Audio Data length: %lu", (unsigned long)audioData.length);
if(audioData.length)
{
// We have a valid audio data,do what ever you want to do with this data
}
}
}
else
{
// Something went wrong.
NSLog(#"%s - %#",__PRETTY_FUNCTION__,error);
}
}];
}
Or if you are trying to play a video that you have downloaded from other source or passed from the paired phone, write the audio data first to the App Groups with file extension. The following codes may help you for that.
- (void)writeAudioToAppGroupsWithData:(NSData *)audioData
{
// Writing the audio data to the App Groups
NSURL *containerURL = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
containerURL = [containerURL URLByAppendingPathComponent:[NSString stringWithFormat:DIRECTORY_PATH]];
[audioData writeToURL:containerURL atomically:YES];
}
In this case make sure that your DIRECTORY_PATH is Library/Caches/filename.extension.
Eg: Library/Caches/Audio.mp3
For playing the saved audio use the following codes.
- (void)playAudio
{
// Playing the audio from the url using the default controller
[self presentMediaPlayerControllerWithURL:[self getAudioUrl] options:nil completion:^(BOOL didPlayToEnd, NSTimeInterval endTime, NSError * _Nullable error) {
NSLog(#"Error = %#",error);
}];
}
You can get the audio url from the App Groups.
- (NSURL *)getAudioUrl
{
// Getting the audio url from the App Groups
NSURL *container = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
NSURL *outputURL = [container URLByAppendingPathComponent:[NSString stringWithFormat:DIRECTORY_PATH]];
return outputURL;
}
Hope this will fix your issue.
I'm working over application, that using zxing library to read QRcodes. I have problem with ZxingWidgetController - when view is showed, during application is in background/not active (eg. screen is lock) image from camera is not shown on screen - only background is visible, and scanner seems to be not working.
when i call initCapture method again, after a little delay video from camera is showed, but in this case, every time when application lose activity i need to reinitialize scanner - this behavior is not comfortable at all.
this bug can be repeated on almost all aplication used zXing, so i suppose that is some zXing bug.
zXing initCapture method code is:
- (void)initCapture {
#if HAS_AVFF
AVCaptureDeviceInput *captureInput =
[AVCaptureDeviceInput deviceInputWithDevice:
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]
error:nil];
if(!captureInput)
{
NSLog(#"ERROR - CaptureInputNotInitialized");
}
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
if(!captureOutput)
{
NSLog(#"ERROR - CaptureOutputNotInitialized");
}
[captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[[AVCaptureSession alloc] init] autorelease];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; // 480x360 on a 4
if([self.captureSession canAddInput:captureInput])
{
[self.captureSession addInput:captureInput];
}
else
{
NSLog(#"ERROR - cannot add input");
}
if([self.captureSession canAddOutput:captureOutput])
{
[self.captureSession addOutput:captureOutput];
}
else
{
NSLog(#"ERROR - cannot add output");
}
[captureOutput release];
if (!self.prevLayer)
{
[self.prevLayer release];
}
self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
// NSLog(#"prev %p %#", self.prevLayer, self.prevLayer);
self.prevLayer.frame = self.view.bounds;
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: self.prevLayer];
[self.captureSession startRunning];
#endif
}
Maybe you guys know what is wrong?
I dont understand your question. If application is in background/not active, of course it cant working. You should make it clear.
I am using AVFoundation to capture video. All seems to be well in my app until I get to here:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
}
}];
}
[library release];
}
}
When running it on a device I receive the "didFinishRecordingToOutputFileAtURL - success" message, and then it crashes and I get this error:
Video /private/var/mobile/Applications/EDC62CED-3710-45B2-A658-A2FE9238F517/tmp/output.mov cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12950 "Movie could not be played." UserInfo=0xe6749b0 {NSLocalizedDescription=Movie could not be played.}
I haven't been able to find a whole lot of information about this anywhere and am not really sure what's going on.
Here's the code with the temporary output URL:
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
NSError *error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
{
//Error - handle if requried
}
}
[outputPath release];
//Start recording
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[outputURL release];
I've already tried releasing the outputURL elsewhere after it saves, and that didn't help, so I don't think that's it.
I figured this out. I was an idiot and kept overlooking the fact that I didn't actually connect the video input to the capture session. Whoops.
Source
So, I have a QTCaptureSession set up thusly:
//Setup Camera
cameraSession = [[QTCaptureSession alloc] init];
QTCaptureDevice *camera = [QTCaptureDevice deviceWithUniqueID: cameraID];
BOOL success = [camera open: &error];
if (!success || error)
{
NSLog(#"Could not open device %#.", cameraID);
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
//Setup Input Session
QTCaptureDeviceInput *cameraInput = [[QTCaptureDeviceInput alloc] initWithDevice: camera];
success = [cameraSession addInput: cameraInput error: &error];
if (!success || error)
{
NSLog(#"Could not initialize input session.");
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
//Setup Output
QTCaptureDecompressedVideoOutput *cameraOutput = [[QTCaptureDecompressedVideoOutput alloc] init];
[cameraOutput setDelegate: self];
success = [cameraSession addOutput: cameraOutput error: &error];
if (!success || error)
{
NSLog(#"Could not initialize output session.");
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
And the QTCaptureDecompressedVideoOutput delegate's captureOutput:didOutputVideoFrame:WithSampleBuffer:fromConnection: thusly:
- (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputVideoFrame:(CVImageBufferRef)videoFrame withSampleBuffer:(QTSampleBuffer *)sampleBuffer fromConnection:(QTCaptureConnection *)connection
{
NSLog(#"starting convert\n");
}
I then start the capture processing using:
[cameraSession startRunning];
All of the variables initialize fine, and the session starts fine, but captureOutput:didOutputVideoFrame:withSampleBuffer:fromConnection: never gets called.
Context
This is a command-line app, compiled with the GCC. It's linked against the following frameworks:
Foundation
Cocoa
QTKit
QuartzCore
Relevant Miscellany
The frame is not likely dropping because captureOutput:didDropVideoFrameWithSampleBuffer:fromConnection: is also not getting called.
So, with some help from Mike Ash, I managed to figure out that my program was terminating immediately and not waiting for the delegate callback (which, according to Apple's QTKit docs, might occur on a separate thread).
My solution was to add a BOOL properties to my object named captureIsFinished, then add this to the main() function:
//Wait Until Capture is Finished
while (![snap captureIsFinished])
{
[[NSRunLoop currentRunLoop] runUntilDate: [NSDate dateWithTimeIntervalSinceNow: 1]];
}
Which effectively perpetuates the run-loop of the app for 1 second, checks to see if the capture is finished, then runs for another second.