QTCaptureOutput.delegate captureOutput:didOutputVideoFrame:... never called - objective-c

Source
So, I have a QTCaptureSession set up thusly:
//Setup Camera
cameraSession = [[QTCaptureSession alloc] init];
QTCaptureDevice *camera = [QTCaptureDevice deviceWithUniqueID: cameraID];
BOOL success = [camera open: &error];
if (!success || error)
{
NSLog(#"Could not open device %#.", cameraID);
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
//Setup Input Session
QTCaptureDeviceInput *cameraInput = [[QTCaptureDeviceInput alloc] initWithDevice: camera];
success = [cameraSession addInput: cameraInput error: &error];
if (!success || error)
{
NSLog(#"Could not initialize input session.");
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
//Setup Output
QTCaptureDecompressedVideoOutput *cameraOutput = [[QTCaptureDecompressedVideoOutput alloc] init];
[cameraOutput setDelegate: self];
success = [cameraSession addOutput: cameraOutput error: &error];
if (!success || error)
{
NSLog(#"Could not initialize output session.");
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
And the QTCaptureDecompressedVideoOutput delegate's captureOutput:didOutputVideoFrame:WithSampleBuffer:fromConnection: thusly:
- (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputVideoFrame:(CVImageBufferRef)videoFrame withSampleBuffer:(QTSampleBuffer *)sampleBuffer fromConnection:(QTCaptureConnection *)connection
{
NSLog(#"starting convert\n");
}
I then start the capture processing using:
[cameraSession startRunning];
All of the variables initialize fine, and the session starts fine, but captureOutput:didOutputVideoFrame:withSampleBuffer:fromConnection: never gets called.
Context
This is a command-line app, compiled with the GCC. It's linked against the following frameworks:
Foundation
Cocoa
QTKit
QuartzCore
Relevant Miscellany
The frame is not likely dropping because captureOutput:didDropVideoFrameWithSampleBuffer:fromConnection: is also not getting called.

So, with some help from Mike Ash, I managed to figure out that my program was terminating immediately and not waiting for the delegate callback (which, according to Apple's QTKit docs, might occur on a separate thread).
My solution was to add a BOOL properties to my object named captureIsFinished, then add this to the main() function:
//Wait Until Capture is Finished
while (![snap captureIsFinished])
{
[[NSRunLoop currentRunLoop] runUntilDate: [NSDate dateWithTimeIntervalSinceNow: 1]];
}
Which effectively perpetuates the run-loop of the app for 1 second, checks to see if the capture is finished, then runs for another second.

Related

How to pause screen recording in AVFoundation

I have to implement a pause screen while recording with AVFoundation in cocoa
For starting video I am using this function:
-(void)takeScreenRecording:(CGRect)rect saveAtPath:(NSURL*)destPath {
// Create a capture session
mSession = [[AVCaptureSession alloc] init];
// Set the session preset as you wish
mSession.sessionPreset = AVCaptureSessionPresetPhoto;
CGDirectDisplayID displayId = kCGDirectMainDisplay;
// Create a ScreenInput with the display and add it to the session
AVCaptureScreenInput *input =
[[AVCaptureScreenInput alloc] initWithDisplayID:displayId];
[input setCropRect:rect];
if (!input) {
mSession = nil;
return;
}
if ([mSession canAddInput:input])
[mSession addInput:input];
// Create a MovieFileOutput and add it to the session
mMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([mSession canAddOutput:mMovieFileOutput])
[mSession addOutput:mMovieFileOutput];
// Start running the session
[mSession startRunning];
// Delete any existing movie file first
if ([[NSFileManager defaultManager] fileExistsAtPath:[destPath path]])
{
NSError *err;
if ( ![[NSFileManager defaultManager] removeItemAtPath:[destPath path]
error:&err] )
{
NSLog(#"Error deleting existing movie %#",[err localizedDescription]);
}
}
[mMovieFileOutput startRecordingToOutputFileURL:destPath
recordingDelegate:self];
}
For stopping Screen recording I am using these functions:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error {
NSLog(#"Did finish recording to %# due to error %#",
[outputFileURL description], [error description]);
[mSession stopRunning];
mSession = nil;
}
-(void)finishRecord {
// Stop recording to the destination movie file
NSLog(#"Stopping record");
[mMovieFileOutput stopRecording];
}
But I am unable to get logic for pause functionality. Can anyone tell me how I can implement a pause Screen when recording?

Can I have AVCaptureVideoDataOutput and AVCaptureMovieFileOutput in the same session?

I'm developing an app in iOS. I need to capture video from the camera and I need to record that video to a file and also get the uncompressed frames, that's why I need to use both AVCaptureOutput...
I read this in the apple's documentation "You can configure multiple inputs and outputs, coordinated by a single session:" So I think it must be doable, but I'm having problems with it...
I'm setting both to the session doing:
self.fileOutput.maxRecordedDuration = CMTimeMake(5000,1 );;
self.fileOutput.minFreeDiskSpaceLimit = 3000;
if([self.captureSession canAddOutput:self.fileOutput]){
[self.captureSession addOutput:self.fileOutput];
NSLog(#"Added File Video Output");
}else{
NSLog(#"Couldn't add video output");
}
if ([self.captureSession canAddOutput:videoOutput]){
[self.captureSession addOutput:videoOutput];
NSLog(#"Added Data Video Output");
}else{
NSLog(#"Couldn't add video output");
}
I'm getting both 'positive' confirmation messages. After that I'm calling to:
NSString *assetPath = [self createAssetFilePath:#"mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:assetPath];
[self.fileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[self.captureSession startRunning];
And then I have my delegate function:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
NSLog(#"Output File URL: %# ", outputFileURL);
BOOL recordedSuccessfully = YES;
if ([error code] != noErr) {
NSLog(#"Error: %#", error);
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
NSLog(#"Error: %#", value);
if (value) {
recordedSuccessfully = [value boolValue];
}
}
}
And I'm getting no error, but the "AVCaptureVideoDataOutput" was working before adding the "AVCaptureMovieFileOutput" and now it's not...
So... Is that possible to do both?! Any idea?!
Thanks!
The answer to this question: Simultaneous AVCaptureVideoDataOutput and AVCaptureMovieFileOutput indicates that you cannot have an AVCaptureVideoDataOutput and a AVCaptureMovieFileOutput to your session simultaneously. I can't verify this in the Apple documentation unfortunately. My experience is that I no longer receive messages to my AVCaptureVideoDataOutput's sampleBufferDelegate after I add an AVCaptureMovieFileOutput to the session's output, which seemingly backs up that assertion.

Error trying to assigning __block ALAsset from inside assetForURL:resultBlock:

I am trying to create a method that will return me a ALAsset for a given asset url. (I need upload the asset later and want to do it outside the result block with the result.)
+ (ALAsset*) assetForPhoto:(Photo*)photo
{
ALAssetsLibrary* library = [[[ALAssetsLibrary alloc] init] autorelease];
__block ALAsset* assetToReturn = nil;
NSURL* url = [NSURL URLWithString:photo.assetUrl];
NSLog(#"assetForPhoto: %#[", url);
[library assetForURL:url resultBlock:^(ALAsset *asset)
{
NSLog(#"asset: %#", asset);
assetToReturn = asset;
NSLog(#"asset: %# %d", assetToReturn, [assetToReturn retainCount]);
} failureBlock:^(NSError *error)
{
assetToReturn = nil;
}];
NSLog(#"assetForPhoto: %#]", url);
NSLog(#"assetToReturn: %#", assetToReturn); // Invalid access exception coming here.
return assetToReturn;
}
The problem is assetToReturn gives an EXC_BAD_ACCESS.
Is there some problem if I try to assign pointers from inside the block? I saw some examples of blocks but they are always with simple types like integers etc.
A few things:
You must keep the ALAssetsLibrary instance around that created the ALAsset for as long as you use the asset.
You must register an observer for the ALAssetsLibraryChangedNotification, when that is received any ALAssets you have and any other AssetsLibrary objects will need to be refetched as they will no longer be valid. This can happen at any time.
You shouldn't expect the -assetForURL:resultBlock:failureBlock:, or any of the AssetsLibrary methods with a failureBlock: to be synchronous. They may need to prompt the user for access to the library and will not always have their blocks executed immediately. It's better to put actions that need to happen on success in the success block itself.
Only if you absolutely must make this method synchronous in your app (which I'd advise you to not do), you'll need to wait on a semaphore after calling assetForURL:resultBlock:failureBlock: and optionally spin the runloop if you end up blocking the main thread.
The following implementation should satisfy as a synchronous call under all situations, but really, you should try very hard to make your code asynchronous instead.
- (ALAsset *)assetForURL:(NSURL *)url {
__block ALAsset *result = nil;
__block NSError *assetError = nil;
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[[self assetsLibrary] assetForURL:url resultBlock:^(ALAsset *asset) {
result = [asset retain];
dispatch_semaphore_signal(sema);
} failureBlock:^(NSError *error) {
assetError = [error retain];
dispatch_semaphore_signal(sema);
}];
if ([NSThread isMainThread]) {
while (!result && !assetError) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
}
else {
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);
}
dispatch_release(sema);
[assetError release];
return [result autorelease];
}
You should retain and autorelease the asset:
// ...
assetToReturn = [asset retain];
// ...
return [assetToReturn autorelease];

addInput method of QTCaptureSession not returning

I have the following code:
BOOL success;
QTCaptureSession *session = [[QTCaptureSession alloc] init];
QTCaptureDevice *device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo];
success = [device open: &e];
if ( !success )
{
NSLog(#"error opening input device: %#", e);
return;
}
QTCaptureDeviceInput *input = [QTCaptureDeviceInput deviceInputWithDevice: device];
success = [session addInput: input error: &e];
if ( !success )
{
NSLog(#"error adding input device to session: %#", e);
return;
}
QTCaptureDecompressedVideoOutput *output = [[QTCaptureDecompressedVideoOutput alloc] init];
[output setDelegate: self];
success = [session addOutput: output error: &e];
if ( !success )
{
NSLog(#"error adding output device to session: %#", e);
return;
}
[session startRunning];
this is located in a bundle loaded during runtime and is part of a method that is the selector of a NSThread (aka. it runs in in a background thread and not the main thread).
My problem is, that the call to #addInput:error: never returns. What am I missing here?
The problem was, that the main thread of the application was not using the Run Loop normally provided by NSApplicationMain() in Cocoa applications. Instead I was handling events within my own while loop.
The solution was calling:
CFRunLoopRunInMode(kCFRunLoopDefaultMode, 0, YES);
This thread on Apple's QuickTime mailing list for a more detailed explanation of the problem.

Can't receive NSInputStream events in OCUnitTest

I'm trying to learn how to use the NSInputStream class on the iPhone using a unit test. I can get the NSStream to read data from a file using the polling method but for some reason the delegate/event method is not working.
I've posted the relevant code below. Please ignore memory leak errors and such since I'm just trying to ensure I know how to use the NSStream class in a sandboxed environment before rolling it into my larger project.
I'm wondering if maybe I'm missing something with regards to how the run loops work?
This is the logic test that creates a streamer class to read from a file.
#import "StreamingTests.h"
#import "Streamer.h"
#implementation StreamingTests
- (void) testStream {
NSLog(#"Starting stream test.");
Streamer * streamer = [[Streamer alloc] init];
streamer.usePolling = NO;
streamer.readingStream = YES;
NSThread * readThread = [[NSThread alloc] initWithTarget:streamer selector:#selector(startStreamRead:) object:nil];
[readThread start];
while(streamer.readingStream) {
[NSThread sleepForTimeInterval:0.5];
}
[readThread cancel];
}
#end
This is a simple test helper object that reads from an NSStream. When usePolling == YES it read data and outputs the appropriate NSLog messages. However, if usePolling == NO the delegate stream event function is never called.
#implementation Streamer
#synthesize readingStream, usePolling;
- (void) startStreamRead:(NSObject*) context {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
NSLog(#"starting stream read.");
readingStream = YES;
/*
NSURL * url = [NSURL URLWithString:#"http://www.google.com"];
NSLog(#"Loading: %#",[url description]);
NSInputStream * inStream = [[NSInputStream alloc] initWithURL:url];
*/
NSInputStream * inStream = [[NSInputStream alloc] initWithFileAtPath:#"sample.ttc"];
if(!usePolling) {
[inStream setDelegate: self];
[inStream scheduleInRunLoop: [NSRunLoop currentRunLoop]
forMode: NSDefaultRunLoopMode];
}
[inStream open];
if(usePolling) {
while(1) {
if([inStream hasBytesAvailable]) {
uint8_t buf[1024];
unsigned int len = 0;
len = [(NSInputStream *)inStream read:buf maxLength:1024];
NSLog(#"Read: %d",len);
}
NSStreamStatus status = [inStream streamStatus];
if(status != NSStreamStatusOpen && status != NSStreamStatusOpening) {
NSLog(#"Stream not open.");
break;
}
}
readingStream = NO;
NSStreamStatus status = [inStream streamStatus];
NSError * error = [inStream streamError];
NSLog(#"Status: %d Error Desc: %# Reason: %#",(int)status,[error localizedDescription], [error localizedFailureReason]);
[pool release];
}
}
- (void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode {
NSMutableData * _data = nil;
NSNumber * bytesRead = nil;
NSLog(#"Event fired.");
switch(eventCode) {
case NSStreamEventHasBytesAvailable:
{
if(!_data) {
_data = [[NSMutableData data] retain];
}
uint8_t buf[1024];
unsigned int len = 0;
len = [(NSInputStream *)stream read:buf maxLength:1024];
if(len) {
[_data appendBytes:(const void *)buf length:len];
// bytesRead is an instance variable of type NSNumber.
//[bytesRead setIntValue:[bytesRead intValue]+len];
NSLog(#"Read %d bytes",len);
} else {
NSLog(#"no buffer!");
}
break;
}
case NSStreamEventEndEncountered:
{
[stream close];
[stream removeFromRunLoop:[NSRunLoop currentRunLoop]
forMode:NSDefaultRunLoopMode];
[stream release];
stream = nil; // stream is ivar, so reinit it
readingStream = NO;
break;
}
default:
{
NSLog(#"Another event occurred.");
break;
}
// continued ...
}
}
#end
Thanks in advance,
b
The reason for it should be that the run loop is blocked since the unit test is executing. You could refer to the NSRunLoop documentation where the method
runUntilDate:
might help you to run the main run loop in the thread of execution of the unit test like this:
[[NSRunLoop mainRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:1]];
This lets the run loop run for 1 second giving it time to process part of your file. It should be noted that this does not provide a reliable way for unit testing (since the time interval might differ depending on run loop size) and may then be unsuitable. By giving your unit an interface that could be used to check the status of the input stream read operation (with a reading finished state) such as
-(BOOL)hasFinishedReadingFile
the unit test could repeatedly execute the run loop until the above method returns TRUE and the file is read completely.
Addition: This question on stackoverflow also deals with the problem in a different way.