I have the following code:
BOOL success;
QTCaptureSession *session = [[QTCaptureSession alloc] init];
QTCaptureDevice *device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo];
success = [device open: &e];
if ( !success )
{
NSLog(#"error opening input device: %#", e);
return;
}
QTCaptureDeviceInput *input = [QTCaptureDeviceInput deviceInputWithDevice: device];
success = [session addInput: input error: &e];
if ( !success )
{
NSLog(#"error adding input device to session: %#", e);
return;
}
QTCaptureDecompressedVideoOutput *output = [[QTCaptureDecompressedVideoOutput alloc] init];
[output setDelegate: self];
success = [session addOutput: output error: &e];
if ( !success )
{
NSLog(#"error adding output device to session: %#", e);
return;
}
[session startRunning];
this is located in a bundle loaded during runtime and is part of a method that is the selector of a NSThread (aka. it runs in in a background thread and not the main thread).
My problem is, that the call to #addInput:error: never returns. What am I missing here?
The problem was, that the main thread of the application was not using the Run Loop normally provided by NSApplicationMain() in Cocoa applications. Instead I was handling events within my own while loop.
The solution was calling:
CFRunLoopRunInMode(kCFRunLoopDefaultMode, 0, YES);
This thread on Apple's QuickTime mailing list for a more detailed explanation of the problem.
Related
I have an iOS app with a simple UIView placed in the view controller. I am trying to show the camera feed of the front facing camera, in the UIView. I am not trying to take a picture or record a video, I simply want to show the live feed in a UIView.
I have tried to implement AVCaptureVideoPreviewLayer, however the feed I get is blank. Nothing seems to happen. Here is my code:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input;
#try {
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
} #catch (NSException *exception) {
NSLog(#"Error; %#", error);
} #finally {
if (error == nil) {
if ([session canAddInput:input]) {
[session addInput:input];
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
if ([session canAddOutput:stillImageOutput]) {
[session setSessionPreset:AVCaptureSessionPresetHigh];
[session addOutput:stillImageOutput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[backgroundStreamView.layer addSublayer:previewLayer];
[session startRunning];
NSLog(#"session running");
} else {
NSLog(#"cannot add output");
}
} else {
NSLog(#"cannot add inout");
}
} else {
NSLog(#"general error: %#", error);
}
}
The session runs perfectly fine, however no video feed is shown. What am I doing wrong?
Managed to fix it myself, turned out to be a fairly simple issue - I didn't specify the frame size of the AVCapturePreviewLayer and as a result it was not appearing (presumably because it defaults to a frame size of zero).
To fix this I set the frame to match the frame of my custom UIView:
[previewLayer setFrame:backgroundStreamView.bounds];
Deprecation code fix
AVCaptureStillImageOutput is also deprecated, so to fix that, I replaced it with the AVCapturePhotoOutput class. Thus the code changed from:
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}]
to the following:
AVCapturePhotoOutput *stillImageOutput = [[AVCapturePhotoOutput alloc] init];
I'm trying to add a few tests to a Core Data app. The App actually works fine, it's the tests that fail miserably when I try to create the stack.
I get the error when trying to add a NSPersistentStore of SQLite type to the NSPersistentStoreCoordinator:
- (NSPersistentStoreCoordinator *)storeCoordinator
{
if (_storeCoordinator == nil) {
_storeCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:self.model];
NSError *err = nil;
if (![_storeCoordinator addPersistentStoreWithType:NSSQLiteStoreType
configuration:nil
URL:self.dbURL
options:[[self class] persistentStoreCordinatorOptions]
error:&err]) {
//This is where I get the error
NSNotification *note = [NSNotification
notificationWithName:[[self class] persistentStoreCoordinatorErrorNotificationName]
object:self
userInfo:#{#"error" : err}];
[[NSNotificationCenter defaultCenter] postNotification:note];
NSLog(#"Error while adding a Store: %#", err);
return nil;
}
}
return _storeCoordinator;
}
This is the error:
CoreData: error: -addPersistentStoreWithType:SQLite configuration:(null) URL:file:///Users/cfisher/Library/Developer/CoreSimulator/Devices/860C8F97-354D-4A9D-B1E9-CC018680F487/data/Library/Caches/TestModel options:{
NSInferMappingModelAutomaticallyOption = 1;
NSMigratePersistentStoresAutomaticallyOption = 1;
NSReadOnlyPersistentStoreOption = 1;
} ... returned error Error Domain=NSCocoaErrorDomain Code=260 "The operation couldn’t be completed. (Cocoa error 260.)" with userInfo dictionary {
}
The URL mentioned in the error is obtained in the following way (the dbURL property below):
- (void)setUp {
[super setUp];
NSURL *cache = [[[NSFileManager defaultManager]
URLsForDirectory:NSCachesDirectory
inDomains:NSUserDomainMask] lastObject];
self.dbURL = [cache URLByAppendingPathComponent:self.testModelName];
self.testModelName = #"TestModel";
self.testBundle = [NSBundle bundleForClass:[self class]];
}
This only happens within the test. The App works fine. Any ideas?
I have to implement a pause screen while recording with AVFoundation in cocoa
For starting video I am using this function:
-(void)takeScreenRecording:(CGRect)rect saveAtPath:(NSURL*)destPath {
// Create a capture session
mSession = [[AVCaptureSession alloc] init];
// Set the session preset as you wish
mSession.sessionPreset = AVCaptureSessionPresetPhoto;
CGDirectDisplayID displayId = kCGDirectMainDisplay;
// Create a ScreenInput with the display and add it to the session
AVCaptureScreenInput *input =
[[AVCaptureScreenInput alloc] initWithDisplayID:displayId];
[input setCropRect:rect];
if (!input) {
mSession = nil;
return;
}
if ([mSession canAddInput:input])
[mSession addInput:input];
// Create a MovieFileOutput and add it to the session
mMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([mSession canAddOutput:mMovieFileOutput])
[mSession addOutput:mMovieFileOutput];
// Start running the session
[mSession startRunning];
// Delete any existing movie file first
if ([[NSFileManager defaultManager] fileExistsAtPath:[destPath path]])
{
NSError *err;
if ( ![[NSFileManager defaultManager] removeItemAtPath:[destPath path]
error:&err] )
{
NSLog(#"Error deleting existing movie %#",[err localizedDescription]);
}
}
[mMovieFileOutput startRecordingToOutputFileURL:destPath
recordingDelegate:self];
}
For stopping Screen recording I am using these functions:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error {
NSLog(#"Did finish recording to %# due to error %#",
[outputFileURL description], [error description]);
[mSession stopRunning];
mSession = nil;
}
-(void)finishRecord {
// Stop recording to the destination movie file
NSLog(#"Stopping record");
[mMovieFileOutput stopRecording];
}
But I am unable to get logic for pause functionality. Can anyone tell me how I can implement a pause Screen when recording?
Source
So, I have a QTCaptureSession set up thusly:
//Setup Camera
cameraSession = [[QTCaptureSession alloc] init];
QTCaptureDevice *camera = [QTCaptureDevice deviceWithUniqueID: cameraID];
BOOL success = [camera open: &error];
if (!success || error)
{
NSLog(#"Could not open device %#.", cameraID);
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
//Setup Input Session
QTCaptureDeviceInput *cameraInput = [[QTCaptureDeviceInput alloc] initWithDevice: camera];
success = [cameraSession addInput: cameraInput error: &error];
if (!success || error)
{
NSLog(#"Could not initialize input session.");
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
//Setup Output
QTCaptureDecompressedVideoOutput *cameraOutput = [[QTCaptureDecompressedVideoOutput alloc] init];
[cameraOutput setDelegate: self];
success = [cameraSession addOutput: cameraOutput error: &error];
if (!success || error)
{
NSLog(#"Could not initialize output session.");
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
And the QTCaptureDecompressedVideoOutput delegate's captureOutput:didOutputVideoFrame:WithSampleBuffer:fromConnection: thusly:
- (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputVideoFrame:(CVImageBufferRef)videoFrame withSampleBuffer:(QTSampleBuffer *)sampleBuffer fromConnection:(QTCaptureConnection *)connection
{
NSLog(#"starting convert\n");
}
I then start the capture processing using:
[cameraSession startRunning];
All of the variables initialize fine, and the session starts fine, but captureOutput:didOutputVideoFrame:withSampleBuffer:fromConnection: never gets called.
Context
This is a command-line app, compiled with the GCC. It's linked against the following frameworks:
Foundation
Cocoa
QTKit
QuartzCore
Relevant Miscellany
The frame is not likely dropping because captureOutput:didDropVideoFrameWithSampleBuffer:fromConnection: is also not getting called.
So, with some help from Mike Ash, I managed to figure out that my program was terminating immediately and not waiting for the delegate callback (which, according to Apple's QTKit docs, might occur on a separate thread).
My solution was to add a BOOL properties to my object named captureIsFinished, then add this to the main() function:
//Wait Until Capture is Finished
while (![snap captureIsFinished])
{
[[NSRunLoop currentRunLoop] runUntilDate: [NSDate dateWithTimeIntervalSinceNow: 1]];
}
Which effectively perpetuates the run-loop of the app for 1 second, checks to see if the capture is finished, then runs for another second.
I am trying to create a method that will return me a ALAsset for a given asset url. (I need upload the asset later and want to do it outside the result block with the result.)
+ (ALAsset*) assetForPhoto:(Photo*)photo
{
ALAssetsLibrary* library = [[[ALAssetsLibrary alloc] init] autorelease];
__block ALAsset* assetToReturn = nil;
NSURL* url = [NSURL URLWithString:photo.assetUrl];
NSLog(#"assetForPhoto: %#[", url);
[library assetForURL:url resultBlock:^(ALAsset *asset)
{
NSLog(#"asset: %#", asset);
assetToReturn = asset;
NSLog(#"asset: %# %d", assetToReturn, [assetToReturn retainCount]);
} failureBlock:^(NSError *error)
{
assetToReturn = nil;
}];
NSLog(#"assetForPhoto: %#]", url);
NSLog(#"assetToReturn: %#", assetToReturn); // Invalid access exception coming here.
return assetToReturn;
}
The problem is assetToReturn gives an EXC_BAD_ACCESS.
Is there some problem if I try to assign pointers from inside the block? I saw some examples of blocks but they are always with simple types like integers etc.
A few things:
You must keep the ALAssetsLibrary instance around that created the ALAsset for as long as you use the asset.
You must register an observer for the ALAssetsLibraryChangedNotification, when that is received any ALAssets you have and any other AssetsLibrary objects will need to be refetched as they will no longer be valid. This can happen at any time.
You shouldn't expect the -assetForURL:resultBlock:failureBlock:, or any of the AssetsLibrary methods with a failureBlock: to be synchronous. They may need to prompt the user for access to the library and will not always have their blocks executed immediately. It's better to put actions that need to happen on success in the success block itself.
Only if you absolutely must make this method synchronous in your app (which I'd advise you to not do), you'll need to wait on a semaphore after calling assetForURL:resultBlock:failureBlock: and optionally spin the runloop if you end up blocking the main thread.
The following implementation should satisfy as a synchronous call under all situations, but really, you should try very hard to make your code asynchronous instead.
- (ALAsset *)assetForURL:(NSURL *)url {
__block ALAsset *result = nil;
__block NSError *assetError = nil;
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[[self assetsLibrary] assetForURL:url resultBlock:^(ALAsset *asset) {
result = [asset retain];
dispatch_semaphore_signal(sema);
} failureBlock:^(NSError *error) {
assetError = [error retain];
dispatch_semaphore_signal(sema);
}];
if ([NSThread isMainThread]) {
while (!result && !assetError) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
}
else {
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);
}
dispatch_release(sema);
[assetError release];
return [result autorelease];
}
You should retain and autorelease the asset:
// ...
assetToReturn = [asset retain];
// ...
return [assetToReturn autorelease];