each time I start a AVCaptureSession I receive a memory warning leading to crashes after time.
I'm starting the session asynchronously and the Instruments tool says the application consumes about 2 MB memory.
Do you have any idea how to overcome that issue? Are 2MB of allocated memory too much?
Thankx!
[iOS 4.3, ARC]
#autoreleasepool {
//Init capture session
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
//Resize container view
CGRect cameraContainerFrame = cameraContainerView.frame;
cameraContainerFrame.size = CGSizeMake(320, 426);
cameraContainerView.frame = cameraContainerFrame;
CALayer *viewLayer = [cameraContainerView layer];
[viewLayer setMasksToBounds:YES];
//Create preview layer
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
CGRect bounds = [cameraContainerView bounds];
[captureVideoPreviewLayer setFrame:bounds];
if ([captureVideoPreviewLayer isOrientationSupported]) {
[captureVideoPreviewLayer setOrientation:AVCaptureVideoOrientationPortrait];
}
[captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[viewLayer addSublayer:captureVideoPreviewLayer];
//Get input device
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([captureDevice lockForConfiguration:nil]){
captureDevice.focusMode = AVCaptureFocusModeContinuousAutoFocus;
captureDevice.whiteBalanceMode = AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance;
[captureDevice unlockForConfiguration];
}
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
// Handle the error appropriately.
DLog(#"ERROR: trying to open camera: %#", error);
}
//Add input to session
[session addInput:input];
//Output
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [NSDictionary dictionaryWithObject:AVVideoCodecJPEG forKey:AVVideoCodecKey];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
//Save state
cameraSessionInitialized = YES;
[session startRunning];
}
session.sessionPreset = AVCaptureSessionPresetMedium;
If you don't care about quality, this does get rid of memory warnings. I'm trying to figure out how to get it working with the AVCaptureSessionPresetPhoto.
Related
I have an iOS app with a simple UIView placed in the view controller. I am trying to show the camera feed of the front facing camera, in the UIView. I am not trying to take a picture or record a video, I simply want to show the live feed in a UIView.
I have tried to implement AVCaptureVideoPreviewLayer, however the feed I get is blank. Nothing seems to happen. Here is my code:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input;
#try {
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
} #catch (NSException *exception) {
NSLog(#"Error; %#", error);
} #finally {
if (error == nil) {
if ([session canAddInput:input]) {
[session addInput:input];
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
if ([session canAddOutput:stillImageOutput]) {
[session setSessionPreset:AVCaptureSessionPresetHigh];
[session addOutput:stillImageOutput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[backgroundStreamView.layer addSublayer:previewLayer];
[session startRunning];
NSLog(#"session running");
} else {
NSLog(#"cannot add output");
}
} else {
NSLog(#"cannot add inout");
}
} else {
NSLog(#"general error: %#", error);
}
}
The session runs perfectly fine, however no video feed is shown. What am I doing wrong?
Managed to fix it myself, turned out to be a fairly simple issue - I didn't specify the frame size of the AVCapturePreviewLayer and as a result it was not appearing (presumably because it defaults to a frame size of zero).
To fix this I set the frame to match the frame of my custom UIView:
[previewLayer setFrame:backgroundStreamView.bounds];
Deprecation code fix
AVCaptureStillImageOutput is also deprecated, so to fix that, I replaced it with the AVCapturePhotoOutput class. Thus the code changed from:
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}]
to the following:
AVCapturePhotoOutput *stillImageOutput = [[AVCapturePhotoOutput alloc] init];
Application get crashed for barcode scanning using AVFoundation.
following is my code.
_session = [[AVCaptureSession alloc] init];
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
_input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
if (_input) {
[_session addInput:_input];
} else {
NSLog(#"Error: %#", error);
}
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_output.metadataObjectTypes = [_output availableMetadataObjectTypes];
_prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_prevLayer.frame = _previewView.bounds;
_prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//[self.view.layer addSublayer:_prevLayer];
[_previewView.layer addSublayer:_prevLayer];
//[self.view];
//[_session startRunning];
[_previewView bringSubviewToFront:_highlightView];
/* code Ends*/
Showing Bad Access.
[_previewView.layer addSublayer:_prevLayer];
This line occurs after the frame is set. Try adding the layer and then setting the frame. I'm sure you've moved on, but this answer could benefit someone else.
I am developing an iPhone App. In that, there is a requirement for Pausing and resuming the camera. So i used AVFoundation for that instead of using UIImagePickerController.
My code is :
- (void) startup :(BOOL)isFrontCamera
{
if (_session == nil)
{
NSLog(#"Starting up server");
self.isCapturing = NO;
self.isPaused = NO;
_currentFile = 0;
_discont = NO;
// create capture device with video input
_session = [[AVCaptureSession alloc] init];
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(isFrontCamera)
{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
cameraDevice = captureDevice;
}
cameraDevice=[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:nil];
[_session addInput:input];
// audio input from default mic
AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput* micinput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
[_session addInput:micinput];
// create an output for YUV output with self as delegate
_captureQueue = dispatch_queue_create("uk.co.gdcl.cameraengine.capture", DISPATCH_QUEUE_SERIAL);
AVCaptureVideoDataOutput* videoout = [[AVCaptureVideoDataOutput alloc] init];
[videoout setSampleBufferDelegate:self queue:_captureQueue];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
nil];
videoout.videoSettings = setcapSettings;
[_session addOutput:videoout];
_videoConnection = [videoout connectionWithMediaType:AVMediaTypeVideo];
[_videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
NSDictionary* actual = videoout.videoSettings;
_cy = [[actual objectForKey:#"Width"] integerValue];
_cx = [[actual objectForKey:#"Height"] integerValue];
AVCaptureAudioDataOutput* audioout = [[AVCaptureAudioDataOutput alloc] init];
[audioout setSampleBufferDelegate:self queue:_captureQueue];
[_session addOutput:audioout];
_audioConnection = [audioout connectionWithMediaType:AVMediaTypeAudio];
[_session startRunning];
_preview = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
}
}
Here i am facing the problem when i change the camera to Front. when i calling the above method by changing the camera to front, the preview layer is getting stuck and no preview is coming. My doubt is "Can we change the capture device in the middle of capture session ?". Please guide me where i am going wrong (or) Suggest me with a solution on how to navigate between front and back camera while recording.
Thanks in Advance.
Yes, you can. There are just a few of things you need to cater to.
Need to be using AVCaptureVideoDataOutput and its delegate for recording.
Make sure you remove the previous deviceInput before adding the new deviceInput.
You must remove and recreate the AVCaptureVideoDataOutput as well.
I am using these two functions for it right now and it works while the session is running.
- (void)configureVideoWithDevice:(AVCaptureDevice *)camera {
[_session beginConfiguration];
[_session removeInput:_videoInputDevice];
_videoInputDevice = nil;
_videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:camera error:nil];
if ([_session canAddInput:_videoInputDevice]) {
[_session addInput:_videoInputDevice];
}
[_session removeOutput:_videoDataOutput];
_videoDataOutput = nil;
_videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[_videoDataOutput setSampleBufferDelegate:self queue:_outputQueueVideo];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey, nil];
_videoDataOutput.videoSettings = setcapSettings;
[_session addOutput:_videoDataOutput];
_videoConnection = [_videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
if([_videoConnection isVideoOrientationSupported]) {
[_videoConnection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
}
[_session commitConfiguration];
}
- (void)configureAudioWithDevice:(AVCaptureDevice *)microphone {
[_session beginConfiguration];
_audioInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:microphone error:nil];
if ([_session canAddInput:_audioInputDevice]) {
[_session addInput:_audioInputDevice];
}
[_session removeOutput:_audioDataOutput];
_audioDataOutput = nil;
_audioDataOutput = [[AVCaptureAudioDataOutput alloc] init];
[_audioDataOutput setSampleBufferDelegate:self queue:_outputQueueAudio];
[_session addOutput:_audioDataOutput];
_audioConnection = [_audioDataOutput connectionWithMediaType:AVMediaTypeAudio];
[_session commitConfiguration];
}
You can't change the captureDevice mid-session. And you can only have one capture session running at a time. You could end the current session and create a new one. There will be a slight lag (maybe a second or two depending on your cpu load).
I wish Apple would allow multiple sessions or at least multiple devices per session... but they do not... yet.
have you considered having multiple sessions and then afterwards processing the video files to join them together into one?
I spent my whole 2weeks for just trying to resolve this problem. So frustrate!
Following 2 functions are what I'm using for fetching a image from device library.
If I use "setImage" function multiple times I keep losing my free memory on my iOS Device.
I think "[imageFromAsset initWithCGImage:[[myasset defaultRepresentation] fullScreenImage]];" in assetImage function causes the problem.
Can any guys help me? Any clues or thinking would be SUPER appreciate! Please!
- (void)setImage:(NSURL *)imageURL{
UIImage *imageFromDeviceLibrary = [[UIImage alloc] init];
[[DevicePhotoControl sharedInstance] assetImage:[imageURL absoluteString] imageToStore:imageFromDeviceLibrary];
UIImageView *fullImageView = [[UIImageView alloc] initWithImage:imageFromDeviceLibrary];
[imageFromDeviceLibrary release];
[self.view addSubview:fullImageView];
[fullImageView release];
}
- (void)assetImage:(NSString *)assetURL imageToStore:(UIImage *)imageFromAsset{
// Handling exception case for when it doesn't have assets-library
if ([assetURL hasPrefix:#"assets-library:"] == NO) {
assetURL = [NSString stringWithFormat:#"%#%#",#"assets-library:",assetURL];
}
__block BOOL busy = YES;
ALAssetsLibrary* assetslibrary = [[[ALAssetsLibrary alloc] init] autorelease];
//get image data by URL
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
[imageFromAsset initWithCGImage:[[myasset defaultRepresentation] fullScreenImage]];
busy = NO;
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Library Image Fetching Failed : %#",[myerror localizedDescription]);
busy = NO;
};
[assetslibrary assetForURL:[NSURL URLWithString:assetURL]
resultBlock:resultblock
failureBlock:failureblock];
while (busy == YES) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
}
The moment the AssetLibrary is released, all the asset object will be gone with it.
I suggest that you create your assetLibrary in the app delegate to keep it alive and only reset it when you receive change notification ALAssetsLibraryChangedNotification from the ALAssetLibrary
here
It may help you.
i want to take a screenshot of a video on my ipad app.
I searched on SO, and i found a lot of sample code. I tried everything but nothing seem to work.
I tried all of this methods:
1) Try with : MPMoviePlayerController
- (void) previewWithPlayer:(NSString*)path image:(UIImageView*)imView
{
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:path]];
UIImage *thumbnail = [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
[player stop];
[player release];
imView.image = thumbnail;
}
2) Try with : AVAssetImageGenerator - v1
- (void) generateImage:(NSString*)path image:(UIImageView*)imView
{
AVAsset *asset = [AVAsset assetWithURL:[NSURL URLWithString:path]];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = CMTimeMake(1, 1);
UIImage *thumbnail = [UIImage imageWithCGImage:[imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL]];
imView.image = thumbnail;
}
13) Try with : AVAssetImageGenerator - v2
- (void) generateImage:(NSString*)path image:(UIImageView*)imView
{
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL URLWithString:path] options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform=TRUE;
[asset release];
CMTime thumbTime = CMTimeMakeWithSeconds(2,30);
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result != AVAssetImageGeneratorSucceeded) {
NSLog(#"couldn't generate thumbnail, error:%#", error);
}
UIImage *thumbImg = [[UIImage imageWithCGImage:im] retain];
imView.image = thumbImg;
[generator release];
};
CGSize maxSize = CGSizeMake(320, 180);
generator.maximumSize = maxSize;
[generator generateCGImagesAsynchronouslyForTimes:[NSArray arrayWithObject:[NSValue valueWithCMTime:thumbTime]] completionHandler:handler];
}
But nothig works.
I tried with MOV, MP4, nothing.
Path is correct, video is working.
NSString *fPath = [[NSBundle mainBundle] pathForResource:#"VideoA" ofType:#"mp4"];
NSLog(#"%#", fPath);
[self generateImage:fPath image:_ImgA];
What's could be the problem? My image view show nothing and no error are returned.
iOS is 6.0/5.1, on iPad simulator/device.
Video is 854×480 pixels, H.264, AAC. About 30Mb of size.
please help me because i'm going crazy with this issue.
thanks.
edit
on device return this error:
couldn't generate thumbnail, error:Error Domain=NSURLErrorDomain
Code=-1 "unknown error" UserInfo=0x1e0a2f30
{NSUnderlyingError=0x1e0a3900 "The operation couldn’t be completed.
(OSStatus error -12935.)", NSLocalizedDescription=unknown error}
Solved.
The trick: use fileURLWithPath:, not URLWithString:. Apparently the difference is really, really significant.
thanks to Noah. https://stackoverflow.com/a/4201419/88461