How to make a full screen landscape camera without any buttons? - objective-c

When the app launches I just want to make it display a camera view in full screen, without any on screen buttons, just the actual part which the camera is seeing.

You wont want to use the camera then. There are multiple ways to do this, the quickest is through a AVCaptureVideoPreviewLayer
Check out this answer: Get Camera Preview to AVCaptureVideoPreviewLayer
- (void)initCapture
{
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:nil];
if (!captureInput) {
return;
}
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
/* captureOutput:didOutputSampleBuffer:fromConnection delegate method !*/
[captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[AVCaptureSession alloc] init];
NSString* preset = 0;
if (!preset) {
preset = AVCaptureSessionPresetMedium;
}
self.captureSession.sessionPreset = preset;
if ([self.captureSession canAddInput:captureInput]) {
[self.captureSession addInput:captureInput];
}
if ([self.captureSession canAddOutput:captureOutput]) {
[self.captureSession addOutput:captureOutput];
}
//handle prevLayer
if (!self.captureVideoPreviewLayer) {
self.captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
}
//if you want to adjust the previewlayer frame, here!
self.captureVideoPreviewLayer.frame = self.view.bounds;
self.captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: self.captureVideoPreviewLayer];
[self.captureSession startRunning];
}
Also see:
Live camera in UIImageView
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureVideoPreviewLayer_Class/Reference/Reference.html

Related

Zooming while capturing video using AVCapture in iOS

I am using AVCapture to capture video and save it. But I need to provide zooming option like pinch to zoom or through a zoom button. Also video should be saved in exactly in same manner in which it is being displayed, I mean when zoomed in, it should be saved zoomed. Any help, Link is appreciated. My code for setting up AVCapture session is:
- (void)setupAVCapture{
session = [[AVCaptureSession alloc] init];
session.automaticallyConfiguresApplicationAudioSession=YES;
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = self.view.bounds;
[self.view.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:movieFileOutput];
[session commitConfiguration];
[session startRunning];
}
I faced the same problem also, and I have solved it using these two steps:
Add a PinchGestureRecognizer event something like that in your Camera Preview View Controller
- (IBAction)handlePinchGesture:(UIPinchGestureRecognizer *)gestureRecognizer{
if([gestureRecognizer isMemberOfClass:[UIPinchGestureRecognizer class]])
{
effectiveScale = beginGestureScale * ((UIPinchGestureRecognizer *)gestureRecognizer).scale;
if (effectiveScale < 1.0)
effectiveScale = 1.0;
CGFloat maxScaleAndCropFactor = [[self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo] videoMaxScaleAndCropFactor];
if (effectiveScale > maxScaleAndCropFactor)
effectiveScale = maxScaleAndCropFactor;
[CATransaction begin];
[CATransaction setAnimationDuration:.025];
[self.previewView.layer setAffineTransform:CGAffineTransformMakeScale(effectiveScale, effectiveScale)];
[CATransaction commit];
if ([[self videoDevice] lockForConfiguration:nil]) {
[[self videoDevice] setVideoZoomFactor:effectiveScale];
[[self videoDevice] unlockForConfiguration];
}}}}
** Note that the key method for persisting the zoom level for video device is [device setVideoZoomFactor:]
2- In the IBAction of the Record Button , add this code to capture the video ( recording ) then to save the recorded video in the a certain path with certain name
- (IBAction)recordButtonClicked:(id)sender {
dispatch_async([self sessionQueue], ^{
if (![[self movieFileOutput] isRecording])
{
[self setLockInterfaceRotation:YES];
if ([[UIDevice currentDevice] isMultitaskingSupported])
{
// Setup background task. This is needed because the captureOutput:didFinishRecordingToOutputFileAtURL: callback is not received until the app returns to the foreground unless you request background execution time. This also ensures that there will be time to write the file to the assets library when the app is backgrounded. To conclude this background execution, -endBackgroundTask is called in -recorder:recordingDidFinishToOutputFileURL:error: after the recorded file has been saved.
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]];
}
// Update the orientation on the movie file output video connection before starting recording.
// Start recording to a temporary file.
NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[#"movie" stringByAppendingPathExtension:#"mov"]];
[[self movieFileOutput] startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self];
}
else
{
[[self movieFileOutput] stopRecording];
}
});
}
I hope that helps you
Add UIPinchGestureRecognizer object to your and handle callback like this:
- (void) zoomPinchGestureRecognizerAction: (UIPinchGestureRecognizer *) sender {
static CGFloat initialVideoZoomFactor = 0;
if (sender.state == UIGestureRecognizerStateBegan) {
initialVideoZoomFactor = _captureDevice.videoZoomFactor;
} else {
CGFloat scale = MIN(MAX(1, initialVideoZoomFactor * sender.scale), 4);
[CATransaction begin];
[CATransaction setAnimationDuration: 0.01];
_previewLayer.affineTransform = CGAffineTransformMakeScale(scale, scale);
[CATransaction commit];
if ([_captureDevice lockForConfiguration: nil] == YES) {
_captureDevice.videoZoomFactor = scale;
[_captureDevice unlockForConfiguration];
}
}
}

How do I dismiss a UIView after scanning a barcode?

I have an iPad app that I want to add a barcode reader to... this is the code for the initialization of the barcoder code:
-(void) scanInitializationCode {
_highlightView = [[UIView alloc] init];
_highlightView.autoresizingMask = UIViewAutoresizingFlexibleTopMargin|UIViewAutoresizingFlexibleLeftMargin|UIViewAutoresizingFlexibleRightMargin|UIViewAutoresizingFlexibleBottomMargin;
_highlightView.layer.borderColor = [UIColor greenColor].CGColor;
_highlightView.layer.borderWidth = 3;
[self.view addSubview:_highlightView];
// define the label to display the results of the scan
_label = [[UILabel alloc] init];
_label.frame = CGRectMake(0, self.view.bounds.size.height - 40, self.view.bounds.size.width, 40);
_label.autoresizingMask = UIViewAutoresizingFlexibleTopMargin;
_label.backgroundColor = [UIColor colorWithWhite:0.15 alpha:0.65];
_label.textColor = [UIColor whiteColor];
_label.textAlignment = NSTextAlignmentCenter;
_label.text = #"(none)";
[self.view addSubview:_label];
// session initialization
_session = [[AVCaptureSession alloc] init];
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
// define the input device
_input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
if (_input) {
[_session addInput:_input];
} else {
NSLog(#"Error: %#", error);
}
// and output device
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_output.metadataObjectTypes = [_output availableMetadataObjectTypes];
// and preview layer
_prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_prevLayer.frame = self.view.bounds;
_prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:_prevLayer];
}
This is the AVCaptureMetadataOutputObjectsDelegate code:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection {
CGRect highlightViewRect = CGRectZero;
AVMetadataMachineReadableCodeObject *barCodeObject;
NSString *detectionString = nil;
NSArray *barCodeTypes = #[AVMetadataObjectTypeEAN13Code];
for (AVMetadataObject *metadata in metadataObjects) {
for (NSString *type in barCodeTypes) {
if ([metadata.type isEqualToString:type])
{
barCodeObject = (AVMetadataMachineReadableCodeObject *)[_prevLayer transformedMetadataObjectForMetadataObject:(AVMetadataMachineReadableCodeObject *)metadata];
highlightViewRect = barCodeObject.bounds;
detectionString = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
if (detectionString != nil) {
_label.text = detectionString;
oISBNField.text = detectionString; // move detectionString to ISBN textbox
[_session stopRunning];
[_highlightView removeFromSuperview];
break;
}
else
_label.text = #"(none)";
}
This is the code that starts the scanning process by having the user tap a UIButton:
- (IBAction)aReadBarcode:(UIButton *)sender {
[self scanInitializationCode];
[_session startRunning];
// display the activity
[self.view bringSubviewToFront:_highlightView];
[self.view bringSubviewToFront:_label];
oISBNField.text = scanResults;
}
The problem is that once the scan has found the barcode, it stays visible; what I want to do is have it return to the UIView that has the button that caused it to start scanning (in other words, I want the _highlightView to disappear). I have tried all kinds of "dismissal" methods, even putting it at the back of the z-order, but none of them work. How can I make the highlightView disappear from the screen?
The answer:
[_prevLayer removeFromSuperlayer]; after [_session stopRunning]

How to add Dynamic Visual effects to running videos in iOS?

I want to change a visual effects dynamically to running videos. Am using GPUImage framework for changing the visual effects. I downloaded the sample project from Here. In this GPUImage, I choosed SimpleVideoFileFilter example. This example runs with one filter, just i modified the code and currently it supports 10 filters. My issue is, Video file is playing in GPUImageView, now i select another filter. Suddenly the video effect is also changing. But that video is starts from beginning. I want to change the filter dynamically for current playing video.
My Code is :
#pragma mark - Play Video with Effects
- (void)getVideo:(NSURL *)url
{
movieFile = [[GPUImageMovie alloc] initWithURL:url];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = YES;
// filter = [[GPUImagePixellateFilter alloc] init];
[movieFile addTarget:filter];
// Only rotate the video for display, leave orientation the same for recording
filterView = (GPUImageView *)self.view;
[filter addTarget:filterView];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL1 = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL1 size:CGSizeMake(640.0, 480.0)];
[filter addTarget:movieWriter];
// Configure this for video from the movie file, where we want to preserve all video frames and audio samples
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];
}
- (void)event:(UIButton*)sender
{
[filter removeTarget:filterView];
UIButton *selectedBtn = sender;
[movieFile removeTarget:filter];
switch (selectedBtn.tag)
{
case 0:
filter = [[GPUImageBrightnessFilter alloc] init];
break;
case 1:
filter = [[GPUImageGrayscaleFilter alloc] init];
break;
case 2:
filter = [[GPUImageSketchFilter alloc] init];
break;
case 3:
filter = [[GPUImageToonFilter alloc] init];
break;
case 4:
filter = [[GPUImageMonochromeFilter alloc] init];
break;
case 5:
filter = [[GPUImagePixellateFilter alloc] init];
break;
case 6:
filter = [[GPUImageCrosshatchFilter alloc] init];
break;
case 7:
filter = [[GPUImageVignetteFilter alloc] init];
break;
case 8:
filter = [[GPUImageColorInvertFilter alloc] init];
break;
case 9:
filter = [[GPUImageLevelsFilter alloc] init];
[(GPUImageLevelsFilter *)filter setRedMin:1.0 gamma:1.0 max:0.0 minOut:0.5 maxOut:0.5];
break;
default:
break;
}
[self getVideo:movieURL];
}
Please help me to resolve this issue.
I found the answer by myself. Solution is,
- (void)event:(UIButton*)sender
{
// isMoviePlayCompleted = NO;
if (btnTag != sender.tag)
{
btnTag = (int)sender.tag;
NSLog(#"tag:%d",btnTag);
[self applyFilter:sender.tag];
}
}
Applying Filter
-(void) applyFilter:(NSInteger) tag
{
[[NSFileManager defaultManager] removeItemAtURL:saveTempUrl error:nil];
recording = NO;
switch (tag)
{
case 0:
filter =nil;
filter = [[GPUImagePixellateFilter alloc] init];
[(GPUImagePixellateFilter *)filter setFractionalWidthOfAPixel:0.0];
break;
case 1:
filter =nil;
filter = [[GPUImageGrayscaleFilter alloc] init];
break;
case 2:
filter =nil;
filter = [[GPUImageSketchFilter alloc] init];
break;
case 3:
filter =nil;
filter = [[GPUImageToonFilter alloc] init];
break;
case 4:
filter =nil;
filter = [[GPUImageMonochromeFilter alloc] init];
break;
case 5:
filter =nil;
filter = [[GPUImageVignetteFilter alloc] init];
break;
default:
break;
}
[self getVideo:movieURL];
}
Play Video with Effects
- (void)getVideo:(NSURL *)url
{
[filter removeAllTargets];
movieFile.audioEncodingTarget = nil;
[movieWriter cancelRecording];
[movieFile cancelProcessing];
[movieWriter finishRecording];
movieWriter = nil;
movieFile = nil;
filterView = nil;
recording = YES;
anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
movieFile = [[GPUImageMovie alloc] initWithURL:url];
movieFile.delegate = self;
movieFile.runBenchmark = NO;
movieFile.playAtActualSpeed = YES;
[movieFile addTarget:filter];
// Only rotate the video for display, leave orientation the same for recording
filterView = (GPUImageView *)self.view;
[filter addTarget:filterView];
NSString *pathName = [NSString stringWithFormat:#"Doc.MOV"];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSTemporaryDirectory() stringByAppendingPathComponent:pathName];
NSFileManager *fileTmp = [[NSFileManager alloc] init];
if ([fileTmp fileExistsAtPath:pathToMovie]) {
[fileTmp removeItemAtPath:pathToMovie error:nil];
}
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
saveTempUrl = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:saveTempUrl size:size];
[filter addTarget:movieWriter];
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
__unsafe_unretained typeof(self) weakSelf = self;
[weakSelf->movieWriter setCompletionBlock:^{
NSLog(#"write completed");
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
movieWriter = nil;
movieFile = nil;
filterView = nil;
recording = NO;
if (saveFilter)
{
saveFilter = NO;
UISaveVideoAtPathToSavedPhotosAlbum([saveTempUrl path], self, #selector(video:didFinishSavingWithError:contextInfo:), nil);
shareFilter = YES;
}
}];
}
Thats it. now when i choose any filter, it witt fill newly. so memory issue is solved. now its working fine for my application.
// Use this code
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(movieFinished) name:MPMoviePlayerPlaybackDidFinishNotification object:videoPlayer];
[videoPlayer play];
-(void)movieFinished
{
[videoPlayer play];
}
-(void) playTheVideo:(NSURL *)videoURL
{
NSTimeInterval time= videoPlayer.currentPlaybackTime;
UIView *parentView = imageViewFiltered; // adjust as needed
CGRect bounds = parentView.bounds; // get bounds of parent view
CGRect subviewFrame = CGRectInset(bounds, 0, 0);
videoPlayer.view.frame = subviewFrame;
videoPlayer.view.autoresizingMask = (UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight);
[parentView addSubview:videoPlayer.view];
videoPlayer.contentURL = videoURL;
[videoPlayer setCurrentPlaybackTime:time];
[videoPlayer stop];
NSLog(#"Videoplayer stop or play in this view ");
[videoPlayer play];
self.showLoading = NO;
self.showLoading =NO;
}

Lazy loading of PhotoLibrary Images

i found an issue with Photo Library Images. It not displaying first time in my View,Image View is blank while loading first time.
Because i found Asset Library block working on another thread.After reloading my View ,I can see all the Images. However first time the Image Views are Blank.
can any one tell me a good way to deal with the problem
It working with Bundle Images.
also some times console shows that
app is crashing due to Program received signal: “0”. Data Formatters temporarily unavailable, will re-try after a 'continue'. (Unknown error loading shared library "/Developer/usr/lib/libXcodeDebuggerSupport.dylib")
My Code:
for (int j = 0; j<9; j++)
{
//allocating View
UIView *smallView = [[UIView alloc] initWithFrame:CGRectMake(xCordImage, yCordImage, 200, 190)];
// allocating ImageView
imageViewTopic = [[[UIImageView alloc] init] autorelease];
typedef void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *asset);
typedef void (^ALAssetsLibraryAccessFailureBlock)(NSError *error);
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
UIImage *images;
if (iref) {
images = [UIImage imageWithCGImage:iref];
}
else {
images = [UIImage imageNamed:#"Nofile.png"];
}
imageViewTopic .image = images ;
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
imageViewTopic .image = [UIImage imageNamed:#"Nofile.png"];
NSLog(#"booya, cant get image - %#",[myerror localizedDescription]);
};
NSString *string ;
MyClass *obj = [imageFileNameArray objectAtIndex:j];
**//obj.fileName contains ALAsset URL of a Image**
string = obj.fileName;
NSURL *asseturl = [NSURL URLWithString:string];
ALAssetsLibrary* assetslibrary = [[[ALAssetsLibrary alloc] init] autorelease];
[assetslibrary assetForURL:asseturl resultBlock:resultblock
failureBlock:failureblock];
imageViewTopic.userInteractionEnabled = YES;
imageViewTopic.frame = CGRectMake(0,0, 200, 150);
[currentView addSubview:scroller];
**// adding the imageView to View**
[smallView addSubview:imageViewTopic];
[myView addSubview:smallView];
[scroller addSubview:myView];
}
I am using this method to show images in scroll view with lazy loading. It works well.
First initialize the value of j1. And data is the image data coming from loop from an array.
dispatch_async(dispatch_get_global_queue(0,0), ^{
NSData * data = [[NSData alloc] initWithContentsOfURL:url];
if ( data == nil )
return;
dispatch_async(dispatch_get_main_queue(), ^{
__block int j1=_j;
// WARNING: is the cell still using the same data by this point??
// NSURL *url = [NSURL URLWithString: imageName];
UIImage *image = [UIImage imageWithData: data]; //image.size.height
image1=[[UIImageView alloc] initWithFrame:CGRectMake(j1,10,image.size.width,image.size.height)];
image1.image=image;
CALayer *layer = [image1 layer];
[layer setMasksToBounds:YES];
[layer setCornerRadius:0.0]; //note that when radius is 0, the border is a rectangle
[layer setBorderWidth:3.0];
[layer setBorderColor:[[UIColor whiteColor] CGColor]];
[portfolio_scroll addSubview:image1];
});
});
_j = _j+ 320;

Objective-c Changing UIImagePickerController to video mode

I have an application which I want onlt to show in the background the video source from the camera. I have the following code in my viewcontroller:
#if !TARGET_IPHONE_SIMULATOR
imagePickerController = [[UIImagePickerController alloc] initWithRootViewController:self];
imagePickerController.delegate = self;
imagePickerController.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePickerController.navigationBarHidden = YES;
imagePickerController.toolbarHidden = NO;
imagePickerController.showsCameraControls = NO;
//...
[self.view addSubview:self.imagePickerController.view];
[imagePickerController viewWillAppear:YES];
[imagePickerController viewDidAppear:YES];
#endif
//...
[self.view addSubview:otherthings];
Then I add other views on top and I have sounds too. However I changed the imagepicker mode to video but it freezes when a sound plays. here's what i changed:
#if !TARGET_IPHONE_SIMULATOR
imagePickerController = [[UIImagePickerController alloc] init];//initWithRootViewController:self];
imagePickerController.delegate = self;
imagePickerController.sourceType = UIImagePickerControllerSourceTypeCamera;
NSArray *mediaTypes = [UIImagePickerController availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
NSArray *videoMediaTypesOnly = [mediaTypes filteredArrayUsingPredicate:[NSPredicate predicateWithFormat:#"(SELF contains %#)", #"movie"]];
BOOL movieOutputPossible = (videoMediaTypesOnly != nil);
if (movieOutputPossible) {
imagePickerController.mediaTypes = videoMediaTypesOnly;
imagePickerController.videoQuality = UIImagePickerControllerQualityTypeHigh;
imagePickerController.navigationBarHidden = YES;
imagePickerController.toolbarHidden = YES;
imagePickerController.showsCameraControls = NO;
}
#endif
Anyone knows why the camera pickers freezes when a sound plays? The sound is an AVAudioPlayer by the way.
Solution: Use AVFOundation instead of UIImagePickerController.
videoBackground = [[UIView alloc] initWithFrame:CGRectMake(0.0, 0.0, 320.0, 480.0)];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = videoBackground.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = videoBackground.bounds;
[videoBackground.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
[session startRunning];
[self.view addSubview:videoBackground];