iPad capturing 16:9 photos - objective-c

I am building a prototype app on iOS, and I’m cannibalizing some Apple sample code to do it (thin ice, I know—this code uses goto statements :\ ). I am using the AVCam project from Session 520 - What's New in Camera Capture. I don’t need video capture capability, just still photos.
The device inputs and outputs are set up thusly:
// Init the device inputs
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
AVCaptureDeviceInput *newAudioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:nil];
// Setup the still image file output
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = #{AVVideoCodecKey: AVVideoCodecJPEG};
[newStillImageOutput setOutputSettings:outputSettings];
// Create session (use default AVCaptureSessionPresetHigh)
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
// Add inputs and output to the capture session
if ([newCaptureSession canAddInput:newVideoInput]) {
[newCaptureSession addInput:newVideoInput];
}
if ([newCaptureSession canAddInput:newAudioInput]) {
[newCaptureSession addInput:newAudioInput];
}
if ([newCaptureSession canAddOutput:newStillImageOutput]) {
[newCaptureSession addOutput:newStillImageOutput];
}
[self setStillImageOutput:newStillImageOutput];
[self setVideoInput:newVideoInput];
[self setAudioInput:newAudioInput];
[self setSession:newCaptureSession];
And here is the method that’s called when I tap the shutter button:
- (void) captureStillImage
{
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
if ([stillImageConnection isVideoOrientationSupported])
[stillImageConnection setVideoOrientation:orientation];
[[self stillImageOutput]
captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error)
{
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)])
{
[[self delegate] captureManager:self didFailWithError:error];
}
}
};
if (imageDataSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
UIImage *image = [[UIImage alloc] initWithData:imageData];
if ([self.delegate respondsToSelector:#selector(captureManagerCapturedImage:)])
{
dispatch_async(dispatch_get_main_queue(), ^{
[self.delegate captureManagerCapturedImage:image];
});
}
[library writeImageToSavedPhotosAlbum:[image CGImage]
orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:completionBlock];
}
else
{
completionBlock(nil, error);
}
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)])
{
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
This code successfully captures an image and saves it to the library. However, at some point while I was working on it, it changed from capturing 5-megapixel 4:3 images to capturing 1920x1080 16:9 images. I can’t find anywhere that the aspect ratio is specified, and I didn’t change any of the code relating to the configuration of the camera, capture sessions, or capture connection. Why did my camera start taking 16:9 photos?
Update: I just re-ran Apple’s original sample code, and it appears that it is also saving 16:9 images captured directly from the video. It is quite possible that I was insane before, or I took a test shot with Camera.app and was looking at that. So my real question is, how do I show a live feed from the camera on the screen while I’m shooting, and take a full-resolution photo. I can’t use UIImagePickerController, because I need to be able to overlay things on top of the live camera feed.
Update 2: I was able to solve this by throwing out the AVCapture code I was using. It turns out that UIImagePickerController does what I needed. I didn’t realize you could overlay custom controls - I thought it took over the whole screen until you were done taking a picture.

If you're capturing frames from a video source, you'll end up with a resolution of 16:9. Capturing frames from a video source and taking photos are different things.

Related

Display camera feed in UIView in iOS

I have an iOS app with a simple UIView placed in the view controller. I am trying to show the camera feed of the front facing camera, in the UIView. I am not trying to take a picture or record a video, I simply want to show the live feed in a UIView.
I have tried to implement AVCaptureVideoPreviewLayer, however the feed I get is blank. Nothing seems to happen. Here is my code:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input;
#try {
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
} #catch (NSException *exception) {
NSLog(#"Error; %#", error);
} #finally {
if (error == nil) {
if ([session canAddInput:input]) {
[session addInput:input];
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
if ([session canAddOutput:stillImageOutput]) {
[session setSessionPreset:AVCaptureSessionPresetHigh];
[session addOutput:stillImageOutput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[backgroundStreamView.layer addSublayer:previewLayer];
[session startRunning];
NSLog(#"session running");
} else {
NSLog(#"cannot add output");
}
} else {
NSLog(#"cannot add inout");
}
} else {
NSLog(#"general error: %#", error);
}
}
The session runs perfectly fine, however no video feed is shown. What am I doing wrong?
Managed to fix it myself, turned out to be a fairly simple issue - I didn't specify the frame size of the AVCapturePreviewLayer and as a result it was not appearing (presumably because it defaults to a frame size of zero).
To fix this I set the frame to match the frame of my custom UIView:
[previewLayer setFrame:backgroundStreamView.bounds];
Deprecation code fix
AVCaptureStillImageOutput is also deprecated, so to fix that, I replaced it with the AVCapturePhotoOutput class. Thus the code changed from:
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}]
to the following:
AVCapturePhotoOutput *stillImageOutput = [[AVCapturePhotoOutput alloc] init];

UIActivityViewController progress & result

I am using UIActivityViewController to save a bunch of video assets to user's camera roll, but the problem is there is no way to know whether the save to photo library was successful or not, and also get error code if it was unsuccessful. Is there any way to override the default behavior of builtin activity? I see that the completionHandler of UIActivityViewController is pretty useless in this regard.
Use the completionBlock to do.
ALAssetsLibrary *lib = [[[ALAssetsLibrary alloc] init] autorelease];
if ([lib videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) {
[lib writeVideoAtPathToSavedPhotosAlbum:videoURL
completionBlock:^(NSURL *assetURL, NSError *error) {
if (!error)
{
[self performSelectorOnMainThread: #selector(dismissAlertView) withObject: nil, waitUntilDone:NO];
}
}];
}
- (void)dismissAlertView
{
//dismiss your alertview here.
}

Video Recording with zoom in/out functionality [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
Is there any framework in iOS which provides the feature to record a video with zoom in/out
functionality.I tried with AVFoundation framework but didn't get any success.
I am using the following code to record a video
- (void) startCamera
NSLog(#"Setting up capture session");
CaptureSession = [[AVCaptureSession alloc] init];
//----- ADD INPUTS -----
NSLog(#"Adding video input");
//ADD VIDEO INPUT
AVCaptureDevice *VideoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (VideoDevice)
{
NSError *error;
VideoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:VideoDevice error:&error];
if (!error)
{
if ([CaptureSession canAddInput:VideoInputDevice])
[CaptureSession addInput:VideoInputDevice];
else
NSLog(#"Couldn't add video input");
}
else
{
NSLog(#"Couldn't create video input");
}
}
else
{
NSLog(#"Couldn't create video capture device");
}
//ADD AUDIO INPUT
NSLog(#"Adding audio input");
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (audioInput)
{
[CaptureSession addInput:audioInput];
}
//----- ADD OUTPUTS -----
//ADD VIDEO PREVIEW LAYER
NSLog(#"Adding video preview layer");
[self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:CaptureSession]];
PreviewLayer.orientation = AVCaptureVideoOrientationPortrait; //<<SET ORIENTATION. You can deliberatly set this wrong to flip the image and may actually need to set it wrong to get the right image
[[self PreviewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
//ADD MOVIE FILE OUTPUT
NSLog(#"Adding movie file output");
MovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
Float64 TotalSeconds = 60; //Total seconds
int32_t preferredTimeScale = 30; //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale); //<<SET MAX DURATION
MovieFileOutput.maxRecordedDuration = maxDuration;
MovieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024; //<<SET MIN FREE SPACE IN BYTES FOR RECORDING TO CONTINUE ON A VOLUME
if ([CaptureSession canAddOutput:MovieFileOutput])
[CaptureSession addOutput:MovieFileOutput];
//SET THE CONNECTION PROPERTIES (output properties)
[self CameraSetOutputProperties]; //(We call a method as it also has to be done after changing camera)
//----- SET THE IMAGE QUALITY / RESOLUTION -----
//Options:
// AVCaptureSessionPresetHigh - Highest recording quality (varies per device)
// AVCaptureSessionPresetMedium - Suitable for WiFi sharing (actual values may change)
// AVCaptureSessionPresetLow - Suitable for 3G sharing (actual values may change)
// AVCaptureSessionPreset640x480 - 640x480 VGA (check its supported before setting it)
// AVCaptureSessionPreset1280x720 - 1280x720 720p HD (check its supported before setting it)
// AVCaptureSessionPresetPhoto - Full photo resolution (not supported for video output)
NSLog(#"Setting image quality");
[CaptureSession setSessionPreset:AVCaptureSessionPresetMedium];
if ([CaptureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) //Check size based configs are supported before setting them
[CaptureSession setSessionPreset:AVCaptureSessionPreset640x480];
//----- DISPLAY THE PREVIEW LAYER -----
//Display it full screen under out view controller existing controls
NSLog(#"Display the preview layer");
CGRect layerRect;
layerRect = CGRectMake(284, 0, 200, 200);
[PreviewLayer setBounds:layerRect];
[PreviewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),
CGRectGetMidY(layerRect))];
//[[[self view] layer] addSublayer:[[self CaptureManager] previewLayer]];
//We use this instead so it goes on a layer behind our UI controls (avoids us having to manually bring each control to the front):
UIView *CameraView = [[UIView alloc] init];
[[self view] addSubview:CameraView];
[self.view sendSubviewToBack:CameraView];
[[CameraView layer] addSublayer:PreviewLayer];
//----- START THE CAPTURE SESSION RUNNING -----
[CaptureSession startRunning];
}
//********** CAMERA SET OUTPUT PROPERTIES **********
- (void) CameraSetOutputProperties
{
//SET THE CONNECTION PROPERTIES (output properties)
AVCaptureConnection *CaptureConnection = [MovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
//Set landscape (if required)
if ([CaptureConnection isVideoOrientationSupported])
{
AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationPortrait; //<<<<<SET VIDEO ORIENTATION IF LANDSCAPE
[CaptureConnection setVideoOrientation:orientation];
}
//Set frame rate (if requried)
CMTimeShow(CaptureConnection.videoMinFrameDuration);
CMTimeShow(CaptureConnection.videoMaxFrameDuration);
if (CaptureConnection.supportsVideoMinFrameDuration)
CaptureConnection.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (CaptureConnection.supportsVideoMaxFrameDuration)
CaptureConnection.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
CMTimeShow(CaptureConnection.videoMinFrameDuration);
CMTimeShow(CaptureConnection.videoMaxFrameDuration);
}
//********** DID FINISH RECORDING TO OUTPUT FILE AT URL **********
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
}
}];
}
}
}

How can I release UIimage which was fetched from ALAssetsLibrary

I spent my whole 2weeks for just trying to resolve this problem. So frustrate!
Following 2 functions are what I'm using for fetching a image from device library.
If I use "setImage" function multiple times I keep losing my free memory on my iOS Device.
I think "[imageFromAsset initWithCGImage:[[myasset defaultRepresentation] fullScreenImage]];" in assetImage function causes the problem.
Can any guys help me? Any clues or thinking would be SUPER appreciate! Please!
- (void)setImage:(NSURL *)imageURL{
UIImage *imageFromDeviceLibrary = [[UIImage alloc] init];
[[DevicePhotoControl sharedInstance] assetImage:[imageURL absoluteString] imageToStore:imageFromDeviceLibrary];
UIImageView *fullImageView = [[UIImageView alloc] initWithImage:imageFromDeviceLibrary];
[imageFromDeviceLibrary release];
[self.view addSubview:fullImageView];
[fullImageView release];
}
- (void)assetImage:(NSString *)assetURL imageToStore:(UIImage *)imageFromAsset{
// Handling exception case for when it doesn't have assets-library
if ([assetURL hasPrefix:#"assets-library:"] == NO) {
assetURL = [NSString stringWithFormat:#"%#%#",#"assets-library:",assetURL];
}
__block BOOL busy = YES;
ALAssetsLibrary* assetslibrary = [[[ALAssetsLibrary alloc] init] autorelease];
//get image data by URL
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
[imageFromAsset initWithCGImage:[[myasset defaultRepresentation] fullScreenImage]];
busy = NO;
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Library Image Fetching Failed : %#",[myerror localizedDescription]);
busy = NO;
};
[assetslibrary assetForURL:[NSURL URLWithString:assetURL]
resultBlock:resultblock
failureBlock:failureblock];
while (busy == YES) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
}
The moment the AssetLibrary is released, all the asset object will be gone with it.
I suggest that you create your assetLibrary in the app delegate to keep it alive and only reset it when you receive change notification ALAssetsLibraryChangedNotification from the ALAssetLibrary
here
It may help you.

zXing (ios version) Black/White screen error

I'm working over application, that using zxing library to read QRcodes. I have problem with ZxingWidgetController - when view is showed, during application is in background/not active (eg. screen is lock) image from camera is not shown on screen - only background is visible, and scanner seems to be not working.
when i call initCapture method again, after a little delay video from camera is showed, but in this case, every time when application lose activity i need to reinitialize scanner - this behavior is not comfortable at all.
this bug can be repeated on almost all aplication used zXing, so i suppose that is some zXing bug.
zXing initCapture method code is:
- (void)initCapture {
#if HAS_AVFF
AVCaptureDeviceInput *captureInput =
[AVCaptureDeviceInput deviceInputWithDevice:
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]
error:nil];
if(!captureInput)
{
NSLog(#"ERROR - CaptureInputNotInitialized");
}
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
if(!captureOutput)
{
NSLog(#"ERROR - CaptureOutputNotInitialized");
}
[captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[[AVCaptureSession alloc] init] autorelease];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; // 480x360 on a 4
if([self.captureSession canAddInput:captureInput])
{
[self.captureSession addInput:captureInput];
}
else
{
NSLog(#"ERROR - cannot add input");
}
if([self.captureSession canAddOutput:captureOutput])
{
[self.captureSession addOutput:captureOutput];
}
else
{
NSLog(#"ERROR - cannot add output");
}
[captureOutput release];
if (!self.prevLayer)
{
[self.prevLayer release];
}
self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
// NSLog(#"prev %p %#", self.prevLayer, self.prevLayer);
self.prevLayer.frame = self.view.bounds;
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: self.prevLayer];
[self.captureSession startRunning];
#endif
}
Maybe you guys know what is wrong?
I dont understand your question. If application is in background/not active, of course it cant working. You should make it clear.