iOS full screen camera overlay - objective-c

I am new to objective-c and have been searching for an answer to this question for a while...
I am creating a photo-sharing app and I want the camera to be full-screen. I have used the following code to hide the default camera overlay and create my own, and make the camera full-screen. On the next screen, I want to display the image captured as full-screen again, but the image is getting collapsed sideways.
Setting the camera overlay:
-(void)loadCamera {
self.imagePicker = [[UIImagePickerController alloc]init];
self.imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
self.imagePicker.delegate = self;
self.imagePicker.cameraCaptureMode = UIImagePickerControllerCameraCaptureModePhoto;
self.imagePicker.showsCameraControls = NO;
self.overlayIV = [[WAUCameraOverlayView alloc]initWithFrame:self.imagePicker.cameraOverlayView.frame];
[self.imagePicker.cameraOverlayView addSubview:self.overlayIV];
CGAffineTransform translate = CGAffineTransformMakeTranslation(0.0, 71.0); //This slots the preview exactly in the middle of the screen by moving it down 71 points
self.imagePicker.cameraViewTransform = translate;
CGAffineTransform scale = CGAffineTransformScale(translate, 1.333333, 1.333333);
self.imagePicker.cameraViewTransform = scale;
[self presentViewController:self.imagePicker animated:YES completion:nil];
}
The above code works great and makes the camera full-screen with my custom overlay. Then, this block of code gets executed:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[self.overlayIV removeFromSuperview];
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
}
I pass "image" to the next screen and initialize it to the size of the screen, but it gets stretched... any way to save/display the image captured from the full-screen camera in the exact ratio and size?
EDIT: The code I use to display the image is:
UIImageView *capturedImageView =[[UIImageView alloc]initWithFrame:self.frame];
capturedImageView.image = aImage;
capturedImageView.contentMode = UIViewContentModeScaleAspectFill;
[self addSubview:capturedImageView];
I want the image to be the size of the screen, and the same ratio/aspect as it was captured in the camera

Related

Bug on submitted app binary but not in the simulator - CALayer position contains NaN

I submitted my app to the App Store where is ready to download. I've since then received some interesting crash reports when people select an image from the ImagePicker in one of my views.
This bug (see below) makes the app crash. I was wondering 2 things.
Can anyone spot the problem in the code below?
How do you deal with bugs that are only in the App Binary but do not show up when trying to recreate them on the dev environment? - I can make the app crash with the Binary that is on the app store but when I do the same on the simulator or on my test phone the app works perfectly..
The Crash report in BugSense
CALayer position contains NaN: [798 nan]
Class:
CALayerInvalidGeometry
0x00120e99 -[imageCroppingViewController imagePickerController:didFinishPickingMediaWithInfo:] (imageCroppingViewController.m:126) + 163481
The Code
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
imageView.image = image;
CGRect rect;
rect.size.width = image.size.width;
rect.size.height = image.size.height;
imageView.center = scrollView.center;
[imageView setFrame:rect];
scrollView.contentSize = imageView.frame.size;
self.navigationController.navigationBar.hidden = NO;
[myPicker.view removeFromSuperview];
}
You do not initialize completely your local variable rect. Its origin member is undefined and may be anything, when you send [imageView setFrame:rect];.
Add rect.origin = CGPointZero; or use CGRectMake to initialize rect :
rect = CGRectMake(0,0,image.size.width,image.size.height);
Or any location you want for your rect.

Loading a png into a UIImageView iOS

I have two images that need to be overlaid over one another, and they are both png images (since I need to be able to make them transparent). However, when I load them into a UIImage view in my xib file, neither of them display at all! When I try using the jpg format of the same images it works fine, but because jpg doesn't support transparency, the overlay effect I need is lost. How can I get the png images to actually display in the window?
This is the kind of task that is easier to do from code than from Interface "Crappy" Builder:
CGRect imageFrame = CGRectMake(x, y, width, height);
UIImage *image1 = // however you obtain your 1st image
UIImage *image2 = // however you obtain your 2nd image
UIImageView *imgView1 = [[UIImageView alloc] initWithImage:image1];
// Adjust the alpha of the view
imgView1.alpha = 1.0f; // This is most advisably 1.0 (always)
imgView1.backgroundColor = [UIColor clearColor];
imgView1.frame = imageFrame;
UIImageView *imgView2 = [[UIImageView alloc] initWithImage:image2];
// Adjust the alpha of the view
imgView1.alpha = 0.5f; // or whatever you find practical
imgView1.backgroundColor = [UIColor clearColor];
imgView2.frame = imageFrame;
// Assume a view controller
[self.view addSubview:imgView1];
[self.view addSUbview:imgView2]; // add the image view later which you wanna be on the top of the other one
// If non-ARC environment, we need to take care of the percious RAM
[imgView1 release];
[imgView2 release];
Try to open your png images in a photo editor like photoshop or pixelmator and save it again as NOT interlaced (in the save options of png).

UIImagePickerController: invoke takePicture without shuttering animation

I'm invoking UIImagePickerController's method takePicture method to programmatically pictures with the iPhone
However I would like to disable the shuttering animation, and just leave the screen as it is, when the picture is taken (and disabling the sound as well).
UPDATE
I'm now using AVFoundation APIs to get the image, but the content of the camera is not displayed anymore on the screen. Should I initialize a UIImagePickerController just to display the content of the camera on the screen ?
Thanks
Check out this sample: It uses AVCaptureSession and an ImagePicker to capture images from the video.
http://developer.apple.com/library/ios/#samplecode/AVCam/Listings/Classes_AVCamCaptureManager_m.html
Also this one: https://github.com/jj0b/AROverlayExample
UIImagePickerController has some options, but not a lot. This other question has some tacky but probably effective ideas for disabling it.
For complete control, you need to use the AVFoundation APIs. Here is a capture example: https://developer.apple.com/library/ios/#samplecode/AVCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010112
I think this useful for 3.0
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:#"public.image"])
{
UIImage *selectedImage = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImage * cameraimage= [self scaleAndRotateImage:selectedImage];
CGRect rect =CGRectMake(0, 0,640,875);
CGImageRef imageRef = CGImageCreateWithImageInRect([selectedImage CGImage], rect);
UIImage *img = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
self.captureimage.image=cameraimage;
NSLog(#"selectedimage width:%f ht:%f",img.size.width,img.size.height);
}
[picker dismissModalViewControllerAnimated:YES];
}

Objective-C: Issue with CGRect .frame intersect/contains

I have two UIImageViews located about center of a horizontal screen and when the user clicks a button another UIImageView, located off screen, slides in from the right. I'm am just trying to detect when the view being brought onto the screen collides with the two static views. The problem is when I run my code and check the CGRect frames they are returned from where the views start, not end, regardless of where I place the frame calls in my method or even if I place them outside the method in a separate one. I'm a bit new to Obj-C and I understand that Core Animation runs on a separate thread and I am guessing that is why I am getting the starting values. (Correct me if I am wrong here).
So, I guess the question here is how do I detect collisions when one item is static and another is animated. Here's my code (feel free to clean it up):
- (IBAction)movegirl
{
//disabling button
self.movegirlbtn.enabled = NO;
//getting graphics center points
CGPoint pos = bushidogirl.center;
CGPoint box1 = topbox.center;
CGPoint box2 = bottombox.center;
//firing up animation
[UIView beginAnimations: nil context: NULL];
//setting animation speed
[UIView setAnimationDuration:2.0 ];
//setting final position of bushidogirl, then resetting her center position
pos.x = 260;
bushidogirl.center = pos;
//running animations
[UIView commitAnimations];
//playing gong sound
[self.audioplayer play];
//enabling button
self.movegirlbtn.enabled = YES;
[self collisiondetection: bushidogirl : topbox];
}
- (void)collisiondetection:(UIImageView *)item1 : (UIImageView *)item2
{
CGRect item1frame = item1.frame;
CGRect item2frame = item2.frame;
NSLog(#"Item1 frame = %d and Item 2 frame = %d", item1frame, item2frame);
}
How about:
if (CGRectIntersectsRect(item1frame, item2frame))
{
// collision!!
}

UIScrollView setZoomScale not working?

I am struggeling with my UIScrollview to get it to zoom-in the underlying UIImageView. In my view controller I set
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView {
return myImageView;
}
In the viewDidLoad method I try to set the zoomScale to 2 as follows (note the UIImageView and Image is set in Interface Builder):
- (void)viewDidLoad {
[super viewDidLoad];
myScrollView.contentSize = CGSizeMake(myImageView.frame.size.width, myImageView.frame.size.height);
myScrollView.contentOffset = CGPointMake(941.0, 990.0);
myScrollView.minimumZoomScale = 0.1;
myScrollView.maximumZoomScale = 10.0;
myScrollView.zoomScale = 0.7;
myScrollView.clipsToBounds = YES;
myScrollView.delegate = self;
NSLog(#"zoomScale: %.1f, minZoolScale: %.3f", myScrollView.zoomScale, myScrollView.minimumZoomScale);
}
I tried a few variations of this, but the NSLog always shows a zoomScale of 1.0.
Any ideas where I screw this one up?
I finally got this to work. what caused the problem was the delegate call being at the end. I now moved it up and .... here we go.
New code looks like this:
- (void)viewDidLoad {
[super viewDidLoad];
myScrollView.delegate = self;
myScrollView.contentSize = CGSizeMake(myImageView.frame.size.width, myImageView.frame.size.height);
myScrollView.contentOffset = CGPointMake(941.0, 990.0);
myScrollView.minimumZoomScale = 0.1;
myScrollView.maximumZoomScale = 10.0;
myScrollView.zoomScale = 0.7;
myScrollView.clipsToBounds = YES;
}
Here is another example I made. This one is using an image that is included in the resource folder. Compared to the one you have this one adds the UIImageView to the view as a subview and then changes the zoom to the whole view.
-(void)viewDidLoad{
[super viewDidLoad];
UIImage *image = [UIImage imageNamed:#"random.jpg"];
imageView = [[UIImageView alloc] initWithImage:image];
[self.view addSubview:imageView];
[(UIScrollView *) self.view setContentSize:[image size]];
[(UIScrollView *) self.view setMaximumZoomScale:2.0];
[(UIScrollView *) self.view setMinimumZoomScale:0.5];
}
I know this is quite late as answers go, but the problem is that your code calls zoomScale before it sets the delegate. You are right the other things in there don't require the delegate, but zoomScale does because it has to be able to call back when the zoom is complete. At least that's how I think it works.
My code must be completely crazy because the scale that I use is completely opposite to what tutorials and others are doing. For me, minScale = 1 which indicates that the image is fully zoomed out and fits the UIImageView that contains it.
Here's my code:
[self.imageView setImage:image];
// Makes the content size the same size as the imageView size.
// Since the image size and the scroll view size should be the same, the scroll view shouldn't scroll, only bounce.
self.scrollView.contentSize = self.imageView.frame.size;
// despite what tutorials say, the scale actually goes from one (image sized to fit screen) to max (image at actual resolution)
CGRect scrollViewFrame = self.scrollView.frame;
CGFloat minScale = 1;
// max is calculated by finding the max ratio factor of the image size to the scroll view size (which will change based on the device)
CGFloat scaleWidth = image.size.width / scrollViewFrame.size.width;
CGFloat scaleHeight = image.size.height / scrollViewFrame.size.height;
self.scrollView.maximumZoomScale = MAX(scaleWidth, scaleHeight);
self.scrollView.minimumZoomScale = minScale;
// ensure we are zoomed out fully
self.scrollView.zoomScale = minScale;
This works as I expect. When I load the image into the UIImageView, it is fully zoomed out. I can then zoom in and then I can pan the image.