Crop UIImageView's image in scale to fill mode - objective-c

I have one UIImageView which is in scale to fill mode. Now i want to crop the image in imageview. I have one rectangle on this imageview which user can move and change the size. After particular size is selected i want to crop the image using this frame. But since my mode of uiimageview is scale to fill this messes the cropping rectangle and different part is cropped. Has any one faced this problem. How to crop properly ?

Your options are to calculate, or use this walkaround:
-(UIImage *)imageFromImageView:(UIImageView *)view withCropRect:(CGRect)cropRect
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * largeImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGImageRef imageRef = CGImageCreateWithImageInRect([largeImage CGImage], cropRect);
UIImage* img = UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return img;
}

Related

CGImageCreateWithImageInRect [duplicate]

This question already has answers here:
Losing image orientation while converting an image to CGImage
(7 answers)
Closed 4 years ago.
I am trying to create a simple cropping app. I have a view overlayed on a UIImage and I want to crop a new image out of only the part of the UIImage where the view is overlayed. I used the code
CGRect cropRect = _cropView.frame;
UIImage *cropImage = [UIImage imageWithData:imageData];
CGImageRef cropped = CGImageCreateWithImageInRect([cropImage CGImage], cropRect);
UIImage *croppedImage = [UIImage imageWithCGImage:cropped];
self.imageView.image = croppedImage;
but it resulted in an unexpected image that seemed very zoomed in and not what I wanted. How can I crop a new image from the image behind the view?
A CGImage deals with pixels, not "points" (the abstract measurement unit of UIKit). When the crop view's contentScaleFactor is larger than 1, its pixel dimensions will be larger than its point dimensions. So, to get a crop rect that corresponds to the pixels, you need to multiply all of the coordinates and dimensions of the view's frame by its contentScaleFactor.
CGRect cropRect = _cropView.frame;
cropRect.origin.x *= _cropView.contentScaleFactor;
cropRect.origin.y *= _cropView.contentScaleFactor;
cropRect.size.width *= _cropView.contentScaleFactor;
cropRect.size.height *= _cropView.contentScaleFactor;
May be this help you,
// Create new image context
CGSize size = _cropView.frame.size;
UIGraphicsBeginImageContextWithOptions(size, NO, 0.0);
// Create rect for image
CGRect rect = _cropView.frame;
// Draw the image into the rect
[existingImage drawInRect:rect];
// Saving the image, ending image context
UIImage * newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

UIImageView is resized as per UIImage - Objective C

I am making UIImage from CGIImageRef.
CGImageRef drawImage = CGImageCreateWithImageInRect(croppedImage.CGImage, frame);
UIImage *newImage = [UIImage imageWithCGImage:drawImage];
CGImageRelease(drawImage);
And applying it to Image View:
self.imageview.image = newImage;
After this line self.imageview.image = imgV.image; my imageView is resized to size of UIImage(newImage).
When I log my self.imageview frame then I am getting correct value. I am not able to get exact reason behind this.
UIImageView will be autoResize to fit its image size if u didnt set the frame of imageView

resizing uiimage is flipping it?

I am trying to resize UIImage . i am taking the image and set its width to 640(for ex), and check the factor i had to use, and use it to the image height also .
Somehow the image is sometimes flips, even if it was portrait image, it becomes landscape.
I am probably not giving attention to something here ..
//scale 'newImage'
CGRect screenRect = CGRectMake(0, 0, 640.0, newImage.size.height* (640/newImage.size.width) );
UIGraphicsBeginImageContext(screenRect.size);
[newImage drawInRect:screenRect];
UIImage *scaled = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Ok i have the answer but its not her.
when i took the image from the assetLibrary i took it with :
CGImageRef imageRef = result.defaultRepresentation.fullResolution;
This flips the image, now i use :
CGImageRef imageRef = result.defaultRepresentation.fullScreenImage;
and the image is just fine .

Save uiview as specified size

I have an uiimageview that is 600x600 and a uitexview loaded in a 200x200 uiview.
I would like to export this uiview to 600x600, same as the uiimageview size.
However If I use the code below I can only generate a size of 400x400, that is the retina size of the uiview.
Is there any code I can achieve this?
UIGraphicsBeginImageContextWithOptions(outputView.bounds.size, outputView.opaque, 2);
[outputView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Have you tried changing the bounds of the view to what you need?
outputView.bounds = CGRectMake(0,0,600,600);
UIGraphicsBeginImageContextWithOptions(outputView.bounds.size, outputView.opaque, 2); [outputView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
outputView.bound = ... original bounds here ...
If the internal view is set to be resized automatically, this should work seamlessly. Otherwise, you could change the frame of the internal view as well.
You could use the UIImage category of AliSoftware to resize it

Scale UIImageView relatively

I've got two UIImageViews, I've managed to merge and save them. But I want the overlaying image to scale and position relatively with the scale of the chosen image. So that the UIImageView that you see on the screen looks the same as the UIImageView which it saves.
- (UIImage *)combineImages{
UIGraphicsBeginImageContext(CGSizeMake(theimageView.image.size.width,theimageView.image.size.height));
// Draw image1
[theimageView.image drawInRect:CGRectMake(0,0, theimageView.image.size.width,theimageView.image.size.height)];
// Draw image2
[Birdie.image drawInRect:CGRectMake(Birdie.center.x, Birdie.center.y, (theimageView.image.size.width / 2), (theimageView.image.size.height / 2))];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resultingImage;
}
theImageView and Birdie are the two UIImageViews.
Thanks in advanced!
Please Try this,
- (UIImage *)combineImages{
// Draw image1
[theimageView.image drawInRect:CGRectMake(0,0, theimageView.image.size.width,theimageView.image.size.height)];
// Draw image2
[Birdie.image drawInRect:CGRectMake(Birdie.frame.origin.x, Birdie.frame.origin.y, Birdie.frame.size.width, Birdie.frame.size.height)];
//have you added Birdie over theimageView if not than do this. If your size is already set as you want then there is no need for above two lines of drawInRect.
UIGraphicsBeginImageContext(theimageView.bounds.size);
[theimageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resultingImage;
}
Thanks,