Make rounded image with storyboards - objective-c

I want to make rounded photo from Facebook, but image always scales.
So i have Storyboard with next parameters:
http://prntscr.com/5bpuqy
align center X, align center Y, width equals 72, height equals 72.
I understand, that problem may be in 72/72, but image mode in storyboard is "Aspect Fit"
I call my methods for downloading image by URL and then make it cornered with radius.
// call
[UIImage setRoundImageView:self.p_photo WithURL:[p_user fullURL] withCornerSize:37];
+ (void)setRoundImageView:(UIImageView *)imageView WithURL:(NSURL *)url withCornerSize:(CGFloat)corner
{
dispatch_async(dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(void){
[[SDWebImageManager sharedManager] downloadImageWithURL:url
options:0
progress:^(NSInteger receivedSize, NSInteger expectedSize) {
} completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, BOOL finished, NSURL *imageURL) {
if (image && finished)
{
[self setRoundImage:image forImageView:imageView withCornerSize:corner];
}
}];
});
}
+ (void)setRoundImage:(UIImage *)image forImageView:(UIImageView *)imageView withCornerSize:(CGFloat)corner
{
dispatch_async(dispatch_get_main_queue(), ^{
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, NO, [UIScreen mainScreen].scale);
[[UIBezierPath bezierPathWithRoundedRect:imageView.bounds
cornerRadius:corner] addClip];
[image drawInRect:imageView.bounds];
imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
});
}

Your code is way more complex than it needs to be. You are manually drawing the image into the image view and that is causing the problem. Try this instead:
+ (void)setRoundImageView:(UIImageView *)imageView WithURL:(NSURL *)url
{
// you don't need to wrap this in a dispatch queue, SDWebImageManager takes care of that for you.
[[SDWebImageManager sharedManager] downloadImageWithURL:url options:0 progress:^(NSInteger receivedSize, NSInteger expectedSize) {
} completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, BOOL finished, NSURL *imageURL) {
if (image && finished) {
[self setRoundImage:image forImageView:imageView];
}
}];
}
+ (void)setRoundImage:(UIImage *)image forImageView:(UIImageView *)imageView
{
// you don't need to wrap this in a dispatch queue, it will be called on the main thread.
// you don't need to manually draw the image into the image view. Just add the image to the image view and let it do its thing.
imageView.layer.cornerRadius = CGRectGetHeight(imageView.bounds) / 2;
imageView.layer.masksToBounds = YES;
[imageView setImage:image];
}
And make sure the image view is set to "Aspect Fill" mode in the storyboard.

Related

How to get dynamicaly height of image in dispatch_async method in scrollview

I am using below code to get the images from server. i want to get dynamicaly height of image and add image in scrollview.
From below code when i get the height outside the dispatch_async method it shows zero.
How i can get the dynamically height of image with async image load.
- (void)viewDidLoad {
[self LoadViewPublicEvents];
}
-(void) LoadViewPublicEvents
{
for (int i=0;i<arrayPublicEvents.count;i++)
{
UIImageView *img_vw1=[[UIImageView alloc] init];
dispatch_async(dispatch_get_global_queue(0, 0), ^{
NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:[NSString stringWithFormat:#"http://abc.us/uploads/event/%#",[[arrayPublicEvents objectAtIndex:i] valueForKey:#"image"]]]];
UIImage *images = [[UIImage alloc]initWithData:imageData];
dispatch_async(dispatch_get_main_queue(), ^{
img_vw1.image = images;
scaledHeight = images.size.height;
});
});
NSLog(#"%f",scaledHeight); // it print zero
img_vw1.backgroundColor=[UIColor clearColor];
img_vw1.frame=CGRectMake(0,y+5,screen_width,197);
[img_vw1 setContentMode:UIViewContentModeScaleAspectFit];
img_vw1.backgroundColor=[UIColor clearColor];
[self.scrll_vw addSubview:img_vw1];
}
}
Thanks in advance
Your code:
NSLog(#"%f",scaledHeight); // it print zero
img_vw1.backgroundColor=[UIColor clearColor];
img_vw1.frame=CGRectMake(0,y+5,screen_width,197);
[img_vw1 setContentMode:UIViewContentModeScaleAspectFit];
img_vw1.backgroundColor=[UIColor clearColor];
[self.scrll_vw addSubview:img_vw1];
Is being executed, before you loaded the image.
Thus you have to either wait (herefore you could use semaphores until the thread has finished) OR you place it inside of your block.
As you want to modify the UI it makes sense to place it into the main block:
UIImageView *img_vw1=[[UIImageView alloc] init];
dispatch_async(dispatch_get_global_queue(0, 0), ^{
NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:[NSString stringWithFormat:#"http://abc.us/uploads/event/%#",[[arrayPublicEvents objectAtIndex:i] valueForKey:#"image"]]]];
UIImage *images = [[UIImage alloc]initWithData:imageData];
dispatch_async(dispatch_get_main_queue(), ^{
img_vw1.image = images;
scaledHeight = images.size.height;
NSLog(#"%f",scaledHeight); // it print zero
img_vw1.backgroundColor=[UIColor clearColor];
img_vw1.frame=CGRectMake(0,y+5,screen_width,197);
[img_vw1 setContentMode:UIViewContentModeScaleAspectFit];
img_vw1.backgroundColor=[UIColor clearColor];
[self.scrll_vw addSubview:img_vw1];
});
});
For more information, here is a link to Apple's documentation: https://developer.apple.com/library/content/documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html

Zooming while capturing video using AVCapture in iOS

I am using AVCapture to capture video and save it. But I need to provide zooming option like pinch to zoom or through a zoom button. Also video should be saved in exactly in same manner in which it is being displayed, I mean when zoomed in, it should be saved zoomed. Any help, Link is appreciated. My code for setting up AVCapture session is:
- (void)setupAVCapture{
session = [[AVCaptureSession alloc] init];
session.automaticallyConfiguresApplicationAudioSession=YES;
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = self.view.bounds;
[self.view.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:movieFileOutput];
[session commitConfiguration];
[session startRunning];
}
I faced the same problem also, and I have solved it using these two steps:
Add a PinchGestureRecognizer event something like that in your Camera Preview View Controller
- (IBAction)handlePinchGesture:(UIPinchGestureRecognizer *)gestureRecognizer{
if([gestureRecognizer isMemberOfClass:[UIPinchGestureRecognizer class]])
{
effectiveScale = beginGestureScale * ((UIPinchGestureRecognizer *)gestureRecognizer).scale;
if (effectiveScale < 1.0)
effectiveScale = 1.0;
CGFloat maxScaleAndCropFactor = [[self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo] videoMaxScaleAndCropFactor];
if (effectiveScale > maxScaleAndCropFactor)
effectiveScale = maxScaleAndCropFactor;
[CATransaction begin];
[CATransaction setAnimationDuration:.025];
[self.previewView.layer setAffineTransform:CGAffineTransformMakeScale(effectiveScale, effectiveScale)];
[CATransaction commit];
if ([[self videoDevice] lockForConfiguration:nil]) {
[[self videoDevice] setVideoZoomFactor:effectiveScale];
[[self videoDevice] unlockForConfiguration];
}}}}
** Note that the key method for persisting the zoom level for video device is [device setVideoZoomFactor:]
2- In the IBAction of the Record Button , add this code to capture the video ( recording ) then to save the recorded video in the a certain path with certain name
- (IBAction)recordButtonClicked:(id)sender {
dispatch_async([self sessionQueue], ^{
if (![[self movieFileOutput] isRecording])
{
[self setLockInterfaceRotation:YES];
if ([[UIDevice currentDevice] isMultitaskingSupported])
{
// Setup background task. This is needed because the captureOutput:didFinishRecordingToOutputFileAtURL: callback is not received until the app returns to the foreground unless you request background execution time. This also ensures that there will be time to write the file to the assets library when the app is backgrounded. To conclude this background execution, -endBackgroundTask is called in -recorder:recordingDidFinishToOutputFileURL:error: after the recorded file has been saved.
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]];
}
// Update the orientation on the movie file output video connection before starting recording.
// Start recording to a temporary file.
NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[#"movie" stringByAppendingPathExtension:#"mov"]];
[[self movieFileOutput] startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self];
}
else
{
[[self movieFileOutput] stopRecording];
}
});
}
I hope that helps you
Add UIPinchGestureRecognizer object to your and handle callback like this:
- (void) zoomPinchGestureRecognizerAction: (UIPinchGestureRecognizer *) sender {
static CGFloat initialVideoZoomFactor = 0;
if (sender.state == UIGestureRecognizerStateBegan) {
initialVideoZoomFactor = _captureDevice.videoZoomFactor;
} else {
CGFloat scale = MIN(MAX(1, initialVideoZoomFactor * sender.scale), 4);
[CATransaction begin];
[CATransaction setAnimationDuration: 0.01];
_previewLayer.affineTransform = CGAffineTransformMakeScale(scale, scale);
[CATransaction commit];
if ([_captureDevice lockForConfiguration: nil] == YES) {
_captureDevice.videoZoomFactor = scale;
[_captureDevice unlockForConfiguration];
}
}
}

Objective-C: Uploading too many images memory pressure causing app to quit

I am using QBImagePicker to allow multiple image upload. It works fine for up to 25 images being downloaded, but more than that, and the app will quit do to memory pressure while uploading. I would like to allow infinite image upload, and am uncertain how to do so where memory would not be an issue (i.e. perhaps clearing memory after each save). Here is my method to save images (which is called from a loop within the main QBImagePickerController method to save all the selected images):
- (void) saveTheImage:(UIImage *)image fileName:(NSString *)name width:(CGFloat) width height:(CGFloat) height quality:(CGFloat) quality extension:(int)fileNumberExtension
{
UIImage *resizedImage = [self resizeImage:image width:width height:height]; //this is a simple method I have to resize the image sent from the picker
NSData *data = UIImageJPEGRepresentation(resizedImage, quality); //save as a jpeg
NSString *fileName = [NSString stringWithFormat:#"%#%d", name, fileNumberExtension]; //set the filename
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0]; //will be saved in documents
NSString *tempPath = [documentsDirectory stringByAppendingPathComponent:fileName]; //with the filename given
//create a block operation to save
NSBlockOperation* saveOp = [NSBlockOperation blockOperationWithBlock: ^{
[data writeToFile:tempPath atomically:YES];
}];
NSOperationQueue *queue = [[NSOperationQueue alloc] init];
[queue addOperation:saveOp];
}
Thanks in advance!
EDIT
My method to resize the image:
- (UIImage *) resizeImage:(UIImage *)image width:(CGFloat) width height:(CGFloat) height
{
UIImage *resizedImage;
CGSize size = CGSizeMake(width, height);
UIGraphicsBeginImageContextWithOptions(size, NO, 0.0f);
[image drawInRect:CGRectMake(0, 0, width, height)];
resizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resizedImage;
}
EDIT 2
Additional methods:
- (void) imagePickerController:(QBImagePickerController *)imagePickerController didSelectAssets:(NSArray *)assets
{
for (int i=0;i<assets.count;i++)
{
ALAssetRepresentation *rep = [[assets objectAtIndex:i] defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
UIImage *pickedImage = [UIImage imageWithCGImage:iref scale:[rep scale] orientation:(UIImageOrientation)[rep orientation]];
int fileNumberExtension = [self getHighestImageNumber] + 1; //new images all have a higher file name
//set the ratio (width of image is 294)
CGFloat ratio = pickedImage.size.width / 294;
CGFloat newHeight = pickedImage.size.height / ratio;
if (newHeight < 430) //image is too wide
{
[self saveTheImage:pickedImage fileName:#"img" width:294 height:newHeight quality:0.8f extension:fileNumberExtension];
}
else //if the image is too narrow
{
//set the ratio (height of image is 430)
CGFloat ratio = pickedImage.size.height / 430;
CGFloat newWidth = pickedImage.size.width / ratio;
[self saveTheImage:pickedImage fileName:#"img" width:newWidth height:430 quality:0.8f extension:fileNumberExtension];
}
[self saveTheImage:pickedImage fileName:#"thm" width:78 height:78 quality:0.0f extension:fileNumberExtension]; //save the thumbnail
}
[self dismissImagePickerController];
}
- (void)dismissImagePickerController
{
[self dismissViewControllerAnimated:YES completion:nil];
}
- (void) addImageClicked
{
QBImagePickerController *imagePickerController = [[QBImagePickerController alloc] init];
imagePickerController.delegate = self;
imagePickerController.allowsMultipleSelection = YES;
imagePickerController.maximumNumberOfSelection = 20; //allow up to 20 photos at once
imagePickerController.filterType = QBImagePickerControllerFilterTypePhotos;
UINavigationController *navigationController = [[UINavigationController alloc] initWithRootViewController:imagePickerController];
[self presentViewController:navigationController animated:YES completion:nil];
}
Solved this issue by adding by using #autoreleasepool around my for loop in this method:
- (void) imagePickerController:(QBImagePickerController *)imagePickerController didSelectAssets:(NSArray *)assets
This thread was very useful.
You have a memory leak. Leaks usually don't happen because ARC takes care of it for you. (every time you finish using an image, it gets cleared from memory). However, NOT ALL objects are governed by ARC. There are some object types (like CGColorSpaceRef, etc.) that need to be freed manually.
You can check this by running Static Analysis in Xcode. In the top menu bar, select Product -> Analyze. If there are places where you need to free your objects, it will tell you.
To free an object, do:
CGColorSpaceRelease(ref); //where ref is a CGColorSpaceRef.
CGImageRelease(iref); //where iref is a CGImageRef.
or the corresponding method that pertains to your object.

setNeedsDisplay on uiimageview doesn't refresh the view immediately

I'm using AVCaptureSession to capture an image and then send it to my server.
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
if (imageSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
self.videoImageData = [imageData copy];
[self processImage:[UIImage imageWithData:imageData]];
NSString* response=[self uploadImage];
[activityIndicator removeFromSuperview];
self.takePictureBtn.enabled = YES;
[self setUserMessage:response];
}
}];
- (void) processImage:(UIImage *)image
{
...
[UIView beginAnimations:#"rotate" context:nil];
[UIView setAnimationDuration:0.5];
int degrees = [self rotationDegrees];
CGFloat radians =degrees * M_PI / 180.0;
self.captureImageView.transform = CGAffineTransformMakeRotation(radians);
[UIView commitAnimations];
[self.captureImageView setNeedsDisplay];
}
- (NSString*)uploadImage send the photo to the server
The behavior I'm expecting is after processImage terminates that the image will be displayed in the UIImageView captureImageView on it the activity indicator rotates but instead i get a white blanc view with the indicator rotating and only after the server post request terminates captureImageView displays the image
How can I achieve my desired behavior?
Another problem I'm facing is that my imageSampleBuffer returns a mirror image of the taken one (object on the left appears on the right)
From the docs:
This method makes
a note of the request and returns immediately. The view is not
actually redrawn until the next drawing cycle, at which point all
invalidated views are updated.
Since you’re doing other stuff after you call setNeedsDisplay the event doesn’t end and you don’t get a display immediately.
If you want to do more stuff, one approach would be to schedule the rest of the stuff in a new block, like, with
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
NSString *const response = [self uploadImage];
[activityIndicator removeFromSuperview];
self.takePictureBtn.enabled = YES;
[self setUserMessage:response];
}];

SDWebImage resizing/scaling/cropping images coming from url before caching them

How can I resize, scale and crop images coming from url before caching them. I tried to achieve this with the following code.
[cell.profilePicture setImageWithURL:[NSURL URLWithString:[rowData objectForKey:#"pictureUrl"]] placeholderImage:nil];
cell.profilePicture.image = [self imageByScalingAndCroppingForSize:CGSizeMake(cell.profilePicture.frame.size.width, cell.profilePicture.frame.size.height) image:cell.profilePicture.image];
This produced a strange result. First, it shows a not resized/scaled/cropped version. But, when I scroll down and then go back, it shows a resized/scaled/cropped version. I know that this code first caches images and resizes/scales/crops them. But, I could not come up with a better solution. I think there should be something that enables to do downloading images so as to resize/scale/crop and caching independently. Any ideas?
Thank you.
I figured out the solution on my own.
Instead of
[cell.profilePicture setImageWithURL:[NSURL URLWithString:[rowData objectForKey:#"pictureUrl"]] placeholderImage:nil];
which downloads and caches images automatically, use this
SDWebImageManager *manager = [SDWebImageManager sharedManager];
[manager downloadWithURL:url delegate:self options:0 success:^(UIImage *image)
{
cell.profilePicture.image = [self imageByScalingAndCroppingForSize:CGSizeMake(cell.profilePicture.frame.size.width, cell.profilePicture.frame.size.height) image:image];
} failure:nil];
Don't forget to include
#import <SDWebImage/UIImageView+WebCache.h>
#interface YourViewController : UIViewController <SDWebImageManagerDelegate>
If you want to scale UIImage, try this way:
- (void)viewDidLoad {
[super viewDidLoad];
SDWebImageManager *manager = [SDWebImageManager sharedManager];
[manager downloadImageWithURL:[NSURL URLWithString:yourPhotoURL]
options:SDWebImageRefreshCached
progress:nil
completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, BOOL finished, NSURL *imageURL) {
CGSize size = CGSizeMake(50,50);//width,height
imageView.image = [self imageWithImage:image scaledToSize:size];
}];
}
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize{
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
imageWithImage method source:The simplest way to resize an UIImage?
This was a great example on how to implement in using SDWebImageManagerDelegate.
#import <Foundation/Foundation.h>
#import <SDWebImage/SDWebImageManager.h>
#interface ImageLoader : NSObject <SDWebImageManagerDelegate>
+ (instancetype)sharedImageLoader;
#end
#implementation ImageLoader
+ (instancetype)sharedImageLoader
{
static ImageLoader* loader = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
loader = [[ImageLoader alloc] init];
});
return loader;
}
- (UIImage *)imageManager:(SDWebImageManager *)imageManager
transformDownloadedImage:(UIImage *)image
withURL:(NSURL *)imageURL
{
// Place your image size here
CGFloat width = 50.0f;
CGFloat height = 50.0f;
CGSize imageSize = CGSizeMake(width, height);
UIGraphicsBeginImageContext(imageSize);
[image drawInRect:CGRectMake(0, 0, width, height)];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
#end
Then just add the delegate where needed.
[SDWebImageManager sharedManager].delegate = [ImageLoader sharedImageLoader];
If anyone is looking for swift-3 version
self.postImage.sd_setImage(with: URL(string: image)!, placeholderImage: UIImage(named: "image_placeholder"),options: SDWebImageOptions(rawValue: 0),
completed: { (image, error, cacheType, imageURL) in
//Scale or do what ever you want to do to the image
let newImage = self.scaleImage(image: image!, view: self.postImage)
self.postImage.image = newImage
})
func scaleImage(image:UIImage, view:UIImageView) -> UIImage {
let oldWidth = image.size.width
let oldHeight = image.size.height
var scaleFactor:CGFloat
var newHeight:CGFloat
var newWidth:CGFloat
let viewWidth:CGFloat = view.width
if oldWidth > oldHeight {
scaleFactor = oldHeight/oldWidth
newHeight = viewWidth * scaleFactor
newWidth = viewWidth
} else {
scaleFactor = oldHeight/oldWidth
newHeight = viewWidth * scaleFactor
newWidth = viewWidth
}
UIGraphicsBeginImageContext(CGSize(width:newWidth, height:newHeight))
image.draw(in: CGRect(x:0, y:0, width:newWidth, height:newHeight))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
`