Resize UIImageView in UITableViewCell - objective-c

I have a 16x16 pixel image that I want to display in an UIImageView. So far, no problem, however 16x16 is a bit small so I want to resize the image view to 32x32 and thus also scale the image up.
But I can't get it to work, it always shows the image with 16x16, no matter what I try. I googled a lot, and found many snippets here on Stack Overflow, but its still doesn't work.
Here is my code so far:
[[cell.imageView layer] setMagnificationFilter:kCAFilterNearest];
[cell.imageView setAutoresizingMask:UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight];
[cell.imageView setClipsToBounds:NO];
[cell.imageView setFrame:CGRectMake(0, 0, 32, 32)];
[cell.imageView setBounds:CGRectMake(0, 0, 32, 32)];
[cell.imageView setImage:image];
I don't want to create a new 32x32 pixel image because I already have some memory problems on older devices and creating two images instead of having just one looks like a very bad approach to me (the images can be perfectly scaled and it doesn't matter if they lose quality).

I have successfully made it using CGAffineTransformMakeScale!
cell.imageView.image = cellImage;
//self.rowWidth is the desired Width
//self.rowHeight is the desired height
CGFloat widthScale = self.rowWidth / cellImage.size.width;
CGFloat heightScale = self.rowHeight / cellImage.size.height;
//this line will do it!
cell.imageView.transform = CGAffineTransformMakeScale(widthScale, heightScale);

I think you need to set the contentMode:
cell.imageView.contentMode = UIViewContentModeScaleAspectFit;
In context:
UIImage *image = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"slashdot" ofType:#"png"]];
imageView = [[UIImageView alloc] initWithImage:image];
[imageView setBackgroundColor:[UIColor greenColor]];
[imageView setFrame:CGRectMake(x,y,32,32)];
imageView.contentMode = UIViewContentModeScaleAspectFit;
[self.view addSubview:imageView];
Note: I've set a background colour so you can debug the on-screen boundaries of the UIImageView. Also x and y are arbitrary integer coordinates.

Using CGAffineTransformMakeScele as #ahmed said is valid and do not seems to be duck type solution at al! For instance, if you have a large image and put it into a UITableViewCell (say the image is 2x larger than the one that fits into a table cell. If you scale by 0.9 you don't see any result. Only if you scale by less than 0.5 (because 0.5*2.0 = 1.0 that is the size of the cell). So it seems that inside the api, apple is doing exactly that.

You need to override the layoutSubviews method. By default, it's resizing the imageview based on the cell height size.
-(void)layoutSubviews
{
[super layoutSubviews];
self.imageView.frame = CGRectMake(self.imageView.frame.origin.x,
self.imageView.frame.origin.y,
MY_ICON_SIZE,
MY_ICON_SIZE);
}
You'll probably want to recalculate the origin as well so it's vertically centered.

Related

UIView changing the size of the image?

I am trying to add UIImage to UIView .
The image is exactly in the size of the view -as i defined the view rect .
Somehow i see that the images is displayed wider,and taller(its stretched ) .
Why does my view is change the image size ?
UIImage *strip=[UIImage imageNamed:#"albumStrip.png"];
UIImageView *imageView=[[UIImageView alloc]initWithImage:strip];
UIView * aView = [ [UIView alloc] initWithFrame:CGRectMake(0.03*winSize.width, 0.85*winSize.height , 0.95*winSize.width, winSize.width/10) ];
aView.backgroundColor = [UIColor whiteColor];
aView.tag = 31000;
aView.layer.cornerRadius=1;
[aView addSubview:imageView];
EDIT :
I can see that my image is 640x960. is it possible that for the iPhone4 the UIImage dont know how to take it and div it by factor 2 ?
Try setting the UIImageView's ContentMode (http://developer.apple.com/library/ios/#documentation/uikit/reference/UIImageView_Class/Reference/Reference.html)
imageView.contentMode = UIViewContentModeScaleAspectFit;
use
imageView.layer.masksToBounds = YES;
which will restrict image to be wide and taller
If the masksToBounds property is set to YES, any sublayers of the layer that extend outside its boundaries will be clipped to those boundaries. Think of the layer, in that case, as a window onto its sublayers; anything outside the edges of the window will not be visible. When masksToBounds is NO, no clipping occurs, and any sublayers that extend outside the layer's boundaries will be visible in their entirety (as long as they don't go outside the edges of any superlayer that does have masking enabled).
of course your image is going to be stretched and can not be shown in your view.
Because its frame is a lot bigger than the size of the UIView.
you should set the frame of the UIImageView to be the same size as your UIView, the add it as a subView:
UIImage *strip=[UIImage imageNamed:#"albumStrip.png"];
CGRect frame = CGRectMake(0.03*winSize.width, 0.85*winSize.height , 0.95*winSize.width, winSize.width/10);
// create the uiimageView with the same frame size as its parentView.
UIImageView *imageView=[[UIImageView alloc] initWithFrame:frame];
// set the image
imageView.image = strip;
// create the view with the same frame
UIView * aView = [ [UIView alloc] initWithFrame:frame];
aView.backgroundColor = [UIColor whiteColor];
aView.tag = 31000;
aView.layer.cornerRadius=1;
[aView addSubview:imageView];
and of course if you can make the size of the uiimageview smaller ;)

Adding an image border

OK, this is what I'm trying to do :
Get an NSImage containing, let's say a photo (1000+ x 1000+ dimensions).
Get another NSImage containing just a tranparent background and a simple black border (500x500).
"Combine" the 2 images, so that the resulting image is the photo with a border.
This is what I've achieved so far :
NSImage* resultImage = [[[self drop] image] copy];
[resultImage lockFocus];
NSRect newRect = NSMakeRect(0, 0, [[[self drop] image] size].width, [[[self drop] image] size].height);
[[[self drop2] image] drawInRect:newRect
fromRect:NSZeroRect
operation:NSCompositeSourceOver
fraction:1.0];
[resultImage unlockFocus];
[[self drop] setImage:resultImage];
Where [self drop] is an ImageWell containing the photo, and [self drop2] an ImageWell containing the border.
The thing is that it IS working. However, the resulting image is - quite obviously - showing a somewhat "stretched" border.
How could I resolve that? Given that the original photo should be of ANY dimensions, how could I make it to use a border (of some fixed dimensions) and still avoid stretching?
How about doing the border directly with CALayer, e.g.:
#import <QuartzCore/QuartzCore.h>
CALayer *layer = imageView.layer;
layer.borderColor = [[NSColor blackColor] CGColor];
layer.borderWidth = 10;
I would do this differently. Just size the image as desired and then add the border. You could do this just by having a simple view with black background, or a suitable image (assuming you want to have customizable image borders, like frames), sized to always keep the resulting border constant. Then you can generate a new image from that view, if you need to.

Loading a png into a UIImageView iOS

I have two images that need to be overlaid over one another, and they are both png images (since I need to be able to make them transparent). However, when I load them into a UIImage view in my xib file, neither of them display at all! When I try using the jpg format of the same images it works fine, but because jpg doesn't support transparency, the overlay effect I need is lost. How can I get the png images to actually display in the window?
This is the kind of task that is easier to do from code than from Interface "Crappy" Builder:
CGRect imageFrame = CGRectMake(x, y, width, height);
UIImage *image1 = // however you obtain your 1st image
UIImage *image2 = // however you obtain your 2nd image
UIImageView *imgView1 = [[UIImageView alloc] initWithImage:image1];
// Adjust the alpha of the view
imgView1.alpha = 1.0f; // This is most advisably 1.0 (always)
imgView1.backgroundColor = [UIColor clearColor];
imgView1.frame = imageFrame;
UIImageView *imgView2 = [[UIImageView alloc] initWithImage:image2];
// Adjust the alpha of the view
imgView1.alpha = 0.5f; // or whatever you find practical
imgView1.backgroundColor = [UIColor clearColor];
imgView2.frame = imageFrame;
// Assume a view controller
[self.view addSubview:imgView1];
[self.view addSUbview:imgView2]; // add the image view later which you wanna be on the top of the other one
// If non-ARC environment, we need to take care of the percious RAM
[imgView1 release];
[imgView2 release];
Try to open your png images in a photo editor like photoshop or pixelmator and save it again as NOT interlaced (in the save options of png).

setting view boundaries

I have a scrollview with an image as a subview. I would like to set the boundaries of the scrollview to be the size of the image view, so that you wouldn't be able to see any of the background.
I don't want this happening anymore.
The weird part is, that after you zoom in or out on the image, then the boundaries seem to fix themselves, and you can no longer move the image out of the way and see the background.
This is what I have going for code:
-(UIView *) viewForZoomingInScrollView:(UIScrollView *)scrollView
{
// return which subview we want to zoom
return self.imageView;
}
-(void)viewDidLoad{
[super viewDidLoad];
[self sendLogMessage:#"Second View Controller Loaded"];
//sets the initial view to scale to fit the screen
self.imageView.frame = CGRectMake(0, 0, CGRectGetWidth(self.view.bounds), CGRectGetHeight(self.view.bounds));
//sets the content size to be the size our our whole frame
self.scrollView.contentSize = self.imageView.image.size;
//setes the scrollview's delegate to itself
self.scrollView.delegate = self;
//sets the maximum zoom to 2.0, meaning that the picture can only become a maximum of twice as big
[self.scrollView setMaximumZoomScale : 2.5];
//sets the minimum zoom to 1.0 so that the scrollview can never be smaller than the image (no matter how far in/out we're zoomed)
[self.scrollView setMinimumZoomScale : 1.0];
[imageView addSubview:button];
}
I thought that this line would solve my problem
//sets the content size to be the size our our whole frame
self.scrollView.contentSize = self.imageView.image.size;
But like I said, it only works after I zoom in or out.
EDIT: When I switch
self.scrollView.contentSize = self.imageView.image.size;
to
self.scrollView.frame = self.imageView.frame;
It works like I want it to (you can't see the background), except the toolbar on the top is covered by the image.
imageView.image.size isn't necessarily the frame of the imageView itself, try setting the
scrollview.frame = imageView.frame
and then
scrollView.contentSize = imageView.image.size
Then you won't see any border. If you want the image to be the maximum size to start with,
do
imageView.frame = image.size;
[imageView setImage:image];
scrollView.frame = self.view.frame; //or desired size
[scrollView addSubView:imageView];
[scrollView setContentSize:image.size]; //or imageView.frame.size
To fix this, I ended up declaring a new CGRect , setting its origin to my scrollView's origin, setting its size with the bounds of my view, and then assigning this CGRect back to my scrollview frame
CGRect scrollFrame;
scrollFrame.origin = self.scrollView.frame.origin;
scrollFrame.size = CGSizeMake(CGRectGetWidth(self.view.bounds), CGRectGetHeight(self.view.bounds));
self.scrollView.frame = scrollFrame;

Why after rotating UIImageView size is getting changed?

I'm new in using transformations. And still confusted how they are working.
What I'm trying to do, is to rotate my UIImageView with given angle. But after rotating, it's changing the size of image, getting smaller. I'm also doing scaling for ImageView so it won't be upside down.How to rotate and keep the size, that was given in CGRectMake, when ImageView was allocated ?
UIImageView *myImageView = [[UIImageView alloc]initWithFrame:CGRectMake(x,y,width,height)];
myImageView.contentMode = UIViewContentModeScaleAspectFit;
[myImageView setImage:[UIImage imageNamed:#"image.png"]];
myImageView.layer.anchorPoint = CGPointMake(0.5,0.5);
CGAffineTransform newTransform;
myImageView.transform = CGAffineTransformMakeScale(1,-1);
newTransform = CGAffineTransformRotate(newTransform, 30*M_PI/180);
[self.window addSubview:myImageView];
Thanks a lot!
Ok I promised I'd look into it, so here's my answer:
I create a scene which should be somewhat equivalent to yours, code as follows:
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(self.view.bounds.size.width/2-100,
self.view.bounds.size.height/2-125,
200,
250)];
imageView.image = [UIImage imageNamed:#"testimage.jpg"];
imageView.contentMode = UIViewContentModeScaleAspectFill;
/*
* I added clipsToBounds, because my test image didn't have a size of 200x250px
*/
imageView.clipsToBounds = YES;
[self.view addSubview:imageView];
NSLog(#"frame: %#",[NSValue valueWithCGRect:imageView.frame]);
NSLog(#"bounds: %#",[NSValue valueWithCGRect:imageView.bounds]);
imageView.layer.anchorPoint = CGPointMake(0.5, 0.5);
imageView.transform = CGAffineTransformMakeRotation(30*M_PI/180);
NSLog(#"frame after rotation: %#",[NSValue valueWithCGRect:imageView.frame]);
NSLog(#"bounds after rotation: %#",[NSValue valueWithCGRect:imageView.bounds]);
This code assumes that you are using ARC. If not add
[imageView release];
at the end.
Using this code the logs look like this:
[16221:207] frame: NSRect: {{60, 105}, {200, 250}}
[16221:207] bounds: NSRect: {{0, 0}, {200, 250}}
[16221:207] frame after rotation: NSRect: {{10.897461, 71.746826}, {298.20508, 316.50635}}
[16221:207] bounds after rotation: NSRect: {{0, 0}, {200, 250}}
As you can see the bounds always stay the same. What actually changes due to the rotation is the frame, because an image which has been rotated by 30°C is of course wider than if it handn't been rotated. And since the center point has been set to the actual center of the view the origin of the frame also changes (being pushed to the left and the top). Notice that the size of the image itself does not change. I didn't use the scale transformation, since the result can be achieved without scaling.
But to make it clearer here are some pictures for you (0°, 30° 90° rotation):
They already look pretty similar, right? I drew the actual frames to make it clear what's the difference between bounds and frame is. The next one really makes it clear. I overlayed all images, rotating them by the negative degrees with which the UIImageView was rotated, giving the following result:
So you see it's pretty straight forward how to rotate images. Now to your problem that you actually want the frame to stay the same. If you want the final frame to have the size of your original frame (in this example with a width of 200 and a height of 250) then you will have to scale the resulting frame. But this will of course result in scaling of the image, which you do not want. I actually think a larger frame will not be a problem for you - you just need to know that you have to take it into account because of the rotation.
In short: it is not possible to have an UIImageView which will have the same frame after rotation. This isn't possible for any UIView. Just think of a rectangle. If you rotate it, it won't be a rectangle after the rotation, will it?
Of course you could put your UIImageView inside another UIView which will have a non-rotated frame with a width of 200 and a height of 250 but that would just be superficial, since it won't really change the fact that a rotated rectangle has a different width and height than the original.
I hope this helps. :)
Do not set the contentMode which UIImageView inherits from UIView and leads to the changes in frame according to scaling,transformation,device rotation in accordance to the UIViewContentMode you select.
Also if you just want to rotate you can just change the frame using :-
[UIView beginAnimations:#"Rotate" context:nil];
[UIView setAnimationDuration:1.0];
[UIView setAnimationDelegate:self];
CGRect frame=yourView.frame;
frame.origin.y+=distance you want to rotate;
yourview.frame=frame;
[UIView commitAnimations];
}
if you dont want the animation just change the frame
Try Using This :-
CABasicAnimation* animation = [CABasicAnimation animationWithKeyPath:#"transform.rotation.z"];
animation.fromValue = [NSNumber numberWithFloat:0.0f];
animation.toValue = [NSNumber numberWithFloat: 2*M_PI];
animation.duration = 0.5f;
animation.repeatCount = HUGE_VALF; // HUGE_VALF is defined in math.h so import it
[self.reloadButton.imageView.layer addAnimation:animation forKey:#"rotation"];