Downloading images from a web server for the retina display iOS - objective-c

I am downloading images from a webserver for display in a table view in my iOS application using the following code:
NSURL *url = [NSURL URLWithString:[imageArray objectAtIndex:indexPath.row]];
UIImage *myImage = [UIImage imageWithData:[NSData dataWithContentsOfURL:url]];
cell.imageView.image = myImage;
The image view is a 60x60 placeholder and 120x120 for the retina display. I am going to assume the user has an iPhone 4. However, if I size the image to 120x120 it does not correct the issue, it just becomes too big for the the imageview. If I size the image to 60x60, on the webserver that is, then the image fits fine but its a little fuzzy. Anyone know how to fix this issue?
Thanks!

Let's first agree that your UIImageView is 60x60 points, meaning 60x60 pixels for a standard display and 120x120 pixels for a retina display.
For a UIImageView at 60x60 points, the image should be 60x60 pixels at scale 1.0 for a standard display and 120x120 pixels at scale 2.0 for a retina display. This means that your UIImage should always have a size of 60x60 points, but should have a different scale depending on the display resolution.
When getting the image data from your server, you should first check the scale of the device's screen and then request the appropriate image size (in pixels), like so:
if ([UIScreen mainScreen].scale == 1.0) {
// Build URL for 60x60 pixels image
}
else {
// Build URL for 120x120 pixels image
}
Then you should put the image data in a UIImage of size 60x60 points, at the appropriate scale:
NSData *imageData = [NSData dataWithContentsOfURL:url];
CFDataRef cfdata = CFDataCreate(NULL, [imageData bytes], [imageData length]);
CGDataProviderRef imageDataProvider = CGDataProviderCreateWithCFData (cfdata);
CGImageRef imageRef = CGImageCreateWithJPEGDataProvider(imageDataProvider, NULL, true, kCGRenderingIntentDefault);
UIImage *image = [[UIImage alloc] initWithCGImage:imageRef
scale:[UIScreen mainScreen].scale
orientation:UIImageOrientationUp];
CFRelease (imageRef);
CFRelease (imageDataProvider);
CFRelease(cfdata);
Hope this helps.

If your downloaded image have a size 2X from your UIImageView dimension size. It's look fine for retina display in iPhone4.
see also there: http://mobile.tutsplus.com/tutorials/iphone/preparing-your-iphone-app-for-higher-resolutions/

Images downloaded from the web are not formatted for Retina Display the same way images bundled with your app (using the "#2x" suffix) are.
You can get/set the scale of any UIView with the aptly named scale: and setScale: methods to help you better display content from the web if you know it's intended for Retina Display devices.

Well the answer is as simple as it can be:
I would recommend always downloading the double density images from the server (as users with non retina display are very few) and setting in the imageview.
What you didn't do correctly is you didn't set the imageView to automatically fit the content.
You can do this either in IB by selecting the ImageView and setting the contentMode (mode) to scale to Fill, or by code:
imageView.contentMode = UIViewContentModeScaleToFill;

Related

iOS 7 UITextView: Size of nstextattachment getting 2x after reopening the application

I am building a note editor using the Text Kit in ios7. Earlier I had trouble in rendering of custom size NSTextAttachment's as it was slowing down the rendering to a great extent.I solved the issue by scaling the images and then adding them to textview.You can find my answer in
iOS 7.0 UITextView gettings terribly slow after adding images to it
After scaling the images the textview rendering runs fine without any lag.The attributed text of textview is stored in core data.During a running session of application the textview displays the images correctly.Even after saving the attributed text in the core data and retrieving it again to display on textview,the images look fine.But after killing the app and again running the application.The images get enlarged to 2x size.while scaling the images I used the following function and used [[UIScreen bounds] scale] to maintain the image quality.
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, [UIScreen mainScreen].scale);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
If I scale the images to 1.0 the images doesn't expand but the image quality is very bad.
What I Think where the problem lies?
The problem lies in the layout manager.
What I have Tried
I have tried subclassing the NSLayoutManager and overriding the
- (void)drawGlyphsForGlyphRange:(NSRange)glyphsToShow atPoint:(CGPoint)origin
What I see is the attachment size is doubling when running a new session of the application.If I try to check the size of attachment and resize it.The lag starts coming again.
I am stucked with this problem from a quite time.Any suggestions would be much appreciated.
Could the reason due to retina display? If it is retina, you might need to reduce the size by 50% before storing. How about trying this:-
//Original Size that you want to store
CGSize imageSize = CGSizeMake(320.0f, 320.0f);
//Make the image 50% of the size for retina
if ([[UIScreen mainScreen] respondsToSelector:#selector(displayLinkWithTarget:selector:)] &&([UIScreen mainScreen].scale == 2.0)) {
// Retina display
imageSize = CGSizeMake(160.0f, 160.0f);
}
UIImage * storeImage = [self imageWithImage:self.image scaledToSize:imageSize]
//TODO: Store this image locally or whatever you want to do.
#interface MMTextAttachment : NSTextAttachment
{
}
#end
#implementation MMTextAttachment
//I want my emoticon has the same size with line's height
- (CGRect)attachmentBoundsForTextContainer:(NSTextContainer *)textContainer proposedLineFragment:(CGRect)lineFrag glyphPosition:(CGPoint)position characterIndex:(NSUInteger)charIndex NS_AVAILABLE_IOS(7_0)
{
return CGRectMake( 0 , 0 , lineFrag.size.height , lineFrag.size.height );
}
#end
I think you can try this.

How to correctly show #2x retina images

  I'm using Xcode 4.5.2, the app is targeting iOS 6.0.
In my application, I am having trouble with showing iPad Retina images. I know I have to use the #2x~ipad.png extension in order to get them to properly show and I do that. My images are named according so they are all named the same besides the extension for each device.
Here is how I named it.
However,
NSString *name=#"appearanceConnection.png";
//NSString *name=#"appearanceConnection"; //I did check this
//I did check the target
UIImage *image = [UIImage imageNamed:name];
NSLog(#"%f",image.size.width);
shows me the width of #1x image.
If i try to set proper image name manually:
if ([[UIScreen mainScreen] respondsToSelector:#selector(displayLinkWithTarget:selector:)] &&
([UIScreen mainScreen].scale == 2.0)) {
// Retina display
name=#"appearanceConnection#2x~ipad.png";
NSLog(#"retina");
}
else
{
name=#"appearanceConnection.png";
NSLog(#"non-retina");
}
UIImage *image = [UIImage imageNamed:name];
NSLog(#"%f",image.size.width);
proper #2x picked and then it scaled by 2.0 when I'm adding it to view.
I cleaned/rebuild my project many times, I reset IOS simulator settings, no result.
What am I doing wrong?
update:
non-retina
retina:
NSLog(#"%f", image.scale)
after imageNamed method gives me 1.0
UPDATE 2:
I'm trying to optimize my animation appearance, so I want to load one big image once and cut it by frames.
Firstly, I load big image (#2x or 1#x), then using cycle
for (int i = 0; i<numberOfFrames; i++)
{
CGImageRef imageRef = CGImageCreateWithImageInRect(image.CGImage,
CGRectMake(i*width, 0.0f, width, height));
UIImage *animationImage = [UIImage imageWithCGImage:imageRef];
if (isFlip)
{
[animationImages insertObject:animationImage atIndex:0];
}
else
{
[animationImages addObject:animationImage];
}
CGImageRelease(imageRef);
}
i cut this big image and create animation.
width and height are multiplied by 2 if it is retina display.
Maybe here is a caveat?
UIImage *animationImage = [UIImage imageWithCGImage:imageRef];
returns me image scaled twice?
however, if I scale (0.5,0.5) UIImageView which is initialized by this image it fits. Have any idea why scaled twice?
the retina and non retina images will have the same width and height when used as a UIImage or something similar, think of it as on retina displays the DPI of the image is doubled, and not the width or height.
this is to keep things transparent when coding for retina and non retina.

iOS - Rotating an image or picture

My iOS app downloads some images from the internet and displays them on the screen (iPhone Portrait layout). Some of these images are more wider than taller, and in that case, when the image is presented to the screen, they appear squished (imagine the picture of a widescreen tv shrunk to iPhone's width). What I want to do is that everytime the width of the image is wider than the height of the image, I want to rotate the picture by 90 degrees clockwise (into landscape layout mode), save it on app's documents folder, and then present it on the screen - this way, the picture of the widescreen tv (e.g.) appears 90 degrees rotated but the image aspect ratio is not totally destroyed.
For various complicated reasons, I can't use landscape layout of my app - too many other side effects. So this is code I wrote:
UIImage *image = [[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:imageURL]]];
CGFloat width = image.size.width;
CGFloat height = image.size.height;
if(width > 1.2*height) {
NSLog(#"rotate the image");
CGImageRef imageRef = [image CGImage];
image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationLeft];
}
Then I save the image into App's documents folder. Then a new UIViewController opens which reads the image file saved in the documents folder and then opens this image. Problem is, the image doesn't appear rotated at all - just appears the same way as the original - without any rotation. I do know that the above code tries to do what it is supposed to do because I do see NSLog "rotate the image" in the console. But somehow this image doesn't get saved as the rotated image.
So, how should I approach this issue?
EDIT:
Code to save my image:
NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
// Create image name
NSString *path = [#"" stringByAppendingFormat:#"%#%#", #"image", #".png"];
// Create full image path
path = [documentsDirectory stringByAppendingPathComponent:path];
path = [NSString stringWithFormat:#"%#", path];
// Write image to image path
NSData *data1 = [NSData dataWithData:UIImagePNGRepresentation(image)];
[data1 writeToFile:path atomically:YES];
The following website's solution ultimately worked for me:
http://www.catamount.com/blog/uiimage-extensions-for-cutting-scaling-and-rotating-uiimages/

Retina display for an image from URL

I have some images I need to get from the web. Just using data from a URL.
They need to show correctly on Retina Display.
When I get the images from the web, they still look pixelated. I need to set the images' scale to retina display (2.0), but I must be missing something.
Here's what I did so far.
UIImage *img = [UIImage imageWithData:[NSData dataWithContentsOfURL:#"http://www.msdomains.com/tmp/test.png"];
CGRect labelFrame = CGRectMake(0,0,64,64);
UIImageView *imageView = [[UIImageView alloc] initWithFrame:labelFrame];
imageView.contentScaleFactor = [UIScreen mainScreen].scale;
[imageView setImage:img];
[self addSubview:imageView];
[imageView release];
Try adding ##2x.png at the end of your URL. That wont change the URL, but the image will be recognized as a retina #2x image. It worked for me, but I used this method with SDWebImage.
e.g. using http://www.msdomains.com/tmp/test.png##2x.png.
Your code should work pretty much as-is. I don't know what the original dimensions of your image were, but I'd guess they were 64x64 px. In order to scale down correctly, the original image would need to be 128x128 px.
As a test, the following code correctly displayed my photo in Retina resolution on the Simulator, and on my iPhone 4:
UIImage *img = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:#"http://www.seenobjects.org/images/mediumlarge/2006-08-19-native-lilac.jpg"]]];
CGRect labelFrame = CGRectMake(0, 0, 375, 249.5);
UIImageView *imageView = [[UIImageView alloc] initWithFrame:labelFrame];
[imageView setImage:img];
[self.view addSubview:imageView];
Note that the UIImageView is 375x249.5 points, which is half of the original (pixel) dimensions of the photo. Also, setting the contentScaleFactor didn't seem to be necessary.
(As an aside, I can't see that specifying #2x on the URL will help, in this case, as the call to dataWithContentsOfURL: will return an opaque blob of data, with no trace of the filename left. It's that opaque data that's then passed to imageWithData: to load the image.)
when you directly assign the image URL to imageView, it will not take it as retina.
imageView.imageURL = [NSURL URLWithString:#"http://example.com/image.png"];
will not give you a retina image.
So, inspite your image is 200x200 but if your imageView is 100x100 then it will take 100x100 from the downloaded image and show pixelated image on retina devices.
Solution would be to use the image property of imageView instead of imageURL.
imageView.image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:#"http://example.com/image.png"]]];
This will assign 200x200 image to the imageView of 100x100 and hence the image will not be pixelated.
For retina display, add the same image with the resolution which is exactly the double of the original image. dont forget to add "#2x"at the end of this image name... e.g. "image_header.png" is an image 320x100 then another image with name "image_header#2x.png" (dimension 640x200) will be selected for the retina display automatically by the OS...
hope it helps

UIImageView, CGImage, and Retina Art

I've got a few CALayers in my interface, and I'm drawing images directly to the layers as opposed to imageViews.
Here's a snippet:
UIImage *anImage = [UIImage imageNamed:#"anyImage"];
CGImageRef anImageRef = [anImage CGImage];
CALayer *aLayer = [CALayer layer];
CGFloat anImageWidth = CGImageGetWidth(anImageRef);
CGFloat anImageHeight = CGImageGetHeight(anImageRef);
CGRect layerFrame = CGRectMake(0,0,anImageWidth, anImageHeight);
[aLayer setLayerContents:(__bridge id)anImageRef];
[parentLayer addSublayer:aLayer];
So my problem is that I'm getting inconsistent results with the size of the image. On the retina Device, the image that appears is double the size anticipated (e.g., it matches the pixel size of the #2x image). On the simulator in retina mode, the image drawn to the layer is the anticipated size (where points match the pixels of the non retina image).
Rather than statically set the size, or halve the size (which corrects the issue on the device but breaks compatibility with non-retina displays), what is a good solution or workaround to this scenario? Why is it happening?
The UIImage contains a scale property. It will be 2.0 for retina display images. See the docs for more info.
CGImageGetWidth() and CGImageGetHeight() return the number of pixels whereas you need the image size in points. Use -[UIImage size] instead.