How to get [UIImage imageWithContentsOfFile:] and High Res Images working - objective-c

As many people are complaining it seems that in the Apple SDK for the Retina Display there's a bug and imageWithContentsOfFile actually does not automatically load the 2x images.
I've stumbled into a nice post how to make a function which detects UIScreen scale factor and properly loads low or high res images ( http://atastypixel.com/blog/uiimage-resolution-independence-and-the-iphone-4s-retina-display/ ), but the solution loads a 2x image and still has the scale factor of the image set to 1.0 and this results to a 2x images scaled 2 times (so, 4 times bigger than what it has to look like)
imageNamed seems to accurately load low and high res images, but is no option for me.
Does anybody have a solution for loading low/high res images not using the automatic loading of imageNamed or imageWithContentsOfFile ? (Or eventually solution how to make imageWithContentsOfFile work correct)

Ok, actual solution found by Michael here :
http://atastypixel.com/blog/uiimage-resolution-independence-and-the-iphone-4s-retina-display/
He figured out that UIImage has the method "initWithCGImage" which also takes a scale factor as input (I guess the only method where you can set yourself the scale factor)
[UIImage initWithCGImage:scale:orientation:]
And this seems to work great, you can custom load your high res images and just set that the scale factor is 2.0
The problem with imageWithContentsOfFile is that since it currently does not work properly, we can't trust it even when it's fixed (because some users will still have an older iOS on their devices)

We just ran into this here at work.
Here is my work-around that seems to hold water:
NSString *imgFile = ...path to your file;
NSData *imgData = [[NSData alloc] initWithContentsOfFile:imgFile];
UIImage *img = [[UIImage alloc] initWithData:imgData];

imageWithContentsOfFile works properly (considering #2x images with correct scale) starting iOS 4.1 and onwards.

Enhancing Lisa Rossellis's answer to keep retina images at desired size (not scaling them up):
NSString *imagePath = ...Path to your image
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfFile:imagePath] scale:[UIScreen mainScreen].scale];

I've developed a drop-in workaround for this problem.
It uses method swizzling to replace the behavior of the "imageWithContentsOfFile:" method of UIImage.
It works fine on iPhones/iPods pre/post retina.
Not sure about the iPad.
Hope this is of help.
#import </usr/include/objc/objc-class.h>
#implementation NSString(LoadHighDef)
/** If self is the path to an image, returns the nominal path to the high-res variant of that image */
-(NSString*) stringByInsertingHighResPathModifier {
NSString *path = [self stringByDeletingPathExtension];
// We determine whether a device modifier is present, and in case it is, where is
// the "split position" at which the "#2x" token is to be added
NSArray *deviceModifiers = [NSArray arrayWithObjects:#"~iphone", #"~ipad", nil];
NSInteger splitIdx = [path length];
for (NSString *modifier in deviceModifiers) {
if ([path hasSuffix:modifier]) {
splitIdx -= [modifier length];
break;
}
}
// We insert the "#2x" token in the string at the proper position; if no
// device modifier is present the token is added at the end of the string
NSString *highDefPath = [NSString stringWithFormat:#"%##2x%#",[path substringToIndex:splitIdx], [path substringFromIndex:splitIdx]];
// We possibly add the extension, if there is any extension at all
NSString *ext = [self pathExtension];
return [ext length]>0? [highDefPath stringByAppendingPathExtension:ext] : highDefPath;
}
#end
#implementation UIImage (LoadHighDef)
/* Upon loading this category, the implementation of "imageWithContentsOfFile:" is exchanged with the implementation
* of our custom "imageWithContentsOfFile_custom:" method, whereby we replace and fix the behavior of the system selector. */
+(void)load {
Method originalMethod = class_getClassMethod([UIImage class], #selector(imageWithContentsOfFile:));
Method replacementMethod = class_getClassMethod([UIImage class], #selector(imageWithContentsOfFile_custom:));
method_exchangeImplementations(replacementMethod, originalMethod);
}
/** This method works just like the system "imageWithContentsOfFile:", but it loads the high-res version of the image
* instead of the default one in case the device's screen is high-res and the high-res variant of the image is present.
*
* We assume that the original "imageWithContentsOfFile:" implementation properly sets the "scale" factor upon
* loading a "#2x" image . (this is its behavior as of OS 4.0.1).
*
* Note: The "imageWithContentsOfFile_custom:" invocations in this code are not recursive calls by virtue of
* method swizzling. In fact, the original UIImage implementation of "imageWithContentsOfFile:" gets called.
*/
+ (UIImage*) imageWithContentsOfFile_custom:(NSString*)imgName {
// If high-res is supported by the device...
UIScreen *screen = [UIScreen mainScreen];
if ([screen respondsToSelector:#selector(scale)] && [screen scale]>=2.0) {
// then we look for the high-res version of the image first
UIImage *hiDefImg = [UIImage imageWithContentsOfFile_custom:[imgName stringByInsertingHighResPathModifier]];
// If such high-res version exists, we return it
// The scale factor will be correctly set because once you give imageWithContentsOfFile:
// the full hi-res path it properly takes it into account
if (hiDefImg!=nil)
return hiDefImg;
}
// If the device does not support high-res of it does but there is
// no high-res variant of imgName, we return the base version
return [UIImage imageWithContentsOfFile_custom:imgName];
}
#end

[UIImage imageWithContentsOfFile:] doesn't load #2x graphics if you specify an absolute path.
Here is a solution:
- (UIImage *)loadRetinaImageIfAvailable:(NSString *)path {
NSString *retinaPath = [[path stringByDeletingLastPathComponent] stringByAppendingPathComponent:[NSString stringWithFormat:#"%##2x.%#", [[path lastPathComponent] stringByDeletingPathExtension], [path pathExtension]]];
if( [UIScreen mainScreen].scale == 2.0 && [[NSFileManager defaultManager] fileExistsAtPath:retinaPath] == YES)
return [[[UIImage alloc] initWithCGImage:[[UIImage imageWithData:[NSData dataWithContentsOfFile:retinaPath]] CGImage] scale:2.0 orientation:UIImageOrientationUp] autorelease];
else
return [UIImage imageWithContentsOfFile:path];
}
Credit goes to Christof Dorner for his simple solution (which I modified and pasted here).

Related

Memory issues with ARC on iOS and Mac

I am trying to mirror screen of my mac to iphone. I have this method in Mac app delegate to capture screeen into base64 string.
-(NSString*)baseString{
CGImageRef screen = CGDisplayCreateImage(displays[0]);
CGFloat w = CGImageGetWidth(screen);
CGFloat h = CGImageGetHeight(screen);
NSImage * image = [[NSImage alloc] initWithCGImage:screen size:(NSSize){w,h}];
[image lockFocus];
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithFocusedViewRect:NSMakeRect(0, 0, w, h)];
[bitmapRep setCompression:NSTIFFCompressionJPEG factor:.3];
[image unlockFocus];
NSData *imageData = [bitmapRep representationUsingType:NSJPEGFileType properties:_options];
NSString *base64String = [imageData base64EncodedStringWithOptions:0];
image = nil;
bitmapRep = nil;
imageData = nil;
return base64String;}
after that I am sending it to iphone and present it in UIImageView.
Delay between screenshots is 40 miliseconds. Everything works as expected until there is enough memory. After minute of streaming it starts swapping and use 6GB of RAM. iOS app memory usage is also growing lineary. By the time iOS reaches 90MB of ram, mac has 6GB.
Even if I stop streaming memory is not released.
I'm using ARC in both projects. Would it make any difference if migrate it to manual reference counting ?
I also tried #autoreleasepool {...} block, but it didn't help.
Any ideas ?
EDIT
My iOS code is here
NSString message = [NSString stringWithFormat:#"data:image/png;base64,%#",base64];
NSURL *url = [NSURL URLWithString:message];
NSData *imageData = [NSData dataWithContentsOfURL:url];
UIImage *ret = [UIImage imageWithData:imageData];
self.image.image = ret;
You have a serious memory leak. The docs for CGDisplayCreateImage clearly state:
The caller is responsible for releasing the image created by calling CGImageRelease.
Update your code with a call to:
CGImageRelease(screen);
I'd add that just after creating the NSImage.
We can't help with your iOS memory leaks since you didn't post your iOS code, but I see a big memory leak in your Mac code.
You are calling a Core Foundation function, CGDisplayCreateImage. Core Foundation objects are not managed by ARC. If a Core Foundation function has "Create" (or "copy") in the name then it follows the "create rule" and you are responsible for releasing the returned CF object when you are done with it.
Some CF objects have special release calls. For those that don't, just call CFRelease. CGImageRef has a special release call, CGImageRelease().
You need a corresponding call to CGImageRelease(screen), probably after the call to initWithCGImage.

use xcassets without imageNamed to prevent memory problems?

according to the apple documentation it is recommended to use xcassets for iOS7 applications and reference those images over imageNamed.
But as far as I'm aware, there were always problems with imageNamed and memory.
So I made a short test application - referencing images out of the xcassets catalogue with imageNamed and started the profiler ... the result was as expected. Once allocated memory wasn't released again, even after I removed the ImageView from superview and set it to nil.
I'm currently working on an iPad application with many large images and this strange imageView behavior leads to memory warnings.
But in my tests I wasn't able to access xcassets images over imageWithContentsOfFile.
So what is the best approach to work with large images on iOS7? Is there a way to access images from the xcassets catalogue in another (more performant) way? Or shouldn't I use xcassets at all so that I can work with imageWithContentsOfFile?
Thank you for your answers!
UPDATE: Cache eviction works fines (at least since iOS 8.3).
I decided to go with the "new Images.xcassets" from Apple, too. Things started to go bad, when I had about 350mb of images in the App and the App constantly crashed (on a Retina iPad; probably because of the size of the loaded images).
I have written a very simple test app where I load the images in three different types (watching the profiler):
imageNamed: loaded from an asset: images never gets released and the app crashes (for me I could load 400 images, but it really depends on the image size)
imageNamed: (conventionally included to the project): The memory usage is high and once in a while (> 400 images) I see a call to didReceiveMemoryWarning:, but the app is running fine.
imageWithContentsOfFile([[NSBundle mainBundle] pathForResource:...): The memory usage is very low (<20mb) because the images are only loaded once at a time.
I really would not blame the caching of the imageNamed: method for everything as caching is a good idea if you have to show your images again and again, but it is kind of sad that Apple did not implement it for the assets (or did not document it that it is not implemented). In my use-case, I will go for the non-caching imageWithData because the user won't see the images again.
As my app is almost final and I really like the usage of the loading mechanism to find the right image automatically, I decided to wrap the usage:
I removed the images.xcasset from the project-target-copy-phase and added all images "again" to the project and the copy-phase (simply add the top level folder of Images.xcassets directly and make sure that the checkbox "Add To Target xxx" is checked and "Create groups for any added folders" (I did not bother about the useless Contents.json files).
During first build check for new warnings if multiple images have the same name (and rename them in a consistent way).
For App Icon and Launch Images set "Don't use asset catalog" in project-target-general and reference them manually there.
I have written a shell script to generate a json-model from all the Contents.json files (to have the information as Apples uses it in its asset access code)
Script:
cd projectFolderWithImageAsset
echo "{\"assets\": [" > a.json
find Images.xcassets/ -name \*.json | while read jsonfile; do
tmppath=${jsonfile%.imageset/*}
assetname=${tmppath##*/}
echo "{\"assetname\":\"${assetname}\",\"content\":" >> a.json
cat $jsonfile >> a.json;
echo '},' >>a.json
done
echo ']}' >>a.json
Remove the last "," comma from json output as I did not bother to do it manually here.
I have used the following app to generate json-model-access code: https://itunes.apple.com/de/app/json-accelerator/id511324989?mt=12 (currently free) with prefix IMGA
I have written a nice category using method swizzling in order to not change running code (and hopefully removing my code very soon):
(implementation not complete for all devices and fallback mechanisms!!)
#import "UIImage+Extension.h"
#import <objc/objc-runtime.h>
#import "IMGADataModels.h"
#implementation UIImage (UIImage_Extension)
+ (void)load{
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
Class class = [self class];
Method imageNamed = class_getClassMethod(class, #selector(imageNamed:));
Method imageNamedCustom = class_getClassMethod(class, #selector(imageNamedCustom:));
method_exchangeImplementations(imageNamed, imageNamedCustom);
});
}
+ (IMGABaseClass*)model {
static NSString * const jsonFile = #"a";
static IMGABaseClass *baseClass = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
NSString *fileFilePath = [[NSBundle mainBundle] pathForResource:jsonFile ofType:#"json"];
NSData* myData = [NSData dataWithContentsOfFile:fileFilePath];
__autoreleasing NSError* error = nil;
id result = [NSJSONSerialization JSONObjectWithData:myData
options:kNilOptions error:&error];
if (error != nil) {
ErrorLog(#"Could not load file %#. The App will be totally broken!!!", jsonFile);
} else {
baseClass = [[IMGABaseClass alloc] initWithDictionary:result];
}
});
return baseClass;
}
+ (UIImage *)imageNamedCustom:(NSString *)name{
NSString *imageFileName = nil;
IMGAContent *imgContent = nil;
CGFloat scale = 2;
for (IMGAAssets *asset in [[self model] assets]) {
if ([name isEqualToString: [asset assetname]]) {
imgContent = [asset content];
break;
}
}
if (!imgContent) {
ErrorLog(#"No image named %# found", name);
}
if (is4InchScreen) {
for (IMGAImages *image in [imgContent images]) {
if ([#"retina4" isEqualToString:[image subtype]]) {
imageFileName = [image filename];
break;
}
}
} else {
if ( UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone ) {
for (IMGAImages *image in [imgContent images]) {
if ([#"iphone" isEqualToString:[image idiom]] && ![#"retina4" isEqualToString:[image subtype]]) {
imageFileName = [image filename];
break;
}
}
} else {
if (isRetinaScreen) {
for (IMGAImages *image in [imgContent images]) {
if ([#"universal" isEqualToString:[image idiom]] && [#"2x" isEqualToString:[image scale]]) {
imageFileName = [image filename];
break;
}
}
} else {
for (IMGAImages *image in [imgContent images]) {
if ([#"universal" isEqualToString:[image idiom]] && [#"1x" isEqualToString:[image scale]]) {
imageFileName = [image filename];
if (nil == imageFileName) {
// fallback to 2x version for iPad unretina
for (IMGAImages *image in [imgContent images]) {
if ([#"universal" isEqualToString:[image idiom]] && [#"2x" isEqualToString:[image scale]]) {
imageFileName = [image filename];
break;
}
}
} else {
scale = 1;
break;
}
}
}
}
}
}
if (!imageFileName) {
ErrorLog(#"No image file name found for named image %#", name);
}
NSString *imageName = [[NSBundle mainBundle] pathForResource:imageFileName ofType:#""];
NSData *imgData = [NSData dataWithContentsOfFile:imageName];
if (!imgData) {
ErrorLog(#"No image file found for named image %#", name);
}
UIImage *image = [UIImage imageWithData:imgData scale:scale];
DebugVerboseLog(#"%#", imageFileName);
return image;
}
#end

Caching images with NSMutableDictionary

I am attempting to create an application that goes through various images from the net and aim to cache them onto the iPhone for offline use. The code I am currently working with is:
NSMutableDictionary *Cache;
- (UIImage *)CachedImage: (NSString*)url {
UIImage *image = [Cache objectForKey:url];
if (image == nil) {
image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:url]]];
[Cache setObject:image forKey:url];
//NSLog (#"Stored");
return image;
} else {
//NSLog (#"Taken");
return image;
} }
I call the function and place the image into an ImageView using the strip of code below.
[self.imageView setImage:[self CachedImage:url]]; // Change url to desired URL.
Using the NSLog, the problem I found is that the code doesn't actually store the value because the value is always reading nil. Why is that and are there other ways of storing images for offline use?
Thanks in advance.
-Gon
Use NSCache to cache UIImages. You can also save the image locally (if you reuse these images a lot and during multiple launch) so whenever your app closes or you flush your cache, you can get the images immediately from your local directory.
https://developer.apple.com/library/mac/#documentation/Cocoa/Reference/NSCache_Class/Reference/Reference.html
You are looking for
NSCache
Check it out here: http://nshipster.com/nscache/
Poor NSCache, always being overshadowed by NSMutableDictionary in the
most inappropriate circumstances. It’s like no one knows its there,
ready to provide all of that garbage collection behavior that
developers take great pains to re-implement themselves.

Does the UIImage Cache image?

UIImage *img = [[UIImage alloc] initWithContentsOfFile:#"xx.jpg"]
UIImage *img = [UIImage imageNamed:#"xx.jpg"]
In the second type will the image get cached ?
Whereas the in the first type the images doesn't get cached?
The -initWithContentsOfFile: creates a new image without caching, it's an ordinary initialization method.
The +imageNamed: method uses cache. Here's a documentation from UIImage Reference:
This method looks in the system caches for an image object with the specified name and returns that object if it exists. If a matching image object is not already in the cache, this method loads the image data from the specified file, caches it, and then returns the resulting object.
UIImage will retain loaded image, keeping it alive until low memory condition will cause the cache to be purged.
Update for Swift:
In Swift the UIImage(named: "...") function is the one that caches the image.
Just wanted to leave this here to help deal with the pathnames issue. This is a method that you can put on a UIImage category.
+(UIImage *)imageNamed:(NSString *)name cache:(BOOL)cache {
if (cache)
return [UIImage imageNamed:name];
name = [[NSBundle mainBundle] pathForResource:[name stringByDeletingPathExtension] ofType:[name pathExtension]];
UIImage *retVal = [[UIImage alloc] initWithContentsOfFile:name];
return retVal;
}
If you don't have an easy way to switch to cached, you might end up just using `imageNamed. This is a big mistake in most cases. See this great answer for more details (and upvote both question and answer!).
#Dan Rosenstark answer in swift..
extension UIImage {
static func imageNamed(name: String, cache: Bool) -> UIImage? {
if (cache) {
return UIImage(named: name)
}
// Using NSString for stringByDeletingPathExtension
let fullName = NSString(string: name)
let fileName = fullName.stringByDeletingPathExtension
let ext = fullName.pathExtension
let resourcePath = NSBundle.mainBundle().pathForResource(fileName, ofType: ext)
if let path = resourcePath {
return UIImage(contentsOfFile: path)
}
return nil
}
}
Correct, the second item is cached.

Preview image of map

How can I make a preview icon like in the iPhone Map application?
Screenshot http://img835.imageshack.us/img835/1619/img0016x.png
Is there a function for this?
In iOS 7 use MKMap​Snapshotter. From NSHipster:
MKMapSnapshotOptions *options = [[MKMapSnapshotOptions alloc] init];
options.region = self.mapView.region;
options.size = self.mapView.frame.size;
options.scale = [[UIScreen mainScreen] scale];
NSURL *fileURL = [NSURL fileURLWithPath:#"path/to/snapshot.png"];
MKMapSnapshotter *snapshotter = [[MKMapSnapshotter alloc] initWithOptions:options];
[snapshotter startWithCompletionHandler:^(MKMapSnapshot *snapshot, NSError *error) {
if (error) {
NSLog(#"[Error] %#", error);
return;
}
UIImage *image = snapshot.image;
NSData *data = UIImagePNGRepresentation(image);
[data writeToURL:fileURL atomically:YES];
}];
Before iOS7 do one of the following:
Add a live map and disable interaction.
Use the Google Static Maps API (explained below).
Create a MKMapView and render it to an image. I tried this (see previous edit) but couldn't get the tiles to load. I tried calling needsLayout and other methods but didn't work.
Apple's deal with Google is more reliable than a free API, but on the bright side, it is really simple. Documentation is at http://code.google.com/apis/maps/documentation/staticmaps/
This is the minimum amount of parameters for practical use:
http://maps.googleapis.com/maps/api/staticmap?center=40.416878,-3.703530&zoom=15&size=290x179&sensor=false
A summary of the parameters:
center: this can be an address, or a latitude,longitude pair with up to 6 decimals (more than 6 are ignored).
format: png8 (default), png24, git, jpg, jpg-baseline.
language: Use any language. Default will be used if requested wasn't available.
maptype: One of the following: roadmap, satellite, hybrid, terrain.
scale: 1,2,4. Use 2 to return twice the amount of pixels on retina displays. 4 is restricted to premium customers.
Non paid resolution limits are 640x640 for 1 and 2.
sensor: true or false. Mandatory. It indicates if the user is being located using a device.
size: size in pixels.
zoom: 0 (whole planet) to 21.
There is a few more to add polygons, custom markers, and style the map.
NSData* data = [NSData dataWithContentsOfURL:#"http://maps.googleap..."];
UIImage *img = [UIImage imageWithData:data];
You can use Google Maps Static API to generate an image instead of loading an entire UIMapView.