what's the different between "cached image" and "local image"? - objective-c

I am beginner. I want to show images by urls on TableView, so I use AFNetworking+SDURLCache, just as below
// init URLCache
SDURLCache *URLCache = [[SDURLCache alloc] initWithMemoryCapacity:1024*1024*2 diskCapacity:1024*1024*20 diskPath:[SDURLCache defaultCachePath]];
[URLCache setIgnoreMemoryOnlyStoragePolicy:YES];
[SDURLCache setSharedURLCache:URLCache];
========================================================
// set cell
[cell.imageView setImageWithURL:[NSURL URLWithString:IMAGE_URL] placeholderImage:[UIImage imageNamed:#"placeholder.png"]];
[[SDURLCache sharedURLCache] cachedResponseForRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:IMAGE_URL]]];
Here, I use a image on all different cells. Everything works OK! When I run my project first time, displaying images on cells need long time because my web image very large. When I run my project second time, I found that TableView load images one by one immediately
? Why?why not load all images once immediately. I replaced the cached image to local image, TableView load images at a time. what's the different between "cached image" and "local" image? SDURLCache did't work?

I have no iOS experience, but I would presume a cached image is an image that is stored in the web client's cache so that it will load that resource faster, unless that resource has been modified and the cached version invalidated since it was put in the cache.
A local image, I would assume, is an image stored on the device itself. Obviously this will be faster than downloading it as storage on the iDevices is comprised of NAND flash chips and such, which far outstrip a domestic Internet connection.

Related

Load of multiple image files from a remote repository ios7

In my project i have about 100 images all tiled to display large zooms with CATiled methods. These are a lot of files that, if put in a local bundle, rapidly make the app bundle too big.
I want to load these files from a web repository, using SDWebImage or any other sdk.
Can you suggest any method to achieve this? Please keep in mid we are not talking of loading a single image, or multiple images for a UITableView, but lots of little tiled images that will be used when the user make zoom in a given view, and therefore this tiles have no UIImageview object to hold them, they are just in the bundle (now).
We are using ios7.
Thank you in advance.

Does #2x works on UIImage imageWithNamed?

[self.distanceSlider setThumbImage:[UIImage imageNamed:#"handle-slider"] forState:UIControlStateNormal];
Let's take a look at that code.
If I use retina display, will the image called be handle-slider#2x instead of handle-slider?
Notice that this could raise an issue. Imagined if I load an image for the sole purpose of processing it and I really really want to load handle-slider, or handle-slider#2x? Then having iOS to override my decision and arbitrarily load #2x image will be kind of silly.
On the other hand most of the time, I used UIImage imageNamed to populate a button. In that case, it makes perfect sense to add #2x.
In any case, which path does apple eventually use and if possible, what's the reference?
I searched stackOverflow.
Most answers are inconsistent with one suggesting one or the other.
The documentation for [UIImage imageNamed:] has exactly the information you want to know.
This method looks in the system caches for an image object with the
specified name and returns that object if it exists. If a matching
image object is not already in the cache, this method loads the image
data from the specified file, caches it, and then returns the
resulting object.
On a device running iOS 4 or later, the behavior is identical if the
device’s screen has a scale of 1.0. If the screen has a scale of 2.0,
this method first searches for an image file with the same filename
with an #2x suffix appended to it. For example, if the file’s name is
button, it first searches for button#2x. If it finds a 2x, it loads
that image and sets the scale property of the returned UIImage object
to 2.0. Otherwise, it loads the unmodified filename and sets the scale
property to 1.0.

Strange issues with file name and Xcode sim vs. ios device

I just spent a day trying to figure out why some simple code was not loading images from the resources folder. Just for kicks I went into the finder and renamed one of the files to exactly the same name and it loaded. Then I did the others - simply renaming them to the same name. I looked and there weren't any strange characters or whitespace before or after the file name. So now they all load fine into an array in the simulator. But now on my device, they aren't loaded into the array, returning "nil" and are throwing exceptions. I know that ios devices are case sensitive where the sim is not but I checked this and the naming is all fine. The only thing I can think of is that my images are named for retina (#2x~iphone) and my device is not. But I have images named the same way, for retina, and those load fine.
I am using imageWithContentsOfFile to load the images and also getting a memory warning after it tries to load the images. I am not sure if the memory warning is related.
I think the problem might be in your file naming. Remember that you should include both retina and non-retina graphics in your app. Seeing as you added "~iphone" I'm assuming this is a universal application so your images should be named as follows:
myImage~iphone.png
myImage#2x~iphone.png
myImage~ipad.png
myImage#2x~ipad.png
Then in your code you only reference the "myImage" part of the file name so iOS can sort the rest out for you.
For example:
UIImage *image = [UIImage imageNamed:#"myImage"];

How to save UIWebView content into photo library

How to save UIWebView Content into photo Library
-(IBAction)save:(id)sender{
UIImage* image = nil;
UIGraphicsBeginImageContext(webView.frame.size);
{
image = UIGraphicsGetImageFromCurrentImageContext();
}
UIGraphicsEndImageContext();
if (image != nil) {
UIImageWriteToSavedPhotosAlbum(image,self, nil, nil);
}
}
This code is saving empty page.
I've released an app (Web2Pic) doing that, and please trust me that UIGraphicsBeginImageContext(webView.frame.size);
can do nothing except getting a small image from the visible area in our UIWebView ;-(
The right way is a bit complex but it just works:
1.Use JavaScript in our UIWebView to get these float values:
//Whole page size in HTML coordinate
document.body.scrollWidth
document.body.scrollHeight
//UIWebView visible size in HTML coordinate
window.innerWidth
window.innerHeight
2.Now we can 'cut' the whole page into dozens of UIWebView-sized small pieces. Then we can capture every small pieces individually and save them into our Cache. I implemented this by calculating page-offsets and use UIGraphicsBeginImageContext(webView.frame.size); to get a array of images. In addition, you should cache the image array into the file system, or the app will eventually crash!
3.When we finally got all the small pieces, we can start a full-resolution context: UIGraphicsBeginImageContext(CGSizeMake(document.body.scrollWidth,document.body.scrollHeight));
4.Render every small images into the big context based on the coordinates. And be careful to the corners, the last image in every line/row may not be a full image.
5.There is still one step left: saving the big image. Do not save it into the PhotoAlbum, because iOS will automatically cut down the resolution of images in the album. Instead, we can save it into the file system and enable the app's iTunes File Share support, or even write a simple in-app photo manager.
Hope these can help ;-)
Yichao Peak Ji

How to optimize flipping through 50+ images, which are downloaded

I have an iPad app with about 50+ full screen images(png) and I want to be able to flip back and forward between the images. To make the app size smaller I am downloading the images as I need them, using NSURLConnection. I also cache about 15 images. The problem I am running into is that even though I have a cache it is quite easy to flip through the cache and to an image that has not been downloaded yet.
I am wondering what suggestion you have to fix my problem. Should I just increase the cache or should I down res the images? Do I have to limit the number of images I am downloading at the same time? Many thanks!
This is how I start each image download
NSURLConnection *conn = [[NSURLConnection alloc] initWithRequest:[NSURLRequest
requestWithURL:[NSURL URLWithString:theUrlString]]
delegate:self startImmediately:NO];
[conn scheduleInRunLoop:[NSRunLoop mainRunLoop]
forMode:NSRunLoopCommonModes];
[conn start];
Concerning the flipping through the photos once they have been downloaded, here are a few tips to try.
Have both a low resolution and a high resolution version of the photo available.
Whenever you have one picture loaded, bring the highres versions of its immediate neighbors into memory. In otherwords, load, but don't display those pictures.
Load the low resolution images into memory of some range surrounding the displayed picture. So if picture 5 is displayed, and your range is 5, load lowres pictures 0 through 10
While a user is flipping through, render the low resolution first, and then load the high resolution picture.
These tips should account for a user flipping through a few pictures to find the desired photo, and then pausing on a select picture, and then flipping through some more.