slow loading using initWithContentsOfUrl - lazy-loading

I have a web service and I make HTTP calls to it from cocoa using this line of code:
NSData *imageData = [[NSData alloc] initWithContentsOfUrl:url options: NSDataReadingUncached error:&error];
Sometimes is it take 10 seconds, sometimes 30 seconds to load the picture from this URL.
I tried loading this URL from a normal browser and it takes 1-2 seconds.
I'm doing lazy loading and the problem is the time it takes to load the contents of this URL vs a normal browser. Both tests were done from the same network.
Download.m
NSError *error;
NSData *imageData = [[NSData alloc] initWithContentsOfURL:self.photoRecord.URL options:NSDataReadingUncached error:&error];
NSMutableArray *jsonArray = [NSJSONSerialization JSONObjectWithData:imageData options:NSJSONReadingAllowFragments error:&myError];
NSMutableDictionary *pictureInfo = [jsonArray objectAtIndex:0];
NSString *picture;
picture = [pictureInfo valueForKey:#"Picture"];
NSData *base64Data = [[NSData alloc]initWithBase64Encoding:picture];
if (base64Data) {
UIImage *downloadedImage = [UIImage imageWithData:base64Data];
self.photoRecord.image = downloadedImage;
}
[(NSObject *)self.delegate performSelectorOnMainThread:#selector(imageDownloaderDidFinish:) withObject:self waitUntilDone:NO];
Then on the main thread PictureController.m:
-(void) loadScrollViewWithPage:(NSUInteger)page{
....
NSIndexPath *myIndexPath = [NSIndexPath indexPathForRow:page inSection:0];
[self startOperationsForPhotoRecord:aRecord atIndexPath:myIndexPath];
}
- (void)startOperationsForPhotoRecord:(PictureRecord *)record atIndexPath:(NSIndexPath *)indexPath {
Download *imageD = [[Download alloc] initWithPhotoRecord:record atIndexPath:indexPath delegate:self];
[self.pendingOperations.downloadsInProgress setObject:imageD forKey:indexPath];
[self.pendingOperations.downloadQueue addOperation:imageD];
}
Then I have the delegate method to update the UIImageView when the download is done which is working just fine.
The size of the content being loaded is around 400kb.
Any ideas?

Related

memory leaks issue while saving images to directory

Please try to understand my question.
i am picking images from phone library and saving into Documents Directory. But When I pick large number of images the utilised memory increases gradually and reach above of 400 mb then my app crash. Please if anybody can solve my problem what should I do? I'm new comer to Objective C. Any response will be appreciated.
here is my code
when Picker finish picking
- (void)agImagePickerController:(AGImagePickerController *)picker didFinishPickingMediaWithInfo:(NSArray *)info {
[self ShowLoadingView:#"Files Are Loading...."];
[self performSelectorInBackground:#selector(saveAllSelectedImages:) withObject:info];}
and then I save images to Directory
-(void) saveAllSelectedImages:(NSArray*)imagesArray{
for (int i=0; i<imagesArray.count; i++) {
ALAsset *asset = [imagesArray objectAtIndex:i];
ALAssetRepresentation *alassetRep = [asset defaultRepresentation];
NSDate *currentDate = [NSDate date];
NSString* DucPath = [[AppDelegate GetDocumentDirectoryPath] stringByAppendingPathComponent:#"Media"];
if (![[NSFileManager defaultManager] fileExistsAtPath:DucPath])
[[NSFileManager defaultManager] createDirectoryAtPath:DucPath withIntermediateDirectories:NO attributes:nil error:nil];
if ([[asset valueForProperty:ALAssetPropertyType] isEqualToString:ALAssetTypeVideo])
{
long long DataSize = [alassetRep size];
Byte *buffer = (Byte*)malloc(DataSize);
NSUInteger buffered = (NSUInteger)[alassetRep getBytes:buffer fromOffset:0.0 length:alassetRep.size error:nil];
NSData *videoData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
NSString* newVideoName = [NSString stringWithFormat:#"video_%d_%d.mov",(int)currentDate,i];
NSString* newVideoPath = [DucPath stringByAppendingPathComponent:newVideoName];
[videoData writeToFile:newVideoPath atomically:YES];
[pImageMediaArray addObject:newVideoName];
}
else
{
UIImage *image = [UIImage imageWithCGImage:[alassetRep fullResolutionImage]];
/************************************Full Resolution Images ******************************************/
NSData *imageData = UIImageJPEGRepresentation(image, 0.8);
image = nil;
NSString *originalPath = [NSString stringWithFormat:#"IMAGE_%d_%d.jpg",(int)currentDate,i];
NSString* pImagePath = [DucPath stringByAppendingPathComponent:originalPath];
[imageData writeToFile:pImagePath atomically:YES];
[pImageMediaArray addObject:originalPath];
}
/************************************Low Resolution Images ******************************************/
UIImage *image = [UIImage imageWithCGImage:[alassetRep fullResolutionImage]];
UIImage *thumbImage = [self imageWithImage:image scaledToSize:CGSizeMake(50, 50)];
NSData *thumbImageData = UIImageJPEGRepresentation(thumbImage, 0.8);
NSString *thumbOriginalPath = [NSString stringWithFormat:#"SMALL_IMAGE_%d_%d.jpg",(int)currentDate,i];
NSString* thumbImagePath = [DucPath stringByAppendingPathComponent:thumbOriginalPath];
NSLog(#"Image path At Save Time:%#",thumbImagePath);
[thumbImageData writeToFile:thumbImagePath atomically:YES];
[pMediaArray addObject:thumbOriginalPath];
}
[appDelegate setPMediaArray:pImageMediaArray];
[pGridView reloadData];
imagesArray = nil;
[imagesArray release];
[pImageMediaArray release];
[self performSelectorOnMainThread:#selector(closeLoadindView) withObject:nil waitUntilDone:YES];}
Byte *buffer = (Byte*)malloc(DataSize);
is not being freed?
I had the same exact same issue. What worked for me was to use an autorelease pool block when you save the image. This will free up the retain count and the garbage collection will release the memory appropriately instead of retaining those objects in memory until the containing loop is finished running.
Example: In the method that you are using to save the images add code that looks like this:
#autoreleasepool {
NSString *filePath = [[NSArray arrayWithObjects:self.imagePath, #"/", GUID, #".png", nil] componentsJoinedByString:#""];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
BOOL res = [imageData writeToFile:filePath atomically:YES];
imageData = nil;
}
You need to add the autoreleasepool for task that perform in the background. In the above code content of the saveAllSelectedImages should be written inside autoreleasepool, Otherwise memory won't be released.

Generating thumbnail from video - ios7

I am using this for reference: Getting thumbnail from a video url or data in IPhone SDK
The method is using the MPMoviePlayerController class instead of the AVFoundation, and I think I want to use that as well because the people said that MPMoviePlayer way is faster than the AVFoundation way.
The problem is, the method used to create the thumbnails, [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame] is deprecated in iOS 7.0.
By looking at the apple docs, the remaining supported ways to create thumbnails are by the methods (void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimes timeOption:(MPMovieTimeOption)option and (void)cancelAllThumbnailImageRequests. But, as the method signatures dictate, these methods return nothing. So how do I access the UIImage thumbnail created by these methods?
If it helps, this is what I have so far in terms of code:
self.videoURL = info[UIImagePickerControllerMediaURL];
NSData *videoData = [NSData dataWithContentsOfURL:self.videoURL];
//Create thumbnail image
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:self.videoURL];
[player requestThumbnailImagesAtTimes:#[#1] timeOption:MPMovieTimeOptionNearestKeyFrame];
//UIImage *thumbnail = ???
How do I get a UIImage reference to the thumbnail?
EDIT
I figured out how to create a notification for the thumbnail image request (using this question as reference). However, I realise that this method works asynchronously from the main thread, and so my notification handler method doesn't seem to ever be called.
This is what I have now.
self.videoURL = info[UIImagePickerControllerMediaURL];
NSData *videoData = [NSData dataWithContentsOfURL:self.videoURL];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:self.videoURL];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(handleThumbnailImageRequestFinishNotification:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:player];
[player requestThumbnailImagesAtTimes:#[#1] timeOption:MPMovieTimeOptionNearestKeyFrame];
And then my handler method:
-(void)handleThumbnailImageRequestFinishNotification:(NSNotification*)notification
{
NSDictionary *userinfo = [notification userInfo];
NSError* value = [userinfo objectForKey:MPMoviePlayerThumbnailErrorKey];
if (value != nil)
{
NSLog(#"Error creating video thumbnail image. Details: %#", [value debugDescription]);
}
else
{
UIImage *thumbnail = [userinfo valueForKey:MPMoviePlayerThumbnailImageKey];
}
But the handler never gets called (or so it appears).
Try this way.
import AVFoundation framework
in *.h
#import <AVFoundation/AVFoundation.h>
in *.m
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:self.urlForConevW options:nil];
AVAssetImageGenerator *generateImg = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *error = NULL;
CMTime time = CMTimeMake(1, 65);
CGImageRef refImg = [generateImg copyCGImageAtTime:time actualTime:NULL error:&error];
NSLog(#"error==%#, Refimage==%#", error, refImg);
UIImage *FrameImage= [[UIImage alloc] initWithCGImage:refImg];
Here is a code to make a thumbnail of the video and save the images to the DocumentDirectory..
//pass the video_path to NSURL
NSURL *videoURL = [NSURL fileURLWithPath:strVideoPath];
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generator.appliesPreferredTrackTransform = YES;
//Set the time and size of thumbnail for image
NSError *err = NULL;
CMTime thumbTime = CMTimeMakeWithSeconds(0,30);
CGSize maxSize = CGSizeMake(425,355);
generator.maximumSize = maxSize;
CGImageRef imgRef = [generator copyCGImageAtTime:thumbTime actualTime:NULL error:&err];
UIImage *thumbnail = [[UIImage alloc] initWithCGImage:imgRef];
//And you can save the image to the DocumentDirectory
NSData *data = UIImagePNGRepresentation(thumbnail);
//Path for the documentDirectory
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
[data writeToFile:[documentsDirectory stringByAppendingPathComponent:currentFileName] atomically:YES];
If your URL is to an HTTP live stream, then it won't return anything, per the docs. For a file URL, I found that I had to start the request after playing the movie, or it would never get called.

saving images from a url takes a long time

The below code successfully get the image from the link and stores into my cache directory. But I want to get many(like 100) images from different url(but on the same website, only the filename differs). This works fine for taking those images but i need to wait for a long time. is there anyway to get the images easily and make my responsive time really faster.
NSString *UCIDLink = [NSString stringWithFormat:#"http://www.example.com/picture.png];
NSURL * imageURL = [NSURL URLWithString:UCIDLink];
NSData * imageData = [NSData dataWithContentsOfURL:imageURL];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES);
NSString *filePath = [[paths objectAtIndex:0] stringByAppendingPathComponent:[NSString stringWithFormat:#"picture.png"]];
NSError *writeError = nil;
[imageData writeToFile:filePath options:NSDataWritingAtomic error:&writeError];
if (writeError) {
NSLog(#"Success");
}else{
NSLog(#"Failed");
}
ghgh
The code you are using take time to load image contents. so, prefer to load image asynchronously.
use below code:
dispatch_queue_t q = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0ul);
dispatch_async(q, ^{
/* Fetch the image from the server... */
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *img = [[UIImage alloc] initWithData:data];
dispatch_async(dispatch_get_main_queue(), ^{
/* This is the main thread again, where we set the tableView's image to
be what we just fetched. */
cell.imgview.image = img;
});
});
or you can use:
AsyncImageView *asyncImageView = [[AsyncImageView alloc]initWithFrame:CGRectMake(30,32,100, 100)];
[asyncImageView loadImageFromURL:[NSURL URLWithString:your url]];
[YourImageView addSubview:asyncImageView];
[asyncImageView release];
Download the Files from here.....
https://github.com/nicklockwood/AsyncImageView
Use multi-threading in order to make a number of image fetches happen simultaneously. That way, you can reduce your waiting time a lot.

Loading Images from JSON to Xcode

I having a bit difficulty loading images from a json file into UIImage - Table Cells in Xcode. I tried to load the images from the server into a NSArray then populating the table view UIImage cells. Is there something that I am missing here?
Image are located on a SQL server.
Thanks for the help.
Here is the server output from the PHP into Xcode. (cover_image)
(
"13497074790148.jpeg",
"13494650900147.png",
"13494606630147.png",
"13494605220147.jpeg",
"13494602920147.jpeg",
"13494601850147.jpeg",
"13491916300147.jpeg"
)
Here is the code in Xcode
NSArray *itemsimages = [[NSArray alloc]initWithArray:[results valueForKeyPath:#"cover_image"]];
self.itemImages = itemsimages;
Here is the code in table cells
UIImage *imageitm = [UIImage imageNamed: [self.itemImages objectAtIndex: [indexPath row]]];
cell.itmImage.image = imageitm;
return cell;
You don't have those images stored locally so it doesn't have any images to display. I suggest using SDWebImage to provide asyncronous image loading from remote location + caching mechanism.
-(void) viewDidLoad
{
NSURL *url = [NSURL URLWithString:#"YOUR URL"];
NSData *data = [NSData dataWithContentsOfURL:url];
NSError *error;
NSMutableDictionary *json = [NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:&error];
NSMutableArray *img = [[NSMutableArray alloc]init];
NSArray *websiteDetails = (NSArray *) [json objectForKey:#"logos"];
for(int count=0; count<[websiteDetails count]; count++)
{
NSDictionary *websiteInfo = (NSDictionary *) [websiteDetails objectAtIndex:count];
imagefile = (NSString *) [websiteInfo objectForKey:#"image_file"];
if([imagefile length]>0)
{
NSLog(#"Imagefile URL is: %#",imagefile);
[img addObject:imagefile];
}
}
//NSarray listofURL
listofURL = img;
}
-(UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath
{
static NSString *simpleTableIdentifier = #"SimpleTableItem";
UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:simpleTableIdentifier];
if (cell == nil) {
cell = [[UITableViewCell alloc] initWithStyle:UITableViewCellStyleDefault reuseIdentifier:simpleTableIdentifier];
dispatch_queue_t concurrentQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
//this will start the image loading in bg
dispatch_async(concurrentQueue, ^{
NSURL *url = [NSURL URLWithString:[listofURL objectAtIndex:indexPath.row]];
NSData *image = [[NSData alloc] initWithContentsOfURL:url];
//this will set the image when loading is finished
dispatch_async(dispatch_get_main_queue(), ^{
cell.imageView.image = [UIImage imageWithData:image];
});
});
}
return cell;
}
You need to have a proper url in json response or you can store the common part of the url in the code itself and append it later with the image name returned from server.
I did as follows in the same condition
__autoreleasing NSError* error = nil;
id result = [NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:&error];
NSDictionary *dict = ((NSDictionary *) result)[#"result"];
NSString *url = dict[#"imageURL"];
NSData *imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:url]];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[_buttonImageView setImage:image forState:UIControlStateNormal];
where data is the response returned from server.

Getting Image from URL Objective C

I'm trying to get an image from an URL and it doesn't seem to be working for me. Can someone point me in the right direction?
Here is my code:
NSURL *url = [NSURL URLWithString:#"http://myurl/mypic.jpg"];
NSString *newIMAGE = [[NSString alloc] initWithContentsOfURL:url
encoding:NSUTF8StringEncoding error:nil];
cell.image = [UIImage imageNamed:newIMAGE];
When I debug the newIMAGE string is nil so something isn't working there.
What you want is to get the image data, then initialize a UIImage using that data:
NSData * imageData = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: #"http://myurl/mypic.jpg"]];
cell.image = [UIImage imageWithData: imageData];
[imageData release];
As requested, here's an asynchronous version:
dispatch_async(dispatch_get_global_queue(0,0), ^{
NSData * data = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: #"http://myurl/mypic.jpg"]];
if ( data == nil )
return;
dispatch_async(dispatch_get_main_queue(), ^{
// WARNING: is the cell still using the same data by this point??
cell.image = [UIImage imageWithData: data];
});
[data release];
});
Ok there's a couple of things wrong here:
The conversion from URL (url) to NSString (newImage) is incorrect, what the code actually does there is try to load the contents of "http://myurl/mypic.jpg" into the NSString.
The -imageNamed method takes a string that is a path of a local file, not a URL as an argument.
You need to use an NSData object as an intermediary, like in this example:
http://blogs.oreilly.com/digitalmedia/2008/02/creating-an-uiimage-from-a-url.html
the accepted answer asynchronous version worked very slow in my code. an approach using NSOperation worked light years faster. the code provided by Joe Masilotti --> objective - C : Loading image from URL? (and pasted below):
-(void) someMethod {
// set placeholder image
UIImage* memberPhoto = [UIImage imageNamed:#"place_holder_image.png"];
// retrieve image for cell in using NSOperation
NSURL *url = [NSURL URLWithString:group.photo_link[indexPath.row]];
[self loadImage:url];
}
- (void)loadImage:(NSURL *)imageURL
{
NSOperationQueue *queue = [NSOperationQueue new];
NSInvocationOperation *operation = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(requestRemoteImage:)
object:imageURL];
[queue addOperation:operation];
}
- (void)requestRemoteImage:(NSURL *)imageURL
{
NSData *imageData = [[NSData alloc] initWithContentsOfURL:imageURL];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self performSelectorOnMainThread:#selector(placeImageInUI:) withObject:image waitUntilDone:YES];
}
- (void)placeImageInUI:(UIImage *)image
{
[self.memberPhotoImage setImage:image];
}
In Swift 3 and 4
let theURL = URL(string:"https://exampleURL.com")
let imagedData = NSData(contentsOf: theURL!)!
let theImage = UIImage(data: imagedData as Data)
cell.theImageView.image = theImage
This will be done in the main thread.
And to perform the same in asynchronous/background thread
DispatchQueue.main.async(){
let theURL = URL(string:"https://exampleURL.com")
let imagedData = NSData(contentsOf: theURL!)!
let theImage = UIImage(data: imagedData as Data)
}
cell.theImageView.image = theImage
Updating upon Jim dovey answer,[data release] is no longer required because in the updated apple guidelines. Memory management is done automatically by ARC (Automatic counting reference) ,
Here is the updated asynchronous call,
dispatch_async(dispatch_get_global_queue(0,0), ^{
NSData * data = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: #"your_URL"]];
if ( data == nil )
return;
dispatch_async(dispatch_get_main_queue(), ^{
self.your_UIimage.image = [UIImage imageWithData: data];
});
});