if UIImage is an autorelease object, why when I analyze does it complain that on the 2nd line below there is a potential leak stored to image:
NSData *data = [[NSData alloc] initWithContentsOfURL: ImageURL];
UIImage *image = [[UIImage alloc] initWithData: data];
[data release];
// Do we want to round the corners?
image = [self roundCorners: image];
// Is it PNG or JPG/JPEG?
// Running the image representation function writes the data from the image to a file
if([ImageURLString rangeOfString: #".png" options: NSCaseInsensitiveSearch].location != NSNotFound)
{
[UIImagePNGRepresentation(image) writeToFile: uniquePath atomically: YES];
}
else if(
[ImageURLString rangeOfString: #".jpg" options: NSCaseInsensitiveSearch].location != NSNotFound ||
[ImageURLString rangeOfString: #".jpeg" options: NSCaseInsensitiveSearch].location != NSNotFound
)
{
[UIImageJPEGRepresentation(image, 100) writeToFile: uniquePath atomically: YES];
}
Why do you say your UIImageĀ is autoreleased? I see only
UIImage *image = [[UIImage alloc] initWithData: data];
Use instead
UIImage *image = [[[UIImage alloc] initWithData: data] autorelease];
As an alternative you may use:
UIImage *tmp = [[UIImage alloc] initWithData: data];
UIImage *image = [self roundCorners: tmp];
[tmp release];
(assuming roundCorners returns an autoreleased object).
In your code, on the second line, your UIImage isn't autoreleased. As soon as you use alloc/init methods, you're retaining. Using a convenience method like imageNamed: creates an auto released object.
Related
Please try to understand my question.
i am picking images from phone library and saving into Documents Directory. But When I pick large number of images the utilised memory increases gradually and reach above of 400 mb then my app crash. Please if anybody can solve my problem what should I do? I'm new comer to Objective C. Any response will be appreciated.
here is my code
when Picker finish picking
- (void)agImagePickerController:(AGImagePickerController *)picker didFinishPickingMediaWithInfo:(NSArray *)info {
[self ShowLoadingView:#"Files Are Loading...."];
[self performSelectorInBackground:#selector(saveAllSelectedImages:) withObject:info];}
and then I save images to Directory
-(void) saveAllSelectedImages:(NSArray*)imagesArray{
for (int i=0; i<imagesArray.count; i++) {
ALAsset *asset = [imagesArray objectAtIndex:i];
ALAssetRepresentation *alassetRep = [asset defaultRepresentation];
NSDate *currentDate = [NSDate date];
NSString* DucPath = [[AppDelegate GetDocumentDirectoryPath] stringByAppendingPathComponent:#"Media"];
if (![[NSFileManager defaultManager] fileExistsAtPath:DucPath])
[[NSFileManager defaultManager] createDirectoryAtPath:DucPath withIntermediateDirectories:NO attributes:nil error:nil];
if ([[asset valueForProperty:ALAssetPropertyType] isEqualToString:ALAssetTypeVideo])
{
long long DataSize = [alassetRep size];
Byte *buffer = (Byte*)malloc(DataSize);
NSUInteger buffered = (NSUInteger)[alassetRep getBytes:buffer fromOffset:0.0 length:alassetRep.size error:nil];
NSData *videoData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
NSString* newVideoName = [NSString stringWithFormat:#"video_%d_%d.mov",(int)currentDate,i];
NSString* newVideoPath = [DucPath stringByAppendingPathComponent:newVideoName];
[videoData writeToFile:newVideoPath atomically:YES];
[pImageMediaArray addObject:newVideoName];
}
else
{
UIImage *image = [UIImage imageWithCGImage:[alassetRep fullResolutionImage]];
/************************************Full Resolution Images ******************************************/
NSData *imageData = UIImageJPEGRepresentation(image, 0.8);
image = nil;
NSString *originalPath = [NSString stringWithFormat:#"IMAGE_%d_%d.jpg",(int)currentDate,i];
NSString* pImagePath = [DucPath stringByAppendingPathComponent:originalPath];
[imageData writeToFile:pImagePath atomically:YES];
[pImageMediaArray addObject:originalPath];
}
/************************************Low Resolution Images ******************************************/
UIImage *image = [UIImage imageWithCGImage:[alassetRep fullResolutionImage]];
UIImage *thumbImage = [self imageWithImage:image scaledToSize:CGSizeMake(50, 50)];
NSData *thumbImageData = UIImageJPEGRepresentation(thumbImage, 0.8);
NSString *thumbOriginalPath = [NSString stringWithFormat:#"SMALL_IMAGE_%d_%d.jpg",(int)currentDate,i];
NSString* thumbImagePath = [DucPath stringByAppendingPathComponent:thumbOriginalPath];
NSLog(#"Image path At Save Time:%#",thumbImagePath);
[thumbImageData writeToFile:thumbImagePath atomically:YES];
[pMediaArray addObject:thumbOriginalPath];
}
[appDelegate setPMediaArray:pImageMediaArray];
[pGridView reloadData];
imagesArray = nil;
[imagesArray release];
[pImageMediaArray release];
[self performSelectorOnMainThread:#selector(closeLoadindView) withObject:nil waitUntilDone:YES];}
Byte *buffer = (Byte*)malloc(DataSize);
is not being freed?
I had the same exact same issue. What worked for me was to use an autorelease pool block when you save the image. This will free up the retain count and the garbage collection will release the memory appropriately instead of retaining those objects in memory until the containing loop is finished running.
Example: In the method that you are using to save the images add code that looks like this:
#autoreleasepool {
NSString *filePath = [[NSArray arrayWithObjects:self.imagePath, #"/", GUID, #".png", nil] componentsJoinedByString:#""];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
BOOL res = [imageData writeToFile:filePath atomically:YES];
imageData = nil;
}
You need to add the autoreleasepool for task that perform in the background. In the above code content of the saveAllSelectedImages should be written inside autoreleasepool, Otherwise memory won't be released.
I'm very new to Objective-C, and am having some beginner issues. I have an application that has an area that is supposed to behave somewhat like a photo gallery. The user chooses a picture from their camera roll, and the photos get displayed in UIImageViews. I'm trying to save the image that they select. I have 9 UIImageView's, and the issue is that when I select a different photo for each UIImageView, close and relaunch the app, the other 8 UIImageViews display the photo that is stored in the first image view. Here is the code that I'm working with:
- (NSString *)dataFilePath {
NSArray *paths = NSSearchPathForDirectoriesInDomains(
NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
return [documentsDirectory stringByAppendingPathComponent:kFilename9];
}
- (void)applicationDidEnterBackground:(UIApplication*)application {
NSLog(#"Image on didenterbackground: %#", imageView);
self.imageData = [NSData dataWithData:UIImagePNGRepresentation(imageView.image)];
self.imageData = [NSData dataWithData:UIImagePNGRepresentation(imageView2.image)];
self.imageData = [NSData dataWithData:UIImagePNGRepresentation(imageView3.image)];
self.imageData = [NSData dataWithData:UIImagePNGRepresentation(imageView4.image)];
self.imageData = [NSData dataWithData:UIImagePNGRepresentation(imageView5.image)];
self.imageData = [NSData dataWithData:UIImagePNGRepresentation(imageView6.image)];
self.imageData = [NSData dataWithData:UIImagePNGRepresentation(imageView7.image)];
self.imageData = [NSData dataWithData:UIImagePNGRepresentation(imageView8.image)];
self.imageData = [NSData dataWithData:UIImagePNGRepresentation(imageView9.image)];
[self.imageData writeToFile:[self dataFilePath] atomically:YES];
NSLog(#"The image is: %#", [[imageView image] description]);
NSLog(#"dataFilePath is: %#", [self dataFilePath]);
}
- (void)viewDidLoad
{
NSString *filePath = [self dataFilePath];
NSLog(#"FilePath: %#", filePath);
NSLog(#"Image: %#", imageView);
if ([[NSFileManager defaultManager] fileExistsAtPath:filePath]) {
NSData *vdlData = [[NSData alloc] initWithContentsOfFile:filePath];
imageView.image = [[UIImage alloc] initWithData:vdlData];
imageView2.image = [[UIImage alloc] initWithData:vdlData];
imageView3.image = [[UIImage alloc] initWithData:vdlData];
imageView4.image = [[UIImage alloc] initWithData:vdlData];
imageView5.image = [[UIImage alloc] initWithData:vdlData];
imageView6.image = [[UIImage alloc] initWithData:vdlData];
imageView7.image = [[UIImage alloc] initWithData:vdlData];
imageView8.image = [[UIImage alloc] initWithData:vdlData];
imageView9.image = [[UIImage alloc] initWithData:vdlData];
}
UIApplication *app = [UIApplication sharedApplication];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(applicationDidEnterBackground:)
name:UIApplicationDidEnterBackgroundNotification
object:app];
[super viewDidLoad];
}
I'm trying to figure out what I need to change to get the UIImageViews to display the correct pictures, rather than them all displaying the same picture. This is probably a simple fix, but any help would be greatly appreciated, thanks!
Okay, here's how I would do it:
Use NSUserDefaults to save your images as a mutable array:
ViewController.h
#property(retain) NSUserDefaults *user;
ViewController.m
#synthesize user;
- (void)viewWillAppear:(BOOL)animated
{
self.user = [NSUserDefaults standardUserDefaults];
Edit
NSMutableArray* array = [[self.user objectForKey:#"images"]mutableCopy];
while(array == nil)
{
[self.user setObject:[NSMutableArray arrayWithObject:#""] forKey:#"images"]
array = [[self.user objectForKey:#"images"]mutableCopy];
NSLog(#"%#",#"attempting to create an array to store the images in");
}
End Edit
}
- (void)applicationDidEnterBackground:(UIApplication*)application {
NSLog(#"Image on didenterbackground: %#", imageView);
NSMutableArray* array = [NSMutableArray arrayWithObject:[NSData dataWithData:UIImagePNGRepresentation(imageView.image)]];
[array addObject:[NSData dataWithData:UIImagePNGRepresentation(imageView2.image)];
[array addObject:[NSData dataWithData:UIImagePNGRepresentation(imageView3.image)];
[array addObject:[NSData dataWithData:UIImagePNGRepresentation(imageView4.image)];
[array addObject:[NSData dataWithData:UIImagePNGRepresentation(imageView5.image)];
[array addObject:[NSData dataWithData:UIImagePNGRepresentation(imageView6.image)];
[array addObject:[NSData dataWithData:UIImagePNGRepresentation(imageView7.image)];
[array addObject:[NSData dataWithData:UIImagePNGRepresentation(imageView8.image)];
[array addObject:[NSData dataWithData:UIImagePNGRepresentation(imageView9.image)];
[self.user setObject:array forKey:#"images"];
}
- (void)viewDidLoad
{
NSMutableArray* array = [[self.user objectForKey:#"images"]mutableCopy];
EDIT
if(array.count == 9)
{
imageView.image = [[UIImage alloc] initWithData:[array objectAtIndex:0]];
imageView2.image = [[UIImage alloc] initWithData:[array objectAtIndex:1]];
imageView3.image = [[UIImage alloc] initWithData:[array objectAtIndex:2]];
imageView4.image = [[UIImage alloc] initWithData:[array objectAtIndex:3]];
imageView5.image = [[UIImage alloc] initWithData:[array objectAtIndex:4]];
imageView6.image = [[UIImage alloc] initWithData:[array objectAtIndex:5]];
imageView7.image = [[UIImage alloc] initWithData:[array objectAtIndex:6]];
imageView8.image = [[UIImage alloc] initWithData:[array objectAtIndex:7]];
imageView9.image = [[UIImage alloc] initWithData:[array objectAtIndex:8]];
}
END EDIT
UIApplication *app = [UIApplication sharedApplication];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(applicationDidEnterBackground:)
name:UIApplicationDidEnterBackgroundNotification
object:app];
[super viewDidLoad];
}
- (void)viewDidUnload
{
self.user = nil;
}
This way, you will not lose the images or data, they will be stored and easily accessed, and they will not disappear even if you update your app.
Cheers!
Before I start with the solution, I have to warn you that the way you're doing this isn't the right one. I suggest that you start learning iOS development from the ground up. Apple's own documentation is a pretty good start. http://developer.apple.com/library/ios/#documentation/iPhone/Conceptual/iPhoneOSProgrammingGuide/Introduction/Introduction.html
Now, back to your question. What you do here is save only one image, not all 9 of them. You set self.imageData always with each image you process and you overwrite its previous value, making only the last image to be saved to file.
So, in order to make your code working, you would have to use an imageData object for each image view, then write that data object to file.
In your case, it's probably best to optimize the code by using loops, instead using multiple objects (like imageView, imageView2, ...).
Also, make sure that you take care of your memory. e.g. imageView.image is allocated but not released.
Well, I see two issues. First and foremost, in viewDidLoad, all your images are getting initWithData:vdlData... so they're all getting the same data. That's why they're all the same.
Also, when you're trying to save them, in ...didEnterBackground, you are overwriting the value of imageData over and over again... when you save it, it's just the last one you've assigned to imageData. You probably want to create an NSArray, and store them in there, pulling them out of the array in viewDidLoad.
I have six image URLs in a string named aux. I split it with component separated by newlines. I want to pick images one by one from aux and display each in an ImageView. My code is:
aux = [_homeText.text stringByAppendingString:[NSString stringWithFormat:#" %# ",item.shortDescription]];
[_homeText setText:aux];
NSString *list =aux;
NSArray *listItems = [list componentsSeparatedByCharactersInSet:[NSCharacterSet newlineCharacterSet]];
[listItems count];
if ([listItems objectAtIndex:0]) {
NSURL *url = [NSURL URLWithString:aux];
/*UIAlertView *alert = [[UIAlertView alloc]initWithTitle:#"hai" message:aux delegate:self cancelButtonTitle:#"ok" otherButtonTitles:nil];
[alert show];
[alert release];*/
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *image = [[UIImage alloc] initWithData:data];
[_homeImage setImage:image];
}
else if ([listItems objectAtIndex:1])
{
NSURL *url = [NSURL URLWithString:aux];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *image = [[UIImage alloc] initWithData:data];
[_homeImage setImage:image];
}
There is no error but I get no image. Why? How do I fix this?
If both data and image not nil, you may try to correctly initialize _homeImage.
I don'w know why, but I think what you created _homeImage like this:
_homeImage = [[UIImageView alloc] init];
Try to do something like this:
CGSize imageSize = [image size];
[_homeImage setFrame:CGRectMake([_homeImage frame].origin.x,
[_homeImage frame].origin.y,
imageSize.width, imageSize.height)];
[_homeImage setImage:image];
[image release], image = nil;
I'm turning around and around with the following code giving me a memroy leak in the pics object apparently linke to the object imageName.
for (int i = 0;i<[potatoesIndexesArray count];i++){
int imageNumber = [[potatoesIndexesArray objectAtIndex:i]intValue];
NSString *imageName = [[NSString alloc] initWithFormat:#"texture%d",imageNumber];
UIImage *image = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource:imageName ofType:#"png"]];
//UIImage *imageHighlighted = [[UIImage alloc]initWithContentsOfFile:[[NSBundle mainBundle] pathForResource:imageName ofType:#"png"]];
NSArray *pics = [[NSArray alloc] initWithObjects:
[self maskImage:image withMask:[mainDelegate.masksArray objectAtIndex:i]],
[self maskImage:image withMask:[mainDelegate.masksArray objectAtIndex:i]],
imageName,
nil]; // pics becomes owner of objects
[textures addObject:[pics retain]]; //textures becomes owner of pics. as a release occurs later. we must retaint pics to keep it available in textures.
[imageName release];
[image release];
[pics release];
//[imageHighlighted release];
}
I've read the Apple doc on memory management bu I can't find what I did wrong there ... any idea ??
Cheers,
Tibi.
If textures is a NSMutableArray, then your [textures addObject:] call already sends a retain to pics. So, the code should be:
[textures addObject:pics];
I'm trying to get an image from an URL and it doesn't seem to be working for me. Can someone point me in the right direction?
Here is my code:
NSURL *url = [NSURL URLWithString:#"http://myurl/mypic.jpg"];
NSString *newIMAGE = [[NSString alloc] initWithContentsOfURL:url
encoding:NSUTF8StringEncoding error:nil];
cell.image = [UIImage imageNamed:newIMAGE];
When I debug the newIMAGE string is nil so something isn't working there.
What you want is to get the image data, then initialize a UIImage using that data:
NSData * imageData = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: #"http://myurl/mypic.jpg"]];
cell.image = [UIImage imageWithData: imageData];
[imageData release];
As requested, here's an asynchronous version:
dispatch_async(dispatch_get_global_queue(0,0), ^{
NSData * data = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: #"http://myurl/mypic.jpg"]];
if ( data == nil )
return;
dispatch_async(dispatch_get_main_queue(), ^{
// WARNING: is the cell still using the same data by this point??
cell.image = [UIImage imageWithData: data];
});
[data release];
});
Ok there's a couple of things wrong here:
The conversion from URL (url) to NSString (newImage) is incorrect, what the code actually does there is try to load the contents of "http://myurl/mypic.jpg" into the NSString.
The -imageNamed method takes a string that is a path of a local file, not a URL as an argument.
You need to use an NSData object as an intermediary, like in this example:
http://blogs.oreilly.com/digitalmedia/2008/02/creating-an-uiimage-from-a-url.html
the accepted answer asynchronous version worked very slow in my code. an approach using NSOperation worked light years faster. the code provided by Joe Masilotti --> objective - C : Loading image from URL? (and pasted below):
-(void) someMethod {
// set placeholder image
UIImage* memberPhoto = [UIImage imageNamed:#"place_holder_image.png"];
// retrieve image for cell in using NSOperation
NSURL *url = [NSURL URLWithString:group.photo_link[indexPath.row]];
[self loadImage:url];
}
- (void)loadImage:(NSURL *)imageURL
{
NSOperationQueue *queue = [NSOperationQueue new];
NSInvocationOperation *operation = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(requestRemoteImage:)
object:imageURL];
[queue addOperation:operation];
}
- (void)requestRemoteImage:(NSURL *)imageURL
{
NSData *imageData = [[NSData alloc] initWithContentsOfURL:imageURL];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self performSelectorOnMainThread:#selector(placeImageInUI:) withObject:image waitUntilDone:YES];
}
- (void)placeImageInUI:(UIImage *)image
{
[self.memberPhotoImage setImage:image];
}
In Swift 3 and 4
let theURL = URL(string:"https://exampleURL.com")
let imagedData = NSData(contentsOf: theURL!)!
let theImage = UIImage(data: imagedData as Data)
cell.theImageView.image = theImage
This will be done in the main thread.
And to perform the same in asynchronous/background thread
DispatchQueue.main.async(){
let theURL = URL(string:"https://exampleURL.com")
let imagedData = NSData(contentsOf: theURL!)!
let theImage = UIImage(data: imagedData as Data)
}
cell.theImageView.image = theImage
Updating upon Jim dovey answer,[data release] is no longer required because in the updated apple guidelines. Memory management is done automatically by ARC (Automatic counting reference) ,
Here is the updated asynchronous call,
dispatch_async(dispatch_get_global_queue(0,0), ^{
NSData * data = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: #"your_URL"]];
if ( data == nil )
return;
dispatch_async(dispatch_get_main_queue(), ^{
self.your_UIimage.image = [UIImage imageWithData: data];
});
});