Can't get resized image to save at new size - objective-c

I have the following code
if (! [url getResourceValue:&isDirectory forKey:NSURLIsDirectoryKey error:&error]) {
// handle error
}
else if (! [isValidDirectory boolValue]) {
// No error and it’s not a directory; do something with the file
NSString* outP = [url absoluteString];
NSString* extension = [outP substringFromIndex: outP.length-3];
NSString* img = #"png";
if([extension isEqualToString:img]){
NSImage *original = [[NSImage alloc] initWithContentsOfURL: url];
NSSize sizes = NSMakeSize(240, 240);
NSImage *small =original;
[small setSize:sizes];
NSArray* representations = [small representations];
NSInteger* total = [substrings count];
NSData* bitmapData = [NSBitmapImageRep representationOfImageRepsInArray: representations usingType: NSPNGFileType properties:nil];
[bitmapData writeToFile:#"/Users/testuser/downloads/test/test_tn.png" atomically:NO];
}
when I run it it does not resize the image it just gives me a straight copy of the image with the name test_tn.png instead of a smaller image. I'm not sure what I'm doing wrong, this is a mac app also. There is some unused code in there that is for later. Is the problem how I'm passing the NSImage to the NSData after i resize?
edit: ok so i converted the NSData back into an NSImage and it appears the NSData is not getting the sized data but the original data.
edit 2: so this works
ok so this works
if([extension isEqualToString:img]){
NSImage *image = [[NSImage alloc] initWithContentsOfURL: url];
NSData *sourcedata =[image TIFFRepresentation];
NSSize newSize;
newSize.height = 160;
newSize.width = 120;
NSImage *sourceImage =[[NSImage alloc] initWithData: sourcedata];
NSImage *resizedImage = [[NSImage alloc] initWithSize: NSMakeSize(240, 240)];
NSSize originalSize = [sourceImage size];
[resizedImage lockFocus];
[sourceImage drawInRect: NSMakeRect(0, 0, 240, 240) fromRect: NSMakeRect(0, 0, originalSize.width, originalSize.height) operation: NSCompositeSourceOver fraction: 1.0];
[resizedImage unlockFocus];
NSData *resizedData = [resizedImage TIFFRepresentation];
NSData *resizedPreviewData = [resizedImage TIFFRepresentation];
NSBitmapImageRep *resizedCaptureImageBitmapRep = [[NSBitmapImageRep alloc] initWithData:resizedPreviewData];
NSData* saveData = [resizedCaptureImageBitmapRep representationUsingType:NSPNGFileType properties:nil];
[saveData writeToFile:#"/Users/testuser/downloads/test/test_tn.png" atomically:YES];
count++;
}

This works for me. I am using this code to layer images on top of each other, but it should be easily adapted to your purposes:
CGSize destinationSize = smallSize;
UIGraphicsBeginImageContext(original.size);
[Original1 drawInRect:CGRectMake(0, 0, original.size.width, original.size.height)];
[Original2 drawInRect:CGRectMake(0, 0, original.size.width, original.size.height)];
UIImage *returnImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIGraphicsBeginImageContext(destinationSize);
[returnImage drawInRect:CGRectMake(0,0,destinationSize.width,destinationSize.height)];
UIImage *smallImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

Related

Can't save CIImage to file on iOS without memory leaks

The following snippet of code save a CIImage to disk using an UIImage.
- (void)applicationWillResignActive:(UIApplication *)application
{
NSString* filename = #"Test.png";
UIImage *image = [UIImage imageNamed:filename];
// make some image processing then store the output
CIImage *processedImage = [CIImage imageWithCGImage:image.CGImage];
#if 1// save using context
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgiimage = [context createCGImage:processedImage fromRect:processedImage.extent];
image = [UIImage imageWithCGImage:cgiimage];
CGImageRelease(cgiimage);
#else
image = [UIImage imageWithCIImage:processedImage];
#endif
// save the image
NSString *filePath = [[[NSBundle mainBundle] bundlePath] stringByAppendingPathComponent:[#"../Documents/" stringByAppendingString:filename]];
[UIImagePNGRepresentation(image) writeToFile:filePath atomically:YES];
}
However, it leaks the CGImageRef even when it is released by calling CGImageRelease
If the line with #if 1 is changed to #if 0, the UIImage is created directly from the CIImage and there are no memory leaks, but then the UIImage is not saved to disk
Wrap the saving inside an autorelease pool:
- (void)applicationWillResignActive:(UIApplication *)application
{
NSString* filename = #"Test.png";
UIImage *image = [UIImage imageNamed:filename];
// make some image processing then store the output
CIImage *processedImage = [CIImage imageWithCGImage:image.CGImage];
#autoreleasepool {
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgiimage = [context createCGImage:processedImage fromRect:processedImage.extent];
image = [UIImage imageWithCGImage:cgiimage];
CGImageRelease(cgiimage);
// save the image
NSURL *documentsDir = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] firstObject];
NSURL *fileURL = [documentsDir URLByAppendingPathComponent:filename];
[UIImagePNGRepresentation(image) writeToURL:fileURL atomically:YES];
}
}
Also note that I updated how you were retrieving the Documents directory to work for iOS 8 (more info).

Save NSBItmapImageRep as grayscale png without alpha

I try to save NSBItmapImageRep without alpha channel.
My code is:
NSBitmapImageRep* rep = [[NSBitmapImageRep alloc] initWithCIImage:img];
NSColorSpace *targetColorSpace = [NSColorSpace genericGrayColorSpace];
NSBitmapImageRep *targetImageRep = [rep bitmapImageRepByConvertingToColorSpace:targetColorSpace
renderingIntent:NSColorRenderingIntentPerceptual];
NSData* PNGData = [targetImageRep representationUsingType:NSPNGFileType properties:nil];
[PNGData writeToURL:url atomically:YES];
[rep release];
But It doesn't work. It saves my image with type GrayScaleAlpha.
How to specify using alpha?

iOS memory management UIImage

if UIImage is an autorelease object, why when I analyze does it complain that on the 2nd line below there is a potential leak stored to image:
NSData *data = [[NSData alloc] initWithContentsOfURL: ImageURL];
UIImage *image = [[UIImage alloc] initWithData: data];
[data release];
// Do we want to round the corners?
image = [self roundCorners: image];
// Is it PNG or JPG/JPEG?
// Running the image representation function writes the data from the image to a file
if([ImageURLString rangeOfString: #".png" options: NSCaseInsensitiveSearch].location != NSNotFound)
{
[UIImagePNGRepresentation(image) writeToFile: uniquePath atomically: YES];
}
else if(
[ImageURLString rangeOfString: #".jpg" options: NSCaseInsensitiveSearch].location != NSNotFound ||
[ImageURLString rangeOfString: #".jpeg" options: NSCaseInsensitiveSearch].location != NSNotFound
)
{
[UIImageJPEGRepresentation(image, 100) writeToFile: uniquePath atomically: YES];
}
Why do you say your UIImage is autoreleased? I see only
UIImage *image = [[UIImage alloc] initWithData: data];
Use instead
UIImage *image = [[[UIImage alloc] initWithData: data] autorelease];
As an alternative you may use:
UIImage *tmp = [[UIImage alloc] initWithData: data];
UIImage *image = [self roundCorners: tmp];
[tmp release];
(assuming roundCorners returns an autoreleased object).
In your code, on the second line, your UIImage isn't autoreleased. As soon as you use alloc/init methods, you're retaining. Using a convenience method like imageNamed: creates an auto released object.

Error when compressing jpeg data using

I get the following error when trying to compress some jpeg data using UIImageJPEGRepresentation(image, 0.5f);
<Error>: Unsupported pixel description - 4 components, 8 bits-per-component, 32 bits-per-pixel
<Error>: CGBitmapContextCreateWithDictionary: failed to create delegate.
<Error>: CGImageDestinationAddImage image could not be converted to destination format.
<Error>: CGImageDestinationFinalize image destination does not have enough images
Here is the code snippet where I download and compress the data...
ASIHTTPRequest *request = [[ASIHTTPRequest alloc] initWithURL:[NSURL URLWithString:object.largeimage]];
[request startSynchronous];
NSData *imageData = [request responseData];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSData *compressedImageData = UIImageJPEGRepresentation(image, 0.5f);
//If the data was compressed properly...
if(compressedImageData) [compressedImageData writeToFile:filePath atomically:YES];
//If it wasn't...
else [imageData writeToFile:filePath atomically:YES];
[image release];
if(request.responseStatusCode == 200)
object.fullImageFilename = filePath;
[request release];
[request release];
Thanks in advance for any help :)
Still not sure of the "proper answer", but I managed to find a workaround for anyone who's interested :)
ASIHTTPRequest *request = [[ASIHTTPRequest alloc] initWithURL:[NSURL URLWithString:object.largeimage]];
[request startSynchronous];
NSData *imageData = [request responseData];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSData *compressedImageData = UIImageJPEGRepresentation(image, 0.5f);
//If the data was compressed properly...
if(compressedImageData) [compressedImageData writeToFile:filePath atomically:NO];
//If it wasn't...
else
{
UIGraphicsBeginImageContext(image.size);
[image drawInRect:CGRectMake(0, 0, image.size.width, image.size.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
compressedImageData = UIImageJPEGRepresentation(newImage, 0.5f);
[compressedImageData writeToFile:filePath atomically:NO];
[newImage release];
}
[image release];
if(request.responseStatusCode == 200)
object.fullImageFilename = filePath;
[request release];

how to compress image programmatically

I am receiving images from xml feed they are very large in size how can i compress them before displaying them in table-cell.
my code is
int blogEntryIndex1 = [indexPath indexAtPosition: [indexPath length] -1];
imgstring=[[blogEntries objectAtIndex: blogEntryIndex1] objectForKey: #"image"];
NSURL *url = [NSURL URLWithString:imgstring];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *img = [[UIImage alloc] initWithData:data];
cell.imageView.image=[img autorelease];
please help me if you can.....
I've got this utility method that will scale an image :
- (UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Use it like this :
int blogEntryIndex1 = [indexPath indexAtPosition: [indexPath length] -1];
imgstring=[[blogEntries objectAtIndex: blogEntryIndex1] objectForKey: #"image"];
NSURL *url = [NSURL URLWithString:imgstring];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *img = [self imageWithImage:[UIImage imageWithData:data] scaledToSize:CGSizeMake(20, 20)]; // Scale it to 20x20px
cell.imageView.image=[img autorelease];
NB I can't for the life of me remember where I got it from but it's served me well in the past
Simple to use:-
-(UIImage *)fireYourImageForCompression:(UIImage *)imgComing{
NSData *dataImgBefore = [[NSData alloc] initWithData:UIImageJPEGRepresentation((imgComing), 1.0)];//.1 BEFORE COMPRESSION
int imageSizeBefore = (int)dataImgBefore.length;
NSLog(#"SIZE OF IMAGE: %i ", imageSizeBefore);
NSLog(#"SIZE OF IMAGE in Kb: %i ", imageSizeBefore/1024);
NSData *dataCompressedImage = UIImageJPEGRepresentation(imgComing, .1); //.1 is low quality
int sizeCompressedImage = (int)dataCompressedImage.length;
NSLog(#"SIZE AFTER COMPRESSION OF IMAGE: %i ", sizeCompressedImage);
NSLog(#"SIZE AFTER COMPRESSION OF IMAGE in Kb: %i ", sizeCompressedImage/1024); //AFTER
//now change your image from compressed data
imgComing = [UIImage imageWithData:dataCompressedImage];
return imgComing;}