Exception while retriving data from Address Book - objective-c

I am getting this exception while retriving data from address book. I have check through internet but not get any help for that.
Overflow allocating bitmap backing store. Cannot back bitmap with 320 bytes per row, -2147483648 height, and 1 planes
I am using AddressBook Framework for retriving data from Address Book. is this issue of Memory or it due to getting information of avatar that i have set in Addressbook contact.
Please help. If any suggestion or recommendations for it then please give it...

Thanks for your reply
As you have said, I have checked all the code for drawing large an image or view. And found the below function that i have used for resizing image. Now resizing image will be done on server side. I have more doubts for this issue. You can check it in below block of code. Now waiting from customer for this issue.
Thanks again for your help.
-(NSData *)getCompressedImageDataFromData:(NSData *)imData
{
NSImage *pImage = [[[NSImage alloc] initWithData:imData] autorelease];
NSSize orgSize = [pImage size];
int widthInput, heightInput;
widthInput = orgSize.width;
heightInput = orgSize.height;
if(widthInput <= 72 && heightInput <= 72)
return imData;
double newheight = heightInput;
NSSize newSize;
if(widthInput >= 72)
{
double ratio;
ratio = widthInput / heightInput;
newheight = 72 / ratio;
newSize = NSMakeSize (72, newheight);
}
else
newSize = NSMakeSize(widthInput, newheight);
NSImage *outputImage = [[[NSImage alloc] initWithSize:newSize] autorelease];
if(![outputImage isValid])
return nil;
[outputImage lockFocus];
[[NSGraphicsContext currentContext] setImageInterpolation:NSImageInterpolationHigh];
[pImage drawInRect:NSMakeRect(0, 0, newSize.width, newSize.height)
fromRect:NSZeroRect operation:NSCompositeCopy fraction:1.0];
[outputImage unlockFocus];
NSData *imageData = [outputImage TIFFRepresentationUsingCompression:NSTIFFCompressionJPEG factor:0];
return [imageData mutableCopy];
}

Are you creating one large view or image into which you're drawing multiple contacts in the address book? It sounds like you're trying to create too large an image/view.

Related

Generating thumbnails causes leak (MacOS, Obj C)

I'm working on a MacOS program in Objective C that needs to produce in-memory thumbnails to send to a server. The following code is used to perform this operation. As the program runs, a leak of about 40mb is induced each time this method is called. I'm missing something really basic, I suspect, but I don't see the source of the problem.
I should add that I've also tried creating one context to use over the life of the program and the problem, if anything, seems somewhat worse.
When I run Instruments, the allocations for the category "VM: ImageIO_JPEG_Data" are growing by one allocation of 40mb each time it's called. The responsible library is "ImageIO" and the responsible caller is "ImageIO_Malloc".
- (void) createPhotoThumbnail
{
NSURL* fileURL = [NSURL fileURLWithPath : _imagePath];
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGContextRef bitmapContext = CGBitmapContextCreate(NULL, MAX_THUMB_DIM, MAX_THUMB_DIM, 8, 0,
colorspace, (CGBitmapInfo)kCGImageAlphaNoneSkipLast);
CIContext *ciContext = [CIContext contextWithCGContext: bitmapContext options: #{}];
if (fileURL)
{
CIImage *image = [[CIImage alloc] initWithContentsOfURL: fileURL];
if (image)
{
// scale the image
CIFilter *scaleFilter = [CIFilter filterWithName: #"CILanczosScaleTransform"];
[scaleFilter setValue: image forKey: #"inputImage"];
NSNumber *scaleFactor = [[NSNumber alloc] initWithFloat: ((float) MAX_THUMB_DIM) /
((float)MAX(_processedWidth, _processedHeight))];
[scaleFilter setValue: scaleFactor forKey: #"inputScale"];
[scaleFilter setValue: #1.0 forKey: #"inputAspectRatio"];
CIImage *scaledImage = [scaleFilter valueForKey: #"outputImage"];
NSMutableData* thumbJpegData = [[NSMutableData alloc] init];
CGImageDestinationRef dest = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)thumbJpegData,
(__bridge CFStringRef)#"public.jpeg",
1,
NULL);
if (dest)
{
CGImageRef img = [ciContext createCGImage:scaledImage
fromRect:[scaledImage extent]];
CGImageDestinationAddImage(dest, img, nil);
if (CGImageDestinationFinalize(dest))
{
// encode it as a string for later
_thumbnail = [thumbJpegData base64EncodedStringWithOptions: 0];
}
else
{
DDLogError(#"Failed to generate photo thumbnail");
}
CGImageRelease(img);
CFRelease(dest);
}
else
{
DDLogError(#"Failed to finalize photo thumbnail image");
}
thumbJpegData = nil;
}
}
CGContextRelease(bitmapContext);
CGColorSpaceRelease(colorspace);
ciContext = nil;
}
UPDATE: I switched the code to use a CGAffineTransform instead of the filter with "CILanczosScaleTransform" and the symptom did not change. Next I used a completely new method (snippet below) and yet the problem persists.
NSImage *thumbnail = [[NSImage alloc] initWithSize: newSize];
[thumbnail lockFocus];
[sourceImage setSize: newSize];
[[NSGraphicsContext currentContext] setImageInterpolation:NSImageInterpolationHigh];
[sourceImage compositeToPoint: NSZeroPoint operation: NSCompositeCopy];
[thumbnail unlockFocus];
NSData *tiff = [thumbnail TIFFRepresentation];
NSBitmapImageRep *imageRep = [NSBitmapImageRep imageRepWithData: tiff];
NSDictionary *imageProps = [NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:0.9] forKey:NSImageCompressionFactor];
NSData *thumbJpegData = [imageRep representationUsingType:NSJPEGFileType properties:imageProps];
This is making me think the problem is perhaps related to something inherent in the way I'm doing this. I find it hard to believe two different methods of image scaling are going to exhibit the same sort of leak.
Thanks to this answer I was able to identify the need for an autorelease pool, something I was completely unaware of. The code in the question is one of a series of methods that are called repeatedly from inside a tight loop. This apparently prevents the OS from having a chance to do some cleanup. The block now looks like this:
#autoreleasepool {
[self findRelevantAdjustments];
[self adjustForStraightenCrop];
[self moveFacesRelativeToTopLeftOrigin];
[self createPhotoThumbnail];
[self sendPhotoToServer];
}
Moral of the story: even with ARC there are more things to pay attention to when it comes to the memory lifecycle.
The problem is not in the CGImageDestinationRef logic, because it still leaks even if you replace that with something far simple, such as:
NSBitmapImageRep *rep = [[NSBitmapImageRep alloc] initWithCIImage:scaledImage];
NSData *data = [rep representationUsingType:NSJPEGFileType properties:nil];
Digging a little further, it would appear that the problem appears to be an issue within CILanczosScaleTransform. If you use an inputScale of #1.0, then the leak disappears. But use something less than #1.0 (even #0.5) and it leaks.
I'd suggest you consider finding a different method for resizing the image.

How to take a screenshot with low quality

Is there a way to a take a screenshot (low level quality) on osx programmatically?
I developed a function like below:
CGImageRef resizeImage(CGImageRef imageRef) {
CGRect thumRect;
CGPoint point;
point.x = 0;
point.y = 0;
thumRect.origin = point;
thumRect.size.height = 225;
thumRect.size.width = 360;
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
if (aplhaInfo == kCGImageAlphaNone)
alphaInfo = kCGImageAlphaNoneSkipLast;
CGContextRef bitmap = CGBitmapContextCreate(NULL, thumRect.size.width, thumRect.size.height, CGImageGetBitsPerComponent(imageRef), 4 * thumRect.size.width, CGImageGetColorSpace(imageRef), alphaInfo);
CGContextDrawImage(bitmap, thumRect, imageRef);
imageRef = CGBitmapContextCreateImage(bitmap);
CGContextRelease(bitmap);
return imageRef;
}
When I runned this function, I took an between 150KB and 600KB image. If I decrease thumRect size, I cant read any characters in the image. But, I want to decrease these images as low as possible. Is there any suggestion or another possible solution?
Thanks.
I found a solution like below:
First af all, resize your image with code in my question.
Then Compress it :)
//imageRef is CGImageRef
NSImage * image = [[NSImage alloc] initWithCGImage:imageRef size:NSZeroSize];
NSBitmapImageRep *bmpImageRep = [NSBitmapImageRep imageRepWithData [image TIFFRepresentation]];
CGFloat compressionFactor = 1.0 //Read it : (1)
NSDictionary *jpgProperties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:compressionFactor], NSImageCompressionFactor, [NSNumber numberWithBool:NO], NSImageProgressive, nil];
NSData *jpgData = [bmpImageRep representationUsingType:NSJPEGFileType properties:jpgProperties];
(1):https://developer.apple.com/library/mac/documentation/Cocoa/Reference/ApplicationKit/Classes/NSBitmapImageRep_Class/index.html#//apple_ref/doc/constant_group/Bitmap_image_properties

NSImage lockFocus and NSString size on retina display

I'm facing a weird issue, I'm drawing inside an NSImage using the following pseudo-code:
NSString* text = #"Hello world!";
NSDictionary *dict = [[[NSDictionary alloc] initWithObjectsAndKeys:[NSColor colorWithCGColor:textColor],NSForegroundColorAttributeName,font, NSFontAttributeName,nil] autorelease];
NSMutableAttributedString* str = [[[NSMutableAttributedString alloc] initWithString:text attributes:dict] autorelease];
NSSize stringSize = [str size];
NSImage* image = [[[NSImage alloc] initWithSize:stringSize] autorelease];
[image lockFocus];
NSRect drawRect = NSMakeRect(0,0,stringSize.width,stringSize.height);
[str drawInRect:drawRect];
[image unlockFocus];
Now the problem is that, with a dual monitor configuration, if I keep my retina display open, the string is mangled (I get half of the string drawn), while by simply closing my retina display and using only my cinema display, the string is drawn correctly. It's like the NSImage is getting the default context and some scaling factor from the retina display.
Do you have any hints ?
Thanks !
Ok, I will keep this for future reference, even there's something about displaying NSImage that covers the same aspect.
No matter what's your primary display but seems that the NSGraphicContext comes with an affine transformation that multiplies x 2 to address the retina resolution.
You just need to reset the affine transformations, before drawing into NSImage with:
NSAffineTransform *trans = [[[NSAffineTransform alloc] init] autorelease];
[trans set];

How do I set the pixels per inch for an exported JPEG image in a Cocoa app?

I have a Cocoa Mac image editing app which lets users export JPEG images. I'm currently using the following code to export these images as JPEG files:
//this is user specified
NSInteger resolution;
NSImage* savedImage = [[NSImage alloc] initWithSize:NSMakeSize(600, 600)];
[savedImage lockFocus];
//draw here
[savedImage unlockFocus];
NSBitmapImageRep* savedImageBitmapRep = [NSBitmapImageRep imageRepWithData:[savedImage TIFFRepresentationUsingCompression:NSTIFFCompressionNone factor:1.0]];
NSDictionary* properties = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithFloat:1.0], NSImageCompressionFactor, nil];
//holds the jpeg file
NSData * imageData = nil;
imageData = [savedImageBitmapRep representationUsingType:NSJPEGFileType properties:properties];
However, I would like for the user to be able to provide the pixels per inch for this JPEG image (like you can in Photoshop's export options). What would I need to modify in the above code to adjust this value for the exported JPEG?
I couldn't find a way to do it with the NSImage APIs but CGImage can by setting kCGImagePropertyDPIHeight/Width.
I also set kCGImageDestinationLossyCompressionQuality which I think is the same as NSImageCompressionFactor.
//this is user specified
NSInteger resolution = 100;
NSImage* savedImage = [[NSImage alloc] initWithSize:NSMakeSize(600, 600)];
[savedImage lockFocus];
//draw here
[savedImage unlockFocus];
NSBitmapImageRep* savedImageBitmapRep = [NSBitmapImageRep imageRepWithData:[savedImage TIFFRepresentationUsingCompression:NSTIFFCompressionNone factor:1.0]];
NSDictionary* properties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:1.0], kCGImageDestinationLossyCompressionQuality,
[NSNumber numberWithInteger:resolution], kCGImagePropertyDPIHeight,
[NSNumber numberWithInteger:resolution], kCGImagePropertyDPIWidth,
nil];
NSMutableData* imageData = [NSMutableData data];
CGImageDestinationRef imageDest = CGImageDestinationCreateWithData((CFMutableDataRef) imageData, kUTTypeJPEG, 1, NULL);
CGImageDestinationAddImage(imageDest, [savedImageBitmapRep CGImage], (CFDictionaryRef) properties);
CGImageDestinationFinalize(imageDest);
// Do something with imageData
if (![imageData writeToFile:[#"~/Desktop/test.jpg" stringByExpandingTildeInPath] atomically:NO])
NSLog(#"Failed to write imageData");
For NSImage or NSImageRep you do not set the resolution directly but set the size instead.
For size, numberOfPixels and resolution the following equation holds:
size = numberOfPixels * 72.0 / resolution
size is a length and is expressed in dots with the unit inch/72.
(size and resolution are floats). You can see that for an image with dpi=72 size and numberOfPixels are numerally the same (but the meaning is very different).
After creating an NSBitmapImageRep the size with the desired resolution can be set:
NSBitmapImageRep* savedImageBitmapRep = . . . ; // create the new rep
NSSize newSize;
newSize.width = [savedImageBitmapRep pixelsWide] * 72.0 / resolution; // x-resolution
newSize.height = [savedImageBitmapRep pixelsHigh] * 72.0 / resolution; // y-resolution
[savedImageBitmapRep setSize:newSize];
// save the rep
Two remarks: do you really need the lockFocus / unlockFocus way? The preferred way to build a new NSBitmapImageRep is to use NSGraphicsContext. see : http://www.mail-archive.com/cocoa-dev#lists.apple.com/msg74857.html
And: to use TIFFRepresentation for an NSBitmapImageRep is very time and space consuming. Since 10.6 another
way exists and costs nothing, because lockFocus and unlockFocus create an object of class NSCGImageSnapshotRep which under the hood is a CGImage. (In OS versions before 10.6 it was an NSCachedImageRep.) The following does it:
[anImage lockFocus];
// draw something
[anImage unlockFocus];
// now anImage contains an NSCGImageSnapshotRep
CGImageRef cg = [anImage CGImageForProposedRect:NULL context:nil hints:nil];
NSBitmapImageRep *newRep = [[NSBitmapImageRep alloc] initWithCGImage:cg];
// set the resolution
// here you may NSLog anImage, cg and newRep
// save the newRep
// release the newRep if needed

How to draw to a file in objective-c

How can I draw to an image in objective-c? all I need to do is create an image with a size I set, draw few AA lines and save the image to a png file. I tried to find it in apple docs but there are CGImage, NSImage, CIImage and more. which one is easiest for my goal? I only need to support the latest mac os x version so new things are not a problem.
Probably the easiest way is to use an NSImage and draw directly into it after calling lockFocus.
Example:
NSSize imageSize = NSMakeSize(512, 512);
NSImage *image = [[[NSImage alloc] initWithSize:imageSize] autorelease];
[image lockFocus];
//draw a line:
[NSBezierPath strokeLineFromPoint:NSMakePoint(100, 100) toPoint:NSMakePoint(200, 200)];
//...
NSBitmapImageRep *imageRep = [[[NSBitmapImageRep alloc] initWithFocusedViewRect:NSMakeRect(0, 0, imageSize.width, imageSize.height)] autorelease];
NSData *pngData = [imageRep representationUsingType:NSPNGFileType properties:nil];
[image unlockFocus];
[pngData writeToFile:#"/path/to/your/file.png" atomically:YES];
Well your question is actually two questions in one.
First question is about how to draw an image. You should first read the docs about drawing images. Apple has a Cocoa Drawing Guide about this topic. Start from there to draw images.
Then you need to save the image to disk. Here is a nice piece of code from over here:
NSBitmapImageRep *bits = ...; // get a rep from your image, or grab from a view
NSData *data;
data = [bits representationUsingType: NSPNGFileType
properties: nil];
[data writeToFile: #"/path/to/wherever/test.png"
atomically: NO];