glReadPixels -> NSData -> save image - objective-c

I create NSBitmapImageRep witch I fill with glReadPixels information. It looks like this:
NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL pixelsWide:width pixelsHigh:height bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSDeviceRGBColorSpace bytesPerRow:3 * width bitsPerPixel:0];
glReadPixels(0, 0, width, height, GL_RGB, GL_UNSIGNED_BYTE, [imageRep bitmapData]);
and later turn it to NSData and write to file. But I get flipped upside down image. How can I fix it?

Draw into a flipped NSImage. This will draw the upsidedown image to the correct orientation. You then get the tiff representation of the NSImage which you can change to other image types using another NSBitmapImageRep with imageRepWithData:. Use the new image representation to make a new NSData object using representationUsingType:properties: and save the new NSData object to file.
NSRect rect = self.bounds;
NSSize size = rect.size;
GLsizei width = (GLsizei)size.width;
GLsizei height = (GLsizei)size.height;
NSBitmapImageRep* imgRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL pixelsWide:width pixelsHigh:height bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSDeviceRGBColorSpace bytesPerRow:width*3 bitsPerPixel:0];
glReadPixels(0, 0, width, height, GL_RGB, GL_UNSIGNED_BYTE, [imgRep bitmapData]);
#ifdef __MAC_10_8
NSImage* image = [NSImage imageWithSize:size flipped:YES drawingHandler:^(NSRect dstRect){
return [imgRep drawInRect:dstRect];
}];
#else
NSImage* image = [[NSImage alloc] initWithSize:size];
[image lockFocusFlipped:YES];
[imgRep drawInRect:rect];
[image unlockFocus];
#endif
NSData* tiff = [image TIFFRepresentation];

bytesPerRow should be (width*3+3)&~3

Related

NSImage drawAtPoint lost quality

I have a monochrome picture,When i draw this picture to another picture, it become a gray picture.
Here is my code:
Save a monochrome picture:
NSImage *copyImage = [image copy]; // Copy from a monochrome picture
NSBitmapImageRep *copyrep = [[NSBitmapImageRep alloc] initWithData:[copyImage TIFFRepresentation]];
NSSize copySize;
copySize.width = 72.0 * [copyrep pixelsWide] / PRINT_DPI; //Set Printing DPI
copySize.height = 72.0 * [copyrep pixelsHigh] / PRINT_DPI; //Set Printing DPI
[copyrep setSize:copySize];
copyImage= [[NSImage alloc] initWithData:[copyrep TIFFRepresentation]];
NSData *imgData = [copyImage TIFFRepresentation];
[imgData writeToFile: #"/Users/bbmac/Desktop/1.png" atomically: NO]; // save for testing
But when i draw it to a big picture, it become a gray picture:
// for printing
NSImage *printImage = [[NSImage alloc] initWithSize:imageSize];
[printImage lockFocus];
// Draw a monochrome picture
[image drawAtPoint:NSMakePoint(newX, newY) fromRect:NSZeroRect operation:NSCompositeCopy fraction:1.0f];
[printImage unlockFocus];
// Save for testing
[[printImage TIFFRepresentation] writeToFile: #"/Users/bbmac/Desktop/printImage.png" atomically: NO];

Saving an image in a NSView

I have problem with rotating and saving JPEG NSImage. I have NSView which is flipped
- (BOOL)isFlipped
{
return YES;
}
Then I'm applying NSImage rotation with following function:
- (NSImage*)imageRotatedByDegrees:(CGFloat)degrees
{
// calculate the bounds for the rotated image
NSRect imageBounds = {NSZeroPoint, [image size]};
NSBezierPath* boundsPath = [NSBezierPath
bezierPathWithRect:imageBounds];
NSAffineTransform* transform = [NSAffineTransform transform];
[transform rotateByDegrees:degrees];
[boundsPath transformUsingAffineTransform:transform];
NSRect rotatedBounds = {NSZeroPoint, [boundsPath bounds].size};
NSImage* rotatedImage = [[NSImage alloc]
initWithSize:rotatedBounds.size];
// center the image within the rotated bounds
imageBounds.origin.x = NSMidX(rotatedBounds) - (NSWidth
(imageBounds) / 2);
imageBounds.origin.y = NSMidY(rotatedBounds) - (NSHeight
(imageBounds) / 2);
// set up the rotation transform
transform = [NSAffineTransform transform];
[transform translateXBy:+(NSWidth(rotatedBounds) / 2) yBy:+
(NSHeight(rotatedBounds) / 2)];
[transform rotateByDegrees:degrees];
[transform translateXBy:-(NSWidth(rotatedBounds) / 2) yBy:-
(NSHeight(rotatedBounds) / 2)];
// draw the original image, rotated, into the new image
[rotatedImage lockFocus];
[transform set];
[image drawInRect:imageBounds fromRect:NSZeroRect
operation:NSCompositeCopy fraction:1.0] ;
[rotatedImage unlockFocus];
return rotatedImage;
}
Image is now successfully rotated. Later, when I'm trying to save JPEG with following code:
-(void)saveDocument:(id)sender
{
NSData *imageData = [image TIFFRepresentation];
NSBitmapImageRep *imageRep = [NSBitmapImageRep imageRepWithData:imageData];
NSDictionary *imageProps =
[NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:1.0]
forKey:NSImageCompressionFactor];
imageData = [imageRep representationUsingType:NSJPEGFileType
properties:imageProps];
[imageData writeToFile:[_imageURL path] atomically:YES];
}
Result JPEG file is incorrectly flipped... What I'm doing wrong ?
Thanks a lot for any ideas, Petr
-[NSImage lockFocusFlipped:] could help.
Finally, I did find correct solution.
- (BOOL)isFlipped
{
return NO;
}
Then it is important to setup NSImage (thanks to pointum !)
[_image lockFocusFlipped:YES];
From now, when I'm saving image, he is correctly rotated and flipped.

How to get PNG base64 encoded from CGImageRef?

i have CGImageRef object (var quartzImage). How convert this object to format PNG data for web:
"data:image/png;base64,"+ base64 data image
my code:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
NSLog(#"%#",quartzImage);
}
If you already have a CGImageRef (with name quartzImage in your code) then you do not need to create an NSImage. Create a NSBitmapImageRep directly. And you should in no case use the lockFocus method. This is good for images that shall be depicted to the screen. And therefore lockFocus usually creates images with a resolution of 72 dpi and 144 dpi for Retina screens. Or do you want to create images for the web with the properties of your screen? Try this:
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCGImage:quartzImage];
NSData *repData = [bitmapRep representationUsingType:NSPNGFileType] properties:nil];
NSString *base64String = [repData base64EncodedStringWithOptions:0];
This base64… method is not available before OS X 10.9. In that case you should use base64Encoding
NSImage *image = [NSImage imageWithCGImage:imageRef];
[image lockFocus];
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithFocusedViewRect:NSMakeRect(0, 0, image.size.width, image.size.height)];
[image unlockFocus];
NSData *imageData = [bitmapRep representationUsingType:NSPNGFileType properties:nil];;
NSString *base64String = [imageData base64EncodedStringWithOptions:0];

Scale Up NSImage and Save

I would like to scale up an image that's 64px to make it 512px (Even if it's blurry or pixelized)
I'm using this to get the image from my NSImageView and save it:
NSData *customimageData = [[customIcon image] TIFFRepresentation];
NSBitmapImageRep *customimageRep = [NSBitmapImageRep imageRepWithData:customimageData];
customimageData = [customimageRep representationUsingType:NSPNGFileType properties:nil];
NSString* customBundlePath = [[NSBundle mainBundle] pathForResource:#"customIcon" ofType:#"png"];
[customimageData writeToFile:customBundlePath atomically:YES];
I've tried setSize: but it still saves it 64px.
Thanks in advance!
You can't use the NSImage's size property as it bears only an indirect relationship to the pixel dimensions of an image representation. A good way to resize pixel dimensions is to use the drawInRect method of NSImageRep:
- (BOOL)drawInRect:(NSRect)rect
Draws the entire image in the specified rectangle, scaling it as needed to fit.
Here is a image resize method (creates a new NSImage at the pixel size you want).
- (NSImage*) resizeImage:(NSImage*)sourceImage size:(NSSize)size
{
NSRect targetFrame = NSMakeRect(0, 0, size.width, size.height);
NSImage* targetImage = nil;
NSImageRep *sourceImageRep =
[sourceImage bestRepresentationForRect:targetFrame
context:nil
hints:nil];
targetImage = [[NSImage alloc] initWithSize:size];
[targetImage lockFocus];
[sourceImageRep drawInRect: targetFrame];
[targetImage unlockFocus];
return targetImage;
}
It's from a more detailed answer I gave here: NSImage doesn't scale
Another resize method that works is the NSImage method drawInRect:fromRect:operation:fraction:respectFlipped:hints
- (void)drawInRect:(NSRect)dstSpacePortionRect
fromRect:(NSRect)srcSpacePortionRect
operation:(NSCompositingOperation)op
fraction:(CGFloat)requestedAlpha
respectFlipped:(BOOL)respectContextIsFlipped
hints:(NSDictionary *)hints
The main advantage of this method is the hints NSDictionary, in which you have some control over interpolation. This can yield widely differing results when enlarging an image. NSImageHintInterpolation is an enum that can take one of five values…
enum {
NSImageInterpolationDefault = 0,
NSImageInterpolationNone = 1,
NSImageInterpolationLow = 2,
NSImageInterpolationMedium = 4,
NSImageInterpolationHigh = 3
};
typedef NSUInteger NSImageInterpolation;
Using this method there is no need for the intermediate step of extracting an imageRep, NSImage will do the right thing...
- (NSImage*) resizeImage:(NSImage*)sourceImage size:(NSSize)size
{
NSRect targetFrame = NSMakeRect(0, 0, size.width, size.height);
NSImage* targetImage = [[NSImage alloc] initWithSize:size];
[targetImage lockFocus];
[sourceImage drawInRect:targetFrame
fromRect:NSZeroRect //portion of source image to draw
operation:NSCompositeCopy //compositing operation
fraction:1.0 //alpha (transparency) value
respectFlipped:YES //coordinate system
hints:#{NSImageHintInterpolation:
[NSNumber numberWithInt:NSImageInterpolationLow]}];
[targetImage unlockFocus];
return targetImage;
}

Resize and Save NSImage?

I have an NSImageView which I get an image for from an NSOpenPanel. That works great.
Now, how can I take that NSImage, half its size and save it as the same format in the same directory as the original as well?
If you can help at all with anything I'd appreciate it, thanks.
Check the ImageCrop sample project from Matt Gemmell:
http://mattgemmell.com/source/
Nice example how to resize / crop images.
Finally you can use something like this to save the result (dirty sample):
// Write to TIF
[[resultImg TIFFRepresentation] writeToFile:#"/Users/Anne/Desktop/Result.tif" atomically:YES];
// Write to JPG
NSData *imageData = [resultImg TIFFRepresentation];
NSBitmapImageRep *imageRep = [NSBitmapImageRep imageRepWithData:imageData];
NSDictionary *imageProps = [NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:0.9] forKey:NSImageCompressionFactor];
imageData = [imageRep representationUsingType:NSJPEGFileType properties:imageProps];
[imageData writeToFile:#"/Users/Anne/Desktop/Result.jpg" atomically:NO];
Since NSImage objects are immutable you will have to:
Create a Core Graphics context the size of the new image.
Draw the NSImage into the CGContext. It should automatically scale it for you.
Create an NSImage from that context
Write out the new NSImage
Don't forget to release any temporary objects you allocated.
There are definitely other options, but this is the first one that came to mind.
+(NSImage*) resize:(NSImage*)aImage scale:(CGFloat)aScale
{
NSImageView* kView = [[NSImageView alloc] initWithFrame:NSMakeRect(0, 0, aImage.size.width * aScale, aImage.size.height* aScale)];
[kView setImageScaling:NSImageScaleProportionallyUpOrDown];
[kView setImage:aImage];
NSRect kRect = kView.frame;
NSBitmapImageRep* kRep = [kView bitmapImageRepForCachingDisplayInRect:kRect];
[kView cacheDisplayInRect:kRect toBitmapImageRep:kRep];
NSData* kData = [kRep representationUsingType:NSJPEGFileType properties:nil];
return [[NSImage alloc] initWithData:kData];
}
Here is a specific implementation
-(NSImage*)resizeImage:(NSImage*)input by:(CGFloat)factor
{
NSSize size = NSZeroSize;
size.width = input.size.width*factor;
size.height = input.size.height*factor;
NSImage *ret = [[NSImage alloc] initWithSize:size];
[ret lockFocus];
NSAffineTransform *transform = [NSAffineTransform transform];
[transform scaleBy:factor];
[transform concat];
[input drawAtPoint:NSZeroPoint fromRect:NSZeroRect operation:NSCompositeCopy fraction:1.0];
[ret unlockFocus];
return [ret autorelease];
}
Keep in mind that this is pixel based, with HiDPI the scaling must be taken into account, it is simple to obtain :
-(CGFloat)pixelScaling
{
NSRect pixelBounds = [self convertRectToBacking:self.bounds];
return pixelBounds.size.width/self.bounds.size.width;
}
Apple has source code for downscaling and saving images found here
http://developer.apple.com/library/mac/#samplecode/Reducer/Introduction/Intro.html
Here is some code that makes a more extensive use of Core Graphics than other answers. It's made according to hints in Mark Thalman's answer to this question.
This code downscales an NSImage based on a target image width. It's somewhat nasty, but still useful as an extra sample for documenting how to draw an NSImage in a CGContext, and how to write contents of CGBitmapContext and CGImage into a file.
You may want to add extra error checking. I didn't need it for my use case.
- (void)generateThumbnailForImage:(NSImage*)image atPath:(NSString*)newFilePath forWidth:(int)width
{
CGSize size = CGSizeMake(width, image.size.height * (float)width / (float)image.size.width);
CGColorSpaceRef rgbColorspace = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGImageAlphaPremultipliedLast;
CGContextRef context = CGBitmapContextCreate(NULL, size.width, size.height, 8, size.width * 4, rgbColorspace, bitmapInfo);
NSGraphicsContext * graphicsContext = [NSGraphicsContext graphicsContextWithGraphicsPort:context flipped:NO];
[NSGraphicsContext setCurrentContext:graphicsContext];
[image drawInRect:NSMakeRect(0, 0, size.width, size.height) fromRect:NSMakeRect(0, 0, image.size.width, image.size.height) operation:NSCompositeCopy fraction:1.0];
CGImageRef outImage = CGBitmapContextCreateImage(context);
CFURLRef outURL = (CFURLRef)[NSURL fileURLWithPath:newFilePath];
CGImageDestinationRef outDestination = CGImageDestinationCreateWithURL(outURL, kUTTypeJPEG, 1, NULL);
CGImageDestinationAddImage(outDestination, outImage, NULL);
if(!CGImageDestinationFinalize(outDestination))
{
NSLog(#"Failed to write image to %#", newFilePath);
}
CFRelease(outDestination);
CGImageRelease(outImage);
CGContextRelease(context);
CGColorSpaceRelease(rgbColorspace);
}
To resize image
- (NSImage *)scaleImage:(NSImage *)anImage newSize:(NSSize)newSize
{
NSImage *sourceImage = anImage;
if ([sourceImage isValid])
{
if (anImage.size.width == newSize.width && anImage.size.height == newSize.height && newSize.width <= 0 && newSize.height <= 0) {
return anImage;
}
NSRect oldRect = NSMakeRect(0.0, 0.0, anImage.size.width, anImage.size.height);
NSRect newRect = NSMakeRect(0,0,newSize.width,newSize.height);
NSImage *newImage = [[NSImage alloc] initWithSize:newSize];
[newImage lockFocus];
[sourceImage drawInRect:newRect fromRect:oldRect operation:NSCompositeCopy fraction:1.0];
[newImage unlockFocus];
return newImage;
}
}