I am converting an Objective-C class that uses calloc() to create a buffer to Swift 3. Here is the pertinent part of the code I'm having issue with. In particular, the rawData assignment and usage.
CGImageRef imageRef = [capturedImage CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
unsigned char *rawData = (unsigned char *)calloc(height * width * 4, sizeof(unsigned char));
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
NSUInteger byteIndex = (bytesPerRow * y) + x * bytesPerPixel;
CGFloat red = (CGFloat)rawData[byteIndex];
CGFloat green = (CGFloat)rawData[byteIndex + 1];
CGFloat blue = (CGFloat)rawData[byteIndex + 2];
From the CGBitmapContextCreate documentation:
data : UnsafeMutableRawPointer?
A pointer to the destination in memory where the drawing is to be rendered. The size of this memory block should be at least (bytesPerRow*height) bytes.
Pass NULL if you want this function to allocate memory for the bitmap. This frees you from managing your own memory, which reduces memory leak issues.
While you could figure out how to allocate the required memory block and obtain an UnsafeMutableRawPointer to it unless you have a good reason to allocate your own buffer just follow the documentation and pass NULL - no need to use calloc() at all.
HTH
Related
I am trying to scale an image using vImage_Buffer and the below code works for me. My trouble is I want to maintain the aspect ratio of the source image, so I might need to add a xOffset or yOffset. Below code only works for yOffset. How can I scale the image with xOffset as well. I can not do scaling with CGContext since that affect the performance.
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
size_t finalWidth = 1080;
size_t finalHeight = 720;
size_t sourceWidth = CVPixelBufferGetWidth(imageBuffer);
size_t sourceHeight = CVPixelBufferGetHeight(imageBuffer);
CGRect aspectRect = AVMakeRectWithAspectRatioInsideRect(CGSizeMake(sourceWidth, sourceHeight), CGRectMake(0, 0, finalWidth, finalHeight));
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t startY = aspectRect.origin.y;
size_t yOffSet = (finalWidth*startY*4);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
void* destData = malloc(finalHeight * finalWidth * 4);
vImage_Buffer srcBuffer = { (void *)baseAddress, sourceHeight, sourceWidth, bytesPerRow};
vImage_Buffer destBuffer = { (void *)destData+yOffSet, aspectRect.size.height, aspectRect.size.width, aspectRect.size.width * 4};
vImage_Error err = vImageScale_ARGB8888(&srcBuffer, &destBuffer, NULL, 0);
No pun intended, but you should really read Accelerate.framework documentation.
Replace malloc with calloc ...
void *destData = calloc(finalHeight * finalWidth * 4);
... to zero all the bytes (or use any other way).
What does vImage_Buffer.rowBytes documentation say?
The distance, in bytes, between the start of one pixel row and the next in an image, including any unused space between them.
The rowBytes value must be at least the width multiplied by the pixel size, where the pixel size depends on the image format. You can provide a larger value, in which case the extra bytes will extend beyond the end of each row of pixels. You may want to do so either to improve performance, or to describe an image within a larger image without copying the data. The extra bytes aren't considered part of the image represented by the vImage buffer.
When allocating floating-point data for images, keep the data 4-byte aligned by allocating bytes as integer multiples of 4. For best performance, allocate bytes as integer multiples of 16.
Look at the following image:
Red circle (top/left corner) is offset from the buffer start, let's calculate it (assuming 4 bytes per pixel):
size_t startY = aspectRect.origin.y;
size_t startX = aspectRect.origin.x;
size_t offset = 4 * (finalWidth * startY + startX);
The distance, in bytes, between the start of one pixel row and the next in an image, including any unused space between them is finalWidth * 4 (red line between two other circles).
Let's fix the destBuffer:
vImage_Buffer destBuffer = {
(void *)destData+offset,
aspectRect.size.height,
aspectRect.size.width,
finalWidth * 4
};
My code creates an TIFFRepresentation of an image and I want to recode it to something different. This is not problematic.
My ImgUtils function is:
+ (CGImageRef) processImageData:(NSData*)rep {
NSBitmapImageRep *bitmapRep = [NSBitmapImageRep imageRepWithData:rep];
int width = bitmapRep.size.width;
int height = bitmapRep.size.height;
size_t pixels_size = width * height;
Byte raw_bytes[pixels_size * 3];
//
// processing, creates and stores raw byte stream
//
int bitsPerComponent = 8;
int bytesPerPixel = 3;
int bitsPerPixel = bytesPerPixel * bitsPerComponent;
int bytesPerRow = bytesPerPixel * width;
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL,
raw_bytes,
pixels_size * bytesPerPixel,
NULL);
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef imageRef = CGImageCreate(width,
height,
bitsPerComponent,
bitsPerPixel,
bytesPerRow,
colorSpaceRef,
bitmapInfo,
provider,
NULL,
NO,
renderingIntent);
[ImgUtils saveToPng:imageRef withSuffix:#"-ok"];
CGColorSpaceRelease(colorSpaceRef);
CGDataProviderRelease(provider);
return imageRef;
}
There is another method, that saves a CGImageRef to filesystem.
+ (BOOL) saveToPng:(CGImageRef)imageRef withSuffix:(NSString*)suffix {
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:[NSString stringWithFormat:#"~/Downloads/pic%#.png", suffix]];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypePNG, 1, NULL);
CGImageDestinationAddImage(destination, imageRef, nil);
CGImageDestinationFinalize(destination);
CFRelease(destination);
return YES;
}
As you can see, immediately after processing the image, I save it on a disk, as pic-ok.png.
Here is the code, that calls the processing function:
CGImageRef cgImage = [ImgUtils processImageData:imageRep];
[ImgUtils saveToPng:cgImage withSuffix:#"-bad"];
The problem is, that the two images differ. Second one, with the -bad suffix is corrupted.
See examples below. Seems like the memory area the CGImageRef pointer is pointing to is released and overwritten immediately after returning from the method.
I tried also return CGImageCreateCopy(imageRef); but it changed nothing.
What am I missing?
CGDataProviderCreateWithData() does not copy the buffer you provide. Its purpose is to allow creation of a data provider that accesses that buffer directly.
Your buffer is created on the stack. It goes invalid after +processImageData: returns. However, the CGImage still refers to the provider and the provider still refers to the now-invalid buffer.
One solution would be to create the buffer on the heap and provide a callback via the releaseData parameter that frees it. Another would be to create a CFData from the buffer (which copies it) and then create the data provider using CGDataProviderCreateWithCFData(). Probably the best would be to create a CFMutableData of the desired capacity, set its length to match, and use its storage (CFDataGetMutableBytePtr()) as your buffer from the beginning. That's heap-allocated, memory-managed, and doesn't require any copying.
My goal is to find the brightness of every pixel in an image and then append it into an array. I have looked online (since this is way outside of my comfort zone, but I figured that's how I'll learn), and found several examples that get the pixel colors in Swift for IOS using UIImage. Unfortunately the same code can't be used for OS X since NSImage doesn't seem to convert to CGImage the same way a UIImage does. So that code became next to useless for me. Next I found this site, which offers a piece of example code that finds and logs the brightness of each pixel in an image. Bingo! The only problem is it is in Objective C, which I only slightly understand. After a bit more failed searches, I began attempting to translate the Objective C code. This is the original code:
- (void)setupWithImage:(UIImage*)image {
UIImage * fixedImage = [image imageWithFixedOrientation];
self.workingImage = fixedImage;
self.mainImageView.image = fixedImage;
// Commence with processing!
[self logPixelsOfImage:fixedImage];
}
- (void)logPixelsOfImage:(UIImage*)image {
// 1. Get pixels of image
CGImageRef inputCGImage = [image CGImage];
NSUInteger width = CGImageGetWidth(inputCGImage);
NSUInteger height = CGImageGetHeight(inputCGImage);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
UInt32 * pixels;
pixels = (UInt32 *) calloc(height * width, sizeof(UInt32));
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixels, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast|kCGBitmapByteOrder32Big);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), inputCGImage);
CGColorSpaceRelease(colorSpace);
CGContextRelease(context);
#define Mask8(x) ( (x) & 0xFF )
#define R(x) ( Mask8(x) )
#define G(x) ( Mask8(x >> 8 ) )
#define B(x) ( Mask8(x >> 16) )
// 2. Iterate and log!
NSLog(#"Brightness of image:");
UInt32 * currentPixel = pixels;
for (NSUInteger j = 0; j < height; j++) {
for (NSUInteger i = 0; i < width; i++) {
UInt32 color = *currentPixel;
printf("%3.0f ", (R(color)+G(color)+B(color))/3.0);
currentPixel++;
}
printf("\n");
}
free(pixels);
#undef R
#undef G
#undef B
}
And this is my translated code:
func logPixelsOfImage(image: NSImage) {
var inputCGImage: CGImageRef = image
var width: UInt = CGImageGetWidth(inputCGImage)
var height: UInt = CGImageGetHeight(inputCGImage)
var bytesPerPixel: UInt = 4
var bytesPerRow: UInt = bytesPerPixel * width
var bitsPerComponent: UInt = 8
var pixels: UInt32!
//pixels = (UInt32 *) calloc(height * width, sizeof(UInt32));
var colorSpace: CGColorSpaceRef = CGColorSpaceCreateDeviceRGB()
var context: CGContextRef = CGBitmapContextCreate(pixels, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big) //Error: Use of unresolved identifier 'kCGImageAlphaPremultipliedLast' Second Error: Use of unresolved identifier 'kCGBitmapByteOrder32Big'
CGContextDrawImage(context, CGRectMake(0, 0, width, height), inputCGImage) // Error: 'Uint' is not convertible to 'CGFloat'
/*
#define Mask8(x) ( (x) & 0xFF )
#define R(x) ( Mask8(x) )
#define G(x) ( Mask8(x >> 8 ) )
#define B(x) ( Mask8(x >> 16) )
*/
println("Brightness of image:")
var currentPixel: UInt32 = pixels
for (var j : UInt = 0; j < height; j++) {
for ( var i : UInt = 0; i < width; i++) {
var color: UInt32 = currentPixel
//printf("%3.0f ", (R(color)+G(color)+B(color))/3.0);
currentPixel++
}
println("/n")
}
//free(pixels)
/*
#undef R
#undef G
#undeg B
*/
}
Comments contain (a) errors that the code provides or (b) code that was in the original that I don't know how to translate.
So what's my question?
Is there a better way of finding the brightness of a pixel than what is shown in the Objective-C code? If so, how would you do that? If not, how would I go about finishing up the translation of the original code?
Thanks -- A CodeIt that is extremely confused and every so slightly desperate.
I'm confused about one strange thing....I have an unsigned char array.... I allocate it using calloc and record some bytes data in it... but when I free this unsigned char and allocate it again, I see that it reserves the same address in memory which was allocated previous time. I understand why....But I cannot understand why the data that I'm trying to write there second time is not written...There is written the data that was written first time....Can anybody explain me this???????
unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char));
This is how I allocate it....
Actually my problem is that because of this allocation , which happens once every 2 secs I have memory leak...But when I try to free the allocated memory sector happens thing described above....:(
Please if anybody can help me....I would be so glad...
Here is the code...
- (unsigned char*) createBitmapContext:(UIImage*)anImage{
CGImageRef imageRef = [anImage CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char));
bytesPerPixel = 4;
bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);
imageRef=nil;
return rawData; }
in this code there is no the part where I free(rawData), and because I cannot free it inside this method I tried to define rawData globally and free it after calling this method...but nothing interesting....
Please if anybody can help me....I would be so glad...
Ok, so this method is rendering a UIImage into a freshly allocated byte buffer and returning the buffer to the caller. Since you're allocating it with calloc, it will be initialised to 0, then overwritten with the image contents.
when I free this unsigned char and allocate it again, I see that it reserves the same address in memory which was allocated previous time
Yes, there are no guarantees about the location of the buffer in memory. Assuming you call free() on the returned memory, requesting the exact same size is quite likely to give you the same buffer back. But - how are you verifying the contents are not written over a second time? What is in the buffer?
my problem is that because of this allocation , which happens once every 2 secs I have memory leak...But when I try to free the allocated memory sector happens thing described above....:(
If there is a leak, it is likely in the code that calls this method, since there is no obvious leakage here. The semantics are obviously such that the caller is responsible for freeing the buffer. So how is that done?
Also, are you verifying that the CGBitmapContext is being correctly created? It is possible that some creation flags or parameters may result in an error. So add a check for context being valid (at least not nil). That could explain why the content is not being overwritten.
One easy way to ensure your memory is being freshly updated is to write your own data to it. You could fill the buffer with a counter, and verify this outside the method. For example, just before you return rawData:
static unsigned updateCounter = 0;
memset(rawData, updateCounter & 0xff, width*height*4);
This will cycle through writing 0-255 into the buffer, which you can easily verify.
Another thing - what are you trying to achieve with this code? There might be an easier way to achieve what you're trying to achieve. Returning bare buffers devoid of metadata is not necessarily the best way to manage your images.
So guys I solved this issue...First thing I've changed createBitmapContext method to this
- (void) createBitmapContext:(UIImage*)anImage andRawData:(unsigned char *)theRawData{
CGImageRef imageRef = [anImage CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char));
bytesPerPixel = 4;
bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(theRawData, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);
imageRef=nil;
// return theRawData;}
then...besides this I missed the part where I assign newRawData to oldRawData and by this I was having two pointers to the same memory address...So from here came the issue... I changed this assignment part to this memcpy(rawDataForOldImage, rawDataForNewImage,newCapturedImage.size.width*newCapturedImage.size.height*4); and here the problem is solved....Thanks to all
in my app I create an unsigned char pointer using this function:
- (unsigned char*)getRawData
{
// First get the image into your data buffer
CGImageRef image = [self CGImage];
NSUInteger width = CGImageGetWidth(image);
NSUInteger height = CGImageGetHeight(image);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = malloc(height * width * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextSetBlendMode(context, kCGBlendModeCopy);
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, (CGFloat)width, (CGFloat)height), image);
CGContextRelease(context);
// Now your rawData contains the image data in the RGBA8888 pixel format.
return rawData;
}
And in another class I assign a property to that pointer like so: self.bitmapData = [image getRawData];
Where in this process can I free that malloc'd memory? When I try to free the property in dealloc, it gives me an exc_bad_access error. I feel like I'm missing a fundamental c or objective-c concept here. All help is appreciated.
There is a good discussion about the safety of using malloc/free in objective-c here.
As long as you correctly free() the memory that you malloc(), there should be no issue.
I personally think that using NSMutableData or NSMutableArray is just easier. If you don't need ultimate performance, I would not use the C malloc/free statements directly.
One way around this sort of issue is to use NSMutableData, so you can replace
unsigned char *rawData = malloc(height * width * 4);
with
myData = [[NSMutableData alloc] initWithCapacity:height * width * 4];
unsigned char *rawData = myData.mutableBytes;
you can then release myData in your deallocator.
alternativly you can do
myData = [NSMutableData dataWithCapacity:height * width * 4];
This will then mean your myData is kept around the the duration of the event loop, you can of cause even change the return type of getRawData method to return NSMUtableData or NSData, and that way it can be retained by other parts of your code, the only time I return raw bytes in my code is if I know it will be available for the life of the object that returns it, that way if I need to hold onto the data I can retain the owner class.
Apple will often use the
myData = [[NSMutableData alloc] initWithCapacity:height * width * 4];
unsigned char *rawData = myData.mutableBytes;
pattern and then document that if you need the bytes beyond the current autorelease pool cycle you will then have to copy it.