Huge memory usage in ARC - objective-c

I have these functions called in a thread that draw a NSView:
+(NSFont *)customFontWithName:(NSString *)fontName AndSize:(float)fontSize
{
NSData *data = [[[NSDataAsset alloc]initWithName:fontName] data];
CGDataProviderRef fontProvider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGFontRef cgFont = CGFontCreateWithDataProvider(fontProvider);
CGDataProviderRelease(fontProvider);
NSDictionary *fontsizeAttr=[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:fontSize], NSFontSizeAttribute,
nil];
CTFontDescriptorRef fontDescriptor = CTFontDescriptorCreateWithAttributes((__bridge CFDictionaryRef)fontsizeAttr);
CTFontRef font = CTFontCreateWithGraphicsFont(cgFont, 0, NULL, fontDescriptor);
CFRelease(fontDescriptor);
CGFontRelease(cgFont);
NSFont* retval= (__bridge NSFont*)font;
CFRelease(font);
return retval;
}
and this:
+(NSAttributedString*) createCurrentTextWithString:(NSString *)string AndMaxLenght:(float)length AndMaxHeight:(float)maxHeight AndColor:(NSColor *)color AndFontName: (NSString*) fontName
{
float dim=0.1;
NSDictionary *dictionary=[NSDictionary dictionaryWithObjectsAndKeys:[CustomFont customFontWithName:fontName AndSize:dim], NSFontAttributeName,color, NSForegroundColorAttributeName, nil];
NSAttributedString * currentText=[[NSAttributedString alloc] initWithString:string attributes: dictionary];
while([currentText size].width<maxLength&&[currentText size].height<maxHeight)
{
dictionary=[NSDictionary dictionaryWithObjectsAndKeys:[CustomFont customFontWithName:fontName AndSize:dim], NSFontAttributeName,color, NSForegroundColorAttributeName, nil];
currentText=[[NSAttributedString alloc] initWithString:string attributes: retval];
dim+=0.1;
}
return currentText;
}
All the objects created in these functions were correctly deallocated and I can't find memory leaks, but this code caused an huge use of memory (many gigabytes) and I can't understand why. Please help.

I found a solution. For some reasons, that I don't know, the code:
NSData *data = [[[NSDataAsset alloc]initWithName:fontName] data];
CGDataProviderRef fontProvider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGFontRef cgFont = CGFontCreateWithDataProvider(fontProvider);
allocs large memory that will not be deallocate any more. So I create a static variable CGFontRef for each custom font I have. This is the only way I found:
static CGFontRef font1;
....
static CGFontRef font;
+(CGFontRef) getFontWithValue: (int) value
{
switch (value)
{
case 1:
return font1;
break;
...
case n:
return fontn;
default:
return NULL;
}
}
And
+(NSFont*) customFontWithName:(int)fontName AndSize:(float)fontSize
{
NSDictionary *fontsizeAttr=[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:fontSize], NSFontSizeAttribute,
nil];
CTFontDescriptorRef fontDescriptor = CTFontDescriptorCreateWithAttributes((__bridge CFDictionaryRef)fontsizeAttr);
CTFontRef font = CTFontCreateWithGraphicsFont([CustomFont getFontWithValue:fontName], 0, NULL, fontDescriptor);
CFRelease(fontDescriptor);
NSFont* retval= (__bridge NSFont*)font;
CFRelease(font);
return retval;
}
I still don't understand why there is this memory leak and this is not a solution, but only a trick, but it works.

Related

Leveling and matching exposure using CoreImage / GPUImage

I was wondering if its possible to level exposure across a set of images using either CoreImage or GPUImage. And how would I go about that?
Example:
Say you have 4 images, but the exposure is different on the third one. How could you level the exposure so all 4 images have the same exposure?
One idea I had was measuring and matching the exposure using AVCapture, i.e. if the input image is -2.0, then simply add 2.0 using CoreImage.
Another idea is to implement histogram equalization..
Has anyone ever dealt with the same task before? Any insights?
You can use CoreImage (#import ) and ImageIO (CGImage) and extract exif metadata with EXIF Dictionary Keys keys:kCGImagePropertyExifExposureTime, kCGImagePropertyExifExposureMode, kCGImagePropertyExifExposureProgram, kCGImagePropertyExifExposureBiasValue, kCGImagePropertyExifExposureIndex, kCGImagePropertyExifWhiteBalance
- (NSDictionary*) exifData: (NSString*) path {
NSDictionary* dic = nil;
NSURL* url = [NSURL fileURLWithPath: path];
if ( url )
{
CGImageSourceRef source = CGImageSourceCreateWithURL ((CFURLRef) url, NULL);
if (source != nil)
{
CFDictionaryRef metadataRef =
CGImageSourceCopyPropertiesAtIndex (source, 0, NULL);
if (metadataRef)
{
NSDictionary* immutableMetadata = (NSDictionary *)metadataRef;
if (immutableMetadata)
{
dic = [NSDictionary dictionaryWithDictionary : (NSDictionary *)metadataRef];
}
CFRelease ( metadataRef );
}
CFRelease(source);
source = nil;
}
}
return dic;
}
Usage:
NSDictionary* dic = [self exifData: path];
if (dic)
{
NSString *s = [dic valueForKey: kCGImagePropertyExifExposureTime];
NSLog(#"Image : %# - ExposureTime : %.2f", path, [s floatValue]);
}
And then changing the exposure or white balance with:
- (UIImage *) changeImageExposure:(NSString*) imagename exposure:(float) exposure {
CIImage *inputImage = [[CIImage alloc] initWithImage:[UIImage imageNamed: imagename]];
CIFilter *exposureAdjustmentFilter = [CIFilter filterWithName:#"CIExposureAdjust"];
[exposureAdjustmentFilter setValue:inputImage forKey:kCIInputImageKey];
[exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:exposure] forKey:#"inputEV"];
CIImage *outputImage = exposureAdjustmentFilter.outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
return [UIImage imageWithCGImage:[context createCGImage:outputImage fromRect:outputImage.extent]];
}

OS X objective-C app uses excessive memory

I am writing an OS X application that will create a video using a series of images. It was developed using code from here: Make movie file with picture Array and song file, using AVAsset, but not including the audio portion.
The code functions and creates an mpg file.
The problem is the memory pressure. It doesn't appear to free up any memory. Using XCode Instruments I found the biggest culprits are:
CVPixelBufferCreate
[image TIFFRepresentation];
CGImageSourceCreateWithData
CGImageSourceCreateImageAtIndex
I tried adding code to release, but the ARC should already being doing that.
Eventually OS X will hang and or crash.
Not sure how to handle the memory issue. There are no mallocs in the code.
I'm open to suggestions. It appears that many others have used this same code.
This is the code that is based on the link above:
- (void)ProcessImagesToVideoFile:(NSError **)error_p size:(NSSize)size videoFilePath:(NSString *)videoFilePath jpegs:(NSMutableArray *)jpegs fileLocation:(NSString *)fileLocation
{
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:videoFilePath]
fileType:AVFileTypeMPEG4
error:&(*error_p)];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//Write all picture array in movie file.
int frameCount = 0;
for(int i = 0; i<[jpegs count]; i++)
{
NSString *filePath = [NSString stringWithFormat:#"%#%#", fileLocation, [jpegs objectAtIndex:i]];
NSImage *jpegImage = [[NSImage alloc ]initWithContentsOfFile:filePath];
CMTime frameTime = CMTimeMake(frameCount,(int32_t) 24);
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30)
{
if (adaptor.assetWriterInput.readyForMoreMediaData)
{
if ((frameCount % 25) == 0)
{
NSLog(#"appending %d to %# attemp %d\n", frameCount, videoFilePath, j);
}
buffer = [self pixelBufferFromCGImage:jpegImage andSize:size];
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
if (append_ok == NO) //failes on 3GS, but works on iphone 4
{
NSLog(#"failed to append buffer");
NSLog(#"The error is %#", [videoWriter error]);
}
//CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
//NSParameterAssert(bufferPool != NULL);
if(buffer)
{
CVPixelBufferRelease(buffer);
//CVBufferRelease(buffer);
}
}
else
{
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
}
j++;
}
if (!append_ok)
{
printf("error appending image %d times %d\n", frameCount, j);
}
frameCount++;
//CVBufferRelease(buffer);
jpegImage = nil;
buffer = nil;
}
//Finish writing picture:
[videoWriterInput markAsFinished];
[videoWriter finishWritingWithCompletionHandler:^(){
NSLog (#"finished writing");
}];
}
- (CVPixelBufferRef) pixelBufferFromCGImage: (NSImage *) image andSize:(CGSize) size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
size.width,
size.height,
kCVPixelFormatType_32ARGB,
(__bridge CFDictionaryRef) options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height,
8, 4*size.width, rgbColorSpace,
kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGImageRef imageRef = [self nsImageToCGImageRef:image];
CGRect imageRect = CGRectMake(0, 0, CGImageGetWidth(imageRef), CGImageGetHeight(imageRef));
CGContextDrawImage(context, imageRect, imageRef);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
imageRef = nil;
context = nil;
rgbColorSpace = nil;
return pxbuffer;
}
- (CGImageRef)nsImageToCGImageRef:(NSImage*)image;
{
NSData * imageData = [image TIFFRepresentation];// memory hog
CGImageRef imageRef;
if(!imageData) return nil;
CGImageSourceRef imageSource = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
imageRef = CGImageSourceCreateImageAtIndex(imageSource, 0, NULL);
imageData = nil;
imageSource = nil;
return imageRef;
}
Your code is using ARC but the libraries you are calling might not be using ARC. They might be relying on the older autorelease pool system to free up memory.
You should have a read how it works, this is fundamental stuff that every Obj-C developer needs to memorise, but basically any object can be added to the current "pool" of objects, which will be released when the pool is released.
By default, the pool on the main thread is emptied each time the app enters an idle state. This usually works fine, since the main thread should never be busy for more than few hundredths of a second and you can't really build up much memory in that amount of time.
When you do a lengthy and memory intensive operation you need to manually setup an autorelease pool, which is most commonly put inside a for or while loop (although you can actually put them anywhere you want, that's just the most useful scenario):
for ( ... ) {
#autoreleasepool {
// do somestuff
}
}
Also, ARC is only for Objective C code. It does not apply to objects created by C functions like CGColorSpaceCreateDeviceRGB() and CVPixelBufferCreate(). Make sure you are manually releasing all of those.
ARC works only for retainable object pointers. ARC documentation defines them as
A retainable object pointer (or “retainable pointer”) is a value of a
retainable object pointer type (“retainable type”). There are three
kinds of retainable object pointer types:
block pointers (formed by applying the caret (^) declarator sigil to a
function type)
Objective-C object pointers (id, Class, NSFoo*, etc.)
typedefs marked with attribute((NSObject)) Other pointer types,
such as int* and CFStringRef, are not subject to ARC’s semantics and
restrictions.
You already explicitly call release here
CGContextRelease(context);
You should do the same for other objects. Like
CVPixelBufferRelease(pxbuffer);
for pxbuffer

Screen recording from a GLKView

I'm trying to record the screen of the user action from a GLKView, my video file is here, I have the correct length but it's show only a black screen.
I've subclassed GLKView, added a pan gesture recogniser on it, and whenever the user do something I draw points on my View (more complicated than that but you got it).
Here is how I initialise my video
NSError *error = nil;
NSURL *url = [NSURL fileURLWithPath:#"/Users/Dimillian/Documents/DEV/movie.mp4"];
[[NSFileManager defaultManager]removeItemAtURL:url error:nil];
self.assetWriter = [[AVAssetWriter alloc] initWithURL:url fileType:AVFileTypeAppleM4V error:&error];
if (error != nil)
{
NSLog(#"Error: %#", error);
}
NSMutableDictionary * outputSettings = [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt: 954] forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt: 608] forKey: AVVideoHeightKey];
self.assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
self.assetWriterVideoInput.expectsMediaDataInRealTime = YES;
// You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:954], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:608], kCVPixelBufferHeightKey,
nil];
self.assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:
self.assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
[self.assetWriter addInput:self.assetWriterVideoInput];
self.startTime = [NSDate date];
self.lastTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:self.startTime],120);
[self.assetWriter startWriting];
[self.assetWriter startSessionAtSourceTime:kCMTimeZero];
}
Now here is a short version of my recogniser
- (void)pan:(UIPanGestureRecognizer *)p {
// Prepare vertex to be added on screen according to user input
[self setNeedsDisplay];
}
Now here is my drawrect method
- (void)drawRect:(CGRect)rect
{
glClearColor(1, 1, 1, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
[effect prepareToDraw];
//removed code about vertex drawing
[self capturePixels];
}
And finally my capturePixels function
- (void)capturePixels
{
glFinish();
CVPixelBufferRef pixel_buffer = NULL;
CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, self.assetWriterPixelBufferInput.pixelBufferPool, &pixel_buffer);
if ((pixel_buffer == NULL) || (status != kCVReturnSuccess))
{
NSLog(#"%d", status);
NSLog(#"VIDEO FAILED");
return;
}
else
{
CVPixelBufferLockBaseAddress(pixel_buffer, 0);
glReadPixels(0, 0, 954, 608, GL_RGBA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(pixel_buffer));
}
// May need to add a check here, because if two consecutive times with the same value are added to the movie, it aborts recording
CMTime currentTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:self.startTime],120);
if(![self.assetWriterPixelBufferInput appendPixelBuffer:pixel_buffer withPresentationTime:currentTime])
{
NSLog(#"Problem appending pixel buffer at time: %lld", currentTime.value);
}
else
{
NSLog(#"%#", pixel_buffer);
NSLog(#"Recorded pixel buffer at time: %lld", currentTime.value);
self.lastTime = currentTime;
}
CVPixelBufferUnlockBaseAddress(pixel_buffer, 0);
CVPixelBufferRelease(pixel_buffer);
}
I have another function to close the video input.
- (void)tearDownGL
{
NSLog(#"Tear down");
[self.assetWriterVideoInput markAsFinished];
[self.assetWriter endSessionAtSourceTime:self.lastTime];
[self.assetWriter finishWritingWithCompletionHandler:^{
NSLog(#"finish video");
}];
[EAGLContext setCurrentContext:context];
glDeleteBuffers(1, &vertexBuffer);
glDeleteVertexArraysOES(1, &vertexArray);
effect = nil;
glFinish();
if ([EAGLContext currentContext] == context) {
[EAGLContext setCurrentContext:nil];
}
context = nil;
}
Which seems to works as I have no error, and at the end the video have the correct length, but it's only black...
I'm nowhere an expert in OpenGL, it's only a tiny part of my iOS application, I want to learn it, I'm doing my best, and thanks from the posts from #BradLarson (OpenGL ES 2.0 to Video on iPad/iPhone) I've been able to make progress, but I'm really stuck now.

AVAssetWriter sometimes fails with status AVAssetWriterStatusFailed. Seems random

I'm writing a MP4 video file with a AVAssetWriter using a AVAssetWriterInputPixelBufferAdaptor.
The source is a video from a UIImagePickerController, either freshly captured from the camera or from the asset library. Quality right now is UIImagePickerControllerQualityTypeMedium.
Some times the writer fails. It's status is AVAssetWriterStatusFailed and the AVAssetWriter objects error property is:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed"
UserInfo=0xf5d8990 {NSLocalizedFailureReason=An unknown error occurred (-536870210),
NSUnderlyingError=0x4dd8e0 "The operation couldn’t be completed. (OSStatus error -536870210.)",
NSLocalizedDescription=The operation could not be completed
The error occurs approximately 20% of the times the code is run. It seems to fail more frequently on iPhone 4 / 4S than on iPhone 5.
It also occurs more frequently if the source video quality is higher.
Using UIImagePickerControllerQualityTypeLow the error doesn't happen so often.
Using UIImagePickerControllerQualityTypeHigh, the error happens a little more frequently.
I have also noticed something else:
It seems to come in waves. When it fails, the following runs will often fail too, even though I delete the app and reinstall it. That leaves me wondering, whether my program leaks some memory and if that memory stays alive even if the app gets killed (is that even possible?).
Here is the code i use to render my video:
- (void)writeVideo
{
offlineRenderingInProgress = YES;
/* --- Writer Setup --- */
[locationQueue cancelAllOperations];
[self stopWithoutRewinding];
NSError *writerError = nil;
BOOL succes;
succes = [[NSFileManager defaultManager] removeItemAtURL:self.outputURL error:nil];
// DLog(#"Url: %#, succes: %i, error: %#", self.outputURL, succes, fileError);
writer = [AVAssetWriter assetWriterWithURL:self.outputURL fileType:(NSString *)kUTTypeQuickTimeMovie error:&writerError];
//writer.shouldOptimizeForNetworkUse = NO;
if (writerError) {
DLog(#"Writer error: %#", writerError);
return;
}
float bitsPerPixel;
CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions((__bridge CMVideoFormatDescriptionRef)([readerVideoOutput.videoTracks[0] formatDescriptions][0]));
int numPixels = dimensions.width * dimensions.height;
int bitsPerSecond;
// Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
if ( numPixels < (640 * 480) )
bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.
else
bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.
bitsPerSecond = numPixels * bitsPerPixel;
NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey,
[NSNumber numberWithInteger:videoSize.height], AVVideoHeightKey,
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey,
nil], AVVideoCompressionPropertiesKey,
nil];
writerVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
writerVideoInput.transform = movie.preferredTransform;
writerVideoInput.expectsMediaDataInRealTime = YES;
[writer addInput:writerVideoInput];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
writerPixelAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerVideoInput
sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
BOOL couldStart = [writer startWriting];
if (!couldStart) {
DLog(#"Could not start AVAssetWriter!");
abort = YES;
[locationQueue cancelAllOperations];
return;
}
[self configureFilters];
CIContext *offlineRenderContext = [CIContext contextWithOptions:#{kCIContextUseSoftwareRenderer : #NO}];
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
if (!self.canEdit) {
[self createVideoReaderWithAsset:movie timeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, kCMTimePositiveInfinity) forOfflineRender:YES];
} else {
[self createVideoReaderWithAsset:movie timeRange:CMTimeRangeWithNOVideoRangeInDuration(self.thumbnailEditView.range, movie.duration) forOfflineRender:YES];
}
CMTime startOffset = reader.timeRange.start;
DLog(#"startOffset: %llu", startOffset.value);
[self.thumbnailEditView removeFromSuperview];
// self.thumbnailEditView = nil;
[glLayer removeFromSuperlayer];
glLayer = nil;
[playerView removeFromSuperview];
playerView = nil;
glContext = nil;
[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{
#try {
BOOL didWriteSomething = NO;
DLog(#"Preparing to write...");
while ([writerVideoInput isReadyForMoreMediaData]) {
if (abort) {
NSLog(#"Abort == YES");
[locationQueue cancelAllOperations];
[writerVideoInput markAsFinished];
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
}
if (writer.status == AVAssetWriterStatusFailed) {
DLog(#"Writer.status: AVAssetWriterStatusFailed, error: %#", writer.error);
[[NSUserDefaults standardUserDefaults] setObject:[NSNumber numberWithInt:1] forKey:#"QualityOverride"];
[[NSUserDefaults standardUserDefaults] synchronize];
abort = YES;
[locationQueue cancelAllOperations];
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
return;
DLog(#"Source file exists: %i", [[NSFileManager defaultManager] fileExistsAtPath:movie.URL.relativePath]);
}
DLog(#"Writing started...");
CMSampleBufferRef buffer = nil;
if (reader.status != AVAssetReaderStatusUnknown) {
if (reader.status == AVAssetReaderStatusReading) {
buffer = [readerVideoOutput copyNextSampleBuffer];
if (didWriteSomething == NO) {
DLog(#"Copying sample buffers...");
}
}
if (!buffer) {
[writerVideoInput markAsFinished];
DLog(#"Finished...");
CGColorSpaceRelease(colorSpace);
[self offlineRenderingDidFinish];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[writer finishWriting];
if (writer.error != nil) {
DLog(#"Error: %#", writer.error);
} else {
DLog(#"Succes!");
}
if (writer.status == AVAssetWriterStatusCompleted) {
videoConvertCompletionBlock(YES, nil);
}
else {
abort = YES;
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
}
});
return;
}
didWriteSomething = YES;
}
else {
DLog(#"Still waiting...");
//Reader just needs a moment to get ready...
continue;
}
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);
if (pixelBuffer == NULL) {
DLog(#"Pixelbuffer == NULL");
continue;
}
//DLog(#"Sample call back! Pixelbuffer: %lu", CVPixelBufferGetHeight(pixelBuffer));
//NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)CGColorSpaceCreateDeviceRGB() forKey:kCIImageColorSpace];
CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil];
CIImage *outputImage = [self filteredImageWithImage:ciimage];
CVPixelBufferRef outPixelBuffer = NULL;
CVReturn status;
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
CFDictionarySetValue(attrs,
kCVPixelBufferCGImageCompatibilityKey,
(__bridge const void *)([NSNumber numberWithBool:YES]));
CFDictionarySetValue(attrs,
kCVPixelBufferCGBitmapContextCompatibilityKey,
(__bridge const void *)([NSNumber numberWithBool:YES]));
status = CVPixelBufferCreate(kCFAllocatorDefault, ciimage.extent.size.width, ciimage.extent.size.height, kCVPixelFormatType_32BGRA, attrs, &outPixelBuffer);
//DLog(#"Output image size: %f, %f, pixelbuffer height: %lu", outputImage.extent.size.width, outputImage.extent.size.height, CVPixelBufferGetHeight(outPixelBuffer));
if (status != kCVReturnSuccess) {
DLog(#"Couldn't allocate output pixelBufferRef!");
continue;
}
[offlineRenderContext render:outputImage toCVPixelBuffer:outPixelBuffer bounds:outputImage.extent colorSpace:colorSpace];
CMTime currentSourceTime = CMSampleBufferGetPresentationTimeStamp(buffer);
CMTime currentTime = CMTimeSubtract(currentSourceTime, startOffset);
CMTime duration = reader.timeRange.duration;
if (CMTIME_IS_POSITIVE_INFINITY(duration)) {
duration = movie.duration;
}
CMTime durationConverted = CMTimeConvertScale(duration, currentTime.timescale, kCMTimeRoundingMethod_Default);
float durationFloat = (float)durationConverted.value;
float progress = ((float) currentTime.value) / durationFloat;
//DLog(#"duration : %f, progress: %f", durationFloat, progress);
[self updateOfflineRenderProgress:progress];
if (pixelBuffer != NULL && writerVideoInput.readyForMoreMediaData) {
[writerPixelAdaptor appendPixelBuffer:outPixelBuffer withPresentationTime:currentTime];
} else {
continue;
}
if (writer.status == AVAssetWriterStatusWriting) {
DLog(#"Writer.status: AVAssetWriterStatusWriting");
}
CFRelease(buffer);
CVPixelBufferRelease(outPixelBuffer);
}
}
#catch (NSException *exception) {
DLog(#"Catching exception: %#", exception);
}
}];
}
Ok, I think I solved it myself. The bad guy was this line:
[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ ....
The global queue I was passing is a concurrent queue. This allows a new callback to be made before the previous one is finished. The asset writer is not designed to be written to from more than one thread at a time.
Creating and using a new serial queue seems to remedy the problem:
assetWriterQueue = dispatch_queue_create("AssetWriterQueue", DISPATCH_QUEUE_SERIAL);
[writerVideoInput requestMediaDataWhenReadyOnQueue:assetWriterQueue usingBlock:^{...

Remove curled corner from qlgenerator thumbnail

How do I remove the curled icon a thumbnail create in a quicklook plugin?
Screenshot of current icon:
Screenshot of what I want:
GeneratePreviewForURL.m:
#include <CoreFoundation/CoreFoundation.h>
#include <CoreServices/CoreServices.h>
#include <QuickLook/QuickLook.h>
#import "GenerateIcon.h"
OSStatus GeneratePreviewForURL(void *thisInterface, QLPreviewRequestRef preview, CFURLRef url, CFStringRef contentTypeUTI, CFDictionaryRef options);
void CancelPreviewGeneration(void *thisInterface, QLPreviewRequestRef preview);
/* -----------------------------------------------------------------------------
Generate a preview for file
This function's job is to create preview for designated file
----------------------------------------------------------------------------- */
OSStatus GeneratePreviewForURL(void *thisInterface, QLPreviewRequestRef preview, CFURLRef url, CFStringRef contentTypeUTI, CFDictionaryRef options)
{
// To complete your generator please implement the function GeneratePreviewForURL in GeneratePreviewForURL.c
[GenerateIcon generatePreviewWithRef:preview URL:url];
return noErr;
}
void CancelPreviewGeneration(void *thisInterface, QLPreviewRequestRef preview)
{
// Implement only if supported
}
GenerateIcon.m:
//
// GenerateIcon.m
// Windows Binary Icon
//
// Created by Asger Hautop Drewsen on 2/5/12.
// Copyright (c) 2012 Asger Drewsen. All rights reserved.
//
#import "GenerateIcon.h"
#implementation GenerateIcon
+(void) generateThumbnailWithRef:(QLThumbnailRequestRef)requestRef URL:(CFURLRef)url
{
[GenerateIcon generateMultiWithThumbnailRef:requestRef PreviewRef:nil URL:url];
}
+(void) generatePreviewWithRef:(QLPreviewRequestRef)requestRef URL:(CFURLRef)url
{
[GenerateIcon generateMultiWithThumbnailRef:nil PreviewRef:requestRef URL:url];
}
+(void) generateMultiWithThumbnailRef:(QLThumbnailRequestRef)thumbnail PreviewRef:(QLPreviewRequestRef)preview URL:(CFURLRef)url
{
#autoreleasepool {
NSString * tempDir = NSTemporaryDirectory();
if (tempDir == nil)
tempDir = #"/tmp";
NSFileManager *fileManager = [[NSFileManager alloc] init];
NSString *directory = [tempDir stringByAppendingFormat: [NSString stringWithFormat:#"%#-%.0f", #"exe-icons", [NSDate timeIntervalSinceReferenceDate] * 1000.0]];
//NSString *directory = [tempDir stringByAppendingPathComponent:#"com.tyilo.exe-icons"];
/*for (NSString *file in [fileManager contentsOfDirectoryAtPath:directory error:nil])
{
[fileManager removeItemAtPath:file error:nil];
}*/
[fileManager createDirectoryAtPath:directory withIntermediateDirectories:YES attributes:nil error:nil];
[[NSTask launchedTaskWithLaunchPath:#"/usr/local/bin/wrestool" arguments:[NSArray arrayWithObjects:
#"-t",
#"group_icon",
#"-o",
directory,
#"-x",
[(__bridge NSURL *)url path],
nil]] waitUntilExit];
NSArray *icons = [fileManager contentsOfDirectoryAtPath:directory error:nil];
if (icons.count > 0)
{
NSImage *image = [[NSImage alloc] initWithContentsOfFile:[directory stringByAppendingPathComponent: [icons objectAtIndex:0]]];
NSData *thumbnailData = [image TIFFRepresentation];
CGSize size = image.size;
NSDictionary *properties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:size.width],kQLPreviewPropertyWidthKey,
[NSNumber numberWithInt:size.height],kQLPreviewPropertyHeightKey,
nil];
CGContextRef CGContext;
if (thumbnail)
{
CGContext = QLThumbnailRequestCreateContext(thumbnail, size, TRUE, (__bridge CFDictionaryRef)properties);
}
else
{
CGContext = QLPreviewRequestCreateContext(preview, size, TRUE, (__bridge CFDictionaryRef)properties);
}
if(CGContext) {
NSGraphicsContext* context = [NSGraphicsContext graphicsContextWithGraphicsPort:(void *)CGContext flipped:size.width > size.height];
if(context) {
//These two lines of code are just good safe programming…
[NSGraphicsContext saveGraphicsState];
[NSGraphicsContext setCurrentContext:context];
NSBitmapImageRep *thumbnailBitmap = [NSBitmapImageRep imageRepWithData:thumbnailData];
[thumbnailBitmap draw];
//This line sets the context back to what it was when we're done
[NSGraphicsContext restoreGraphicsState];
}
// When we are done with our drawing code QLThumbnailRequestFlushContext() is called to flush the context
if (thumbnail)
{
QLThumbnailRequestFlushContext(thumbnail, CGContext);
}
else
{
QLPreviewRequestFlushContext(preview, CGContext);
}
// Release the CGContext
CFRelease(CGContext);
}
/*NSLog(#"%#", [directory stringByAppendingPathComponent: [icons objectAtIndex:0]]);
CGImageRef image = (__bridge CGImageRef) [[NSImage alloc] initByReferencingFile:[directory stringByAppendingPathComponent: [icons objectAtIndex:0]]];
QLThumbnailRequestSetImage(thumbnail, image, properties);*/
}
else
{
NSLog(#"Failed to generate thumbnail!");
}
}
}
#end
Edit: Added screenshots.
You need to add the undocumented "IconFlavor" key to the properties dictionary that you supply to QLThumbnailRequestCreateContext() or QLThumbnailRequestSetXXX(), and give it the value 1 for minimal decoration.
See here for an example. At the top of that file are some other values I've discovered for "IconFlavour".
The aspect of your icons is automatically chosen by Quick Look and there is no public way to customize that. What is your type conformance tree?
For more information on UTIs see Uniform Type Identifiers Overview. Note that your type conformance tree won't necessarily translate to what you want from Quick Look but at least you will have a sane starting point.