I am trying to mirror screen of my mac to iphone. I have this method in Mac app delegate to capture screeen into base64 string.
-(NSString*)baseString{
CGImageRef screen = CGDisplayCreateImage(displays[0]);
CGFloat w = CGImageGetWidth(screen);
CGFloat h = CGImageGetHeight(screen);
NSImage * image = [[NSImage alloc] initWithCGImage:screen size:(NSSize){w,h}];
[image lockFocus];
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithFocusedViewRect:NSMakeRect(0, 0, w, h)];
[bitmapRep setCompression:NSTIFFCompressionJPEG factor:.3];
[image unlockFocus];
NSData *imageData = [bitmapRep representationUsingType:NSJPEGFileType properties:_options];
NSString *base64String = [imageData base64EncodedStringWithOptions:0];
image = nil;
bitmapRep = nil;
imageData = nil;
return base64String;}
after that I am sending it to iphone and present it in UIImageView.
Delay between screenshots is 40 miliseconds. Everything works as expected until there is enough memory. After minute of streaming it starts swapping and use 6GB of RAM. iOS app memory usage is also growing lineary. By the time iOS reaches 90MB of ram, mac has 6GB.
Even if I stop streaming memory is not released.
I'm using ARC in both projects. Would it make any difference if migrate it to manual reference counting ?
I also tried #autoreleasepool {...} block, but it didn't help.
Any ideas ?
EDIT
My iOS code is here
NSString message = [NSString stringWithFormat:#"data:image/png;base64,%#",base64];
NSURL *url = [NSURL URLWithString:message];
NSData *imageData = [NSData dataWithContentsOfURL:url];
UIImage *ret = [UIImage imageWithData:imageData];
self.image.image = ret;
You have a serious memory leak. The docs for CGDisplayCreateImage clearly state:
The caller is responsible for releasing the image created by calling CGImageRelease.
Update your code with a call to:
CGImageRelease(screen);
I'd add that just after creating the NSImage.
We can't help with your iOS memory leaks since you didn't post your iOS code, but I see a big memory leak in your Mac code.
You are calling a Core Foundation function, CGDisplayCreateImage. Core Foundation objects are not managed by ARC. If a Core Foundation function has "Create" (or "copy") in the name then it follows the "create rule" and you are responsible for releasing the returned CF object when you are done with it.
Some CF objects have special release calls. For those that don't, just call CFRelease. CGImageRef has a special release call, CGImageRelease().
You need a corresponding call to CGImageRelease(screen), probably after the call to initWithCGImage.
Related
I'm using the profiler in xcode 4 to determinate if I have any memory leaks. I didn't have this leak before, but with xcode 5 I have this one.
I'm trying to set an image for the tab item of my `UIViewController and the profiler marks this line :
image = [[UIImage alloc] initWithContentsOfFile:imgPath]; <<=== Leak : 9.1%
This is part of my code I don't understand why. What's the best way to resolve this issue?
NSString *imgPath;
UIImage *image;
IBNewsViewController *newsView = [[IBNewsViewController alloc] initWithURL:[tvLocal urlFlux] title:#"News" isEmission:NO];
[newsView setTitle:#"News"];
imgPath = [[NSBundle mainBundle] pathForResource:"news" ofType:#"png"];
image = [[UIImage alloc] initWithContentsOfFile:imgPath]; <<=== Leak : 9.1%
newsView.tabBarItem.image = image;
[image release];
image = nil;
UINavigationController* navNew = [[UINavigationController alloc] initWithRootViewController:newsView];
[newsView release];
newsView = nil;
EDIT:
No leak on iOS6.
Why it's leak on iOS7?
You should switch to the autoreleasing imageNamed: method. This has the added benefit of system level cacheing of the image.
NSString *imgPath;
UIImage *image;
IBNewsViewController *newsView = [[IBNewsViewController alloc] initWithURL:[tvLocal urlFlux] title:#"News" isEmission:NO];
[newsView setTitle:#"News"];
image = [UIImage imageNamed: #"news"];
newsView.tabBarItem.image = image;
UINavigationController* navNew = [[UINavigationController alloc] initWithRootViewController:newsView];
[newsView release];
newsView = nil;
To make life easier on yourself I'd switch your project to use ARC so you have less to worry about WRT memory management.
Replace this line
image = [[UIImage alloc] initWithContentsOfFile:imgPath];
With
image = [UIImage imageWithContentsOfFile:imgPath];
and check once.
First, switch to ARC. There is no single thing you can do on iOS that will more improve your code and remove whole classes of memory problems with a single move.
Beyond that, the code above does not appear to have a leak itself. That suggests that the actual mistake is elsewhere. There are several ways this could happen:
You're leaking the IBNewsViewController somewhere else
IBNewsViewController messes with its tabBarItem incorrectly and leaks that
You're leaking the UINavigationController somewhere else
You're retaining the tabBarItem.image somewhere else and failing to release it
Those are the most likely that I would hunt for. If you're directly accessing ivars, that can often cause these kinds of mistakes. You should use accessors everywhere except in init and dealloc. (This is true in ARC, but is absolutely critical without ARC.)
Leak detection is not perfect. There are all kinds of "abandoned" memory that may not appear to be a leak. I often recommend using Heapshot (now "Generation") analysis to see what other objects may be abandoned; that may give you a better insight into this leak.
Why differences in iOS 6 vs iOS 7? I suspect you have the same problem on iOS 6, but it doesn't look like a "leak", possibly because there is something caching the image that was removed in iOS 7. The cache pointer may make it look like it's not a leak to Instruments.
Speaking of which, do make sure to run the static analyzer. It can help you find problems.
And of course, switch to ARC.
My goal is to make a video out of a short sequence of opengl frames (around 200 frames). So in order to do this, I use the following code to create a array of images:
NSMutableArray* images = [NSMutableArray array];
KTEngine* engine = [KTEngine sharedInstance]; //Opengl - based engine
for (unsigned int i = engine.animationContext.unitStart; i < engine.animationContext.unitEnd ; ++i)
{
NSLog(#"Render Image %d", i);
[engine.animationContext update:i];
[self.view setNeedsDisplay];
[images addObject:[view snapshot]];
}
NSLog(#"Total image rendered %d", [images count]);
[self createVideoFileFromArray:images];
So this works perfectly fine on simulator, but not on device (retina iPad). So my guess is that the device does not support so many UIimages (specially in 2048*1536). The crash always happends after 38 frames or so.
Now as for the solution, I thought to create a video for each 10 frames, and then attached them all together, but when can I know if I have enough space (is the autorelease pool drained?).
Maybe I should use a thread, process 10 images, and fire it again for the next 10 frames once it's over?
Any idea?
It seems quite likely that you're running out of memory.
To reduce memory usage, you could try to store the images as NSData using the PNG or JPG format instead. Both PNG's and JPG's are quite small when represented as data, but loading them into UIImage objects can be very memory consuming.
I would advice you to do something like below in your loop. The autorelease pool is needed to drain the returned snapshot on each iteration.
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIImage *image = [view snapshot];
NSData *imageData = UIImagePNGRepresentation(image);
[images addObject:imageData];
[pool release];
This of course requires your createVideoFileFromArray: method to handle pure image data instead of UIImage objects, but that should probably be feasible to implement.
I have a memory leak which crashes my App while copying / creating images width NSFileManager.
When i profile my App with "Allocations", everything looks fine. The Allocated Memory Goes up from aprox 1.5 MB to 6 MB during every recoursion and then drops to 1.5MB again.
But the "Real Memory" and "Virtuel Memory" grows to aprox 150MB and then the App crashes.
I receive Memory Warnings Level 1 and 2 before.
here is the function us use:
-(void) processCacheItems:(NSMutableArray*) originalFiles
{
if ( [originalFiles count] == 0 )
{
[originalFiles release];
return;
}
else
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSString *curFileName = [originalFiles lastObject];
NSString *filePath = [documentsDirectoryPath stringByAppendingPathComponent:curFileName];
NSURL *fileURL = [NSURL fileURLWithPath:filePath];
CGSize destinationSize = CGSizeMake(150,150);
CGSize previewDestinationSize = CGSizeMake(1440.0, 1440.0);
UIImage *originalImage = [UIImage imageWithContentsOfFile:filePath]; // AUTORELEASED
// create thumb and copy to presentationfiles directory
UIImage *thumb = [originalImage resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:destinationSize
interpolationQuality:kCGInterpolationHigh]; // AUTORELEASED
// the resizedImageWithContentMode: does not semm to make the problem, because when i skip this and just use the original file the same problem occours
NSString *thumbPath = [thumbsDirectoryPath stringByAppendingPathComponent:curFileName];
[fileManager createFileAtPath:thumbPath contents:UIImageJPEGRepresentation(thumb, 0.9) attributes:NULL];
// create thumb and copy to presentationfiles directory
UIImage *previewImage = [originalImage resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:previewDestinationSize
interpolationQuality:kCGInterpolationHigh]; // AUTORELEASED
NSString *previewImagePath = [previewsDirectoryPath stringByAppendingPathComponent:curFileName];
[fileManager createFileAtPath:previewImagePath contents:UIImageJPEGRepresentation(previewImage, 0.9) attributes:NULL];
// copy copy original to presentationfiles directory
NSString *originalPath = [originalFilesDirectoryPath stringByAppendingPathComponent:curFileName];
[fileManager copyItemAtPath:filePath toPath:originalPath error:NULL];
[originalFiles removeLastObject];
[pool drain];
[self processCacheItems:originalFiles]; // recursion
}
}
Thank you for your Hint.
I fond out, that the Problem was not a leak, but the memory Allocation was too big when scaling down Big Images in "resizedImageWithContentMode:" That made the App crash.
I changed the Image scaling to use the Image I/O framework.
Now it works fine.
UPDATE: This answer is outdated. If you are using ARC, ignore it .
How do you allocate NSFileManager?
I have experienced that allocating it via the + defaultManager method, which is deprecated, produces memory leaks (or Instruments says so, Instruments sometimes reports memory leaks where there are not).
Generally, you should allocate it with [[NSFileManager alloc] init] and release when you no longer need it.
I am implementing an iOS App that needs to fetch a huge amount of images over HTTP. I've tried several approaches but independently what I do, Instuments shows constantly increasing memory allocations and the App crashes sooner or later when I run it on a device. There are no leaks shown by Instruments.
So far I have tried the following approches:
Fetch the images using a synchronous NSURLConnection within an NSOperation
Fetch the images using a asynchronous NSURLConnection within an NSOperation
Fetch the images using [NSData dataWithContentsOfURL:url] in the Main-Thread
Fetch the images using synchronous ASIHTTPRequest within an NSOperation
Fetch the images using asynchronous ASIHTTPRequest and adding it to a NSOperationQueue
Fetch the images using asynchronous ASIHTTPRequest and using a completionBlock
The Call Tree in Instrumetns shows that the memory is consumed while processing the HTTP-Response. In case of asynchronous NSURLConnection this is in
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
[receivedData appendData:data];
}
In case of the synchronous NSURLConnection, Instruments shows a growing CFData (store) entry.
The problem with ASIHTTPRequest seems to be the same as with the asynchronous NSURLConnection in a analogous code-position. The [NSData dataWithContentsOfURL:url] approach shows an increasing amount of total memory allocation in exactely that statement.
I am using an NSAutoReleasePool when the request is done in a separate thread and I have tried to free up memory with [[NSURLCache sharedURLCache] removeAllCachedResponses] - no success.
Any ideas/hints to solve the problem? Thanks.
Edit:
The behaviour only shows up if I persist the images using CoreData. Here is the code I run as a NSInvocationOperation:
-(void) _fetchAndSave:(NSString*) imageId {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSString *url = [NSString stringWithFormat:#"%#%#", kImageUrl, imageId];
HTTPResponse *response = [SimpleHTTPClient GET:url headerOrNil:nil];
NSData *data = [response payload];
if(data && [data length] > 0) {
UIImage *thumbnailImage = [UIImage imageWithData:data];
NSData *thumbnailData = UIImageJPEGRepresentation([thumbnailImage scaleToSize:CGSizeMake(55, 53)], 0.5); // UIImagePNGRepresentation(thumbnail);
[self performSelectorOnMainThread:#selector(_save:) withObject:[NSArray arrayWithObjects:imageId, data, thumbnailData, nil] waitUntilDone:NO];
}
[pool release];
}
All CoreData related stuff is done in the Main-Thread here, so there should not be any CoreData multithreading issue. However, if I persist the images, Instruments shows constantely increasing memory allocations at the positions described above.
Edit II:
CoreData related code:
-(void) _save:(NSArray*)args {
NSString *imageId = [args objectAtIndex:0];
NSData *data = [args objectAtIndex:1];
NSData *thumbnailData = [args objectAtIndex:2];
Image *image = (Image*)[[CoreDataHelper sharedSingleton] createObject:#Image];
image.timestamp = [NSNumber numberWithDouble:[[NSDate date] timeIntervalSince1970]];
image.data = data;
Thumbnail *thumbnail = (Thumbnail*)[[CoreDataHelper sharedSingleton] createObject:#"Thumbnail"];
thumbnail.data = thumbnailData;
thumbnail.timestamp = image.timestamp;
[[CoreDataHelper sharedSingleton] save];
}
From CoreDataHelper (self.managedObjectContext is picking the NSManagedObjectContext usable in the current thread):
-(NSManagedObject *) createObject:(NSString *) entityName {
return [NSEntityDescription insertNewObjectForEntityForName:entityName inManagedObjectContext:self.managedObjectContext];
}
We had a similar problem. While fetching lots of images over http, there was huge growth and a sawtooth pattern in the memory allocation. We'd see the system clean up, more or less, as it went, but slowly, and not predictably. Meanwhile the downloads were streaming in, piling up on whatever was holding onto the memory. Memory allocation would crest around 200M and then we'd die.
The problem was an NSURLCache issue. You stated that you tried [[NSURLCache sharedURLCache] removeAllCachedResponses]. We tried that, too, but then tried something a little different.
Our downloads are done in groups of N images/movies, where N was typically 50 to 500. It was important that we get all of N as an atomic operation.
Before we started our group of http downloads, we did this:
NSURLCache *sharedCache = [[NSURLCache alloc] initWithMemoryCapacity:0 diskCapacity:0 diskPath:0];
[NSURLCache setSharedURLCache:sharedCache];
We then get each image in N over http with a synchronous call. We do this group download in an NSOperation, so we're not blocking the UI.
NSData *movieReferenceData = [NSURLConnection sendSynchronousRequest:request returningResponse:&response error:&error];
Finally, after each individual image download, and after we're done with our NSData object for that image, we call:
[sharedCache removeAllCachedResponses];
Our memory allocation peak behavior dropped to a very comfortable handful of megabytes, and stopped growing.
In this case, you're seeing exactly what you're supposed to see. -[NSMutableData appendData:] increases the size of its internal buffer to hold the new data. Since an NSMutableData is always located in memory, this causes a corresponding increase in memory usage. What were you expecting?
If the ultimate destination for these images is on disk, try using an NSOutputStream instead of NSMutableData. If you then want to display the image, you can create a UIImage pointing to the file when you're done.
As many people are complaining it seems that in the Apple SDK for the Retina Display there's a bug and imageWithContentsOfFile actually does not automatically load the 2x images.
I've stumbled into a nice post how to make a function which detects UIScreen scale factor and properly loads low or high res images ( http://atastypixel.com/blog/uiimage-resolution-independence-and-the-iphone-4s-retina-display/ ), but the solution loads a 2x image and still has the scale factor of the image set to 1.0 and this results to a 2x images scaled 2 times (so, 4 times bigger than what it has to look like)
imageNamed seems to accurately load low and high res images, but is no option for me.
Does anybody have a solution for loading low/high res images not using the automatic loading of imageNamed or imageWithContentsOfFile ? (Or eventually solution how to make imageWithContentsOfFile work correct)
Ok, actual solution found by Michael here :
http://atastypixel.com/blog/uiimage-resolution-independence-and-the-iphone-4s-retina-display/
He figured out that UIImage has the method "initWithCGImage" which also takes a scale factor as input (I guess the only method where you can set yourself the scale factor)
[UIImage initWithCGImage:scale:orientation:]
And this seems to work great, you can custom load your high res images and just set that the scale factor is 2.0
The problem with imageWithContentsOfFile is that since it currently does not work properly, we can't trust it even when it's fixed (because some users will still have an older iOS on their devices)
We just ran into this here at work.
Here is my work-around that seems to hold water:
NSString *imgFile = ...path to your file;
NSData *imgData = [[NSData alloc] initWithContentsOfFile:imgFile];
UIImage *img = [[UIImage alloc] initWithData:imgData];
imageWithContentsOfFile works properly (considering #2x images with correct scale) starting iOS 4.1 and onwards.
Enhancing Lisa Rossellis's answer to keep retina images at desired size (not scaling them up):
NSString *imagePath = ...Path to your image
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfFile:imagePath] scale:[UIScreen mainScreen].scale];
I've developed a drop-in workaround for this problem.
It uses method swizzling to replace the behavior of the "imageWithContentsOfFile:" method of UIImage.
It works fine on iPhones/iPods pre/post retina.
Not sure about the iPad.
Hope this is of help.
#import </usr/include/objc/objc-class.h>
#implementation NSString(LoadHighDef)
/** If self is the path to an image, returns the nominal path to the high-res variant of that image */
-(NSString*) stringByInsertingHighResPathModifier {
NSString *path = [self stringByDeletingPathExtension];
// We determine whether a device modifier is present, and in case it is, where is
// the "split position" at which the "#2x" token is to be added
NSArray *deviceModifiers = [NSArray arrayWithObjects:#"~iphone", #"~ipad", nil];
NSInteger splitIdx = [path length];
for (NSString *modifier in deviceModifiers) {
if ([path hasSuffix:modifier]) {
splitIdx -= [modifier length];
break;
}
}
// We insert the "#2x" token in the string at the proper position; if no
// device modifier is present the token is added at the end of the string
NSString *highDefPath = [NSString stringWithFormat:#"%##2x%#",[path substringToIndex:splitIdx], [path substringFromIndex:splitIdx]];
// We possibly add the extension, if there is any extension at all
NSString *ext = [self pathExtension];
return [ext length]>0? [highDefPath stringByAppendingPathExtension:ext] : highDefPath;
}
#end
#implementation UIImage (LoadHighDef)
/* Upon loading this category, the implementation of "imageWithContentsOfFile:" is exchanged with the implementation
* of our custom "imageWithContentsOfFile_custom:" method, whereby we replace and fix the behavior of the system selector. */
+(void)load {
Method originalMethod = class_getClassMethod([UIImage class], #selector(imageWithContentsOfFile:));
Method replacementMethod = class_getClassMethod([UIImage class], #selector(imageWithContentsOfFile_custom:));
method_exchangeImplementations(replacementMethod, originalMethod);
}
/** This method works just like the system "imageWithContentsOfFile:", but it loads the high-res version of the image
* instead of the default one in case the device's screen is high-res and the high-res variant of the image is present.
*
* We assume that the original "imageWithContentsOfFile:" implementation properly sets the "scale" factor upon
* loading a "#2x" image . (this is its behavior as of OS 4.0.1).
*
* Note: The "imageWithContentsOfFile_custom:" invocations in this code are not recursive calls by virtue of
* method swizzling. In fact, the original UIImage implementation of "imageWithContentsOfFile:" gets called.
*/
+ (UIImage*) imageWithContentsOfFile_custom:(NSString*)imgName {
// If high-res is supported by the device...
UIScreen *screen = [UIScreen mainScreen];
if ([screen respondsToSelector:#selector(scale)] && [screen scale]>=2.0) {
// then we look for the high-res version of the image first
UIImage *hiDefImg = [UIImage imageWithContentsOfFile_custom:[imgName stringByInsertingHighResPathModifier]];
// If such high-res version exists, we return it
// The scale factor will be correctly set because once you give imageWithContentsOfFile:
// the full hi-res path it properly takes it into account
if (hiDefImg!=nil)
return hiDefImg;
}
// If the device does not support high-res of it does but there is
// no high-res variant of imgName, we return the base version
return [UIImage imageWithContentsOfFile_custom:imgName];
}
#end
[UIImage imageWithContentsOfFile:] doesn't load #2x graphics if you specify an absolute path.
Here is a solution:
- (UIImage *)loadRetinaImageIfAvailable:(NSString *)path {
NSString *retinaPath = [[path stringByDeletingLastPathComponent] stringByAppendingPathComponent:[NSString stringWithFormat:#"%##2x.%#", [[path lastPathComponent] stringByDeletingPathExtension], [path pathExtension]]];
if( [UIScreen mainScreen].scale == 2.0 && [[NSFileManager defaultManager] fileExistsAtPath:retinaPath] == YES)
return [[[UIImage alloc] initWithCGImage:[[UIImage imageWithData:[NSData dataWithContentsOfFile:retinaPath]] CGImage] scale:2.0 orientation:UIImageOrientationUp] autorelease];
else
return [UIImage imageWithContentsOfFile:path];
}
Credit goes to Christof Dorner for his simple solution (which I modified and pasted here).