I'm working on a Mac app. Initially monitoring Xcode's memory report while I ran my app showed showed the memory was just ramping up crazy. I used Instruments and profiled my app for allocations and leaks. Turned out there wasn't much leaked memory as you would expect due to strong reference cycles etc. However there was a lot of abandoned memory. By following the stack trace that lead to my code I have fixed by 70% using autorelease pools etc. Still the remaining 30% of abandoned memory seems to point system calls.
Now I have two questions based on that I have two questions
1) I want to fix the remaining 30%. How can I get rid of abandoned memory? I have already used Instruments and know exactly where those system calls are spawned but still dont know what to do to have that memory be cleaned up. (using ARC no manual retain/ release and autorelease doesn't seem to make a diff.)
2) After I know whatever my application was doing has completed and there is no need for any memory to be there (just like the application first started) I want to get rid of all memory that my app has used up. This I plan to use as a brute force approach to clean up all memory just like the system would if the user closes the app or turns off the system.
Basically if I know where my apps memory is in the file system I'll just programmatically call purge command on that or something similar. Because at this point I'm 100% sure nothing needs to be in memory for the app except for the first screen that you would expect the first time you launch the app.
I read this, this, this and this but they weren't helpful.
I'm working on a cocoa app that receives and stores large amount data into core data ( > 500K objects with many reverse relationships). The app has to constantly run in cycles. The problem i encounter is after each cycle the allocated memory is growing by 20-40 megabytes (as indicated by xcode; of course, same tendency showed by activity monitor).
What i have so far:
wrapped methods that insert objects into context in autorelease pools;
reset context, undoManager nil, stalenessInterval 0;
recreate persistent store coordinator on cycle completion (remove-add the store);
many hours of profiling, but could not find leak causes
Thought i'll appreciate advices if this list of actions can be improved, my main question is how should i deal with system memory, which my app will eventually eat up. Because it's possible i wont be able to optimize my code any more. I must not allow the app to crash due to not enough memory; so the way i deal with it now is i relaunch the app if its memory allocation reaches some hardcoded value (lets say, 1 GB).
I don't like this solution, so hopefully someone can advice me on appropriate way to handle this. Or on ways experienced people handle such situations. Thanks
UPDATE
Adding snapshots from instruments and xcode's memory debug gauge when first cycle has finished.
I'm trying to debug a mysterious app crash that happens after running the app for a few hours.
My hunch is that this may be memory related, as I've not done any memory optimization since starting to build my app.
I'm looking at my application with the "allocations" instrument and I see these numbers:
All allocations 4.70mb
Living 51072
Transitory 357280
Overall Bytes 100.23mb
Overall 408000
Which is the important number here? Does my app take 4.7 Mb of memory or 100 Mb? At which point should I be concerned about my app being killed for memory reasons? I want to avoid premature optimization :)
Since I'm using ARC and TabBarController, most of the controller related memory seems to be out of my hands, unless I find out how to create a tab and lazily init a controller when that tab is first touched.
Thank you!
I'm having strange effects with my app. I implemented my own PDF viewer. It shows ONE page at a time. Using Instruments Activity Monitor I see that my real memory is constantly at around 50MB.
After switching pages forth and back a couple of times I receive a memory warning level 0.
I do my best to react on it and sacrifice the low-res background image I'm rendering first to show something until CATiledLayer catches up.
Does not help. A few pages later I get memory warning level 1 and level 2 and after a few more pages my app gets killed with reason "9". Memory NEVER goes above 50MB!
Why do I get those warnings in the first place? There IS enough memory available.
This is happening on on iPad running iOS 4.3.
I don't think there's anything mysterious going on here -- which I'm sure is not what you wanted to hear. There are no absolute figures of "safe" amounts of memory to use. The rule is: when the OS tells you you're using too much, use less. It will jettison background processes first and in preference to your foreground app but there are still limits.
In the "olden days," you used to be lucky to get 20Mb. I'm sure you can safely get more than that on an iPad but, apparently, it's less than 50Mb.
You don't say how much memory you free by releasing the background image, but it seems that you need to be caching less data. You might also want to check Leaks (also in Instruments) to make sure you're releasing the objects you think you are.
Even the standard blank-window Cocoa app that gets built when you make a new Cocoa project in Xcode uses almost 6 MB of memory. What's the reason for this? Is it possible to make an app use less, or does OS X simply manage memory differently for Cocoa apps?
Not that I'm complaining. I know that performance "hardly matters anymore" (edit: what I mean is, it matters less than readability/maintainability/the programmer's time). I'm just curious.
OS X does a lot of magic with shared memory and copy-on-write pages, so chances are that it doesn't take that much physical RAM for every application.
You can check exactly how memory blocks are mapped by running:
sudo vmmap <PID of the process>
Depends on the all the framework (APIs) you use. Combine that with the VM allocations done by low level ops.
It's only worth trying to reduce the heap alloc (total), as well as the resident size of the code. Making sure your data structs are allocated efficiently and trying to compile with the ever-so-famous "-Os" optimization flag (size optimization). There isn't much you can do about the VM eaten by Cocoa. I wouldn't really worry about it.
This is clearly a 'WTF' moment for developers in general. The question is usually - why does my trivial application use up so much memory.
The answer is down to the underlying framework. You could argue that 6MB is too much, but really, it is nothing.
It's not rare to see computers come with 2GB of memory these days. The stock IMAC is 4GB. The whole point of the computer industry is to use up all the resources a machine has so that it continues to evolve.
Yes you should avoid ineffecincies where possible (Don't load up a 5million point array at start up for instance). But unless your beta demonstrates you fudged up just keep it on the list of todo's.
I'm a bit out on a limb here, but I guess it's because all the libraries that get added have to do quite a bit of setting up and there is no need to garbage collect, so they simply get to waste memory; plus, even if all memory got autoreleased, it would wait until the first idle event, which is after the creation of the window. Delete unneeded libraries/frameworks, or force a garbage collect somewhere after loading the window from the nib and see how much it goes down if you're so concerned.
I am not concerned about it. Some of the memory might be returned later, and the rest is the price you pay for a powerful framework.
A factor which is not directly linked to cocoa but is valid to frameworks in general is that the overhead is not linear. There is usually a fixed and a variable "price", in terms of overhead, to use the framework.
When you create a simple blank window, the fixed overhead is crushing, but when you create an application with tens of windows, dialogs, controls and all, the initial fixed overhead becomes negligible, against the size of the application itself.