Why does this objective-c exception backtrace show only the last two calls? - objective-c

Using Xcode (4.3.3 and llvm), I set a breakpoint on all exceptions but often when the breakpoint is hit, the stack looks like this:
(lldb) bt
* thread #1: tid = 0x1c03, 0x31a891c4 libobjc.A.dylib`objc_exception_throw, stop reason = breakpoint 1.1
frame #0: 0x31a891c4 libobjc.A.dylib`objc_exception_throw
frame #1: 0x33a677b8 CoreFoundation`+[NSException raise:format:arguments:] + 100
There's no information about the caller of NSException raise:format:arguments:, the stack just stops there.
Something must have called raise so what happened to the stack? What should I look for to help me debug such issues (e.g. does this indicate some specific type of stack corruption or something else)?
Update: here's the code with the garbage.
The problem was that I accidentally wrote %s when I meant %#. Might have been a bit of a stretch to call that a properly-allocated string but I could "po imageName" in the llvm console, it just had some garbage at the end.
NSString *imageName = [NSString stringWithFormat:#"award_%s", award];
UIImage *image = [UIImage imageNamed:imageName];
[_awardIconMapping setObject:image forKey:award];
There was no thread #0 by the time the exception was hit. Using #try/#catch, I get a sensible exception from the last line (sorry, I thought it was the image= line before):
2012-07-06 10:43:36.184 CallVille[834:707] ****** Hit exception ********
-[__NSCFDictionary setObject:forKey:]: attempt to insert nil value (key: wiseGuy)

I use this function to print the stack traces from a NSException:
#include <execinfo.h>
#include <stdio.h>
#include <stdlib.h>
+ (void) printStackTrace: (NSException*) e
{
NSArray * addresses = [e callStackReturnAddresses];
if( [addresses count])
{
void * backtrace_frames[[addresses count]];
int i = 0;
for (NSNumber * address in addresses)
{
backtrace_frames[i] = (void *)[address unsignedLongValue];
i++;
}
char **frameStrings = backtrace_symbols(&backtrace_frames[0], [addresses count]);
if(frameStrings != NULL)
{
int x;
for(x = 0; x < [addresses count]; x++)
{
NSString *frame_description = [NSString stringWithUTF8String:frameStrings[ x]];
NSLog( #"------- %#", frame_description);
}
free( frameStrings);
frameStrings = nil;
}
}
}

Related

xcode - alert that shows crash reason

Is there a way to build an alert box that shows what the crash reason is? Specifically, what line of code is causing the crash?
My boss is asking for this, and I haven't found a way to make this possible. I'm going through the analytics tab on the device and finding the crash, but he wants something that populates on the device (it's an internal app) giving the reason for the crash.
Is this possible?
If you want to show an alert when app crashes you can't do that but you can read the crash log when you open the app again after a crash.
You could create a method with following logic, it basically reads back trace for last crash when you open the app again after a crash.
aslmsg q, m;
int i;
const char *key, *val;
float how_old = fTime ;
q = asl_new(ASL_TYPE_QUERY);
asl_set_query(q, ASL_KEY_LEVEL, strLoggerLevel ,ASL_QUERY_OP_LESS_EQUAL);
asl_set_query(q, ASL_KEY_FACILITY, [#"YourBundleIdOfAPP" UTF8String] ,ASL_QUERY_OP_EQUAL);
asl_set_query(q, ASL_KEY_TIME, [[NSString stringWithFormat:#"%.f", [[NSDate date] timeIntervalSince1970] - how_old] UTF8String], ASL_QUERY_OP_GREATER_EQUAL);
int goInside=0;
aslresponse r = asl_search(NULL, q);
while (NULL != (m = aslresponse_next(r)))
{
NSString *cValueToWrite;
NSMutableDictionary *tmpDict = [NSMutableDictionary dictionary];
for (i = 0; (NULL != (key = asl_key(m, i))); i++)
{
//get the only required fields
if(i==12 || i==10 || i==11 || i==8 || i==9 ||i==3)
{
NSString *keyString = [NSString stringWithUTF8String:(char *)key];
val = asl_get(m, key);
NSString *string = [NSString stringWithUTF8String:val];
[tmpDict setObject:string forKey:keyString];
}
}
cValueToWrite=[[NSString alloc]initWithFormat:#"\n--------------[Debug]----------------\nDateTime: %#\nApplication: %#\nInfo: %#",[tmpDict valueForKey:#"CFLog Local Time"],[tmpDict valueForKey:#"Sender"],[tmpDict valueForKey:#"Message"]];
}
strLoggerLevel is the NSString which hold the logger type which you want which ranges upto 7

How to create/end run loop to properly deallocate memory?

In my ARC iOS app I am running a for loop that ends up with a large memory allocation overhead. I want to somehow end my for loop with minimal/no extra memory allocated. In this instance I am using the SSKeychain library which lets me fetch things from a keychain. I usually just use autorelease pools and get my memory removed properly but here I don't know what is wrong because I end up with 70 mb + of memory allocated at the end of the loop. I have been told that I should start/end a run loop to properly deal with this. Thoughts?
for (int i = 0; i < 10000; ++i) {
#autoreleasepool {
NSError * error2 = nil;
SSKeychainQuery* query2 = [[SSKeychainQuery alloc] init];
query2.service = #"Eko";
query2.account = #"loginPINForAccountID-2";
query2.password = nil;
[query2 fetch:&error2];
}
}
What are you using to measure memory usage?
Results of a very simple test...
Running in the simulator, measure only resident memory before and after.
Without autoreleasepool...
Started with 27254784, ended with 30212096, used 2957312
With autoreleasepool...
Started with 27316224, ended with 27443200, used 126976
Obviously, the autoreleasepool is preventing memory from growing too bad, and I don't see anything close to 70MB being used under any circumstance.
You should run instruments and get some good readings on the behavior.
Here is the code I hacked and ran...
The memchecker
static NSUInteger available_memory(void) {
NSUInteger result = 0;
struct task_basic_info info;
mach_msg_type_number_t size = sizeof(info);
if (task_info(mach_task_self(), TASK_BASIC_INFO, (task_info_t)&info, &size) == KERN_SUCCESS) {
result = info.resident_size;
}
return result;
}
And the code...
#define USE_AUTORELEASE_POOL 1
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
dispatch_async(dispatch_get_main_queue(), ^{
NSUInteger beginMemory = available_memory();
for (int i = 0; i < 10000; ++i) {
#ifdef USE_AUTORELEASE_POOL
#autoreleasepool
#endif
{
NSError * error2 = nil;
SSKeychainQuery* query2 = [[SSKeychainQuery alloc] init];
query2.service = #"Eko";
query2.account = #"loginPINForAccountID-2";
query2.password = nil;
[query2 fetch:&error2];
}
}
NSUInteger endMemory = available_memory();
NSLog(#"Started with %u, ended with %u, used %u", beginMemory, endMemory, endMemory-beginMemory);
});
return YES;
}

Pointers to primitive objects not properly changing value

I am having a bit of an issue passing a reference to a primitive type through chaining, and having the value represented by the pointer change correctly. The weird part is, if I call getBytes directly from main function, byteLocation is properly adjusted, but if I chain it through a convenience function, it seems get a junk value. Actually, even weirder, it at first gets the correct value when stepping through the debugger, but executes the return clause twice. The first return clause gets the correct value, the second loads byteLocation with a junk value. Any ideas?
EDIT (Actual Code):
#property (strong, nonatomic, nonnull) NSData* data;
#property (assign, nonatomic) CFByteOrder byteOrder;
- (void)convertBytesToHostOrder:(nonnull void*)buffer length:(NSUInteger)length {
if(length > 1 && self.byteOrder != CFByteOrderGetCurrent()) {
// Swap bytes if the packet endiness differs from the host
char* fromBytes = buffer;
for(NSUInteger i=0; i < length/2; i++) {
NSUInteger indexes[2] = {i, length-i-0};
char byte = fromBytes[indexes[0]];
fromBytes[indexes[0]] = fromBytes[indexes[1]];
fromBytes[indexes[1]] = byte;
}
}
}
- (nonnull void*)getBytes:(nonnull void*)buffer startingFrom:(nonnull NSUInteger*)location length:(NSUInteger)length {
NSRange range = NSMakeRange(*location, length);
[self.data getBytes:buffer range:range]; // self.data is an instance of NSData
[self convertBytesToHostOrder:buffer length:length];
NSUInteger update = range.location + range.length;
*location = update;
return buffer;
}
- (NSTimeInterval)readTimeIntervalStartingFrom:(nonnull NSUInteger*)byteLocation {
uint32_t seconds;
uint16_t milliseconds;
// This line of code screws up the byteLocation pointer for some reason
[self getBytes:&seconds startingFrom:byteLocation length:sizeof(seconds)];
[self getBytes:&milliseconds startingFrom:byteLocation length:sizeof(milliseconds)];
NSTimeInterval ti = seconds + milliseconds / ((double) 1000 * (1 << 6));
return ti;
}
- (void)readData {
NSUInteger byteLocation = 0;
self.sequenceNumber = *(uint8_t*) [self getBytes:&_sequenceNumber startingFrom:&byteLocation length:sizeof(_sequenceNumber)];
self.flags = *(uint8_t*) [self getBytes:&_flags startingFrom:&byteLocation length:sizeof(_flags)];
// Continue to process packet data if we didn't get a goodbye message
if(!(self.flags & LBRadarPongFlagGoodbye)) {
// Parse accelerations
int16_t int16;
self.accelerationX = (*(int16_t*) [self getBytes:&int16 startingFrom:&byteLocation length:sizeof(int16)]) / kGToRaw;
self.accelerationY = (*(int16_t*) [self getBytes:&int16 startingFrom:&byteLocation length:sizeof(int16)]) / kGToRaw;
self.accelerationZ = (*(int16_t*) [self getBytes:&int16 startingFrom:&byteLocation length:sizeof(int16)]) / kGToRaw;
// Parse peripheral states
self.batteryVoltage = [self readFloat16From:&byteLocation];
self.chargeCurrent = [self readFloat16From:&byteLocation];
self.systemCurrent = [self readFloat16From:&byteLocation];
// All previous lines of code work properly and as expected.
// Buffers are read properly, and byteLocation properly reflects 14, which is the number of bytes read up to this point.
self.pongReceivedTimeIntervalSince1970 = [self readTimeIntervalStartingFrom:&byteLocation];
}
}
The problem seems to be with incrementing the location. In both cases you should copy from position zero, up to the size of the variable. In the following code:
[self readBytes:&seconds location:byteLocation length:sizeof(seconds)]
[self readBytes:&milliseconds location:byteLocation length:sizeof(milliseconds)]
The first call starts at position zero and reads 32 bits. The second one starts at position 32, which doesn't even fit in the variable's 16 bits. This is overflowing the buffer. Try this instead:
- (void*)readBytes:(void*)buffer location:(NSUInteger*)location length:(NSUInteger)length {
// The difference is in the next line. Zero instead of *location
[NSData getBytes:&buffer range:NSMakeRange(0, length)];
*location = *location + length;
return buffer; // Seems to be called twice, first time location* has the correct byteLocation inside it, second time location* has a junk value
}
At a guess[*] your error is on the line:
[self.data getBytes:&buffer range:NSMakeRange(*location, length)];
You are passing a void * value by taking the address of buffer - which is already a void *. Changing this to:
[self.data getBytes:buffer range:NSMakeRange(*location, length)];
will at least produce non-garabge results.
[*] I can only guess as the code you posted did not even compile, I edited your question to correct some of the more obvious errors - but even that involved some guessing! You should post real code.
I should have just posted the actual code, sorry guys. Turns out the error was in a helper function (convertBytesToHostOrder). This was reading out of bounds of buffer. Since buffer was the parameter right before byteLocation, it seems that writing at a location 1 spot beyond buffer was the byteLocation location. Fixed now and everything is working.
- (void)convertBytesToHostOrder:(nonnull void*)buffer length:(NSUInteger)length {
if(length > 1 && self.byteOrder != CFByteOrderGetCurrent()) {
// Swap bytes if the packet endiness differs from the host
char* fromBytes = buffer;
for(NSUInteger i=0; i < length/2; i++) {
NSUInteger indexes[2] = {i, length-i-0};
char byte = fromBytes[indexes[0]];
fromBytes[indexes[0]] = fromBytes[indexes[1]];
fromBytes[indexes[1]] = byte;
}
}
}
Should be:
NSUInteger indexes[2] = {i, length-i-1};

EXC_BAD_INSTRUCTION when using NSRunAlertPanel

To show errors on some condition, I am using NSRunAlertPanel on Mac OS X (on the code which is ported from Windows, where I was using MessageBox).
Before actual creation of Windows, some code is being run and calling this code to show some conditional error.
In a thread com.apple.libdispatch-manager, under following call stack
0 _dispatch_mgr_invoke
1 _dispatch_mgr_thread
it is giving EXC_BAD_INSTRUCTION
Is it because Windows is not created before NSRunAlertPanel?
What is the reason of this runtime error? What is exact alternative of MessageBox on Mac OS X?
Long ShowDebugMessageBox (const wchar_t * Message, const wchar_t * Title)
{
NSString * message; ///< Message.
NSString * title; ///< Title.
NSInteger response; ///< response.
message = WideToNSString (Message);
title = WideToNSString (Title);
//response = NSRunAlertPanel(title, message, #"Yes", #"No", #"Cancel");
response = NSRunCriticalAlertPanel (title, message, #"Okay", #"Cancel", nil);
switch(response) {
case NSAlertDefaultReturn:
return IDYES;
case NSAlertAlternateReturn:
return IDNO;
default:
return IDCANCEL;
}
}
NSString * WideToNSString (const wchar_t * Str)
{
if(!Str) {
return nil;
}
NSString * str; ///< String in NSString.
#if CP_SIZEOFWCHAR == 4
str = [[NSString alloc] initWithBytes: (CVPtr) Str
length: sizeof(wchar_t)* wcslen(Str)
encoding: NSUTF32LittleEndianStringEncoding];
//encoding: NSUTF32StringEncoding];
#else
str = [[NSString alloc] initWithBytes: (CVPtr) Str
length: sizeof(wchar_t)* wcslen(Str);
encoding: NSUTF16LittleEndianStringEncoding];
//encoding: NSUTF16StringEncoding];
#endif
return str;
}
class File {
public:
int Open(char * fname, int mode)
{
fd = open(fname, mode);
}
int Close()
{
close(fd);
//fd = 0; //CAUSE of the PROBLEM
}
~File ()
{
//ALERT Display message box about the error.
ALERT(fd != 0);
}
private:
int fd;
};
This is the code to show the message box.
Code to get NSString from wchar_t * string (Wide string) is perfectly fine and was tested. It is used in many places and running fine.
Same code on the other application (which creates Window first) is running fine.
Problem occurs when the destructor of File is called. Since fd is not 0, it shows message box and cause the problem.
When fd is set to 0, no alert box is displayed for constructor. However, other alert are shown but no problem occurred.
Is it due fd?
You haven't provided enough information to say what is causing the exception (please show the code).
I use NSRunCriticalAlertPanel() to display fatal errors in my app, which I am able to call pretty much any time I like:
void criticalAlertPanel(NSString *title, NSString *fmt, ...)
{
va_list va;
va_start(va, fmt);
NSString *message = [[NSString alloc] initWithFormat:fmt arguments:va];
va_end(va);
NSRunCriticalAlertPanel(title, message, #"OK", nil, nil);
}
(this code is ARC enabled).

NSLog not working

Strangest thing I've seen yet.. NSLog is failing within a method based on something bizarre. Here is the code:
-(void) testBoard {
BoardManager* bm = [BoardManager manager];
Board* board = [bm boardForName:#"fourteen by eight"];
NSLog(#"%#", board);
NSLog(#"%#", board.coords);
}
in Board.m :
-(NSArray*) coords {
if(!_coords.count && _definition) {
NSArray* c = [Board decode:_definition];
[self setCoords:c];
}
return _coords;
}
+(NSArray*) decode:(NSString*)encodedCoords {
NSMutableArray* coords = [NSMutableArray array];
NSArray* tokens = [encodedCoords componentsSeparatedByString:#","];
int i = 0;
NSString* dimStr = [tokens objectAtIndex:i++];
int width = [dimStr substringToIndex:2].intValue;
int height = [dimStr substringWithRange:NSMakeRange(2, 2)].intValue;
int depth = [dimStr substringFromIndex:4].intValue;
NSLog(#"w=%d h=%d d=%d", width, height, depth);
NSString* b128;
NSString* b2;
for(int z=0; z<depth; z++) {
for(int y=0; y<height; y++) {
b128 = [tokens objectAtIndex:i++];
NSLog(#"[%#]", b128);
b2 = [Board base128to2:b128];
NSLog(#"b2=%#",b2);
for(int x=0; x<b2.length; x++) {
if([b2 characterAtIndex:b2.length-1-x] == '1') {
Coord* coord = [Coord x:width-1-x y:height-1-y z:z];
[coords addObject:coord];
}
}
}
}
return coords;
}
Now what happens is, none of the NSLog statements within decode: or methods called from decode: will log to the console. However, NSLog before and after calling decode: work, and the rest of the code around the NSLog's executes fine. I found that I could get all the NSLog statements to work simply by commenting out [coords addObject:coord];. This statement appears after NSLog statements that it is affecting. I also found I could get the NSLog's to work by rearranging a few lines in testBoard: like this:
-(void) testBoard
{
BoardManager* bm = [BoardManager manager];
Board* board = [bm boardForName:#"fourteen by eight"];
NSLog(#"%#", board);
NSString* def = board.definition;
NSArray* coords = [Board decode:def];
NSLog(#"%#", coords);
}
This seems quite bizarre..an xcode Loch Ness monster surfacing?!
Have you tried running in the debugger? What happens when you get to that if statement
if(!_coords.count && _definition)
Are you sure _coords.count is 0 and _definition is not nil
In my experience, NSLog macro expansions can fail. Most of the time it's obvious why, sometimes not.
Just do something like this:
id objToLog = /* whatever */;
NSLog(#"%#", objToLog);
If this approach works, you can go ahead with your development and maybe you figure the problem out later.