Accessing NSData bytes - Not the same value on debugger - objective-c

I'm facing a problem that I don't understand. Before beginning to explain it, even if I have worked on a Swift project this year that was using some Objective-C, I am new to this language and its concepts.
So, there is my problem : I want to access the bytes of an NSData object. I know there several ways to do so :
[data bytes];
data.bytes;
[data getBytes: dest, length: [data length]];
But each method doesn't return the same value as the console, when I'm using po [data bytes].
Can you explain me why this happens ? I don't really understand what I'm missing.
Thanks.

data and data.bytes are of two totally different types. data is an instance of NSData, while data.bytes is a raw pointer (const void *). When you call po in the debugger (short for "print object"), it will call -description on things which inherit from NSObject, or just print the value if they do not.
In this case, since data is an NSData (which has -description), if you po data, it calls [data description] and prints the result of that out; since NSData knows how to nicely format its contents, it will print nicely.
However, since data.bytes is a void *, there is no way for the debugger to know how to print it (void * can point to anything; how to interpret it is totally up to you) so it just prints out the pointer itself.
If you want to print the data from the debugger directly, you can tell it how to interpret the pointer and print it out. If you know that the data blob is n bytes long, you can run the following command:
p/x *(uint8_t (*)[<n>])data.bytes
where <n> is replaced with the literal length of the data (e.g. uint8_t (*)[8])). *(uint8_t (*)[<n>])data.bytes tells the debugger to reinterpret data.bytes as an array of n bytes (giving it the length so it knows how much data to read from memory) while p/x tells it to print the hex values of the bytes it finds.

Related

Difference between sizeof and length of NSData

I'm confused. What is deference between sizeof and .length of NSData. Length is a count of characters? Right? but does it mean sizeof? Can anybody explain me more exactly plz
sizeof() is a language keyword that returns the storage size of a type and is evaluated at compile time.
For example:
NSData *obj = [NSData data];
NSLog(#"%lu", sizeof(obj));
would print either 4 on a 32-bit platform or 8 on a 64-bit platform as obj is a pointer and that's how much space a pointer takes on those platforms.
It's the same as:
NSLog(#"%lu", 4);
or
NSLog(#"%lu", 8);
depending on the platform being compiled on.
However NSData is an object that stores data and it provides the length method so you can interrogate how much data it is currently storing. It is evaluated at runtime.
NSLog(#"%lu", obj.length);
prints 0 as that NSData object is empty.
I'm not expert in iOS, but I tried to look a bit. It seems that .Length is "number of bytes contained in the receiver". While .sizeof represents "actual number of bytes that whole NSData structure occupies in memory".
In other languages it could behave differently - i.e. C#'s string: length will be 20 for 20 characters, but while C# uses Unicode - sizof() will return 40. However, other objects might behave in totally different manner....
I think that in your example sizeof() might however return two possible results - number of bytes occupied by NSData internal structures (like pointer(s) to the real data, counters etc.) WITH or WITHOUT size of contained data.
I suppose the best way would be to try to store some data and compare outputs of the two methods :) If NSData is simply pointer - the results of .sizeof will be just 4 or 8 - size of the pointer :)

Objective-C/Cocoa equivalent of fread

I'm trying to read in the first four bytes of a file. I know that this works correctly with the following C code:
FILE *file = fopen(URL.path.UTF8String, "rb");
uint data;
fread(&data, 4, 1, file);
NSLog(#"%u", data);
This prints out: 205
I'm trying to find the equivalent way of doing this in Objective-C/with Cocoa functions. I've tried a number of things. I feel like the following is close:
NSFileHandle *fileHandle = [NSFileHandle fileHandleForReadingFromURL:URL error:nil];
NSData *data2 = [fileHandle readDataOfLength:4];
NSLog(#"%#", data2);
NSLog(#"%u", (uint)data2.bytes);
This prints out: < cd000000 >
and: 1703552
As expected, the first four bytes of the file are indeed CD000000.
I'm assuming there's one of two things causing the difference (or both):
fread is not counting the 0s following the CD. I've confirmed this by only reading in 1 byte with the fileHandle, but sometimes this number will extend greater than one byte, so I can't restrict it like this. Do I need to manually check that the bytes coming in aren't 00?
This has something to do with endianness. I have tried a number of functions such as CFSwapInt32BigToHost but have not been able to get back the right value. It would be great if anyone can enlighten me as to how endianness works/effects this.
You are not dereferencing the data.
NSLog(#"%u", (uint)data2.bytes); // wrong
The "quick hack" version is like this:
NSLog(#"%u", *(uint *) data2.bytes); // hack
A more robust solution requires copying to a variable somewhere, to get the alignment right, but this doesn't matter on all platforms:
uint value;
[data getBytes:&value length:sizeof(value)];
NSLog(#"%u", value);
Another solution is to explicitly read the data byte-by-byte, which is most portable, has no alignment issues on any platform, and has no byte-order issues on any platform:
unsigned char *p = data.bytes;
uint value = (unsigned) p[0] | ((unsigned) p[1] << 8) |
((unsigned) p[2] << 16) | ((unsigned) p[3] << 24);
NSLog(#"%u", value);
As you can see, there are good reasons why we avoid putting binary data in files ourselves, and leave it to libraries or use text formats.
This can't be an issue with byte order, because fread() is working correctly. The fread() function and the -readDataOfLength: method will both give you the same result: a chunk of bytes.
You attempt reinterpret a sequence of 4 bytes as an unsigned int. This is not guaranteed to work on all platforms. It will only work if sizeof(unsigned int) equals 4. And it will only work if the byte order is the same for reading and writing.
Furthermore, you are not printing the scalars correctly with NSLog.
fread() in binary mode won't do anything to your data, you'll get the bytes as they are in the file.
It's absolutely byte ordering that is causing this, but I don't know anything about Apple's Objective C API:s. I don't understand why you don't need to do pointer accesses to the data2 object, even (why isn't data2.bytes failing, and data2->bytes needed?).
Also, the documentation for NSData doesn't say anything about byte order that I could find.

How to get byte size from objective-c type encoding

Using NSMethodSignature I can get a methods argument types via getArgumentTypeAtIndex:. Which returns a c-string based off of this documentation. So like "i" for int and "I" for unsigned.
Is there a function somewhere that takes in this encoding and returns the types size in bytes?
Something like this:
int paramSize = typeEncodingSize("i");
NSLog(#"%s is %d bytes", "i", paramSize);
//this would be the encoding for a struct that has three fields. An id, a pointer and an int.
paramSize = typeEncodingSize("{example=#*i}"); //two 8 byte pointers & one 4 byte int
NSLog(#"%s is %d bytes", "{example=#*i}", paramSize);
which would output:
i is 4 bytes
{example=#*i} is 20 bytes
I figure there must be an api function for this somewhere since the docs for [NSInvocation setArgument:atIndex:] say
The number of bytes copied is determined by the argument size.
I understand that this is old, but I've hit the same wall.
The solution seems to be method NSGetSizeAndAlignment.
https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Miscellaneous/Foundation_Functions/#//apple_ref/c/func/NSGetSizeAndAlignment
Have you tried sizeof()? That's the usual way to determine the size of a struct or other type.
Caleb's sizeof() is correct, because basically you can pass anything to #encode only if it's also accepted by sizeof().There's nothing like #decode. You can download class-dump source code and look into CDTypeParser.m file for example how to parse it.

Objective-C memory management problem

I'm getting an EXC_BAD_ACCESS error, and It's because of this part of code. Basically, I take an input and do some work on it. After multiple inputs, it throws the error. Am I doing something wrong with my memory here? I'd post the rest of the code, but it's rather long -- and I think this may be where my problem lies (It's where Xcode points me, at least).
-(IBAction) findShows: (id) clicked
{
char urlChars[1000];
[self getEventURL: urlChars];
NSString * theUrl = [[NSString alloc] initWithFormat:#"%s", urlChars];
NSData *data = [NSData dataWithContentsOfURL:[NSURL URLWithString:theUrl]];
int theLength = [data length];
NSString *content = [NSString stringWithUTF8String:[data bytes]];
char eventData[[data length]];
strcpy(eventData, [content UTF8String]);
[self parseEventData: eventData dataLength: theLength];
[whatIsShowing setStringValue:#"Showing events by this artist"];
}
When a crash occurs, there will be a backtrace.
Post it.
Either your program will break in the debugger, and the call stack will be in the debugger UI (or you can type 'bt
With that, the cause of the crash is often quite obvious. Without that, we are left to critique the code.
So, here goes....
char urlChars[1000];
[self getEventURL: urlChars];
This is, at best, a security hole and, at worst, the source of your crash. Any time you are going to copy bytes into a buffer, there should be some kind of way to (a) limit the # of bytes copied in (pass the length of the buffer) and (b) the # of bytes copied is returned (0 for failure or no bytes copied).
Given the above, what happens if there are 1042 bytes copied into urlChars by getEventURL:? boom
NSString * theUrl = [[NSString alloc] initWithFormat:#"%s", urlChars];
This is making some assumptions about urlChars that will lead to failure. First, it assumes that urlChars is of a proper %s compatible encoding. Secondly, it assumes that urlChars is NULL terminated (and didn't overflow the buffer).
Best to use one of the various NSString methods that create strings directly from the buffer of bytes using a particular encoding. More precise and more efficient.
NSData *data = [NSData dataWithContentsOfURL:[NSURL URLWithString:theUrl]];
I hope this isn't on the main thread... 'cause it'll block if it is and that'll make your app unresponsive on slow/flaky networks.
int theLength = [data length];
NSString *content = [NSString stringWithUTF8String:[data bytes]];
char eventData[[data length]];
strcpy(eventData, [content UTF8String]);
This is about the least efficient possible way of doing this. There is no need to create an NSString instance just to then turn it into a (char *). Just grab the bytes from the data directly.
Also -- are you sure that the data returned is NULL terminated? If not, that strcpy() is gonna blow right past the end of your eventData buffer, corrupting the stack.
[self parseEventData: eventData dataLength: theLength];
[whatIsShowing setStringValue:#"Showing events by this artist"];
What kind of data are you parsing that you really want to parse the raw bytes? In almost all cases, such data should be of some kind of structured type; XML or, even, HTML. If so, there is no need to drop down to parsing the raw bytes. (Not that raw data is unheard of -- just odd).
The bytes you get from [content UTF8String] could conceivably be different in number from the value of [data length]. Try using strncpy() instead and see if that still crashes. (It's also possible that getEventURL: sometimes fails to return a string in the format expected, but that's impossible to tell without the source to that method.)
Is it possible that the string contained in urlChars sometimes comes back non-NULL-terminated? You might want to try zeroing out the array, for example using bzero.
Additionally, there are a bunch of techniques for debugging EXC_BAD_ACCESS. Since you're doing a lot of pure C string manipulation, the usual method of turning on NSZombieEnabled may or may not help you (though I recommend turning it on regardless). Another technique you can try is recovering a previous stack frame using GDB. See my previous answer to a similar question if you're interested.
In my opinion the code is too complex. Do not resort to plain C arrays and strings unless you absolutely have to, they are harder to get right. (It’s no rocket science, but if you play with guns all the time, you will shoot yourself in the foot sooner or later.) Even if you insist on parsing plain C strings, isolate the code using the function interface:
// Callers have to mess with char*.
- (void) parseEventData: (char*) data {…}
// Callers can stay in the Objective-C land.
- (void) parseEventData: (NSString* or NSData*) data {
char *unwrappedData = …;
…
}
I’d certainly think twice before I used strcpy in my code. And I think you are leaking theUrl (although that should not cause EXC_BAD_ACCESS in this case). As for the bug itself, you might be hanging on parts of urlChars or eventData and when those stack-based variables disappear, you cause the segfault?

Objective-C Packing Data using NSMutableData?

I need to put a short and integer at the begging of a message that i am sending to a java server. The server is expecting to read a short (message id) then an integer (message length). I've read in the stackoverflow that NSMutableData is similar to java ByteBuffer.
I am trying to pack the message into NSMutableData then send it.
So this is what I have but is not working !.
NSMutableData *data = [NSMutableData dataWithLength:(sizeof(short) + sizeof(int))];
short msg_id = 2;
int length = 198;
[data appendBytes:&msg_id length:sizeof(short)];
[data appendBytes:&length length:sizeof(int)];
send(sock, data, 6, 0);
The server is using Java ByteBuffer to read in the received data. So the bytes coming in is:
32,120,31,0,2,0
which is invalid.
The correct value so the ByteBuffer can read them as .getShort() and .getInt()
0,2,0,0,0,-66
You're basically putting stuff into the NSData object correctly, but you're not using it with the send function correctly. First off, as dreamlax suggests, use NSMutableData's -initWithCapacity initializer to get a capacity, not zeroed bytes.
Your data pointer is a pointer to an Objective-C (NSData) object, not a the actual raw byte buffer. The send function is a classic UNIX-y C function, and doesn't know anything about Objective-C objects. It expects a pointer to the actual bytes:
send(sock, [data bytes], [data length], 0);
Also, FWIW, note that endianness matters here if you're expecting to recover the multibyte fields on the server. Consider using HTONL and HTONS on the short and int values before putting them in the NSData buffer, assuming the server expects "network" byte order for its packet format (though maybe you control that).
I think your use of dataWithLength: will give you an NSMutableData object with 6 bytes all initialised to 0, but then you append 6 more bytes with actual values (so you'll end up with 12 bytes all up). I'm assuming here that short is 2 bytes and int is 4. I believe you want to use dataWithCapacity: to hint how much memory to reserve for your data that you are packing.
As quixoto has pointed out, you need to use the bytes method, which returns a pointer to the first byte of the actual data. The length method will return the number of bytes you have.
Another thing you need to watch out for is endianness. The position of the most significant byte is dependent on the underlying architecture.