What is f-ing my data? - objective-c

I'm getting a NSData * and trying to get it byte by byte but the data is filled with f.
NSData *Data = getData();
cout << "The log way:" << endl;
NSLog(#"%#", Data);
cout << "The data way:" << endl;
char *data = (char *)[Data bytes];
for(int i = 0; i < [Data length]; i++)
{
cout.width(2);
cout.fill(0);
cout << hex << (int)(data[i]) << " ";
}
cout << endl;
What I'm getting as output:
The log way:
/* something long about time and file */<1f9cb0f8>
The data way:
1f ffffff9c ffffffb0 fffffff8
How can I get this data as ints without all those f?

The char data type is of undefined signedness, and it seems your compiler (gcc or clang?) decided it should be signed. Therefore, when you cast a char to a larger type, sign extension is used, which fills the extra bits with the same value the most significant bit has. For bytes with a value larger than 0x7F, the most significant bit is set, so it breaks the enlargement.
What you want is zero extension, which zeroes the extra bits. You can get it by using the unsigned char type.
This should do it:
unsigned char *data = (unsigned char *)[Data bytes];
By the way, -[NSData bytes] return a const pointer. You should honor this and mark your pointer as const too:
const unsigned char *data = (const unsigned char *)[Data bytes];

That is data as ints, sort of. The data is split into bytes and sign-extended to ints. As bytes, your data is 1f 9c b0 f8. 9c, b0, and f8 are all negative, so they are sign extended by making the extra bits all 1, which is why you are getting a bunch of fs.

Related

Troubles with casting bytes from NSInputStream?

I have openssl server and Objective-C client. I send message like this
uint32_t testD = 161;
err = SSL_write(ssl_, &testD, sizeof(uint32_t));
and read it by NSInputStream like
case NSStreamEventHasBytesAvailable:
{
uint8_t buffer[4];
int len;
while ([inStream hasBytesAvailable])
{
len = [inStream read:buffer maxLength:sizeof(buffer)];
if (len > 0)
{
NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
NSData *theData = [[NSData alloc] initWithBytes:buffer length:len];
if (nil != output)
{
char buff;
[theData getBytes:&buff length:1];
uint32_t temp = (uint32_t)buffer;
}
...
So, in output I have "¡", it's 161-th ASCII symbol, in buff I have '\xa1' and in temp very big number, but actually I need 161 in temp.
I read that '\xa1' it's also 161, but I can't cast this to uint32_t.
What is the problem?
ANSWER:
The problem was in casting. This works fine for me:
unsigned char buff;
int temp = buff;
or
char buff;
int b = (unsigned char) buff;
No encoding is used by SSL_write(), and \xa1 == 161 is a mathematical identity, not the result of any encoding process. As you're successfully recovering \xa1, clearly no decoding is used by NSInputStream either.
It seems to me that you're casting the address of the buffer rather than its contents, which is why you get a high value that varies with compilation.
In addition you are possibly over-running the data by reading whatever is available and then only consuming four bytes of it: less in fact because you're incorrectly testing len >= 1 rather than len >= 4.
You should:
Use a buffer of exactly four bytes. No need to allocate it dynamically: you can declare it as a local array.
Read until you have read four bytes. This requires a loop.
Change the casting syntax (don't ask me how, I'm no Objective-C expert, but the code that recovers buff looks like a good start), so that you get the content of the buffer instead of the address.
After that you may then have endian issues.
Nothing to do with encoding.
What encoding is used in SSL_write and NSInputStream?
There is no encoding. Its bytes in and bytes out.
I think you are looking for network byte order/endianess.
Network byte order is big endian. So your code would become:
uint32_t testD = 161;
uint32_t be = htonl(testD);
err = SSL_write(ssl_, &be, sizeof(be));
Here's the description of htonl from the htonl(3) man pages:
The htonl() function converts the unsigned integer hostlong from host byte order to network byte order.
To convert back, you would use ntohl.
I'm not sure if Cocoa/CocoaTouch offers a replacement for htonl and ntohl. So you might have to use them in your iPhone projects, too. See, for example, Using ntohl and htonl problems on iPhone.
We can get a single byte value like this:
unsigned char buff;
int temp = buff;
Or
char buff;
int b = (unsigned char) buff;

nsstring to unsigned char []

I would convert
NSString *myString = #"0x10 0x1c 0x37 0x00"; //Aquired reading text file using NSString methods..
to
unsigned char convertedfrommyString[] = { 0x10, 0x1c, 0x37, 0x00 };
My goal is to aquiring them and then swap it using this code:
unsigned char convertedfrommyString[] = { 0x10, 0x1c, 0x37, 0x00 };
int data = *((int *) convertedfromString);
NSLog(#"log = %08x", data);
the output should be:
log = 00371c10
Any help?
EDIT
From both Jan Baptiste Younès and Sven I found the way to understand my problem and solve with this code:
NSString *myString = [[#"0x10 0x1c 0x37 0x00" stringByReplacingOccurrencesOfString:#"0x" withString:#""] stringByReplacingOccurrencesOfString:#" " withString:#""];
unsigned result = 0;
NSScanner *scanner = [NSScanner scannerWithString:myString];
[scanner scanHexInt:&result];
int reverse = NSSwapInt(result);
NSLog(#"scanner: %8u", result);
NSLog(#"bytes:%08x", result);
NSLog(#"reverse:%08x (that is what i need!)", reverse);
Really OK!
But can I accept two answer?
That's more than a simple conversion, you need to actually parse the values from your string. You can use NSScanner to do this.
NSScanner *scanner = [NSScanner scannerWithString: #"0x10 0x1c 0x37 0x00"];
unsigned char convertedfrommyString[4];
unsigned index = 0;
while (![scanner isAtEnd]) {
unsigned value = 0;
if (![scanner scanHexInt: &value]) {
// invalid value
break;
}
convertedfrommyString[index++] = value;
}
Of course this sample is missing error handling (the single values could not fit into an unsigned char or there could be more than four).
But this solved only half your problem. The other issue is converting this to an int. You did this by casting the unsigned char pointer to an int pointer. This is not portable and also not legal in C. To always get the result you want you should instead use bit shifts to assemble your int. So inside your loop you could do
result = result | (value << i);
i += 8;
instead of putting the values inside an unsigned char array. For this result and i should both be initialized to zero.
You may cut your original string at spaces and use the solution given here Objective-C parse hex string to integer. You can also use scanUpToString:intoString to parse upto space chars.

Best way to release memory allocated using malloc

I have function to convert an integer into byte array (for iPhone). To add dynamicity I have allocate the array using malloc. But I think this will leak memory. What's best way to manage this memory,
+ (unsigned char *) intToByteArray:(int)num{
unsigned char * arr = (unsigned char *)
malloc(sizeof(num) * sizeof(unsigned char));
for (int i = sizeof(num) - 1 ; i >= 0; i --) {
arr[i] = num & 0xFF;
num = num >> 8;
}
return arr;
}
When calling,
int x = 500;
unsigned char * bytes = [Util intToByteArray:x];
I want to avoid the call free(bytes) since, the calling function do not know or explicitly knows, the memory is allocated and not freed.
A few things:
The char type (and signed char and unsigned char) all have a size of 1 by definition, so sizeof(unsigned char) is unnecessary.
It looks like you just want to get the byte representation of an int object, if this is the case, it is not necessary to allocate more space for it, simply take the address of the int and cast it to a pointer to unsigned char *. If the byte order is wrong you can use the NSSwapInt function to swap the order of the bytes in the int and then take the address and cast to unsigned char *. For example:
int someInt = 0x12345678;
unsigned char *bytes = (unsigned char *) &someInt;
This cast is legal and reading from bytes is legal up until sizeof(int) bytes are read. This is accessing the “object representation”.
If you insist on using malloc, then you simply need to pass the buffer to free when you are done, as in:
free(bytes);
The name of your method does not imply the correct ownership of the returned buffer. If your method returns something that the caller is responsible for freeing, it is conventional to name the method using new, copy, or sometimes create. A more suitable name would be copyBytesFromInt: or something similar. Otherwise you could have the method accept a pre-allocated buffer and call the method getBytes:fromInt:, for example:
+ (void) getBytes:(unsigned char *) bytes fromInt:(int) num
{
for (int i = sizeof(num) - 1 ; i >= 0; i --) {
bytes[i] = num & 0xFF;
num = num >> 8;
}
}
You could wrap your bytes into a NSData instance:
NSData *data = [NSData dataWithBytesNoCopy:bytes length:sizeof(num) freeWhenDone:YES];
Make sure your method follows the usual object ownership rules.
Just call free(bytes); when you are done with the bytes (either at the end of method or in dealloc of the class)
since you want to avoid the free call, you could wrap your byte[] in a NSData object:
NSData *d = [NSData dataWithBytesNoCopy:bytes length:num freeWhenDone:YES];
The conventional way of handling this is for the caller to pass in an allocated byte buffer. That way the caller is responsible for freeing it. Something like:
int x = 500;
char *buffer = malloc(x * sizeof(char));
[Util int:x toByteArray:buffer];
…
free(buffer);
I would also consider creating an NSData to hold the bytes, this would take care of memory management for you, while still allowing you to alter the byte buffer:
+ (NSData *) intToByteArray:(int)num {
unsigned char * arr = (unsigned char *)
malloc(sizeof(num) * sizeof(unsigned char));
for (int i = sizeof(num) - 1 ; i >= 0; i --) {
arr[i] = num & 0xFF;
num = num >> 8;
}
return [NSData dataWithBytesNoCopy:arr length:num freeWhenDone:YES];
}

Working SHA1 code in iOS5 not working in iOS6

The following worked just fine with iOS5 as the base class but fails (SIGABRT) with iOS6. Could it be an OS thing or an architecture thing?
Important to also note is the accompanying MD5 hash does work.
-(NSString *)SHA1Hash {
const char *cStr = [self UTF8String];
unsigned char digest[16];
CC_SHA1( cStr, strlen(cStr), digest ); // This is the sha1 call
NSMutableString *output = [NSMutableString stringWithCapacity:CC_SHA1_DIGEST_LENGTH * 2];
for(int i = 0; i < CC_SHA1_DIGEST_LENGTH; i++)
[output appendFormat:#"%02x", digest[i]];
return output;
}
Thanks for any and all help!
You were probably getting "lucky" on iOS 5. SHA-1 digests are 20 bytes, not 16:
unsigned char digest[16];
Use the macro CC_SHA1_DIGEST_LENGTH to declare your digest length. 16 is too short so you are trashing the stack.
unsigned char digest[CC_SHA1_DIGEST_LENGTH];
From man page for CC_SHA1
CC_SHA1() computes the SHA-1 message digest of the len bytes at data and
places it in md (which must have space for CC_SHA1_DIGEST_LENGTH == 20
bytes of output). It returns the md pointer.

How create a size_t and how count the size?

I create a NSData and use the function
- (const void *)bytes;
So, it return the bytes in a const void * variable. If I read the memory manually I will find this:
98 F3 00 76 84 //Then a lot of zero
Use strlen not work because the 00. But it will be aways the same size: 10 hexa lenght. So, to create a manually size_t, I will use:
size_t mysize = 0x0A
Or I have use the size in bits:
size_t mysize = 0x28
Is any of this correct?
The NSData contains the length.
const void *mybytes = [data bytes];
size_t mysize = [data length];
NSData also has -(NSUInteger)length.