Troubles with casting bytes from NSInputStream? - ssl

I have openssl server and Objective-C client. I send message like this
uint32_t testD = 161;
err = SSL_write(ssl_, &testD, sizeof(uint32_t));
and read it by NSInputStream like
case NSStreamEventHasBytesAvailable:
{
uint8_t buffer[4];
int len;
while ([inStream hasBytesAvailable])
{
len = [inStream read:buffer maxLength:sizeof(buffer)];
if (len > 0)
{
NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
NSData *theData = [[NSData alloc] initWithBytes:buffer length:len];
if (nil != output)
{
char buff;
[theData getBytes:&buff length:1];
uint32_t temp = (uint32_t)buffer;
}
...
So, in output I have "¡", it's 161-th ASCII symbol, in buff I have '\xa1' and in temp very big number, but actually I need 161 in temp.
I read that '\xa1' it's also 161, but I can't cast this to uint32_t.
What is the problem?
ANSWER:
The problem was in casting. This works fine for me:
unsigned char buff;
int temp = buff;
or
char buff;
int b = (unsigned char) buff;

No encoding is used by SSL_write(), and \xa1 == 161 is a mathematical identity, not the result of any encoding process. As you're successfully recovering \xa1, clearly no decoding is used by NSInputStream either.
It seems to me that you're casting the address of the buffer rather than its contents, which is why you get a high value that varies with compilation.
In addition you are possibly over-running the data by reading whatever is available and then only consuming four bytes of it: less in fact because you're incorrectly testing len >= 1 rather than len >= 4.
You should:
Use a buffer of exactly four bytes. No need to allocate it dynamically: you can declare it as a local array.
Read until you have read four bytes. This requires a loop.
Change the casting syntax (don't ask me how, I'm no Objective-C expert, but the code that recovers buff looks like a good start), so that you get the content of the buffer instead of the address.
After that you may then have endian issues.
Nothing to do with encoding.

What encoding is used in SSL_write and NSInputStream?
There is no encoding. Its bytes in and bytes out.
I think you are looking for network byte order/endianess.
Network byte order is big endian. So your code would become:
uint32_t testD = 161;
uint32_t be = htonl(testD);
err = SSL_write(ssl_, &be, sizeof(be));
Here's the description of htonl from the htonl(3) man pages:
The htonl() function converts the unsigned integer hostlong from host byte order to network byte order.
To convert back, you would use ntohl.
I'm not sure if Cocoa/CocoaTouch offers a replacement for htonl and ntohl. So you might have to use them in your iPhone projects, too. See, for example, Using ntohl and htonl problems on iPhone.

We can get a single byte value like this:
unsigned char buff;
int temp = buff;
Or
char buff;
int b = (unsigned char) buff;

Related

Convert 1 byte to int Objective-C

I have an NSData packet with data in it. I need to convert the byte at range 8, 1 to an int. To get the data at that location I do the following.
NSData *byte = [packet subdataWithRange:NSMakeRange(8, 1)];
If I NSLog byte
<01>
How do I think convert this to an int? This is probably the most basic of questions but I am just not getting it right.
Any help would be appreciated.
Update
With that data the int should be equal to 1. I am not sure if this has anything todo with Endian.
use -[NSData bytes] to get raw buffer and read from it
int i = *((char *)[byte bytes])
or use -[NSData getBytes:length:]
char buff;
[bytes getBytes:&buff length:1];
int i = buff;
make sure you are reading from char * not int *, otherwise you are accessing invalid memory location, which may or may not crash or provide correct result.

Best way to release memory allocated using malloc

I have function to convert an integer into byte array (for iPhone). To add dynamicity I have allocate the array using malloc. But I think this will leak memory. What's best way to manage this memory,
+ (unsigned char *) intToByteArray:(int)num{
unsigned char * arr = (unsigned char *)
malloc(sizeof(num) * sizeof(unsigned char));
for (int i = sizeof(num) - 1 ; i >= 0; i --) {
arr[i] = num & 0xFF;
num = num >> 8;
}
return arr;
}
When calling,
int x = 500;
unsigned char * bytes = [Util intToByteArray:x];
I want to avoid the call free(bytes) since, the calling function do not know or explicitly knows, the memory is allocated and not freed.
A few things:
The char type (and signed char and unsigned char) all have a size of 1 by definition, so sizeof(unsigned char) is unnecessary.
It looks like you just want to get the byte representation of an int object, if this is the case, it is not necessary to allocate more space for it, simply take the address of the int and cast it to a pointer to unsigned char *. If the byte order is wrong you can use the NSSwapInt function to swap the order of the bytes in the int and then take the address and cast to unsigned char *. For example:
int someInt = 0x12345678;
unsigned char *bytes = (unsigned char *) &someInt;
This cast is legal and reading from bytes is legal up until sizeof(int) bytes are read. This is accessing the “object representation”.
If you insist on using malloc, then you simply need to pass the buffer to free when you are done, as in:
free(bytes);
The name of your method does not imply the correct ownership of the returned buffer. If your method returns something that the caller is responsible for freeing, it is conventional to name the method using new, copy, or sometimes create. A more suitable name would be copyBytesFromInt: or something similar. Otherwise you could have the method accept a pre-allocated buffer and call the method getBytes:fromInt:, for example:
+ (void) getBytes:(unsigned char *) bytes fromInt:(int) num
{
for (int i = sizeof(num) - 1 ; i >= 0; i --) {
bytes[i] = num & 0xFF;
num = num >> 8;
}
}
You could wrap your bytes into a NSData instance:
NSData *data = [NSData dataWithBytesNoCopy:bytes length:sizeof(num) freeWhenDone:YES];
Make sure your method follows the usual object ownership rules.
Just call free(bytes); when you are done with the bytes (either at the end of method or in dealloc of the class)
since you want to avoid the free call, you could wrap your byte[] in a NSData object:
NSData *d = [NSData dataWithBytesNoCopy:bytes length:num freeWhenDone:YES];
The conventional way of handling this is for the caller to pass in an allocated byte buffer. That way the caller is responsible for freeing it. Something like:
int x = 500;
char *buffer = malloc(x * sizeof(char));
[Util int:x toByteArray:buffer];
…
free(buffer);
I would also consider creating an NSData to hold the bytes, this would take care of memory management for you, while still allowing you to alter the byte buffer:
+ (NSData *) intToByteArray:(int)num {
unsigned char * arr = (unsigned char *)
malloc(sizeof(num) * sizeof(unsigned char));
for (int i = sizeof(num) - 1 ; i >= 0; i --) {
arr[i] = num & 0xFF;
num = num >> 8;
}
return [NSData dataWithBytesNoCopy:arr length:num freeWhenDone:YES];
}

Arbitrary precision bit manipulation (Objective C)

I need to do bit operations on representations of arbitrary precision numbers in Objective C. So far I have been using NSData objects to hold the numbers - is there a way to bit shift the content of those? If not, is there a different way to achieve this?
Using NSMutableData you can fetch the byte in a char, shift your bits and replace it with -replaceBytesInRange:withBytes:.
I don't see any other solution except for writing your own date holder class using a char * buffer to hold the raw data.
As you'll have spotted, Apple doesn't provide arbitrary precision support. Nothing is provided larger than the 1024-bit integers in vecLib.
I also don't think NSData provides shifts and rolls. So you're going to have to roll your own. E.g. a very naive version, which may have some small errors as I'm typing it directly here:
#interface NSData (Shifts)
- (NSData *)dataByShiftingLeft:(NSUInteger)bitCount
{
// we'll work byte by byte
int wholeBytes = bitCount >> 3;
int extraBits = bitCount&7;
NSMutableData *newData = [NSMutableData dataWithLength:self.length + wholeBytes + (extraBits ? 1 : 0)];
if(extraBits)
{
uint8_t *sourceBytes = [self bytes];
uint8_t *destinationBytes = [newData mutableBytes];
for(int index = 0; index < self.length-1; index++)
{
destinationBytes[index] =
(sourceBytes[index] >> (8-extraBits)) |
(sourceBytes[index+1] << extraBits);
}
destinationBytes[index] = roll >> (8-extraBits);
}
else
/* just copy all of self into the beginning of newData */
return newData;
}
#end
Of course, that assumes the number of bits you want to shift by is itself expressible as an NSUInteger, amongst other sins.

NSData encoding to Unicode returning nil

I am attempting to convert an HMAC (hashed data) to a string safe for urls for authentication purposes.
Im having problems converting data generated from sha256 hashing (using apples crypto library) to Unicode in both little and big Endian, one hashed string will work in big and not in little, and visa versa for a different hashed string. For some hashed
strings it works perfectly. I think it may have something to do with an out of range character or something. When I say it doesn't work, I mean it returns nil.
The code looks like this:
NSString *mystring = [[NSString alloc] initWithData:myHash encoding:NSUnicodeEncoding
Is Unicode the best to use? I tried encoding to UTF8, it returns nil and ascii doesn't have all the characters, I get a few "?" where data is missing.
Really, my question is, how do I make A string from NSData from a sha256 hash?
Solution:
https://github.com/eczarny/xmlrpc
NSData+Base64.h and NSData+Base64.m
You first need to make a hex string out of your HMAC bytes.
You can make it like this:
void bytes_to_hexstring(void *uuid, char *hex_string, size_t osize) {
static const char hexdigits[] = "0123456789ABCDEF";
const unsigned char* bytes = (unsigned char *)uuid;
int i = 0;
for (i = 0; i<osize; ++i) {
const unsigned char c = *bytes++;
*hex_string++ = hexdigits[(c >> 4) & 0xF];
*hex_string++ = hexdigits[(c ) & 0xF];
}
*hex_string = 0;
}
char *mystring = malloc(41);
bytes_to_hexstring(myHash, mystring, 20);
something like this.

Casting NSString to unsigned char *

I'm trying to use a function that has the following signature to sign a HTTP request:
extern void hmac_sha1(const unsigned char *inText, int inTextLength, unsigned char* inKey, const unsigned int inKeyLength, unsigned char *outDigest);
And this is the method I wrote to use it:
- (NSString *)sign: (NSString *)stringToSign {
NSString *secretKey = #"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
const unsigned char *inText = (unsigned char *)[stringToSign UTF8String];
int inTextLength = [stringToSign length];
unsigned char *inKey = (unsigned char *)[secretKey UTF8String];
const unsigned int inKeyLength = (unsigned int)[secretKey length];
unsigned char *outDigest;
hmac_sha1(inText, inTextLength, inKey, inKeyLength, outDigest);
NSString *output = [NSString stringWithUTF8String:(const char *)outDigest];
return output;
}
The problem is I'm sure this is not the way I'm supposed to do this casting, as inside this hmac_sha1 function I get a EXC_BAD_ACCESS exception.
Since I am new to Objective-C and have close to no experience in C (surprise!) I don't really know what to search for. Any tips on how I can start solving this?
Thanks in advance!
BTW, I got the reference for this function here in stackoverflow.
It looks like the problem is not with the casting, but with outDigest. The fifth argument to hmac_sha1 should point to an already allocated buffer of size 20 bytes (I think).
If you change the line that says
unsigned char *outDigest;
to say
#define HMACSHA1_DIGEST_SIZE 20
void *outDigest = malloc(HMACSHA1_DIGEST_SIZE);
That should get you past the crash inside hmac_sha1.
Then you've got the problem of converting the data at outDigest into an NSString. It looks like hmac_sha1 will put 20 bytes of random-looking data at outDigest, and not a null terminated UTF-8 string, so stringWithUTF8String: won't work. You might want to use something like this instead if you have to return an NSString:
NSString *output = [[NSString alloc] initWithBytesNoCopy:outDigest
length:HMACSHA1_DIGEST_SIZE
encoding:NSASCIIStringEncoding
freeWhenDone:YES];
I don't think NSString is really the right type for the digest, so it might be worth changing your method to return an NSData if you can.
This wasn't part of your question but it's a bug nonetheless, you shouldn't use -length to get the byte count of an UTF8 string. That method returns the number of Unicode characters in the string, not the number of bytes. What you want is -lengthOfBytesUsingEncoding:.
NSUInteger byteCount = [stringToSign lengthOfBytesUsingEncoding:NSUTF8StringEncoding];
Also be aware that the result does not account for a terminating NULL character.
Are you sure you don't need to allocate some memory for outDigest before calling hmac_sha1? Since you pass in a pointer, rather than a pointer to a pointer, there's no way that the memory can be allocated inside the routine.