Convert 1 byte to int Objective-C - objective-c

I have an NSData packet with data in it. I need to convert the byte at range 8, 1 to an int. To get the data at that location I do the following.
NSData *byte = [packet subdataWithRange:NSMakeRange(8, 1)];
If I NSLog byte
<01>
How do I think convert this to an int? This is probably the most basic of questions but I am just not getting it right.
Any help would be appreciated.
Update
With that data the int should be equal to 1. I am not sure if this has anything todo with Endian.

use -[NSData bytes] to get raw buffer and read from it
int i = *((char *)[byte bytes])
or use -[NSData getBytes:length:]
char buff;
[bytes getBytes:&buff length:1];
int i = buff;
make sure you are reading from char * not int *, otherwise you are accessing invalid memory location, which may or may not crash or provide correct result.

Related

Objective-C - NSData to integer does not work

I'm trying to convert 2 bytes in a NSData to an int.
Using the code
int value = *(int*)[d1 bytes];
NSLog(#"NSData: %# -> int: %d",d1, value);
i'll get
NSData: <01ac> -> int: 44033
which is int for ac01 not 01ac.
What would be the way to convert it in the correct way?
I believe that the byte order is switched (i.e. big endian vs. little endian).
To fix this, try:
int value = CFSwapInt32BigToHost(*(int*)[d1 bytes]);

Troubles with casting bytes from NSInputStream?

I have openssl server and Objective-C client. I send message like this
uint32_t testD = 161;
err = SSL_write(ssl_, &testD, sizeof(uint32_t));
and read it by NSInputStream like
case NSStreamEventHasBytesAvailable:
{
uint8_t buffer[4];
int len;
while ([inStream hasBytesAvailable])
{
len = [inStream read:buffer maxLength:sizeof(buffer)];
if (len > 0)
{
NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
NSData *theData = [[NSData alloc] initWithBytes:buffer length:len];
if (nil != output)
{
char buff;
[theData getBytes:&buff length:1];
uint32_t temp = (uint32_t)buffer;
}
...
So, in output I have "ยก", it's 161-th ASCII symbol, in buff I have '\xa1' and in temp very big number, but actually I need 161 in temp.
I read that '\xa1' it's also 161, but I can't cast this to uint32_t.
What is the problem?
ANSWER:
The problem was in casting. This works fine for me:
unsigned char buff;
int temp = buff;
or
char buff;
int b = (unsigned char) buff;
No encoding is used by SSL_write(), and \xa1 == 161 is a mathematical identity, not the result of any encoding process. As you're successfully recovering \xa1, clearly no decoding is used by NSInputStream either.
It seems to me that you're casting the address of the buffer rather than its contents, which is why you get a high value that varies with compilation.
In addition you are possibly over-running the data by reading whatever is available and then only consuming four bytes of it: less in fact because you're incorrectly testing len >= 1 rather than len >= 4.
You should:
Use a buffer of exactly four bytes. No need to allocate it dynamically: you can declare it as a local array.
Read until you have read four bytes. This requires a loop.
Change the casting syntax (don't ask me how, I'm no Objective-C expert, but the code that recovers buff looks like a good start), so that you get the content of the buffer instead of the address.
After that you may then have endian issues.
Nothing to do with encoding.
What encoding is used in SSL_write and NSInputStream?
There is no encoding. Its bytes in and bytes out.
I think you are looking for network byte order/endianess.
Network byte order is big endian. So your code would become:
uint32_t testD = 161;
uint32_t be = htonl(testD);
err = SSL_write(ssl_, &be, sizeof(be));
Here's the description of htonl from the htonl(3) man pages:
The htonl() function converts the unsigned integer hostlong from host byte order to network byte order.
To convert back, you would use ntohl.
I'm not sure if Cocoa/CocoaTouch offers a replacement for htonl and ntohl. So you might have to use them in your iPhone projects, too. See, for example, Using ntohl and htonl problems on iPhone.
We can get a single byte value like this:
unsigned char buff;
int temp = buff;
Or
char buff;
int b = (unsigned char) buff;

How can I turn NSData's hex string into a normal hex string?

So, take an unsigned int, say 4286578687.
From this site: http://www.mathsisfun.com/binary-decimal-hexadecimal-converter.html
I get the hex value to be: FF7FFFFF
However, if I put that int into NSData like so:
//The unsigned int is unsignedInt and its length is unsignedLength
NSData *thisData = [NSData dataWithBytes:&unsignedInt length:unsignedLength];
And then use the description method, which supposedly returns the data's hex value as a string:
NSLog(#"data as hex: %#", [thisData description]);
The output is:
data as hex: <ffff7fff>
Which on the same website evaluates to 4294934527.
So it seems like NSData is using some non-standard hex format. Can anyone tell me how to get back to the real hex format?
You are seeing a difference between storing the bytes of the unsigned int in big-endian versus little-endian.
If you want to guarantee that the output of the NSData is in big-endian format then you should do the following:
unsigned int x = 4286578687;
unsigned int big = NSSwapHostIntToBig(x);
NSData *thisData = [NSData dataWithBytes:&big length:sizeof(big)];
NSLog(#"data as hex: %#", thisData);
This logs the expected result of data as hex: <ff7fffff>.
This code will work on any processor type and always give you the result in big-endian format.
When going back the other way you would need to use the NSSwapBigIntToHost function to ensure the big-endian data is properly converted to the local format.
You can try something like this :
NSInteger i = 4286578687;
NSLog(#"hex : %#", [NSString stringWithFormat:#"%X", i]);
// Result
>>> hex : FF7FFFFF

get float value from nsdata objective-c iOS

I am trying to get a float value from a NSData object which contains several hex values. e.g. EC 51 38 41
From this 4 Byte values i want to get the float value 11.52. How do i have to do this in xcode?
I have tried it with NSScanner (scanFloat, scanHexFloat), NSNumberformatter and NSNumber, i created an Byte Array and tried float myFloat = *(float*)&myByteArray. All these Options i found here at stackoverflow.
I tested it in Windows with C# and there it was no problem:
byte[] bytes = new byte[4] { 0xEC, 0x51, 0x38, 0x41 };
float myFloat = System.BitConverter.ToSingle(bytes, 0);
Does anybody know how i have to do this in xcode???
Thanks, Benjamin
When converting binary data from a foreign protocol always make sure to include proper swapping for endianness:
uint32_t hostData = CFSwapInt32BigToHost(*(const uint32_t *)[data bytes]);
float value = *(float *)(&hostData);
You have to know the endianness of the encoded data. You might need to use CFSwapInt32LittleToHost instead.
NSData * data = ...; // loaded from bluetooth
float z;
[data getBytes:&z length:sizeof(float)];
Try this.
I have tries it with NSScanner (scanFloat, scanHexFloat), NSNumberformatter and NSNumber
You're barking up the wrong tree here. NSScanner is for scanning strings. NSNumber is not the same as NSData, and NSNumberFormatter won't work with NSData either.
NSData is a container for plain old binary data. You've apparently got a float stored in an NSData instance; if you want to access it, you'll need to get the data's bytes and then interpret those bytes however you like, e.g. by casting to float:
float *p = (float*)[myData bytes]; // -bytes returns a void* that points to the data
float f = *p;

Casting NSString to unsigned char *

I'm trying to use a function that has the following signature to sign a HTTP request:
extern void hmac_sha1(const unsigned char *inText, int inTextLength, unsigned char* inKey, const unsigned int inKeyLength, unsigned char *outDigest);
And this is the method I wrote to use it:
- (NSString *)sign: (NSString *)stringToSign {
NSString *secretKey = #"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
const unsigned char *inText = (unsigned char *)[stringToSign UTF8String];
int inTextLength = [stringToSign length];
unsigned char *inKey = (unsigned char *)[secretKey UTF8String];
const unsigned int inKeyLength = (unsigned int)[secretKey length];
unsigned char *outDigest;
hmac_sha1(inText, inTextLength, inKey, inKeyLength, outDigest);
NSString *output = [NSString stringWithUTF8String:(const char *)outDigest];
return output;
}
The problem is I'm sure this is not the way I'm supposed to do this casting, as inside this hmac_sha1 function I get a EXC_BAD_ACCESS exception.
Since I am new to Objective-C and have close to no experience in C (surprise!) I don't really know what to search for. Any tips on how I can start solving this?
Thanks in advance!
BTW, I got the reference for this function here in stackoverflow.
It looks like the problem is not with the casting, but with outDigest. The fifth argument to hmac_sha1 should point to an already allocated buffer of size 20 bytes (I think).
If you change the line that says
unsigned char *outDigest;
to say
#define HMACSHA1_DIGEST_SIZE 20
void *outDigest = malloc(HMACSHA1_DIGEST_SIZE);
That should get you past the crash inside hmac_sha1.
Then you've got the problem of converting the data at outDigest into an NSString. It looks like hmac_sha1 will put 20 bytes of random-looking data at outDigest, and not a null terminated UTF-8 string, so stringWithUTF8String: won't work. You might want to use something like this instead if you have to return an NSString:
NSString *output = [[NSString alloc] initWithBytesNoCopy:outDigest
length:HMACSHA1_DIGEST_SIZE
encoding:NSASCIIStringEncoding
freeWhenDone:YES];
I don't think NSString is really the right type for the digest, so it might be worth changing your method to return an NSData if you can.
This wasn't part of your question but it's a bug nonetheless, you shouldn't use -length to get the byte count of an UTF8 string. That method returns the number of Unicode characters in the string, not the number of bytes. What you want is -lengthOfBytesUsingEncoding:.
NSUInteger byteCount = [stringToSign lengthOfBytesUsingEncoding:NSUTF8StringEncoding];
Also be aware that the result does not account for a terminating NULL character.
Are you sure you don't need to allocate some memory for outDigest before calling hmac_sha1? Since you pass in a pointer, rather than a pointer to a pointer, there's no way that the memory can be allocated inside the routine.