Convert Hex string to IEEE 754 float - objective-c

I am trying to convert a nsstring with hex values into a float value.
NSString *hexString = #"3f9d70a4";
The float value should be = 1.230.
Some ways I have tried to solve this are:
1.NSScanner
-(unsigned int)strfloatvalue:(NSString *)str
{
float outVal;
NSString *newStr = [NSString stringWithFormat:#"0x%#",str];
NSScanner* scanner = [NSScanner scannerWithString:newStr];
NSLog(#"string %#",newStr);
bool test = [scanner scanHexFloat:&outVal];
NSLog(#"scanner result %d = %a (or %f)",test,outVal,outVal);
return outVal;
}
results:
string 0x3f9d70a4
scanner result 1 = 0x1.fceb86p+29 (or 1067282624.000000)
2.casting pointers
NSNumber * xPtr = [NSNumber numberWithFloat:[(NSNumber *)#"3f9d70a4" floatValue]];
result:3.000000

What you have is not a "hexadecimal float", as is produced by the %a string format and scanned by scanHexFloat: but the hexadecimal representation of a 32-bit floating-point value - i.e. the actual bits.
To convert this back to a float in C requires messing with the type system - to give you access to the bytes that make up a floating-point value. You can do this with a union:
typedef union { float f; uint32_t i; } FloatInt;
This type is similar to a struct but the fields are overlaid on top of each other. You should understand that doing this kind of manipulation requires you understand the storage formats, are aware of endian order, etc. Do not do this lightly.
Now you have the above type you can scan a hexadecimal integer and interpret the resultant bytes as a floating-point number:
FloatInt fl;
NSScanner *scanner = [NSScanner scannerWithString:#"3f9d70a4"];
if([scanner scanHexInt:&fl.i]) // scan into the i field
{
NSLog(#"%x -> %f", fl.i, fl.f); // display the f field, interpreting the bytes of i as a float
}
else
{
// parse error
}
This works, but again consider carefully what you are doing.
HTH

I think a better solutions is a workaround like this :
-(float) getFloat:(NSInteger*)pIndex
{
NSInteger index = *pIndex;
NSData* data = [self subDataFromIndex:&index withLength:4];
*pIndex = index;
uint32_t hostData = CFSwapInt32BigToHost(*(const uint32_t *)[data bytes]);
return *(float *)(&hostData);;
}
Where your parameter is an NSData which rapresents the number in HEX format, and the input parameter is a pointer to the element of NSData.

So basically you are trying to make an NSString to C's float, there's an old fashion way to do that!
NSString* hexString = #"3f9d70a4";
const char* cHexString = [hexString UTF8String];
long l = strtol(cHexString, NULL, 16);
float f = *((float *) &l);
// f = 1.23
for more detail please see this answer

Related

nsstring to unsigned char []

I would convert
NSString *myString = #"0x10 0x1c 0x37 0x00"; //Aquired reading text file using NSString methods..
to
unsigned char convertedfrommyString[] = { 0x10, 0x1c, 0x37, 0x00 };
My goal is to aquiring them and then swap it using this code:
unsigned char convertedfrommyString[] = { 0x10, 0x1c, 0x37, 0x00 };
int data = *((int *) convertedfromString);
NSLog(#"log = %08x", data);
the output should be:
log = 00371c10
Any help?
EDIT
From both Jan Baptiste Younès and Sven I found the way to understand my problem and solve with this code:
NSString *myString = [[#"0x10 0x1c 0x37 0x00" stringByReplacingOccurrencesOfString:#"0x" withString:#""] stringByReplacingOccurrencesOfString:#" " withString:#""];
unsigned result = 0;
NSScanner *scanner = [NSScanner scannerWithString:myString];
[scanner scanHexInt:&result];
int reverse = NSSwapInt(result);
NSLog(#"scanner: %8u", result);
NSLog(#"bytes:%08x", result);
NSLog(#"reverse:%08x (that is what i need!)", reverse);
Really OK!
But can I accept two answer?
That's more than a simple conversion, you need to actually parse the values from your string. You can use NSScanner to do this.
NSScanner *scanner = [NSScanner scannerWithString: #"0x10 0x1c 0x37 0x00"];
unsigned char convertedfrommyString[4];
unsigned index = 0;
while (![scanner isAtEnd]) {
unsigned value = 0;
if (![scanner scanHexInt: &value]) {
// invalid value
break;
}
convertedfrommyString[index++] = value;
}
Of course this sample is missing error handling (the single values could not fit into an unsigned char or there could be more than four).
But this solved only half your problem. The other issue is converting this to an int. You did this by casting the unsigned char pointer to an int pointer. This is not portable and also not legal in C. To always get the result you want you should instead use bit shifts to assemble your int. So inside your loop you could do
result = result | (value << i);
i += 8;
instead of putting the values inside an unsigned char array. For this result and i should both be initialized to zero.
You may cut your original string at spaces and use the solution given here Objective-C parse hex string to integer. You can also use scanUpToString:intoString to parse upto space chars.

Converting decimal number to binary Objective-C

Hi I have made an IOS app that converts binary, hexadecimal and decimal values. It all works fine except for my decimal to binary conversion. Here is what I have. It returns 0s and 1s but far too many. Can anyone tell me why this is or help me with a better method?
NSString *newDec = [display text]; //takes user input from display
NSString *string = #"";
NSUInteger x = newDec;
int i = 0;
while (x > 0) {
string = [[NSString stringWithFormat:#"%u", x&1] stringByAppendingString:string];
x = x>> 1;
++i;
}
display.text = string; //Displays result in ios text box
Try this:
NSUInteger x = [newDec integerValue];
And next time don't ignore the Compiler's "Incompatible pointer to Integer conversion" hint...
Explanation: Afaik, assigning an object to an int, actually assigns the address of the object to that integer, not the content of the string (which is what you want).

Converting a octal String to Decimal in Objective-C?

I trying to do conversions between Binary, Octal, Decimal and Hexadecimal in Objective-C.
I had problems converting Octal to Decimal.
I have tried the following:
NSString *decString = [NSString stringWithFormat:#"%d", 077];
It works fine, returning 63 as expected, but my Octal value is a NSString. How can I tell the computer that it is a Octal;
I know there is a method called "scanHexInt:" which I used to convert Hexadecimal to decimal, but it seems there is no scanOctInt...
Any help would be appreciated!
The cleanest solution is probably:
long result = strtol(input.UTF8String, NULL, 8);
or
long long result = strtoll(input.UTF8String, NULL, 8);
Define a category on NSString (put this on top of any of your source code modules or into a new .m/.h file pair, #interface goes into .h, #implementation into .m):
#interface NSString (NSStringWithOctal)
-(int)octalIntValue;
#end
#implementation NSString (NSStringWithOctal)
-(int)octalIntValue
{
int iResult = 0, iBase = 1;
char c;
for(int i=(int)[self length]-1; i>=0; i--)
{
c = [self characterAtIndex:i];
if((c<'0')||(c>'7')) return 0;
iResult += (c - '0') * iBase;
iBase *= 8;
}
return iResult;
}
#end
Use it like that:
NSString *s = #"77";
int i = [s octalIntValue];
NSLog(#"%d", i);
The method returns an integer representing the octal value in the string. It returns 0, if the string is not an octal number. Leading zeroes are allowed, but not necessary.
Alternatively, if you want to drop down to C, you can use sscanf
int oct;
sscanf( [yourString UTF8String], "%o", &oct );

Converting NSData bytes to NSString

I am trying to create a 16 byte and later 32 byte initialization vector in objective-c (Mac OS). I took some code on how to create random bytes and modified it to 16 bytes, but I have some difficulty with this. The NSData dumps the hex, but an NSString dump gives nil, and a cstring NSLog gives the wrong number of characters (not reproduced the same in the dump here).
Here is my terminal output:
2012-01-07 14:29:07.705 Test3Test[4633:80f] iv hex <48ea262d efd8f5f5 f8021126 fd74c9fd>
2012-01-07 14:29:07.710 Test3Test[4633:80f] IV string: (null)
2012-01-07 14:29:07.711 Test3Test[4633:80f] IV char string t^Q¶�^��^A
Here is the main program:
int main (int argc, const char * argv[])
{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
//NSString *iv_string = [NSString stringWithCString:iv encoding:NSUTF8StringEncoding];
testclass *obj = [testclass alloc];
NSData *iv_data = [obj createRandomNSData];
//[iv_string dataUsingEncoding:NSUTF8StringEncoding];
NSLog(#"iv hex %#",iv_data);
//NSString *iv_string = [[NSString alloc] initWithBytes:[iv_data bytes] length:16 encoding:NSUTF8StringE$
NSString *iv_string = [[NSString alloc] initWithData:iv_data encoding:NSUTF8StringEncoding];
NSLog(#"IV string: %#",iv_string);
NSLog(#"IV char string %.*s",[iv_data bytes]);
return 0;
]
(I left in the above some commented code that I tried and did not work also).
Below is my random number generater, taken from a stack overflow example:
#implementation testclass
-(NSData*)createRandomNSData
{
int twentyMb = 16;
NSMutableData* theData = [NSMutableData dataWithCapacity:twentyMb];
for( unsigned int i = 0 ; i < twentyMb/4 ; ++i )
{
u_int32_t randomBits = arc4random();
[theData appendBytes:(void*)&randomBits length:4];
}
NSData *data = [NSData dataWithData:theData];
[theData dealloc];
return data;
}
#end
I am really quite clueless as to what could be the problem here. If I have data as bytes, it should convert to a string or not necessarily? I have looked over the relevant examples here on stackoverflow, but none of them have worked in this situation.
Thanks,
Elijah
An arbitrary byte sequence may not be legal UTF8 encoding. As #Joachim Isaksson notes, there is seldom reason to convert to strings this way. If you need to store random data as a string, you should use an encoding scheme like Base64, serialize the NSData to a plist, or similar approach. You cannot simply use a cstring either, since NULL is legal inside of a random byte sequence, but is not legal inside of a cstring.
You do not need to build your own random byte creator on Mac or iOS. There's one built-in called SecRandomCopyBytes(). For example (from Properly encrypting with AES with CommonCrypto):
+ (NSData *)randomDataOfLength:(size_t)length {
NSMutableData *data = [NSMutableData dataWithLength:length];
int result = SecRandomCopyBytes(kSecRandomDefault,
length,
data.mutableBytes);
NSAssert(result == 0, #"Unable to generate random bytes: %d",
errno);
return data;
}
When converting NSData to NSString using an UTF8 encoding, you won't necessarily end up with the same number of bytes since not all binary values are valid encodings of characters. I'd say using a string for binary data is a recipe for problems.
What is the use of the string? NSData is exactly the datatype you want for storing binary data to begin with.

Converting NSData to float?

Initially I thought the code presented below was working, the "inBuffer" seems to be correctly getting 4-bytes of data, also the variable MDD_times is correct.
NSData *inBuffer;
float MDD_times;
// FLOAT_002
inBuffer = [inFile readDataOfLength:sizeof(float)];
[inBuffer getBytes: &MDD_times length:sizeof(float)];
NSLog(#"Time: %f", MDD_times);
OK let me expand on this little (code above updated), this is what I am getting:
inBuffer = <3d2aaaab>
MDD_times = -1.209095e-12 (this will be 0.0416667 bigEndian)
NSLog(#"Time: %f", MDD_times) = Time: -0.000000
Its probably NSLog that can't accommodate the float value, I flipped the bytes in the float to bigEndian and the expected value "0.0416667" displays just fine. AT least I know the NSData > float bit is working as intended.
gary
Here's some code I have to do this at a given offset in a buffer. This should work regardless of host endianness when the file is in big endian format.
union intToFloat
{
uint32_t i;
float fp;
};
+(float)floatAtOffset:(NSUInteger)offset
inData:(NSData*)data
{
assert([data length] >= offset + sizeof(float));
union intToFloat convert;
const uint32_t* bytes = [data bytes] + offset;
convert.i = CFSwapInt32BigToHost(*bytes);
const float value = convert.fp;
return value;
}
If you’re sure that the inFile returns data that was encoded with the same type of float and the same endianness, your code should work as expected.