nsstring to unsigned char [] - objective-c

I would convert
NSString *myString = #"0x10 0x1c 0x37 0x00"; //Aquired reading text file using NSString methods..
to
unsigned char convertedfrommyString[] = { 0x10, 0x1c, 0x37, 0x00 };
My goal is to aquiring them and then swap it using this code:
unsigned char convertedfrommyString[] = { 0x10, 0x1c, 0x37, 0x00 };
int data = *((int *) convertedfromString);
NSLog(#"log = %08x", data);
the output should be:
log = 00371c10
Any help?
EDIT
From both Jan Baptiste Younès and Sven I found the way to understand my problem and solve with this code:
NSString *myString = [[#"0x10 0x1c 0x37 0x00" stringByReplacingOccurrencesOfString:#"0x" withString:#""] stringByReplacingOccurrencesOfString:#" " withString:#""];
unsigned result = 0;
NSScanner *scanner = [NSScanner scannerWithString:myString];
[scanner scanHexInt:&result];
int reverse = NSSwapInt(result);
NSLog(#"scanner: %8u", result);
NSLog(#"bytes:%08x", result);
NSLog(#"reverse:%08x (that is what i need!)", reverse);
Really OK!
But can I accept two answer?

That's more than a simple conversion, you need to actually parse the values from your string. You can use NSScanner to do this.
NSScanner *scanner = [NSScanner scannerWithString: #"0x10 0x1c 0x37 0x00"];
unsigned char convertedfrommyString[4];
unsigned index = 0;
while (![scanner isAtEnd]) {
unsigned value = 0;
if (![scanner scanHexInt: &value]) {
// invalid value
break;
}
convertedfrommyString[index++] = value;
}
Of course this sample is missing error handling (the single values could not fit into an unsigned char or there could be more than four).
But this solved only half your problem. The other issue is converting this to an int. You did this by casting the unsigned char pointer to an int pointer. This is not portable and also not legal in C. To always get the result you want you should instead use bit shifts to assemble your int. So inside your loop you could do
result = result | (value << i);
i += 8;
instead of putting the values inside an unsigned char array. For this result and i should both be initialized to zero.

You may cut your original string at spaces and use the solution given here Objective-C parse hex string to integer. You can also use scanUpToString:intoString to parse upto space chars.

Related

Convert Hex string to IEEE 754 float

I am trying to convert a nsstring with hex values into a float value.
NSString *hexString = #"3f9d70a4";
The float value should be = 1.230.
Some ways I have tried to solve this are:
1.NSScanner
-(unsigned int)strfloatvalue:(NSString *)str
{
float outVal;
NSString *newStr = [NSString stringWithFormat:#"0x%#",str];
NSScanner* scanner = [NSScanner scannerWithString:newStr];
NSLog(#"string %#",newStr);
bool test = [scanner scanHexFloat:&outVal];
NSLog(#"scanner result %d = %a (or %f)",test,outVal,outVal);
return outVal;
}
results:
string 0x3f9d70a4
scanner result 1 = 0x1.fceb86p+29 (or 1067282624.000000)
2.casting pointers
NSNumber * xPtr = [NSNumber numberWithFloat:[(NSNumber *)#"3f9d70a4" floatValue]];
result:3.000000
What you have is not a "hexadecimal float", as is produced by the %a string format and scanned by scanHexFloat: but the hexadecimal representation of a 32-bit floating-point value - i.e. the actual bits.
To convert this back to a float in C requires messing with the type system - to give you access to the bytes that make up a floating-point value. You can do this with a union:
typedef union { float f; uint32_t i; } FloatInt;
This type is similar to a struct but the fields are overlaid on top of each other. You should understand that doing this kind of manipulation requires you understand the storage formats, are aware of endian order, etc. Do not do this lightly.
Now you have the above type you can scan a hexadecimal integer and interpret the resultant bytes as a floating-point number:
FloatInt fl;
NSScanner *scanner = [NSScanner scannerWithString:#"3f9d70a4"];
if([scanner scanHexInt:&fl.i]) // scan into the i field
{
NSLog(#"%x -> %f", fl.i, fl.f); // display the f field, interpreting the bytes of i as a float
}
else
{
// parse error
}
This works, but again consider carefully what you are doing.
HTH
I think a better solutions is a workaround like this :
-(float) getFloat:(NSInteger*)pIndex
{
NSInteger index = *pIndex;
NSData* data = [self subDataFromIndex:&index withLength:4];
*pIndex = index;
uint32_t hostData = CFSwapInt32BigToHost(*(const uint32_t *)[data bytes]);
return *(float *)(&hostData);;
}
Where your parameter is an NSData which rapresents the number in HEX format, and the input parameter is a pointer to the element of NSData.
So basically you are trying to make an NSString to C's float, there's an old fashion way to do that!
NSString* hexString = #"3f9d70a4";
const char* cHexString = [hexString UTF8String];
long l = strtol(cHexString, NULL, 16);
float f = *((float *) &l);
// f = 1.23
for more detail please see this answer

Objective-C: Convert Hex Strings to Integers to Compare Which is Greater

My goal is to compare two hex strings and determine which number is higher. I assume I need to convert those hex strings to integers to be able to compare them mathematically, but the conversion to unsigned isn't working. Here's what I've tried:
NSString *firstHex = #"35c1f029684fe";
NSString *secondHex = #"35c1f029684ff";
unsigned first = 0;
unsigned second = 0;
NSScanner *firstScanner = [NSScanner scannerWithString:firstHex];
NSScanner *secondScanner = [NSScanner scannerWithString:secondHex];
[firstScanner scanHexInt:&first];
[secondScanner scanHexInt:&second];
NSLog(#"First: %d",first);
NSLog(#"Second: %d",second);
But the log output gives me:
First: -1
Second: -1
I can't figure out what I'm doing wrong. Am I using NSScanner correctly here? Thanks in advance.
Your hex numbers are 13 digits long - 52 binary bits. This is longer than 32 bits, use long long variables and scanHexLongLong: instead.
For the sake of completeness, here's the working code using the advice from the above answer:
NSString *firstHex = #"35c1f029684fe";
NSString *secondHex = #"35c1f029684ff";
unsigned long long first = 0;
unsigned long long second = 0;
NSScanner *firstScanner = [NSScanner scannerWithString:firstHex];
NSScanner *secondScanner = [NSScanner scannerWithString:secondHex];
[firstScanner scanHexLongLong:&first];
[secondScanner scanHexLongLong:&second];
NSLog(#"First: %llu",first);
NSLog(#"Second: %llu",second);
if(first > second){
NSLog(#"First is greater");
}else{
NSLog(#"Second is greater");
}
it must be faster to just find out which one is larger as a string:
the longer string is bigger (ignoring leading 0's)
if they are the same then you can convert each char and compare, Repeat for each char...

In Objective-C, how to print out N spaces? (using stringWithCharacters)

The following is tried to print out N number of spaces (or 12 in the example):
NSLog(#"hello%#world", [NSString stringWithCharacters:" " length:12]);
const unichar arrayChars[] = {' '};
NSLog(#"hello%#world", [NSString stringWithCharacters:arrayChars length:12]);
const unichar oneChar = ' ';
NSLog(#"hello%#world", [NSString stringWithCharacters:&oneChar length:12]);
But they all print out weird things such as hello ÔÅÓñüÔÅ®Óñü®ÓüÅ®ÓñüÔ®ÓüÔÅ®world... I thought a "char array" is the same as a "string" and the same as a "pointer to a character"? The API spec says it is to be a "C array of Unicode characters" (by Unicode, is it UTF8? if it is, then it should be compatible with ASCII)... How to make it work and why those 3 ways won't work?
You can use %*s to specify the width.
NSLog(#"Hello%*sWorld", 12, "");
Reference:
A field width, or precision, or both, may be indicated by an asterisk
( '*' ). In this case an argument of type int supplies the field width
or precision. Applications shall ensure that arguments specifying
field width, or precision, or both appear in that order before the
argument, if any, to be converted.
This will get you what you want:
NSLog(#"hello%#world", [#"" stringByPaddingToLength:12 withString:#" " startingAtIndex:0]);
I think the issue you have is you are misinterpreting what +(NSString *)stringWithCharacters:length: is supposed to do. It's not supposed to repeat the characters, but instead copy them from the array into a string.
So in your case you only have a single ' ' in the array, meaning the other 11 characters will be taken from whatever follows arrayChars in memory.
If you want to print out a pattern of n spaces, the easiest way to do that would be to use -(NSString *)stringByPaddingToLength:withString:startingAtIndex:, i.e creating something like this.
NSString *formatString = #"Hello%#World";
NSString *paddingString = [[NSString string] stringByPaddingToLength: n withString: #" " startingAtIndex: 0];
NSLog(formatString, paddingString);
This is probably the fastest method:
NSString *spacesWithLength(int nSpaces)
{
char UTF8Arr[nSpaces + 1];
memset(UTF8Arr, ' ', nSpaces * sizeof(*UTF8Arr));
UTF8Arr[nSpaces] = '\0';
return [NSString stringWithUTF8String:UTF8Arr];
}
The reason your current code isn't working is because +stringWithCharacters: expects an array with a length of characters of 12, while your array is only 1 character in length {' '}. So, to fix, you must create a buffer for your array (in this case, we use a char array, not a unichar, because we can easily memset a char array, but not a unichar array).
The method I provided above is probably the fastest that is possible with a dynamic length. If you are willing to use GCC extensions, and you have a fixed size array of spaces you need, you can do this:
NSString *spacesWithLength7()
{
unichar characters[] = { [0 ... 7] = ' ' };
return [NSString stringWithCharacters:characters length:7];
}
Unfortunately, that extension doesn't work with variables, so it must be a constant.
Through the magic of GCC extensions and preprocessor macros, I give you.... THE REPEATENATOR! Simply pass in a string (or a char), and it will do the rest! Buy now, costs you only $19.95, operators are standing by! (Based on the idea suggested by #JeremyL)
// step 1: determine if char is a char or string, or NSString.
// step 2: repeat that char or string
// step 3: return that as a NSString
#define repeat(inp, cnt) __rep_func__(#encode(typeof(inp)), inp, cnt)
// arg list: (int siz, int / char *input, int n)
static inline NSString *__rep_func__(char *typ, ...)
{
const char *str = NULL;
int n;
{
va_list args;
va_start(args, typ);
if (typ[0] == 'i')
str = (const char []) { va_arg(args, int), '\0' };
else if (typ[0] == '#')
str = [va_arg(args, id) UTF8String];
else
str = va_arg(args, const char *);
n = va_arg(args, int);
va_end(args);
}
int len = strlen(str);
char outbuf[(len * n) + 1];
// now copy the content
for (int i = 0; i < n; i++) {
for (int j = 0; j < len; j++) {
outbuf[(i * len) + j] = str[j];
}
}
outbuf[(len * n)] = '\0';
return [NSString stringWithUTF8String:outbuf];
}
The stringWithCharaters:length: method makes an NSString (or an instance of a subclass of NSString) using the first length characters in the C array. It does not iterate over the given array of characters until it reaches the length.
The output you are seeing is the area of memory 12 Unicode characters long starting at the location of your passed 1 Unicode character array.
This should work.
NSLog(#"hello%#world", [NSString stringWithCharacters:" " length:12]);

how to extract data from cocoa iPhone sax xml parsing routine

I'm trying to read in and parse an xml document in an iPhone app. I begin parsing and then use the override method:
static void startElementSAX(void *ctx, const xmlChar *localname, const xmlChar *prefix, const xmlChar *URI,
int nb_namespaces, const xmlChar **namespaces, int nb_attributes, int nb_defaulted, const xmlChar **attributes)
I then try to convert the attributes to a string with:
NSString *str1 = [[NSString alloc] initWithCString:attributes encoding:NSUTF8StringEncoding];
Why does the attributes parameter have two ** in front of it. And why when trying to extract the data and convert it to a string with the above code do I get the warning:
passing argument 1 of 'initWithCString:encoding:' from incompatible pointer type.
The documentation for libxml's start element callback states that the pointer is to an array that hold 5 values for each attribute (the number of attributes is returned in nb_attributes). This means that every 5th value in the array is a new attribute item.
The five items for each attribute are:
localname (the name of the attribute)
prefix (the namespace of the attribute)
URI
[start of] value (a pointer to the start
of the xmlChar string for the value)
end [of value] (a pointer to the end of the
xmlChar string for the value)
So you need to step through the array, get each value out of the items for the first attribute, then use the start value pointer to get the xmlChar string that is length = end - start. Then start over with the next attribute till you read in nb_attributes worth.
If that makes your head ache then I strongly suggest you switch to Apple's NSXMLParser (link may require login, or use this link NSXMLParser). In which case you would get the attributes as an NSDictionary. To get all the attributes out of it you could do the following:
for (NSString *attributeName in [attributeDict allKeys]) {
NSString *attributeValue = [attributeDict objectForKey:attributeName];
// do something here with attributeName and attributeValue
}
If you have access to the iPhone developer site then look at the example SeismicXML.
The sample is great except for two things:
you need to bump 'i' by 5 after each loop since there are 5 items for each attribute.
doing strlen() on both begin and end is expensive; it's easier to simply subtract begin from end
for (int i = 0; i < nb_attributes*5; i += 5)
{
const char *attr = (const char *)attributes[i];
const char *begin = (const char *)attributes[i + 3];
const char *end = (const char *)attributes[i + 4];
int vlen = end - begin;
char val[vlen + 1];
strncpy(val, begin, vlen);
val[vlen] = '\0';
NSLog(#"attribute %s = '%s'", attr, val);
}
The accepted answers explanation is correct, but it's helpful to view some example code too. Here is just one way to extract the value from the attributes, at least it works when I tested it. I'm far from being a C guru though.
for (int i = 0; i < nb_attributes; i += 5) {
const char *attr = (const char *)attributes[i];
const char *begin = (const char *)attributes[i + 3];
const char *end = (const char *)attributes[i + 4];
int vlen = strlen(begin) - strlen(end);
char val[vlen + 1];
strncpy(val, begin, vlen);
val[vlen] = '\0';
NSLog(#"attribute %s: %d = %s", attr, i, val);
}
NSXMLParser is nice, but from what I can tell, it downloads the entire XML before processing. Using libxml it can read in chunks at a time. It allows greater flexibility, but higher learning curve.
The '**' notation means "pointer to a pointer." In C/C++, a "string" is represented by an array of characters. An array is actually just a pointer under the covers, so a string in C/C++ can actually be declared as either "char[]" or "char*". The [] notation compiles down to a pointer to an array.
A common example of this is the typical "main" function in C/C++:
int main(int argc, char **argv)
Which is equivalent to:
int main(int argc, char *argv[])
argv is an array of char* "strings" (the command-line arguments to the program).
I can't provide an example at the moment, but it looks like you need to iterate over attributes to access the individual strings. For example, attributes[0] would be the first attribute string (an xmlChar*). You should be able to convert each individual attribute to an NSString.
const xmlChar **namespaces is an array of CStrings (int nb_namespaces tells you how many). If you want each namespace as an NSString, you could do something like the following:
NSMutableArray *namespaces = [[NSMutableArray alloc] init];
int i;
for (i = 0; i < nb_namespaces; i++) {
NSString *namespace = [[NSString alloc] initWithCString:attributes[i] encoding:NSUTF8StringEncoding];
[namespaces addObject:namespace];
}
The initWithCString method is expecting xmlChar *, which is a pointer to an xmlChar (the first char in a CString).
xmlChar ** means pointer to a pointer to an xmlChar (the first char in the first CString).

Casting NSString to unsigned char *

I'm trying to use a function that has the following signature to sign a HTTP request:
extern void hmac_sha1(const unsigned char *inText, int inTextLength, unsigned char* inKey, const unsigned int inKeyLength, unsigned char *outDigest);
And this is the method I wrote to use it:
- (NSString *)sign: (NSString *)stringToSign {
NSString *secretKey = #"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
const unsigned char *inText = (unsigned char *)[stringToSign UTF8String];
int inTextLength = [stringToSign length];
unsigned char *inKey = (unsigned char *)[secretKey UTF8String];
const unsigned int inKeyLength = (unsigned int)[secretKey length];
unsigned char *outDigest;
hmac_sha1(inText, inTextLength, inKey, inKeyLength, outDigest);
NSString *output = [NSString stringWithUTF8String:(const char *)outDigest];
return output;
}
The problem is I'm sure this is not the way I'm supposed to do this casting, as inside this hmac_sha1 function I get a EXC_BAD_ACCESS exception.
Since I am new to Objective-C and have close to no experience in C (surprise!) I don't really know what to search for. Any tips on how I can start solving this?
Thanks in advance!
BTW, I got the reference for this function here in stackoverflow.
It looks like the problem is not with the casting, but with outDigest. The fifth argument to hmac_sha1 should point to an already allocated buffer of size 20 bytes (I think).
If you change the line that says
unsigned char *outDigest;
to say
#define HMACSHA1_DIGEST_SIZE 20
void *outDigest = malloc(HMACSHA1_DIGEST_SIZE);
That should get you past the crash inside hmac_sha1.
Then you've got the problem of converting the data at outDigest into an NSString. It looks like hmac_sha1 will put 20 bytes of random-looking data at outDigest, and not a null terminated UTF-8 string, so stringWithUTF8String: won't work. You might want to use something like this instead if you have to return an NSString:
NSString *output = [[NSString alloc] initWithBytesNoCopy:outDigest
length:HMACSHA1_DIGEST_SIZE
encoding:NSASCIIStringEncoding
freeWhenDone:YES];
I don't think NSString is really the right type for the digest, so it might be worth changing your method to return an NSData if you can.
This wasn't part of your question but it's a bug nonetheless, you shouldn't use -length to get the byte count of an UTF8 string. That method returns the number of Unicode characters in the string, not the number of bytes. What you want is -lengthOfBytesUsingEncoding:.
NSUInteger byteCount = [stringToSign lengthOfBytesUsingEncoding:NSUTF8StringEncoding];
Also be aware that the result does not account for a terminating NULL character.
Are you sure you don't need to allocate some memory for outDigest before calling hmac_sha1? Since you pass in a pointer, rather than a pointer to a pointer, there's no way that the memory can be allocated inside the routine.