NSData encoding to Unicode returning nil - objective-c

I am attempting to convert an HMAC (hashed data) to a string safe for urls for authentication purposes.
Im having problems converting data generated from sha256 hashing (using apples crypto library) to Unicode in both little and big Endian, one hashed string will work in big and not in little, and visa versa for a different hashed string. For some hashed
strings it works perfectly. I think it may have something to do with an out of range character or something. When I say it doesn't work, I mean it returns nil.
The code looks like this:
NSString *mystring = [[NSString alloc] initWithData:myHash encoding:NSUnicodeEncoding
Is Unicode the best to use? I tried encoding to UTF8, it returns nil and ascii doesn't have all the characters, I get a few "?" where data is missing.
Really, my question is, how do I make A string from NSData from a sha256 hash?
Solution:
https://github.com/eczarny/xmlrpc
NSData+Base64.h and NSData+Base64.m

You first need to make a hex string out of your HMAC bytes.
You can make it like this:
void bytes_to_hexstring(void *uuid, char *hex_string, size_t osize) {
static const char hexdigits[] = "0123456789ABCDEF";
const unsigned char* bytes = (unsigned char *)uuid;
int i = 0;
for (i = 0; i<osize; ++i) {
const unsigned char c = *bytes++;
*hex_string++ = hexdigits[(c >> 4) & 0xF];
*hex_string++ = hexdigits[(c ) & 0xF];
}
*hex_string = 0;
}
char *mystring = malloc(41);
bytes_to_hexstring(myHash, mystring, 20);
something like this.

Related

How can I convert an iOS (objective-c) NSString value to a byte value?

How can I convert an iOS (objective-c) NSString value to a byte value?
JAVA uses .getBytes(), but I do not know how to do it on iOS.
uint8_t *dbytes = (uint8_t *)[Value bytes];
I expected to be something like this… but I do not want it.
You can call -[NSString UTF8String] to change NSString into const char*.
Example: const char *bytes = [Value UTF8String];
Reference: Apple Document
It is not clear in what encoding you expect your results to be.
Java's getBytes() returns String bytes using default encoding which is platform dependant, so instead use getBytes(charsetname) providing charset explicitly. You should communicate to use a specific encoding to avoid interoperability issues.
On iOS side, you can use -[NSStrings dataUsingEncoding:]
e.g.
NSData *bytes = [#"Hello" dataUsingEncoding:NSUTF8StringEncoding];
First, use UTF8String translate to unsigned long, then use 0xff & translate to Byte, as below:
NSString *str =#"xxx";
unsigned long red = strtoul([str UTF8String], 0, 16);
Byte bt = (Byte)(0xff & red);

Can stringEncodingForData:encodingOptions:convertedString:usedLossyConversion: return NSUTF16StringEncoding or NSUTF32StringEncoding?

I'd like to know if calling stringEncodingForData:encodingOptions:convertedString:usedLossyConversion: can return NSUTF16StringEncoding, NSUTF32StringEncoding or any of their variants?
The reason I'm asking is because of this documentation note on cStringUsingEncoding::
Special Considerations
UTF-16 and UTF-32 are not considered to be C string encodings, and should not be used with this method—the results of passing NSUTF16StringEncoding, NSUTF32StringEncoding, or any of their variants are undefined.
So I understand that creating a C string with UTF-16 or UTF-32 is unsupported, but I'm not sure if attempting String Encoding Detection with stringEncodingForData:encodingOptions:convertedString:usedLossyConversion: may return UTF-16 and UTF-32 or not.
An example scenario, (adapted from SSZipArchive.m), may be:
// name is a null-terminated C string built with `fread` from stdio.h:
char *name = (char *)malloc(size_name + 1);
size_t read = fread(name, 1, size_name + 1, file);
name[size_name] = '\0';
// dataName is the data object of name
NSData *dataName = [NSData dataWithBytes:(const void *)name length:sizeof(unsigned char) * size_name];
// stringName is the string object of dataName
NSString *stringName = nil;
NSStringEncoding encoding = [NSString stringEncodingForData:dataName encodingOptions:nil convertedString:&stringName usedLossyConversion:nil];
In the above code, can encoding be NSUTF16StringEncoding, NSUTF32StringEncoding or any of their variants?
Platforms: macOS 10.10+, iOS 8.0+, watchOS 2.0+, tvOS 9.0+.
Yes, if the string is encoded using one of those encodings. The notes about C strings are specific to C strings. An NSString is not a C string, and the method you're describing doesn't work on C strings; it works on arbitrary data that may be encoded in a wide variety of ways.
As an example:
#import <Foundation/Foundation.h>
int main(int argc, const char * argv[]) {
#autoreleasepool {
NSData *data = [#"test" dataUsingEncoding:NSUTF16StringEncoding];
NSStringEncoding encoding = [NSString stringEncodingForData:data
encodingOptions:nil
convertedString:nil
usedLossyConversion:nil];
NSLog(#"%ld == %ld", (unsigned long)encoding,
(unsigned long)NSUTF16StringEncoding);
}
return 0;
}
// Output: 10 == 10
This said, in your specific example, if name is really what it says it is, "a null-terminated C string," then it could never be UTF-16, because C strings cannot be encoded in UTF-16. C strings are \0 terminated, and \0 is a very common character in UTF-16. Without seeing more code, however, I would not gamble on whether that comment is accurate.
If your real question here is "given an arbitrary c-string-safe encoding, is it possible that stringEncodingForData: will return a not-c-string-safe encoding," then the answer is "yes, it could, and it's definitely not promised that it won't even if it doesn't today." If you need to prevent that, I recommend using NSStringEncodingDetectionSuggestedEncodingsKey and ...UseOnlySuggestedEncodingsKey to force it to be an encoding you can handle. (You could also use ...DisallowedEncodingsKey to prevent specific multi-byte encodings, but that wouldn't be as robust.)

Troubles with casting bytes from NSInputStream?

I have openssl server and Objective-C client. I send message like this
uint32_t testD = 161;
err = SSL_write(ssl_, &testD, sizeof(uint32_t));
and read it by NSInputStream like
case NSStreamEventHasBytesAvailable:
{
uint8_t buffer[4];
int len;
while ([inStream hasBytesAvailable])
{
len = [inStream read:buffer maxLength:sizeof(buffer)];
if (len > 0)
{
NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
NSData *theData = [[NSData alloc] initWithBytes:buffer length:len];
if (nil != output)
{
char buff;
[theData getBytes:&buff length:1];
uint32_t temp = (uint32_t)buffer;
}
...
So, in output I have "¡", it's 161-th ASCII symbol, in buff I have '\xa1' and in temp very big number, but actually I need 161 in temp.
I read that '\xa1' it's also 161, but I can't cast this to uint32_t.
What is the problem?
ANSWER:
The problem was in casting. This works fine for me:
unsigned char buff;
int temp = buff;
or
char buff;
int b = (unsigned char) buff;
No encoding is used by SSL_write(), and \xa1 == 161 is a mathematical identity, not the result of any encoding process. As you're successfully recovering \xa1, clearly no decoding is used by NSInputStream either.
It seems to me that you're casting the address of the buffer rather than its contents, which is why you get a high value that varies with compilation.
In addition you are possibly over-running the data by reading whatever is available and then only consuming four bytes of it: less in fact because you're incorrectly testing len >= 1 rather than len >= 4.
You should:
Use a buffer of exactly four bytes. No need to allocate it dynamically: you can declare it as a local array.
Read until you have read four bytes. This requires a loop.
Change the casting syntax (don't ask me how, I'm no Objective-C expert, but the code that recovers buff looks like a good start), so that you get the content of the buffer instead of the address.
After that you may then have endian issues.
Nothing to do with encoding.
What encoding is used in SSL_write and NSInputStream?
There is no encoding. Its bytes in and bytes out.
I think you are looking for network byte order/endianess.
Network byte order is big endian. So your code would become:
uint32_t testD = 161;
uint32_t be = htonl(testD);
err = SSL_write(ssl_, &be, sizeof(be));
Here's the description of htonl from the htonl(3) man pages:
The htonl() function converts the unsigned integer hostlong from host byte order to network byte order.
To convert back, you would use ntohl.
I'm not sure if Cocoa/CocoaTouch offers a replacement for htonl and ntohl. So you might have to use them in your iPhone projects, too. See, for example, Using ntohl and htonl problems on iPhone.
We can get a single byte value like this:
unsigned char buff;
int temp = buff;
Or
char buff;
int b = (unsigned char) buff;

How to convert NSString to C string?

I know that this question is a possible duplicate, but even after looking at some Google tutorials and questions even on this forum none of them gives me a decent answer about this subject.
I have:
NSString *str = #"text";
And I would like to do something like:
char cstring [512] = str;
(this only shows what I want to do, after looking at Apple's NSString Class Ref I didn't even think about using it).
Up to now I have:
char command [512] = [[NSString stringWithFormat:#"text"] cStringUsingEncoding:NSUTF8StringEncoding];
Still, with that I get errors.
Any solution?
try const char *command = [str UTF8String];
A c string is returned as a pointer, not as an array of characters. To use it, you can change your variable to a pointer.
const char *command = [theString cStringUsingEncoding:NSUTF8StringEncoding];
Since you want the UTF8 encoding, you can use the UTF8String convenience method.
const char *command = [theString UTF8String];
If you need the data to be stored in a character array, you can use the getCString:maxLength:encoding: method, passing the array as the buffer. This will allow you to store the string directly to the buffer, and will tell you if the buffer is too small.
char command[512];
if(![theString getCString:command maxLength:sizeof(command)/sizeof(*command) encoding:NSUTF8StringEncoding]) {
NSLog(#"Command buffer too small");
}

Casting NSString to unsigned char *

I'm trying to use a function that has the following signature to sign a HTTP request:
extern void hmac_sha1(const unsigned char *inText, int inTextLength, unsigned char* inKey, const unsigned int inKeyLength, unsigned char *outDigest);
And this is the method I wrote to use it:
- (NSString *)sign: (NSString *)stringToSign {
NSString *secretKey = #"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
const unsigned char *inText = (unsigned char *)[stringToSign UTF8String];
int inTextLength = [stringToSign length];
unsigned char *inKey = (unsigned char *)[secretKey UTF8String];
const unsigned int inKeyLength = (unsigned int)[secretKey length];
unsigned char *outDigest;
hmac_sha1(inText, inTextLength, inKey, inKeyLength, outDigest);
NSString *output = [NSString stringWithUTF8String:(const char *)outDigest];
return output;
}
The problem is I'm sure this is not the way I'm supposed to do this casting, as inside this hmac_sha1 function I get a EXC_BAD_ACCESS exception.
Since I am new to Objective-C and have close to no experience in C (surprise!) I don't really know what to search for. Any tips on how I can start solving this?
Thanks in advance!
BTW, I got the reference for this function here in stackoverflow.
It looks like the problem is not with the casting, but with outDigest. The fifth argument to hmac_sha1 should point to an already allocated buffer of size 20 bytes (I think).
If you change the line that says
unsigned char *outDigest;
to say
#define HMACSHA1_DIGEST_SIZE 20
void *outDigest = malloc(HMACSHA1_DIGEST_SIZE);
That should get you past the crash inside hmac_sha1.
Then you've got the problem of converting the data at outDigest into an NSString. It looks like hmac_sha1 will put 20 bytes of random-looking data at outDigest, and not a null terminated UTF-8 string, so stringWithUTF8String: won't work. You might want to use something like this instead if you have to return an NSString:
NSString *output = [[NSString alloc] initWithBytesNoCopy:outDigest
length:HMACSHA1_DIGEST_SIZE
encoding:NSASCIIStringEncoding
freeWhenDone:YES];
I don't think NSString is really the right type for the digest, so it might be worth changing your method to return an NSData if you can.
This wasn't part of your question but it's a bug nonetheless, you shouldn't use -length to get the byte count of an UTF8 string. That method returns the number of Unicode characters in the string, not the number of bytes. What you want is -lengthOfBytesUsingEncoding:.
NSUInteger byteCount = [stringToSign lengthOfBytesUsingEncoding:NSUTF8StringEncoding];
Also be aware that the result does not account for a terminating NULL character.
Are you sure you don't need to allocate some memory for outDigest before calling hmac_sha1? Since you pass in a pointer, rather than a pointer to a pointer, there's no way that the memory can be allocated inside the routine.