SHA256 signature generation not matching: Java vs Objective C - objective-c

I am converting a Java code to Objective C to match signature in request with SHA256.
Here are my inputs and outputs:
#
Java:
SecretKeySpec key = new SecretKeySpec(client_secret.getBytes("UTF-8"), "HmacSHA256");
Mac mac = Mac.getInstance("HmacSHA256");
mac.init(key);
byte[] bytes = mac.doFinal(baseString.getBytes("UTF-8"));
String signature = new String(Base64.encodeBase64(bytes));
log.info("signature="+signature);
Output is: signature=q4T4uuz482U+guKa8oRn8Enq9xJjSRYvQlYxF6TSAFQ=
Obj C:
-- (NSString*) HMACWithSec:(NSString*) secret data:(NSString*) data {
const char *cKey = [secret cStringUsingEncoding:NSASCIIStringEncoding];
//Making it NSUTF8StringEncoding does not help
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *hash = [[NSData alloc] initWithBytes: cHMAC length:CC_SHA256_DIGEST_LENGTH];
// I tried making a Hex string out of the bytes and then covering it back to ASCII or UTF-8, did not work.
NSString* signature = [hash base64EncodedStringWithOptions:0];
//All base 64 methods returns same. I found this as most convenient.
NSLog(#"%#", signature);
return signature;
}
Output is: HcqgsyTj0DRYlGT1QV9L48LcvX9Zc+eX8ShXjJQEDKc=
#
I tried the following too:
NSString *s8 = [NSString stringWithCharacters:(const char *)cHMAC length:sizeof(cHMAC)];
Output is arbitrary chars.
Explanation:
Both inputs are same for secret and data. I verified with cKey and cData. After I am getting the bytes in obj c into "cHMAC". Making it a NSData and then getting the base64 string is not generating the same string.
Question:
What did I miss out? Any insight helps.

Related

How does AES-128 CBC encryption with IV work in objective-C?

I am trying to encrypt a string with 'AES-128 CBC with IV'. Here is the input parameter and expected output:
Key:
000102030405060708090A0B0C0D0E0F
IV:
00102030405060708090A0B0C0D0E0F0
Input data:
EA010B23CDA9B16F0001020304050607
Output:
B773C36749E87D3F8FED98FE52026A15
I have verified the output on this web site:
http://extranet.cryptomathic.com/aescalc/index?key=000102030405060708090A0B0C0D0E0F&iv=00102030405060708090A0B0C0D0E0F0&input=EA010B23CDA9B16F0001020304050607&mode=cbc&action=Encrypt&output=B773C36749E87D3F8FED98FE52026A15
How to encrypt a string with AES-128 CBC with IV in objective C? (With same result as http://extranet.cryptomathic.com/aescalc) I am trying to get the encrypted string - B773C36749E87D3F8FED98FE52026A15 , but no luck.
I have tried to use this library for the encryption: https://github.com/Pakhee/Cross-platform-AES-encryption/tree/master/iOS
Here is my objective c code:
NSString* data = #"EA010B23CDA9B16F0001020304050607";
NSString* key = #"000102030405060708090A0B0C0D0E0F";
NSString* iv = #"00102030405060708090A0B0C0D0E0F0";
NSData *encryptedData = [[StringEncryption alloc] encrypt:[#"EA010B23CDA9B16F0001020304050607" dataUsingEncoding:NSUTF8StringEncoding] key:key iv:iv];
NSLog(#"encryptedData %#", encryptedData);
The output of encryptedData is:
<68f8ed75 e79f2ba2 c80e67a2 f0c84b7a c4b07fd1 59e937e5 14644cba c0ddb60c 40502375 7798e7a1 58bd05a5 b3d9e7bd>
I expect the value of *encryptedData should be <42373733 43333637 34394538 37443346 38464544 39384645 35323032 36413135>, which is hex of B773C36749E87D3F8FED98FE52026A15
I have tried another library - https://github.com/dev5tec/FBEncryptor
NSData* _data = [data dataUsingEncoding:NSUTF8StringEncoding];
NSData* _key = [key dataUsingEncoding:NSUTF8StringEncoding];
NSData* _iv = [iv dataUsingEncoding:NSUTF8StringEncoding];
NSData *encryptedData2 = [FBEncryptorAES encryptData:_data key:_key iv:_iv];
NSLog(#"encryptedData2 = %#", encryptedData2);
Output is <2beea977 aef69eb1 ed9f6dd0 7bf5f1ce d1e5df46 2cbf8465 773f122d 03267abb 2e113d9b 07189268 4fd6babe 7b1c0056>
It seems that I am using wrong library or I have wrong input to the encryption function. Any recommendation of AES library for objective c?
Common Crypto is the correct thing to use for encryption for iOS and OSX. The issue it to provide the correct input.
The input key, iv and data appear to be in hexadecimal. Cryptomathic expects it inputs to be in hexadecimal and it's output is in hexadecimal so it works correctly.
But the ObjC code uses:
NSString* data = #"EA010B23CDA9B16F0001020304050607";
NSData* _data = [data dataUsingEncoding:NSUTF8StringEncoding];
which uses the hexadecimal as a character string.
Instead use a hex to data conversion such as #Larme linked to, see the first comment.
From the sizes of the input and output it appears you are using PKCS#7 padding which adds a full block of padding if the input is an exact multiple of the block size, Cryptomathic does not add PKCS#7 padding.
Update
#interface Crypt : NSObject
+ (NSData *)aes128Data:(NSData *)dataIn;
+ (NSData *)dataFromHexString:(NSString *)hexString;
#end
#implementation Crypt
+ (NSData *)aes128Data:(NSData *)dataIn
operation:(CCOperation)operation // kCC Encrypt, Decrypt
key:(NSData *)key
options:(CCOptions)options // kCCOption PKCS7Padding, ECBMode,
iv:(NSData *)iv
error:(NSError **)error
{
CCCryptorStatus ccStatus = kCCSuccess;
size_t cryptBytes = 0;
NSMutableData *dataOut = [NSMutableData dataWithLength:dataIn.length + kCCBlockSizeAES128];
ccStatus = CCCrypt( operation,
kCCAlgorithmAES,
options,
key.bytes, key.length,
iv.bytes,
dataIn.bytes, dataIn.length,
dataOut.mutableBytes, dataOut.length,
&cryptBytes);
if (ccStatus == kCCSuccess) {
dataOut.length = cryptBytes;
}
else {
if (error) {
*error = [NSError errorWithDomain:#"kEncryptionError"
code:ccStatus
userInfo:nil];
}
dataOut = nil;
}
return dataOut;
}
+ (NSData *)dataFromHexString:(NSString *)hexString {
char buf[3];
buf[2] = '\0';
unsigned char *bytes = malloc([hexString length]/2);
unsigned char *bp = bytes;
for (CFIndex i = 0; i < [hexString length]; i += 2) {
buf[0] = [hexString characterAtIndex:i];
buf[1] = [hexString characterAtIndex:i+1];
char *b2 = NULL;
*bp++ = strtol(buf, &b2, 16);
}
return [NSData dataWithBytesNoCopy:bytes length:[hexString length]/2 freeWhenDone:YES];
}
#end
NSString *dataHexString = #"EA010B23CDA9B16F0001020304050607";
NSString *keyHexString = #"000102030405060708090A0B0C0D0E0F";
NSString *ivHexString = #"00102030405060708090A0B0C0D0E0F0";
NSLog(#"dataHexString: %#", dataHexString);
NSLog(#"keyHexString: %#", keyHexString);
NSLog(#"ivHexString: %#", ivHexString);
NSData *data = [Crypt dataFromHexString:dataHexString];
NSData *key = [Crypt dataFromHexString:keyHexString];
NSData *iv = [Crypt dataFromHexString:ivHexString];
NSLog(#"data: %#", data);
NSLog(#"key: %#", key);
NSLog(#"iv: %#", iv);
NSError *error;
NSData *encryptedData = [Crypt
aes128Data:data
operation:kCCEncrypt
key:key
options:0
iv:iv
error:&error];
NSLog(#"encryptedData %#", encryptedData);
Output:
dataHexString: EA010B23CDA9B16F0001020304050607
keyHexString: 000102030405060708090A0B0C0D0E0F
ivHexString: 00102030405060708090A0B0C0D0E0F0
data: <ea010b23 cda9b16f 00010203 04050607>
key: <00010203 04050607 08090a0b 0c0d0e0f>
iv: <00102030 40506070 8090a0b0 c0d0e0f0>
encryptedData: <b773c367 49e87d3f 8fed98fe 52026a15>
Note encryptedData matches the Cryptomathic result.

Different HMac Digest are being generated every time for the same input in objective c

Trying to get the digest using HMac SHA256 with below code but every time it is giving different output.
Here key parameter is in Base64string format while plaintext parameter is without any encoding.
+(NSString *)hmacWithIndicies:(NSString *)plaintext withKey:(NSString *)key {
NSLog(#"Input text::%#",plaintext);
NSLog(#"Input Key::%#",key);
NSData *keyData = [[NSData alloc] initWithBase64EncodedString:key options:0];
NSLog(#"Key Data is::%#",keyData);
const char *cKey = (char *)[keyData bytes];
NSLog(#"Key Length is::%lu",strlen(cKey));
NSData *keyInData = [NSData dataWithBytes:cKey length:sizeof(cKey)];
NSLog(#"Key data = %#", keyInData);
//Data here
const char *cData = [plaintext cStringUsingEncoding:NSUTF8StringEncoding];
NSLog(#"Input Length is::%lu",strlen(cData));
NSData *dataData = [NSData dataWithBytes:cData length:sizeof(cData)];
NSLog(#"Input data = %#", dataData);
uint8_t cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *hMacInData =[[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSLog(#"Hash Mac data generated is %#", hMacInData);
NSString *b64EncStrHmac = [hMacInData base64EncodedStringWithOptions:0];
NSLog(#"Hash Mac generated is %#", b64EncStrHmac);
return b64EncStrHmac;
}
Calling the above method as below:-
NSString * hMacOutput= [KeyGeneration hmacWithIndicies:#"2SagarPra2983688" withKey:#"qDwki5t1SSuKER4mzSMBHXhtt+PRMCv0B2LgXaBZmgE="];
NSLog(#"Output of HMac digest::%#",hMacOutput);
hMacOutput digest is resulting in different output every time it is being called.
You can not use strlen() on non "C" strings, "C" strings are null terminated strings that do not contain any 0x00 bytes. strlen() counts until it finds the first 0x00 byte, on data bytes that could be early or past the end of the data, possible causing a crash.
You are trying to hard, there is no reason for "C" style arrays, just use the bytes member of NSData and NSMutableData along with the length method.
[NSMutableData dataWithLength: ] allocates memory.
Example:
+(NSString *)hmacWithIndicies:(NSString *)plaintext withKey:(NSString *)key {
NSLog(#"Input text: %#", plaintext);
NSLog(#"Input Key: %#", key);
NSData *keyData = [[NSData alloc] initWithBase64EncodedString:key options:0];
NSLog(#"keyData Length: %lu, Data: %#", keyData.length, keyData);
NSData *inData = [plaintext dataUsingEncoding:NSUTF8StringEncoding];
NSLog(#"inData Length: %lu, Data: %#", inData.length, inData);
NSMutableData *HMACdata = [NSMutableData dataWithLength:CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, keyData.bytes, keyData.length, inData.bytes, inData.length, (void *)HMACdata.bytes);
NSLog(#"Hash Mac data generated: %#", HMACdata);
NSString *b64EncStrHmac = [HMACdata base64EncodedStringWithOptions:0];
NSLog(#"Hash Mac generated: %#", b64EncStrHmac);
return b64EncStrHmac;
}
Output:
Input text: 2SagarPra2983688
Input Key: qDwki5t1SSuKER4mzSMBHXhtt+PRMCv0B2LgXaBZmgE=
keyData Length: 32, Data: a83c248b 9b75492b 8a111e26 cd23011d 786db7e3 d1302bf4 0762e05d a0599a01
inData Length: 16, Data: 32536167 61725072 61323938 33363838
Hash Mac data generated: b681d2b1 251f1953 3716258c 8eeb9101 db3ecad2 c4a5077e 0cf76617 e45e5459
Hash Mac generated: toHSsSUfGVM3FiWMjuuRAds+ytLEpQd+DPdmF+ReVFk=
Output of HMac digest::toHSsSUfGVM3FiWMjuuRAds+ytLEpQd+DPdmF+ReVFk=
It is not possible to use strlen for binary data. As the key of HMAC can be of any size you may be using more bytes than the key actually contains. If the key changes each time, you will get different output. You need to retrieve the size of the key from keyData, not from a cKey.

NSString from NSData always null

I would like to sign a request with HMAC SHA512, but I seem to mess up encoding and decoding from and to NSData and NSString. I desperately tried to figure out what is wrong, but I just don't seem to get it right.
PSEUDOCODE:
function hmac_512(msg, sec) {
sec = Base64Decode(sec);
result = hmac(msg, sec, sha512);
return Base64Encode(result);
}
secret = "7pgj8Dm6";
message = "Test\0Message";
result = hmac_512(message, secret);
if (result == "69H45OZkKcmR9LOszbajUUPGkGT8IqasGPAWqW/1stGC2Mex2qhIB6aDbuoy7eGfMsaZiU8Y0lO3mQxlsWNPrw==")
print("Success!");
else
printf("Error: %s", result);
My implementation:
+(void)doSomeMagic{
NSString *message = #"Test\0Message";
NSString *signedRequest = [self signRequestForParameterString:message];
//checking against CORRECT (from JAVA equivalent) signed request
if ([signedRequest isEqualToString:#"69H45OZkKcmR9LOszbajUUPGkGT8IqasGPAWqW/1stGC2Mex2qhIB6aDbuoy7eGfMsaZiU8Y0lO3mQxlsWNPrw==" ])
NSLog(#"Success!");
else
NSLog(#"Error!");
}
Here is the signing method:
+(NSString *)signRequestForParameterString:(NSString*)paramStr{
NSString *secret = #"7pgj8Dm6";
// secret is base64 encoded, so I decode it
NSData *decodedSecret = [secret base64DecodedData];
NSString *decodedSecretString = [NSString stringWithUTF8String:[decodedSecret bytes]];
NSData *data = [paramStr dataUsingEncoding:NSUTF8StringEncoding];
NSString *dataString = [NSString stringWithUTF8String:[data bytes]];
return [self generateHMACSHA512Hash:decodedSecretString data:dataString];
}
Here is the hashing function:
+(NSString *)generateHMACSHA512Hash:(NSString *)key data:(NSString *)data{
const char *cKey = [key cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA512_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA512, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC
length:sizeof(cHMAC)];
NSString *hash = [HMAC base64EncodedString];
return hash;
}
I am pretty sure it is due to the encoding of the strings (decodedSecretString and dataString). decodedSecretString (decoded base64) after decoding is encoded in ASCII. However, when I call the hashing method, I encode it in ascii again, which will result in a null error. Everything is confusing me right now.
Your secret doesn't decode to a valid UTF-8 string, and Java allows NUL bytes in strings, but when you're converting "Test\0Message" to a C string and using strlen, its length is 4.
Something like this should work:
+(NSString *)signRequestForParameterString:(NSString*)paramStr{
NSString *secret = #"7pgj8Dm6";
NSData *data = [paramStr dataUsingEncoding:NSUTF8StringEncoding];
return [self generateHMACSHA512Hash:[secret base64DecodedData] data:data];
}
+(NSString *)generateHMACSHA512Hash:(NSData *)key data:(NSData *)data{
unsigned char cHMAC[CC_SHA512_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA512, key.bytes, key.length, data.bytes, data.length, cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
return [HMAC base64EncodedString];
}
When doing HMAC or other cryptographic functions, you should build up some fundamental methods/functions that don't deal with strings first. Then you can create wrapper methods that decode/encode string data or digests in a convenient way.
+ (NSData *)dataBySigningData:(NSData *)data withKey:(NSData *)key
{
unsigned char cHMAC[CC_SHA512_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA512, [key bytes], [key length], [data bytes], [data lenght], cHMAC);
return [[NSData alloc] initWithBytes:cHMAC length:CC_SHA512_DIGEST_LENGTH];
}
+ (NSData *)dataBySigningMessage:(NSString *)message withKey:(NSData *)key
{
return [self dataBySigningData:[message dataUsingEncoding:NSUTF8StringEncoding]
withKey:[key dataUsingEncoding:NSUTF8StringEncoding]];
}
(Note: this code is not tested, just hacked together in a text editor)
Don't worry about the string representation of your key or data. Then you can go from there, e.g. getting the base64 encoding of the digest.
Cryptographic functions DO NOT CARE about strings or text encodings. They care about bytes. Strings (in C, since they are null-terminated) are a mere subset of what can be represented in data. So it would be severely limiting to deal with strings.

Creating iOS SHA1 Hash with UTF-8 characters

I'm using this code to create hash from giving string to devolving IOS App.
-(NSString*) sha1:(NSString*)input
{
const char *cstr = [input cStringUsingEncoding:NSUTF8StringEncoding];
NSData *data = [NSData dataWithBytes:cstr length:input.length];
uint8_t digest[CC_SHA1_DIGEST_LENGTH];
CC_SHA1(data.bytes, data.length, digest);
NSMutableString* output = [NSMutableString stringWithCapacity:CC_SHA1_DIGEST_LENGTH * 2];
for(int i = 0; i < CC_SHA1_DIGEST_LENGTH; i++)
[output appendFormat:#"%02x", digest[i]];
return output;
}
my PHP code is
sha1(json_encode($array));
I have string that contain Arabic languages.
when I create hash from English string and compare it with hash created from php code under Ubuntu*strong text* the result will be the same.
but when I create the hash with Arabic character it will and compare it with hash created from php code does't mach the result.
so what is the man problem with this code.
Thanks
input.length is the number of characters, not the number of bytes, the difference is that many bytes under UTF8 encoding are multiple bytes in length.
Replace:
const char *cstr = [input cStringUsingEncoding:NSUTF8StringEncoding];
NSData *data = [NSData dataWithBytes:cstr length:input.length];
with the NSString method:
NSData *data = [input dataUsingEncoding: NSUTF8StringEncoding];
There is no need for the intermediate const char *cstr.
I found my problem that come from my PHP json_encode($array) function.
so I found this function in http://php.net/manual/en/function.json-encode.php that encode my json with UTF-8 character
function my_json_encode($arr)
{
//convmap since 0x80 char codes so it takes all multibyte codes (above ASCII 127). So such characters are being "hidden" from normal json_encoding
array_walk_recursive($arr, function (&$item, $key) { if (is_string($item)) $item = mb_encode_numericentity($item, array (0x80, 0xffff, 0, 0xffff), 'UTF-8'); });
return mb_decode_numericentity(json_encode($arr), array (0x80, 0xffff, 0, 0xffff), 'UTF-8');
}
EX:
sha1(my_json_encode($newArray)

objective c hmac sha 256 gives wrong nsdata output

So finally figured out how to do an hmac sha 256 hashing. I will be using this for a wcf service api i made. My problem is that the NSData output that my method is sending out have spaces.
eg. This is how it looks like what my API sends out
"2efb00aba01a3f5b674fba3063b43fee7a9356947118......"
And this is how my iphone app shows it
<2efb00ab a01a3f5b 674fba30.....>
This is how my code in objective c looks like:
NSData *hmacSHA256(NSString *key, NSString *data)
{
const char *cKey = [key cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
return [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
}
This came from this answer:
https://stackoverflow.com/a/8459123/639713
Anyway, my issue is, how do I deal with this. How do I convert the NSdata output to string? And if does get converted to string I'm guessing the output will be different from what the WCF Service API sends out. Do I change how the API processes it's hmacsha256 output?
Thanks!
You could modify your method slightly so that instead of creating an NSData containing the digest bytes, you could create a string formatting the bytes as hexadecimal.
NSString *hmacSHA256(NSString *key, NSString *data)
{
const char *cKey = [key cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSMutableString *result = [NSMutableString string];
for (int i = 0; i < sizeof cHMAC, i++)
{
[result appendFormat:#"%02hhx", cHMAC[i]];
}
return result;
}
<2efb00ab a01a3f5b 674fba30.....> looks like the result of calling -[NSData description], like NSLog would do for any %# format strings. The NSData itself represents a sequence of bytes. The output you're after appears to be the byte sequence as a hexidecimal string. See Best way to serialize an NSData into a hexadeximal string for how to serialize the NSData to that format.