How do I do a bit-wise XOR on NSData in Objective-C? - objective-c

I have two NSData objects, data1 and data2, and I'd like to do a bit-wise XOR and store the result in a third NSData object, xorData.
The first thing I tried was this:
*data1.bytes^*data2.bytes;
but it gave an error of:
Invalid operands to binary expression ('const void' and 'const void')
So I tried to write a function that would extract the bytes of the data into an array of integers, perform the xor on the integers, then save the result back into an NSData object. The function looks like this:
+(NSData *) DataXOR1:(NSData *) data1
DataXOR2:(NSData *)data2{
const int *data1Bytes = [data1 bytes];
const int *data2Bytes = [data2 bytes];
NSMutableArray *xorBytes = [[NSMutableArray alloc] init ];
for (int i = 0; i < data1.length;i++){
[xorBytes addObject:[NSNumber numberWithInt:(data1Bytes[i]^data2Bytes[i])]];
}
NSData *xorData = [NSKeyedArchiver archivedDataWithRootObject:xorBytes];
return xorData;
}
This runs, but gives the wrong answers. When I test it on two simple pieces of data (data1 = 0x7038 and data2 = 0x0038), and use NSLog to output what the values are, I get:
data1Bytes[0] = 8070450532247943280
data2Bytes[0] = 8070450532247943168
data1Bytes[0]^data2Bytes[0] = 112
data1Bytes[1] = 10376302331560798334
data2Bytes[1] = 10376302331560798334
data1Bytes[1]^data2Bytes[1] = 0
This boggles my mind a bit because the values in the dataXBytes arrays are totally wrong, but they're xor-ing to the right values! (0x70 ^ 0x00 = 0x70 = 112)
I think it might be an endian-ness problem, but when I change the initialization of data1Bytes to:
const int *data1Bytes = CFSwapInt32BigToHost([data1 bytes]);
it runs into an error when it tries to access it, saying:
Thread 1: EXC_BAD_ACCESS(code=1, address = 0xa08ad539)
Is there a much simpler way to do the xor? If not, how can I fix the endian problem?

Casting to int then archiving an NSArray of NSNumbers will definitely not create the result you're looking for. You'll want to have some mutable NSData to which you append the individual bytes to, something like
+(NSData *) DataXOR1:(NSData *) data1
DataXOR2:(NSData *)data2{
const char *data1Bytes = [data1 bytes];
const char *data2Bytes = [data2 bytes];
// Mutable data that individual xor'd bytes will be added to
NSMutableData *xorData = [[NSMutableData alloc] init];
for (int i = 0; i < data1.length; i++){
const char xorByte = data1Bytes[i] ^ data2Bytes[i];
[xorData appendBytes:&xorByte length:1];
}
return xorData;
}

+ (NSData *) xorData:(NSData *)data1 with:(NSData *)data2
{
// make data1 smaller
if (data1.length > data2.length) {
NSData *t = data1;
data1 = data2;
data2 = t;
}
char *xor = (char *) malloc(data2.length * sizeof(char));
char * data1Bytes = (char *) data1.bytes;
char * data2Bytes = (char *) data2.bytes;
for (int i = 0; i <data1.length; i++)
{
xor[i] = data1Bytes[i] ^ data2Bytes[i];
}
NSMutableData *data = [[[NSMutableData alloc] initWithBytes:xor length:data1.length] autorelease];
[data appendData:[data2 subdataWithRange:NSMakeRange(data1.length, data2.length - data1.length)]];
free(xor);
return [NSData dataWithData:data];
}

Related

Equivalent Hashing in C# and Objective-C using HMAC256

I'm working with a partner and we're not able to get C# and Objective-C to produce the same hashes using what we think are the same tools in the respective languages. In C#, I'm doing this:
byte[] noncebytes=new byte[32];
//We seed the hash generator with a new 32 position array. Each position is 0.
//In prod code this would be random, but for now it's all 0s.
HMACSHA256 hmac256 = new HMACSHA256(noncebytes);
string plaintext = "hello";
string UTFString = Convert.ToBase64String(
System.Text.Encoding.UTF8.GetBytes(plaintext));
string HashString = Convert.ToBase64String(
hmac256.ComputeHash(System.Text.Encoding.UTF8.GetBytes(plaintext))); //Convert that hash to a string.
This produces the following base64string hash:
Q1KybjP+DXaaiSKmuikAQQnwFojiasyebLNH5aWvxNo=
What is the equivalent Objective-C code to do this? We need the client and the server to be able to generate matching hashes for matching data.
Here is the Objective-C code we are currently using:
...
NSData *zeroNumber = [self zeroDataWithBytes:32]; //empty byte array
NSString *nonceTest = [zeroNumber base64String]; // using MF_Base64Additions.h here
NSData *hashTest = [self hmacForKeyAndData:nonceTest withData:#"hello"]; //creating hash
NSString *hashTestText = [hashTest base64String];
NSLog(#"hello hash is %#", hashTestText);
...
//functions for zeroing out the byte. I'm sure there's a better way
- (NSData *)zeroDataWithBytes: (NSUInteger)length {
NSMutableData *mutableData = [NSMutableData dataWithCapacity: length];
for (unsigned int i = 0; i < length; i++) {
NSInteger bits = 0;
[mutableData appendBytes: (void *) &bits length: 1];
} return mutableData;
}
//hash function
-(NSData *) hmacForKeyAndData:(NSString *)key withData:(NSString *) data {
const char *cKey = [key cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
return [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
}
UPDATE:
There is a pretty good project on GitHub that seems to accomplish everything you want, plus a lot more encryption related options; includes unit tests.
NSData *hmacForKeyAndData(NSString *key, NSString *data)
{
const char *cKey = [key cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
return [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
}
(Source)
With the above, I think you will have import <CommonCrypto/CommonHMAC.h>. The next step for encoding to Base64:
+ (NSString *)Base64Encode:(NSData *)data
{
//Point to start of the data and set buffer sizes
int inLength = [data length];
int outLength = ((((inLength * 4)/3)/4)*4) + (((inLength * 4)/3)%4 ? 4 : 0);
const char *inputBuffer = [data bytes];
char *outputBuffer = malloc(outLength);
outputBuffer[outLength] = 0;
//64 digit code
static char Encode[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
//start the count
int cycle = 0;
int inpos = 0;
int outpos = 0;
char temp;
//Pad the last to bytes, the outbuffer must always be a multiple of 4
outputBuffer[outLength-1] = '=';
outputBuffer[outLength-2] = '=';
/* http://en.wikipedia.org/wiki/Base64
Text content M a n
ASCII 77 97 110
8 Bit pattern 01001101 01100001 01101110
6 Bit pattern 010011 010110 000101 101110
Index 19 22 5 46
Base64-encoded T W F u
*/
while (inpos < inLength){
switch (cycle) {
case 0:
outputBuffer[outpos++] = Encode[(inputBuffer[inpos]&0xFC)>>2];
cycle = 1;
break;
case 1:
temp = (inputBuffer[inpos++]&0x03)<<4;
outputBuffer[outpos] = Encode[temp];
cycle = 2;
break;
case 2:
outputBuffer[outpos++] = Encode[temp|(inputBuffer[inpos]&0xF0)>> 4];
temp = (inputBuffer[inpos++]&0x0F)<<2;
outputBuffer[outpos] = Encode[temp];
cycle = 3;
break;
case 3:
outputBuffer[outpos++] = Encode[temp|(inputBuffer[inpos]&0xC0)>>6];
cycle = 4;
break;
case 4:
outputBuffer[outpos++] = Encode[inputBuffer[inpos++]&0x3f];
cycle = 0;
break;
default:
cycle = 0;
break;
}
}
NSString *pictemp = [NSString stringWithUTF8String:outputBuffer];
free(outputBuffer);
return pictemp;
}
Note the second line of code in the objective-c portion of the question.
NSString *nonceTest = [zeroNumber base64String];
but it should be this:
NSString *nonceTest = [[NSString alloc] initWithData:zeroNumber encoding:NSASCIIStringEncoding];
It was a case of converting the string to base64 when we didn't need to for the hmac seeeding.
We now get: Q1KybjP+DXaaiSKmuikAQQnwFojiasyebLNH5aWvxNo= as the hash on both platforms.

IOBluetoothRFCOMMChannel: How to use writeSync?

I need to write a value from "a" to "o" to a bluetooth device. The device uses SPP and I'm already connected via IOBluetoothRFCOMMChannel.
There are functions like writeSync:lenght: but how do I use them? As I said, I need to send a value from "a" to "o"
I tried:
[rfcommChannel writeSync:"a" length:1];
but it isn't working.
Apple has an example code with:
[rfcommChannel writeSync:"ATZ\n" length:4];
but I'm not sure what "ATZ" means.
Found the answer:
- (void) sendData:(NSString *)string toChannel:(IOBluetoothRFCOMMChannel*)rfcommChannel
{
int i;
// Turn the string into data.
NSData *data = [string dataUsingEncoding:NSASCIIStringEncoding];
char buffer[ [data length] +4];
char *bytes = (char *) [data bytes];
// Add a CRLF to the start
buffer[0] = 13;
buffer[1] = 10;
// Copy the data into the buffer.
for (i=0;i<[data length];i++)
{
buffer[2+i] = bytes[i];
}
// Append a CRLF
buffer[ [data length]+2] = 13;
buffer[ [data length]+3] = 10;
// Synchronously write the data to the channel.
[rfcommChannel writeSync:&buffer length:[data length]+4];
}
For my specific case:
[self sendData:#"a" toChannel:self.rfcommChannel];

How to set a byte in NSMutableData

How would I set a byte in an NSMutableData object?
I tried the following:
-(void)setFirstValue:(Byte)v{
[mValues mutableBytes][0] = v;
}
But that makes the compiler cry out loud...
But that makes the compiler cry out loud...
That is because mutableBytes* returns void*. Cast it to char* to fix the problem:
((char*)[mValues mutableBytes])[0] = v;
You could also use replaceBytesInRange:withBytes:
char buf[1];
buf[0] = v;
[mValues replaceBytesInRange:NSMakeRange(0, 1) withBytes:buf];
I cast it to an array
NSMutableData * rawData = [[NSMutableData alloc] initWithData:data];
NSMutableArray * networkBuffer = [[NSMutableArray alloc]init];
const uint8_t *bytes = [self.rawData bytes];
//cycle through data and place it in the network buffer
for (int i =0; i < [data length]; i++)
{
[networkBuffer addObject:[NSString stringWithFormat:#"%02X", bytes[i]]];
}
then of course you can just adjust objects in your networkBuffer (which is an nsmutablearray)

Convert NSString to C string, increment and come back to NSString

I'm trying to develop a simple application where i can encrypt a message. The algorithm is Caesar's algorithm and for example, for 'Hello World' it prints 'KHOOR ZRUOG' if the increment is 3 (standard).
My problem is how to take each single character and increment it...
I've tried this:
NSString *text = #"hello";
int q, increment = 3;
NSString *string;
for (q = 0; q < [text length]; q++) {
string = [text substringWithRange:NSMakeRange(q, 1)];
const char *c = [string UTF8String] + increment;
NSLog(#"%#", [NSString stringWithUTF8String:c]);
}
very simple but it doesn't work.. My theory was: take each single character, transform into c string and increment it, then return to NSString and print it, but xcode print nothing, also if i print the char 'c' i can't see the result in console. Where is the problem?
First of all, incrementing byte by byte only works for ASCII strings. If you use UTF-8, you will get garbage for glyphs that have multi-byte representations.
With that in mind, this should work (and work faster than characterAtIndex: and similar methods):
NSString *foo = #"FOOBAR";
int increment = 3;
NSUInteger bufferSize = [foo length] + 1;
char *buffer = (char *)calloc(bufferSize, sizeof(char));
if ([foo getCString:buffer maxLength:bufferSize encoding:NSASCIIStringEncoding]) {
int bufferLen = strlen(buffer);
for (int i = 0; i < bufferLen; i++) {
buffer[i] += increment;
if (buffer[i] > 'Z') {
buffer[i] -= 26;
}
}
NSString *encoded = [NSString stringWithCString:buffer
encoding:NSASCIIStringEncoding];
}
free(buffer);
first of all replace your code with this:
for (q = 0; q < [text length]; q++) {
string = [text substringWithRange:NSMakeRange(q, 1)];
const char *c = [string UTF8String];
NSLog(#"Addr: 0x%X", c);
c = c + increment;
NSLog(#"Addr: 0x%X", c);
NSLog(#"%#", [NSString stringWithUTF8String:c]);
}
Now you can figure out your problem. const char *c is a pointer. A pointer saves a memory address.
When I run this code the first log output is this:
Addr: 0x711DD10
that means the char 'h' from the NSString named string with the value #"h" is saved at address 0x711DD10 in memory.
Now we increment this address by 3. Next output is this:
Addr: 0x711DD13
In my case at this address there is a '0x00'. But it doesn't matter what is actually there because a 'k' won't be there (unless you are very lucky).
If you are happy there is a 0x00 too. Because then nothing bad will happen. If you are unlucky there is something else. If there is something other than 0x00 (or the string delimiter or "end of string") NSString will try to convert it. It might crash while trying this, or it might open a huge security hole.
so instead of manipulating pointers you have to manipulate the values where they point to.
You can do this like this:
for (q = 0; q < [text length]; q++) {
string = [text substringWithRange:NSMakeRange(q, 1)];
const char *c = [string UTF8String]; // get the pointer
char character = *c; // get the character from this pointer address
character = character + 3; // add 3 to the letter
char cString[2] = {0, 0}; // create a cstring with length of 1. The second char is \0, the delimiter (the "end marker") of the string
cString[0] = character; // assign our changed character to the first character of the cstring
NSLog(#"%#", [NSString stringWithUTF8String:cString]); // profit...
}

Converting NSString to Hex

I've been doing a lot of reading on how to convert a string to a hex value. Here is what I found to accomplish this:
NSString * hexString = [NSString stringWithFormat:#"%x", midiValue];
This returned some "interesting" results and upon reading a little further I found a post that mentioned this
"You're passing a pointer to an object representing a numeric value, not the numeric value proper."
So I substituted 192 instead of "midiValue" and it did what I expected.
How would I pass the string value and not the pointer?
Decleration of midiValue:
NSString *dMidiInfo = [object valueForKey:#"midiInformation"];
int midiValue = dMidiInfo;
You probably need to do something like this:
NSNumberFormatter *numberFormatter= [[NSNumberFormatter alloc] init];
int anInt= [[numberFormatter numberFromString:string ] intValue];
also, I think there is some example code in the xcode documentation for converting to and from a hex value, in the QTMetadataEditor sample. in the MyValueFormatter class.
+ (NSString *)hexStringFromData:(NSData*) dataValue{
UInt32 byteLength = [dataValue length], byteCounter = 0;
UInt32 stringLength = (byteLength*2) + 1, stringCounter = 0;
unsigned char dstBuffer[stringLength];
unsigned char srcBuffer[byteLength];
unsigned char *srcPtr = srcBuffer;
[dataValue getBytes:srcBuffer];
const unsigned char t[16] = "0123456789ABCDEF";
for (; byteCounter < byteLength; byteCounter++){
unsigned src = *srcPtr;
dstBuffer[stringCounter++] = t[src>>4];
dstBuffer[stringCounter++] = t[src & 15];
srcPtr++;
}
dstBuffer[stringCounter] = '\0';
return [NSString stringWithUTF8String:(char*)dstBuffer];
}
+ (NSData *)dataFromHexString:(NSString*) dataValue{
UInt32 stringLength = [dataValue length];
UInt32 byteLength = stringLength/2;
UInt32 byteCounter = 0;
unsigned char srcBuffer[stringLength];
[dataValue getCString:(char *)srcBuffer];
unsigned char *srcPtr = srcBuffer;
Byte dstBuffer[byteLength];
Byte *dst = dstBuffer;
for(; byteCounter < byteLength;){
unsigned char c = *srcPtr++;
unsigned char d = *srcPtr++;
unsigned hi = 0, lo = 0;
hi = charTo4Bits(c);
lo = charTo4Bits(d);
if (hi== 255 || lo == 255){
//errorCase
return nil;
}
dstBuffer[byteCounter++] = ((hi << 4) | lo);
}
return [NSData dataWithBytes:dst length:byteLength];
}
Hopefully this helps.
if you are messing with a simple iPhone app in Xcode using the iPhone 5.1 Simulator then this is useful:
//========================================================
// This action is executed whenever the hex button is
// tapped by the user.
//========================================================
- (IBAction)hexPressed:(UIButton *)sender
{
// Change the current base to hex.
self.currentBase = #"hex";
// Aquire string object from text feild and store in a NSString object
NSString *temp = self.labelDisplay.text;
// Cast NSString object into an Int and the using NSString method StringWithFormat
// which is similar to c's printf format then output into hexidecimal then return
// this NSString object with the hexidecimal value back to the text field for display
self.labelDisplay.text=[NSString stringWithFormat:#"%x",[temp intValue]];
}