16 bit hex and variables - objective-c

I am trying to send the following hex to NSOutputstream 0x0000000e000000010000001000003014
I'm able to send hex that is 8 bit with the following code:
long myhex = 0x0000000e00000001;
NSData *data = [[NSData alloc] initWithBytes:&myhex length: sizeof(myhex)];
[outputStream write:[data bytes] maxLength:[data length]];
The problem is when I try this:
long myhex = 0x0000000e000000010000001000003014;
it says "integer constant is too long for its type"
I cant seem to figure out what type of integer will except this hex value.

Instead of trying to find an integer type long enough, you should probably just create an array of bytes and send that. Not only will you eventually be unable to find a type long enough for the data you wish to send, but there are also differences in the order of bytes in integers on different platforms.
So, to send arbitrarily long data in any order, use an array of bytes (unsigned char, or, preferably, uint8_t from stdint.h), e.g.:
uint8_t dataBytes[] = { 0x00, … , 0x0e, … 0x30, 0x14 };
NSData *data = [[NSData alloc] initWithBytes:dataBytes length:sizeof(dataBytes)];

Related

How to send byte using NSdata

I'm new in the area of Objective-C.
My question is, how can I Send a byte format like b70f using NSData?
So, basically I have to make a variable first with the value of b70f and then write it to the peripheral.
[peripheral writeValue:[NSData dataWithBytes:&value length:1] forCharacteristic:characteristic type:CBCharacteristicWriteWithResponse];
How can I do this?
Any help would be appreciated.
Assuming that b70f is a 16-bit value, something like this should work...
uint16_t value = 0xb70f;
NSData *data = [NSData dataWithBytes:&value length:2];
This takes advantage that an "array" of two bytes fits into a single 16-bit integer.
If the bytes are in the wrong order, wrap the assignment with OSSwapInt16().
uint16_t value = OSSwapInt16(0xb70f);

AES128 in Objective-C -> Test Case fails

I try to get a AES128 encryption running with objective-c. However, I don't get the correct output.
This may also be a stupid objective-c error because I'm just starting with this language ;-)
My test case looks like this:
CryptoUtils* crypt = [[CryptoUtils alloc] init];
NSData* plaintext = [#"6bc1bee22e409f96e93d7e117393172a" dataUsingEncoding:NSUTF8StringEncoding];
NSData* key = [#"2b7e151628aed2a6abf7158809cf4f3c" dataUsingEncoding:NSUTF8StringEncoding];
NSData* iv = [#"000102030405060708090A0B0C0D0E0F" dataUsingEncoding:NSUTF8StringEncoding];
NSString* encrypted = [crypt encryptData:plaintext withKey:key andIV:iv];
XCTAssertEqualObjects(#"7649abac8119b246cee98e9b12e9197d", encrypted , #"AES testcase1 not equal");
The encryptData method looks like this:
(NSString *)encryptData:(NSData*)clearText withKey:(NSData*) currentKey andIV:(NSData*) currentIV{
// Buffer for Ciphertext
NSMutableData *cipherData = [NSMutableData dataWithLength:clearText.length + kCCBlockSizeAES128];
size_t cipherLength;
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt,
kCCAlgorithmAES128,
kCCOptionPKCS7Padding,
currentKey.bytes,
currentKey.length,
currentIV.bytes,
clearText.bytes,
clearText.length,
cipherData.mutableBytes,
cipherData.length,
&cipherLength);
if(cryptStatus){
NSLog(#"Something terrible during encryption happened!");
} else {
NSLog(#"Ciphertext length: %i", [cipherData length]);
NSString *output=[cipherData description];
output = [output stringByReplacingOccurrencesOfString:#" " withString:#""];
output = [output stringByReplacingOccurrencesOfString:#"<" withString:#""];
output = [output stringByReplacingOccurrencesOfString:#">" withString:#""];
return output;
}
return nil;
}
Now, I'm getting back a wrong 'encrypted' String. Especially, it is a lot to long and I suspect the problem to be the NSData's that I pass to the method. Does anybody have an idea of what I'm doing wrong here?
Thank you
You meantion AES128 in your title, but refer to AES256 in your first paragraph.
It also appears that you are using input data, key and IV values that you do not mean to use.
It appears you want your input data to be 128 bits long (which would align with the block size), but in reality it is 376 bits long. This is technically acceptable, since you use padding, but context clues point to this being an oversight.
For AES128 your input key must be 128 bits long, but your key is 256 bits long. This is incorrect for AES128.
It appears you want your IV to be 128 bits long, but in reality it is 256 bits long. This is incorrect for AES - the IV must have the same length as block size, i.e. 128 bits.
Now, what you want to do instead, is probably:
char bytes[] = {0x6b, 0xc1, 0xbe, 0xe2, 0x2e, 0x40, 0x9f, 0x96, 0xe9, 0x3d, 0x7e, 0x11, 0x73, 0x93, 0x17, 0x2a};
NSData *clearText = [NSData dataWithBytes:&bytes length:16];
char keyBytes[] = {0x2b,0x7e,0x15,0x16,0x28,0xae,0xd2,0xa6,0xab,0xf7,0x15,0x88,0x09,0xcf,0x4f,0x3c};
NSData* currentKey = [NSData dataWithBytes:&keyBytes length:16];
char ivBytes[16] = char ivBytes[16] = {0x00,0x01,0x02,0x03,0x04,0x05,0x06,0x07,0x08,0x09,0x0A,0x0B,0x0C,0x0D,0x0E,0x0F};
NSData* currentIV = [NSData dataWithBytes:&ivBytes length:16];
These changes will give you input, key and IV values with 128 bit length.
After making these changes, output will be:
7649abac8119b246cee98e9b12e9197d8964e0b149c10b7b682e6e39aaeb731c00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
The first 32 characters match those you provided. This suggests that your reference value is incomplete, and the code is working as intended.

Converting NSData bytes to NSString

I am trying to create a 16 byte and later 32 byte initialization vector in objective-c (Mac OS). I took some code on how to create random bytes and modified it to 16 bytes, but I have some difficulty with this. The NSData dumps the hex, but an NSString dump gives nil, and a cstring NSLog gives the wrong number of characters (not reproduced the same in the dump here).
Here is my terminal output:
2012-01-07 14:29:07.705 Test3Test[4633:80f] iv hex <48ea262d efd8f5f5 f8021126 fd74c9fd>
2012-01-07 14:29:07.710 Test3Test[4633:80f] IV string: (null)
2012-01-07 14:29:07.711 Test3Test[4633:80f] IV char string t^Q¶�^��^A
Here is the main program:
int main (int argc, const char * argv[])
{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
//NSString *iv_string = [NSString stringWithCString:iv encoding:NSUTF8StringEncoding];
testclass *obj = [testclass alloc];
NSData *iv_data = [obj createRandomNSData];
//[iv_string dataUsingEncoding:NSUTF8StringEncoding];
NSLog(#"iv hex %#",iv_data);
//NSString *iv_string = [[NSString alloc] initWithBytes:[iv_data bytes] length:16 encoding:NSUTF8StringE$
NSString *iv_string = [[NSString alloc] initWithData:iv_data encoding:NSUTF8StringEncoding];
NSLog(#"IV string: %#",iv_string);
NSLog(#"IV char string %.*s",[iv_data bytes]);
return 0;
]
(I left in the above some commented code that I tried and did not work also).
Below is my random number generater, taken from a stack overflow example:
#implementation testclass
-(NSData*)createRandomNSData
{
int twentyMb = 16;
NSMutableData* theData = [NSMutableData dataWithCapacity:twentyMb];
for( unsigned int i = 0 ; i < twentyMb/4 ; ++i )
{
u_int32_t randomBits = arc4random();
[theData appendBytes:(void*)&randomBits length:4];
}
NSData *data = [NSData dataWithData:theData];
[theData dealloc];
return data;
}
#end
I am really quite clueless as to what could be the problem here. If I have data as bytes, it should convert to a string or not necessarily? I have looked over the relevant examples here on stackoverflow, but none of them have worked in this situation.
Thanks,
Elijah
An arbitrary byte sequence may not be legal UTF8 encoding. As #Joachim Isaksson notes, there is seldom reason to convert to strings this way. If you need to store random data as a string, you should use an encoding scheme like Base64, serialize the NSData to a plist, or similar approach. You cannot simply use a cstring either, since NULL is legal inside of a random byte sequence, but is not legal inside of a cstring.
You do not need to build your own random byte creator on Mac or iOS. There's one built-in called SecRandomCopyBytes(). For example (from Properly encrypting with AES with CommonCrypto):
+ (NSData *)randomDataOfLength:(size_t)length {
NSMutableData *data = [NSMutableData dataWithLength:length];
int result = SecRandomCopyBytes(kSecRandomDefault,
length,
data.mutableBytes);
NSAssert(result == 0, #"Unable to generate random bytes: %d",
errno);
return data;
}
When converting NSData to NSString using an UTF8 encoding, you won't necessarily end up with the same number of bytes since not all binary values are valid encodings of characters. I'd say using a string for binary data is a recipe for problems.
What is the use of the string? NSData is exactly the datatype you want for storing binary data to begin with.

Sending hexadecimal data to devices (Converting NSString to hexadecimal data)

I'm trying to send hexadecimal data via WiFi.
The code is something like this:
NSString *abc = #"0x1b 0x50";
NSData *data = [[[NSData alloc] initWithData:[abc dataUsingEncoding:NSASCIIStringEncoding]]autorelease];
[outputStream write:[data bytes] maxLength:[data length]]];
Instead of sending the hexadecimal data, it's sending it in text format.
I tried with NSUTF8StringEncoding, but it's the same. I'm using it with the NSStream class.
You're not getting what you expect with NSString *abc = #"0x1b 0x50". It's almost the same as having NSString *abc = #"cat dog 123 0x0x0x"; just a bunch of words separated by spaces. So when you create your NSData object, you're just initializing it with a string of characters, not a series of actual numbers.
If you can get your numbers into an NSArray, this question/answer should help you: How to convert NSArray to NSData?
The data that you probably want to send is simply 0x1b50, which is the decimal number 6992 (assuming big-endian), and fits into two bytes. This is not the same as a string (which could contain anything) even if it happens to contain some human-readable representation of those numbers.
I'm assuming you want to send this as binary data, and if so one way would be to simply send a buffer formed by a single UInt16 instead of a string. I'm not very familiar with the relevant APIs, but look to see if you can populate the NSData with an integer, perhaps something like:
UInt16 i = 0x1b50; // Or = 6992
[[NSData alloc] initWithBytes: &i length: sizeof(i)]
[outputStream write: [data bytes] maxLength: [data length]]];
Again, I'm not fluent with Objective C, but this should be the general approach to sending the number 0x1b50 as binary data.

Append NSInteger to NSMutableData

How do you append a NSInteger to NSMutableData. Something allong the lines of...
NSMutableData *myData = [[NSMutableData alloc] init];
NSInteger myInteger = 42;
[myData appendBytes:myInteger length:sizeof(myInteger)];
So that 0x0000002A will get appended to myData.
Any help appreciated.
Pass the address of the integer, not the integer itself. appendBytes:length: expects a pointer to a data buffer and the size of the data buffer. In this case, the "data buffer" is the integer.
[myData appendBytes:&myInteger length:sizeof(myInteger)];
Keep in mind, though, that this will use your computer's endianness to encode it. If you plan on writing the data to a file or sending it across the network, you should use a known endianness instead. For example, to convert from host (your machine) to network endianness, use htonl():
uint32_t theInt = htonl((uint32_t)myInteger);
[myData appendBytes:&theInt length:sizeof(theInt)];