OpenCL Kernel: unsigned char -> signed char (aka cl_char) - objective-c

The following kernel accepts a char* array (alphabet):
kernel void generate_cl(global char* alphabet,
global int* rand_buffer,
int len,
int max,
global bool *stop)
However, by compiling it becomes:
extern void (^generate_cl_kernel)(const cl_ndrange *ndrange, cl_char* alphabet, cl_int* rand_buffer, cl_int len, cl_int max, bool* stop);
Obviously alphabet is now a cl_char (aka signed char).
My Problem: I need a unsigned/const char array. (see code below)
My Question: How do I cast unsigned char to signed char (if possible)? Or is there any other approach?
const char* alphabet_ = ... //char array, received from [NSString UTF8String]
generate_cl_kernel(&range,alphabet,..); //throws semantic issue [!]

Related

Returning variable size byte array and free the memory

I am writing a proxy .c file for using OpenSSL library for RSA encryption. Usable for iPhone. Since encrypted data sized is not known to the calling class in Objective C, the container is not initialized and I want to keep it dynamic. The char array and the received size is called by reference. Its memory is dynamically allocated in ssl_encrypt_public_rsa and the caller has to free it. I don't like the idea to give the responsibility to the caller.
Is there any other robust method that you can suggest?
openssl implementation in .c (later compiled to static lib .a file)
// calling function must free cipherText memory
void ssl_encrypt_public_rsa(RSA *rsaKey, int plainLength, unsigned char *plainText,
int *cipherLength, unsigned char **cipherText)
{
int enc_len = RSA_size(rsaKey);
unsigned char *enc_bytes = malloc(enc_len * sizeof(char));
int encSize = RSA_public_encrypt(plainLength, plainText, enc_bytes, rsaKey, RSA_PKCS1_PADDING);
*cipherText = enc_bytes;
*cipherLength = encSize;
}
void ssl_encrypt_mips(int plainLength, unsigned char *plainText,
int *cipherLength, unsigned char **cipherText)
{
// rsaKeyMips is defined and initialized earlier
ssl_encrypt_public_rsa(rsaKeyMips, plainLength, plainText, cipherLength, cipherText);
}
Calling function Objective C .m file
-(NSData *) encryptMips:(NSData *)inData
{
int encSize = 0;
unsigned char *cipherBytes;
ssl_encrypt_mips((int)inData.length, (unsigned char *)inData.bytes, &encSize, &cipherBytes);
if (encSize == -1 ) return nil;
NSData * d = [NSData dataWithBytes:cipherBytes length:encSize];
free(cipherBytes);
return d;
}
It is unclear what you are asking, as you've already wrapped the C call in a wrapper which takes care of the free(). If you wish to remove the free() from your wrapper, and serendipitously avoid a copy, you can change your:
NSData *d = [NSData dataWithBytes:cipherBytes length:encSize];
free(cipherBytes);
to:
NSData *d = [NSData dataWithBytesNoCopy:cipherBytes length:encSize free:YES];
which hands ownership of the malloc() block to the NSData instance, which will free() it later when no longer needed.
HTH

why cannot I use struct like this?

why cannot I use struct like this?
typedef struct { unsigned char FirstName; unsigned char LastName; unsigned int age; } User;
User UserNick = {Nick, Watson, 24};
NSLog(#"Your paint job is (R: %NSString, G: %NSString, B: %u)",
UserNick.FirstName, UserNick.LastName, UserNick.age);
I mean I have used a struct like this for sure:
typedef struct {unsigned char red; unsigned char green; unsigned char blue; } Color;
Color carColor = {255, 65,0};
NSLog(#"Your paint job is (R: %hhu, G: %hhu, B: %hhu)",
carColor.red, carColor.green, carColor.blue);
If you want to use C strings you need the following code:
typedef struct { unsigned char *FirstName; unsigned char *LastName; unsigned int age; } User;
User UserNick = {"Nick", "Watson", 24};
NSLog(#"Your paint job is (R: %s, G: %s, B: %u)",
UserNick.FirstName, UserNick.LastName, UserNick.age);
C strings are char *. C string literals need to be in quotes. %s is the format specifier for C strings.
One other suggestion - start field names (and variables names) with lowercase letters.
And since you are working with Objective-C, you would probably end up being better off if you make User a real class instead of a struct. Then you can use properties and proper memory management. The names could be NSString instead of C strings. This makes it easy to store the objects in collections and do other useful things that are hard with a plain old struct.
In your definition, FirstName is an unsigned char which means it is a variable that can hold only one char as its value. However, Nick is a string, namely an array of chars.
One could do
typedef struct {
unsigned char * FirstName;
unsigned char * LastName;
unsigned int age;
} User;
User Nick = {"Nick", "Watson", 24};

Comparing char with enum

I have an enum defined this way:
typedef enum : unsigned char {
START_DELIMITER = 0xAA,
END_DELIMITER = 0xBB,
} Delimiter;
When I compare the delimiter value with with char byte from const char*, like so:
// data is NSData;
const char *bytes = [data bytes];
if (bytes[0] == START_DELIMITER) { }
The above test is false even though bytes[0] contains 0xAA.
If I define START_DELIMITER as const char, the comparison is true. Why does the test against the enum fails even though the enum is already defined as unsigned char?
The char is signed, and the enum is unsigned. Perhaps the compiler sign-extends before making the comparison?

Objective C SHA512 hash of two NSData

Here is a Java code, which computes SHA512 hash of a byte array with salt:
private static String DIGEST_ALGORITHM = "SHA-512";
public static byte[] getHash(final byte[] data, final byte[] salt) throws NoSuchAlgorithmException
{
final MessageDigest md = MessageDigest.getInstance(DIGEST_ALGORITHM);
md.reset();
if (salt != null)
{
md.update(salt);
}
return md.digest(data);
In Objective C, I use this algorithm for compute the hash of an NSData:
#implementation NSData (CommonDigest)
- (NSData *) SHA512Hash {
unsigned char hash[CC_SHA512_DIGEST_LENGTH];
(void) CC_SHA512( [self bytes], (CC_LONG)[self length], hash );
return ( [NSData dataWithBytes: hash length: CC_SHA512_DIGEST_LENGTH] );
}
This works perfectly, computes the same hash, as the Java code, if I use the same single data (i.e. the salt is nil in the Java code). The problem is that, if I want to computes hash of two NSData, i.e. there is a salt (the second parameter in the Java code is not nil). You can see that in the Java code, if the salt is not null, it performs an update, and then call the digest method. Somewhere I read that, this operation is equal with merging the two byte array (the data and salt arrays with System.arraycopy), and call the digest on the result array.
However, if I do this in Objective C (with NSMutableData appendData method), I don't get the same result. How can I fix this?
I can see in the CommonDigest class, there are similar methods, but I don't know, how can I use these...I think of these methods:
extern int CC_SHA512_Init(CC_SHA512_CTX *c);
extern int CC_SHA512_Update(CC_SHA512_CTX *c, const void *data, CC_LONG len);
extern int CC_SHA512_Final(unsigned char *md, CC_SHA512_CTX *c);
extern unsigned char *CC_SHA512(const void *data, CC_LONG len, unsigned char *md);
So I would like to create a method like this:
#implementation NSData (CommonDigest)
- (NSData *)SHA512HashWithSalt:(NSData *)salt {...}
I haven’t run this code and compared it with a Java implementation but it should work:
#implementation NSData (CommonDigest)
- (NSData *)SHA512HashWithSalt:(NSData *)salt {
unsigned char hash[CC_SHA512_DIGEST_LENGTH];
CC_SHA512_CTX context;
CC_SHA512_Init(&context);
if ([salt length]) {
CC_SHA512_Update(&context, [salt bytes], (CC_LONG)[salt length]);
}
CC_SHA512_Update(&context, [self bytes], (CC_LONG)[self length]);
CC_SHA512_Final(hash, &context);
return [NSData dataWithBytes:hash length:CC_SHA512_DIGEST_LENGTH];
}
#end

Casting NSString to unsigned char *

I'm trying to use a function that has the following signature to sign a HTTP request:
extern void hmac_sha1(const unsigned char *inText, int inTextLength, unsigned char* inKey, const unsigned int inKeyLength, unsigned char *outDigest);
And this is the method I wrote to use it:
- (NSString *)sign: (NSString *)stringToSign {
NSString *secretKey = #"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
const unsigned char *inText = (unsigned char *)[stringToSign UTF8String];
int inTextLength = [stringToSign length];
unsigned char *inKey = (unsigned char *)[secretKey UTF8String];
const unsigned int inKeyLength = (unsigned int)[secretKey length];
unsigned char *outDigest;
hmac_sha1(inText, inTextLength, inKey, inKeyLength, outDigest);
NSString *output = [NSString stringWithUTF8String:(const char *)outDigest];
return output;
}
The problem is I'm sure this is not the way I'm supposed to do this casting, as inside this hmac_sha1 function I get a EXC_BAD_ACCESS exception.
Since I am new to Objective-C and have close to no experience in C (surprise!) I don't really know what to search for. Any tips on how I can start solving this?
Thanks in advance!
BTW, I got the reference for this function here in stackoverflow.
It looks like the problem is not with the casting, but with outDigest. The fifth argument to hmac_sha1 should point to an already allocated buffer of size 20 bytes (I think).
If you change the line that says
unsigned char *outDigest;
to say
#define HMACSHA1_DIGEST_SIZE 20
void *outDigest = malloc(HMACSHA1_DIGEST_SIZE);
That should get you past the crash inside hmac_sha1.
Then you've got the problem of converting the data at outDigest into an NSString. It looks like hmac_sha1 will put 20 bytes of random-looking data at outDigest, and not a null terminated UTF-8 string, so stringWithUTF8String: won't work. You might want to use something like this instead if you have to return an NSString:
NSString *output = [[NSString alloc] initWithBytesNoCopy:outDigest
length:HMACSHA1_DIGEST_SIZE
encoding:NSASCIIStringEncoding
freeWhenDone:YES];
I don't think NSString is really the right type for the digest, so it might be worth changing your method to return an NSData if you can.
This wasn't part of your question but it's a bug nonetheless, you shouldn't use -length to get the byte count of an UTF8 string. That method returns the number of Unicode characters in the string, not the number of bytes. What you want is -lengthOfBytesUsingEncoding:.
NSUInteger byteCount = [stringToSign lengthOfBytesUsingEncoding:NSUTF8StringEncoding];
Also be aware that the result does not account for a terminating NULL character.
Are you sure you don't need to allocate some memory for outDigest before calling hmac_sha1? Since you pass in a pointer, rather than a pointer to a pointer, there's no way that the memory can be allocated inside the routine.