Comparing char with enum - objective-c

I have an enum defined this way:
typedef enum : unsigned char {
START_DELIMITER = 0xAA,
END_DELIMITER = 0xBB,
} Delimiter;
When I compare the delimiter value with with char byte from const char*, like so:
// data is NSData;
const char *bytes = [data bytes];
if (bytes[0] == START_DELIMITER) { }
The above test is false even though bytes[0] contains 0xAA.
If I define START_DELIMITER as const char, the comparison is true. Why does the test against the enum fails even though the enum is already defined as unsigned char?

The char is signed, and the enum is unsigned. Perhaps the compiler sign-extends before making the comparison?

Related

When to use size_t vs uint32_t?

When to use size_t vs uint32_t? I saw a a method in a project that receives a parameter called length (of type uint32_t) to denote the length of byte data to deal with and the method is for calculating CRC of the byte data received. The type of the parameter was later refactored to size_t. Is there a technical superiority to using size_t in this case?
e.g.
- (uint16_t)calculateCRC16FromBytes:(unsigned char *)bytes length:(uint32_t)length;
- (uint16_t)calculateCRC16FromBytes:(unsigned char *)bytes length:(size_t)length;
According to the C specification
size_t ... is the unsigned integer type of the result of the sizeof
operator
So any variable that holds the result of a sizeof operation should be declared as size_t. Since the length parameter in the sample prototype could be the result of a sizeof operation, it is appropriate to declare it as a size_t.
e.g.
unsigned char array[2000] = { 1, 2, 3 /* ... */ };
uint16_t result = [self calculateCRC16FromBytes:array length:sizeof(array)];
You could argue that the refactoring of the length parameter was pointlessly pedantic, since you'll see no difference unless:
a) size_t is more than 32-bits
b) the sizeof the array is more than 4GB

why cannot I use struct like this?

why cannot I use struct like this?
typedef struct { unsigned char FirstName; unsigned char LastName; unsigned int age; } User;
User UserNick = {Nick, Watson, 24};
NSLog(#"Your paint job is (R: %NSString, G: %NSString, B: %u)",
UserNick.FirstName, UserNick.LastName, UserNick.age);
I mean I have used a struct like this for sure:
typedef struct {unsigned char red; unsigned char green; unsigned char blue; } Color;
Color carColor = {255, 65,0};
NSLog(#"Your paint job is (R: %hhu, G: %hhu, B: %hhu)",
carColor.red, carColor.green, carColor.blue);
If you want to use C strings you need the following code:
typedef struct { unsigned char *FirstName; unsigned char *LastName; unsigned int age; } User;
User UserNick = {"Nick", "Watson", 24};
NSLog(#"Your paint job is (R: %s, G: %s, B: %u)",
UserNick.FirstName, UserNick.LastName, UserNick.age);
C strings are char *. C string literals need to be in quotes. %s is the format specifier for C strings.
One other suggestion - start field names (and variables names) with lowercase letters.
And since you are working with Objective-C, you would probably end up being better off if you make User a real class instead of a struct. Then you can use properties and proper memory management. The names could be NSString instead of C strings. This makes it easy to store the objects in collections and do other useful things that are hard with a plain old struct.
In your definition, FirstName is an unsigned char which means it is a variable that can hold only one char as its value. However, Nick is a string, namely an array of chars.
One could do
typedef struct {
unsigned char * FirstName;
unsigned char * LastName;
unsigned int age;
} User;
User Nick = {"Nick", "Watson", 24};

nsstring to unsigned char []

I would convert
NSString *myString = #"0x10 0x1c 0x37 0x00"; //Aquired reading text file using NSString methods..
to
unsigned char convertedfrommyString[] = { 0x10, 0x1c, 0x37, 0x00 };
My goal is to aquiring them and then swap it using this code:
unsigned char convertedfrommyString[] = { 0x10, 0x1c, 0x37, 0x00 };
int data = *((int *) convertedfromString);
NSLog(#"log = %08x", data);
the output should be:
log = 00371c10
Any help?
EDIT
From both Jan Baptiste Younès and Sven I found the way to understand my problem and solve with this code:
NSString *myString = [[#"0x10 0x1c 0x37 0x00" stringByReplacingOccurrencesOfString:#"0x" withString:#""] stringByReplacingOccurrencesOfString:#" " withString:#""];
unsigned result = 0;
NSScanner *scanner = [NSScanner scannerWithString:myString];
[scanner scanHexInt:&result];
int reverse = NSSwapInt(result);
NSLog(#"scanner: %8u", result);
NSLog(#"bytes:%08x", result);
NSLog(#"reverse:%08x (that is what i need!)", reverse);
Really OK!
But can I accept two answer?
That's more than a simple conversion, you need to actually parse the values from your string. You can use NSScanner to do this.
NSScanner *scanner = [NSScanner scannerWithString: #"0x10 0x1c 0x37 0x00"];
unsigned char convertedfrommyString[4];
unsigned index = 0;
while (![scanner isAtEnd]) {
unsigned value = 0;
if (![scanner scanHexInt: &value]) {
// invalid value
break;
}
convertedfrommyString[index++] = value;
}
Of course this sample is missing error handling (the single values could not fit into an unsigned char or there could be more than four).
But this solved only half your problem. The other issue is converting this to an int. You did this by casting the unsigned char pointer to an int pointer. This is not portable and also not legal in C. To always get the result you want you should instead use bit shifts to assemble your int. So inside your loop you could do
result = result | (value << i);
i += 8;
instead of putting the values inside an unsigned char array. For this result and i should both be initialized to zero.
You may cut your original string at spaces and use the solution given here Objective-C parse hex string to integer. You can also use scanUpToString:intoString to parse upto space chars.

OpenCL Kernel: unsigned char -> signed char (aka cl_char)

The following kernel accepts a char* array (alphabet):
kernel void generate_cl(global char* alphabet,
global int* rand_buffer,
int len,
int max,
global bool *stop)
However, by compiling it becomes:
extern void (^generate_cl_kernel)(const cl_ndrange *ndrange, cl_char* alphabet, cl_int* rand_buffer, cl_int len, cl_int max, bool* stop);
Obviously alphabet is now a cl_char (aka signed char).
My Problem: I need a unsigned/const char array. (see code below)
My Question: How do I cast unsigned char to signed char (if possible)? Or is there any other approach?
const char* alphabet_ = ... //char array, received from [NSString UTF8String]
generate_cl_kernel(&range,alphabet,..); //throws semantic issue [!]

Casting from const void to char?

Alright, I'm hashing an image. And as you all know, hashing an image takes FOREVER. So I'm taking 100 samples of the image, evenly spaced out. Here's the code.
#define NUM_HASH_SAMPLES 100
#implementation UIImage(Powow)
-(NSString *)md5Hash
{
NSData *data = UIImagePNGRepresentation(self);
char *bytes = (char*)malloc(NUM_HASH_SAMPLES*sizeof(char));
for(int i = 0; i < NUM_HASH_SAMPLES; i++)
{
int index = i*data.length/NUM_HASH_SAMPLES;
bytes[i] = (char)(data.bytes[index]); //Operand of type 'const void' where arithmetic or pointer type is required
}
unsigned char result[CC_MD5_DIGEST_LENGTH];
CC_MD5( bytes, NUM_HASH_SAMPLES, result );
return [NSString stringWithFormat:
#"%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x",
result[0], result[1], result[2], result[3],
result[4], result[5], result[6], result[7],
result[8], result[9], result[10], result[11],
result[12], result[13], result[14], result[15]
];
}
The error is on the commented line.
What am I doing wrong?
data.bytes is a void *, so it makes no sense to dereference it (or even to perform the necessary pointer arithmetic on it).
So, if you meant to take a byte out of the data, then obtain a pointer to const unsigned char and dereference that:
const unsigned char *src = data.bytes;
/* ..then, in your loop.. */
bytes[i] = src[index];
Oh, and do not cast the return value of malloc()!
According to the documentation for NSData, data.bytes returns a type of const void *. Basically, you're trying to access a pointer to void which makes no sense since void doesn't have a size.
Cast it to a char pointer and dereference it.
((const char *)data.bytes)[index]
or
*((const char *)data.bytes + index)
Edit: What I'd normally do is assign the pointer to a known data type straight away and use that instead.
I.e.
const char *src = data.bytes;
bytes[i] = src[index];
Edit2: You might also want to leave the const qualifier in as suggested by H2CO3. That way you won't accidentally write to a location you're not supposed to.