Why does passing an unsigned int to performSelector lose bits? - objective-c

I am trying to pass a hex value as an unsigned int to a method using a dynamic link. The value I pass as a parameter is getting corrupted somehow. What is happening?
- (void)callPerformSelector
{
NSNumber *argument = [NSNumber numberWithUnsignedInt:(unsigned int)0xFFFFFFFF];
SEL selector = NSSelectorFromString(#"testPerformSelector:");
NSLog(#"testPerformSelector object %#", argument);
[self performSelector:selector withObject:argument];
}
- (void)testPerformSelector:(unsigned int) arg1
{
NSLog(#"testPerformSelector unsigned int %u", arg1);
NSLog(#"testPerformSelector hex %X", arg1);
}
Output is:
testPerformSelector object 4294967295
testPerformSelector unsigned int 4294967283
testPerformSelector hex FFFFFFF3

Because it should be:
- (void)callPerformSelector
{
NSNumber *argument = #0xFFFFFFFF;
SEL selector = #selector(testPerformSelector:);
NSLog(#"testPerformSelector object %#", argument);
[self performSelector:selector withObject:argument];
}
- (void)testPerformSelector:(NSNumber *) arg1
{
NSLog(#"testPerformSelector unsigned int %u", arg1.unsignedIntValue);
}
unsigned int and NSNumber * is two different things

There's an easy reason and a complex reason.
Simple reason: Why this doesn't work. The first argument to the target of a performSelectorWithObject must be an object. You are specifying a pointer to an unsigned integer in your function signature but then passing an object (NSNumber) when you call it. So instead of:
- (void)testPerformSelector:(unsigned int) arg1
you should have
- (void)testPerformSelector:(NSNumber *) arg1
You will then need to use NSNumber's unsignedIntValue to get the 0xFFFFFFFF out of the object.
The complex reason is much more interesting: Why this nearly works and looks like it loses a few bits. NSNumber is an object that wraps a numeric value this is very different from a raw numeric value. However NSNumber is implemented as a tagged pointer so although objective-c knows it is an object and treats it like an object, a subset of NSNumber values are implemented as tagged pointers where real values are coded into the "pointer" and the fact that this is not a normal pointer is indicated in the (otherwise always zero) bottom four bits of a pointer see Friday Q&A.

You can only pass objects to selectors and not primitive types, and therefore the selector should be:
- (void)testPerformSelector:(NSNumber *) arg1
{
NSLog(#"testPerformSelector hex %x", [arg1 unsignedIntValue]);
}
Update: As pointed-out by #gnasher729, the reason the number passed appears to be -13 is because it's a tagged pointer.

Related

How to properly save the pointer of a local variable in NSValue for further usage

I've an objective-C function like below:
- (NSValue *)foo:(NSString *)str {
NSValue* val = [NSValue valueWithPointer: str.UTF8String];
char* ptr = [val pointerValue];
return val;
}
It takes in a parameter NSString str. Inside the function, I get UTF8 representation of the the string which is a copy of characters of the string (i.e. the pointer for str and str.UT8String are different).
Now, I store this pointer information in NSValue and return this NSValue for further usage.
However, I've observed that str.UTF8String acts as a local variable to this function is a released as we come out of the function, thus leaving its pointer dangling.
Thus, when I try to use the pointer in NSValue to recreate the string, it either returns a nil or random value to me. Occasionally, it also returns true value to me.
How can I safetly save the pointer of str.UTF8String for further usage?

How can I distinguish a boolean NsNumber from a real number?

I'm getting a value back from an Objective C library (it's Firebase, but that doesn't really matter) of type id. The documentation states that this value will be an NSNumber for both boolean and numeric results. I want to take a different action based on whether or not this result corresponds to a boolean.
I know this is possible because printing out the class of the result via NSStringFromClass([value class]); gives "__NSCFBoolean" for booleans, but I'm not really sure how to correctly structure the comparison.
The objCType method gives information about the type of the data contained in the
number object:
NSNumber *n = #(1.3);
NSLog(#"%s", [n objCType]); // "d" for double
NSNumber *b = #YES;
NSLog(#"%s", [b objCType]); // "c" for char
The possible values are documented in
"Type Encodings"
in the "Objective-C Runtime Programming Guide".
Since BOOL is defined as unsigned char, it is reported as such by this method.
This means that you cannot distinguish it from a NSNumber object containing any char.
But it is sufficient to check between "boolean" and "numeric":
if (strcmp([obj objCType], #encode(BOOL)) == 0) {
// ...
} else if (strcmp([obj objCType], #encode(double)) == 0) {
// ...
} else {
// ...
}

Objective-C: How to check if a variable is an object, a struct or another primitive

I want to write a function or a directive like NSLog() that takes any kind of variable, primitives and objects. In that function I want to distinguish those.
I know how it works for objects:
- (void)test:(id)object {
if ([object isKindOfClass:[NSString class]])
...
but how do I distinguish objects from structs or even integer or floats.
Something like:
"isKindOfStruct:CGRect" or "isInt"
for example?
Is this possible?
I thought since you can send everything to NSLog(#"...", objects, ints, structs) it must be possible?
Thanks for any help!
EDIT
My ultimate goal is to implement some kind of polymorphism.
I want to be able to call my function:
MY_FUNCTION(int)
MY_FUNCTION(CGRect)
MY_FUNCTION(NSString *)
...
or [self MYFUNCTION:int]...
and in MY_FUNCTION
-(void)MYFUNCTION:(???)value {
if ([value isKindOf:int])
...
else if ([value isKindOf:CGRect])
...
else if ([value isKindOfClass:[NSString class]])
...
}
I know that isKindOf doesn't exists and you can't even perform such methods on primitives. I'm also not sure about the "???" generic type of "value" in the function header.
Is that possible?
#define IS_OBJECT(T) _Generic( (T), id: YES, default: NO)
NSRect a = (NSRect){1,2,3,4};
NSString* b = #"whatAmI?";
NSInteger c = 9;
NSLog(#"%#", IS_OBJECT(a)?#"YES":#"NO"); // -> NO
NSLog(#"%#", IS_OBJECT(b)?#"YES":#"NO"); // -> YES
NSLog(#"%#", IS_OBJECT(c)?#"YES":#"NO"); // -> NO
Also, check out Vincent Gable's The Most Useful Objective-C Code I’ve Ever Written for some very handy stuff that uses the #encode() compiler directive (that) returns a string describing any type it’s given..."
LOG_EXPR(x) is a macro that prints out x, no matter what type x is, without having to worry about format-strings (and related crashes from eg. printing a C-string the same way as an NSString). It works on Mac OS X and iOS.
A function like NSLog() can tell what types to expect in its parameter list from the format string that you pass as the first parameter. So you don't query the parameter to figure out it's type -- you figure out what type you expect based on the format string, and then you interpret the parameter accordingly.
You can't pass a C struct or primitive as a parameter of type id. To do so, you'll have to wrap the primitive in an NSNumber or NSValue object.
e.g.
[self test: [NSNumber numberWithInt: 3.0]];
id is defined as a pointer to an Objective-C object.
#alex gray answer did not work(or at least did not work on iOS SDK 8.0). You can use #deepax11 answer, however I want to point how this 'magic macro' works. It relies on type encodings provided from the system. As per the Apple documentation:
To assist the runtime system, the compiler encodes the return and argument types for each method in a character string and associates the string with the method selector. The coding scheme it uses is also useful in other contexts and so is made publicly available with the #encode() compiler directive. When given a type specification, #encode() returns a string encoding that type. The type can be a basic type such as an int, a pointer, a tagged structure or union, or a class name—any type, in fact, that can be used as an argument to the C sizeof() operator.
To break the macro apart, we first get "typeOf" our variable, then call #encode() on that type, and finally compare returned value to 'object' and 'class' types from encoding table.
Full example should look like:
const char* myType = #encode(typeof(myVar));//myVar declared somewhere
if( [#"#" isEqualToString:#(myType)] || [#"#" isEqualToString:#(myType)] )
{
//myVar is object(id) or a Class
}
else if ( NSNotFound != [[NSString stringWithFormat:#"%s", myType] rangeOfCharacterFromSet:[NSCharacterSet characterSetWithCharactersInString:#"{}"]].location )
{
//myVar is struct
}
else if ( [#"i" isEqualToString:#(myType)] )
{
//my var is int
}
Please note that NSInteger will return int on 32-bit devices, and long on 64-bit devices. Full list of encodings:
‘c’ - char
‘i’ - int
’s’ - short
‘l’ - long
‘q’ - long long
‘C’ - unsigned char
‘I’ - unsigned int
’S’ - unsigned short
‘L’ - unsigned long
‘Q’ - unsigned long long
‘f’ - float
‘d’ - double
‘B’ - C++ bool or a C99 _Bool
‘v’ - void
‘*’ - character string(char *)
‘#’ - object(whether statically typed or typed id)
‘#’ - class object(Class)
‘:’ - method selector(SEL)
‘[<some-type>]’ - array
‘{<some-name>=<type1><type2>}’ - struct
‘bnum’ - bit field of <num> bits
‘^type’ - pointer to <type>
‘?’ - unknown type(may be used for function pointers)
Read more about Type Encodings at Apple
#define IS_OBJECT(x) ( strchr("##", #encode(typeof(x))[0]) != NULL )
This micro works which I got somewhere in stack overflow.
It's important to note that id represents any Objective-C object. And by Objective-C object, I mean one that is defined using #interface. It does not represent a struct or primitive type (int, char etc).
Also, you can only send messages (the [...] syntax) to Objective-C objects, so you cannot send the isKindOf: message to a normal struct or primitive.
But you can convert a integer etc to a NSNumber, a char* to a NSString and wrap a structure inside a NSObject-dervied class. Then they will be Objective-C objects.

Why am I getting an integer to pointer conversion error in objective-c?

I am looping through an NSString object called previouslyDefinedNSString and verifying if the integer representing the ASCII value of a letter is in an NSMutableSet called mySetOfLettersASCIIValues, which I had previously populated with NSIntegers:
NSInteger ASCIIValueOfLetter;
for (int i; i < [previouslyDefinedNSString length]; i++) {
ASCIIValueOfLetter = [previouslyDefinedNSString characterAtIndex:i];
// if character ASCII value is in set, perform some more actions...
if ([mySetOfLettersASCIIValues member: ASCIIValueOfLetter])
However, I am getting this error within the condition of the IF statement.
Incompatible integer to pointer conversion sending 'NSInteger' (aka 'int') to parameter of type 'id';
Implicit conversion of 'NSInteger' (aka 'int') to 'id' is disallowed with ARC
What do these errors mean? How am I converting to an object type (which id represents, right?)? Isn't NSInteger an object?
You want to make it an NSNumber, as in:
NSInteger ASCIIValueOfLetter;
for (int i; i < [previouslyDefinedNSString length]; i++) {
ASCIIValueOfLetter = [previouslyDefinedNSString characterAtIndex:i];
// if character ASCII value is in set, perform some more actions...
if ([mySetOfLettersASCIIValues member: [NSNumber numberWithInteger: ASCIIValueOfLetter]])
Now you're going to have the result you're looking for.
These errors mean that member: expects an object. id is a pointer to an Objective-C object, and instead of an object, you're passing in a primitive type, or scalar (despite its NS- prefix, NSInteger is not an object - just a typedef to a primitive value, and in your case, an int). What you need to do is wrap that scalar value in an object, and specifically, NSNumber, which is a class specifically designed to handle this.
Instead of calling member: with ASCIIValueOfLetter, you need to call it with the wrapped value, [NSNumber numberWithInteger:ASCIIValueOfLetter], as Maurício mentioned.

Using sizeof Correctly with Byte[]

I'm sort of out of my depths here, but I have the following code (the real code actually has a point of course):
- (NSData*) dataTheseBytes:(Byte[]) bytes {
return [NSData dataWithBytes:bytes length:sizeof(bytes)];
}
The compiler warning is
Sizeof on array function parameter will return size of 'Byte *' (aka
'unsigned char *') instead of 'Byte []'
How can I eliminate this warning (or rather, what am I not understanding about my array of bytes)?
Additionally, why doesn't the error happen with this code? Must have something to do with the method signature...?
Byte bytes[3] = { byte1, byte2, byte3 };
NSData *retVal = [NSData dataWithBytes:bytes length:sizeof(bytes)];
When you pass a C array as a method or C function argument, it "decays" to a pointer to the underlying type (i.e. Byte[] is actually passed as Byte *.) So the called method/function has no idea how many elements are present in the array.
You must also pass the length of the array in order for the called code to know what you want. That's why +[NSData dataWithBytes:length:] has that second argument.
c arrays do not embed their element count.
this is how you would declare a method with an unspecified element count. this is not generally usable:
`- (NSData*) dataTheseBytes:(const Byte*)bytes;`
// or
`- (NSData*) dataTheseBytes:(const Byte[])bytes;`
a more rigid implementation could specify the element count. this is ok if you are always using the same size. example:
enum { MONByteBufferElementCount = 23 };
...
`- (NSData*) dataTheseBytes:(const Byte[MONByteBufferElementCount])bytes
{
return [NSData dataWithBytes:&bytes[0] length:MONByteBufferElementCount * sizeof(bytes[0])];
}
the problem with using objc messaging in this case is that the compiler may not be able to determine the appropriate selector and produce an error or warning if you have declared a selector with the same name but uses different parameters or element counts. therefore, it's safer to use a c function:
`NSData* DataTheseBytes(const Byte bytes[MONByteBufferElementCount]) {
return [NSData dataWithBytes:&bytes[0] length:MONByteBufferElementCount * sizeof(bytes[0])];
}
or use a more verbose name:
`- (NSData*) dataWithMONByteBuffer:(const Byte[MONByteBufferElementCount])bytes
{
return [NSData dataWithBytes:&bytes[0] length:MONByteBufferElementCount * sizeof(bytes[0])];
}
in objc, it's most common to pass the length as an argument, similar to the NSData constructor you call. some part of your program will be able to determine this value (whether it is NSData, a c array or something else).
- (NSData*) dataTheseBytes:(const Byte*)bytes length:(NSUInteger)length
{
return [NSData dataWithBytes:bytes length:length];
}
it's also common to see the element count, like so:
- (NSData*) dataTheseFloats:(const float*)floats length:(NSUInteger)count
{
return [NSData dataWithBytes:floats length:count * sizeof(float)];
}
finally, there are of course a few corner cases. the obvious being a null terminated string:
- (NSData*) dataWithASCIIString:(const char*)chars
{
return [NSData dataWithBytes:chars length:strlen(chars)];
}
You cannot pass arrays to a function. You're passing a pointer to the first element in the array of the caller.
If you need the length of that array, you need to pass that length as a separate argument to your function, and use that instead of sizeof