Is there a way to declare an objective-C block typedef whose arguments contain that typedef? - objective-c

Here's an interesting one for the objective-C gurus out there...
Is there a way to declare an objective-C block typedef that contains an argument of that typedef?
typedef BOOL (^SSCellAction) ( UITableViewController* inTVC, SSCellAction inChainedAction );
The idea is that I wanted to used chained menu action system that allows a chain of work/response to occur (usually 1-3 items). When the last action invokes, it passes nil for inChainedAction. Since this seems relatively trivial to imagine, I'll be dammed if I can't figure out how to declare it without llvm saying no. :)

rmaddy's comment is correct. Just as in C, a typedef cannot use itself. Basically, a typedef does not make a real type, but just makes an alias that the compiler expands out at compile-time. It is always possible to manually expand all typedefs in your program yourself (which is sometimes an instructive exercise), so that your program is written without typedefs. However, a recursive typedef cannot be expanded.
Some possible workarounds:
Use id as the parameter type, and cast back into the right type inside the block. This loses type safety.
Or, use a struct type of one member, the block. A struct is a real type, so it can be used within its definition. The downside of this is that you explicitly "wrap" the block into the struct type to pass it, and explicitly "unwrap" the struct into the block by accessing the field when you need to call it. This way is type-safe.

Related

Isn't pointer type checking disabled in DLL/C-Connect, and is that OK?

After this somehow related question Why can't I pass an UninterpretedBytes to a void* thru DLL/C-Connect? where we saw that I could not pass a Smalltalk array of bits to a void * parameter, I further analyzed the method responsible for checking the compatibility of formal pointer description with effective object passed as argument, and I think that I discovered another questionable piece:
CPointerType>>coerceForArgument: anObject
...snip...
(anObject isKindOf: self defaultDatumClass)
ifTrue: [
(referentType = anObject type referentType
or: [(referentType isVoid
and: [anObject type referentType isConstant not])
or: [anObject type isArray not
or: [anObject type baseArrayType = referentType]]])
ifTrue: [^anObject asPointer]].
...snip...
It means the following:
It first checks if the argument is CDatum (a proxy to some C-formatted rawdata and associated CType).
If so, it checks whether the type is the same as the formal definition in external method prototype (self).
If not, it could be that the argument is void *, in which case any kind of pointer is accepted (it has been checked that it is a pointer in the code that I snipped), except if it is pointer on a const thing.
There is a first discrepancy: it should check if the formal definition is const void * and accept any pointer on const in this case... But that does not matter much, we rarely have actual argument declared const.
If not, it checks if either not an array (for example, int foo[2]), or an array whose type matches (same base type and dimension).
So, if the formal definition is for example struct {int a; char *b} *foo, and that I pass a double * bar, the type does not match, there is no const qualifier mismatch, and the parameter is not an array, conclusion: we can safely pass it without any further checking!
That's a kind of pointer aliasing. We do not have an optimizing compiler making any speculation about the absence of such aliasing in Smalltalk, so that won't be the source of undefined behaviour. It could be that we deliberately want to force this sort of dirty reinterpret_cast for obscure reasons (since we can explicitly cast a CDatum, I would prefer the explicit way).
BUT, it might be that we completely messed up and passed the wrong object, with wrong type, wrong dimension, and that the address foo->b in my example above will contain either some re-interpreted garbage if pointer is 32bits aligned, or be completely undefined on 64 bits machine (because beyond the sizeof double).
A C compiler would warn me for sure about the aliasing, and prevent production of artifact with -Wall -Werror.
What troubles me here is that I do not even get a warning...
Does it sound correct?
Short answer: it's not OK to correct this behavior, because some low level user interface stuff depends on it (event loop). We can't even introduce a Warning or anything.
Longer story: I tried to rewrite the whole method with double dispatching (ask anObject if compatible with formal CPointerType rather than testing every possible Object class with repeated isKindOf: ).
But when ommitting the disgracious pointer aliasing tolerance, it invariably screw my Macosx 8.3 image with tons of blank windows opening, and blocked uninterruptable UI...
After instrumenting, it appears that the event loop relies on it, and pass aString asNSString (which is transformed into utf16, but stored into a ByteArray and thus declared unsigned char *), to an Objective C method expecting an unsigned short *.
It's a case where the pointer aliasing is benign, as long as we pass the good bytes.
If I try and fix asNSString with a proper cast to unsigned short *, then the UI blocks (I don't know why, but it would require debugging at VM level).
Conclusion: it's true that some distinction such as (unsigned char *) vs (char *) can be germane and should better not be completely prohibited (whether char is signed or not is platform dependent, and not all libraries have cleanly defined APIs). Same goes with platform dependent wide character, we have conversion methods producing the good bytes, but not the good types. We could eventually make an exception for char * like we did for void * (before void * was introduced, char * was the way to do it anyway)... Right now, I have no good solution for this because of the event loop.

Objective c and struct

I have this problem. This the external library that i must use in .h file:
typedef struct _IPCSSContext IPCSSContext;
IPCSSContext * ipcssnew(const IPCSSCfg *_config, const IPCSSCallbacks *_callbacks, void *_user);
How can I use? Thanx
First you define a variable of type IPCSSCallbacks whatever that might be, but it's probably a struct of function pointers.
Then you fill in the fields of the variable with pointers to your callback functions.
Then you call icssnew() passing the IPCSSCallbacks, the config and a pointer to anything you like. This pointer will be passed untouched to your callback functions when they are called and you can do what you like with it in the callbacks (including nothing).
This is a pretty standard pattern in C for performing callbacks.

Why are Objective-C blocks not handled through pointers as are other Objective-C objects?

In code that uses blocks, you frequently see declarations such as:
typedef void(^Thunk)(void);
Thunk block1 = ^{NSLog(#"%d %p",i, &i);};
instead of
typedef void(^Thunk)(void);
Thunk *block1 = ^{NSLog(#"%d %p",i, &i);};
Blocks seem to be the only Objective-C object that is handled directly, instead of through a pointer. Why is this? Aren't they regular objects?
If you're wondering about the missing *: In your example you typedef'd the block which is hiding the syntax. You can do this with other objects as well:
typedef NSNumber *Number;
Number foo = #42;
When assigning a block to a variable that's actually assigning a pointer.
Blocks are a bit like C arrays: Block literals are structures created by the compiler. Block variables are pointers behind the scenes.
You're not handling a block “directly”. You're handling it indirectly.
The syntax void (^)(void) declares a pointer to a block. There is no syntax for type “block”; there is only syntax for type “pointer to a block”.
From the Language Specification for Blocks:
The abstract declarator,
int (^)(char, float)
describes a reference to a Block that, when invoked, takes two parameters, the first of type char and the second of type float, and returns a value of type int. The Block referenced is of opaque data that may reside in automatic (stack) memory, global memory, or heap memory.
The relevant bit is “describes a reference to a Block”.
The specification isn't entirely consistent, since (for example) it refers to a “variable with Block type” and “Block variable declarations“. But in fact a variable always holds a reference (pointer) to a block, and never holds a block directly as its value.
If you look at the block implementation, you will see that blocks are ObjC objects.
If you inspect a stack block (like in your example), you'll see something like this:
(lldb) po myBlock
<__NSStackBlock__: 0xbfffc940>
And you will notice in the public header _Block_copy() and _Block_release() take void* arguments.
The non-pointeryness is just syntactic sugar the compiler provides to shield you from the dirty bits of blocks.
Blocks are a C extension. You can use blocks in C (where the extension is supported).
Apple just happens to have implemented blocks using ObjC types.
Implementation details: Blocks are one of the few ObjC types which may be allocated on the stack using clang. Normally, this is not an option because almost every API expects that an objc_object is a heap allocation which supports reference counting.

ObjC protocols potentially useless

In ObjC we can use protocols to restrict an id behavior, so we can declare something like
-(void)aMethod:(id<aProtocol>)aVar which works very well until we provide a value or a non-id variable as aVar, but this gets completely broken since we can pass a generic id variable delcared without protocols specifiers... Is this normal? Is there any workaround? Am I missing something?
Just use id less, and declare variables and parameters using the correct types, where possible. That is to say: don't pass ids around. If you are implementing a collections class (for example), then id's often useful.
My approach is to specify types, and introduce that type as local as possible in the source. So I omit id and add the type, and when (for instance) I take a reference from a collection, I create a variable:
MONType<MONProtocol>* thing = [array objectAtIndex:idx];
// now thing is correctly typed. use thing.
Similarly, if I have an id parameter, I declare a new variable:
- (IBAction)someAction:(id)sender
{
NSButton * button = sender;
// now use button, not sender
Protocols are extremely useful. Very often, better/cleaner than subclassing.
You're missing the understanding that types in Objective-C are determined at runtime, not compile time. Just because you say that an object will be of type id<aProtocol> does not mean that at runtime it is guaranteed to be so.
The idea of specifying something as id<aProtocol> is to aid you as a developer and people using your code. It aids you as a developer because the compiler will warn (or error under ARC) if you attempt to call a method on something that the compiler can determine it doesn't think exists on instances of its supposed type (excluding forwarding which could mean an instance responds to something the compiler cannot determine). It aids people using your code as it tells them the contract that they should adhere to when interfacing with your code.
So, in your question you say that:
but this gets completely broken if we pass a generic id variable delcared without protocols specifiers
Well, the compiler would warn and tell you that you're trying to pass something that does not conform to that protocol, except for the case of passing id. That's why you generally should try to type things more precisely than just id.
If you have a method defined like so:
- (void)aMethod:(id<aProtocol>)aVar
Then aVar could be of type SomeSubclass where that is defined like so:
#interface SomeSubclass : NSObject <aProtocol>
And you could then use aMethod like this:
SomeSubclass *obj = [SomeSubclass new];
[other aMethod:obj];
I (FINALLY) found out that using Objective-C++ is the way to go. Let's suppose I want to be able to pass NSString or NSNumber (instead of a too much generic id and instead of using protocols which become useless passing id values): well, I can create a C++ class having two distinct constructors, one for each ObjC class, so passing id values cannot be done anymore (almost directly). For example, let's take a look at
class NSStringOrNSNumber{
public:
NSStringOrNSNumber(NSString *);
NSStringOrNSNumber(NSNumber *);
};
The great advantage is that methods/functions taking a NSStringOrNSNumber parameter can get NSString/NSNumber values DIRECTLY, since the constructor acts as an implicit cast. In other words, if we have
void aFunction(NSStringOrNSNumber param);
the following calls are perfectly valid:
aFunction(#"Hello!");
aFunction(#25);
The only (little) downside is that we need the class to implement a function if we want to get back the value passed to the constructor.
Using a C++ class constructor to get something like id<NSCoding> is still better the using id<NSCoding> directly: in fact, if we do the following
#class classOne, classTwo;
class NSCodingClass{
private:
NSCodingClass(classOne *);
NSCodingClass(classTwo *);
public:
NSCodingClass(id<NSCoding>);
}
we won't be able to pass a generic id as a parameter (since it would be ambiguous: the compiler cannot know which constructor to call among the two private ones)

using objc_msgSend to call a Objective C function with named arguments

I want to add scripting support for an Objective-C project using the objc runtime. Now I face the problem, that I don't have a clue, how I should call an Objective-C method which takes several named arguments.
So for example the following objective-c call
[object foo:bar];
could be called from C with:
objc_msgSend(object, sel_getUid("foo:"), bar);
But how would I do something similar for the method call:
[object foo:var bar:var2 err:errVar];
??
Best Markus
The accepted answer is close, but it won't work properly for certain types. For example, if the method is declared to take a float as its second argument, this won't work.
To properly use objc_msgSend, you have to cast it to the the appropriate type. For example, if your method is declared as
- (void)foo:(id)foo bar:(float)bar err:(NSError **)err
then you would need to do something like this:
void (*objc_msgSendTyped)(id self, SEL _cmd, id foo, float bar, NSError**error) = (void*)objc_msgSend;
objc_msgSendTyped(self, #selector(foo:bar:err:), foo, bar, error);
Try the above case with just objc_msgSend, and log out the received arguments. You won't see the correct values in the called function. This unusual casting situation arises because objc_msgSend is not intended to be called like a normal C function. It is (and must be) implemented in assembly, and just jumps to a target C function after fiddling with a few registers. In particular, there is no consistent way to refer to any argument past the first two from within objc_msgSend.
Another case where just calling objc_msgSend straight wouldn't work is a method that returns an NSRect, say, because objc_msgSend is not used in that case, objc_msgSend_stret is. In the underlying C function for a method that returns an NSRect, the first argument is actually a pointer to an out value NSRect, and the function itself actually returns void. You must match this convention when calling because it's what the called method will assume. Further, the circumstances in which objc_msgSend_stret is used differ between architectures. There is also an objc_msgSend_fpret, which should be used for methods that return certain floating point types on certain architectures.
Now, since you're trying to do a scripting bridge thing, you probably cannot explicitly cast every case you run across, you want a general solution. All in all, this is not completely trivial, and unfortunately your code has to be specialized to each architecture you wish to target (e.g. i386, x86_64, ppc). Your best bet is probably to see how PyObjC does it. You'll also want to take a look at libffi. It's probably a good idea to understand a little bit more about how parameters are passed in C, which you can read about in the Mac OS X ABI Guide. Last, Greg Parker, who works on the objc runtime, has written a bunch of very nice posts on objc internals.
objc_msgSend(object, sel_getUid("foo:bar:err:"), var, var2, errVar);
If one of the variables is a float, you need to use #Ken's method, or cheat by a reinterpret-cast:
objc_msgSend(..., *(int*)&var, ...)
Also, if the selector returns a float, you may need to use objc_msgSend_fpret, and if it returns a struct you must use objc_msgSend_stret. If that is a call to superclass you need to use objc_msgSendSuper2.
objc_msgSend(obj, #selector(foo:bar:err:), var, var2, &errVar);