If I am debugging a Mach-O binary using lldb, what data structures in memory can I examine to determine if any methods have been swizzled? Any steps I can follow?
Also, is there a way to determine programmatically if any methods have been swizzled?
Since you mention lldb you can set symbolic breakpoints on:
b method_exchangeImplementation
b method_setImplementation
b class_replaceMethod
When you hit a breakpoint for:
method_exchangeImplementations(Method _Nonnull m1, Method _Nonnull m2)
you can inspect the m1 m2 args selector names like this:
po (SEL)method_getName($arg1)
po (SEL)method_getName($arg2)
For method_setImplementation(Method _Nonnull m, IMP _Nonnull imp):
po (SEL)method_getName($arg1)
For class_replaceMethod(Class cls, SEL name, IMP imp, const char *types)
po $arg1
po (SEL)method_getName($arg2)
Those Method will likely be yielded through previous calls to:
class_getInstanceMethod(Class _Nullable cls, SEL _Nonnull name)
class_getClassMethod(Class _Nullable cls, SEL _Nonnull name)
so after:
b class_getInstanceMethod
b class_getClassMethod
and hitting respective breakpoints, to inspect class:
po $arg1
to inspect selector:
po (SEL)method_getName($arg2)
The best place to setup those symbolic breakpoints would be here:
__attribute__((constructor))
static void premain() {
int i = 0;
i++; // put xcode breakpoint here and when hit prep your lldb symbolic bps
}
Related
Does the Objective-C compiler actually warn you if it thinks that you may be passing nil to a parameter marked with _Nonnull?
Or is it merely a hint to whatever tool converts between Swift and Objective-C to better deal with Swift optionals?
On its own, there is warning only in the extremely trivial case: when you pass nil to a function accepting _Nonnull value.
NSObject* _Nonnull test(NSObject* _Nonnull) {
test(nil); // warning
NSObject* _Nullable f = nil;
test(f); // no warning (passing _Nullable to _Nonnull), only warn in static analysis
NSObject* _Nonnull g = [[NSObject alloc] init];
g = nil; // no warning (!) (assigning nil to _Nonnull)
if (g != nil) { // no warning (unnecessary '!= nil' comparison)
test(g);
}
return nil; // no warning (returning nil to _Nonnull)
}
(There is a -Wnullable-to-nonnull-conversion flag but it doesn't seem to have any effect to the code above.)
As documented, the three attributes will not change the behavior of the code:
… Note that, unlike the declaration attribute nonnull, the presence of _Nonnull does not imply that passing null is undefined behavior: fetch is free to consider null undefined behavior or (perhaps for backward-compatibility reasons) defensively handle null.
Beyond the compiler, it will also help the static analyzer, but again clang's static analyzer will only catch the trivial case where it knows for sure you are assigning a nil to a _Nonnull (i.e. the test(f) example above).
Nevertheless, it is still useful to mark a pointer _Nonnull/_Nullable as
Documentation;
Allow Swift developers to use your library better;
The compiler will emit warnings everywhere if you don't add those annotations 😏
I'm developing an app that needs to perform some swizzling.
I'm swizzling a method -(void)m1:(CMAcceleration)a; with another one that I provide.
-(void)newM(id self, SEL _cmd, ...){
va_list args;
va_start(args, _cmd);
//...
NSInteger returnValue=((NSInteger(*)(id,SEL,...))origImp)(self,_cmd,args);
va_end(args);
}
To swizzle it I use:
origImp=method_setImplementation(method, newImp);
I then call it normally like [ClassInstance m1:a];
The thing is, args seems to be filled with garbage when I expected a structure like {name=type...} as described in here.
I need to pass the arguments to the original implementation after doing some operation like NSLog.
Searching the Internet it seems this is a Simulator problem related but I'm not sure and I have no access to a device to confirm this.
Am I doing something wrong or is there a way to fix this?
You are doing it very wrong.
The method signature should match i.e. -(void)newM:(CMAcceleration)a;
and
Method method = class_getInstanceMethod([SomeClass class],#selector(newM:));
IMP newImp = method_getImplementation(method);
origImp=method_setImplementation(method, newImp);
A different way is make C function
void newM(id self, SEL _cmd, CMAcceleration a) {
}
origImp=method_setImplementation(method, (IMP)newM);
In Objective-C, is a Block an object? Can it be retained/released? Can it be sub-classed?
Yes, it's an object type.
However it's mostly handled by the compiler, so you can't subclass it and you have very little control over memory management. You typically can only copy it to the heap, in case it outlives the scope it's declared in.
Concretely, at runtime a block is an instance of either __NSMallockBlock__, __NSStackBlock__ or __NSGlobalBlock__. You can read more on the subject here: http://www.cocoawithlove.com/2009/10/how-blocks-are-implemented-and.html
In addition to Gabriele's answer, which is correct, you can see it by sending a class message to a block:
#import <objc/message.h>
int main( void )
{
void ( ^ block )( void ) = ^( void )
{};
NSLog( #"%#", ( ( id ( * )( id, SEL ) )objc_msgSend )( ( id )block, #selector( class ) ) );
return 0;
}
A non-object type would eventually crash, but with a block, you'll get __NSGlobalBlock__ printed, meaning a block is a real (even if obviously special) instance of an Objective-C class.
Lets look at this, for example:
- (void)doSomeStuffWithABlock
{
void(^myBlock)(NSUInteger, NSString *, NSString *) = ^(NSUInteger index, NSString *string, NSString *string) {
// Do stuff
};
myBlock(1, #"two", #"three"); // Set breakpoint here to inspect myBlock.
}
In the debugger:
(lldb) po myBlock
<__NSMallocBlock__: 0x17d21f20>
(lldb) p * myBlock
(__block_literal_generic) $1 = 0x001cc801 MyApp`__68-[MyObject myMethod]_block_invoke + 1 at MYMyObject.m:218
(lldb) p (BOOL)[myBlock isKindOfClass:[NSObject class]]
(BOOL) $5 = YES
You can send it retain / release, if you're not using ARC. You can also ask for it's retainCount. However block.h declares these macros that you should use:
#define Block_copy(...) ((__typeof(__VA_ARGS__))_Block_copy((const void *)(__VA_ARGS__)))
#define Block_release(...) _Block_release((const void *)(__VA_ARGS__))
I don't think you can subclass it, since no headers publicly expose __NSMallocBlock__ or related Block types (see comments) (that I could find), and I suspect down that road lies nothing but the most excruciating pain, since you'll be trying to trick the compiler. I'm not sure why you'd even want to do that. If you need to add functionality, consider creating your own object that wraps a #property that is the actual block to execute. Or consider using NSBlockOperation.
I am trying to write a unit test for a method which itself creates and passes a block to another object (so it can be called later). This method is using socket.io-objc to send a request to a server. It passes a callback block to socketio's sendEvent:withData:andAcknowledge that will be invoked when it receives a response from the server). Here is the method i want to test:
typedef void(^MyResponseCallback)(NSDictionary * response);
...
-(void) sendCommand:(NSDictionary*)dict withCallback:(MyResponseCallback)callback andTimeoutAfter:(NSTimeInterval)delay
{
__block BOOL didTimeout = FALSE;
void (^timeoutBlock)() = nil;
// create the block to invoke if request times out
if (delay > 0)
{
timeoutBlock = ^
{
// set the flag so if we do get a response we can suppress it
didTimeout = TRUE;
// invoke original callback with no response argument (to indicate timeout)
callback(nil);
};
}
// create a callback/ack to be invoked when we get a response
SocketIOCallback cb = ^(id argsData)
{
// if the callback was invoked after the UI timed out, ignore the response. otherwise, invoke
// original callback
if (!didTimeout)
{
if (timeoutBlock != nil)
{
// cancel the timeout timer
[NSObject cancelPreviousPerformRequestsWithTarget:self selector:#selector(onRequestTimeout:) object:timeoutBlock];
}
// invoke original callback
NSDictionary * responseDict = argsData;
callback(responseDict);
}
};
// send the event to the server
[_socketIO sendEvent:#"message" withData:dict andAcknowledge:cb];
if (timeoutBlock != nil)
{
// if a timeout value was specified, set up timeout now
[self performSelector:#selector(onRequestTimeout:) withObject:timeoutBlock afterDelay:delay];
}
}
-(void) onRequestTimeout:(id)arg
{
if (nil != arg)
{
// arg is a block, execute it
void (^callback)() = (void (^)())arg;
callback();
}
}
This all appears to be working fine when running for real. My problem comes in when I run my unit test (which uses SenTestingKit and OCMock):
-(void)testSendRequestWithNoTimeout
{
NSDictionary * cmd = [[NSDictionary alloc] initWithObjectsAndKeys:
#"TheCommand", #"msg", nil];
__block BOOL callbackInvoked = FALSE;
__block SocketIOCallback socketIoCallback = nil;
MyResponseCallback requestCallback = ^(NSDictionary * response)
{
STAssertNotNil(response, #"Response dictionary is invalid");
callbackInvoked = TRUE;
};
// expect controller to emit the message
[[[_mockSocket expect] andDo:^(NSInvocation * invocation) {
SocketIOCallback temp;
[invocation getArgument:&temp atIndex:4];
// THIS ISN'T WORKING AS I'D EXPECT
socketIoCallback = [temp copy];
STAssertNotNil(socketIoCallback, #"No callback was passed to socket.io sendEvent method");
}] sendEvent:#"message" withData:cmd andAcknowledge:OCMOCK_ANY];
// send command to dio
[_ioController sendCommand:cmd withCallback:requestCallback];
// make sure callback not invoked yet
STAssertFalse(callbackInvoked, #"Response callback was invoked before receiving a response");
// fake a response coming back
socketIoCallback([[NSDictionary alloc] initWithObjectsAndKeys:#"msg", #"response", nil]);
// make sure the original callback was invoked as a result
STAssertTrue(callbackInvoked, #"Original requester did not get callback when msg recvd");
}
To simulate a response, I need to capture (and retain) the block that is created by the method i'm testing and is passed to sendEvent. By passing an 'andDo' argument to my expectation, I am able to access the block i'm looking for. However, it seems like it is not being retained. So, when sendEvent unwinds and I go to invoke the callback, all the values that should have been captured in the block show up as null. The result is that the test crashes when I invoke socketIoCallback and it goes to access the 'callback' value that was originally captured as part of the block (and is now nil).
I am using ARC and so I expect that "__block SocketIOCallback socketIoCallback" will retain values. I've tried to "-copy" the block into this variable but still it does not seem to retain past the end of sendCommand. What can I do to force this block to retain long enough for me to simulate a response?
Note: I've tried calling [invocation retainArguments] which works but then crashes somewhere in objc_release when cleaning up after the test is complete.
I was finally able to reproduce the problem and I suspect that the error is in this code section:
SocketIOCallback temp;
[invocation getArgument:&temp atIndex:4];
Extracting the block this way does not work correctly. I'm not exactly sure why but it may have to do with some of the magic ARC does in the background. If you change the code to the following it should work as you'd expect:
void *pointer;
[invocation getArgument:&pointer atIndex:4];
SocketIOCallback temp = (__brigde SocketIOCallback)pointer;
#aLevelOfIndirection is correct in that it is due to call to getArgument:atIndex:.
To understand why, remember that SocketIOCallback is an object pointer type (it is a typedef for a block pointer type), which is implicitly __strong in ARC. So &temp is type SocketIOCallback __strong *, i.e. it is a "pointer to strong". When you pass a "pointer to strong" to a function, the contract is that if the function replaces the value pointed to by the pointer (which is __strong), it must 1) release the existing value, and 2) retain the new value.
However, NSInvocation's getArgument:atIndex: does not know anything about the type of the thing pointed to. It takes a void * parameter, and simply copies the desired value binary-wise into the location pointed to by the pointer. So in simple terms, it does a pre-ARC non-retained assignment into temp. It does not retain the value assigned.
However, since temp is a strong reference, ARC will assume it was retained, and release it at the end of its scope. This is an over-release, and thus will crash.
Ideally, the compiler should disallow conversions between "pointer to strong" and void *, so this doesn't accidentally happen.
The solution is to not pass a "pointer to strong" to getArgument:atIndex:. You can pass either a void * as aLevelOfIndirection showed:
void *pointer;
[invocation getArgument:&pointer atIndex:4];
SocketIOCallback temp = (__brigde SocketIOCallback)pointer;
or a "pointer to unretained":
SocketIOCallback __unsafe_unretained pointer;
[invocation getArgument:&pointer atIndex:4];
SocketIOCallback temp = pointer;
and then assign back into a strong reference afterwards. (or in the latter case, you could use the unretained reference directly if you are careful)
I'm trying to swizzle a function and call the original implementation with the function args.
the new IMP is of the form:
static id WrapperFunction(id self, SEL _cmd, ...) {
va_list args;
va_start(args, _cmd);
originalImp(self, _cmd, args);
...
}
This is clearly wrong since args now contains _cmd while ... did not.
How can I pass ... to originalImp?
Gcc has: http://gcc.gnu.org/onlinedocs/gcc/Constructing-Calls.html
clang has nothing and you must do assembly to do that (basically if you know the address of originalImp, you just want to "jmp" to it).