I have the problem that touchesEnded (on iOS) or mouseUp (on macOS) is still called on released objects when the touch/click was started before the release. This behavior is causing crashes in my project. I made a stripped down program with SceneKit and a SpriteKit overlay but I am not sure what I am doing wrong.
I've started with the SceneKit game template (iOS or macOS) and I've added two classes:
In GameViewController.m at the end of -(void)awakeFromNib I added this:
OverlayScene *overlay=[OverlayScene sceneWithSize:self.gameView.frame.size];// init the overlay
[overlay setupButtons];// create a small GUI menu
[self.gameView setOverlaySKScene:overlay];// add the overlay to the SceneKit view
Then the two classes
OverlayScene.m:
#import <SpriteKit/SpriteKit.h>
#import "Button.h"
#interface OverlayScene : SKScene
-(void)setupButtons;
#end
OverlayScene.m:
#import "OverlayScene.h"
#implementation OverlayScene
-(void)setupButtons{//showing a menu mockup
NSLog(#"setupButtons");
srand((unsigned)time(NULL));// seed the RNG
[self removeAllChildren];//remove all GUI element (only one button or nothing (in the first call)
Button *b=[[Button alloc] initWithSelector:#selector(setupButtons) onObject:self];// a new button
b.position=CGPointMake(rand()%300, rand()%300);// just a random position so we can see the change
[self addChild:b];// finally attach the button to the scene
}
#end
Button.h:
#import <SpriteKit/SpriteKit.h>
#interface Button : SKSpriteNode{
SEL selector;//the method which is supposed to be called by the button
__weak id selectorObj;//the object on which the selector is called
}
-(id)initWithSelector:(SEL)s onObject:(__weak id)o;
#end
Button.m:
#import "Button.h"
#implementation Button
-(id)initWithSelector:(SEL)s onObject:(__weak id)o{
//self=[super initWithColor:[UIColor purpleColor] size:CGSizeMake(200, 60)];//this is the code for iOS
self=[super initWithColor:[NSColor purpleColor] size:CGSizeMake(200, 60)];// this is for macOS
selector=s;//keep the selector with the button
selectorObj=o;
[self setUserInteractionEnabled:YES];//enable touch handling
return self;
}
//-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{//called on iOS
-(void)mouseDown:(NSEvent *)event{//called on macOS
[selectorObj performSelector:selector];//we are calling the method which was given during the button initialization
}
#end
When you click on the button it gets removed (and dealloced by arc) as expected and a new one is created. Everything is fine as long as you keep holding down the mouse button. When you release it the app crashes because that mouseUp event is sent to the removed button:
Related
I'm going through this book called "cocoa programming for mac os x" and I just started with delegates. This whole thing with delegates is still a little bit wacky to me but I think I just need to let it settle.
However there was this one exercise where I should implement a delegate of the main window so that if resized height is always 2xwidth.
So I got 4 files:
AppDelegate.h
AppDelegate.m
WindowDelegate.h
WindowDelegate.m
AppDelegate are just the two standard files that get created when you open a new Cocoa project. I had to look up the solution because I didn't quite know how to accomplish this task.
The solution was just to create a new cocoa class, "WindowDelegat.h/.m" and add this to it's implementation file:
- (NSSize)windowWillResize:(NSWindow *)sender toSize:(NSSize)frameSize {
NSSize newSize = frameSize;
newSize.height = newSize.width * 2;
return newSize;
}
Then I opened the interface builder, added a new object and made it my WindowDelegate. I then had to ctrl drag from the WindowDelegate to the actual window and made it the window's delegate.
Clicked run and it worked. Yay! But why?
First I thought that "windowWillResize" is just one of these callback functions that get's called as soon as the window is resized but it isn't. Normally methods get invoked because the general lifecycle of an program invokes them or because they are an #IBAction, a button or different control elements.
But "windowWillResize" is non of them. So why is it called?
EDIT: Problem solved! Thanks a lot!
Now I'm trying to connect the delegate to the window programmatically. Therefore I deleted the referencing outlet from WindowDelegate to the actual window in interface builder. It works but I just want to verify that this it the correct way how it's done:
AppDelegate.h
#import <Cocoa/Cocoa.h>
#import "WindowDelegate.h"
#interface AppDelegate : NSObject <NSApplicationDelegate>
#end
AppDelegate.m
#import "AppDelegate.h"
#interface AppDelegate ()
#property (weak) IBOutlet NSWindow *window;
#property (strong) WindowDelegate *winDeleg;
#end
#implementation AppDelegate
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
// Insert code here to initialize your application
}
- (void)applicationWillTerminate:(NSNotification *)aNotification {
// Insert code here to tear down your application
}
- (void)awakeFromNib {
[_window setOpaque:NO];
NSColor *transparentColor = [NSColor colorWithDeviceRed:0.0 green:0.0 blue:0.0 alpha:0.5];
[_window setBackgroundColor:transparentColor];
NSSize initialSize = NSMakeSize(100, 200);
[_window setContentSize:initialSize];
_winDeleg = [[WindowDelegate alloc] init];
[_window setDelegate: _winDeleg];
}
#end
WindowDelegate.h
#import <Foundation/Foundation.h>
#import <Cocoa/Cocoa.h>
#interface WindowDelegate : NSObject <NSWindowDelegate>
#end
WindowDelegate.m
#import "WindowDelegate.h"
#implementation WindowDelegate
- (NSSize)windowWillResize:(NSWindow *)sender toSize:(NSSize)frameSize {
NSSize newSize = frameSize;
newSize.height = newSize.width * 2;
return newSize;
}
- (id)init {
self = [super init];
return self;
}
#end
Why does the #property of WindowDelegate need to be strong?
And isn't my winDeleg an object? Why do I have to access it through _winDeleg when it's an object. I though the underscore is used to access variables?
Thank you for your help!
Clicked run and it worked. Yay! But why?
Because instances of NSWindow have a delegate property that can point to any object that implements the NSWindowDelegate protocol, and that protocol includes the -windowWillResize:toSize: method.
Read that a few times. The reason it's important is that you can create your own object, say that it implements NSWindowDelegate, implement -windowWillResize:toSize:, and set that object as your window's delegate. Then, whenever the user resizes the window, your method will be called and can modify the proposed new size.
Normally methods get invoked because the general lifecycle of an program invokes them or because they are an #IBAction, a button or different control elements. But "windowWillResize" is non of them. So why is it called?
This really isn't so different. Think of delegates as "helper objects." They let you customize the behavior of an object without having to create a whole new subclass. The NSWindowDelegate object is essentially a contract that the NSWindow promises to follow: whenever certain things happen, such as the user resizing the window, the window will call certain methods in its delegate object, if the delegate exists and implements those methods. In the case of NSApplication, a lot of those delegate methods are application lifecycle events, like the app starting up or quitting or getting a message from the operating system. In the case of NSWindow, delegate methods correspond to interesting events that can happen to a window, like the user moving it, hiding it, showing it, maximizing it, moving it to a different screen, etc. Other classes, like text views or network connections or movie players, have their own sets of interesting events and their own delegate protocols to match.
Note that methods marked IBAction really aren't delegate methods, they're just methods that get called by objects like controls that use a target/action paradigm. The IBAction keyword lets the IDE know which methods it should present as possible actions for things like buttons. You often find actions in window controllers and view controllers, and those objects frequently act as a delegate for some other object, but the actions themselves aren't part of the delegate protocol. For example, NSTableView takes a delegate object that determines how the table will act and what's displayed in it. It often makes sense for the view controller that manages the table to be the table's delegate, and that same view controller might also manage some buttons and contain the action methods that said buttons trigger, but the actions aren't part of the NSTableViewDelegate protocol and you therefore wouldn't call them delegate methods.
I am having an issue with the detected numbers in iOS 7 and iPhone, when the user long tap the number detected by the OS, it prompts an UIActionSheet with the options: "Call", "Send Message", "Add to Contacts", "Copy" and "Cancel". The problem I am facing is, when the option "Send Message" or "Add to Contacts" is tapped, the OS creates an modal view on top of my current modal view, which leads to having the navigation bar of the second modal view not being displayed correctly.
With that in mind, I am not able to assert at which moment the user has tapped which button, because it is not me who created the UIActionSheet (iOS does itself), then I can not receive any kind of delegate methods. The only message sent to the UIViewController is:
-(BOOL)textView:(UITextView *)textView
shouldInteractWithURL:(NSURL *)URL
inRange:(NSRange)characterRange
Which tells me what kind of data was tapped once by the user (but not long tapped). I tried as well with the method call:
-(void)viewWillDisappear:(BOOL)animated
-(void)viewDidDisappear:(BOOL)animated
Unfortunately, they are never invoked on iOS7, whereas iOS8 does. Which drives me to the conclusion that, so far, this issue is only iOS7 related, I am using an iPhone 4 with iOS 7.1.2. When I tried the same case in iOS8, the second modal view renders correctly, being placed on top of my current view.
I hope someone has more info or other ideas.
Thanks!!
The solution that I ended up applying was to completely avoid the long press gesture on the title number that was generating the action sheet to appear. I did it checking on the gesture recognizer list for the UITextView that was inside the cell, then look up for one that was of the class UILongPressGestureRecognizer, inside I would just disable the friend gesture. Snippet of code is:
RestrictedTextView.h
#import <UIKit/UIKit.h>
#interface RestrictedTextView : UITextView
#end
RestrictedTextView.m
#import "RestrictedTextView.h"
NSString *const kFriendsStringInGesture = #"friends";
#implementation RestrictedTextView
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
if ([[gestureRecognizer class] isEqual:[UILongPressGestureRecognizer class]])
{
UILongPressGestureRecognizer *longPress = (UILongPressGestureRecognizer *)gestureRecognizer;
longPress.enabled = YES;
if ([longPress valueForKey:kFriendsStringInGesture] != nil)
{
UILongPressGestureRecognizer *friendLongPress = (UILongPressGestureRecognizer *)[[longPress valueForKey:#"friends"] anyObject];
friendLongPress.enabled = NO;
}
}
return YES;
}
#end
I've created a button on one viewController that loads another view modally using the UIModalPresentationFormSheet presentation style. On this loaded view, I have two textFields, and I'm forcing the first textField to be first responder so that the keyboard will appear immediately with the new view. I've set up the textFields to have an action method that is hooked up to "Did End on Exit" event. However, whenever I hit "return" on the keyboard for either textField, the keyboard fails to go away (Here is my code):
// addCustomPage method that is called when button from original view is touched
- (IBAction) addCustomPage:(id)sender
{
NSLog(#"Adding Custom Page");
if (!self.customPageViewController)
{
self.customPageViewController =
[[CustomPageViewController alloc] initWithNibName:#"CustomPageViewController" bundle: nil];
}
customPageViewController.modalPresentationStyle = UIModalPresentationFormSheet;
[self presentModalViewController:customPageViewController animated:YES];
// force keyboard to appear with loaded page on the first textField
[customPageViewController.firstTextField becomeFirstResponder];
}
#interface CustomPageViewController : UIViewController
#property (strong, nonatomic) IBOutlet UITextField *firstTextField;
#property (strong, nonatomic) IBOutlet UITextField *secondTextField;
- (IBAction)keyboardEndOnExit:(id)sender; // DID END ON EXIT EVENT
#end
//in CustomPageViewController.m
-(IBAction)keyboardEndOnExit:(id)sender
{
[sender resignFirstResponder];
}
This is a fairly straight forward problem, and I have no problem normally dismissing keyboards using this technique with basic views and textFields. I'm not sure if using a view in this presentation format, or set up makes things different. Thanks!
You have confirmed that you keyboardEndOnExit method is actually being called?
You could also take a more direct approach by calling [yourTextView resignFirstResponder] when a specific action is take by the user, such as a key pressed etc. I would still check if that method is ever being called using breakpoints or a log.
Have a look at this question. Pretty sure it is the same problem caused by UIModalPresentationFormSheet.
I have created a cocoa application (not document based) and have the default MyAppDelegate class and the MainMenu nib file. I have also created a new nib which contains a window called Splash and a window controller class (NSWindowController) called SplashWindowController.
What I would like is that when the application starts instead of the MainMenu nib window opening I would like to open the Splash window.
I think that I have to create an instance of my SplashWindowController in my AppDelegate class and then instantiate the window and set it to front. However I have tried several things like including a reference to the SplashWindowController.h file in my AppDelegate class and also adding an object to my MainMenu nib and setting its class to be SplashWindowController. But have had no luck with either.
If anybody out there could help me with this one it would be much appreciated as have been at this (what seems like a simple task) for the best part of a day.
Thanks in advance.
You can simply combine both windows into one .xib file.
ExampleAppDelegate.h
#import <Cocoa/Cocoa.h>
#interface ExampleAppDelegate : NSObject <NSApplicationDelegate> {
IBOutlet id splash;
IBOutlet id window;
}
- (IBAction)closeSplashButton:(id)sender;
- (void)closeSplash;
#end
ExampleAppDelegate.m
#import "ExampleAppDelegate.h"
#implementation ExampleAppDelegate
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
[NSTimer scheduledTimerWithTimeInterval:5.0
target:self
selector:#selector(closeSplash)
userInfo:nil
repeats:NO];
}
- (IBAction)closeSplashButton:(id)sender {
[self closeSplash];
}
- (void)closeSplash {
[splash orderOut:self];
[window makeKeyAndOrderFront:self];
[NSApp activateIgnoringOtherApps:YES];
}
#end
MainMenu.xib
Add NSWindow (Title: Splash)
Add NSButton to the Splash window
Connect both IBOutlets to the corresponding windows
Connect the button to the corresponding IBAction
Enable 'Visible at Launch' for the splash window (using the Inspector)
Disable 'Visible at Launch' for the main window (using the Inspector)
Result
At launch only the splash window is visible. The splash window automatically closes after 10 seconds. The user can close the splash window directly by pressing the button. The main windows shows up after closing the splash window.
I think this should be quite simple, but I cannot make it work.
I want to detect mouse clicks on a WebView...
I've subclassed WebView, here is the code
#import <AppKit/AppKit.h>
#import <Foundation/Foundation.h>
#import <WebKit/WebKit.h>
#interface ResultsWebView : WebView {
}
#end
and
#import "ResultsWebView.h"
#implementation ResultsWebView
- (void)mouseDown:(NSEvent *)theEvent {
NSLog(#"%#", theEvent);
}
#end
In my xib file, I added a WebView and then changed the class to ResultsWebView.
I've checked in runtime and the object is a ResultsWebView, but the mouseDown event is never called...
What am I missing?
WebUIDelegate comes into rescue.
Supposing that you have a WebView instance in your NSWindowController:
WebView *aWebView;
You can set your controller as UIDelegate, as follow:
[aWebView setUIDelegate:self];
implementing the following method within your controller, you will have a form of control over mouse click events:
- (void)webView:(WebView *)sender mouseDidMoveOverElement:
(NSDictionary *)elementInformation modifierFlags:(NSUInteger)
modifierFlags
{
if ([[NSApp currentEvent] type] == NSLeftMouseUp)
NSLog(#"mouseDown event occurred");
}
as a bonus, playing with the elementInformation dictionary you can get additional informations about the DOM element in which click event occurred.
The problem is that the message isn't being sent to the WebView. Instead it is being sent to the private WebHTMLView inside your WebView, which actually processes the mouse events.
http://trac.webkit.org/browser/trunk/Source/WebKit/mac/WebView/WebHTMLView.mm#L3555
You might just want to look into using javascript in your WebView to send the onmousedown events back to your objective-c class.
Maybe this helps:
[window setAcceptsMouseMovedEvents:YES];