I'm facing a strange behavior with Cocoa NSView on Mac OS X.
I've a custom NSView in a NSView container, this custom NSView tracks mouse movements, clicks, and has a tooltip.
When I add a NSView above the described view, I can still see the tooltips even if the view with the tooltip is under, behind and not visible.
I'm pretty sure that I misunderstood something in the event handling chain. Any help is really appreciated!
The core issue is that you are not supposed to have overlapping views in Cocoa. Or at least, the behavior then becomes undefined. A view can be a subview of another view, but not simply a sibling within the bounds of the other view.
However, one way to solve your particular problem is to make the view underneath hidden, using the setHidden: method.
If you're not using it anymore you can call the removeFromSuperview method.
NSView *myView
[myView alloc] init]
// do stuff
[myView removeFromSuperview]
if you just don't want it to receive events you can call the resignFirstResponder method
NSView *myView
[[myView alloc] init]
// do stuff
[myView resignFirstResponder]
Related
My app is not document based, and its sole window is managed by a custom, xib-based NSWindowController subclass that I instantiate within the app delegate code:
- (void) applicationDidFinishLaunching:(NSNotification*) aNotification
{
_mainWindowController = [MainWindowController new];
// (stored in ivar just to prevent deallocation)
//[_mainWindowController showWindow:self];
// ↕︎ Not sure about the difference between these two... both seem to work.
[[_mainWindowController window] makeKeyAndOrderFront:self];
}
I have subclassed NSClipView to "center content inside a scroll view" (instead of having it pegged to the lower left corner) when it is zoomed to a size smaller than the clip view, and also implement custom functionality on mouse drag etc.
My window does have a title bar.
My window isn't borderless (I think), so I am not subclassing NSWindow.
I have overriden -acceptsFirstResponder, -canBecomeKeyView and -becomeFirstResponder in my NSClipview subclass (all return YES).
The drag events do trigger -mouseDown: etc., and if I set a breakpoint there, the first responder at that point is the same as the window hosting my clip view: [self.window firstResponder] and [self window] give the same memory address.
What am I missing?
Update
I put together a minimal project reproducing my setup.
I discovered that if my custom view is the window's main view, -keyDown: is called without problems. But if I place a scroll view and replace its clip view by my custom view (to do that, I need to change the base class from NSView to NSClipView, of course!), -keyDown: is no longer triggered.
I assume it has something to do with how NSScrollView manages events (however, as I said before, -mouseDown:, -mouseDragged: etc. seem to be unaffected).
I also discovered that I can override -keyDown: in my window controller, and that seems to work, so I have decided to do just that (still open to an answer, though). Also, since I'm trying to detect the shift key alone (not as a modifier of another key), I'd rather use:
- (void) flagsChanged:(NSEvent *) event
{
if ([event modifierFlags] & NSShiftKeyMask) {
// Shift key is DOWN
}
else{
// Shift key is UP
}
}
...instead of -keyDown: / -keyUp: (taken from this answer).
I have a UIScrollView with some UIViews in it.
What I am trying to do, is catch the touches events when the UIViews are touched/untouched.
The problem I am having, is the UIScrollView seems to swallow all the touch events, especially if you hold for too long on a UIView.
I preferably want the UIScrollView to have userInteraction disabled as it scrolls automatically.
Is this possible?
I have tried subclassing the UIViews but the touches events are never called in it.
You can attach a tapGesture to your scrollview with something along those lines:
UITapGestureRecognizer* tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapGestureUpdated:)];
tapGesture.delegate = self;
tapGesture.numberOfTapsRequired = 1;
tapGesture.numberOfTouchesRequired = 1;
[self addGestureRecognizer:_tapGesture];
then in your - (void)tapGestureUpdated:(UITapGestureRecognizer *)tapGesture method this is your responsability to determine the location of the touch and find out if there was a picking on one of your subviews. You could call then a method on a delegate that notify that a specific view has been touched.
Perhaps reordering your views so that a view that has a touch recognizer object associated with it is what the app recognizes. Move it in the document outline to the top (scroll view)
In my application window I have two NSViews. On the left the NSView ("Menu") contains a few buttons. When one of the buttons is clicked it should change the contents of the right NSView ("Content").
For each of the views on the right I have a separate NSViewControllers that get loaded and their views gets added as a subview. When a further button gets pressed on the left the added subviews on the right should be removed and the new view should be loaded as a subview.
To accomplish this I load my Menu in AppDelegate with the following:
MenuVC *menuSubView = [[MenuVC alloc] initWithNibName:#"MenuVC" bundle: nil];
menuSubView.contentView = (NSView*)[self contentView];
[[self menuView] addSubview:[menuSubView view]];
This works fine. As you can see I have a NSView pointer in the Menu VC which points to the contentView so that I can populate it with the subviews.
Now as a method for one of the button presses I do the following:
SomeContentVC *subView = [[SomeContentVC alloc] initWithNibName:#"SomeContentVC" bundle:nil];
[self.contentView addSubview:[subView view]];
This does not work.
If I however add a subview from the awakeFromNib method of the MenuViewController implementation (in the case of default content when the app opens) it works. However when I try to remove that subview using
[[self.contentView setSubviews:[NSArray array]];
I can't. Interesting is also that if I try to count the number of subviews (even after having added one in the awakeFromNib method) it returns 0 subviews for self.contentView. Why? How can I get it to work properly?
Thanks
The fact that messaging self.contentView achieves nothing except, for some things, returning 0 probably means that self.contentView is nil.
Do you perhaps have two instances of MenuVC by accident? Perhaps one instantiated in a NIB and one instantiated in code?
When in doubt, log everything. Log self in various methods. Log menuSubView just after you create it. Log menuSubView.contentView just after you assign it. Etc. Eventually, you'll probably see that you're interacting with different objects than you thought you were.
in my app, when the user touches on the view ,i am showing an UIImageView there and i drag and
drop the image from another UIImageView to that touched UIImageView.
But the problem is that, only the recent touched UIImageView is activated. i mean ,when i
click 3 times then shows 3 UIImageViews but only the last is activated and accept the another
image.
How can i make all touched UIImageViews are activated .. Any body help on this..
Thanks in advance.
You should read Apples documentation on the responder chain and event handling. The key UIView method here is
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
This method traverses the view hierarchy by sending the
pointInside:withEvent: message to each subview to determine which
subview should receive a touch event. If pointInside:withEvent:
returns YES, then the subview’s hierarchy is traversed; otherwise, its
branch of the view hierarchy is ignored. You rarely need to call this
method yourself, but you might override it to hide touch events from
subviews.
You should try this:
Where there is your UIView (Or a view):
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(tapped:)];
tapGesture.numberOfTapsRequired = 1;//number of tap
[view addGestureRecognizer:tapGesture];
The selector:
-(void)tapped:(UITapGestureRecognizer *)sender {
NSLog(#"Pressed");
}
I have a NSView object that is being returned to me as a result of a function. I know the view is valid because I can see the contents of the view if I do this:
NSRect rect = NSMakeRect(600,600,200,200);
NSWindow *testWindow = [[NSWindow alloc] initWithContentRect:rect styleMask:NSTexturedBackgroundWindowMask backing: NSBackingStoreBuffered defer:NO];
[[testWindow contentView] addSubview:returnedView];
[testWindow makeKeyAndOrderFront:NSApp];
In my application I have a window with a custom view (has some text on it) that has an outlet referenced in my code using IBOutlet. I'm trying to add the view I'm getting returned as a subview of that outlet.
[referencedView addSubView:returnedView]
[referencedView setNeedsDisplay:YES];
The referenced view is visible (I can see the text in it), but the returnedView doesn't appear on top. Am I forgetting something?
This is what my code looks like now:
[returnedView setFrame:NSMakeRect(0,0,200,200)];
[referencedView addSubview:returnedView];
[referencedView setNeedsDisplay:YES];
[referencedView drawRect:[referencedView bounds]];
Views can only be in ONE superview is what I just learned. I had the test code and the code I wanted to work so it was removing my view and putting it in the window instead.
That custom view (with the text in it) has a custom drawRect implementation, in which the text is drawn, right? In this case, my idea is that you'd want to call super's implementation of drawRect to make sure that the subviews get drawn too.