How to cancel a drag in a Mac application - objective-c

I have an application that has a canvas (NSView) where a user can drag an element around. When the mouse leaves the edge of that view it becomes a drag operation.
What I would like to do is when the mouse entered the originating view again it would cancel the operation and would automatically start the move within the canvas again.
I can figure out the second part I just need to figure out how to force a drag and drop to cancel. I need to do this somehow from draggingEntered: so before the mouse is even released.

Make your canvas view respond to drags as well. Initially all its <NSDraggingDestionation> protocol methods would just return “no, do nothing”, but if you start a drag from within the canvas you’d keep track of that, and once the drag leaves and comes back your NSDraggingDestionation methods would return, “Ok, we accept, drag is over, don’t bother animating.”
Then you could continue tracking locally. Like, assuming your canvas had called:
- (void)dragImage:(NSImage *)anImage at:(NSPoint)viewLocation offset:(NSSize)initialOffset event:(NSEvent *)event pasteboard:(NSPasteboard *)pboard source:(id)sourceObj slideBack:(BOOL)slideFlag;
That method would then return control to your canvas

Related

How to forward pointer events

I was able to get this working in iOS but I am having trouble with Win 8. What I need to do is this:
I have a drawing view that is the forward most child of a ScrollViewer. When I enable drawing mode from a button in the app bar, this view becomes active. With one touch, I want the user to be able to perform the current drawing action, but with two touches I want the touches to go to the scroll view instead for scrolling and zooming. I am able to distinguish between 1 and 2 (or more) fingers, but I don't know what to do after that point.
I tried removing the manipulation mode from the drawing view so that it would not block the scroll view, but the touches continued to be swallowed by the drawing view. I also tried calling ReleasePointerCapture but that had no effect either. How can I forward touch events using the WinRT API?
For some more information, I am making use of the PointerPressed, PointerChanged, etc events in the drawing view.

Observe global mouse location on screen in Cocoa?

I'd like to be notified when the mouse moves and get the location on the screen. I tried this:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSMouseMovedMask handler:^(NSEvent *event) {
CGPoint location = [NSEvent mouseLocation];
NSLog(#"Position: %#", NSStringFromPoint(location));
}];
However this seems to only work as long as the mouse is in my app's window. As soon as leave it I'm not notified. Until I enter the window again. Shouldn't this event be global?
UPDATE:
I extracted the code in question and made a separate sample project (Dropbox-Link). Just to be sure, there is nothing else interfering with it. It's a clean project with just the code above in applicationDidFinishLaunching:. I get the same results and it's really strange. I uploaded a video to youtube: http://www.youtube.com/watch?v=I3AKgmURaMk.
These are my observations:
Immediately after launching the app, no events at all are delivered, no matter where I move the mouse.
Clicking the app's window will somehow activate event delivery. Now I receive NSMouseMovedMask events, no matter where I move the mouse (this is what I want to achieve).
Clicking back to Xcode doesn't change anything. I keep getting the events.
However, moving the focus back to my app results in a strange behavior. When I move my mouse over Xcode I only get events over some parts of the window, while some parts seem to absorb the event. More specifically I don't receive events over the editor or the log view. I only receive them while moving the mouse over the gray split view separator area (with the tab bars embedded in them).
It should work, I've seen weird behavior when the window is up but not in the foreground. I've noticed that if the application is in the background but not in the dock you will only get events when the mouse transitions between windows in the foreground. Minimizing the application seems to fix this problem.. I'm very new at cocoa dev/ObjC.. I might be wrong, but I've noticed when I minimize my app, I get all events.

UIButton events. What's the difference?

I've encountered a problem where my button should remain "pressed down" while it shows popover called from it. Popover is selector for some filter and filter is shown on button itself. When I tap on it and it shows popover it becomes deselected no matter what.
I think I have to redefine it's behavior on touch event and make it respond not to standart touch up inside. Then I wondered what are other events responsible for? But I couldn't find events list in iOS library and in StackOverflow are only questions about incorrect behavior of touch up inside or touch down.
So what's the difference betweeen touch events?
touch cancel - when you touch button but move your finger away and
it remains deselected?
touch down - right on tap.
touch down repeat ??
touch drag enter ??
touch drag exit ??
touch drag inside ??
touch drag outside ??
touch up inside - when you tap and release button remaining in it's
bounds . It changes UIButtons state to Normal.
touch up outside - when you tap and release button leaving it's
bounds ?
other IBActions are not sent by UIButton, right?
Also how those events change UIButton's appearance? Like highlighted or selected?
I'd appreciate a link on good article about IBActions, because I couldn't find it.
From Apple's doc for UIControlEvents:
UIControlEventTouchCancel
A system event canceling the current touches for the control.
UIControlEventTouchDown
A touch-down event in the control.
UIControlEventTouchDownRepeat
A repeated touch-down event in the control; for this event the value of the UITouch tapCount method is greater than one.
UIControlEventTouchDragEnter
An event where a finger is dragged into the bounds of the control.
UIControlEventTouchDragExit
An event where a finger is dragged from within a control to outside its bounds.
UIControlEventTouchDragInside
An event where a finger is dragged inside the bounds of the control.
UIControlEventTouchDragOutside
An event where a finger is dragged just outside the bounds of the control.
UIControlEventTouchUpInside
A touch-up event in the control where the finger is inside the bounds of the control.
UIControlEventTouchUpOutside
A touch-up event in the control where the finger is outside the bounds of the control.
Listed in, what I would consider, order of common use/likelihood of occurrence for a normal button:
UIControlEventTouchDown: The user tapped the button. This fires on the finger/stylus making contact.
UIControlEventTouchUpInside: The user tapped the button. This fires on the finger/stylus contact pulled back away from the screen.
Useful for sliders and drag events like moving a component around. The below are in order of occurrence:
UIControlEventTouchDragInside: Triggered as the finger drags into the button area.
UIControlEventTouchDragExit: Triggered during a drag motion. It is called only once, as the users finger/stylus leaves the bounds of the button.
UIControlEventTouchDragOutside: Triggered during a drag motion, after 'UIControlEventTouchDragExit', and is called continuously, as long as the original touch continues.
UIControlEventTouchUpOutside: This is simply the finger/stylus being lifted BUT only if the finger/stylus is no longer within the bounds of the button. The important thing (and probably obviously) to call out is that the touch had to have been within the button at some point to associate this event with the button.
Note: My understanding is that the above can be helpful for:
Sliders: as you might expect the touch may have been intentional but because of the quick swipe action, their finger movement may be sloppy and lift up outside of the slider area.
Moving components around, as when you push things around a screen you want the movement to happen when the finger/stylus touches the border of the component/object.
Other events:
UIControlEventTouchCancel: Something out of the user's control is cancelling their touch action. Think of this as something "going wrong" on the phone side of things.
UIControlEventTouchDownRepeat: Want to detect when your user is mad and tapping a button furiously? Want to detect if they're still in Windows mode and are trying to "double click"? Or maybe you designed a button to do something different if they tap twice. This event helps with all of those!
References:
SO 1: Dif between UIControlEventTouchDragOutside and UIControlEventTouchDragExit
SO 2: What is UIControlEventTouchCancel?

Missing mouse up event from different canvas in smartGWT

I have two canvases located to each other. One is supposed to be a kind of workspace you can add items to, select and rearange them. The other one is just a property view.
What I want to do is to be able to draw a rectrangle on the workspace. As long as the user holds down the mouse button the rectangle will show up. If he releases the button all items that are beneath the rectangle will be selected. This currently works good with the MouseDown, MouseStillDown und MouseUp events. I'm drawing another rectangle shaed canvas on the workspace which will be transformed on every MouseStillDown event and the selection will occur on the MouseUp event. My problem is, that if the user hold down the mouse button and moves it to the property canvas and then releases the button the MouseUp event from the workspace isn't called. Neither is the one from the property since it's missing a MouseDown event. So if the user releases the button there the selection won't work and the rectangle stays in the workspace.
Is there an oppurtunity to somehow avoid this? Or is there a better way to determine the area the user selected with it's mouse while holding down the left mouse button?

Mouse events for an NSSegmentedCell subclass?

I'm trying to implement some rudimentary tabs in a Cocoa editor I'm working on. I am using an NSSegmentedControl and adding segments to it as tabs. I'm using a custom NSSegmentedCell subclass for the tabs to draw a little 'x' icon next to the text for closing tabs and so far it's been going pretty smooth.
However, I cannot figure out how to actually process mouse events for the tabs to check if someone moused over (or clicked) the 'x' icon. I tried overriding "mouseMoved" in my NSSegmentedControl subclass, but for some odd reason it stops getting called when I add a new segment to it (I set "setAcceptsMouseMovedEvents" to yes in awakeFromNib, do I have to also set it somewhere else??). NSSegmentedCells, being NSCell subclasses seem to not have any mouse event processing, aside from mouse tracking, which gets triggered only when the control is clicked.
So the question is, how would I properly process mouse events, either in the NSSegmentedControl or in the NSSegmentedCell subclass?
Take a look at NSTrackingArea. You can add a tracking area to your NSSegmentedControl and get mouse-entered events on that to highlight the close button.
As for getting the click events, you're probably best off using a separate NSActionCell subclass for the close button and do some hit testing there.