Cocoa: how to prevent mouse move past coordinates? - objective-c

Final goal is to prevent mouse move to another screen (dual display setup) unless a hotkey is held.
The best I came up with, is this:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSMouseMovedMask handler:^(NSEvent *mouseMovedEvent) {
// does nothing yet
}]
The reason I went for global monitor is that my app does not have any windows (and views), it's a status bar app. So NSTrackingArea went home.
Any help would be much appreciated. In Java world, I would simply preventDefault() the event object. Now I need to get the same functionality in Objective-C. Ideally, I'd wish there would be "MouseMovedPastScreen" event, but apparently there's not.
Thanks.
EDIT
Again, in Java, I would get the bounds of both screens, and stopped mouse at the corner positions. And then would allow the event to bubble if the ⌘ key has been held during the event execution.

Controlling the mouse cursor From the Quartz Display Services programming guide. Particularly
CGAssociateMouseAndMouseCursorPosition (false);

Related

Where To Handle Mouse Events When Using GKStateMachine

There is just not a lot of information out there about using GKStateMachine and I was wondering where mouse events should be handled. In Apple's "Dispenser" example, they handle the mouse click events in the main scene and I was wondering if that's where they should be handled or can each state handle its own mouse events?
My purpose for using GKStateMachine in the first place is for a MacOS SpriteKit project (Objective C) I'm working on involving the creation of a Leaderboard for a tournament. In the setup phase of the app, I have a visual list of competitors and I click on them one at a time to decide on match ups.
Mouse events will be captured by visual elements e.g. nodes in the scene or the scene itself. GKStateMachine is not a visual element so it doesn't make sense for it to handle mouse events. You could expect the mouse event captured by a visual element to trigger a state change.

Observe global mouse location on screen in Cocoa?

I'd like to be notified when the mouse moves and get the location on the screen. I tried this:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSMouseMovedMask handler:^(NSEvent *event) {
CGPoint location = [NSEvent mouseLocation];
NSLog(#"Position: %#", NSStringFromPoint(location));
}];
However this seems to only work as long as the mouse is in my app's window. As soon as leave it I'm not notified. Until I enter the window again. Shouldn't this event be global?
UPDATE:
I extracted the code in question and made a separate sample project (Dropbox-Link). Just to be sure, there is nothing else interfering with it. It's a clean project with just the code above in applicationDidFinishLaunching:. I get the same results and it's really strange. I uploaded a video to youtube: http://www.youtube.com/watch?v=I3AKgmURaMk.
These are my observations:
Immediately after launching the app, no events at all are delivered, no matter where I move the mouse.
Clicking the app's window will somehow activate event delivery. Now I receive NSMouseMovedMask events, no matter where I move the mouse (this is what I want to achieve).
Clicking back to Xcode doesn't change anything. I keep getting the events.
However, moving the focus back to my app results in a strange behavior. When I move my mouse over Xcode I only get events over some parts of the window, while some parts seem to absorb the event. More specifically I don't receive events over the editor or the log view. I only receive them while moving the mouse over the gray split view separator area (with the tab bars embedded in them).
It should work, I've seen weird behavior when the window is up but not in the foreground. I've noticed that if the application is in the background but not in the dock you will only get events when the mouse transitions between windows in the foreground. Minimizing the application seems to fix this problem.. I'm very new at cocoa dev/ObjC.. I might be wrong, but I've noticed when I minimize my app, I get all events.

How to be notified in OS X when a drag operation *starts* anywhere?

I'm wondering if there's any way to have my application be notified when a drag-and-drop operation starts anywhere on the screen, even if I don't have an active window there.
I've looked into the normal drag-and-drop APIs, but I haven't spotted anything that does this. The NSDraggingDestination protocol along with the -[NSWindow/NSView registerForDraggedTypes:] method allows you to notice when someone is dragging something and that crosses over into your window, but I'd like to notice it when any dragging operation is started anywhere on the screen.
Any tips on how to go about this? Is there a standard Cocoa API for it, or is there a private API / some kind of dirty hack to get this information?
Thanks in advance :)
Take a look at NSEvent’s +addGlobalMonitorForEventsMatchingMask:handler:. I’m not sure if you can track mouse dragging but it’s certainly possible to track mouse button up/down events.
i haven't done it,
but i am assuming you need some kind of external software monitoring ALL mouse activity on the system, and reporting it to your app (or your application doing this itself),
as dragging events are usually reported in your app only when there is activity inside your app's window..

how to make an NSWindow blocking other Windows?

i'm looking for a way to have a NSWindow, which is able to block other NSWindows, like the menubar does. I mean: It is not possible to drag a Window over the menubar.
Is that kind of behavior realizable for my own NSWindow?
Thanks in advance
Bijan
NSWindow's dragging behavior automatically keeps windows from going under the menu bar — because they aren't supposed to. If you have some special case, you can override the standard dragging behavior. But think carefully before throwing away standard functionality prescribed by the HIG.
Also, it isn't possible to drag a window over the menu bar (rather than under) unless it's also over everything else, because the menu bar is normally above every other window.
I just stumbled on this question. There they say it is possible to move other windows using the Accessibility API or the Quartz Window Services.
Can't I just read out the other window's positions and move them, so that they do not collide with my window? Maybe triggered by a 0.1 sec. timer?

How can I check whether Exposé is being activated or not?

I'm creating an application that emulates MacBook's multi-touch Trackpad. As you may know, on MacBook's trackpad
if you swipe 4 fingers up, it triggers the Show Desktop.
if you swipe 4 fingers down, it shows the Exposé.
However, if the Show Desktop is being activated and you swipe 4 fingers down, it will come back to the normal mode. The same goes with the Exposé: if the Exposé is being activated and you swipe 4 fingers up, it will also come back to the normal mode.
Here is the problem: I use the keyboard shortcut F3 to show the Exposé and F11 to show the Show Desktop. The problem is, when the Show Desktop is being activated, if I press F3, it will go straight to the Exposé. And when the Exposé is being activated, if I press F11 it will go straight to the Show Desktop. But I want it to behave like Trackpad, which I guess its code may look like this
- FourFingersDidSwipeUp {
if (isExposeBeingActivated() || isShowDesktopBeingActivated()) {
pressKey("Esc");
} else {
pressKey("F11");
}
}
But I don't know how to implement the "isExposeBeingActivated()" and "isShowDesktopBeingActivated()" methods. I've tried creating a window and check whether its size has changed (on assumption that if the Expose is being activated, its size should be smaller), but the system always returns the same size. I tried monitoring the background processes during the Expose, but nothing happened. Does anyknow have any suggestions on this?
(I'm sorry if my English sounds weird.)
As far as I know, there's no public interface to any Exposé related functionality beyond the ability to specify the "collection behavior" of your own application's windows.
came here after reading your email. I understand the problem. After a bit of googling I found out what you already know, that there's really no official API or documentation for Exposé. A very ugly solution I've thought of could be having Exposé trigger a timer equal to the total time it takes to show all windows fully (guessing this is constant). If a swipe up would be done within that timer, it would mean that Exposé would still be active (isExposeBeingActivated()), so you would trigger a cancel instead of a Show Desktop. This wouldn't cover the use of the "slow motion" Exposé (via SHIFT key). Maybe you can detect if it's a normal or "slow motion" Exposé call?
Really sorry if this doesn't make sense at all within your application's scope, guess I'm just really saying the first solution I thought of.
Cheers.
Pedro.