When Mission Control runs, it prevents applications from receiving keyboard and mouse events. It also leaves the last application running thinking that it still has focus. This is a problem for me because I don't receive keyUp or mouseUp events if I start Mission Control with a mouse button or a key held down and my application will behave as if that mouse button or key is held down.
I would like a way to either read both keyboard and mouse events even when Mission Control is active, or a way of detecting that Mission Control is active. Ideally, I would like to be able to do the latter since I effectively can't use my application when Mission Control is running.
I've tried a couple of things with no luck:
Use addGlobalMonitorForEventsMatchingMask to register a global monitor for keyboard and mouse events. This captures mouse events (but not keyboard events, although the documentation says keyDown events should be sent to the global monitor) when I switch to another application, but Mission Control doesn't seem to let events propagate to global monitors.
Check [[NSRunningApplication currentApplication] {isActive, ownsMenuBar}].
Apparently, my application is active even though it's not receiving events!
Check [NSApp keyWindow] != nil.
Apparently, one of my windows should be receiving key events. None of them are.
Check if Mission Control is one of the running applications returned by [NSWorkspace runningApplications]. Mission Control does not show up in this list when it's running.
Edit:
I've finally worked around this problem (albeit not in a very satisfactory way). For the mouse, it turns out that you can query the state of the pressed buttons with [NSEvent pressedMouseButtons]. I simply keep track of what I think the mouse state should be from NSLeftMouseDown and NSLeftMouseUp events and compare that to [NSEvent pressedMouseButtons] every so often to make sure that they're consistent. If they're not, then I know that something has hijacked my NSLeftMouseUp event and act accordingly.
For the keyboard, I could not find a way to query the keyboard state, so I couldn't do a similar workaround. I ended up disabling application switching using presentation options when keys are pressed.
At least in OS X 10.10, you can use this code to check if Mission Control is active or not:
func missionControlIsActive() -> Bool
{
var result: Bool = false
let windowInfosRef = CGWindowListCopyWindowInfo(CGWindowListOption(kCGWindowListOptionOnScreenOnly), CGWindowID(0)) // CGWindowID(0) is equal to kCGNullWindowID
let windowList: NSArray = windowInfosRef.takeRetainedValue() // We own the returned CFArrayRef
for entry in windowList
{
if (entry.objectForKey("kCGWindowOwnerName") as! String) == "Dock"
{
var bounds: NSDictionary = entry.objectForKey("kCGWindowBounds") as! NSDictionary
if (bounds.objectForKey("Y") as! NSNumber) == -1
{
result = true
}
}
}
return result
}
In a nutshell, the code checks if a specific window owned by the OS X Dock process is visible on the screen and if it is in a specific position. If both conditions are met, Mission Control will be active right now. Code will work in a sandboxed app and no privileges for assistive devices are required.
Did you try on bash level using NSTask? Something like ps -faxU <username> should list all running processes and then you could parse the output, or indeed you could use ps -faxU <username> | grep -i "mission control" (At the top of my head I am not sure how the process may be called, but sth like "mission control" seems legit). Not the most elegant solution maybe, but if nothing else works it may be worth it.
May be i'm missing something, but have you tried to use event taps instead of global monitoring?
It does appear that DTrace has some ability to see Mission Control being activated. Try running:
sudo fs_usage -filesys | grep Mission
from the command line and then launching the Mission Control app from the /Application folder.
You should see a lot of output related to Mission Control starting up. Unfortunately, this same output did not appear by using the keyboard short cut or swiping. Of course, using DTrace in production code is not something I would actually recommend.
C++ and Qt implementation works in latest OS X.
bool Window::missionControlIsActive() {
bool result = false;
CFArrayRef windows = CGWindowListCopyWindowInfo(kCGWindowListOptionOnScreenOnly+kCGWindowListExcludeDesktopElements, kCGNullWindowID);
for (int i = 0; i < CFArrayGetCount(windows) ; i++) {
auto cfMutableDictionaryRef_dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex( windows, i );
auto cfStringRef_name = (CFStringRef)CFDictionaryGetValue(cfMutableDictionaryRef_dict, kCGWindowName);
if (QString::fromCFString(cfStringRef_name) != u"") continue;
auto cfStringRef_ownerName = (CFStringRef)CFDictionaryGetValue(cfMutableDictionaryRef_dict, kCGWindowOwnerName);
if (QString::fromCFString(cfStringRef_ownerName) != u"Dock") continue;
auto cfDictRef_bounds = (CFDictionaryRef)CFDictionaryGetValue(cfMutableDictionaryRef_dict, kCGWindowBounds);
auto cfNumRef_bounds_Y = (CFNumberRef)CFDictionaryGetValue(cfDictRef_bounds, QString("Y").toCFString());
double num;
CFNumberGetValue(cfNumRef_bounds_Y, kCFNumberFloat64Type, &num);
if (num > 1.0 and num < 1000000) continue;
result = true;
break;
}
CFRelease(windows);
return result;
}
Related
I would like to create a system tool / application which has the capacity to aid in window management. I'm trying to find documentation about the following topics, if they are indeed possible given the security sandboxing of OSX.
Show a list of running applications with the name & icon, and allow the user to choose one
Manipulate the frame(s) of said application's windows (eg, resize, reposition) from my app (with animations -- though I assume this will be trivial once I can perform the actual change)
Hide or show these applications from task managers, etc.
Be able to launch (or terminate) instances of the given application
It seems to me that Quicksilver accomplishes many of these things, but the lack of AppStore availability makes me wonder if it possible to do this while remaining in the OSX sandbox.
There are a lot of pieces of software out there that do window management. You can check out a tiling window manager I've been hacking on called Amethyst. The basic idea behind software like this relies on Accessibility (which you can find documentation for here). As a quick overview the APIs work by acquiring references to accessibility elements (applications, windows, buttons, text fields, etc.) which have properties (hidden, position, size, etc.), some of which are writable.
As an example let's say that you wanted to move all windows in every running application to the upper left corner of the screen. That code might look like
for (NSRunningApplication *runningApplication in [[NSWorkspace sharedWorkspace] runningApplications]) {
AXUIElementRef applicationRef = AXUIElementCreateApplication([runningApplication processIdentifier]);
CFArrayRef applicationWindows;
AXUIElementCopyAttributeValues(applicationRef, kAXWindowsAttribute, 0, 100, &applicationWindows);
if (!applicationWindows) continue;
for (CFIndex i = 0; i < CFArrayGetCount(applicationWindows); ++i) {
AXUIElementRef windowRef = CFArrayGetValueAtIndex(applicationWindows, i);
CGPoint upperLeft = { .x = 0, .y = 0 };
AXValueRef positionRef = AXValueCreate(kAXValueCGPointType, &upperLeft);
AXUIElementSetAttributeValue(windowRef, kAXPositionAttribute, positionRef);
}
}
Which illustrates how you get references to applications and their windows, how to copy attributes from an accessibility element, and how to set attributes of an accessibility element.
There are a variety of notifications documented in NSWorkspace for the launching and termination of applications, and the accessibility framework also has a sense of notifications for things like an application creating or destroying windows, or a window miniaturizing or deminiaturizing.
Animating the window changes is non-trivial and I haven't figured out how to do it yet, though it may be possible. It may not be possible at all without hitting private APIs. But the other things you have listed should be possible. Hiding an application, for example, could be done by setting the kAXHiddenAttribute on the application accessibility element. Launching an application can actually be done via -[NSWorkspace launchApplication:].
Note that the use of accessibility necessitates that the user have Enable access for assistive devices turned on in System Preferences > Accessibility.
I'm using this code to capture Global Keyboard Shortcuts in my app.
This works great on almost every computer I run it on. I tried it on a brand new Retina Macbook Pro this week and addGlobalMonitorForEventsMatchingMask doesn't work at all. It doesn't even log every key like I have it set up to do here for debugging.
Is there a more reliable way to do this? Right now I load this on applicationDidFinishLaunching.
I think it might make more sense to load it as its own method in the App Delegate but I'm not sure what the syntax of that would look like.
[NSEvent addGlobalMonitorForEventsMatchingMask:NSKeyDownMask handler:^(NSEvent *event){
NSLog(#"sequence = %li", (unsigned long)[event modifierFlags]);
// Activate app when pressing cmd-c
if([event modifierFlags] == 1048840 && [[event charactersIgnoringModifiers] compare:#"c"] == 0) {
In OSX 10.9 (Mavericks) the setting has moved to System Preferences > Security & Privacy > Privacy > Accessibility - make sure your app is checked.
check the "Enable access for assistive devies" in accessiblity in system prefrences and try again.
Using this code, I may register a global event handler:
[NSEvent addGlobalMonitorForEventsMatchingMask: NSKeyDownMask
handler: ^(NSEvent *incomingEvent) {
NSString *chars = [[incomingEvent characters] lowercaseString];
unichar character = [chars characterAtIndex:0];
// do something useful
NSLog(#"keydown globally! Which key? This key: %c", character);
}];
Unfortunately, events get passed along to this monitor, if support for assistive devices is enabled. Without assistive devices being enabled, no events get passed along.
Form the documentation:
Key-related events may only be monitored if accessibility is enabled or if your
application is trusted for accessibility access (see AXIsProcessTrusted).
I wonder, if another method exists, which passes along events without forcing the user to enable specific features of OS X.
While I didn't find a solution in Apple's docs, a solution must exist. E.g. the MAS-downloaded version of Alfred allows to define a hotkey.
Interestingly, Alfred's preferences only shows special keys and points out, that certain special key combinations may not work.
Since I basically want to show / hide a 'global' non-activating panel, I probably should simply prepare a system service. Should I?
You could try creating a Quartz Event Tap, with kCGSessionEventTap as the location.
Quartz Event Services Reference
Sample code from Mac OS X Internals
I've needed to make a global hot key input box in my Cocoa App.
I know about Shortcut Recorder, but it is a very old solution. It has parts implemented using Carbon, which has been deprecated, and I can't publish my app to the Mac App Store if I use it.
Is there any ready-to-use modern solution? Can anybody give me the way to make this by myself (I don't know where to start from)?
There is a modern framework named MASShortcut for implementing Global Shortcuts in OS X 10.7+.
Not all of Carbon is deprecated. You can't make a pure-Carbon application anymore, but some APIs live on and some of them are still the easiest way to do certain things.
One of these is the Carbon Events hotkey API. You certainly can sift through all the events using NSEvent's event-monitor methods, but it's unnecessary work. The Carbon Events hotkey API is still supported and much simpler—you just tell it what key you want to match and what function to call when the key is pressed. And there are Cocoa wrappers such as DDHotKey that make it even simpler.
RegisterEventHotKey, the relevant Carbon Events function (see also UnregisterEventHotKey in the same doc)
DDHotKey
MASShortcut, yet another wrapper (suggested by TongG):
MASShortcut (with ARC)
MASShortcut (without ARC)
KeyboardShortcuts, written in Swift and includes a SwiftUI hotkey recorder view [added to this answer in edit by the project's author].
In Mac OS X 10.6 and higher, you can use the methods +addGlobalMonitorForEventsMatchingMask:handler: and +addLocalMonitorForEventsMatchingMask:handler: defined from the NSEvent class. Monitoring Events reports the following information:
Local and global event monitors are mutually exclusive. For example, the global monitor does not observe the event stream of the application in which it is installed. The local event monitor only observes the event stream of its application. To monitor events from all applications, including the "current" application, you must install both event monitors.
The code shown in that page is for a local event monitor, but the code for a global event monitor is similar; what changes is the invoked NSEvent's method.
_eventMonitor = [NSEvent addLocalMonitorForEventsMatchingMask:
(NSLeftMouseDownMask | NSRightMouseDownMask | NSOtherMouseDownMask | NSKeyDownMask)
handler:^(NSEvent *incomingEvent) {
NSEvent *result = incomingEvent;
NSWindow *targetWindowForEvent = [incomingEvent window];
if (targetWindowForEvent != _window) {
[self _closeAndSendAction:NO];
} else if ([incomingEvent type] == NSKeyDown) {
if ([incomingEvent keyCode] == 53) {
// Escape
[self _closeAndSendAction:NO];
result = nil; // Don't process the event
} else if ([incomingEvent keyCode] == 36) {
// Enter
[self _closeAndSendAction:YES];
result = nil;
}
}
return result;
}];
Once the monitor is not anymore necessary, you remove it using the following code:
[NSEvent removeMonitor:_eventMonitor];
I'm using some Carbon code in my Cocoa project for handling global key events (shortcuts) from other applications. Currently I have setup a kEventHotKeyReleased event handler and I can successfully obtain hot keys when my application is not active. That triggers some operation in my application.
The problem I have with the behavior of kEventHotKeyReleased is:
Say for example I press the Cmd-Shift-P key combination. As soon as I release the "P" key the hot key event is triggered. I need to be able to trigger the event (or manually trigger it) when all of the keys are unpressed (i.e: the Cmd and Shift keys are released too).
It is easy to monitor for hot keys but I have seen nothing for monitoring individual keystrokes. If I could monitor the modifier key states I would be in business.
Any hints on how to do this?
Thanks in advance!
UPDATE:
I've tried using kEventRawKeyUp and kEventRawKeyModifiersChanged but while kEventHotKeyReleased works those two don't even though I set them up in the exact same way as kEventHotKeyReleased.
EventTypeSpec eventTypes[] = {{kEventClassKeyboard, kEventHotKeyReleased}, {kEventClassKeyboard, kEventRawKeyUp}};
// Changing the order in the list does not help, nor does removing kEventHotKeyReleased
OSStatus err = InstallApplicationEventHandler(&globalHotkeyHandler, GetEventTypeCount(eventTypes), eventTypes, NULL, NULL);
// err == noErr after this line
The globalHotKeyHandler method is called for kEventHotKeyReleased, but not for kEventRawKeyUp for some reason I can't seem to grasp. Here's what my globalHotKeyHandler method looks like:
OSStatus globalHotkeyHandler(EventHandlerCallRef nextHandler, EventRef anEvent, void *userData) {
NSLog(#"Something happened!");
}
Is there an additional call that needs to be made or something else I forgot?
N.B: At first glance, it seems like it could be that Access for Assistive Devices is disabled but it is not. So I'm pretty clueless.
UPDATE 2:
I investigated a bit on the CGEventTap Leibowitzn suggested and I came up with this setup:
CFMachPortRef keyUpEventTap = CGEventTapCreate(kCGHIDEventTap,kCGHeadInsertEventTap,kCGEventTapOptionListenOnly,kCGEventKeyUp,&keyUpCallback,NULL);
CFRunLoopSourceRef keyUpRunLoopSourceRef = CFMachPortCreateRunLoopSource(NULL, keyUpEventTap, 0);
CFRelease(keyUpEventTap);
CFRunLoopAddSource(CFRunLoopGetCurrent(), keyUpRunLoopSourceRef, kCFRunLoopDefaultMode);
CFRelease(keyUpRunLoopSourceRef);
... and the callback:
CGEventRef keyUpCallback (CGEventTapProxy proxy, CGEventType type, CGEventRef event, void *refcon) {
NSLog(#"KeyUp event tapped!");
return event;
}
As you can see I'm using kCGEventKeyUp as the mask for the event tap but somehow I'm receiving mouse down events ??!??
UPDATE 3:
Ok forget that, I overlooked the line in the doc that said to use CGEventMaskBit(kCGEventKeyUp) for this parameter, so the correct call is:
CGEventTapCreate(kCGHIDEventTap,kCGHeadInsertEventTap,kCGEventTapOptionListenOnly,CGEventMaskBit(kCGEventKeyUp),&keyUpCallback,NULL);
I'm still having a problem though: modifier keys do not trigger the kCGEventKeyUp...
UPDATE 4:
Ok forget that again... I'm bound to answer to my own questions 5 minutes after asking them today huh!
To intercept modifier keys, use kCGEventFlagsChanged:
CGEventTapCreate(kCGHIDEventTap,kCGHeadInsertEventTap,kCGEventTapOptionListenOnly,CGEventMaskBit(kCGEventFlagsChanged),&callbackFunction,NULL);
So in essence I got the key and modifier key state detection working, but I'm still interested in knowing why kEventRawKeyUp doesn't work...
N.B: Also note that I'm developing on Tiger with the goal of having support for new and older versions of the OS as much as possible. CGEventTap is 10.4+ only so I'll be using this for now but a backwards-compatible solution would be welcome.
One option is to use EventTaps. This lets you monitor all keyboard events. See:
http://developer.apple.com/mac/library/documentation/Carbon/Reference/QuartzEventServicesRef/Reference/reference.html#//apple_ref/c/func/CGEventTapCreate
Unfortunately event taps will stop working if an application is requesting secure input. For example Quicken.
OSStatus err = InstallApplicationEventHandler(&globalHotkeyHandler, GetEventTypeCount(eventTypes), eventTypes, NULL, NULL);
This is not global. This installs the handler only when your own application is active, and (I believe) after the Carbon Event Manager's own event filters.
You need to use InstallEventHandler, which takes an event target as its first parameter (InstallApplicationEventHandler is a macro that passes the application event target).
For events that occur while your application is not active, the target you want is GetEventMonitorTarget(). For events that occur while your application is active, the target you want is GetEventDispatcherTarget(). To catch events no matter what application is active, install your handler on both targets.
Nowadays, though, I'd just use CGEventTaps, as Leibowitzn suggested.