I'm using this code to capture Global Keyboard Shortcuts in my app.
This works great on almost every computer I run it on. I tried it on a brand new Retina Macbook Pro this week and addGlobalMonitorForEventsMatchingMask doesn't work at all. It doesn't even log every key like I have it set up to do here for debugging.
Is there a more reliable way to do this? Right now I load this on applicationDidFinishLaunching.
I think it might make more sense to load it as its own method in the App Delegate but I'm not sure what the syntax of that would look like.
[NSEvent addGlobalMonitorForEventsMatchingMask:NSKeyDownMask handler:^(NSEvent *event){
NSLog(#"sequence = %li", (unsigned long)[event modifierFlags]);
// Activate app when pressing cmd-c
if([event modifierFlags] == 1048840 && [[event charactersIgnoringModifiers] compare:#"c"] == 0) {
In OSX 10.9 (Mavericks) the setting has moved to System Preferences > Security & Privacy > Privacy > Accessibility - make sure your app is checked.
check the "Enable access for assistive devies" in accessiblity in system prefrences and try again.
Related
I'm completely stuck with logic that's been working for many years now. I'm using this to listen to keyboard events (hotkeys) while the app is in background.
CFMachPortRef eventTap = CGEventTapCreate(kCGSessionEventTap,
kCGHeadInsertEventTap,
kCGEventTapOptionDefault,
CGEventMaskBit(kCGEventKeyDown) | CGEventMaskBit(kCGEventFlagsChanged),
myCGEventCallback,
nil);
if (!eventTap) {
printf("error: event tap register failed\n");
return false;
}
The code has not changed, and this has worked for a long time until the latest update to mojave. How can I go about troubleshooting the reason for this? Would anyone know how I can get this working again?
Got this working after some fighting. Add this to your info.plist:
<key>NSAppleEventsUsageDescription</key>
<string></string>
Then go to your system preferences -> security -> privacy -> accessibility, and ensure your app is there and checked.
If it's already there and this keeps happening, remove it and add it again. I have to do this every time I rebuild my app...
From time to time, but not always (I have had this working for a bit), the app/extension gets in a state where I can't read a flag set in my App Group between my companion app and my app extension. Don't know how it gets in this state or why the values differ, but it's critical to my application these always be in sync.
Companion app viewDidLoad:
NSUserDefaults *myAppSettings = [[NSUserDefaults alloc] initWithSuiteName:#"group.myapp"];
.....
[myAppSettings setBool:true forKey:#"myBool"];
[myAppSettings synchronize];
NSLog([myAppSettings boolForKey:#"myBool"] ? #"Companion app - bool TRUE" : #"Companion app - bool FALSE");
App extension viewDidLoad
NSUserDefaults *myAppSettings = [[NSUserDefaults alloc] initWithSuiteName:#"group.myapp"];
[myAppSettings synchronize];
NSLog([myAppSettings boolForKey:#"myBool"] ? #"App extension app - bool TRUE" : #"App extension - bool FALSE");
Console output
Companion app - bool TRUE
App extension - bool FALSE
I also synchronize before my companion app will enter background. I have my app group set up in the portal etc.
What am I doing wrong?
EDIT
Apparently others having this problem too:
https://devforums.apple.com/message/977151#977151
"I think that this is currently very glitchy.
Sometimes the data sharing works, then a change and all of a sudden the widget can't see the shared data anymore (both on Simulator and device).
Annoying and hope it's a bit more reliable in next beta!"
EDIT 2
Looks like another person has reported this exact issue as well:
"I also noticed the same thing too.This not only happen to the
NSUserDefaults, but also all the files in the container folder. The
keyboard extension suddenly will lose read/write pemission to the
container folder after using the keyboard for a while."
EDIT 3
More evidence: https://devforums.apple.com/message/1028078#1028078
After I upgrade to beta 3, I noticed that sometimes the keyboard
failed to open the database because it failed to access to the DB
file. The keyboard has been able to access to the file before.
EDIT 4
Seems like this could be because the keyboard loses the RequestsOpenAccess flag. But I can't reproduce it, and there's no way for me to tell for sure.
EDIT 5
Seems like others are reporting this in the iOS8 GM build:
This issue still persists for me in the GM. It seems related to a
keyboard crash.. but also there seems to be some contention between
keyboard and containing app in terms of who creates the suite in what
order. I think this problem is on Apple's end. Trust me, I WANT it to
be my fault but I've spent countless hours with trial and error. No
matter what I do in code and verify with NSLog, it will end up in this
state eventually. Hoping someone finds a magic pill. :S
Has anyone solved this yet?
You must request open access in order to access shared NSUserDefaults. It's stated directly in the App Extension Programming guide:
By default, a keyboard has no network access and cannot share a container with its containing app. To enable these things, set the value of the RequestsOpenAccess Boolean key in the Info.plist file to YES.
Be sure you change the RequestsOpenAccess field to YES. You'll find it in keyboard's Info.plist > NSExtension > NSExtensionAttributes > RequestOpenAccess. Then remove the keyboard in Settings, delete the app, run it again, and add the keyboard again. After you add it, tap on the keyboard name and then flip the switch to enable Allow Full Access. You'll need to instruct the users to follow those same steps to grant access (and reassure them you're not evil), otherwise it simply will not work and you'll never get the data that's stored in your shared container. Note that in iOS 8.3+, if the user hasn't enabled full access the keyboard will be able to access the shared container, but writing to it will not save the data, for security and privacy purposes. In 8.2- you can't access that data without open access granted.
I can confirm that the problem is related to RequestsOpenAccess flag.
Assuming that everything done right (NSUserDefaults use initWithSuiteName, all Capabilities for main application and custom keyboard were set, etc.) I have the next steps:
1) Install the main application and a custom keyboard on device
2) Set 'Allow full access' for the custom keyboard to YES
3) Add some items (in my case this is a simple text templates) in the main app
4) Go to keyboard and check that all items, that were added from the main app,
appeared in custom keyboard
5) Go to main app and add a few more items
6) Go to keyboard and now you will see that nothing changed
7) Go to settings and switch 'Allow full access' to NO and then to YES
8) Go to custom keyboard again and check that item which were added in step 5 appeared.
I would like to create a system tool / application which has the capacity to aid in window management. I'm trying to find documentation about the following topics, if they are indeed possible given the security sandboxing of OSX.
Show a list of running applications with the name & icon, and allow the user to choose one
Manipulate the frame(s) of said application's windows (eg, resize, reposition) from my app (with animations -- though I assume this will be trivial once I can perform the actual change)
Hide or show these applications from task managers, etc.
Be able to launch (or terminate) instances of the given application
It seems to me that Quicksilver accomplishes many of these things, but the lack of AppStore availability makes me wonder if it possible to do this while remaining in the OSX sandbox.
There are a lot of pieces of software out there that do window management. You can check out a tiling window manager I've been hacking on called Amethyst. The basic idea behind software like this relies on Accessibility (which you can find documentation for here). As a quick overview the APIs work by acquiring references to accessibility elements (applications, windows, buttons, text fields, etc.) which have properties (hidden, position, size, etc.), some of which are writable.
As an example let's say that you wanted to move all windows in every running application to the upper left corner of the screen. That code might look like
for (NSRunningApplication *runningApplication in [[NSWorkspace sharedWorkspace] runningApplications]) {
AXUIElementRef applicationRef = AXUIElementCreateApplication([runningApplication processIdentifier]);
CFArrayRef applicationWindows;
AXUIElementCopyAttributeValues(applicationRef, kAXWindowsAttribute, 0, 100, &applicationWindows);
if (!applicationWindows) continue;
for (CFIndex i = 0; i < CFArrayGetCount(applicationWindows); ++i) {
AXUIElementRef windowRef = CFArrayGetValueAtIndex(applicationWindows, i);
CGPoint upperLeft = { .x = 0, .y = 0 };
AXValueRef positionRef = AXValueCreate(kAXValueCGPointType, &upperLeft);
AXUIElementSetAttributeValue(windowRef, kAXPositionAttribute, positionRef);
}
}
Which illustrates how you get references to applications and their windows, how to copy attributes from an accessibility element, and how to set attributes of an accessibility element.
There are a variety of notifications documented in NSWorkspace for the launching and termination of applications, and the accessibility framework also has a sense of notifications for things like an application creating or destroying windows, or a window miniaturizing or deminiaturizing.
Animating the window changes is non-trivial and I haven't figured out how to do it yet, though it may be possible. It may not be possible at all without hitting private APIs. But the other things you have listed should be possible. Hiding an application, for example, could be done by setting the kAXHiddenAttribute on the application accessibility element. Launching an application can actually be done via -[NSWorkspace launchApplication:].
Note that the use of accessibility necessitates that the user have Enable access for assistive devices turned on in System Preferences > Accessibility.
When Mission Control runs, it prevents applications from receiving keyboard and mouse events. It also leaves the last application running thinking that it still has focus. This is a problem for me because I don't receive keyUp or mouseUp events if I start Mission Control with a mouse button or a key held down and my application will behave as if that mouse button or key is held down.
I would like a way to either read both keyboard and mouse events even when Mission Control is active, or a way of detecting that Mission Control is active. Ideally, I would like to be able to do the latter since I effectively can't use my application when Mission Control is running.
I've tried a couple of things with no luck:
Use addGlobalMonitorForEventsMatchingMask to register a global monitor for keyboard and mouse events. This captures mouse events (but not keyboard events, although the documentation says keyDown events should be sent to the global monitor) when I switch to another application, but Mission Control doesn't seem to let events propagate to global monitors.
Check [[NSRunningApplication currentApplication] {isActive, ownsMenuBar}].
Apparently, my application is active even though it's not receiving events!
Check [NSApp keyWindow] != nil.
Apparently, one of my windows should be receiving key events. None of them are.
Check if Mission Control is one of the running applications returned by [NSWorkspace runningApplications]. Mission Control does not show up in this list when it's running.
Edit:
I've finally worked around this problem (albeit not in a very satisfactory way). For the mouse, it turns out that you can query the state of the pressed buttons with [NSEvent pressedMouseButtons]. I simply keep track of what I think the mouse state should be from NSLeftMouseDown and NSLeftMouseUp events and compare that to [NSEvent pressedMouseButtons] every so often to make sure that they're consistent. If they're not, then I know that something has hijacked my NSLeftMouseUp event and act accordingly.
For the keyboard, I could not find a way to query the keyboard state, so I couldn't do a similar workaround. I ended up disabling application switching using presentation options when keys are pressed.
At least in OS X 10.10, you can use this code to check if Mission Control is active or not:
func missionControlIsActive() -> Bool
{
var result: Bool = false
let windowInfosRef = CGWindowListCopyWindowInfo(CGWindowListOption(kCGWindowListOptionOnScreenOnly), CGWindowID(0)) // CGWindowID(0) is equal to kCGNullWindowID
let windowList: NSArray = windowInfosRef.takeRetainedValue() // We own the returned CFArrayRef
for entry in windowList
{
if (entry.objectForKey("kCGWindowOwnerName") as! String) == "Dock"
{
var bounds: NSDictionary = entry.objectForKey("kCGWindowBounds") as! NSDictionary
if (bounds.objectForKey("Y") as! NSNumber) == -1
{
result = true
}
}
}
return result
}
In a nutshell, the code checks if a specific window owned by the OS X Dock process is visible on the screen and if it is in a specific position. If both conditions are met, Mission Control will be active right now. Code will work in a sandboxed app and no privileges for assistive devices are required.
Did you try on bash level using NSTask? Something like ps -faxU <username> should list all running processes and then you could parse the output, or indeed you could use ps -faxU <username> | grep -i "mission control" (At the top of my head I am not sure how the process may be called, but sth like "mission control" seems legit). Not the most elegant solution maybe, but if nothing else works it may be worth it.
May be i'm missing something, but have you tried to use event taps instead of global monitoring?
It does appear that DTrace has some ability to see Mission Control being activated. Try running:
sudo fs_usage -filesys | grep Mission
from the command line and then launching the Mission Control app from the /Application folder.
You should see a lot of output related to Mission Control starting up. Unfortunately, this same output did not appear by using the keyboard short cut or swiping. Of course, using DTrace in production code is not something I would actually recommend.
C++ and Qt implementation works in latest OS X.
bool Window::missionControlIsActive() {
bool result = false;
CFArrayRef windows = CGWindowListCopyWindowInfo(kCGWindowListOptionOnScreenOnly+kCGWindowListExcludeDesktopElements, kCGNullWindowID);
for (int i = 0; i < CFArrayGetCount(windows) ; i++) {
auto cfMutableDictionaryRef_dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex( windows, i );
auto cfStringRef_name = (CFStringRef)CFDictionaryGetValue(cfMutableDictionaryRef_dict, kCGWindowName);
if (QString::fromCFString(cfStringRef_name) != u"") continue;
auto cfStringRef_ownerName = (CFStringRef)CFDictionaryGetValue(cfMutableDictionaryRef_dict, kCGWindowOwnerName);
if (QString::fromCFString(cfStringRef_ownerName) != u"Dock") continue;
auto cfDictRef_bounds = (CFDictionaryRef)CFDictionaryGetValue(cfMutableDictionaryRef_dict, kCGWindowBounds);
auto cfNumRef_bounds_Y = (CFNumberRef)CFDictionaryGetValue(cfDictRef_bounds, QString("Y").toCFString());
double num;
CFNumberGetValue(cfNumRef_bounds_Y, kCFNumberFloat64Type, &num);
if (num > 1.0 and num < 1000000) continue;
result = true;
break;
}
CFRelease(windows);
return result;
}
I've needed to make a global hot key input box in my Cocoa App.
I know about Shortcut Recorder, but it is a very old solution. It has parts implemented using Carbon, which has been deprecated, and I can't publish my app to the Mac App Store if I use it.
Is there any ready-to-use modern solution? Can anybody give me the way to make this by myself (I don't know where to start from)?
There is a modern framework named MASShortcut for implementing Global Shortcuts in OS X 10.7+.
Not all of Carbon is deprecated. You can't make a pure-Carbon application anymore, but some APIs live on and some of them are still the easiest way to do certain things.
One of these is the Carbon Events hotkey API. You certainly can sift through all the events using NSEvent's event-monitor methods, but it's unnecessary work. The Carbon Events hotkey API is still supported and much simpler—you just tell it what key you want to match and what function to call when the key is pressed. And there are Cocoa wrappers such as DDHotKey that make it even simpler.
RegisterEventHotKey, the relevant Carbon Events function (see also UnregisterEventHotKey in the same doc)
DDHotKey
MASShortcut, yet another wrapper (suggested by TongG):
MASShortcut (with ARC)
MASShortcut (without ARC)
KeyboardShortcuts, written in Swift and includes a SwiftUI hotkey recorder view [added to this answer in edit by the project's author].
In Mac OS X 10.6 and higher, you can use the methods +addGlobalMonitorForEventsMatchingMask:handler: and +addLocalMonitorForEventsMatchingMask:handler: defined from the NSEvent class. Monitoring Events reports the following information:
Local and global event monitors are mutually exclusive. For example, the global monitor does not observe the event stream of the application in which it is installed. The local event monitor only observes the event stream of its application. To monitor events from all applications, including the "current" application, you must install both event monitors.
The code shown in that page is for a local event monitor, but the code for a global event monitor is similar; what changes is the invoked NSEvent's method.
_eventMonitor = [NSEvent addLocalMonitorForEventsMatchingMask:
(NSLeftMouseDownMask | NSRightMouseDownMask | NSOtherMouseDownMask | NSKeyDownMask)
handler:^(NSEvent *incomingEvent) {
NSEvent *result = incomingEvent;
NSWindow *targetWindowForEvent = [incomingEvent window];
if (targetWindowForEvent != _window) {
[self _closeAndSendAction:NO];
} else if ([incomingEvent type] == NSKeyDown) {
if ([incomingEvent keyCode] == 53) {
// Escape
[self _closeAndSendAction:NO];
result = nil; // Don't process the event
} else if ([incomingEvent keyCode] == 36) {
// Enter
[self _closeAndSendAction:YES];
result = nil;
}
}
return result;
}];
Once the monitor is not anymore necessary, you remove it using the following code:
[NSEvent removeMonitor:_eventMonitor];