I'm working on a macOS Objective-C application that interacts with trackpad and mouse. This kind of interaction needs Accessibility Control. The problem is that if user unticks the app (while it's running) in System Preferences > Security & Privacy > Privacy > Accessibility, it prevents user to work with trackpad and mouse properly, so making hard even to quit the app.
The only thing that I could manage to do with this problem is running Terminal via Spotlight Search and doing killall.
I need the app to stop itself if there is no Accessibility permissions.
You can call AXIsProcessTrusted to determine if you are trusted for accessibility. I'm not aware of any callback or notification when the setting changes, but it sounds like you could just poll every second or so, and exit your app if the function starts to return false.
AXIsProcessTrusted
Related
I have a background process running on the user’s macOS machine. Its job is to detect whenever any app is launched on the machine. Currently, I am detecting the NSNotificationCenter’s NSWorkspaceDidLaunchApplicationNotification event. This works perfectly for detecting when an app is freshly launched (i.e. the app had no instance already running at that time).
But, on macOS, if we click the red cross button at the top-left corner, it generally closes the app window and the app continues to run in the background. This is also evident by the app icon visible on the dock with the dot indicator below it. If I click the app icon on the dock and then launch it, the NSWorkspaceDidLaunchApplicationNotification event won’t be triggered.
To track such events, I tried using the NSWorkspaceDidActivateApplicationNotification event. Using this event, I was able to detect all the app launch scenarios. The problem is that this event gets triggered whenever the app comes into focus such as switching windows using command+tab, clicking on its dock icon, changing between two apps, …
Is there a way to filter out these triggers or identify which action led to the trigger? Or is there some other event/ method I can listen to which gives the required filtered triggers? I only want to detect scenarios where a new window of the app is created.
What you seems to want is two different things, as was mentioned in comments, which should be processed separately to be reached.
To detect app launch, when the new process is started. You could use the NSWorkspaceDidLaunchApplicationNotification if it is enough (usually for visual user apps), or kqueue if it is not, or even EndpointSecurity framework to rule them all.
To track the window(s) in the already launched app. Visually, if the white dot under app dock icon is there, the app is still launched.
There is an Accessibility framework for this task, you could track the event of window creation, window destruction, get count of windows from target process id, visibility state and etc.
It is a bit abandoned and has no updates since maybe release, but it will work for you in most cases.
I am developing a desktop application that capture computer activities on mac osx using objective-c. I know it's possible to capture it when user presses on the keyboard and mouse position. But I don't know how to detect when user switches tasks on computer, like closing a window (of other applications), activating another window (of other applications)?
Does anyone have any experience in that?
Yes, via the Accessibility system. For example the NSAccessibilityMainWindowChangedNotification.
Is there any way by which I can prevent a windows metro app from going to background since it goes in suspended state when in backgroud?
Else how can I show a alert message when I am trying to put app in the background?
Not programmatically: apps cannot override the user's choice here.
Once the app's window is deactivated it is too late for it to show an alert message.
From a system configuration standpoint the Assigned Access feature allows an admin to set up kiosk with a single app which the user can't switch away from.
I am the developer of an app called 1Keyboard (http://www.eyalw.com/1keyboard).
my app used to capture global keystrokes and send them to iPhones/iPads/etc as if the computer was an HID bluetooth keyboard.
This worked on 10.8, and dais on 10.9.
I understand it has to do with the new Accessibility menu in system preferences.
Instead of having one checkbox ("enable assistive devices") in 10.8,
now the user has to enable this for each app individually.
For some reason, my app doesn't show up in the accessibility requiring apps in the preference. http://d.pr/i/8IfP
What should I do to have it appear there, and restore the lost functionality?
See the function "AXIsProcessTrustedWithOptions:". It can be made to cause a prompt to the user, which will then add the app to the list in question. Much easier for the user.
Boolean isTrusted = AXIsProcessTrustedWithOptions(CFDictionaryCreate(NULL, (const void*[]){ kAXTrustedCheckOptionPrompt }, (const void*[]){ kCFBooleanTrue }, 1, NULL, NULL));
That being said, I'm still having trouble trapping global keystroke events.
There are no "add an app" button on the new Accessibility menu, however, you can simply "drag-and-drop" your application from the Application folder to the panel directly (http://tinypic.com/r/2qu2k3d/5).
I'm using your app and I can confirm that the functionality is now restored :-)
Hope it helps !
Note: In Yosemite, there are now the standard "+" and "-" buttons in the Privacy pane of Security and Privacy preferences.
With windows 8, is it possible to create an application that is always visible? For instance, in previous versions of windows, there is the task bar with quick launch icons. Can I create something similar to the quick launch icons that are always on the screen?
If you are referring to a Windows 8 Store app then the answer is no. You can have a live tile and toast notifications that provides updates to the user which may cause the user to launch your application.
A good article to read to understand how your Windows Store apps will run on Windows 8 go here to learn about Application lifecycle (Windows Store apps). This will explain the App execution state.
It is not possible in the RT version, but the same is possible in the desktop version. If you have a desktop app, you can pin it to the taskbar. But any Window store app cannot be pinned to the taskbar. What you can do instead is move the app to the beginning of your Home screen, so anytime you click the Windows button your app will be visible right in front.
Do you mean always visible in the Star Menu screen? If so, you can add tile updating functionality to your application. As long as the user has the application pinned to the Start Menu, he would see the updates. Check the link below for an introductory tutorial.
http://blogs.msdn.com/b/windowsappdev/archive/2012/04/16/creating-a-great-tile-experience-part-1.aspx
"Quick Launch" has a very specific meaning, which you may or may not have been referring to in your question.
Below is the Quick Launch bar in Windows 8 - essentially a toolbar pointing to a location in your %AppData% directory. Prior to Windows 7 it was available by default, but the ability to now pin items directly to the taskbar rather supersedes it. Here's how you can restore Quick Launch if you really want to :)
It's, of course, available only in the Desktop mode and not on the Modern UI, where pinning a tile is the best you can hope for, and it's all up to the user to pin it AND to determine where it shows up on their Start Screen.
Another option worth mentioning (although more like system tray than quick launch) is lock screen presence. If the user chooses so and your app supports that, he can add it to his lock screen:
either as a a badge (up to 7 apps)
or as a tile notification (single app only)
This is not a way for the user to quickly start your app (other answers have already covered these options) but a way to stay visible and keep your user informed.