Is there a neat/easy way in objective-c/cocoa to track if a user is at their computer, ie I assume by detecting key presses and mouse movement?
(ie I want to fill out my timesheet automatically by detecting when I am at work and not at work)
You can detect mouse events across the entire log-in session using an event tap.
I'm pretty sure there's a way to do this for key events as well, but I don't remember what it was and it requires that the user have access for assistive devices turned on. Catching key events across the session is hard on purpose, in order to make Mac OS X unattractive for key-logger authors.
You should also use NSWorkspace's notifications to detect when the machine is about to go to sleep, and when it has just woken up from sleep.
Much more on Event Taps and handling global activity in Mac OS X 10.6 is here:
Mouse tracking daemon
Related
I have a background process running on the user’s macOS machine. Its job is to detect whenever any app is launched on the machine. Currently, I am detecting the NSNotificationCenter’s NSWorkspaceDidLaunchApplicationNotification event. This works perfectly for detecting when an app is freshly launched (i.e. the app had no instance already running at that time).
But, on macOS, if we click the red cross button at the top-left corner, it generally closes the app window and the app continues to run in the background. This is also evident by the app icon visible on the dock with the dot indicator below it. If I click the app icon on the dock and then launch it, the NSWorkspaceDidLaunchApplicationNotification event won’t be triggered.
To track such events, I tried using the NSWorkspaceDidActivateApplicationNotification event. Using this event, I was able to detect all the app launch scenarios. The problem is that this event gets triggered whenever the app comes into focus such as switching windows using command+tab, clicking on its dock icon, changing between two apps, …
Is there a way to filter out these triggers or identify which action led to the trigger? Or is there some other event/ method I can listen to which gives the required filtered triggers? I only want to detect scenarios where a new window of the app is created.
What you seems to want is two different things, as was mentioned in comments, which should be processed separately to be reached.
To detect app launch, when the new process is started. You could use the NSWorkspaceDidLaunchApplicationNotification if it is enough (usually for visual user apps), or kqueue if it is not, or even EndpointSecurity framework to rule them all.
To track the window(s) in the already launched app. Visually, if the white dot under app dock icon is there, the app is still launched.
There is an Accessibility framework for this task, you could track the event of window creation, window destruction, get count of windows from target process id, visibility state and etc.
It is a bit abandoned and has no updates since maybe release, but it will work for you in most cases.
I am developing a desktop application that capture computer activities on mac osx using objective-c. I know it's possible to capture it when user presses on the keyboard and mouse position. But I don't know how to detect when user switches tasks on computer, like closing a window (of other applications), activating another window (of other applications)?
Does anyone have any experience in that?
Yes, via the Accessibility system. For example the NSAccessibilityMainWindowChangedNotification.
Is there any API in Windows 8 that tells you whether there is a Keyboard connected to your device? I believe the OS should have information about this, but I am not sure that it is exposed.
I checked the Windows.Devices.Input.KeyboardCapabilities(). It only returns an object with a property keyboardPresent that equals to 1 on my touch device or non-touch device.
The problem has already been reported on stackoverflow without solution (Provide another solution but which also don't seem to work).
Maybe a work around could be to add a TextBox outside the screen and set the focus on it and register InputPane.GetForCurrentView().Showing and to see if it is fired or not. If it is you can deduce that there is no keyboard connected (and you might be able to reset the focus to the page inside the event so that the keyboard don't actually pop up) and if it don't fired that the there is a keyboard connected. That not a really good solution but might be the best available ...
We have a hardware device, with an LCD display. It supports an USB interface to connect keyboard and mose. Using these keyboard and mouse, we can navigate to varios menu items and edit entries.
We have couple of test cases written to verify that mouse click and keyboard input events are working when pressed respective key.
My task is to automate these test cases.
I donot have any control to the hardware device, as I can not access the o/s kernel or any application running there. There is one way to verify what is currently displayed on the UI. So I have to use that and verify whether the mouse/keyboard has performed the appropriate events.
As I have gone through couple of previous posts, it seems like that one of the way to achieve this is through virual HID device driver rather than actual keyboard and mosue. But I am not sure how to achieve it.
Please do help me for it. I am fine with any programming language.
I am more interested to simulate the mouse and keyboard events.
You probably don't need to write your own driver. AutoHotKey does pretty much anything you can think of, and the scripting language is quite easy to learn.
You can get it here:
http://www.autohotkey.com/
Since you're using linux, here's a similar project that will run on linux:
http://sikuli.org/
how do I observe keyboard input event while the applicaion is not actived.
You'll need to create a CGEventTap using Quartz Event Services. The user must have access for assistive devices turned on, which makes sense, because that's the only legitimate reason for you to do that.
If you want to set up a hotkey, there's an API in Carbon Event Manager for that, and a Cocoa wrapper named SGHotKeysLib. Note that the Carbon Event Manager hotkey API is still supported in current, 64-bit Mac OS X.