Global events for Show desktop, show notification center, etc. in cocoa - objective-c

for my program, I need to be able to discriminate between when users are performing some actions using gestures on the trackpad and when using corresponding hotkeys. Typically, I need to know when users show desktop, and if he performed an associated hotkey or associated gesture. Same for switching spaces, etc...
Basically, I have this need for showing notification center, application windows, show desktop, show dashboard,etc... Being able to handle hot corners would even be a plus.
So far I was hoping to be able to investigate global monitors for events, using NSAnyEventMask and slightly reverse engineer to figure out what type is the "Mission control open" event, but this was not a success. In fact, NSAnyEventMask does not seem to work at all as my method is never called (while it is with other masks such as keydown or mousemove).
I also had a look at the accessibility features, hoping I could add a relevant AXObserver notification, but did not find anything neither. I guess this is not surprising since the accessibility API provides a description of basic graphical components such as menus, windows, etc... therefore, virtual spaces and notification centers are not described by it.
Finally, CGevent tap does not seem to handle these events as when I use the function keys for showing desktop, the only events handled by my CGeventTaps are the corresponding keydown and keyup events.
I suspect few possible outcomes.
(1) I have been amazing at trying, but unfortunately this is not possible at all to handle these events ... I seriously doubt this as first I am far from being an amazing programmer, especially in Cocoa, and second, apple prove me that this is possible to access lots of amazing events programmatically and I believe in the power of their API.
(2) I have tried the good methods, but failed because of side factors. It is likely.
(3) other methods could help me to handle these events globally and programmatically (private API?).
Thanks a lot for your help,
Kind regards,

Just saw this, but this is caused by an error in Apple's implementation of NSAnyEventMask. The docs describe NSAnyEventMask as 0xffffffffUyet the implementation of NSAnyEventMask is an NSUIntegerMax which is 0xffffffffffffffffU. This is possibly due to the transition from 32 bit to 64 bit computers which changes NSUInteger's from unsigned int's to unsigned long's. Replacing NSAnyEventMask with '0xffffffffU' fixes the problem. I've already listed this as a bug to apple in hopes they would fix this.

Related

What OS X events can I access programmatically from Swift?

I'd like to find both current running programs (or at least program in the foreground) programmatically - and also key events on OS X.
I found inter-application communication guidelines, but they don't seem to say I can find out what applications are running.
I've found key events, but it seems to imply that the current task in the forefront is the one that gets the key events and only if it doesn't handle them does it go up to event chain. I'd like to programmatically intercept them.
Seems dubious, I know. I'm trying to use key events along with screen captures to try to best learn text on the screen - it's for research.
I'm using swift, but I understand that an obj-c example is pretty helpful since they all use the same libraries.
To get a list of running applications use:
NSWorkspace.sharedWorkspace().runningApplications
You can build a key logger (so to speak) by creating an event monitor (same document you linked, just a different section).

How can I capture shift-command-3/4 in Cocoa [duplicate]

I have an image application and I want to release it where unregistered users can view the files but cant save until they've registered.
I'm looking for a way to prevent the user from using the built in screenshot functionality so I don't have to watermark the images. How might I accomplish this?
-- Edit Below --
I decided to watermark the images. I had been trying to avoid watermarking since the images are stereoscopic but I'm rather happy about how the watermark looks now. I put a logo in the corner and offset it enough on each image so it appears in the foreground.
Whether people agree with it in practice or not, my question is still valid. Apple's DVD Player hides the video in its screenshots, which doesn't altogether stop the user from taking screenshots but accomplishes my original goal.
I would still very much like to know how to do this. (the DVD player way)
Based on a symbols search through DVD Player, it likely uses the private API CGSSetWindowCaptureExcludeShape. Richard Heard has been kind enough to reverse engineer it and wrap it for easy use.
Being private, it may stop working (or have already stopped working) at any time.
But ultimately the answer to your question is "yes, but not in any publicly documented way". Some other takeaways from this lengthy thread are:
Asking this question inevitably excites a lot of myopic moral outrage.
Given there's no public method, reverse engineering DVD Player is a useful path to pursue.
A request to Apple DTS might be the only reliable method to find an answer.
DVD Player does this (the user can still take the screenshot, but the player window doesn't appear in it), so I'm sure there's a way. Maybe setting the window's sharing type to NSWindowSharingNone?
One option that is very user hostile is to change the folder in which screen captures are stored to a /dev/null style directory by changing the com.apple.screencapture setting.
A huge downside of this is that you might mess up the users settings and not being able to restore them if the exit from your application isn't clean.
Another option is to keep track of what files that are created in the screen capture location, see if they match the pattern for name and then remove them.
This method is still quite hostile though.
I also investigated if it was possibility to kill the process that handle the screen capture, unfortunately the process that handles it, SystemUIServer just reboots after being killed.
SystemUIServer seems to refuse taking screenshots if DVD Player currently is playing a DVD. I have no idea how the DVD playback detection works though, but it might be a lead to prevent screenshots.
Links
Technical details about Screenshots in Mac OS X
com.apple.screencapture details
ScreenCapture.strings - List of error messages from ScreenCapture
Disclaimer before people start ranting: I have a legit reason to solve this problem, but won't use the com.apple.screencapture -> /dev/null method due to it's downsides.
You could try to run your application fullscreen and then capture all the keystrokes. But please listen to siride.
No; that's a system feature.

What is the nature of the gestures needed in Windows 8?

Most of touchpads on laptops don't handle multitouch, hence are not able to send swipe gestures to the OS.
Would it be possible to send some gestures to Windows from an external device, like a Teensy, or a recent Arduino, that can already emulate a keyboard and a mouse. I could send buttons 4 and 5 (mouse wheel up and down), but I would like to send a real swipe gesture (for example with a flex sensor...).
One of the ways that you could work with arduino and similar is to use the Microsoft .NET Microframework, which is an open source code, available for no cost from: Micro Framework
There are other frameworks available for the Artuino that you might want to use. So if you have a great idea on how to utilize the sensor hardware, then the output must meet certain specifications.
To be able to connect to your hardware that reads gestures, you will need to understand how drivers are created, so take a look at this: Info on drivers.
To find that type of information you would need to take a look at above link, this is for sensors, which would appear to be not quite what you are looking for, you are looking to use "gestures" but first you have to be able to make the connection to your device, this guide MIGHT help. I have reviewed it for other reasons.
There is a bunch of stuff to dig through, but first of all, imo, is to understand how to get your software to communicate with Windows 8. Let me know if you have any other questions. I am not the best person, you might want to refer to the community at the Micro Framework link shown above.
Good luck.
That's perfectly possible. What your effectively suggesting is that you want to create your own input peripheral like a trackpad and use that to send inputs. As long as windows recognizes this device as an input source it will work.

How does UIGestureRecognizer work?

How does UIGestureRecognizer work internally? Is it possible to emulate it in iOS < 3.2?
If you want a detailed explanation on how they work, it is worth watching this video from last year's WWDC.
See the video Deepak mentions for details, but yes, it is something you can build yourself if you want to.
Be sure to ask yourself a couple questions first, though: do you want to recreate the entire recognizer "framework", or just be able to recognize, say, a swipe? If the latter, there should be tons of examples on the web from pre 3.2 days of detecting swipes using the normal touch event handlers.
If you really want to recreate the framework, you can, and it's actually kind of an interesting exercise. The UIKit object does have some hooks into the event pipeline at earlier stages, but you can get a similar result by tracking the touches and building a pipeline of recognizer objects. If you read the docs on UIGestureRecognizer, you'll see that the state management that they use is pretty clearly laid out. You could copy that, and then just build you own custom MyPanGestureRecognizer, MySwipeGestureRecognizer, etc, that derive from a MyGestureRecognizer base. You should have some UIView subclass (MyGestureView) that handles all the touches and runs through its list of MyGestureRecognizers, using the state machine that's implied in the docs.

Window list ordered by recently used

I'm trying to create a window switching application. Is there any way of getting a list of the windows of other applications, ordered by recently used?
Start with the Accessibility framework. Many of the hooks for screen readers are also useful here. Particularly look at the UIElementInspector sample and the NSAccessiblity protocol.
There's also Quartz Window services, which can easily give you a list of all the windows on screen. Unfortunately, it doesn't tie into concepts like window focus (just level), and I don't know of a way to get notifications back from it when levels change. You might do something like tap into the Quartz Event framework to capture Cmd-Tab and the like, but that's complex and fragile. There is unfortunately no good way to convert a CGWindowID into an AXUIElementRef (the post is for 10.5, but I don't know of anything that was added in 10.6 to improve this). But hopefully you can do everything you need through the Accessibility framework.
You would want to use
[NSWorkspace runningApplications]
to get you a list of all the applications running, and watch
[NSRunningApplication currentApplication]
to know when the user switches to a new application to keep up with which one is used more recently.