I'm working on a game in Objective-C. The Siri remote works great via GCMicroGamepad and real MFi controllers work well via GCGamepad. However, third-party IR remotes do not work at all in-game (and neither does the Remote App on iPhone or an older Apple TV 3rd gen remote).
How can I recognize and distinguish these inputs?
Two days later... I have found that a UITapGestureRecognizer can be used to detect Up, Down, Left, Right and Select events correctly when presented with a third-party TV remote or iPhone Remote.app. The directional events are actually unique to these types of remotes as well—the Siri remote does not generate directional tap events. Unfortunately, however, tapping the Select button on either the Siri remote or the third-party or iPhone Remote.app will generate a Select event from my tap recognizer. I need some way to distinguish the two.
The only distinguishing factor I can find is that tapping the Siri remote also generates a button-A press on the GCMicroGamepad—a third-party remote or iPhone Remote.app does not affect the GCMicroGamepad at all. But it's very extremely inelegant to attempt to watch the GCMicroGamepad for tap-release events, and then use that event to filter out a matching Select button event. Certainly it's not a recommended use of the APIs; it doesn't seem like a good long-term solution. If I could tell the Siri remote to stop generating UI events when in GCMicroGamepad mode, that would be excellent.
I cannot test this right now, but you could probably differentiate the Siri Remote from a third party remote by using a GCEventViewController with the controllerUserInteractionEnabled property set to false. This way, the Siri Remote inputs shouldn't get passed to UIKit (when the GCEventViewController is the first responder). The third-party remote's input events might go through to UIKit since, unlike the Siri Remote, it's not a GCMicroGamepad.
So far, Apple really doesn't support multiplayer games with multiple Siri Remotes, iOS Remotes or IR remotes. But I think it might be coming because the Remote app on iOS will soon support multiplayer gaming (so I guess the Apple TV will recognize multiple GCMicroGamePad controllers).
Related
I would like to build a helper app for gamers, and to build some extra functionality I would like to observe/time certain third party games behaviors, more specifically when the game actually launches the full screen process.
For example: my app is a system tray app, the game has a "launcher" app with lobby and menu screens. Once the game launches the extra process, usually OS X will switch resolutions (optionally) and my App would be notified somehow. Then I can start a timer. Once game match finishes, either the process is closed, or the game is not full screen anymore, my app gets a second notification and I can stop the timer.
Are there official Apple APIs that provide any way to observe/poll for the app going full screen and/or launch additional windows that I can reliably assume it's the actual game screen?
I doubt you're going to find a completely comprehensive solution. There are many ways for apps to achieve a full-screen experience and most don't provide a notification about that fact.
A full-screen app can modify the presentationOptions of NSApplication to hide the Dock and menu bar. Another app can use key-value observing to monitor its application object's currentSystemPresentationOptions property, which will reflect the current system status.
A full-screen app can capture the displays (although Apple discourages this technique). You can try to detect this by calling CGDisplayIsCaptured(), although it's been deprecated since 10.9 with no replacement. It may be possible that, if you register a callback with CGDisplayRegisterReconfigurationCallback(), you'll get called when something captures the display. However, capturing the display is sort of about preventing other processes from noticing such changes, so maybe not. In that case, you'd have to poll. You might also poll for the current display mode; changing the mode is the primary reason why a game would capture the display in the first place.
A full-screen game could also just create a borderless window the size of the screen and set its window level to be in front of the Dock and menu bar (and other apps' windows). There's not really a notification about this. You could detect it using the CGWindowList API, but you would have to poll. For example, you could call CGWindowListCopyWindowInfo(kCGWindowListOptionOnScreenOnly, kCGNullWindowID) and iterate through the dictionaries looking for one the size of the screen and at a window level above kCGStatusWindowLevel.
(You might be able to use the Accessibility API to get a notification when the frontmost window changes, so you'd only have to poll when that happens.)
You cannot observe a notification if there is none. So firstly you need to know if the app you want to observe is actually sending a notification that is observable. You cannot 'hook' into other apps without their planned consent.
Before iOS7 came, we noticed an issue:
Music remote-control from earbud or springboard can hijack our audio session even if we set the category to solo-ambient or another exclusive mode.
We thus tried a few things:
We tried to take ownership of the audio session back. But this requires that our audio code knows when to take it back and from whom. We thought we could let the app code become the first responder to remote-control events, do our stuff, and then pass the events on to the music app. However, we found that the events got detained by the first responder and there is no way to push it back to the chain of commands.
We tried to become first-resonder and block remote-control events all together when we are in solo-ambient. This worked fine with iOS6, still works with earbud control in iOS7, but fails with iOS7's control center. The control center seems to bypass the remote-control event handler remoteControlReceivedWithEvent completely, where we put our blocking code.
I read something elsewhere that:
You can't block the music app. your app can become one though (apple
won't like that) and then the control center would control yours.
But I found no documentation whatsoever about control center.
And as said above, control center does not enter the normal remote control hooks even if an app is the first responder.
Another quoteP
Remote Control Event handling is so your app can be controlled by
Control Center, the earbuds, etc... it is not so that your app can eat
said controls, preventing control of other apps from said sources. It
only worked in iOS6 because of a bug in iOS, now fixed in iOS7
Is it that what had were using was due this bug? I find it hard to believe because we got the solution on this list and the Xcode mailing list so I assume that was an accepted solution.
Now we really wonder if we are missing something from the very beginning:
Is solo-ambient really an exclusive mode for audio session or is it that music app is an exception to that exclusivity?
How can our app live in harmony with the remote-control, and control center?
Where can we find up-to-date documentation of remote-control and control center?
The remote control has been mysteriously fixed after clean building everything agains iOS7 SDK. Now app delegate can receive remote-control events from Control Center. However, the play/pause events are UIEventSubtypeRemoteControlPause and UIEventSubtypeRemoteControlPlay instead of the iOS6's UIEventSubtypeRemoteControlTogglePlayPause.
I know this might seem odd, but I am working on a windows Metro app which would be displayed on touch screen monitors in our local university.
Now I am using the simulator for debugging, but in the simulator you have to start "Touch Mode" to even use the touch interface.
So when using the touch monitors, do we have to specifically specify touch mode ? Or it will automatically integrate the touch functionality ?
Thank you.
Touch is a first class-citizen in Windows Store applications, so no special accommodations are needed. I would recommend you test on a touch device though before deploying, it's a different way of interacting, and even though the simulator does a decent job of handling the mechanics, it will "feel" different to a user - especially if you're leveraging pinch-zoom, swipe and other gestures.
On another note... is this app intended for a kiosk-type application? If so, keep in mind with Windows 8/RT, you won't be able to easily prevent the users from swiping to the charms, navigating to other programs, etc. You may want/need to take a look at Windows 8 Embedded depending on the specific deployment requirements.
I'm working on OS X project and i want to programmatically generate Touchpad gesture event like NSEventTypeSwipe or NSEventTypeRotate
So I can rotate/zoom etc on other applications.
I found out how to generate mouse/keyboard events but not touchpad.
Any idea?
There is no public API for generating those events.
You can find some work on synthesizing those events in this project: calftrail/Touch.
Reference: Cocoa Event Handling Guide
The above Mac Developer Library guide does not state any known methods to programmatically generate Touchpad Gestures.
It goes so far as to say the touchpad gestures themselves occur outside of the OS:
"The trackpad hardware includes built-in support for interpreting common gestures and for mapping movements..."
That guide also explicitly mentions apps should not rely on that sole input mechanism, and it's best to include support for keyboard and mouse for that reason.
Now that Mac and Windows (i.e., Windows 8) are supporting touchscreen monitors at the OS level, it's a matter of time before programmatically touchpad & touchscreen gestures can be incorporated into services like your project or remote desktop control using the appropriate API when it becomes available.
I think that touch gesture events cannot be easily generated as there is no public, official Apple API. This NSEventTypeMagnify, NSEventTypeRotate, NSEventTypeSwipe are I think only for read only purposes while handling exitsting system events. Maybe Apple for some reason don't want to make from Magic Mouse a magic touchpad by 3rd party developers. This Project rob mayoff mentioned is corrently not working as apple probably changes something in structure of event data. So relaying on such hacking isn't futuristic.
But If you think a little more you could achieve what touch events will do by other means.
magnification (pinch gesture) -> its just zoom in, zoom out -> most programs is using shortcut for this like CMD and +, CMD and -.
rotation it is usable with photos and there is shortcut like CMD and L, CMD and R in preview app.
swiping - changing spaces (desktops) -> use CTRL and arrows <- or ->
When I see the xcode simulator, I understand that touch events could be generated programmatically, and that xcode simulator is using those routines and functions to transfer the cursor touches and translate them to touch events, however based on what #robMayoff said, it seems that Apple did not make that library open for public. Emulating the same behavior and creating functions from scratch for that would be a bit challenging.
This is the NSTouch class reference:
http://developer.apple.com/library/mac/#documentation/AppKit/Reference/NSTouch_Class/Reference/Reference.html#//apple_ref/occ/cl/NSTouch
I have an installation project in mind which involves a hacked iPad - I'd like to have a background process running recording all the touch events regardless of what app is running in the foreground, and send them out via OSC.
Note that this is using a jailbroken iPad with root access, and users will be alerted about not entering any sensitive data. But I'm not an iOS developer so I'm not sure if this is even possible. I'd appreciate any kind of input/suggestions.
[edit] Since someone questioned my motive behind this question, I'll try to explain a bit: to be specific, I'd like to build a mechanical system with Arduino that emulates the user's touch input on the iPad, but I do not want to limit them to using an app that does nothing else but recording touch events.
There are three options:
Use the IOHIDFamily subsystem to capture all the touch events. This will do most of the processing for you, the only thing you'll need to do is fetch the events using a HID client, get their types, and if they are touch events, get their position, radius and other things you need.
Use the MultitouchSupport framework. This way you will have to process the digitizer data frames manually which is tricky.
Use a MobileSubstrate hook to hook the already existing HID client inside SpringBoard.