Is it possible to capture touch events in the background on a jailbroken iOS device? - objective-c

I have an installation project in mind which involves a hacked iPad - I'd like to have a background process running recording all the touch events regardless of what app is running in the foreground, and send them out via OSC.
Note that this is using a jailbroken iPad with root access, and users will be alerted about not entering any sensitive data. But I'm not an iOS developer so I'm not sure if this is even possible. I'd appreciate any kind of input/suggestions.
[edit] Since someone questioned my motive behind this question, I'll try to explain a bit: to be specific, I'd like to build a mechanical system with Arduino that emulates the user's touch input on the iPad, but I do not want to limit them to using an app that does nothing else but recording touch events.

There are three options:
Use the IOHIDFamily subsystem to capture all the touch events. This will do most of the processing for you, the only thing you'll need to do is fetch the events using a HID client, get their types, and if they are touch events, get their position, radius and other things you need.
Use the MultitouchSupport framework. This way you will have to process the digitizer data frames manually which is tricky.
Use a MobileSubstrate hook to hook the already existing HID client inside SpringBoard.

Related

tvOS - game control via non-Siri remote

I'm working on a game in Objective-C. The Siri remote works great via GCMicroGamepad and real MFi controllers work well via GCGamepad. However, third-party IR remotes do not work at all in-game (and neither does the Remote App on iPhone or an older Apple TV 3rd gen remote).
How can I recognize and distinguish these inputs?
Two days later... I have found that a UITapGestureRecognizer can be used to detect Up, Down, Left, Right and Select events correctly when presented with a third-party TV remote or iPhone Remote.app. The directional events are actually unique to these types of remotes as well—the Siri remote does not generate directional tap events. Unfortunately, however, tapping the Select button on either the Siri remote or the third-party or iPhone Remote.app will generate a Select event from my tap recognizer. I need some way to distinguish the two.
The only distinguishing factor I can find is that tapping the Siri remote also generates a button-A press on the GCMicroGamepad—a third-party remote or iPhone Remote.app does not affect the GCMicroGamepad at all. But it's very extremely inelegant to attempt to watch the GCMicroGamepad for tap-release events, and then use that event to filter out a matching Select button event. Certainly it's not a recommended use of the APIs; it doesn't seem like a good long-term solution. If I could tell the Siri remote to stop generating UI events when in GCMicroGamepad mode, that would be excellent.
I cannot test this right now, but you could probably differentiate the Siri Remote from a third party remote by using a GCEventViewController with the controllerUserInteractionEnabled property set to false. This way, the Siri Remote inputs shouldn't get passed to UIKit (when the GCEventViewController is the first responder). The third-party remote's input events might go through to UIKit since, unlike the Siri Remote, it's not a GCMicroGamepad.
So far, Apple really doesn't support multiplayer games with multiple Siri Remotes, iOS Remotes or IR remotes. But I think it might be coming because the Remote app on iOS will soon support multiplayer gaming (so I guess the Apple TV will recognize multiple GCMicroGamePad controllers).

How can I detect/observe when third party app launches a full screen process?

I would like to build a helper app for gamers, and to build some extra functionality I would like to observe/time certain third party games behaviors, more specifically when the game actually launches the full screen process.
For example: my app is a system tray app, the game has a "launcher" app with lobby and menu screens. Once the game launches the extra process, usually OS X will switch resolutions (optionally) and my App would be notified somehow. Then I can start a timer. Once game match finishes, either the process is closed, or the game is not full screen anymore, my app gets a second notification and I can stop the timer.
Are there official Apple APIs that provide any way to observe/poll for the app going full screen and/or launch additional windows that I can reliably assume it's the actual game screen?
I doubt you're going to find a completely comprehensive solution. There are many ways for apps to achieve a full-screen experience and most don't provide a notification about that fact.
A full-screen app can modify the presentationOptions of NSApplication to hide the Dock and menu bar. Another app can use key-value observing to monitor its application object's currentSystemPresentationOptions property, which will reflect the current system status.
A full-screen app can capture the displays (although Apple discourages this technique). You can try to detect this by calling CGDisplayIsCaptured(), although it's been deprecated since 10.9 with no replacement. It may be possible that, if you register a callback with CGDisplayRegisterReconfigurationCallback(), you'll get called when something captures the display. However, capturing the display is sort of about preventing other processes from noticing such changes, so maybe not. In that case, you'd have to poll. You might also poll for the current display mode; changing the mode is the primary reason why a game would capture the display in the first place.
A full-screen game could also just create a borderless window the size of the screen and set its window level to be in front of the Dock and menu bar (and other apps' windows). There's not really a notification about this. You could detect it using the CGWindowList API, but you would have to poll. For example, you could call CGWindowListCopyWindowInfo(kCGWindowListOptionOnScreenOnly, kCGNullWindowID) and iterate through the dictionaries looking for one the size of the screen and at a window level above kCGStatusWindowLevel.
(You might be able to use the Accessibility API to get a notification when the frontmost window changes, so you'd only have to poll when that happens.)
You cannot observe a notification if there is none. So firstly you need to know if the app you want to observe is actually sending a notification that is observable. You cannot 'hook' into other apps without their planned consent.

iOS7: Control center, remote-control, and CoreAudio audio session solo-ambient category

Before iOS7 came, we noticed an issue:
Music remote-control from earbud or springboard can hijack our audio session even if we set the category to solo-ambient or another exclusive mode.
We thus tried a few things:
We tried to take ownership of the audio session back. But this requires that our audio code knows when to take it back and from whom. We thought we could let the app code become the first responder to remote-control events, do our stuff, and then pass the events on to the music app. However, we found that the events got detained by the first responder and there is no way to push it back to the chain of commands.
We tried to become first-resonder and block remote-control events all together when we are in solo-ambient. This worked fine with iOS6, still works with earbud control in iOS7, but fails with iOS7's control center. The control center seems to bypass the remote-control event handler remoteControlReceivedWithEvent completely, where we put our blocking code.
I read something elsewhere that:
You can't block the music app. your app can become one though (apple
won't like that) and then the control center would control yours.
But I found no documentation whatsoever about control center.
And as said above, control center does not enter the normal remote control hooks even if an app is the first responder.
Another quoteP
Remote Control Event handling is so your app can be controlled by
Control Center, the earbuds, etc... it is not so that your app can eat
said controls, preventing control of other apps from said sources. It
only worked in iOS6 because of a bug in iOS, now fixed in iOS7
Is it that what had were using was due this bug? I find it hard to believe because we got the solution on this list and the Xcode mailing list so I assume that was an accepted solution.
Now we really wonder if we are missing something from the very beginning:
Is solo-ambient really an exclusive mode for audio session or is it that music app is an exception to that exclusivity?
How can our app live in harmony with the remote-control, and control center?
Where can we find up-to-date documentation of remote-control and control center?
The remote control has been mysteriously fixed after clean building everything agains iOS7 SDK. Now app delegate can receive remote-control events from Control Center. However, the play/pause events are UIEventSubtypeRemoteControlPause and UIEventSubtypeRemoteControlPlay instead of the iOS6's UIEventSubtypeRemoteControlTogglePlayPause.

Windows Metro Apps on touch Screen Monitors

I know this might seem odd, but I am working on a windows Metro app which would be displayed on touch screen monitors in our local university.
Now I am using the simulator for debugging, but in the simulator you have to start "Touch Mode" to even use the touch interface.
So when using the touch monitors, do we have to specifically specify touch mode ? Or it will automatically integrate the touch functionality ?
Thank you.
Touch is a first class-citizen in Windows Store applications, so no special accommodations are needed. I would recommend you test on a touch device though before deploying, it's a different way of interacting, and even though the simulator does a decent job of handling the mechanics, it will "feel" different to a user - especially if you're leveraging pinch-zoom, swipe and other gestures.
On another note... is this app intended for a kiosk-type application? If so, keep in mind with Windows 8/RT, you won't be able to easily prevent the users from swiping to the charms, navigating to other programs, etc. You may want/need to take a look at Windows 8 Embedded depending on the specific deployment requirements.

Save Button Pushes and Location

I am making an app where I need to know
every button that was ever pushed by the user in the app, and when it was pushed, and
where the iPhone has gone (using gps), but there are no cell towers in the area so I can't use that significant location changes method everyone uses.
It seems to me like the Plist method for data saving won't work because I don't want the app to start where it left off, I want it to start at the beginning every time.
Also, if any of you have any idea how I can make my app wake up at certain specific times, and/or how I can make it impossible to exit, that would be awesome. This is for an experiment with the University of Queensland St. Lucio Psych Department and the Grute Eylandt Aborigines.
You can know everything the user does in your app if you want. You could use your own solution with an SQLite database for example, and dispatch the data to a server every once in a while.
The GPS is also easy, you could just track the user with the Core Location framework.
You can't make the app wake at specific times, the best you can do is implement push notifications but it's up to the user to open the app via the notification or by themselves by tapping the app icon on the iPhone home screen.
Otherwise you could set up a local notification just before exiting the application, this is faster and easier to implement then setting up push notifications.
There is also no way to stop the app being closed, that is until iOS 6 comes along with it's accessibility features, you can disable the home button then. But not now.