two USB touch display not recognized Linux Kernel - usb

I have two USB touch display connected to separate ports.
When both are connected, only one touch display shall be active. I can monitor touch events by seeing /dev/input/eventX. Where as other touch event is never recognized and there is no event in /dev/input/..
When I connect only one touch display, both will get recognized, touch events are fired.
I know this is 100% kernel issue. Can anyone suggest how to debug.
Thanks

Related

Still pin capture on Linux. Is this possible?

A. I have a general understanding question about the "still pin" or "snapshot" functionality on some web cameras: how does this work ? It must be one of the following possibilities:
The camera is on and video is streaming to the host. When the snap button is pressed, a signal is sent to the host's camera driver (/dev/input/event0 on linux), the driver extracts a frame from the stream, and sends it up in the stack.
The camera is on and video is streaming (or not) to the host. When the snap button is pressed, the on board firmware puts aside the current frame, and tells the host a new "still" is available.
B. I have 4 usb cameras attached on a R-Pi (single usb host). All cameras have a still pin. I don't care about the video, no need for streaming, I want to take 4 simultaneous photos. My idea is to trigger all 4 cams to capture a frame using the still pin. How can I capture those 4 images without streaming video (bandwidth issues) ?
Note: I have already experimented a lot and I am capable of capturing a frame from a video stream. My cameras are unknown brands but exposes "video capture" as device caps. When using AMCap on windows, the snap button triggers a snapshot.
Thanks for any help.

tvOS - game control via non-Siri remote

I'm working on a game in Objective-C. The Siri remote works great via GCMicroGamepad and real MFi controllers work well via GCGamepad. However, third-party IR remotes do not work at all in-game (and neither does the Remote App on iPhone or an older Apple TV 3rd gen remote).
How can I recognize and distinguish these inputs?
Two days later... I have found that a UITapGestureRecognizer can be used to detect Up, Down, Left, Right and Select events correctly when presented with a third-party TV remote or iPhone Remote.app. The directional events are actually unique to these types of remotes as well—the Siri remote does not generate directional tap events. Unfortunately, however, tapping the Select button on either the Siri remote or the third-party or iPhone Remote.app will generate a Select event from my tap recognizer. I need some way to distinguish the two.
The only distinguishing factor I can find is that tapping the Siri remote also generates a button-A press on the GCMicroGamepad—a third-party remote or iPhone Remote.app does not affect the GCMicroGamepad at all. But it's very extremely inelegant to attempt to watch the GCMicroGamepad for tap-release events, and then use that event to filter out a matching Select button event. Certainly it's not a recommended use of the APIs; it doesn't seem like a good long-term solution. If I could tell the Siri remote to stop generating UI events when in GCMicroGamepad mode, that would be excellent.
I cannot test this right now, but you could probably differentiate the Siri Remote from a third party remote by using a GCEventViewController with the controllerUserInteractionEnabled property set to false. This way, the Siri Remote inputs shouldn't get passed to UIKit (when the GCEventViewController is the first responder). The third-party remote's input events might go through to UIKit since, unlike the Siri Remote, it's not a GCMicroGamepad.
So far, Apple really doesn't support multiplayer games with multiple Siri Remotes, iOS Remotes or IR remotes. But I think it might be coming because the Remote app on iOS will soon support multiplayer gaming (so I guess the Apple TV will recognize multiple GCMicroGamePad controllers).

iOS7: Control center, remote-control, and CoreAudio audio session solo-ambient category

Before iOS7 came, we noticed an issue:
Music remote-control from earbud or springboard can hijack our audio session even if we set the category to solo-ambient or another exclusive mode.
We thus tried a few things:
We tried to take ownership of the audio session back. But this requires that our audio code knows when to take it back and from whom. We thought we could let the app code become the first responder to remote-control events, do our stuff, and then pass the events on to the music app. However, we found that the events got detained by the first responder and there is no way to push it back to the chain of commands.
We tried to become first-resonder and block remote-control events all together when we are in solo-ambient. This worked fine with iOS6, still works with earbud control in iOS7, but fails with iOS7's control center. The control center seems to bypass the remote-control event handler remoteControlReceivedWithEvent completely, where we put our blocking code.
I read something elsewhere that:
You can't block the music app. your app can become one though (apple
won't like that) and then the control center would control yours.
But I found no documentation whatsoever about control center.
And as said above, control center does not enter the normal remote control hooks even if an app is the first responder.
Another quoteP
Remote Control Event handling is so your app can be controlled by
Control Center, the earbuds, etc... it is not so that your app can eat
said controls, preventing control of other apps from said sources. It
only worked in iOS6 because of a bug in iOS, now fixed in iOS7
Is it that what had were using was due this bug? I find it hard to believe because we got the solution on this list and the Xcode mailing list so I assume that was an accepted solution.
Now we really wonder if we are missing something from the very beginning:
Is solo-ambient really an exclusive mode for audio session or is it that music app is an exception to that exclusivity?
How can our app live in harmony with the remote-control, and control center?
Where can we find up-to-date documentation of remote-control and control center?
The remote control has been mysteriously fixed after clean building everything agains iOS7 SDK. Now app delegate can receive remote-control events from Control Center. However, the play/pause events are UIEventSubtypeRemoteControlPause and UIEventSubtypeRemoteControlPlay instead of the iOS6's UIEventSubtypeRemoteControlTogglePlayPause.

Is it possible to capture touch events in the background on a jailbroken iOS device?

I have an installation project in mind which involves a hacked iPad - I'd like to have a background process running recording all the touch events regardless of what app is running in the foreground, and send them out via OSC.
Note that this is using a jailbroken iPad with root access, and users will be alerted about not entering any sensitive data. But I'm not an iOS developer so I'm not sure if this is even possible. I'd appreciate any kind of input/suggestions.
[edit] Since someone questioned my motive behind this question, I'll try to explain a bit: to be specific, I'd like to build a mechanical system with Arduino that emulates the user's touch input on the iPad, but I do not want to limit them to using an app that does nothing else but recording touch events.
There are three options:
Use the IOHIDFamily subsystem to capture all the touch events. This will do most of the processing for you, the only thing you'll need to do is fetch the events using a HID client, get their types, and if they are touch events, get their position, radius and other things you need.
Use the MultitouchSupport framework. This way you will have to process the digitizer data frames manually which is tricky.
Use a MobileSubstrate hook to hook the already existing HID client inside SpringBoard.

How do I Capture keyboard/mouse events with my application in focus?

I'm building a synergy like app for Mac OS, that captures keyboard/mouse and sends them to a remote computer.
I wish to capture all user keyboard and mouse events, while my NSWindow is in focus (if possible, while not in focus would be nicer). the catch is - I don't want system shortcuts like CMD+Tab or CMD+Q to interrupt me, I wish to handle them before the windowing system does, so that my app won't loose focus. Same for mouse.
Thanks
Check this - Cocoa Event-Handling Guide
Hope this helps :)