I need to create a GUI running on Raspberry PI 4B connected to a touch screen. When the app is running I would like the mouse pointer to be hidden since it's not needed on such type of screens.
I've read I can add a nocursor option as follows xserver-command = X -nocursor in the file /etc/lightdm/lightdm.conf. Now I am not able to verify that since my RPI hasn't been shipped yet. Also I wouldn't like to hide the pointer completely on the OS but rather only when the app is running.
I couldn't find anything interesting either in the documentation of Compose or source classes itself that's why I am asking here.
Related
I am working in a system where I do not have reference to the application's NSStatusItem (or anything really). Is there a way to snag a reference to a previously created NSStatisItem? I can construct a new icon with statusItemWithLength but the associated menu and actions are lost.
A little bit of background, this is actually a Java application and there is currently no way to set a status icon to use Mac's template behavior. Previously we just changed the icon color manually, but BigSur doesn't throw an event when it decides to change the statusBar's color. I was going to attempt to set statusItem.button.image.template to true using native calls, but while prototyping in obj-c I ran into this issue.
Any help is appreciated.
I'm working on a game in Objective-C. The Siri remote works great via GCMicroGamepad and real MFi controllers work well via GCGamepad. However, third-party IR remotes do not work at all in-game (and neither does the Remote App on iPhone or an older Apple TV 3rd gen remote).
How can I recognize and distinguish these inputs?
Two days later... I have found that a UITapGestureRecognizer can be used to detect Up, Down, Left, Right and Select events correctly when presented with a third-party TV remote or iPhone Remote.app. The directional events are actually unique to these types of remotes as well—the Siri remote does not generate directional tap events. Unfortunately, however, tapping the Select button on either the Siri remote or the third-party or iPhone Remote.app will generate a Select event from my tap recognizer. I need some way to distinguish the two.
The only distinguishing factor I can find is that tapping the Siri remote also generates a button-A press on the GCMicroGamepad—a third-party remote or iPhone Remote.app does not affect the GCMicroGamepad at all. But it's very extremely inelegant to attempt to watch the GCMicroGamepad for tap-release events, and then use that event to filter out a matching Select button event. Certainly it's not a recommended use of the APIs; it doesn't seem like a good long-term solution. If I could tell the Siri remote to stop generating UI events when in GCMicroGamepad mode, that would be excellent.
I cannot test this right now, but you could probably differentiate the Siri Remote from a third party remote by using a GCEventViewController with the controllerUserInteractionEnabled property set to false. This way, the Siri Remote inputs shouldn't get passed to UIKit (when the GCEventViewController is the first responder). The third-party remote's input events might go through to UIKit since, unlike the Siri Remote, it's not a GCMicroGamepad.
So far, Apple really doesn't support multiplayer games with multiple Siri Remotes, iOS Remotes or IR remotes. But I think it might be coming because the Remote app on iOS will soon support multiplayer gaming (so I guess the Apple TV will recognize multiple GCMicroGamePad controllers).
Before iOS7 came, we noticed an issue:
Music remote-control from earbud or springboard can hijack our audio session even if we set the category to solo-ambient or another exclusive mode.
We thus tried a few things:
We tried to take ownership of the audio session back. But this requires that our audio code knows when to take it back and from whom. We thought we could let the app code become the first responder to remote-control events, do our stuff, and then pass the events on to the music app. However, we found that the events got detained by the first responder and there is no way to push it back to the chain of commands.
We tried to become first-resonder and block remote-control events all together when we are in solo-ambient. This worked fine with iOS6, still works with earbud control in iOS7, but fails with iOS7's control center. The control center seems to bypass the remote-control event handler remoteControlReceivedWithEvent completely, where we put our blocking code.
I read something elsewhere that:
You can't block the music app. your app can become one though (apple
won't like that) and then the control center would control yours.
But I found no documentation whatsoever about control center.
And as said above, control center does not enter the normal remote control hooks even if an app is the first responder.
Another quoteP
Remote Control Event handling is so your app can be controlled by
Control Center, the earbuds, etc... it is not so that your app can eat
said controls, preventing control of other apps from said sources. It
only worked in iOS6 because of a bug in iOS, now fixed in iOS7
Is it that what had were using was due this bug? I find it hard to believe because we got the solution on this list and the Xcode mailing list so I assume that was an accepted solution.
Now we really wonder if we are missing something from the very beginning:
Is solo-ambient really an exclusive mode for audio session or is it that music app is an exception to that exclusivity?
How can our app live in harmony with the remote-control, and control center?
Where can we find up-to-date documentation of remote-control and control center?
The remote control has been mysteriously fixed after clean building everything agains iOS7 SDK. Now app delegate can receive remote-control events from Control Center. However, the play/pause events are UIEventSubtypeRemoteControlPause and UIEventSubtypeRemoteControlPlay instead of the iOS6's UIEventSubtypeRemoteControlTogglePlayPause.
I'm working on OS X project and i want to programmatically generate Touchpad gesture event like NSEventTypeSwipe or NSEventTypeRotate
So I can rotate/zoom etc on other applications.
I found out how to generate mouse/keyboard events but not touchpad.
Any idea?
There is no public API for generating those events.
You can find some work on synthesizing those events in this project: calftrail/Touch.
Reference: Cocoa Event Handling Guide
The above Mac Developer Library guide does not state any known methods to programmatically generate Touchpad Gestures.
It goes so far as to say the touchpad gestures themselves occur outside of the OS:
"The trackpad hardware includes built-in support for interpreting common gestures and for mapping movements..."
That guide also explicitly mentions apps should not rely on that sole input mechanism, and it's best to include support for keyboard and mouse for that reason.
Now that Mac and Windows (i.e., Windows 8) are supporting touchscreen monitors at the OS level, it's a matter of time before programmatically touchpad & touchscreen gestures can be incorporated into services like your project or remote desktop control using the appropriate API when it becomes available.
I think that touch gesture events cannot be easily generated as there is no public, official Apple API. This NSEventTypeMagnify, NSEventTypeRotate, NSEventTypeSwipe are I think only for read only purposes while handling exitsting system events. Maybe Apple for some reason don't want to make from Magic Mouse a magic touchpad by 3rd party developers. This Project rob mayoff mentioned is corrently not working as apple probably changes something in structure of event data. So relaying on such hacking isn't futuristic.
But If you think a little more you could achieve what touch events will do by other means.
magnification (pinch gesture) -> its just zoom in, zoom out -> most programs is using shortcut for this like CMD and +, CMD and -.
rotation it is usable with photos and there is shortcut like CMD and L, CMD and R in preview app.
swiping - changing spaces (desktops) -> use CTRL and arrows <- or ->
When I see the xcode simulator, I understand that touch events could be generated programmatically, and that xcode simulator is using those routines and functions to transfer the cursor touches and translate them to touch events, however based on what #robMayoff said, it seems that Apple did not make that library open for public. Emulating the same behavior and creating functions from scratch for that would be a bit challenging.
This is the NSTouch class reference:
http://developer.apple.com/library/mac/#documentation/AppKit/Reference/NSTouch_Class/Reference/Reference.html#//apple_ref/occ/cl/NSTouch
I have an installation project in mind which involves a hacked iPad - I'd like to have a background process running recording all the touch events regardless of what app is running in the foreground, and send them out via OSC.
Note that this is using a jailbroken iPad with root access, and users will be alerted about not entering any sensitive data. But I'm not an iOS developer so I'm not sure if this is even possible. I'd appreciate any kind of input/suggestions.
[edit] Since someone questioned my motive behind this question, I'll try to explain a bit: to be specific, I'd like to build a mechanical system with Arduino that emulates the user's touch input on the iPad, but I do not want to limit them to using an app that does nothing else but recording touch events.
There are three options:
Use the IOHIDFamily subsystem to capture all the touch events. This will do most of the processing for you, the only thing you'll need to do is fetch the events using a HID client, get their types, and if they are touch events, get their position, radius and other things you need.
Use the MultitouchSupport framework. This way you will have to process the digitizer data frames manually which is tricky.
Use a MobileSubstrate hook to hook the already existing HID client inside SpringBoard.