Is it possible to programmatically generate Touchpad gesture event? - objective-c

I'm working on OS X project and i want to programmatically generate Touchpad gesture event like NSEventTypeSwipe or NSEventTypeRotate
So I can rotate/zoom etc on other applications.
I found out how to generate mouse/keyboard events but not touchpad.
Any idea?

There is no public API for generating those events.
You can find some work on synthesizing those events in this project: calftrail/Touch.

Reference: Cocoa Event Handling Guide
The above Mac Developer Library guide does not state any known methods to programmatically generate Touchpad Gestures.
It goes so far as to say the touchpad gestures themselves occur outside of the OS:
"The trackpad hardware includes built-in support for interpreting common gestures and for mapping movements..."
That guide also explicitly mentions apps should not rely on that sole input mechanism, and it's best to include support for keyboard and mouse for that reason.
Now that Mac and Windows (i.e., Windows 8) are supporting touchscreen monitors at the OS level, it's a matter of time before programmatically touchpad & touchscreen gestures can be incorporated into services like your project or remote desktop control using the appropriate API when it becomes available.

I think that touch gesture events cannot be easily generated as there is no public, official Apple API. This NSEventTypeMagnify, NSEventTypeRotate, NSEventTypeSwipe are I think only for read only purposes while handling exitsting system events. Maybe Apple for some reason don't want to make from Magic Mouse a magic touchpad by 3rd party developers. This Project rob mayoff mentioned is corrently not working as apple probably changes something in structure of event data. So relaying on such hacking isn't futuristic.
But If you think a little more you could achieve what touch events will do by other means.
magnification (pinch gesture) -> its just zoom in, zoom out -> most programs is using shortcut for this like CMD and +, CMD and -.
rotation it is usable with photos and there is shortcut like CMD and L, CMD and R in preview app.
swiping - changing spaces (desktops) -> use CTRL and arrows <- or ->

When I see the xcode simulator, I understand that touch events could be generated programmatically, and that xcode simulator is using those routines and functions to transfer the cursor touches and translate them to touch events, however based on what #robMayoff said, it seems that Apple did not make that library open for public. Emulating the same behavior and creating functions from scratch for that would be a bit challenging.
This is the NSTouch class reference:
http://developer.apple.com/library/mac/#documentation/AppKit/Reference/NSTouch_Class/Reference/Reference.html#//apple_ref/occ/cl/NSTouch

Related

tvOS - game control via non-Siri remote

I'm working on a game in Objective-C. The Siri remote works great via GCMicroGamepad and real MFi controllers work well via GCGamepad. However, third-party IR remotes do not work at all in-game (and neither does the Remote App on iPhone or an older Apple TV 3rd gen remote).
How can I recognize and distinguish these inputs?
Two days later... I have found that a UITapGestureRecognizer can be used to detect Up, Down, Left, Right and Select events correctly when presented with a third-party TV remote or iPhone Remote.app. The directional events are actually unique to these types of remotes as well—the Siri remote does not generate directional tap events. Unfortunately, however, tapping the Select button on either the Siri remote or the third-party or iPhone Remote.app will generate a Select event from my tap recognizer. I need some way to distinguish the two.
The only distinguishing factor I can find is that tapping the Siri remote also generates a button-A press on the GCMicroGamepad—a third-party remote or iPhone Remote.app does not affect the GCMicroGamepad at all. But it's very extremely inelegant to attempt to watch the GCMicroGamepad for tap-release events, and then use that event to filter out a matching Select button event. Certainly it's not a recommended use of the APIs; it doesn't seem like a good long-term solution. If I could tell the Siri remote to stop generating UI events when in GCMicroGamepad mode, that would be excellent.
I cannot test this right now, but you could probably differentiate the Siri Remote from a third party remote by using a GCEventViewController with the controllerUserInteractionEnabled property set to false. This way, the Siri Remote inputs shouldn't get passed to UIKit (when the GCEventViewController is the first responder). The third-party remote's input events might go through to UIKit since, unlike the Siri Remote, it's not a GCMicroGamepad.
So far, Apple really doesn't support multiplayer games with multiple Siri Remotes, iOS Remotes or IR remotes. But I think it might be coming because the Remote app on iOS will soon support multiplayer gaming (so I guess the Apple TV will recognize multiple GCMicroGamePad controllers).

Windows Metro Apps on touch Screen Monitors

I know this might seem odd, but I am working on a windows Metro app which would be displayed on touch screen monitors in our local university.
Now I am using the simulator for debugging, but in the simulator you have to start "Touch Mode" to even use the touch interface.
So when using the touch monitors, do we have to specifically specify touch mode ? Or it will automatically integrate the touch functionality ?
Thank you.
Touch is a first class-citizen in Windows Store applications, so no special accommodations are needed. I would recommend you test on a touch device though before deploying, it's a different way of interacting, and even though the simulator does a decent job of handling the mechanics, it will "feel" different to a user - especially if you're leveraging pinch-zoom, swipe and other gestures.
On another note... is this app intended for a kiosk-type application? If so, keep in mind with Windows 8/RT, you won't be able to easily prevent the users from swiping to the charms, navigating to other programs, etc. You may want/need to take a look at Windows 8 Embedded depending on the specific deployment requirements.

Is it possible to test touch gestures with a "non-touchable" laptop?

While developing Windows Store apps and WP8 apps, is it possible to test the "touch" gestures while using a laptop that doesn't support touch? Are there parallel mouse or keyboard actions/combinations that will do the same thing as a "pinch" or a "flick" gesture (to imitate semantic zoom and unzoom) for example?
When you use the simulator, there are buttons at the right that let you simulate multi-touch. It's not convenient, but it mostly works.
http://msdn.microsoft.com/en-us/library/windows/apps/hh441475(v=vs.110).aspx
I haven't been able to use the WP8 SDK yet but if the SDK is like WP7 (from what I hear from others it is) use the simulator that comes with the SDK and use Multitouch Vista to emulate the touches with this Guide to using Multitouch Vista. You will just need any other USB mouse (note though using the built-in trackpad as another mouse has been hit and miss in my experience with Multitouch Vista)
For Windows Store Apps the easiest way to simlute gestures only us by using the provided simulator. You can still use Multitouch Vista but the dots to track the points won't show in the Metro Environment.

Customizing Multi-Touch Gestures (OS X)

Is it possible to create custom multi-touch gestures for OS X? I know you can now make up your own gestures through the System Preferences but my question is if it's possible for the gesture to perform an action that isn't listed in the possible Trackpad options in the OS default. For example, let's say I want to close an application by swiping with three fingers or something along those lines.
I understand that this would most likely be an outside application that would need to be written but I'm just trying to get an idea of where to start to read about this (the functions in Objective C to look at, etc.).
Thanks for the help!
Yes, override touchesBegan and write your own custom code for interpreting the corresponding touch events.

How to implement simple javascript touch events for webkit on WebOS while developing web content

did anyone come across difficulties while trying to implement simple js touch events for webkit and found a solution for it ?
Basically my touch start, touch move, and touch end events are not being detected.
Thanks
Palm webOS's webkit doesn't implement touch events at this time. Instead, the system sends gesture events. These aren't super well documented, but they are used in the mojomatters sample code that's part of the Palm webOS SDK. Look at the gesture-assistant.js source file for gestureStart, gestureChange, and gestureEnd.