While developing Windows Store apps and WP8 apps, is it possible to test the "touch" gestures while using a laptop that doesn't support touch? Are there parallel mouse or keyboard actions/combinations that will do the same thing as a "pinch" or a "flick" gesture (to imitate semantic zoom and unzoom) for example?
When you use the simulator, there are buttons at the right that let you simulate multi-touch. It's not convenient, but it mostly works.
http://msdn.microsoft.com/en-us/library/windows/apps/hh441475(v=vs.110).aspx
I haven't been able to use the WP8 SDK yet but if the SDK is like WP7 (from what I hear from others it is) use the simulator that comes with the SDK and use Multitouch Vista to emulate the touches with this Guide to using Multitouch Vista. You will just need any other USB mouse (note though using the built-in trackpad as another mouse has been hit and miss in my experience with Multitouch Vista)
For Windows Store Apps the easiest way to simlute gestures only us by using the provided simulator. You can still use Multitouch Vista but the dots to track the points won't show in the Metro Environment.
Related
I just found a great bluetooth keyboard by Brydge for the Pixel Slate and am wondering if current ChromeOs devices can manually transfer from "tablet" to "desktop" mode when you attach a bluetooth keyboard?
It seems like this is feasible by enabling a setting in developer mode. Thanks to #Skawtnyc's response on Reddit!
You can switch back and forth on the fly. Enable chrome://flags/#ash-debug-shortcuts and then you can use CTRL+SHIFT+ALT+T to switch back and forth between desktop and tablet mode.
If you connect a mouse it will automatically switch to desktop mode, but using this method you can still switch back and forth.
#Skawtnyc also points out a bug to watch out for when using an external display such as a monitor.
Keep in mind that when connected to a dock with an external display, this setup can trigger a crash/restart error. The cause is the virtual keyboard. Anything that causes the virtual keyboard to appear on screen will trigger it. The workaround is to go into Accessibility settings and turn off the on-screen keyboard. You don't need it anyway when using a physical keyboard.
I know this might seem odd, but I am working on a windows Metro app which would be displayed on touch screen monitors in our local university.
Now I am using the simulator for debugging, but in the simulator you have to start "Touch Mode" to even use the touch interface.
So when using the touch monitors, do we have to specifically specify touch mode ? Or it will automatically integrate the touch functionality ?
Thank you.
Touch is a first class-citizen in Windows Store applications, so no special accommodations are needed. I would recommend you test on a touch device though before deploying, it's a different way of interacting, and even though the simulator does a decent job of handling the mechanics, it will "feel" different to a user - especially if you're leveraging pinch-zoom, swipe and other gestures.
On another note... is this app intended for a kiosk-type application? If so, keep in mind with Windows 8/RT, you won't be able to easily prevent the users from swiping to the charms, navigating to other programs, etc. You may want/need to take a look at Windows 8 Embedded depending on the specific deployment requirements.
I'm working on OS X project and i want to programmatically generate Touchpad gesture event like NSEventTypeSwipe or NSEventTypeRotate
So I can rotate/zoom etc on other applications.
I found out how to generate mouse/keyboard events but not touchpad.
Any idea?
There is no public API for generating those events.
You can find some work on synthesizing those events in this project: calftrail/Touch.
Reference: Cocoa Event Handling Guide
The above Mac Developer Library guide does not state any known methods to programmatically generate Touchpad Gestures.
It goes so far as to say the touchpad gestures themselves occur outside of the OS:
"The trackpad hardware includes built-in support for interpreting common gestures and for mapping movements..."
That guide also explicitly mentions apps should not rely on that sole input mechanism, and it's best to include support for keyboard and mouse for that reason.
Now that Mac and Windows (i.e., Windows 8) are supporting touchscreen monitors at the OS level, it's a matter of time before programmatically touchpad & touchscreen gestures can be incorporated into services like your project or remote desktop control using the appropriate API when it becomes available.
I think that touch gesture events cannot be easily generated as there is no public, official Apple API. This NSEventTypeMagnify, NSEventTypeRotate, NSEventTypeSwipe are I think only for read only purposes while handling exitsting system events. Maybe Apple for some reason don't want to make from Magic Mouse a magic touchpad by 3rd party developers. This Project rob mayoff mentioned is corrently not working as apple probably changes something in structure of event data. So relaying on such hacking isn't futuristic.
But If you think a little more you could achieve what touch events will do by other means.
magnification (pinch gesture) -> its just zoom in, zoom out -> most programs is using shortcut for this like CMD and +, CMD and -.
rotation it is usable with photos and there is shortcut like CMD and L, CMD and R in preview app.
swiping - changing spaces (desktops) -> use CTRL and arrows <- or ->
When I see the xcode simulator, I understand that touch events could be generated programmatically, and that xcode simulator is using those routines and functions to transfer the cursor touches and translate them to touch events, however based on what #robMayoff said, it seems that Apple did not make that library open for public. Emulating the same behavior and creating functions from scratch for that would be a bit challenging.
This is the NSTouch class reference:
http://developer.apple.com/library/mac/#documentation/AppKit/Reference/NSTouch_Class/Reference/Reference.html#//apple_ref/occ/cl/NSTouch
Are we only going to be able to create full screen Metro-style applications?
Yes:
Metro style apps are full screen apps tailored to your users' needs, tailored to the device they run on, tailored for touch interaction, and tailored to the Windows user interface.
Otherwise, as John Gardner points out, your app would not be a Metro-style app: if it exists on the desktop, it is by definition a desktop app.
But that's kindof the point of Metro.
You don't always have fullscreen either, because depending on screen size, you can have 2 applications visible.
If you want to use the desktop, you fall back into the standard windows desktop and use standard desktop applicatoins.
Yes but you can re-size your app by using Snap Feature.
I don't know exactly what are your requirements but better look at this and this video
Is there a way to programmatically mute the sound on OSX without using private APIs and in a way that is accepted by the Mac App Store.
FOR MAC OS X: This tutorial might be of assistance.
FOR IOS: No.
Sound Mute is a system-wide setting. So for applications developed using the official SDK cannot change (and in most cases cannot even access) system-wide settings.
It is technically possible to change the system volume through the private AVSystemController class in Celestial.framework, but will prevent your app from getting Apple's approval.
On the Mac, see Srikar's answer. I believe he's got it covered.
On iOS, [MPMusicPlayerController iPodMusicPlayer].volume = 0.0 will mute the audio output. If there is an MPVolumeView present in your view hierarchy, it will do this without visual feedback; if there is not, it may present the volume change popup you see when you press the hardware volume buttons on the device.