React native Windows: Emit keypresses to system - react-native

I am currently evaluating the possibility to emit keypresses (like left / right arrow or page up / page down) programmatically to the system with React Native in Windows or macOS.
The use case would be a remote clicker for Powerpoint presentations.
All I find online is the possibility to send keypress events to the RN app, not the system.
Is that something that is possible at all or would I need to build this in e.g. Java?
Thanks for your ideas in advance!

Related

Trigger keyboard voice input

I use react-native with expo (both latest) and i try to use speech to text.
For the moment i have found two solutions:
react-native-voice, who need to eject the app but I do not want to.
Google api call, which is not free.
Both of these solutions are not ideal in my case.
The thing is that I can write in an input with the speech-to-text integrated into the native keyboard. But I would like to be able to trigger it at the press of a button, without having to open the keyboard and have to click on the microphone.
Is there a way to do this? In the react native keyboard doc they don't talk about this
Thanks

React Native iOS simulator only performs fetch on click

I have a weird bug in my simulator. iOS / React Native does a call, but it waits until I click inside the simulator to actually display the data (and show the result actions from Redux in my console.log).
Has anyone experienced this behaviour before?
If you're using remote JS debugging (in Chrome for example), try disabling it.
Update: Rather than disabling JS debugging completely, you can instead deactivate breakpoints and remove Pause on exceptions, which usually means you don't have to click to nudge the code to progress.
To do so in Chrome click the Sources tab in inspector, followed by the relevant file, then on the right ensure the two options are disabled. One looks like a bullet with a line through it (you want this to be blue), and the other looks like a pause icon in a circle (you want this to be grey).

Microsoft pixelsense input simulator finger stays pressed

I have a problem with the Input Simulator in the microsoft surface SDK 2.0.
Whenever I try to simulate a finger or a blob click, the first click stays pressed (like when you put it as placeholder with right click+left click).
I don't know the reason of this behaviour since on other computers it work without problem.
Can it be because of I'm using windows in a virtual machine? If yes, is there any workaround?
I fixed this by disabling touch pen utilities which was automatically installed with surface SDK.

VB.NET Keybord tracker

Hello I have a VB NET application and I would like to add it a keybord key pressed catching system so I can track any keybord key that is pressed on any application that is running on my computer and uses keybord.
If someone has any idea thanks for sharing it.
I Hope my question was clear.
Thanks.
You will need to use a Global KeyBoard hook, look at this CodePlex Project. It will allow you to intercept the Global Keyboard events.
From Link:
This library allows you to tap keyboard and mouse and to detect and
record their activity even when an application is inactive and runs in
background.
This library attaches to windows global hooks, tracks keyboard and
mouse clicks and movement and raises common .NET events with
KeyEventArgs and MouseEventArgs, so you can easily retrieve any
information you need:

Adobe AIR Keyboard Hook

I'm trying to add a feature to my AIR app that can listen for (configurable) global keyboard events even when the app is minimized. Ex: CTRL-ALT-SHIFT-F12 to grab a screenshot.
I can't find any way to register a keyboard hook, and listening for keyboard events only captures them when the app has focus. Suggestions?
I don't think that Adobe Air programs can process keypress events unless the application is in focus.
http://forums.adobe.com/thread/420446
Even this question regarding a Global handler for keypresses states that the application must be in focus.
Try hooking onto the stage's KeyboardEvent:
stage.addEventListener(KeyboardEvent.KEY_DOWN,KeyHandler);
function KeyHandler(e:KeyboardEvent){
trace ("Key Code: " + e.keyCode);
trace ("Control? " + e.ctrlKey);
trace ("Shift? " + e.shiftKey);
trace ("Alt? " + e.altKey);
}
With NativeProcess, you could write an external app pretty easily to listen for global keyboard events and send them back to your AIR app. I might be going down this path now...
I'm testing my Air application in Flash CS5 and I need to disable keyboard shortcuts so I can test my own shortcuts. I can get ctrl-F to work, but ctrl-C will not.
I notice that my keyboard shortcuts WILL work if it's a standard AS3 file that I'm testing.
One method I use is to monitor the clipboard in the AIR app, that only allows you to react based on copied data, but it's at least sort of a way to listen for input when the app does not have focus.