Setting up cryptographic key between Arduino and React Native app - react-native

I am new to Arduino and working on a project to pass cryptographic keys between Arduino (ESP8266) and a React Native app. On the ESP8266 side I am using arduino crypto library and for React Native I am using react-native-crypto-js. The encryption doesn't seem to do the right thing and returns garbage. So, as a first step I was trying to pass the key from Arduino into the mobile app (using the bluetooth HC-05). The following code was used in the arduino side:
void getConnected() {
// wait till someone try to connect to the user
if (btSerial.available() > 0) {
String message = btSerial.readString();
if(message == "Hello") {
Serial.println("user trying to connect");
byte key[32];
device.getKey(key);
Serial.println((char*)key);
btSerial.write((char*)key);
}
}
}
The key is generated using RNG.rand(key, sizeof(key)) and I also printed the generated bytes separately and it is something like:
137 224 186 115 0 0 0 0 172 228 254 63 131 53 32 64 208 218 255 63 13 0 0 0 172 228 254 63 48 70 32 64
As you see above, since there are 0s in the bytes, the code in the React Native app is only getting the first 4 bytes, and the remaining is ignored as it is thinking the 5th byte to be the end of string.
The code used in the App is as below:
async setup(deviceId) {
console.log('connecting');
await BluetoothSerial.connect(deviceId);
console.log('connected');
await BluetoothSerial.write('Hello');
setTimeout(async () => {
let key = await BluetoothSerial.readFromDevice();
console.log(key);
}, 3000);
// const input = await BluetoothSerial.readFromDevice();
return true;
}
I would really appreciate some pointers. Please help.
Thanks

Related

NSMutableDictionary cannot be converted to ASAuthorizationAppleIDRequest

I’m implementing Apple Sign In in React Native using this library:
https://github.com/invertase/react-native-apple-authentication
And there’s a handler for the onPress event of the Apple Sign-In button given like so:
import { appleAuth } from '#invertase/react-native-apple-authentication';
async function onAppleButtonPress() {
// performs login request
const appleAuthRequestResponse = await appleAuth.performRequest({
requestedOperation: appleAuth.Operation.LOGIN,
requestedScopes: [appleAuth.Scope.EMAIL, appleAuth.Scope.FULL_NAME],
});
// get current authentication state for user
// /!\ This method must be tested on a real device. On the iOS simulator it always throws an error.
const credentialState = await appleAuth.getCredentialStateForUser(appleAuthRequestResponse.user);
// use credentialState response to ensure the user is authenticated
if (credentialState === appleAuth.State.AUTHORIZED) {
// user is authenticated
}
}
However, when I convert this to clojurescript like so:
(reg-fx
:apple-signin-fx
(fn [navigation]
(go (let [appleAuthRequestResponse
(<p! (.performRequest appleAuth
(clj->js
{:requestedOperation (.. appleAuth -Operation -LOGIN)
:requestedScopes [(.. appleAuth -Scope -Email)
(.. appleAuth -Scope -FULL_NAME)]})))]))))
I’m getting the following error:
JSON value '{
nonceEnabled = 1;
requestedOperation = 1;
requestedScopes = (
"<null>",
1
);
}' of type NSMutableDictionary cannot be converted to ASAuthorizationAppleIDRequest *
+[RCTConvert(ASAuthorizationAppleIDRequest) ASAuthorizationAppleIDRequest:]
RCTConvert+ASAuthorizationAppleIDRequest.m:85
__41-[RCTModuleMethod processMethodSignature]_block_invoke_16
-[RCTModuleMethod invokeWithBridge:module:arguments:]
facebook::react::invokeInner(RCTBridge*, RCTModuleData*, unsigned int, folly::dynamic const&)
facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)::$_0::operator()() const
invocation function for block in facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)
_dispatch_call_block_and_release
_dispatch_client_callout
_dispatch_main_queue_callback_4CF
__CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
__CFRunLoopRun
CFRunLoopRunSpecific
GSEventRunModal
-[UIApplication _run]
UIApplicationMain
main
start
How to fix this?

polidea ble encode send package to base64

I want to send a command to my scooter via polidea ble but I don't know how to compose the package and encode it to base64, I tried different ways but it seems it does not work. Here is the documentation of how I need to make the package:
APP 🡪 Bluetooth
START_PACK, OPCODESEND, LENGTH, D0, CHECKSUM (START_PACK = 0x55)
OPCODESEND
0x02 – Speed Limit
D0 – 1 is 6Km/k, 2 is 12Km/h, 3 is 20Km/h, 4 is 25Km/h, 5 No speed Limit
0x03 - Change Zero Start
D0 🡪 0 Zero Start OFF, 1 Zero Start ON
0x05 –Lock Unlock Scooter
D0 🡪 0 Unlock, 1 Lock
0x06 – On/Off light from display
D0 🡪 0 light OFF, 1 light ON
For example, how should the package look to turn on the light?
My understanding of the documentation you have included, is that to turn on the light the code to create the packet in base64 would be:
import { Buffer } from "buffer";
var start_pack = 0x55;
var opcode = 0x06 // light
var action = 0x01 // On
var length = 0x01 // Length of what? action?
var checksum = start_pack + opcode + action + length
var valueBytes = Buffer.alloc(5);
valueBytes[0] = start_pack;
valueBytes[1] = opcode;
valueBytes[2] = length;
valueBytes[3] = action;
valueBytes[4] = checksum;
var valueBase64 = valueBytes.toString('base64')
console.log("Data to write: " + valueBase64)
This gave me an output of:
Data to write: VQYBAV0=

setting AVAudioFormat to connect function has crash

I have been trying play with AVAudioEngine for singed 16 bit stream data.
But pass to AVAudioFormat connect function always make crash.
codes like below:
let AUDIO_OUTPUT_SAMPLE_RATE = 44100
let AUDIO_OUTPUT_CHANNELS = 2
let AUDIO_OUTPUT_BITS = 16
var audioEngine: AVAudioEngine?
var audioPlayer: AVAudioPlayerNode?
...
audioEngine = AVAudioEngine()
audioPlayer = AVAudioPlayerNode()
audioEngine?.attach(audioPlayer!)
let mixer = audioEngine?.mainMixerNode
mixer!.outputVolume = 1.0
let stereoFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: Double(AUDIO_OUTPUT_SAMPLE_RATE), channels: 2, interleaved: false)
audioEngine!.connect(audioPlayer!, to: mixer!, format: stereoFormat)
...
audioEngine!.connect(...) is crashing line
I'm using Xcode 8 beta 6, OS X El Capitan, And, this happening arise from both of simulator and devices.
This is part of crash message:
ERROR: >avae> AVAudioNode.mm:751: AUSetFormat: error -10868
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'
...
3 AVFAudio 0x000000011e0a5630 _Z19AVAE_RaiseExceptionP8NSStringz + 176
4 AVFAudio 0x000000011e0f270d _ZN19AVAudioNodeImplBase11AUSetFormatEP28OpaqueAudioComponentInstancejjP13AVAudioFormat + 213
5 AVFAudio 0x000000011e0f2630 _ZN19AVAudioNodeImplBase15SetOutputFormatEmP13AVAudioFormat + 46
6 AVFAudio 0x000000011e0f9663 _ZN21AVAudioPlayerNodeImpl15SetOutputFormatEmP13AVAudioFormat + 25
7 AVFAudio 0x000000011e099cfd _ZN18AVAudioEngineGraph8_ConnectEP19AVAudioNodeImplBaseS1_jjP13AVAudioFormat + 2377
8 AVFAudio 0x000000011e09d15f _ZN18AVAudioEngineGraph7ConnectEP11AVAudioNodeS1_mmP13AVAudioFormat + 355
9 AVFAudio 0x000000011e0fc80e _ZN17AVAudioEngineImpl7ConnectEP11AVAudioNodeS1_mmP13AVAudioFormat + 348
It's no problem playing with buffer and format from audio file.
What is mistake(s) I do?
Thanks.
-10868, a.k.a. kAudioUnitErr_FormatNotSupported, so it looks like your PCMFormatInt16 isn't appreciated. Changing it to .PCMFormatFloat32 does work.
From Apple, AVAudioPlayerNode
When playing buffers, there's an implicit assumption that the buffers are at the same sample rate as the node’s output format.
and then print AVAudioPlayerNode's output format, which is engine.mainMixerNode.inputFormat
open func connectNodes() {
print(engine.mainMixerNode.inputFormat(forBus: 0))
engine.connect(playerNode, to: engine.mainMixerNode, format: readFormat)
}
The result is
<AVAudioFormat 0x6000024c18b0: 2 ch, 44100 Hz, Float32, non-inter>
so choose .pcmFormatFloat32, instead of .pcmFormatInt16
OSStatus master website
You need not to create the audio format yourself,
Get the audio format from the actual audio data ( pcm buffer or audio file )
the audio format thing from WWDC 2016 , Delivering an Exceptional Audio Experience
The actual audio format theory from WWDC 2015, What's New in Core Audio
the actual audio format from output
the actual audio format from input
do audio channel mapping, and audio bit depth keeps the same
You can use singed 16 bit audio format, but you should convert it at first.
// Setup your own format
let inputFormat = AVAudioFormat(
commonFormat: .pcmFormatInt16,
sampleRate: 44100,
channels: AVAudioChannelCount(2),
interleaved: true
)!
let engine = AVAudioEngine()
// Use system format as output format
let outputFormat = engine.mainMixerNode.outputFormat(forBus: 0)
self.converter = AVAudioConverter(from: inputFormat, to: outputFormat)!
self.playerNode = AVAudioPlayerNode()
engine.attach(playerNode)
engine.connect(playerNode, to: engine.mainMixerNode, format: nil)
...
// Prepare input and output buffer
let inputBuffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: maxSamplesPerBuffer)!
let outputBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat, frameCapacity: maxSamplesPerBuffer)!
// When you fill your Int16 buffer with data, send it to converter
self.converter.convert(to: outputBuffer, error: nil) { inNumPackets, outStatus in
outStatus.pointee = .haveData
return inputBuffer
}
// Now in outputBuffer sound in system format and we can play it
self.playerNode.scheduleBuffer(outputBuffer)

Efficient Go serialization of struct to disk

I've been tasked to replace C++ code to Go and I'm quite new to the Go APIs. I am using gob for encoding hundreds of key/value entries to disk pages but the gob encoding has too much bloat that's not needed.
package main
import (
"bytes"
"encoding/gob"
"fmt"
)
type Entry struct {
Key string
Val string
}
func main() {
var buf bytes.Buffer
enc := gob.NewEncoder(&buf)
e := Entry { "k1", "v1" }
enc.Encode(e)
fmt.Println(buf.Bytes())
}
This produces a lot of bloat that I don't need:
[35 255 129 3 1 1 5 69 110 116 114 121 1 255 130 0 1 2 1 3 75 101 121 1 12 0 1 3 86 97 108 1 12 0 0 0 11 255 130 1 2 107 49 1 2 118 49 0]
I want to serialize each string's len followed by the raw bytes like:
[0 0 0 2 107 49 0 0 0 2 118 49]
I am saving millions of entries so the additional bloat in the encoding increases the file size by roughly x10.
How can I serialize it to the latter without manual coding?
If you zip a file named a.txt containing the text "hello" (which is 5 characters), the result zip will be around 115 bytes. Does this mean the zip format is not efficient to compress text files? Certainly not. There is an overhead. If the file contains "hello" a hundred times (500 bytes), zipping it will result in a file being 120 bytes! 1x"hello" => 115 bytes, 100x"hello" => 120 bytes! We added 495 byes, and yet the compressed size only increased by 5 bytes.
Something similar is happening with the encoding/gob package:
The implementation compiles a custom codec for each data type in the stream and is most efficient when a single Encoder is used to transmit a stream of values, amortizing the cost of compilation.
When you "first" serialize a value of a type, the definition of the type also has to be included / transmitted, so the decoder can properly interpret and decode the stream:
A stream of gobs is self-describing. Each data item in the stream is preceded by a specification of its type, expressed in terms of a small set of predefined types.
Let's return to your example:
var buf bytes.Buffer
enc := gob.NewEncoder(&buf)
e := Entry{"k1", "v1"}
enc.Encode(e)
fmt.Println(buf.Len())
It prints:
48
Now let's encode a few more of the same type:
enc.Encode(e)
fmt.Println(buf.Len())
enc.Encode(e)
fmt.Println(buf.Len())
Now the output is:
60
72
Try it on the Go Playground.
Analyzing the results:
Additional values of the same Entry type only cost 12 bytes, while the first is 48 bytes because the type definition is also included (which is ~26 bytes), but that is a one-time overhead.
So basically you transmit 2 strings: "k1" and "v1" which are 4 bytes, and the length of strings also has to be included, using 4 bytes (size of int on 32-bit architectures) gives you the 12 bytes, which is the "minimum". (Yes, you could use a smaller type for length, but that would have its limitations. A variable-length encoding would be a better choice for small numbers, see encoding/binary package.)
All in all, encoding/gob does a pretty good job for your needs. Don't get fooled by initial impressions.
If this 12 bytes for one Entry is too "much" for you, you can always wrap the stream into a compress/flate or compress/gzip writer to further reduce the size (in exchange for slower encoding/decoding and slightly higher memory requirement for the process).
Demonstration:
Let's test the following 5 solutions:
Using a "naked" output (no compression)
Using compress/flate to compress the output of encoding/gob
Using compress/zlib to compress the output of encoding/gob
Using compress/gzip to compress the output of encoding/gob
Using github.com/dsnet/compress/bzip2 to compress the output of encoding/gob
We will write a thousand entries, changing keys and values of each, being "k000", "v000", "k001", "v001" etc. This means the uncompressed size of an Entry is 4 byte + 4 byte + 4 byte + 4 byte = 16 bytes (2x4 bytes text, 2x4 byte lengths).
The code looks like this:
for _, name := range []string{"Naked", "flate", "zlib", "gzip", "bzip2"} {
buf := &bytes.Buffer{}
var out io.Writer
switch name {
case "Naked":
out = buf
case "flate":
out, _ = flate.NewWriter(buf, flate.DefaultCompression)
case "zlib":
out, _ = zlib.NewWriterLevel(buf, zlib.DefaultCompression)
case "gzip":
out = gzip.NewWriter(buf)
case "bzip2":
out, _ = bzip2.NewWriter(buf, nil)
}
enc := gob.NewEncoder(out)
e := Entry{}
for i := 0; i < 1000; i++ {
e.Key = fmt.Sprintf("k%3d", i)
e.Val = fmt.Sprintf("v%3d", i)
enc.Encode(e)
}
if c, ok := out.(io.Closer); ok {
c.Close()
}
fmt.Printf("[%5s] Length: %5d, average: %5.2f / Entry\n",
name, buf.Len(), float64(buf.Len())/1000)
}
Output:
[Naked] Length: 16036, average: 16.04 / Entry
[flate] Length: 4120, average: 4.12 / Entry
[ zlib] Length: 4126, average: 4.13 / Entry
[ gzip] Length: 4138, average: 4.14 / Entry
[bzip2] Length: 2042, average: 2.04 / Entry
Try it on the Go Playground.
As you can see: the "naked" output is 16.04 bytes/Entry, just little over the calculated size (due to the one-time tiny overhead discussed above).
When you use flate, zlib or gzip to compress the output, you can reduce the output size to about 4.13 bytes/Entry, which is about ~26% of the theoretical size, I'm sure that satisfies you. If not, you can reach out to libs providing compression with higher efficiency like bzip2, which in the above example resulted in 2.04 bytes/Entry, being 12.7% of the theoretical size!
(Note that with "real-life" data the compression ratio would probably be a lot higher as the keys and values I used in the test are very similar and thus really well compressible; still ratio should be around 50% with real-life data).
Use protobuf to efficiently encode your data.
https://github.com/golang/protobuf
Your main would look like this:
package main
import (
"fmt"
"log"
"github.com/golang/protobuf/proto"
)
func main() {
e := &Entry{
Key: proto.String("k1"),
Val: proto.String("v1"),
}
data, err := proto.Marshal(e)
if err != nil {
log.Fatal("marshaling error: ", err)
}
fmt.Println(data)
}
You create a file, example.proto like this:
package main;
message Entry {
required string Key = 1;
required string Val = 2;
}
You generate the go code from the proto file by running:
$ protoc --go_out=. *.proto
You can examine the generated file, if you wish.
You can run and see the results output:
$ go run *.go
[10 2 107 49 18 2 118 49]
"Manual coding", you're so afraid of, is trivially done in Go using the standard encoding/binary package.
You appear to store string length values as 32-bit integers in big-endian format, so you can just go on and do just that in Go:
package main
import (
"bytes"
"encoding/binary"
"fmt"
"io"
)
func encode(w io.Writer, s string) (n int, err error) {
var hdr [4]byte
binary.BigEndian.PutUint32(hdr[:], uint32(len(s)))
n, err = w.Write(hdr[:])
if err != nil {
return
}
n2, err := io.WriteString(w, s)
n += n2
return
}
func main() {
var buf bytes.Buffer
for _, s := range []string{
"ab",
"cd",
"de",
} {
_, err := encode(&buf, s)
if err != nil {
panic(err)
}
}
fmt.Printf("%v\n", buf.Bytes())
}
Playground link.
Note that in this example I'm writing to a byte buffer, but that's for demonstration purposes only—since encode() writes to an io.Writer, you can pass it an opened file, a network socket and anything else implementing that interface.

What are the ascii values of up down left right?

What are the ASCII values of the arrow keys? (up/down/left/right)
In short:
left arrow: 37 up arrow: 38right arrow: 39down arrow: 40
There is no real ascii codes for these keys as such, you will need to check out the scan codes for these keys, known as Make and Break key codes as per helppc's information. The reason the codes sounds 'ascii' is because the key codes are handled by the old BIOS interrupt 0x16 and keyboard interrupt 0x9.
Normal Mode Num lock on
Make Break Make Break
Down arrow E0 50 E0 D0 E0 2A E0 50 E0 D0 E0 AA
Left arrow E0 4B E0 CB E0 2A E0 4B E0 CB E0 AA
Right arrow E0 4D E0 CD E0 2A E0 4D E0 CD E0 AA
Up arrow E0 48 E0 C8 E0 2A E0 48 E0 C8 E0 AA
Hence by looking at the codes following E0 for the Make key code, such as 0x50, 0x4B, 0x4D, 0x48 respectively, that is where the confusion arise from looking at key-codes and treating them as 'ascii'... the answer is don't as the platform varies, the OS varies, under Windows it would have virtual key code corresponding to those keys, not necessarily the same as the BIOS codes, VK_UP, VK_DOWN, VK_LEFT, VK_RIGHT.. this will be found in your C++'s header file windows.h, as I recall in the SDK's include folder.
Do not rely on the key-codes to have the same 'identical ascii' codes shown here as the Operating system will reprogram the entire BIOS code in whatever the OS sees fit, naturally that would be expected because since the BIOS code is 16bit, and the OS (nowadays are 32bit protected mode), of course those codes from the BIOS will no longer be valid.
Hence the original keyboard interrupt 0x9 and BIOS interrupt 0x16 would be wiped from the memory after the BIOS loads it and when the protected mode OS starts loading, it would overwrite that area of memory and replace it with their own 32 bit protected mode handlers to deal with those keyboard scan codes.
Here is a code sample from the old days of DOS programming, using Borland C v3:
#include <bios.h>
int getKey(void){
int key, lo, hi;
key = bioskey(0);
lo = key & 0x00FF;
hi = (key & 0xFF00) >> 8;
return (lo == 0) ? hi + 256 : lo;
}
This routine actually, returned the codes for up, down is 328 and 336 respectively, (I do not have the code for left and right actually, this is in my old cook book!) The actual scancode is found in the lo variable. Keys other than the A-Z,0-9, had a scan code of 0 via the bioskey routine.... the reason 256 is added, because variable lo has code of 0 and the hi variable would have the scan code and adds 256 on to it in order not to confuse with the 'ascii' codes...
Really the answer to this question depends on what operating system and programming language you are using. There is no "ASCII code" per se. The operating system detects you hit an arrow key and triggers an event that programs can capture. For example, on modern Windows machines, you would get a WM_KEYUP or WM_KEYDOWN event. It passes a 16-bit value usually to determine which key was pushed.
The ascii values of the:
Up key - 224
72
Down key - 224
80
Left key - 224
75
Right key - 224
77
Each of these has two integer values for ascii value, because they are special keys, as opposed to the code for $, which is simply 36. These 2 byte special keys usually have the first digit as either 224, or 0. this can be found with the F# in windows, or the delete key.
EDIT : This may actually be unicode looking back, but they do work.
If you're programming in OpenGL, use GLUT. The following page should help: http://www.lighthouse3d.com/opengl/glut/index.php?5
GLUT_KEY_LEFT Left function key
GLUT_KEY_RIGHT Right function key
GLUT_KEY_UP Up function key
GLUT_KEY_DOWN Down function key
void processSpecialKeys(int key, int x, int y) {
switch(key) {
case GLUT_KEY_F1 :
red = 1.0;
green = 0.0;
blue = 0.0; break;
case GLUT_KEY_F2 :
red = 0.0;
green = 1.0;
blue = 0.0; break;
case GLUT_KEY_F3 :
red = 0.0;
green = 0.0;
blue = 1.0; break;
}
}
You can check it by compiling,and running this small C++ program.
#include <iostream>
#include <conio.h>
#include <cstdlib>
int show;
int main()
{
while(true)
{
int show = getch();
std::cout << show;
}
getch(); // Just to keep the console open after program execution
}
If you're working with terminals, as I was when I found this in a search, then you'll find that the arrow keys send the corresponding cursor movement escape sequences.
So in this context,
UP = ^[[A
DOWN = ^[[B
RIGHT = ^[[C
LEFT = ^[[D
with ^[ being the symbol meaning escape, but you'll use the ASCII value for escape which is 27, as well as for the bracket and letter.
In my case, using a serial connection to communicate these directions, for Up arrow, I sent the byte sequence 27,91,65 for ^[, [, A
You can utilize the special function for activating the navigation for your programming purpose. Below is the sample code for it.
void Specialkey(int key, int x, int y)
{
switch(key)
{
case GLUT_KEY_UP:
/*Do whatever you want*/
break;
case GLUT_KEY_DOWN:
/*Do whatever you want*/
break;
case GLUT_KEY_LEFT:
/*Do whatever you want*/
break;
case GLUT_KEY_RIGHT:
/*Do whatever you want*/
break;
}
glutPostRedisplay();
}
Add this to your main function
glutSpecialFunc(Specialkey);
Hope this will to solve the problem!
The Ascii codes for arrow characters are the following:
↑ 24
↓ 25
→ 26
← 27
I got stuck with this question and was not able to find a good solution, so decided to have some tinkering with the Mingw compiler I have. I used C++ and getch() function in <conio.h> header file and pressed the arrow keys to find what value was assigned to these keys. As it turns out, they are assigned values as 22472, 22477, 22480, 22475 for up, right, down and left keys respectively. But wait, it does not work as you would expect. You have to ditch the 224 part and write only the last two digits for the software to recognize the correct key; and as you guessed, 72, 77, 80 and 75 are all preoccupied by other characters, but this works for me and I hope it works for you as well. If you want to run the C++ code for yourself and find out the values for yourself, run this code and press enter to get out of the loop:
#include<iostream>
#include<conio.h>
using namespace std;
int main()
{
int x;
while(1){
x=(int)getch();
if(x==13){
break;
}
else
cout<<endl<<endl<<x;
}
return getch();
}
If you Come for JavaScript Purpose to get to Know which Key is Pressed.
Then there is a method of AddEventListener of JavaScript name keydown.
which give us that key which is pressed but there are certain method that you can perform on that pressed key that you get by keydown or onkeydown quite same both of them.
The Methods of pressed Key are :-
.code :- This Return a String About Which key is Pressed Like ArrowUp, ArrowDown, KeyW, Keyr and Like that
.keyCode :- This Method is Depereciated but still you can use it. This return an integer like for small a ---> 65 , Capital A :- 65 mainly ASCII code means case Insensitive
ArrowLeft :- 37, ArrowUp :- 38, ArrowRight :- 39 and ArrowDown :- 40
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
</head>
<body>
<h1 id="show">Press Any Button</h1>
// JavaScript Code Starting From here See Magic By Pressing Any Buttton
<script>
document.addEventListener('keydown', (key)=> {
let keycode = key.keyCode;
document.getElementById('show').innerText = keycode;
/*
let keyString = key.code;
switch(keyString){
case "ArrowLeft":
console.log("Left Key is Pressed");
break;
case "ArrowUp":
console.log("Up Key is Pressed");
break;
case "ArrowRight":
console.log("Right Key is Pressed");
break;
case "ArrowDown":
console.log("Down Key is Pressed");
break;
default:
console.log("Any Other Key is Pressed");
break;
}
*/
});
</script>
</body>
</html>
Can't address every operating system/situation, but for AppleScript on a Mac, it is as follows:
LEFT: 28
RIGHT: 29
UP: 30
DOWN: 31
tell application "System Events" to keystroke (ASCII character 31) --down arrow
Gaa! Go to asciitable.com. The arrow keys are the control equivalent of the HJKL keys. I.e., in vi create a big block of text. Note you can move around in that text using the HJKL keys. The arrow keys are going to be ^H, ^J, ^K, ^L.
At asciitable.com find, "K" in the third column. Now, look at the same row in the first column to find the matching control-code ("VT" in this case).
this is what i get:
Left - 19
Up - 5
Right - 4
Down - 24
Works in Visual Fox Pro