React Native BLE with MiBand - react-native

I'm really new to this but I already know (by searching other projects through the Internet) that the MiBands have an authentication process.
The thing is I have tried to write without and with response to the only service UUID I got through the connection and it's always saying the band has not that UUID Service.
I am using React-Native-BLE-PLX library.
As it can be seen in the image I use the device's Service UUID since I cannot get any other service but it always says that it does not exist.
search(){
this.manager = new BleManager();
this.manager.startDeviceScan(null, null, (error, device) => {
if (error) {
console.log(error.message);
return;
}
if (device.name == 'Mi Band 3') {
this.manager.stopDeviceScan();
this.device = device;
this.connect();
}
});
}
connect() {
console.log("CONNECTING...");
this.device.connect()
.then(async (device) => {
console.log("CONNECTED!!!");
console.log("DEVICE CONNECTED:\n");
console.log(device);
this.auth(device);
// return this.manager.discoverAllServicesAndCharacteristicsForDevice(device.id)
})
// .then((device) => {
// console.log(device);
// this.send(device);
// })
// .catch((error) => {
// console.log("ERROR: ");
// console.log(error);
// });
}
async auth(device) {
console.log("DEVICE: \n");console.log(this.device);
console.log("DEVICE'S SERVICE UUID: \n" +this.device.serviceUUIDs[0]);
console.log("TRYING");
this.manager.writeCharacteristicWithoutResponseForDevice('D7:2D:F8:F2:24:3F', '0000fee0-0000-1000-8000-00805f9b34fb', '00000009-0000-3512-2118-0009af100700', 0x01 + 0x00 + new Buffer("OLA MUNDO"))
.then((device) => {
console.log("STUFF GOING ON:\n");
console.log(device);
})
.catch((error) => {
throw error;
});
}
Really need help and thanks for that.
If there is something I need to describe more please just tell me.

Directly after getting connected you must first discover the services and characteristics. After that you can start the authentication part. However your authentication part is totally wrong. Do a bit of Googling to find out how to do it properly...

Related

Node.js wait("await") for SQL database query before proceeding

I have spent a lot of time reading up on this but I simply don't get how to solve it.
I have an application that uses a token that is stored in a SQL database. I need that token before the application can proceed.
I'm trying to solve it with "await" but it doesn't work. The SQL query result is still retrieved "too late".
const pool = mysql.createPool({
user : 'xxxx', // e.g. 'my-db-user'
password : "xxxx", // e.g. 'my-db-password'
database : "xxxx", // e.g. 'my-database'
// If connecting via localhost, specify the ip
host : "xxxx"
// If connecting via unix domain socket, specify the path
//socketPath : `/cloudsql/xxxx`,
});
const isAuthorized = async (userId) => {
let query = "SELECT * FROM auth WHERE id = 2";
await pool.query(query, (error,results) => {
if (!results[0]) {
console.log("No results");
return
} else {
tokenyay=results[0].refreshtoken;
console.log("results: "+results[0].refreshtoken);
return results[0].refreshtoken;
}
});
await console.log("tokenyay: "+tokenyay);
if (tokenyay != null && tokenyay != '') {
refreshTokenStore[userId] = tokenyay;
}
console.log(userId);
console.log(refreshTokenStore[userId] ? true : false);
return refreshTokenStore[userId] ? true : false;
};
I don't know what your pool is, but judging from the callback, we can see that pool.query() method is not await-able.
You can manually create a Promise for it, though, which is await-able, for example
await new Promise((resolve, reject) => {
pool.query(query, (error, results) => {
if (error) reject(error);
if (!results[0]) {
console.log("No results");
resolve(); // give `undefined` to the `await...` and make it stop waiting
return;
} else {
tokenyay = results[0].refreshtoken;
console.log("results: " + results[0].refreshtoken);
resolve(results[0].refreshtoken);
}
})
});
Edit:
However, since the result of await is obtained from the value passed to the resolve, we don't need to assign tokenyay inside of the callback.
We can use tokenyay = await... instead.
tokenyay = await new Promise((resolve, reject) => {
pool.query(query, (error, results) => {
if (error) reject(error);
if (!results[0]) {
console.log("No results");
resolve(); // give `undefined` to the `await...` and make it stop waiting
return;
} else {
console.log("results: " + results[0].refreshtoken);
resolve(results[0].refreshtoken);
}
})
});

WebRTC succesfully signalled offer and answer, but not getting any ICE candidates

I'm trying to establish a WebRTC connection between two browsers. I have a node.js server for them to communicate through, which essentially just forwards the messages from one client to the other. I am running the server and two tabs all on my laptop, but I have not been able to make a connection. I have been able to send the offers and answers between the two tabs successfully resulting in pc.signalingState = 'stable' in both tabs. I believe once this is done then the RTCPeerConnection objects should start producing icecandidate events, but this is not happening and I do not know why. Here is my code (I've omitted the server code):
'use strict';
// This is mostly copy pasted from webrtc.org/getting-started/peer-connections.
import { io } from 'socket.io-client';
const configuration = {
'iceServers': [
{ 'urls': 'stun:stun4.l.google.com:19302' },
{ 'urls': 'stun:stunserver.stunprotocol.org:3478' },
]
}
// Returns a promise for an RTCDataChannel
function join() {
const socket = io('ws://localhost:8090');
const pc = new RTCPeerConnection(configuration);
socket.on('error', error => {
socket.close();
throw error;
});
pc.addEventListener('signalingstatechange', event => {
// Prints 'have-local-offer' then 'stable' in one tab,
// 'have-remote-offer' then 'stable' in the other.
console.log(pc.signalingState);
})
pc.addEventListener('icegatheringstatechange', event => {
console.log(pc.iceGatheringState); // This line is never reached.
})
// Listen for local ICE candidates on the local RTCPeerConnection
pc.addEventListener('icecandidate', event => {
if (event.candidate) {
console.log('Sending ICE candidate'); // This line is never reached.
socket.emit('icecandidate', event.candidate);
}
});
// Listen for remote ICE candidates and add them to the local RTCPeerConnection
socket.on('icecandidate', async candidate => {
try {
await pc.addIceCandidate(candidate);
} catch (e) {
console.error('Error adding received ice candidate', e);
}
});
// Listen for connectionstatechange on the local RTCPeerConnection
pc.addEventListener('connectionstatechange', event => {
if (pc.connectionState === 'connected') {
socket.close();
}
});
// When both browsers send this signal they will both receive the 'matched' signal,
// one with the payload true and the other with false.
socket.emit('join');
return new Promise((res, rej) => {
socket.on('matched', async first => {
if (first) {
// caller side
socket.on('answer', async answer => {
await pc.setRemoteDescription(new RTCSessionDescription(answer))
.catch(console.error);
});
const offer = await pc.createOffer();
await pc.setLocalDescription(offer)
.catch(console.error);
socket.emit('offer', offer);
// Listen for connectionstatechange on the local RTCPeerConnection
pc.addEventListener('connectionstatechange', event => {
if (pc.connectionState === 'connected') {
res(pc.createDataChannel('data'));
}
});
} else {
// recipient side
socket.on('offer', async offer => {
pc.setRemoteDescription(new RTCSessionDescription(offer))
.catch(console.error);
const answer = await pc.createAnswer();
await pc.setLocalDescription(answer)
.catch(console.error);
socket.emit('answer', answer);
});
pc.addEventListener('datachannel', event => {
res(event.channel);
});
}
});
});
}
join().then(dc => {
dc.addEventListener('open', event => {
dc.send('Hello');
});
dc.addEventListener('message', event => {
console.log(event.data);
});
});
The behavior is the same in both Firefox and Chrome. That behavior is, again, that the offers and answers are signalled successfully, but no ICE candidates are ever created. Does anyone know what I'm missing?
Okay, I found the problem. I have to create the RTCDataChannel before creating the offer. Here's a before and after comparison of the SDP offers:
# offer created before data channel:
{
type: 'offer',
sdp: 'v=0\r\n' +
'o=- 9150577729961293316 2 IN IP4 127.0.0.1\r\n' +
's=-\r\n' +
't=0 0\r\n' +
'a=extmap-allow-mixed\r\n' +
'a=msid-semantic: WMS\r\n'
}
# data channel created before offer:
{
type: 'offer',
sdp: 'v=0\r\n' +
'o=- 1578211649345353372 2 IN IP4 127.0.0.1\r\n' +
's=-\r\n' +
't=0 0\r\n' +
'a=group:BUNDLE 0\r\n' +
'a=extmap-allow-mixed\r\n' +
'a=msid-semantic: WMS\r\n' +
'm=application 9 UDP/DTLS/SCTP webrtc-datachannel\r\n' +
'c=IN IP4 0.0.0.0\r\n' +
'a=ice-ufrag:MZWR\r\n' +
'a=ice-pwd:LfptE6PDVughzmQBPoOtvaU8\r\n' +
'a=ice-options:trickle\r\n' +
'a=fingerprint:sha-256 1B:C4:38:9A:CD:7F:34:20:B8:8D:78:CA:4A:3F:81:AE:C5:55:B3:27:6A:BD:E5:49:5A:F9:07:AE:0C:F6:6F:C8\r\n' +
'a=setup:actpass\r\n' +
'a=mid:0\r\n' +
'a=sctp-port:5000\r\n' +
'a=max-message-size:262144\r\n'
}
In both cases the answer looked similar to the offer. You an see the offer is much longer and mentions webrtc-datachannel in the second case. And sure enough, I started getting icecandidate events and everything is working now.

bleManager.startDeviceScan not working on iOS react-native-ble-plx

I am trying to pair a gateway to React native app using the react-native-ble-plx.
The below source code works fine in Android wheras in iOS, the bleManager.startDeviceScan() is not triggered. Nothing happens after this step.
Any help is much appreciated!
Source code:
const connectBLE = () => {
const subscription = bleManager.onStateChange(async (state) => {
if (state === 'PoweredOn') {
subscription.remove();
scanAndConnect();
}
};
}
const scanAndConnect = () => {
bleManager.startDeviceScan(null, null, async (error, device) => {
if (error) {
showToast(error, 'error');
console.log('Handle error - scanning will be stopped automatically');
return;
}
console.log('Devices');
console.log(device.name);
// Check if it is a device you are looking for based on device name
if (device.name === "BLE_0006") {
// Stop scanning as we have found the device.
bleManager.stopDeviceScan();
// Establish Device connection.
device
.connect()
.then((deviceData) => {
/** Show Toast on Device disconnect */
bleManager.onDeviceDisconnected(
deviceData.id,
(connectionError, connectionData) => {
if (connectionError) {
console.log(connectionError);
}
console.log('Device is disconnected');
console.log(connectionData);
},
);
/** Discover All Services and Characteristics */
return device.discoverAllServicesAndCharacteristics();
})
.then(async (deviceObject) => {
console.log('deviceObject');
console.log(deviceObject);
/** Subscribe for the Readable service */
device.monitorCharacteristicForService(
Enum.bleConnectionInfo.customServiceUUID,
Enum.bleConnectionInfo.readCharacteristicUUID,
(error, characteristic) => {
if (error) {
console.log('Error in monitorCharacteristicForService');
console.log(error.message);
return;
}
console.log(characteristic.uuid, characteristic.value);
]);
},
);
})
.catch((error) => {
console.warn(error);
showToast(error, 'error');
});
}
});
}
This might be helpful for someone!
The Issue was resolved by moving the bleManager() initialization outside the functional component.

Stream Web Audio with WebRTC without asking for microphone

I want to stream audio from a web page to a local server, using WebRTC. That server will process that audio and will output it immediately to the user. I need real time.
My code is actually working. However I am asking the user for the microphone with getUserMedia, and I don't need that microphone. This is quite annoying. What can I do in order to stream the audio without having to ask the user for the microphone?
Thank you.
Here is a minimal working example (it is highly inspired by https://github.com/aiortc/aiortc/blob/main/examples/server/client.js). Only the last part with comments is interesting :
let webSocket = new WebSocket('wss://0.0.0.0:8080/ws');
const config = { sdpSemantics: 'unified-plan' }
const pc = new RTCPeerConnection(config);
webSocket.onmessage = (message) => {
const data = JSON.parse(message.data);
switch(data.type) {
case "answer":
pc.setRemoteDescription(data.answer)
break;
default:
break;
}
};
function negotiate() {
return pc.createOffer()
.then(function(offer) {
return pc.setLocalDescription(offer);
})
.then(function() {
return new Promise(function(resolve) {
if (pc.iceGatheringState === 'complete') {
resolve();
} else {
function checkState() {
if (pc.iceGatheringState === 'complete') {
pc.removeEventListener('icegatheringstatechange', checkState);
resolve();
}
}
pc.addEventListener('icegatheringstatechange', checkState);
}
});
})
.then(function() {
const offer = pc.localDescription;
webSocket.send(
JSON.stringify({
type: "offer",
offer: {
sdp: offer.sdp,
type: offer.type
}
})
);
})
}
// Preparing the oscillator
const audioCtx = new (window.AudioContext || window.webkitAudioContext)();
const oscillator = audioCtx.createOscillator();
const serverDestination = audioCtx.createMediaStreamDestination();
oscillator.connect(serverDestination);
// Asking for useless microphone
navigator.mediaDevices.getUserMedia({audio: true})
.then(() => {
return negotiate();
});
// Actual streaming
const stream = new MediaStream();
serverDestination.stream.getTracks().forEach((track) => {
pc.addTrack(track, stream);
})
// User pushes button to start the oscillator
function play() {
oscillator.start();
};
Just get rid of this:
// Asking for useless microphone
navigator.mediaDevices.getUserMedia({audio: true})
.then(() => {
return negotiate();
});
As you say, it's useless and not necessary. If you don't call getUserMedia(), the user won't be prompted to share their microphone. You can make WebRTC connections without this.
I suspect the problem you're running into is that your audio context is paused. If you call audioCtx.resume() when a user clicks a button, you'll be up and running. This is due to autoplay policy.
If you don't need user media, don't ask for it with getUserMedia in your code.

Unable to read from BLE with react-native-ble-plx

I'm using React Native and the BLE package react-native-ble-plx on my Ubuntu machine.
According to the docs, I've done the following:
Scanned for devices
Connected to device
device.discoverAllServicesAndCharacteristics
writeCharacteristicWithResponseForDevice
My problem lies in reading the response I get from sending my command.
RX characteristic is writable, but TX is only notifiable and not readable. Therefore I need to monitor the characteristic with monitorCharacteristicForDevice, however I can't get this to function.
I never recieve neither 'monitor success', 'Error at receiving data from device' nor 'Monitor failure' in the code below.
How do I read the return?
monitorDevice = () => {
this.setState({monitoring: true})
console.log(this.state.servicesDiscovered)
if(this.state.servicesDiscovered){
{this.manager.monitorCharacteristicForDevice(this.state.connectedTo,
SERVICE_UUID,
TX_CHARACTERISTIC,
(error, characteristic) => {
if (error) {
console.error("Error at receiving data from device", error);
return
}
else {
this.setState({monitoring: true})
console.log('monitor success')
console.log('monitor success' + characteristic.value);
this.state.messagesRecieved.push(characteristic.value)
}
})
}
}
else{
console.log("Monitor failure")
}
}
checkBattery = () => {
if(this.state.monitoring){
this.manager.writeCharacteristicWithResponseForDevice(this.state.connectedTo,
SERVICE_UUID,
RX_CHARACTERISTIC,
"YmF0dGVyeQ==")
.then(characteristic => {
console.log("Successfully sent: " + characteristic.value)
return
})
.catch(err => {
console.log(err)
})
}
else{
alert("Not monitoring")
}
}