How to read from AzureIOT only messages from one device - azure-iot-hub

I have an Azure IOT solution where data from 2 devices go to the same IOT hub. From my computer I need to read the messages only from one of the devices. I implemented the ReadDeviceToCloudMessages.js in https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-node-node-getstarted
var client = EventHubClient.fromConnectionString(connectionString);
client.open()
.then(client.getPartitionIds.bind(client))
.then(function (partitionIds) {
return partitionIds.map(function (partitionId) {
return client.createReceiver('todevice', partitionId, { 'startAfterTime' : Date.now()}).then(function(receiver) {
console.log('Created partition receiver: ' + partitionId)
receiver.on('errorReceived', printError);
receiver.on('message', printMessage);
});
});
})
.catch(printError);
But I am getting all the messages in the IOThub. How do I get messages only from one device.

You can route the expected device message to build-in endpoint: events. Then you can only receive the selected device message from your above code.
Create the route:
Turn "Device messages which do not match any rules will be written to the 'Events (messages/events)' endpoint." to off and make sure the route is enabled.

Related

How to Handle push notification when sent to multiple device but some them are failed

I want to send fcm push notifications to android and ios app. I am using this code to send
let message = {
registration_ids: firebaseId, // this is the array of tokens
collapse_key: 'something',
data: {
type: data.type,
title: data.title,
body : data.body,
notificationId: something
},
};
fcm.send(message, (err, response) => {
if (err) {
console.log(err);
resolve(false);
} else {
console.log("Notification Android Sent Successfully");
console.log(response);
}
});
Now what I want if some notifications are failed then I want to send them SMS. but I got a response like this form the fcm server in case of success or failure.
Notification Android Sent Successfully
{"multicast_id":7535512435227354255,"success":2,"failure":1,"canonical_ids":0,"results":[{"message_id":"0:1610622370449056%d0bd483c86f03759"},{"message_id":"0:1610622370449058%d0bd483c86f03759"},{"error":"InvalidRegistration"}]}
now how will I know which device did not get the notification as we can see there 1 failure from 3 device
so i can send SMS to that device
The results in the response you get contains the result for each token, in the same order in which you specified them,
So in your example, the message was successfully sent to the first two tokens, while the third token was unknown. You'll want to remove that last token from your database to prevent trying to send messages to it in the future.

Why is WebRTC remote video source generated by URL.createObjectURL

In this document, it uses URL.createObjectURL to set the video source. (This is the code to answer a call).
var offer = getOfferFromFriend();
navigator.getUserMedia({video: true}, function(stream) {
pc.onaddstream = e => video.src = URL.createObjectURL(e.stream);
pc.addStream(stream);
pc.setRemoteDescription(new RTCSessionDescription(offer), function() {
pc.createAnswer(function(answer) {
pc.setLocalDescription(answer, function() {
// send the answer to a server to be forwarded back to the caller (you)
}, error);
}, error);
}, error);
});
I expected video.src to be the address to retrieve the remote video. So it should be fixed and given by the other side of the connection (whoever initiated the call). But the value of URL.createObjectURL is generated on the answerer's side, and it event depends on when the function is called. How it can be used to get the remote video stream?
Edit:
The result of URL.createObjectURL looks like blob:http://some.site.com/xxxx-the-token-xxxx. With this string, how does the video component know where to load the remote stream? Is there a hashmap of {url:stream} stored somewhere? If so, how does the video component access the hashmap?
A stream object does store a token string, which you can get with stream.toURL. But it is different from the result of URL.createObjectURL. The value of URL.createObjectURL depends on time. If you call it twice in a row, you get different values.
URL.createObjectURL(stream) is a hack. Stop using it. Efforts are underway to remove it.
Use video.srcObject = stream directly instead. It is standard and well-implemented.
This assignment of a local resource should never have been a URL in the first place, and is a red herring to understanding how WebRTC works.
WebRTC is a transmission API, sending data directly from one peer to another. No content URLs are involved. The remote stream you get from onaddstream is a local object receiver side, and is the live streaming result of the transmission, ready to be played.
The documentation you read is old and outdated. Thanks for pointing it out, I'll fix it. It has other problems: you should call setRemoteDescription immediately, not wait for the receiver to share their camera, otherwise incoming candidates are missed. Instead of the code you show, do this:
pc.onaddstream = e => video.srcObject = e.stream;
function getOfferFromFriend(offer) {
return pc.setRemoteDescription(new RTCSessionDescription(offer))
.then(() => navigator.getUserMedia({video: true}))
.then(stream => {
pc.addStream(stream);
return pc.createAnswer();
})
.then(answer => pc.setLocalDescription(answer))
.then(() => {
// send the answer to a server to be forwarded back to the caller (you)
})
.catch(error);
}
It uses srcObject, avoids the deprecated callback API, and won't cause intermittent ICE failures.
Because a WebRTC connection involves several steps and what you get from such a connection is a stream. But the src property of the video tag does not accept a stream, but a URL. And this is the way to "convert" a stream to a URL.

WebRTC: How do I stream Client A's video to Client B?

I am looking into WebRTC but I feel like I'm not understanding the full picture. I'm looking at this demo project in particular: https://github.com/oney/RCTWebRTCDemo/blob/master/main.js
I'm having trouble understanding how I can match 2 clients so that Client A can see Client B's video stream and vice versa.
This is in the demo:
function getLocalStream(isFront, callback) {
MediaStreamTrack.getSources(sourceInfos => {
console.log(sourceInfos);
let videoSourceId;
for (const i = 0; i < sourceInfos.length; i++) {
const sourceInfo = sourceInfos[i];
if(sourceInfo.kind == "video" && sourceInfo.facing == (isFront ? "front" : "back")) {
videoSourceId = sourceInfo.id;
}
}
getUserMedia({
audio: true,
video: {
mandatory: {
minWidth: 500, // Provide your own width, height and frame rate here
minHeight: 300,
minFrameRate: 30
},
facingMode: (isFront ? "user" : "environment"),
optional: [{ sourceId: sourceInfos.id }]
}
}, function (stream) {
console.log('dddd', stream);
callback(stream);
}, logError);
});
}
and then it's used like this:
socket.on('connect', function(data) {
console.log('connect');
getLocalStream(true, function(stream) {
localStream = stream;
container.setState({selfViewSrc: stream.toURL()});
container.setState({status: 'ready', info: 'Please enter or create room ID'});
});
});
Questions:
What exactly is MediaStreamTrack.getSources doing? Is this because devices can have multiple video sources (e.g. 3 webcams)?
Doesn't getUserMedia just turn on the client's camera? In the code above isn't the client just viewing a video of himself?
I'd like to know how I can pass client A's URL of some sort to client B so that client B streams the video coming from client A. How do I do this? I'm imagining something like this:
Client A enters, joins room "abc123". Waits for another client to join
Client B enters, also joins room "abc123".
Client A is signaled that Client B has entered the room, so he makes a connection with Client B
Client A and Client B start streaming from their webcam. Client A can see Client B, and Client B can see Client A.
How would I do it using the WebRTC library (you can just assume that the backend server for room matching is created)
The process you are looking for is called JSEP (JavaScript Session Establishment Protocol) and it can be divided in the 3 steps I describe below. These steps start once both clients are in the room and can comunicate through WebSockets, I will use ws as an imaginary WebSocket API for communication between the client and the server and other clients:
1. Invite
During this step, one desinged caller creates and offer and sends it through the server to the other client (callee):
// This is only in Chrome
var pc = new webkitRTCPeerConnection({iceServers:[{url:"stun:stun.l.google.com:19302"}]}, {optional: [{RtpDataChannels: true}]});
// Someone must be chosen to be the caller
// (it can be either latest person who joins the room or the people in it)
ws.on('joined', function() {
var offer = pc.createOffer(function (offer) {
pc.setLocalDescription(offer);
ws.send('offer', offer);
});
});
// The callee receives offer and returns an answer
ws.on('offer', function (offer) {
pc.setRemoteDescription(new RTCSessionDescription(offer));
pc.createAnswer(function(answer) {
pc.setLocalDescription(answer);
ws.send('answer', answer);
}, err => console.log('error'), {});
});
// The caller receives the answer
ws.on('answer', function (answer) {
pc.setRemoteDescription(new RTCSessionDescription(answer));
});
Now both sides are have exchanged SDP packets and are ready to connect to each other.
2. Negotiation (ICE)
ICE candidates are created by each side to find a way to connect to each other, they are pretty much IP addresses where they can be found: localhost, local area network address (192.168.x.x) and external public IP Address (ISP). They are generated automatically by the PC object.
// Both processing them on each end:
ws.on('ICE', candidate => pc.addIceCandidate(new RTCIceCandidate(data)));
// Both sending them:
pc.onicecandidate = candidate => ws.send('ICE', candidate);
After the ICE negotiation, the conexion gets estabished unless you try to connect clients through firewalls on both sides of the connection, p2p communications are NAT traversal but won't work on some scenarios.
3. Data streaming
// Once the connection is established we can start to transfer video,
// audio or data
navigator.getUserMedia(function (stream) {
pc.addStream(stream);
}, err => console.log('Error getting User Media'));
It is a good option to have the stream before making the call and adding it at earlier steps, before creating the offer for the caller and right after receiving the call for the callee, so you don't have to deal with renegotiations. This was a pain a few years ago but it may be better implemented now in WebRTC.
Feel free to check my WebRTC project in GitHub where I create p2p connections in rooms for many participants, it is in GitHub and has a live demo.
MediaStreamTrack.getSources is used to get video devices connected. It seems to be deprecated now. See this stack-overflow question and documentation. Also refer MediaStreamTrack.getSources demo and code.
Yes. getUserMedia is just turning on camera. You can see the demo and code here.
Please refer to this peer connection sample & code here to stream audio and video between users.
Also look at this on Real time communication with WebRTC.

Can't get GCM push messages being sent properly

So my GCM push message works if I use this test link
http://www.androidbegin.com/tutorial/gcm.html
Here's the response
{ "multicast_id":7724943165862866717,
"success":1,
"failure":0,
"canonical_ids":0,
"results":[{"message_id":"0:1418649384921891% 7fd2b314f9fd7ecd"}]}
However if I push using my own service using node push service using the toothlessgear/node-gcm lib
https://github.com/ToothlessGear/node-gcm I get a success message on the server but no msg makes it to the client
{ multicast_id: 5130374164465991000,
success: 1,
failure: 0,
canonical_ids: 0,
results: [ { message_id: '0:1418649238305331%7fd2b3145bca2e79' } ] }
I also tried the same message using pushwoosh and push woosh doesn't work either. How come I'm getting a success message on the server, but no push is received on the client on the latter two services. Is there some sort of ip configuration that I need to do, or some sort of certificate? I've used the same google api server key which is open to all ips on all 3 of these services.
Why does the response show success on the latter but no msg gets received on the client?
Node service server side code
var gcm = require('node-gcm');
// create a message with default values
var message = new gcm.Message();
// or with object values
var message = new gcm.Message({
collapseKey: 'demo',
delayWhileIdle: true,
timeToLive: 3,
data: {
key1: 'message1',
key2: 'message2'
}
});
var sender = new gcm.Sender('insert Google Server API Key here');
var registrationIds = ['regId1'];
/**
* Params: message-literal, registrationIds-array, No. of retries, callback-function
**/
sender.send(message, registrationIds, 4, function (err, result) {
console.log(result);
});
So the pushes were correctly being sent, my issue was with the cordova plugin on the client which requires that the android payload for "message" or "title" be set. The sample php just coincidentally was setting the message property and that's why it worked.
Updating the code to add the following to the data
data: {message:'test'}
works correctly

instant messaging with intel xdk receiving notification on new message

i would like to create instant messaging in cross platform. how can i get the application keep listening to the server so when there is a message coming, the application could receive a notification.
maybe like service in android?
I've read about push message (push mobi) but it doesn't seem to meet my need since it blast the notification on all registered id from admin panel, not from 1 id to another id.
i notice GCM but some say it is not suitable for sending and receiving chat.
Sounds like a good scenario for websockets. There's a phongap plugin for android that will allow you to use them.
Take a look at the plugin demo. It looks pretty strait-forward.
client side javascript:
var socket = io.connect("http://10.0.2.2:8080");
document.getElementById('log').innerHTML = "connecting";
socket.on('ping', function (data) {
document.getElementById('log').innerHTML = data.message;
socket.emit('pong', { message: 'Hello from client!' });
});
socket.on('connect', function () {
document.getElementById('log').innerHTML = "connected";
});
});
Server side web service in node.js:
var io = require('socket.io').listen(8080);
io.sockets.on('connection', function (socket) {
console.log('emit...');
socket.emit('ping', { message: 'Hello from server ' + Date.now() });
socket.on('pong', function (data) {
console.log(data.message);
});
});
console.log('listening on port 8080');