Meteor(React) : GPS based tracking for checkout - gps

Created a application(iOS) using Meteor-react framework for car rental parking. I wants to track the drivers mobile using GPS. So that I can detect his checkout status. I need to calculate distance between parking space and drivers location once check-in so that I can send push messages to driver whether he is about to check-out. Before going for a actual development for GPS based checkout need a suggestion is it really possible? What will be the pron and cons?
Right-now I am going through npm package "gps-tracking" which is a GPS listener.
Below code I have added in /server/gps.js
var gps = require("gps-tracking");
var options = {
'debug' : true, //We don't want to debug info
automatically. We are going to log everything manually so you can
check what happens everywhere
'port' : 8000,
'device_adapter' : "TK103"
}
var server = gps.server(options,function(device,connection){
console.log(options);
console.log(device);
console.log(connection);
device.on("connected",function(data){
console.log("I'm a new device connected");
return data;
});
device.on("login_request",function(device_id,msg_parts){
console.log('Hey! I want to start transmiting my position. Please accept me. My name is '+device_id);
this.login_authorized(true);
console.log("Ok, "+device_id+", you're accepted!");
});
device.on("ping",function(data){
//this = device
console.log("I'm here: "+data.latitude+", "+data.longitude+" ("+this.getUID()+")");
//Look what informations the device sends to you (maybe velocity, gas level, etc)
//console.log(data);
return data;
});
device.on("alarm",function(alarm_code,alarm_data,msg_data){
console.log("Help! Something happend: "+alarm_code+"
("+alarm_data.msg+")");
});
//Also, you can listen on the native connection object
connection.on('data',function(data){
//echo raw data package
//console.log(data.toString());
})
});
Thanks in advance!

Related

Managing 2 conferences with Voximplant scenario

I am trying to make conference with Voximplant, and when user makes a call to another user, while the call is still going on, it makes another call to another user making two calls and the callees is added to a video conferencing.
But it seems the caller is billed twice and the scenerio doesnt look optimised. What should i do to bill once and optimize it?
Scenario:
require(Modules.Conference);
var call, conf = null;
VoxEngine.addEventListener(AppEvents.Started, handleConferenceStarted);
function handleConferenceStarted(e) {
// Create 2 conferences right after session to manage audio in the right way
if( conf === null ){
conf = VoxEngine.createConference(); // create conference
}
conf.addEventListener(CallEvents.Connected,function(){
Logger.write('Conference started')
})
}
VoxEngine.addEventListener(AppEvents.CallAlerting, function(e) {
e.call.addEventListener(CallEvents.Connected, handleCallConnected);
let new_call = VoxEngine.callUser(e.destination,e.callerid,e.displayName,{},true)
new_call.addEventListener(CallEvents.Connected,handleCallConnected);
e.call.answer();
});
function handleCallConnected(e) {
Logger.write('caller connected');
conf.add({
call: e.call,
mode: "FORWARD",
direction: "BOTH", scheme: e.scheme
});
}
You need to end the conference when there are no participants. Refer to the following article in our documentation: https://voximplant.com/docs/guides/conferences/howto. You can find the full scenario code there.
Additionally, I recommend to add some handlers for the CallEvents.Disconnected and the CallEvent.Failed events right after
new_call.addEventListener(CallEvents.Connected,handleCallConnected);
because sometimes the callee may be offline or press a reject button. 🙂

Using video.js is it possible to get current HLS timestamp?

I have an application which is embedding a live stream in it. To cater for delays I'd like to know what is the current timestamp of the stream and compare it with the time on the server.
What I have tested up till now is checking the difference between the buffered time of the video with the current time of the video:
player.bufferedEnd() - player.currentTime()
However I'd like to compare the time with the server instead and to do so I need to get the timestamp of the last requested .ts file.
So, my question is using video.js, is there some sort of hook to get the timestamp of the last requested .ts file?
Video.js version: 7.4.1
I had managed to solve this issue, however please bear with me I don't remember where I had found the documentation for this bit of code.
In my case I was working in an Angular application, I had a video component responsible for loading a live stream with the use of video.js. Anyway let's see some code...
Video initialisation
private videoInit() {
this.player = videojs('video', {
aspectRatio: this.videoStream.aspectRatio,
controls: true,
autoplay: false,
muted: true,
html5: {
hls: {
overrideNative: true
}
}
});
this.player.src({
src: '://some-stream-url.com',
type: 'application/x-mpegURL'
});
// on video play callback
this.player.on('play', () => {
this.saveHlsObject();
});
}
Save HLS Object
private saveHlsObject() {
if (this.player !== undefined) {
this.playerHls = (this.player.tech() as any).hls;
// get and syncing server time...
// make some request to get server time...
// then calculate difference...
this.diff = serverTime.getTime() - this.getVideoTime().getTime();
}
}
Get Timestamp of Video Segment
// access the player's playlists, get the last segment and extract time
// in my case URI of segments were for example: 1590763989033.ts
private getVideoTime(): Date {
const targetMedia = this.playerHls.playlists.media();
const lastSegment = targetMedia.segments[0];
const uri: string = lastSegment.uri;
const segmentTimestamp: number = +uri.substring(0, uri.length - 3);
return new Date(segmentTimestamp);
}
So above the main point is the getVideoTime function. The time of a segment can be found in the segment URI, so that function extracts the time from the segment URI and then converts it to a Date object. Now to be honest, I don't know if this URI format is something that's a standard for HLS or something that was set for the particular stream I was connecting to. Hope this helps, and sorry I don't have any more specific information!

WebRTC - Change device/camera in realtime

I'm having a problem trying to change my camera in real time, It works for the local video, but the remote person cannot see the new camera, and still sees the old one. I tried to stop the stream and init again but still not working. This is just some of my code.
I have searched everywhere and I can't find a solution. Can someone help me out?
function init() {
getUserMedia(constraints, connect, fail);
}
$(".webcam-devices").on('change', function() {
var deviceID = this.value;
constraints.video = {
optional: [{
sourceId: deviceID
}]
};
stream.getTracks().forEach(function (track) { track.stop(); });
init();
});
You need to actually change the track you're sending in the PeerConnection. In Firefox, you can use RTPSender.replaceTrack(new_track); to change without renegotiation (this is being added to the spec now). Otherwise, you need to add the new stream/track to the RTCPeerConnection, and remove the old one, and then process the onnegotiationneeded event and renegotatiate
See one of #jib's fiddles: Jib's replaceTrack() fiddle:
function flip() {
flipped = 1 - flipped;
return pc1.getSenders()[0].replaceTrack(streams[flipped].getVideoTracks()[0])
.then(() => log("Flip! (notice change in dimensions & framerate!)"))
.catch(failed);
}

WebRTC: Switch from Video Sharing to Screen sharing during call

Initially, I had two different webpages:
One was to do Video Call and
Other was to do Screen Sharing
Now, I want to do both of them in one page.
Here is the scenario:
During Live call, a user wants to stop sharing his/her video and start sharing screen.
Afterwards, again he/she wishes to turn off screen sharing and start video sharing.
For clarity, here are some questions I want to ask:
On Caller Side:
1) How can I change my local stream from video to screen and vice versa?
2) Once it is done, how can I assign it to the local video element?
On Callee Side:
1) How do I handle if the current stream I am receiving is changed from video to screen?
2) How do I handle if the stream I am receiving has stopped? I mean, now I can receive neither video nor screen (just audio)
Kindly, help me in this regards. If there are any open source codes available, kindly share their links too.
Just for your reference, I was trying to handle it using following code. (i know this is naive and won't work)
function handleUserMedia(newStream){
var localvideo = document.getElementById("localvideo");
localvideo.src = URL.createObjectURL(newStream);
localStream = newStream;
sendMessage('got user media');
if (isInitiator) {
maybeStart();
}
}
function handleUserMediaError(error){
console.log(error);
}
var video_constraints = {video: true, audio: true};
var screen_constraints = {video: { mandatory: { chromeMediaSource: 'screen' } }};
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
//getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
$scope.toggleSelected = function () {
$scope.selected = !$scope.selected;
if($scope.selected)
{
getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Video';
}
else
{
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
}
};
Check this demo:
https://www.webrtc-experiment.com/demos/switch-streams.html
and the relevant tutorial:
https://www.webrtc-experiment.com/docs/how-to-switch-streams.html
simply renegotiate peer connections on both users' side!

Using multiple USB cameras with Web RTC

I want to use multiple USB camera with Web RTC.
For ex)
https://apprtc.appspot.com/?r=93443359
This application is web RTC sample.
I can connect to another machine, but I have to disconnect once to change the camera.
What I want to is,
1.Use two camera at the same time on the same screen.
2.(if 1 is not possible),I want to switch the camera without disconnecting current connection
Does anyone have information about how to use two camera on Web RTC?
call getUserMedia twice and change the camera input in between
You can use constraints to specify which camera to use and you can have both of them displayed in one page as well. To specify which camera to use take a look at the following snippet (only works on Chrome 30+):
getUserMedia({
video: {
mandatory: {
sourceId: webcamId,
...
}
},
successCallback,
failCallback);
The webcamId you can get by:
MediaStreamTrack.getSources(function(sources){
var cams = _.filter(sources, function(e){ //only return video elements
return e.kind === 'video';
});
var camIds = _.map(cams, function (e) { // return only ids
return e.id;
});
});
In the snippet above I've used underscore methods filter and map.
More information on:
WebRTC video sources
constraints