WEBRTC + PEERJS - Stream from 2 cameras same time from one user - camera

Hi I have currently web app where users can stream video from camera to each other, but I have question:
Is it possible to stream from 2 cameras same time from one user and send this two streams to another one ?
I'm using javascript with peer.js,node.js,socke.io and express.

Related

Agora.io broadcast speaker video/audio to another channel(s)

We are developing the following scenario and can only get the speakers audio to be streamed for some reason. Can someone give advice on how to implement this or if its even possible with Agora React Native SDK:
I just wanted to check that what we are doing is possible in Agora. So to clarify we have:
Channel 1: speaker 1, maybe speaker 2 - They will chat each other like normal
Channel 2: Audience 1, Audience 2 - They video chat to each other as normal + speaker 1 and speaker 2 video from Channel 1 (video and Audio but can't communicate with Channel 2 audience)
Channel 3+: same as Channel 2 but with different audience members.
Basically the speakers can be seen and heard by audience, but they cannot hear the audience. At the moment we have this working but the speakers video doesn't show to the audience, just the sound.
cheers
Mike
What you described is possible using multi channel features of the SDK. The only limitation with multi channel is that you can only publish your local video to one channel at a time, you can however subscribe to multiple videos (from multiple channels).

Does video.js support MPEG2-TS/UDP streams?

I am just starting to play around with video.js and really like it. I currently have some code where I have two players showing two different HLS streams in a single browser page.
However, HLS inherently has high latency and that may not work for my project. So I am wondering if video.js can receive and play MPEG2-TS/UDP streams which would have less latency (I can easily change the format of all of my source video steams).
My basic requirement is to have 2 players in a single browser page, one player showing the video stream sent from a particular network node, and the second showing how a different network node received that same stream. So the two video.js players on the browser page are showing 2 video streams that are actually the same video so they are highly correlated. This is why the latency is a critical requirement for this project.
Thanks,
-Andres

video live streaming application in React Native

I am trying to implement video live streaming
live streaming and
upload it to server and
save the streaming video (Playback)
in react native can any one help me with a sample project
this is will be helpful https://www.npmjs.com/package/react-native-video
for point upload it to server, what exactly do u need upload? video uploading or something else?
So - you'll need a backend server that can accept a video stream, and convert it into a stream that can be consumed in React Native. You'd also like the server to save the streamed video, and encode it so it can be played back as video on demand (VOD) after the stream has stopped. None of this is React - it'll all be done on the backend.
You can build all this yourself, but there are a number of APIs that can do this for you. Disclaimer: I work for one such company: api.video. (A Google search will find others)
For livestreaming from the browser, you can use getUserMedia and stream to a server. (you can see my demo using JavaScript at livestream.a.video. This stream wil be visible to all your users, and then also recorded and saved as VOD for later playback.
To upload a recorded video - you can use file.slice() to break the video into manageable chunks, and upload to the server - for transcoding into a video stream (demo at upload.a.video, and tutorial.)
For playback, these APIs will give you a player URL or the m3u8 url that you can incorporate into a video player. This is your React Native part - and there are several video players that you can add into your application to playback HLS video. At api.video, we have our own player, and also support 3rd party players.

Recording a call with webRTC

I'm implementing a video chatting service and I need it to be monitored and archived.
All the conversations are done 1 on 1 and have to be written to a file with audio and video (can be to separate files from each user, but has to be kept on the server), I can't figure how that could be done.
I'm using peer.js to connect the two clients together so to my understanding there is no server in between the two users that could keep the data.

real time gps tracking device

I am working on a product which requires a real time gps tracking.
I searched on google to collect information, so I got an overview on how does basically GPS work.
Now what else do I need except for a GPS receiver to display on my mobile through an app (designed by us) where a GPS receiver is attached to my product?
I don't want the whole navigation functionality, but just to display where it is located.
I know a GPS receiver continuously sends and receives signals from satellites, but how to display that on my mobile app?
Do we send some info to some kind to a database and then program the app to collect the info from there?
Basically, You need a kind of connection between your receiver and your phone. Since all the phones have their own gps receiver, I assume that you are gonna track the location of a remote device. So the easiest way to do that is creating a basic web service which holds just 2 pages in total.
1 to update data by receiver (I suggest you to use a hash key to prevent bots to piss your database)
1 to get data to mobile app
Depending on your background and project needs you can either use a database to keep location data or you can use just a file (like xml file or something like that)
Also you will need to parse NMEA code to get the actual data. It is easier to do that on web side (like in php). So you can just pass the NMEA code as it is to server, parse it, record it to database/file. Then you can fetch the data from your mobile app by using simple http requests.
If two devices will be close to each other you can easily use bluetooth but I don't know if there is any case that it make sense to place an additional gps receiver just next to a smart phone :)
First of all, you need some kind of communication channel to send location from your product to a server (1) of directly to the phone (2).
Usually GPS tracking devices use GSM modem with GPRS to connection to a remote server. Then your phone app can request location data from server via HTTP API. You can use open source GPS tracking server for your project and use some simple protocol to send location data to it.
Second popular option requires GSM modem again, but in this case device can send SMS messages directly to your phone. In your app you can intercept SMS and retrieve location data.