Record remote video stream with audio using webrtc for Mac - webrtc

I need a way to record the audio and video of remote peer connections.
We're using the native version of webrtc for macOS.
In the current API of webrtc, there is no way to access the audio of remote connections.
I didn't try anything in particular because this is something that I can't find any reference for.

Related

React Native - Connecting to remote WebRTC stream

We have mobile application that historically has used RTSP streaming to allow a user to watch a live stream, which currently is published via Wowza Streaming Engine. We have had a need to lower stream latency, so have gravitated towards WebRTC to achieve this.
The problem is that there seems to be a lack of documentation, or examples regarding the implementation of a react-native WebRTC viewer which connects to a remote stream.
Does anyone out there have any documentation, or code examples for this kind of implementation?
I do note there is a react-native-webrtc library, however, all examples demonstrate connecting two peers on mobile phones with their video cameras i.e. Like facetime. We are after an example demonstrating someone on a phone connecting to a remote streaming server with a video feed.
Cheers,
If you want a webrtc client to connect to a server you need a server doing webrtc with the proper signaling that fit your need. Webrtc don't care which signaling you use, so you have to choose it or choose a the platform you need.
There are a lot of different media server, or library that support webrtc in server side all having there specific signaling(ex: Freeswitch, Kurento etc), or no signaling (ex: Mediasoup). Few will have a react native version as Media Streaming is not really something in the javascript/UI side but you can do something with the webrtc react-native lib.
Twillio has a lot of supported platform and could be a good start if you search a ready to use solution.

How to play IP Camera RTSP feed with WebRTC in Ant Media Server?

I have an IP Camera with built in RTSP URL but it's not having a public IP. I want to stream it with as low latency as possible.
Is there any way that I can play my IP camera with WebRTC using Ant Media Server?
There are two ways to do this with Ant Media Server.
You can add the RTSP URL for your camera to Ant Media Server as a stream source. You can do this as documented here or you can use this REST method by providing the source URL. With this solution you will have video/audio data on the server side and you can record or re-stream it.
The second way is more innovative and but it is applicable for the camera which has own processors (mostly ARM) on it. In this solution you can run the Embedded SDK software on your camera's processor. This software can capture video/audio data from the camera and feed that data to the WebRTC peer. In this case data is sent directly to the WebRTC peer without passing through the server.
You can find for about Embedded SDK here.
You can find reference project executable files for different architectures here.
Lastly if you want to modify and build this reference project by your self you
will need SDK libraries. You can login to antmedia.io and download the SDK libraries.

Is it possible to detect media device is remove during in between a webrtc voip call

Hi is it possible to detect that a media device say headphone is removed during a ongoing webrtc call.We need this so that peer can re negotiate again.
Most probably you will get an error if you're using that media device. If not then you should poll on navigator.mediaDevices.enumerateDevices().

Get a stream of a remote camera

I need to start a live stream in a remote computer connected to a webcam,
then connect to that remote ip address and see the live stream, like a security webcam more or less.
On my client i want to be able to see the stream in my browser.
What I've tried so far:
VLC on the remote pc: I start the stream (MMS, HTTP or RSTP) and then I encapsulate the stream as object in a html page.
This works, but I have a high latency and not all the browsers support x-vlc-plugin.
WebRTC. This seemed to me the best solution. Direct stream, very low latency.
I tried all the solutions I found in internet, that also integrate node.js. I tried also to build some code myself but the problem is that:
I start the stream on the "server", the remote pc.
When i go to the client, I type in the browser the ip address and port of the remote PC. In theory I should be able to see the REMOTE stream, but instead the browser asks for permission to use my LOCAL camera!
Do you have some hints or solutions about? What am I doing wrong?
Last project I tried:
https://github.com/xat/webcam-binaryjs-demo
In this project:
https://webrtc.github.io/samples/src/content/peerconnection/multiple-relay/
the developer uses a relay of the stream.
Buttons work but I don't know how to use this, that is how to catch the relay and display it on the client.
Thank you for your suggestions.
webRTC has three common API
getUserMedia : for communication and streaming between camera/mic with browser (request permission for access to camera/mic)
https://developer.mozilla.org/en-US/docs/Web/API/Navigator/getUserMedia
RTCDataChannel : data channel for send/receive any type of data on connection
https://developer.mozilla.org/en-US/docs/Web/API/RTCDataChannel
RTCPeerConnection : for creating peer-to-peer connection
https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection
you don't need getUserMedia
find getUserMedia() , this method send access request for camera and microphone to user , you can set both boolean false , or remove it carefully
navigator.getUserMedia({
video:false,
audio:true,
},function(mediaStream){...

Is it possible to use WebRTC to streaming video from Server to Client?

In WebRTC, I always see the implementation about peer-to-peer and how to get video streaming from one client to another client. How about server-to-client?
Is it possible for WebRTC to streaming video file from server-to-client?
(I am thinking about using WebRTC Native C++ API to create my own server application to connect to the current implementation on chrome or firefox browser client application.)
OK, if it is possible, will it be faster than many current video streaming services?
Yes it is possible as the server can be one of the peers in that peer-to-peer session.
If you respect the protocols and send the video in SRTP packets using VP8, the browser will play it. To help you build these components on other applications or servers, you can check this page and this project as a guide.
Now, comparing WebRTC with other streaming services... It will depend on several variables like the Codec or the protocol. But, for instance, comparing WebRTC (SRTP over UDP with VP8 Codec) against Flash (RTMP over TCP with H264 Codec), I would say that WebRTC wins.
The player will be Flash Player against the native <video> tag.
The transport would be TCP against UDP.
But of course, everything depends on what you are sending to the client.
I have written some apps and plugins using the native WebRTC API, and there isn't a lot of information out there yet, but here are a few useful resources to get you started:
QT Example: http://research.edm.uhasselt.be/jori/qtwebrtc
Native to Browser example: http://sourcey.com/webrtc-native-to-browser-video-streaming-example/
I started with the WebRTC Native C++ to Browser Video Streaming Example but it doesnot build anymore with the actual WebRTC Native Code.
Then I made modifications merging into a standalone process :
management of the peerConnection (the peerconnection_server)
access to Video4Linux capture (the peerconnection_client).
Removing the stream from browser to the WebRTC Native C++ client give a simple solution to access throught a WebRTC browser to a Video4Linux device that is available from GitHub webrtc-streamer.
Live Demo
We are attempting to replace MJPEGs with Webrtc for our server software and have a prototype module for doing this using a smattering of components tied to the Openwebrtc project. It has been an absolute bear to do, and we have frequent ICE negotiation errors (even over a simple LAN), but it mostly works.
We also built a prototype with the Google Webrtc module, but it had many dependencies. I find it easier to work with the Openwebrtc modules because Google's stuff is so tightly tied to general peer-to-peer scenarios on the browser.
I compiled the following from scratch:
libnice 0.1.14
gstreamer-sctp-1.0
usrsctp
Then I have to interact with libnice a bit directly to gather candidates. Also have to write out the SDP files by hand. But the amount of control--being able to control the source of the pipeline--makes it worthwhile. The resulting pipeline (with two clients off one server source) is below:
Of course. I'm writting a program using native WebRTC api which can join the conference as a peer and record both video and audio.
see: How to stream audio from browser to WebRTC native C++ application
and you can definitely streaming media from native app.
I'm sure you can use dummy_audio_file to streaming audio from local file, and you can find a way to access the video streaming progress by your own.
Yes it is. We have developed an load test tool to publish and play for Ant Media Server. This tool can broadcast media file. We used the same native WebRTC library used in Ant Media Server.
Sure it's possible, it allows covert live streaming to WebRTC, for example:
OBS/FFmpeg ---RTMP---> Server ---WebRTC--> Chrome/Client
For this scenario, it allows the ultra low latency live streaming, about 600~800ms, to play the live streaming by WebRTC. Please take a look at this demo.