Can Janus WebRTC server implement server-side peer? - webrtc

I've been reading about Janus, looked at the examples. I'm looking for a webRTC component that I can use in the following way:
Receive RTP video packets from some external sender
Become a WebRTC peer and connect to an external WebRTC signaling server, STUN, TURN, the usuals
Send the incoming RTP packets as a coherent video via the WebRTC peer connection to some other peer on a browser on the Internet
Is Janus the right tool? Maybe there are other tools? Would appreciate some directions..
Thanks!

I am not sure about Janus.
You can achieve these functionalities with LM Tools (lmtools.com) with easy configuration. It can receive RTP packets from external sender and can send those packets as per WebRTC specification to other peer.
Please note LM tools is not free software like JSON, though you can have free trial for 1 month.
Disclaimer: I work for LM Tools.

Can Janus WebRTC server implement server-side peer?
Yes, can can do that. What you are looking for is an RTP forwarding and you will get more context, and expert opinion from their community friendly google group page.

I hope you are looking for a Gateway solution.
(RTP/RTCP separate streams are converted to webrtc RTP/RTCP mux)
For this you need to make changes in the Janus code or use some plugin supporting RTP/RTSP.
Current Janus server relays RTP/RTCP and messages between browsers.
https://janus.conf.meetecho.com/docs/

Related

React Native - Connecting to remote WebRTC stream

We have mobile application that historically has used RTSP streaming to allow a user to watch a live stream, which currently is published via Wowza Streaming Engine. We have had a need to lower stream latency, so have gravitated towards WebRTC to achieve this.
The problem is that there seems to be a lack of documentation, or examples regarding the implementation of a react-native WebRTC viewer which connects to a remote stream.
Does anyone out there have any documentation, or code examples for this kind of implementation?
I do note there is a react-native-webrtc library, however, all examples demonstrate connecting two peers on mobile phones with their video cameras i.e. Like facetime. We are after an example demonstrating someone on a phone connecting to a remote streaming server with a video feed.
Cheers,
If you want a webrtc client to connect to a server you need a server doing webrtc with the proper signaling that fit your need. Webrtc don't care which signaling you use, so you have to choose it or choose a the platform you need.
There are a lot of different media server, or library that support webrtc in server side all having there specific signaling(ex: Freeswitch, Kurento etc), or no signaling (ex: Mediasoup). Few will have a react native version as Media Streaming is not really something in the javascript/UI side but you can do something with the webrtc react-native lib.
Twillio has a lot of supported platform and could be a good start if you search a ready to use solution.

Do I need SIP + WebRTC

I am working on the webRTC application. Which can receive a call from browsers, The caller's source can be from any phone number or the extension dialed from the webRTC application. I am using the FreeSwitch server for this purpose.
Can anyone help me to know if this is achievable using only WebRTC or do I need SIP + webRTC like sip.JS, jsSIP
You can create a calling application using WebRTC without SIP but you will need to create or choose some form of signalling protocol. WebRTC can transport the audio and video packets for you but it does not specify how to set up the connection between two peers.
Given you're intending to use FreeSWITCH you may find that using SIP is the easiest option for you. FreeSWITCH plus one of the SIP javascript libraries you've mentioned solves your signalling requirements.

Analyzing RTP packets from browser's webRTC stream using Wireshark or similar tool

Is my observation correct that RTP/RTCP packets from a webRTC stream cannot be analyzed in Wireshark running on the same desktop to analyze RTP packets because the browser would have encrypted them using DTLS/SRTP?
I know there are some browser APIs to help but is there any other approach?
libpcap if used to write some tool will probably have the same problem.
Firefox has support for dumping the decrypted RTP/RTCP packets into the log files, described here. Chrome does not have something similar unfortunately.
If you use a server, some of them like Janus have the ability to generate similar dumps, see here

SIP-WebRTC gateway/bridge: Kurento OR openwebrtc OR Intel CS for webrtc

I am researching implementation of a WebRTC-SIP gateway/bridge. That is, for example, to make a WebRTC call to a SIP end point via a SIP server like Asterisk. I know that Asterisk already supports this but I need an intermediary server for various needs like logging, recording, integration with local auth/signalling and other app modules. I looked at Kurento, Openwebrtc (Ericson) and the lesser known Intel's Collaboration Suite for WebRTC.
I need a server-side solution to interact with my Node Application server. Specifically, the server-API should be able to generate a SDP for a RTP end point and convert WebRTC SDP to the more generic SDP used by Legacy SIP servers or have a way to bridge these two end-points. I feel comfortable that this is possible with Kurento (saw a post on except that I am not aware of any jsSip/sipML5 kind of API for Kurento. Kurento itself is not meant to provide signalling. For e.g., if the SDP generated by Kurento for the rtpEndpoint in Kurento has to be used in a SIP call/INVITE, how would one implement it? For that matter, how would one initiate a SIP INVITE, for example, from Kurento? Are there third-party modules to do this?
Has anyone used the any of the servers listed above for a similar use case?
This is a programming question. I am looking for server APIs to implement a WebRTC to SIP gateway/bridge for media transcoding (if required), SDP transformation and SIP signalling.

WebRTC Relay Server / Broadcast multiple clients

I've got WebRTC peer to peer working but when I want to broadcast a single camera to multiple clients obviously peer to peer isn't suitable.
I've found solutions like
http://lynckia.com
and
http://www.medooze.com/products/mcu/webrtc-support.aspx
But the first I can't get setup (and it seems to have cross browser issues)
the second just feels like we're hitting a nail with a nuclear missile.
All I need is a relay, I don't need to decode / recode streams.
I just need
The Broadcaster to connect to the server (peer to peer)
The clients to connect to the server (peer to peer)
The server to relay the stream from the broadcaster to the clients.
Is there any software out there that offers this solution that I've missed? is there an alternative working and scalable alternative?
Thanks
Jitsi Video Bridge works pretty much exactly how you describe.
On your server you can run Janus, to which your broadcaster can provide a stream via RTP.
Have a look at an example configuration file.
After writing a configuration file which defines how the server receives stream from the broadcaster, you should be able to launch janus in the background via a command line interface tool:
$ janus --daemon --config=config_file.conf
Also, see streaming test demo.
Note: I have not tested this thoroughly.
Have a look at this github-repo inspired from muaz khan's WebRTC p2p scalable broadcast. This can work great on LAN. On internet, I am not sure how well it can work as of now though we are improving it on the go.
If you just want to broadcast from a peer to a set of peers, if they don't care about the latency, the best solution is to covert WebRTC to live streaming, without transcoding just muxing:
Peer(Publisher) ---WebRTC--> Server --RTMP/HLS/DASH--> Peers/Players
If this works good for you, SRS is able to covert WebRTC to live streaming.
Because live streaming allows you to use CDN or TCP to deliver the streams, and the latency is about 3~5s, so this solution is only available when Peers/Players never need to communicate to the Peer(Publisher).
If you want all those peers to talk to each other, it's very complex and need a WebRTC SFU cluster to do this, there will be a huge number of streams. For example, if allows 100 peers to talk to each other, there will be 100x100 = 10k streams.
It's too complicated, so I don't think there is good open-source solution right now(at 2022.02).