How to capture rtp stream from webrtc then convert it to hls to broadcast to client ?
I want to receive rtp from webrtc in browser via media server (eg kurento ... ) then convert it to hls stream. User can use hlsEndpoint to play.
WebRTC -> RTP -> HLS
What is the correct way?
My aim is to create a live stream app that supports push streams using webrtc , i'm working with rtmp , i want webrtc as an additional option.
Tks all.
Just use a media server to covert WebRTC to Live Streaming like RTMP, HTTP-FLV or HLS, please read this wiki.
Because the WebRTC is not only RTP, but also need to transcode the audio from opus to aac, and there is something like the jitter-buffer, NACK or packet out-of-order to handle.
For live streaming, the RTMP is the de-facto standard in live streaming industry, so if you covert WebRTC to RTMP, you got everything, like transcoding by FFmpeg, forwarding to YouTube, or DVR to file, etc.
If you need to convert WebRTC to HLS or RTMP you may check Ant Media Server
The community edition also provides this.
I'm trying to isolate video and audio and am able to control the video feed from the caller side, however, unable to turn off the local video stream on the remote side since its an audio call. Any suggestions on how to isolate the video and audio feeds. It doesn't work just by removing the streams by getting the getStream.
How can i make an audio stream captured in browser using WebRTC to be streamed live via icecast/shoutcast protocols?
Use liquidsoap + webcaster.js:
https://github.com/webcast/webcaster
I am using WebRTC with kurento media server, as far as i came to know WebRTC supports VP8 for video Streaming and using opus for audio Streaming, So my question is if i want to compress the stream which includes both audio and video, so do i need to use both(VP8 and opus)?
If you are streaming both audio and video then it will use both an audio codec (typically Opus) and a video codec (typically VP8, VP9, or H.264).
webRTC is implemented PeerConnection as per https://apprtc.appspot.com/
How does webRTC implement synchronization of the their audio and video streams from remote?
Normal RTP a/v sync done using RTCP SR/RR reports and the timestamps in each SRTP packet.
See any VoIP application.