Which codec is used for online video Streaming? - webrtc

I am using WebRTC with kurento media server, as far as i came to know WebRTC supports VP8 for video Streaming and using opus for audio Streaming, So my question is if i want to compress the stream which includes both audio and video, so do i need to use both(VP8 and opus)?

If you are streaming both audio and video then it will use both an audio codec (typically Opus) and a video codec (typically VP8, VP9, or H.264).

Related

How to transcode audio from an RTMP stream in real-time to an audio stream

I’m looking to transcode the audio from an RTMP stream in real-time to an audio stream. I am currently having the RTMP stream published to an RTMP server (https://www.npmjs.com/package/node-media-server). From there, I would like to be able to extract the audio in MP3 or AAC in raw audio chunks and to have the audio chunks sent over to AWS Transcribe for transcription. The part that I am not sure how to do is extract the audio from RTMP stream and process the raw audio in real-time.
Does anyone have any suggestions on how this could be accomplished?

How to capture rtp stream from webrtc then convert it to hls to broadcast to client?

How to capture rtp stream from webrtc then convert it to hls to broadcast to client ?
I want to receive rtp from webrtc in browser via media server (eg kurento ... ) then convert it to hls stream. User can use hlsEndpoint to play.
WebRTC -> RTP -> HLS
What is the correct way?
My aim is to create a live stream app that supports push streams using webrtc , i'm working with rtmp , i want webrtc as an additional option.
Tks all.
Just use a media server to covert WebRTC to Live Streaming like RTMP, HTTP-FLV or HLS, please read this wiki.
Because the WebRTC is not only RTP, but also need to transcode the audio from opus to aac, and there is something like the jitter-buffer, NACK or packet out-of-order to handle.
For live streaming, the RTMP is the de-facto standard in live streaming industry, so if you covert WebRTC to RTMP, you got everything, like transcoding by FFmpeg, forwarding to YouTube, or DVR to file, etc.
If you need to convert WebRTC to HLS or RTMP you may check Ant Media Server
The community edition also provides this.

Does YouTube support Live ingest of HEVC over RTMP or is this only available via HLS?

On the site https://support.google.com/youtube/answer/2853702?ref_topic=6136989 only h.264 is listed as an avalible injest codec however on the page https://developers.google.com/youtube/v3/live/guides/hls-ingestion it states that "Supported video codecs are H264 and HEVC."
I've experimented a bit but have been unable to get an RTMP connection with HEVC. When i switch back to h.264 and maintain all the same codec parameters it works fine.
I talked to the YouTube Chat support and he said:
"I've checked our available resources here and there is no information yet for hevc. Maybe you can check the site if they have support as well. https://developers.google.com/youtube/v3/live/support"
So here I am :)
Thanks for reading!
Short answer: HEVC is not supported by RTMP.
RTMP protocol supports H264 but does not support HEVC because it's underlying container FLV does not support HEVC. So unless Adobe specifies it in the specification, it is unlikely to be supported by anyone. The spec was defined quite a while ago when most modern codec was H264.
HLS protocol supports both H264 and HEVC. It is based on MPEG-TS or fMP4 which have the support of both these codecs.
You can hack / force put HEVC into FLV and then stream with RTMP (some people do it for their custom streaming pipeline or apps), but nobody except you could receive it since it would not conform to the specification.
Links:
https://en.wikipedia.org/wiki/Comparison_of_video_container_formats#Video_coding_formats_support
https://www.adobe.com/devnet/rtmp.html
https://www.adobe.com/devnet/f4v.html
Some enthusiasts made changes in RTMP protocol to support HEVC. It came from streaming cameras side but looks like it's working with just minor changes.
Take a look at our approach to this: Support for HEVC over RTMP in Softvelum products

How to stream WebRTC audio to Icecast server?

How can i make an audio stream captured in browser using WebRTC to be streamed live via icecast/shoutcast protocols?
Use liquidsoap + webcaster.js:
https://github.com/webcast/webcaster

What's the difference HLS(http live stream) and DSS(darwin streaming server)

I'm beginner developer.
And I don't speak engilsh very well. sorry
I want to broadcast live video from iPhone camera like iphone video call.
In this case, which do I choice better, hls or dss.
so, what's the functional difference HLS and DSS.
Can hls broadcast live video from iphone camera to another iphone?
Darwin Streaming Server is for RTSP streaming. HLS is a streaming technology based on using HTTP server for hosting the content.
iPhone to iPhone video isn't well served by either technology. It's possible to use an iPhone camera to capture video, upload it to a server, package it for HLS and serve it to the client viewers. This is very high latency (around 10-30 seconds), so it's likely not suitable for you.
If you want 1-to-1 messaging, you're probably better off using a real-time system like RTP, which is what's used by FaceTime and video calling programs.