Does YouTube support Live ingest of HEVC over RTMP or is this only available via HLS? - youtube-livestreaming-api

On the site https://support.google.com/youtube/answer/2853702?ref_topic=6136989 only h.264 is listed as an avalible injest codec however on the page https://developers.google.com/youtube/v3/live/guides/hls-ingestion it states that "Supported video codecs are H264 and HEVC."
I've experimented a bit but have been unable to get an RTMP connection with HEVC. When i switch back to h.264 and maintain all the same codec parameters it works fine.
I talked to the YouTube Chat support and he said:
"I've checked our available resources here and there is no information yet for hevc. Maybe you can check the site if they have support as well. https://developers.google.com/youtube/v3/live/support"
So here I am :)
Thanks for reading!

Short answer: HEVC is not supported by RTMP.
RTMP protocol supports H264 but does not support HEVC because it's underlying container FLV does not support HEVC. So unless Adobe specifies it in the specification, it is unlikely to be supported by anyone. The spec was defined quite a while ago when most modern codec was H264.
HLS protocol supports both H264 and HEVC. It is based on MPEG-TS or fMP4 which have the support of both these codecs.
You can hack / force put HEVC into FLV and then stream with RTMP (some people do it for their custom streaming pipeline or apps), but nobody except you could receive it since it would not conform to the specification.
Links:
https://en.wikipedia.org/wiki/Comparison_of_video_container_formats#Video_coding_formats_support
https://www.adobe.com/devnet/rtmp.html
https://www.adobe.com/devnet/f4v.html

Some enthusiasts made changes in RTMP protocol to support HEVC. It came from streaming cameras side but looks like it's working with just minor changes.
Take a look at our approach to this: Support for HEVC over RTMP in Softvelum products

Related

Is it possible to deliver RTSP stream via Kurento. WebRTC to RTSP

I want to use Kurento as media server which takes WebRTC as an input and provides RTSP stream as url: rtsp://kurento/streamName
Is this possible?
I saw https://github.com/lulop-k/kurento-rtsp2webrtc/ project which does opposite thing.
My final goal is to deliver a stream to mobile browsers via JSMPEG.
This is not possible, as Kurento team says: "We can consume it, but not produce it."
Now, as a common solution for this, you could stream from Kurento to Wowza media server using an RTP endpoint, and then re-stream RTSP from Wowza. In KMS google group there is a lot of content related to the integration between the two of them.

Streaming RTSP to WebRTC using Kurento

I have been testing out Kurento for a while now.
I have gone through one2many sample, and got everything working.
Now I would like to do the same, but have the "presenter" be an RTSP source.
I don't have much experience with RTSP, so I might be missing something. I have looked over several samples and they all use the PlayerEndpoint, which receives an rtsp://... address.
For my implementation, I would rather the camera access a Kurento URL in order to initiate the RTSP stream.
Since I have very limited experience with RTSP, I'm not sure if this is possible and if it's a common practice.
If not, what are the alternative in a case where I don't know the RTSP URI in advance and don't have a UI to input it at runtime?
Follow this example project, you can simply make Kurento streaming RTSP to WebRTC on the fly.

Streaming webcam and mic inputs through browser

Short version:
I need an in-browser solution to deliver the webcam and mic streams to a server.
Long version:
I'm trying to create a live streaming application. So far I've only managed to figure out this workflow:
Client creates stream (some transcoder is probably required here)
Client sends(publishes?) stream to server (basically hosts an RTMP/other stream that should be accessible by my server)
Server transcodes, transrates, etc. and publishes the stream to a CDN
Viewers watch published stream
Ideally, I'd like a browser-based solution that requires minimal setup from the client's end (a Flash plugin download might be acceptable) and streams the webcam and mic inputs to the server. I'm either unaware of the precise keywords or am looking for the wrong thing, but I can't find an apt solution.
Solutions that involve using ffmpeg or vlc to publish a stream aren't really what I'm looking for, since they require additional download and setup, and aren't restricted to just webcam and mic inputs. WebRTC probably won't serve the same quality but if all else fails, I think it can get the job done, at least for some browsers.
I'm using Ubuntu for development and have just activated a trial license for Wowza streaming server and cloud.
Is ffmpeg/vlc et. al. the only way out? Or is there something that can do the job in a single browser tab?
If you go the RTMP way, Adobe Flash Player supports H.264 encoding directly. Since you mentioned Wowza you can find an example and complete source code (including the fla) in the examples directory. There's also a demo here. There are many other open-source Flash capture plugins.
You can also use the aforementioned Flash recorder without Wowza. In this case you'll need a RTMP server, a notable example being the Nginx RTMP module which supports recording (to flv) and also offers callbacks that allow you to launch the transcoding once the recording is done.
With WebRTC you can record (getUserMedia, MediaStreamRecorder) small media chunks and send them to the server where they will get concatenated or using the peer-to-peer communications features of WebRTC (RTCPeerConnection). For a detailed overview see my answer here.
In both cases you'll have issues with devices/browsers that don't support Flash or WebRTC, eg. iPhones, Safari. Plus getUserMedia doesn't capture the same format across all browsers: Firefox audio/video in WebM and Chrome audio in wav and video in WebM.
For mobile devices you'll probably have to write apps.

What's the difference HLS(http live stream) and DSS(darwin streaming server)

I'm beginner developer.
And I don't speak engilsh very well. sorry
I want to broadcast live video from iPhone camera like iphone video call.
In this case, which do I choice better, hls or dss.
so, what's the functional difference HLS and DSS.
Can hls broadcast live video from iphone camera to another iphone?
Darwin Streaming Server is for RTSP streaming. HLS is a streaming technology based on using HTTP server for hosting the content.
iPhone to iPhone video isn't well served by either technology. It's possible to use an iPhone camera to capture video, upload it to a server, package it for HLS and serve it to the client viewers. This is very high latency (around 10-30 seconds), so it's likely not suitable for you.
If you want 1-to-1 messaging, you're probably better off using a real-time system like RTP, which is what's used by FaceTime and video calling programs.

RTSP streaming iOS 6 using WoWza Server

Does any one knows about RTSP streaming using WOWza Server ?
I want to play it on a MPMoviePlayer controller in iOS6 but it shows not enough buffer to keep it up. My webservice urls work fine because I have also checked them using a browser but I can't find anything about RTSP streaming.
Does any one have any tutorials about RTSP streaming on iPhone using WOWza Server ?
rtsp streaming is not possible on iPhone, iPad and iPod touch. Please refer to the following link.
http://www.wowza.com/forums/content.php?62
MPMoviePlayer only supports HTTP live streaming. to have RTSP working on iPhone, you must implement your client on the iOS.
There is library live555 which implements for you, but you must integrate it with the code. Also decoding of the stream must be implemented in software, by you or a 3rd party library.
Wowza support re-restreaming the content as HLS, if this is your wowza server than there are easy instructions on the www.wowza.com site, if its not perhaps they are streaming HLS.
If you have to use rtsp, there are several good players available on the app store, or you can build your own using lib 555 as mentioned or one of our open or closed sourced frameworks.
https://github.com/mooncatventures-group/AVDemoPlay2L