MPEG-2 vs AVC vs HEVC Inputs - rtmp

I am trying to work with Media live and I'm supposed to send RTMP stream from my iOS/Android devices.
I was checking the pricing model of Media live and I can see there're multiple kinds of input. As I am new to all this (media stuff), I am not sure what they are.
If I send RTMP stream from my iOS device, what kind of input will it be?
MPEG-2 Inputs
AVC Inputs
HEVC Inputs

These are video compression standards you have listed from the oldest (MPEG-2 in 90') to the newest (HEVC in 2013).
They have different features and specificiations. Most importantly, the bitrate they output in the same quality level is significantly different. HEVC is the best in terms of bitrate saving, also the most complex in terms of HW/SW.

Related

What video encoding format supports variable bit rate streaming (quality) and a SEEK function to skip segments?

I'm looking for a Multiplatform video format that allows for a variable bit rate for streaming over high or low quality links, and also a seek function that allows a byte[] range offset (as in HTTP commands) to fetch a missing or lower than desired quality.
I think it is worth separating encoding and streaming to help the background.
Most streaming protocols will stream video contained in 'containers' e.g. MP4, WebM etc. The containers will include video tracks which can be encoded with different encoders, e.g. H.264, H.265, VP9 etc.
The term Variable bit rate is usually used to describe how the encoding is done - i.e. the encoder may compress or encode the video so it has variable quality and fixed bit rate or try to maintain a given quality level but with a variable bit rate.
I suspect what you may be more interested in is what is called Adaptive Bit Rate streaming - here the video is 'transcoded' into multiple copies, each with a different bit rate. The copies are all segmented at the same points, for example every two seconds.
A client can choose which bit rate to request for the next segment of the video, i.e. the next 2 second chunk, depending on its capabilities and on the network conditions at that time. Hence, the bit rate actually being streamed to the device can vary over time. See this answer for how you can view this in action in live examples: https://stackoverflow.com/a/42365034/334402
Assuming this meets your needs, then the two dominant ABR streaming formats at the moment are HLS and MPEG-DASH.
Traditionally HLS uses TS segments while DASH uses fragmented MP4. Both are now converging on CMAF, which means that the bulk of the media will be a single Multiplatform format, as you are looking for. There is a good overview of CMAF here at the time of writing: https://developer.apple.com/documentation/http_live_streaming/about_the_common_media_application_format_with_http_live_streaming
One caveat is that if your content is encrypted then, at the moment, different devices support different encryption modes so you may need to have separate HLS and DASH encrypted media for the medium term, until device support evolves over time.

Does UE4/5 Leverage GPU for Video Rendering?

I want to generate many hours of video of a 3D board game. I have a powerful RTX 3090, and I want to utilize it as much as possible.
Can UE4/5 effectively leverage GPU resources to render video? If not, is there a better engine to effectively leverage GPUs for rendering?
UnrealEngine provides a very advanced cinematic tools,including movie render queue module which allows rendering the UE view-port to image sequence or video (very limited number of formats). However, UE doesn't encode video on GPU in that specific case. I write "that specific" for a reason. UE does use Nvidia GPU encoder(if available) for fast and efficient encoding to H264 when Pixel Streamer feature is used to stream video with WebRTC. But this one is for interactive streaming of the engine,not for video encoding. So even though you can deploy UE in a remote render farm and try to encode a massive amount of video data, it won't be as fast as using dedicated hardware encoder, such as NVIDIA NVENC. Moreover, UE doesn't provide video encoding to H264 at all. You would have to encode into JPG/PNG/BMP image sequence,then use tool like FFMPEG to convert to video. I recently put on GitHub an MP4 encoder plugin for UnrealEngine, which I wrote for my needs, but this one uses also CPU to perform encoding.

What technologies should I use to produce a WebM live stream from a series of in-memory bitmaps?

Boss handed me a bit of a challenge that is a bit out of my usual ballpark and I am having trouble identifying which technologies/projects I should use. (I don't mind, I asked for something 'new' :)
Job: Build a .NET server-side process that can pick up a bitmap from a buffer 10 times per second and produce/serve a 10fps video stream for display in a modern HTML5 enabled browser.
What Lego blocks should I be looking for here?
Dave
You'll want to use FFmpeg. Here's the basic flow:
Your App -> FFmpeg STDIN -> VP8 or VP9 video wrapped in WebM
If you're streaming in these images, probably the easiest thing to do is decode the bitmap into a raw RGB or RGBA bitmap, and then write each frame to FFmpeg's STDIN. You will have to read the first bitmap first to determine the size and color information, then execute the FFmpeg child process with the correct parameters. When you're done, close the pipe and FFmpeg will finish up your output file. If you want, you can even redirect FFmpeg's STDOUT to somewhere like blob storage on S3 or something.
If all the images are uploaded at once and then you create the video, it's even easier. Just make a list of the files in-order and execute FFmpeg. When FFmpeg is done, you should have a video.
One additional bit of information that will help you understand how to build an FFmpeg command line: WebM is a container format. It doesn't do anything but keep track of how many video streams, how many audio streams, what codecs to use for those streams, subtitle streams, metadata (like thumbnail images), etc. WebM is basically Matroska (.mkv), but with some features disabled to make adopting the WebM standard easier for browser makers. Inside WebM, you'll want at least one video stream. VP8 and VP9 are very compatible codecs. If you want to add audio, Opus is a standard codec you can use.
Some resources to get you started:
FFmpeg Documentation (https://ffmpeg.org/documentation.html)
Converting raw images to video (https://superuser.com/a/469517/48624)
VP8 Encoding (http://trac.ffmpeg.org/wiki/Encode/VP8)
FFmpeg Binaries for Windows (https://ffmpeg.zeranoe.com/builds/)

Windows 8 low latency video streaming

My game is based on Flash and uses RTMP to deliver live video to players. Video should be streamed from single location to many clients, not between clients.
It's essential requirement that end-to-end video stream should have very low latency, less than 0.5s.
Using many tweaks on server and client, I was able to achieve approx. 0.2s latency with RTMP and Adobe Live Media Encoder in case of loopback network interface.
Now the problem is to port the project to Windows 8 store app. Natively Windows 8 offers smooth streaming extensions for IIS + http://playerframework.codeplex.com/ for player + video encoder compatible with live smooth streaming. As of encoder, now I tested only Microsoft Expression Encoder 4 that supports live smooth streaming.
Despite using msRealTime property on player side, the latency is huge and I was unable to make it less than 6-10 seconds by tweaking the encoder. Different sources state that smooth [live] streaming is not a choice for low-latency video streaming scenarios, and it seems that with Expression Encoder 4 it's impossible to achieve low latency with any combination of settings. There are hardware video encoders which support smooth streaming, like ones from envivio or digital rapids, however:
They are expensive
I'm not sure at all if they can significantly improve latency on encoder side, compared to Expression Encoder
Even if they can eliminate encoder's time, can the rest of smooth streaming (IIS side) support required speed.
Questions:
What technology could be used to stream to Win8 clients with subsecond latency, if any?
Do you know players compatible with win8 or easily portable to win8 which support rtmp?
Addition. Live translation of Build 2012 uses Rtmp and Smooth Streaming in Desktop mode. In Metro mode, it uses RTMP and Flash Player for Metro.
I can confirm that Smooth Streaming will not be your technology of choice here. Under the very best scenario with perfect conditions, the best you're going to get is a few seconds (absolute minimum latency would be the chunk length itself, even if everything else had 0 latency.)
I think most likely RTSP/RTMP or something similar using UDP is your best bet. I would be looking at Video Conferencing technologies more than wide audience streaming technologies. If I remember correctly there are a few .NET components out there to handle RTSP H.264 meant for video conferencing - if I can find them later I will post here.

Streaming encoded video in Adobe AIR Application

I am developing a desktop application in Adobe AIR that will be used to stream the user's camera video to a wowza media server. I want to encode the video on the fly, means transmit the H.264 encoded video instead of the default flash player encoded video for quality purpose. Is there any way around for this?
Waiting for the help from people around,
Rick
H.264 encoding is usually done in Native Code C or C++ because it is a cpu
intensive set of algorithms. The source code for x264 can give you an
idea of the code required but it is a tough read if you start from scratch.
Here is a book to get you started or you can read the original AVC standard
if you suffer from insomnia.