I have created one sample application for demonstrating a working of HTTP live streaming.
What I have done is, I have one library that takes input as video file (avi, mpeg, mov, .ts) and generating segments (.ts) and playlist (.m3u8) files for the given video file. I am storing playlist (as string) in a linked list, as an when i am getting playlist data from the library.
I have written one basic web server which will server the user requested segment and playlist files. I am requesting playlist.m3u8 file from the iPhone safari browser and it is launching the QuickTime player where it is requesting the segment.ts files listed in the received playlist files. after playing every segments (listed in current playlist) it again requests for the playlist, where i am responding with the next playlist file which contains the next set of segment.ts files listed in it.
Is this what we call HTTP live streaming?
Is there anything else, other that this i need to do for implementing HTTP live streaming?
Thanks.
Not much more. If you are taking input streams of media, encoding them, encapsulating in a format suitable for delivery and preparing the encapsulated media for distribution by placing it in such a way that they can be requested from the HTTP server, you are done. The idea behind the live streaming is that it leverages existing Internet architecture that is already optimized for serving HTTP requests for reasonably sized resources.
HTTP streaming renders many existing CDN solutions obsolete with their custom streaming protocols, custom routing and custom content caching.
You can also use media stream validator command line application for mac os x for validating streams generated by the HTTP Web server.
More or less but there's also adaptive bit-rate streaming to take care of if you want your server to push files to iOS devices. Which means your scope expands from having a single "index.m3u8" file that tracks all the TS files to a master index that then tracks the index files for each bitrate you'd want to support in your application which then individually track the TS files encoded at the respective bit-rates.
It's a good amount of work, but mostly routine/repetitive once you've got the hang of the basics.
For more on streaming, your bible, from the iOS standpoint, should ALWAYS be TN2224. Adhering closely to the specs in the Technote, is your best chance of getting through the App Store approval process vis-a-vis streaming.
Some people don't bother (building a streaming app over the past couple of months and looked at the HTTP logs of a whole bunch of video apps that don't quite seem to stick by the rules) - sometimes Apple notices, sometimes they don't, and sometimes the player is just too big for Apple to interfere.
So it's not very different there from every other aspect of the functionality of your app that undergoes Apple's scrutiny. It's just that there are ways you can be sure you're on the right track.
And of course, from a purely technical standpoint, as #psp1 mentioned the mediastreamvalidator tool can help you figure out if your streams are - at their very core, even if not in terms of their overall abilities - compatible with what's expected of HLS implementations.
Note: You can either roll with your own encoding solution (with ffmpeg, the plus being you have more control, the minus being it takes time to configure and get working just RIGHT. Plus once you start talking even the least amount of scale, you run into a whole host of other problems. And once you're done with all the technical hard-work, you'd find that was easy. Now you'd have to actually figure out which license you need to get for having a fancy H.264 encoder with you and jump through all the legal/procedural hoops to get one).
The easier solution for a developer without a legal/accounting team that could fill a football field, IMO, it's easier to go third-party with sites like Encoding.com, Zencoder etc who provide their encoding services a-la-carte or with a monthly fee. The plus is that they've taken care of all the licensing BS and are just providing you a simple "pay to use" service, which could also be extremely useful when you're building a project for a client. The minus is that you're now DEPENDENT on Zencoder/Encoding, the flip-side of which you'd know when your encoding jobs fail for a whole day because their servers are down, or even otherwise, when the API doesn't quite act as you expect or has been documented!
But anyhow that's about all the factors you got to Grok before pushing a HLS server into production!
Related
For the welcome screen of my app, we are trying to serve up a webpage in a webview that consists of a video and some text. (We want to go this route so that we could quickly update the welcome screen and test changes on the fly, versus having to submit and get approval each time.)
The video is only 8.6mB and is currently being played via HTML5 , hosted on an S3 and served via CloudFront. However, the playback still tends to be a bit choppy at times. Does anyone have any recommendations as to an optimal way to host and serve up the video to make it play smoothly? Are there any specific settings for the S3 or CloudFront anyone would recommend that could help?
Thanks in advance for any help anyone can provide.
The most common technique currently is to use ABR in parallel with a CDN to provide smooth playback.
ABR, Adaptive Bit Rate, involves making multiple copies of the video at different bit rates, from low to high and hosting these on the server.
The client receives an index file for the videos, e.g. an m3u8 manifest file, and then chooses the best bit rate for the current conditions to allow smooth playback without buffering.
If network conditions improve the client will 'step up' bit rates and if it gets worse it will 'step down' bit rates.
Typically a low or medium bit rate is chosen as the first one to allow quick and smooth start up.
You can see this effect on services like Netflix as they start up, and you can also see it on YouTube if you right click the video and select 'Stats for Nerds'.
Some links for ABR in AWS Elastic transcoding - you can set the bit rates you want, for e.g. see the note below from their FAQ re HLS jobs:
Specify that the transcoding job create a playlist that references the outputs. You should order your bit rates from lowest to highest, with the audio only stream last, since this order will be maintained in the generated playlist file. Once your transcoding job has completed, the output bucket will contain a proper arrangement of your master and individual M3U8 playlists, and MPEG-2 TS media stream fragments.
Take a look at the sample request on this page here which includes two different bit rates (video service providers will generally have more than 2 but this gives you a feel for the approach):
http://docs.aws.amazon.com/elastictranscoder/latest/developerguide/create-job.html
Azure Media Services has a built in "Adaptive Streaming" preset that is content-adaptive and can adjust the encoding settings to meet your content that is coming in.
See the following - https://learn.microsoft.com/en-us/azure/media-services/media-services-autogen-bitrate-ladder-with-mes
I have developed a site that hosts user videos. I store the video files in AWS S3, I deliver them through AWS Cloudfront and I use video.js as the site's player with HTML5 as default and flash as fallback.
Generally the video streaming seems to work fine but in some cases I receive complaints from users for slow or choppy video playback. I want to create some tests to measure the performance of streaming in order to be able to distinguish user problems (e.g. slow connection at the user side) or with my service.
Are there any best practices or tools to collect video delivery metrics? I'm interested in open source solutions or something that I can implement myself because it's just a personal project, but I don't want to rediscover the wheel.
Testing progressive download implies checking the transmission bandwidth and its continuity. For example for a high transmission rate the initial client buffer will be filled faster and the playback will start sooner. However, losing that transmission capacity at some later time can cause re-buffering. The total transmission time of your file must be lower than the video duration.
To identify potential issues you can start with the S3 bucket logs and the CloudFront cache statistics and access logs.
There's a load testing tool written in Java called Apache JMeter. It cannot use JavaScript so it must be configured to request the files directly.
The disadvantage of using a load test tool in a single location is pretty evident. Different geographical areas and carriers have different characteristics and test results will be different.
There are online, non open-source tools that can load test from multiple locations but they are generally paid though some offer free trials.
Here's another way to look at this.
but in some cases I receive complaints from users for slow or choppy video playback.
If you're using an Adaptive HLS stream, and you're CloudFront, and the video is still choppy to some users, that's probably because of their own internet connection speeds.
In that case, you can encode your video in multiple resolutions (using just one AWS MediaConvert job, btw) - like 1080p, 720p, 360p, 240p, 144p etc.
And then Videojs has a stream switcher plugin that will 1) automatically start playing the highest possible resolution - and no higher - that's right for the viewer's connection and 2) give the user the option via a "Settings" (gear) icon in the control bar that they can use to switch resolutions manually.
That way, even those with really poor internet connections should be able to watch your video.
Of course, the other alternative is to use progressive download videos that the viewer can simply click play, then immediately click pause, and wait for the video to buffer, and then play it after it's fully downloaded.
Check out the Videojs Resolution Switcher demo here.
-- Ravi Jayagopal
What is required to use SMIL file to utilize adaptive streaming in a videojs player. I have created the SMIL file in my wowza application and it is creating my 4 separate streams and making them available. However I cannot get my webpage, that uses videojs, to correctly play the SMIL file. Hints on that coding or where to go to find the correct documentation would be greatly appreciated.
There aren't many implementations of SMIL players. I'm sure I've seen wowza URLs that suggest it will output the SMIL as other formats, something like whatever.smil/manifest.m3u8. That's HLS which could be played on mobile and Safari natively and with videojs-contrib-hls elsewhere.
I know the question is old, but I've been struggling with this recently, so I want to share my experience in case anyone is interested. My scenario is very similar: want to deliver adaptive bitrate streaming from Wowza to clients using videojs.
There is a master link that explains how to setup and run Wowza Transcoder for live streaming, and how to set up your Adaptive Bitrate Streams using an SMIL file. Following the video in there you can achieve to have a stream that uses ABS, but the SMIL file is attached to the stream name, so it is not a solution if you have streams that come to Wowza from another Media Server origin and that need to be transcoded before being served to the clients. In the article there are a few key things mentioned (like the Stream Name Groups), but somehow things doesn't seem pretty clear, at least to me. So here is some clarification from what I understood from all articles I read and what I did to achieve ABS:
You can achieve ABS in Wowza either with SMIL files or with Stream Name Groups (NGRP). NGRP refres to a block of streams that is defined in the Transcoder template that can be played back using multi-bitrate streaming (dynamically) (<- this is what I used). And SMIL files are used to create a "static" list of streams for multi-bitrate VOD streaming. If you are using Wowza Origin-Edge Delivery you'll need the .smil file, because NGRP do not get forwarded to the edge. (Source for all this information: here).
In case you need the SMIL file, you probably need to generate a new one for every stream, and probably you want to do that in an automated way, so best way would be through an HTTP request (in the link above it is explained how to achieve this).
In case you can live with NGRP, things are a bit easier:
You need to enable Wowza Transcoder (this is pretty easy and steps are in the video I mention above).
You should create your own Transcoder Template with the different stream presets you want to deliver, as an example you can check the default ones that are already there. The more presets you add, the more work Wowza will need to do whenever a stream comes, since it will need to generate a new stream for every preset that you have defined.
Now is when we generate the NGRPs. In your Transcoder Template, you can generate as many NGRPs as you want (to clarify: these are like groups of streams, that you will be able to set in your clients video player. Each NGRP contains the streams that the video will be able to use when doing the adaptive bitrate streaming). For instance, these are the default NGRPs:
If you play the ngrp "_mobile" in the clients video player, the ABS algorithm in the player will be able to adapt itself to play either the 240p or the 160p streams based on the client capabilities.
So imagine you have these two NGRP. In order to play them in videoJS, you will need to set the source to:
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_all/playlist.m3u8
or
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_mobile/playlist.m3u8
... based on how many options you want to provide to the client player to use for the ABS. (For instance: if your targets are old mobile devices, you probably just want to offer a couple of low bitrate streams).
(This would be in case you're delivering an HLS stream. If other format, the extension would change, for instance if you are delivering a DASH stream you would have "/manifest.mpd" instead of "playlist.m3u8").
That is all, there is also a very helpful link in video.js documentation explaining how it does the bitrate switching: here.
I hope it helps someone! At least clarifying things! :)
While looking for video streaming server with Adaptive Bit Rate using http, I came across some proprietary servers/implementation namely Adobe dynamic streaming for Flash, Apple HTTP adaptive streaming and a similar one from microsoft.
What I am looking for is Apache webserver ABR streaming, I found out that MPEG DASH is the standard for this, and looks like apache supports it. But I am not able to get a start to it.
Can someone point me to an example or steps to achieve this?
Also, I understand that such a streaming requires a bunch of video files acting as segments at different bit rates of a video file that needs to be streamed and some metadata file.
I am not able to understand how I can provide this to apache to make it stream to the client(browser).
Appreciate help or directions on this.
Thanks.
Using MPEG-DASH, streaming becomes very simple. The video is stored at different quality levels (in terms of bitrate, resolution, etc.) on any HTTP server, each divided in segments of a few seconds length. The (intelligent) client application requests the segment for a specific time and quality (dependent on the current network capacity) via standard HTTP GET requests. So you can use your "standard" apache or any other webserver.
To get started I suggest to get some DASH content, either from DASHIF, or generate content on your own the easy way, using a transcoding platforms, like bitcodin.
We know that Dropbox desktop clients use a binary diff algorithm to break down all files into blocks, and only upload blocks that it doesn't already have in the cloud (https://serverfault.com/questions/52861/how-does-dropbox-version-upload-large-files).
Nevertheless, the Dropbox API, as far as I see, can only upload the whole file (/files_put, /files (POST)) when a sync is needed.
Is there any way to do differential/incremental syncing using the Dropbox API, i.e. upload only the changed portion of the file like the desktop clients do?
If this is not possible, then what are the best practices to periodically sync large files that has small changes using the Dropbox API?
Unfortunately this isn't possible and I would suspect that it may never be available.
After doing a bit of research, I found a feature request for delta-syncing to be integrated into the API[*]. Dropbox hasn't responded, nor has the community upvoted this request.
I would make an educated guess that the reason why Dropbox hasn't provided this functionality, and likely never will, is because this is a dangerous feature in the hands of unknown developers.
Consider the case where you write an application that uses such a delta-change update system for updating large files. You thoroughly test your app and publish it to an app store. A couple of weeks after your initial release, and numerous downloads, you start receiving bad reviews and complaints because you managed to miss a very specific test case.
Within this specific, buggy case you've miscalculated a differential offset by 1-byte. Oh no! You've now corrupted thousands of files, for hundreds of users!
Considering such a possibility, I think I would personally request that Dropbox NEVER provide such a dev feature. If they integrated such a function into the API, they would be breaking their #1 purpose-- to provide consistent, safe, & reliable cloud backups of your important files.
[*]: This was the original reference location, but it is now a dead link.
(https://www.dropbox.com/votebox/1515/delta-sync-api-for-mobile-applications)