Track Flash Player error from backend - html5-video

I have a few live streams that my video player (JWplayer) is used to play. I want a mechanism to automatically load a live stream to test if the live stream loads in JWplayer or not - this needs to happen backend on the server side - preferably a unix flavoured machine.
For example, the live stream URL may change or there may be a cross domain error. Ultimately, if this happens, I want to remove the live stream from my database automatically.
Is it possible to do this automatically? Note that an m3u8 URL may play in quicktime but not in Flash because of m3u8 errors.
I would like a similar tracking mechanism linked to a HTML 5 player (that supports live m3u8 streams) - say quicktime (or maybe ffplay?)
Is this possible? If so, how?
Thanks a lot!

Related

How can I use Adaptive streaming for VODs in Ant Media Server?

I'm using Ant Media Server for streaming. My use case requires me to record the Live Streams as VODs so the users can access the content later as well.
Like the live streams, I want to apply adaptive settings to the VODs as well so that users can get the suited resolution as per their network.
I can't find any built in solution for this yet. Can you please tell me any solution as to how can I do this!
I'm using S3 to store the recordings.
Thanks.
Thank you for the question. As far as I understand from the question, it seems that Live Streams are recorded as VoD files.
I think the most efficient way is doing that through HLS. With this way, the VoD files are recorded as HLS and multibitrates is available. No need to transcode again and it'll be played directly. Let me explain this solution step by step.
Set HLS playlist type to event and settings.deleteHLSFilesOnEnded to false . Edit your red5-web.properties for the application and set the following settings
settings.hlsPlayListType=event
settings.deleteHLSFilesOnEnded=false
Restart the server
sudo service antmedia restart
Add adaptive bitrates on the web panel.
Start Live Streaming and let the Ant Media Server create HLS(m3u8 and ts) files for each bitrate.
Stop Live Streaming
Then you can watch the stream by giving the master m3u8 file which is {STREAM_ID}_adaptive.m3u8. It can be even played directly by embedded player even if it's not live.
For more information, take a look at this wiki about HLS Playing
Please let me know if this approach helps you.
antmedia.io

Does video.js support MPEG2-TS/UDP streams?

I am just starting to play around with video.js and really like it. I currently have some code where I have two players showing two different HLS streams in a single browser page.
However, HLS inherently has high latency and that may not work for my project. So I am wondering if video.js can receive and play MPEG2-TS/UDP streams which would have less latency (I can easily change the format of all of my source video steams).
My basic requirement is to have 2 players in a single browser page, one player showing the video stream sent from a particular network node, and the second showing how a different network node received that same stream. So the two video.js players on the browser page are showing 2 video streams that are actually the same video so they are highly correlated. This is why the latency is a critical requirement for this project.
Thanks,
-Andres

Parsing HLS manifest of live stream in Safari to retrieve time-based metadata

I am using native Safari player implementation to stream video with HLS streaming protocol.
My goal is to get time-based metadata (such as EXT-X-DATERANGE) from a live stream manifest.
As far as I know, it is not possible to retrieve this data because the streaming logic is fully controlled by the Safari player which does not expose this data.
For now, I came to the 2 possible solutions:
Manually download the manifests and parse out the EXT-X-DATERANGE tag. But with this approach, the download timer should be manually managed too. And, of course, the number of requests for the playlists will be increased.
Desktop Safari browser supports MSE. This means it is possible to have full control over manifest retrieving and parsing. There are awesome libraries that already provide this functionality, such as shaka-player or hls.js. It is possible to implement custom response filter for segments(shaka-player) or listen to Hls.Events.FRAG_CHANGED event (hls.js) in order to have access to the playlist. The problem is that Safari in IOS mobile still does not support the MSE. So it is not possible to apply this solution for mobiles.
Are there any other ways to retrieve time-based metadata (such as EXT-X-DATERANGE) using native Safari player implementation?
Thanks a lot in advance!

SMIL adaptive streaming in Videojs

What is required to use SMIL file to utilize adaptive streaming in a videojs player. I have created the SMIL file in my wowza application and it is creating my 4 separate streams and making them available. However I cannot get my webpage, that uses videojs, to correctly play the SMIL file. Hints on that coding or where to go to find the correct documentation would be greatly appreciated.
There aren't many implementations of SMIL players. I'm sure I've seen wowza URLs that suggest it will output the SMIL as other formats, something like whatever.smil/manifest.m3u8. That's HLS which could be played on mobile and Safari natively and with videojs-contrib-hls elsewhere.
I know the question is old, but I've been struggling with this recently, so I want to share my experience in case anyone is interested. My scenario is very similar: want to deliver adaptive bitrate streaming from Wowza to clients using videojs.
There is a master link that explains how to setup and run Wowza Transcoder for live streaming, and how to set up your Adaptive Bitrate Streams using an SMIL file. Following the video in there you can achieve to have a stream that uses ABS, but the SMIL file is attached to the stream name, so it is not a solution if you have streams that come to Wowza from another Media Server origin and that need to be transcoded before being served to the clients. In the article there are a few key things mentioned (like the Stream Name Groups), but somehow things doesn't seem pretty clear, at least to me. So here is some clarification from what I understood from all articles I read and what I did to achieve ABS:
You can achieve ABS in Wowza either with SMIL files or with Stream Name Groups (NGRP). NGRP refres to a block of streams that is defined in the Transcoder template that can be played back using multi-bitrate streaming (dynamically) (<- this is what I used). And SMIL files are used to create a "static" list of streams for multi-bitrate VOD streaming. If you are using Wowza Origin-Edge Delivery you'll need the .smil file, because NGRP do not get forwarded to the edge. (Source for all this information: here).
In case you need the SMIL file, you probably need to generate a new one for every stream, and probably you want to do that in an automated way, so best way would be through an HTTP request (in the link above it is explained how to achieve this).
In case you can live with NGRP, things are a bit easier:
You need to enable Wowza Transcoder (this is pretty easy and steps are in the video I mention above).
You should create your own Transcoder Template with the different stream presets you want to deliver, as an example you can check the default ones that are already there. The more presets you add, the more work Wowza will need to do whenever a stream comes, since it will need to generate a new stream for every preset that you have defined.
Now is when we generate the NGRPs. In your Transcoder Template, you can generate as many NGRPs as you want (to clarify: these are like groups of streams, that you will be able to set in your clients video player. Each NGRP contains the streams that the video will be able to use when doing the adaptive bitrate streaming). For instance, these are the default NGRPs:
If you play the ngrp "_mobile" in the clients video player, the ABS algorithm in the player will be able to adapt itself to play either the 240p or the 160p streams based on the client capabilities.
So imagine you have these two NGRP. In order to play them in videoJS, you will need to set the source to:
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_all/playlist.m3u8
or
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_mobile/playlist.m3u8
... based on how many options you want to provide to the client player to use for the ABS. (For instance: if your targets are old mobile devices, you probably just want to offer a couple of low bitrate streams).
(This would be in case you're delivering an HLS stream. If other format, the extension would change, for instance if you are delivering a DASH stream you would have "/manifest.mpd" instead of "playlist.m3u8").
That is all, there is also a very helpful link in video.js documentation explaining how it does the bitrate switching: here.
I hope it helps someone! At least clarifying things! :)

Streaming music on your website through custom player / application (iTunes)

I was doing some research to find out ways that would allow me to stream music on my website legally. I came across iTunes partner program which allows to stream music on a website through their embedded players. I was wondering is it possible to stream iTunes music through your own custom player? If that is not possible via iTunes, then what other methods are available?
You could do this with a server software like Icecast, there is some good tutorials on setting this up here: http://www.icecast.org/docs.php
Depending on how many browsers you want to support you might want to setup two streams, one in MP3/OGG and a "backup" stream in Flash. Then add some detection as to what the browser supports and present the correct stream (i.e.: Use the HTML5 <audio> tag for playing MP3/OGG to browsers that support this, and use your flash stream for the rest)
their program allowing playback of music in the iTunes Store is likely only for those with the intention to sell music, without providing a commerce business, you'd be breaking their partner program T&C's.