On the HTML5 Video Player, how can I tell the type of the source being played? - html5-video

I'm providing DASH, HLS, and MP4 (as a fallback) as sources to an HTML5 Video player element. I would like to log the source types that my users end up using, but I don't know where to get that information from.
What's the best way for me to get this source type? I expected to be able to at least parse the video element's currentSrc property to figure this out, but there's nothing in the value that indicates whether it's dash/hls/mp4.

Related

When does HTML5 video fetch and decode its video?

I load 5 <video autoplay> elements on my page. On some older computers, this can take a lot of CPU.
At what point does a video element fetch its source video, and decode it? Is it only when the video begins to play?
I'm not seeing any data in the "Network" tab in Chrome dev tools, so I'm not sure how to tell when the data is being fetched.
If autoplay is specified, the video automatically begins to play back as soon as it can do so without stopping to finish loading the data and it has nothing to to with when data is fetched.
You should use preload attribute to specify and leave out unbiguity.
This enumerated attribute is intended to provide a hint to the browser about what the author thinks will lead to the best user
experience. It may have one of the following values:
- none: indicates that the video should not be preloaded.
- metadata: indicates that only video metadata (e.g. length) is fetched.
- auto: indicates that the whole video file could be downloaded, even if the user is not expected to use it. the empty string: synonym of the auto value.
Note: If not set, its default value is browser-defined (i.e. each browser may have its default value). The spec advises it to be set to metadata.
Note: Some versions of Chrome only acknowledge autostart, rather than autoplay

HTML5 video toggle audio tracks

Given a video file with multiple audio tracks, I'm looking for a way to toggle the played audio stream during playback to save space and add multiple different translations to the same video file. I read about the video.audioTracks property, however, according to this page it's only available in IE11+ and Safari browsers as of now. From what I've of about so far it might be possible to store the video without any audio tracks whatsoever and play the audiotracks separately (which would most likely lead to asynchronous playback). Are there any better ways of achieving this behaviour?

Projekktor: supporting multiple video sizes

I am using Projekktor to display video, but if someone is using, say, an iPhone I want to send out a smaller video than the full 1080p that might be sent to a browser.
Is there a built-in way to do this, or do I need to do a user-agent check and create a playlist based on the device manually?
You can configure Projekktor to fetch a specific video file depending on the dimensions of the video display.
To do so you need to provide multiple video video files with different resolutions for each format you want to deliver and set a "quality" property for each of them.
To alter the dimensions/quality mapping you have to set the "playbackQualities" config option
The whole logic is described in detail over here.

Can HTML5 track element be used for *live* subtitles?

I am planning to build a system to broadcast public events (trials, meetings, conferences).
A key request will be the insertion of live subtitles to the A/V stream.
The subtitles will be "live" since they will be produced by an operator while the event will happen.
I suppose the HTML5 "track" element is not yet implemented by any of the major browsers, but: can I expect to eventually use it for live subtitles? Will I be able to inject the subtitle to the page while the stream is playing?
Please Look at the following links. Looking at the link i am having to believe it should be possible as they are using Js to show subtitles
http://www.storiesinflight.com/js_videosub/
http://cuepoint.org/
You may also consider http://mozillapopcorn.org/ which is to show content on timing of the video. So technically u can use this with ajax to show/stream subtitles
There are HTML5 video JS libs that support subtitles (eg: VideoJS supports the .srt format, there are several easily Google-able others), however to the best of my knowledge none of them support streaming subtitles.
I think you may have to build your own solution for this. If I were to do it, I'd probably try doing something with Socket.IO's broadcast functionality that can push data out to all connected clients at once, and have your client-side JS listen for new subtitle events and render them on screen as they come in. You can use plain ol' CSS to overlay the text over the HTML5 video.

In html5 video tag is it possible to play media resources listed in source tags sequentially?

The problem is, my videos are hosted on a video site, longer videos will be divided into several parts. I want to play these clips in html5 video tag on my own site, is it possible to play media resources listed in source tags sequentially?
I googled and got to the w3c's html5 video spec page, find an attribute named startOffsetTime, I tried to assign value for this attribute to each media resource as suggested in the spec, but it still doesn't work.
You can load the next clip when JavaScript's onend event is triggered.
Check out markcial's answer in another thread.