How to change audio playrate from the browser console? - webbrowser-control

I change the playrate of videos all the time. Whenever I watch youtube or BBC videos I'm able to speed them:
document.getElementsByTagName('video')[0].playbackRate = 2.5
It makes learning and going through content a lot better.
I've been trying to do the same with audio and I can't.
I know there mind be podcast apps that provide up to 3x speed but I (a) only found 2x (b) not every audio is just podcast, and most importantly (c) I want to know how to do it.
For example Making Sense Podcast

The same thing works with audio, but there's no guarantee that there is a tag on the page for you to reference.
The HTMLAudioElement might be instantiated with script:
const audio = new Audio('https://example.com/podcast.webm');
If that's the case, document.getElementsByTagName() has nothing to find.
You will need to modify the script doing the playing.

Related

Can I sync multiple live radio streams with video?

Is it possible to sync multiple live radio streams to a pre-recorded video simultaneously and vary the volume at defined time-indexes throughout? Ultimately for an embedded video player.
If so, what tools/programming languages would be best suited for doing this?
I've looked at Gstreamer, WebChimera and ffmpeg but am unsure which route to go down.
This can be done with WebChimera, as it is open source and extremely flexible.
The best possible implementation of this is in QML by modifying the .qml files from WebChimera Player directly with any text editor.
The second best implementation of this is in JavaScript with the Player JS API.
The difference between these two methods would firstly be resource consumption.
The second method that would use only JavaScript would require adding one <object> tag for the video, and one more for each audio file you need to play. So for every media source you add to the page, you will need to call a new instance of the plugin.
While the first method made only in QML (mostly knowing JavaScript would be needed here too, as that handles the logic part behind QML), would load all your media sources in one plugin instance, with multiple VlcVideoSurface components that each has it's own Plugin QML API.
The biggest problem I can foresee for what you want to do is the buffering state, as all media sources need to be paused as soon as one video/audio starts buffering. Synchronizing them by time should not be to difficult though.
WebChimera Wiki is a great place to start, it has lots of demos and examples. And at WebChimera Questions we've helped developers modify WebChimera Player to suit even the craziest of needs. :)

Embedded ASS subtitle track in streamed video?

We'd like to build a small specialized clone of the ill-fated popcorn-time project, that is to say a node-webkit frontend for peerflix. The videos we'd like to play are mkv files that have embedded ASS subtitle tracks, and we can't seem to get the embedded subtitles to show up: while VLC nicely shows them, html5 video players in webkit-based things don't, not even in Google Chrome (so it's not a matter of Chromium's reduced codec support).
Now, I'm a bit out of our depths here, I don't really know much about these things, but it seems to me the media engine underneath webkit just ignores the ASS subtitle track here. Is it because it's ASS? Is it a matter of codecs somehow? Or is it, after all, a html5 thing? Now, the html5 video "living standard" mentions that "captions can be provided, either embedded in the video stream or as external files using the track element" - so the feature is at least planned, but I do realize that implementation is lacking. However, given that node-webkit uses ffmpeg as the underlying engine, it seems strange to me that the subtitles are not picked up at all.
Could someone more knowledgeable please advise us the problem? Also, is there anything we could do about it?
Extracting the subtitles beforehand is not an option, though I have been playing with the idea of extracting the subtitles on the fly, and feeding that stream back to the player - I had some modest success with this, and it looks like it could be done with some effort, but I'm really out of my depth here, and the whole idea is pretty contrived anyway.
However, I find it improbable that nobody has run into this problem before, hence this question: is there any way to show embedded (ASS) subtitle tracks in a streamed video in node-webkit?
Not sure if this would help but according to this page node-webkit doesn't ship with codec for patented media formats. They do have a few suggestions on the page, one of which is to compile your own node-webkit.
You could try using Popcorn Time's ffmpegsumo file which is what I used when I needed mp3 support and Chrome's version didn't work. Although, I don't know if that supports ASS subtitle format(considering its use, I would think it has to).
Note: I would have commented this answer but unfortunately I don't have commenting privileges yet. A couple of upvotes sure would be nice ;)

iOS Localization of large files like movies

I am creating a new iOS project, where I have to include a video in my app. We need to access this video also in offline mode - so I need to include it as video file in my project.
The question - what is the best practice for such large file localization? App will be in at least 7 languages and I can not decide - include video in 7 languages which would dramatically change size of app or include it only in English and localize other stuff? Probably someone knows - if my phone language is for example Spanish and I download localizable app - does this app include videos in all languages or only in my selected?
Any answer will be appropriated and thank you in advance.
You could do something nifty with bundling the video and audio separated. You could then play the correct audio for each localization. The same could be done with subtitles.
Most optimized way would be to not bundle any video in the application and just allow the users to download the video for the current localization from within the app. That's what I think.
A single video can have multiple audio streams. So you can simply create one video with multiple audio streams. Audio is general smaller compared to video. So should not hurt as much. User can then select the audio langauage [or program can preselect based on some preference] to play it out.
Almost all container formats can contain more than 1 audio stream. [Almost all, definitely mp4 which is what I guess you are using.]
EDIT: You can possibly also have two files. One with half the languages which are the most common and one with other half which are less common to reduce some size.

is there a Video for writing to plist in iOS?

I'm not saying I'm lazy (I am), but I want to read what is in the plist, and then when the user does something, to append that info to the existing plist, and write it back, so that info is not lost.
Now that's the type of thing I would think there'd be a video on.
Thus, my question is twofold:
a) Where's a good video (iTunes, Standford Univ, iOS App Pgming Guide) on writing to a plist using file handlers ?(gulp)
b) how can I do a search and restrict the media to videos (so I can find videos on these subjects in the future)?
I'm especially looking for how to handle the notorious first time, and how to make sure I don't overwrite good data by wrongly thinking it's empty.
There are plenty of such tutorials on YouTube. Have a look at this one:
iPhone Programming - Reading/Writing .plist Files

Providing data as needed for QTMovie

I understand that if I wanted to provide a QTMovie with data from an arbitrary source as it is needed I probably have to deal with the QTDataReference class, but unfortunately my experience with anything similar is limited to an audio data callback with Audio Buffers on the iPhone.
If I initialized the QTDataReference with some NSMutableData, how would I know when it needs more data and furthermore, how would I "clear" already played movie data and provide it again when the user seeks back (I want them to be able to)?
Basically the movie data I want to provide would in the end come from a set of files (which are really just one movie file split up), which become available sequentially during playback. This part is crucial.
Anybody who gets me going in the right direction can get beta access to the first Mac OS X Usenet movie streamer ;)
You likely can't do this using QTKit. Start with the QuickTime Overview and go from there.
The first idea that occurs to me is creating a QuickTime movie that directs QuickTime to look to the files you expect to download (or even to look directly to their remote URLs) for its media, then siccing QuickTime Player/QTMovieView on it and hoping that QuickTime's support for progressive download patches over any rough spots.