youtube live api event tagging, video clipping, preroll ads - api

I'd like to use YouTube Live to stream a live video embedded in my website. I'd also like to be able to use the API to tag events within that video based on the time, then auto-generate clips x seconds before and y seconds after the timestamp to show that event only. I'd also like to publish my own pre-roll advertisements on both the live stream and the clips (and be able to play an ad during the stream at appropriate times). Is any/all of this possible?

Related

Youtube Live Video embedding not allowed

I want to embed a Youtube Live Video inside a webpage. Some years ago this was not a further problem. At the weekend I created a new Youtube account for a sports club and Livestreaming was activated 24h after verification. Now I set "allow embedding" for the Livestream, but every time I create an Event with OBS and stream to this event, the "allow embedding" option is disabled. An older installation for another sports club has this option enabled every time I stream.
Once the Livestream is startet it is possible to change the option via YouTube Studio, but it should be the default.
Any ideas?
The moment you start the live stream the "allow embedded" is switched off. If you immediately after going live (during your intro video or waiting video) switch it back on again on the youtube website it works fine. But it is indeed an irritating bug.

Does video.js support MPEG2-TS/UDP streams?

I am just starting to play around with video.js and really like it. I currently have some code where I have two players showing two different HLS streams in a single browser page.
However, HLS inherently has high latency and that may not work for my project. So I am wondering if video.js can receive and play MPEG2-TS/UDP streams which would have less latency (I can easily change the format of all of my source video steams).
My basic requirement is to have 2 players in a single browser page, one player showing the video stream sent from a particular network node, and the second showing how a different network node received that same stream. So the two video.js players on the browser page are showing 2 video streams that are actually the same video so they are highly correlated. This is why the latency is a critical requirement for this project.
Thanks,
-Andres

video live streaming application in React Native

I am trying to implement video live streaming
live streaming and
upload it to server and
save the streaming video (Playback)
in react native can any one help me with a sample project
this is will be helpful https://www.npmjs.com/package/react-native-video
for point upload it to server, what exactly do u need upload? video uploading or something else?
So - you'll need a backend server that can accept a video stream, and convert it into a stream that can be consumed in React Native. You'd also like the server to save the streamed video, and encode it so it can be played back as video on demand (VOD) after the stream has stopped. None of this is React - it'll all be done on the backend.
You can build all this yourself, but there are a number of APIs that can do this for you. Disclaimer: I work for one such company: api.video. (A Google search will find others)
For livestreaming from the browser, you can use getUserMedia and stream to a server. (you can see my demo using JavaScript at livestream.a.video. This stream wil be visible to all your users, and then also recorded and saved as VOD for later playback.
To upload a recorded video - you can use file.slice() to break the video into manageable chunks, and upload to the server - for transcoding into a video stream (demo at upload.a.video, and tutorial.)
For playback, these APIs will give you a player URL or the m3u8 url that you can incorporate into a video player. This is your React Native part - and there are several video players that you can add into your application to playback HLS video. At api.video, we have our own player, and also support 3rd party players.

Is it possible to capture the video frames when using the Google Hangouts API?

I would like to use the Hangouts API to try to capture the video frames and process them to see if I could begin writing an app to measure average brightness over the entire frame or a sub-frame.
Is it possible to pull the frame itself and measure pixel data like this? Is there any command in the Hangouts API that allows for this, or is there other third-party software that can accomplish this?
It is not possible to get direct access to the video feed or video frame data in the Hangouts API.

are there osmf concepts for scheduled live events?

Lots of ancient, non-negotiable history due to mergers and acquisitions, so I realize there are better ways to do all of this, however... I am faced with the following:
I have an osmf based video player where a particular playlist item (for a live video), must do the following:
play a preRoll prior to displaying a countdown
display a countdown until the video start time (synched with the server time)
play another preRoll prior to video playback
play the live stream until the server time reaches the stream end time
then play a post roll video
I've gotten this working for the most part, but I'm running into walls with the arbitrary insertion of "ads", since I don't want to trigger events associated with loading new media. If I try to inject a new ad (particularly after the stream has played), the live stream will display again. While I could figure out some horrible way to make this work, I just wanted to make sure first that I'm not missing something critical about osmf and live events. I'm also a bit uncertain as to what is native osmf in the architecture I'm working with and what is homegrown.
1) Does osmf have a concept for a scheduled live event that might make this easier
2) Does osmf have an option to arbitrarily insert a video into playback based on some external call without changing the playlist index or returning to the beginning of the video.