video live streaming application in React Native - react-native

I am trying to implement video live streaming
live streaming and
upload it to server and
save the streaming video (Playback)
in react native can any one help me with a sample project

this is will be helpful https://www.npmjs.com/package/react-native-video
for point upload it to server, what exactly do u need upload? video uploading or something else?

So - you'll need a backend server that can accept a video stream, and convert it into a stream that can be consumed in React Native. You'd also like the server to save the streamed video, and encode it so it can be played back as video on demand (VOD) after the stream has stopped. None of this is React - it'll all be done on the backend.
You can build all this yourself, but there are a number of APIs that can do this for you. Disclaimer: I work for one such company: api.video. (A Google search will find others)
For livestreaming from the browser, you can use getUserMedia and stream to a server. (you can see my demo using JavaScript at livestream.a.video. This stream wil be visible to all your users, and then also recorded and saved as VOD for later playback.
To upload a recorded video - you can use file.slice() to break the video into manageable chunks, and upload to the server - for transcoding into a video stream (demo at upload.a.video, and tutorial.)
For playback, these APIs will give you a player URL or the m3u8 url that you can incorporate into a video player. This is your React Native part - and there are several video players that you can add into your application to playback HLS video. At api.video, we have our own player, and also support 3rd party players.

Related

React native live stream and save live stream video

I have to create an app to live stream and save live stream video.
An app ( which is simple version of cam app as Botslab/Mi home/ ... ) just lives every time, saves video (maybe every 20 mins) to memory stick and watches saved videos on App
I plan to use react-native-webrtc for the project.
But when I read docs, it has a problem: mediaDevices.getDisplayMedia() can help to record video but my skill is not good enough so I can't find a way to use it in React Native.
I have seen https://stackoverflow.com/a/59082227/14745811 but it can't stream HD or fullHD quality
So does anyone have any suggestions?
Of course, a free option because I use my company server.

Parsing HLS manifest of live stream in Safari to retrieve time-based metadata

I am using native Safari player implementation to stream video with HLS streaming protocol.
My goal is to get time-based metadata (such as EXT-X-DATERANGE) from a live stream manifest.
As far as I know, it is not possible to retrieve this data because the streaming logic is fully controlled by the Safari player which does not expose this data.
For now, I came to the 2 possible solutions:
Manually download the manifests and parse out the EXT-X-DATERANGE tag. But with this approach, the download timer should be manually managed too. And, of course, the number of requests for the playlists will be increased.
Desktop Safari browser supports MSE. This means it is possible to have full control over manifest retrieving and parsing. There are awesome libraries that already provide this functionality, such as shaka-player or hls.js. It is possible to implement custom response filter for segments(shaka-player) or listen to Hls.Events.FRAG_CHANGED event (hls.js) in order to have access to the playlist. The problem is that Safari in IOS mobile still does not support the MSE. So it is not possible to apply this solution for mobiles.
Are there any other ways to retrieve time-based metadata (such as EXT-X-DATERANGE) using native Safari player implementation?
Thanks a lot in advance!

Track Flash Player error from backend

I have a few live streams that my video player (JWplayer) is used to play. I want a mechanism to automatically load a live stream to test if the live stream loads in JWplayer or not - this needs to happen backend on the server side - preferably a unix flavoured machine.
For example, the live stream URL may change or there may be a cross domain error. Ultimately, if this happens, I want to remove the live stream from my database automatically.
Is it possible to do this automatically? Note that an m3u8 URL may play in quicktime but not in Flash because of m3u8 errors.
I would like a similar tracking mechanism linked to a HTML 5 player (that supports live m3u8 streams) - say quicktime (or maybe ffplay?)
Is this possible? If so, how?
Thanks a lot!

Streaming web video to Roku

Does anyone know how technically to send videos (i.e. Youtube Videos) to a Roku player? There is a "Twonky Beam" app that allows streaming and what it appears to do is to send .mp4 files to Roku for playback. See the demo here: http://gigaom.com/video/youtube-on-roku-twonky-airplay/
This is done without a "Twonky Beam" Roku app. Looks like something that Roku supports natively, although I cannot find anything documented.
I want to know how they were able to accomplish this without Roku being a UPNP or DLNA device.
Any insights here would be great!
There are discussions on how to extract the mp4 URL from YouTube here and here
In terms of how to do airplay style video playback on Roku, you would use the External Control Protocol to launch a channel with the URLs of the video you wish to play back, or once your channel is launched, us the ECP in combination with the roInput component to send the URL's to your channel. Your channel would then send the URLs to a video playback compoenent which would initiate playback from Youtube or whatever source you send it. If you want to play URL's from your device (android/IOS) you would need to run a web server on the device to serve videos to the device.
here is an Open Source YouTube project referenced in that second thread.
Any unofficial project that plays video's from YouTube is subject to DMCA takedown by YouTube should they decide your project does not fit with their goals.
roInput is not really well documented, here is an example that demonstrates both roInput and launch parameters (launch parameters are keywords you include in an http POST):
function main(params as object)
if params.parameter <> invalid then
print "This channnel was launched with Launch Parameters!"
print params
else
print "launched without input parameters"
end if
port=CreateObject("roMessagePort")
input=createobject("roInput")
input.setmessageport(port)
while true
msg=wait(100,port)
if type(msg)="roInputEvent" then
params=msg.getinfo()
print params
end if
end while
end function
so your parameters might be "vidurl=http://myserver.com/video300k.mp4&vidurl=http://myserver.com/video600k.mp4" if you wanted to send multiple bit rate videos.
there are plenty of examples of how to play video on a Roku in the RokuSDK, the simplest being the simplevideoplayer exmaple.
As to the last part of the question re UPNP, you can find a roku on your lan either via brute force telnet on port 8060 to every ip or by using SSDP, also documented in the ECP guide linked above

MPMoviePlayerController Play Live Stream

I am making a Radio app for my website and I tried to use MPMoviePlayerController to access my streaming (.pls) file actually shoutcast but it doesn't work>
Any help or snippet to do that ?
Apple recommends you use HTTPLiveStreaming for audio/video streams. Do a little digging and I'm sure you could figure it out.
If it's a live stream, a playlist, or something like that, you can just use the API to create an m3u8 playlist that your MPMoviePlayerController can stream. Check out the docs and guides at the link below:
HTTP Live Streaming Resources - Apple Developer