I have to create an app to live stream and save live stream video.
An app ( which is simple version of cam app as Botslab/Mi home/ ... ) just lives every time, saves video (maybe every 20 mins) to memory stick and watches saved videos on App
I plan to use react-native-webrtc for the project.
But when I read docs, it has a problem: mediaDevices.getDisplayMedia() can help to record video but my skill is not good enough so I can't find a way to use it in React Native.
I have seen https://stackoverflow.com/a/59082227/14745811 but it can't stream HD or fullHD quality
So does anyone have any suggestions?
Of course, a free option because I use my company server.
Related
I am trying to make a react page that listens to music, just like shazam does. Do I have to use some API or something like that and how to do it?
ACR-Cloud allows you to scan audio files and it will recognize the song playing, unlike popular methods where the song is listened to in real-time and recognized, you can record the audio and scan it. However, it'll create the impression that you are listening to it in real-time. I think that's what you need.
On frontend, you can record a n(10 for ex) sec audio clip and send it to our backend which will perform the automatic recognition and send back the response to the front-end.
Take a look at ThisSong-Backend(Python)
Shazam Clone UI Demo (Front End Demo with Reach Native)
Also, there's an official ShazamKit
I am trying to implement video live streaming
live streaming and
upload it to server and
save the streaming video (Playback)
in react native can any one help me with a sample project
this is will be helpful https://www.npmjs.com/package/react-native-video
for point upload it to server, what exactly do u need upload? video uploading or something else?
So - you'll need a backend server that can accept a video stream, and convert it into a stream that can be consumed in React Native. You'd also like the server to save the streamed video, and encode it so it can be played back as video on demand (VOD) after the stream has stopped. None of this is React - it'll all be done on the backend.
You can build all this yourself, but there are a number of APIs that can do this for you. Disclaimer: I work for one such company: api.video. (A Google search will find others)
For livestreaming from the browser, you can use getUserMedia and stream to a server. (you can see my demo using JavaScript at livestream.a.video. This stream wil be visible to all your users, and then also recorded and saved as VOD for later playback.
To upload a recorded video - you can use file.slice() to break the video into manageable chunks, and upload to the server - for transcoding into a video stream (demo at upload.a.video, and tutorial.)
For playback, these APIs will give you a player URL or the m3u8 url that you can incorporate into a video player. This is your React Native part - and there are several video players that you can add into your application to playback HLS video. At api.video, we have our own player, and also support 3rd party players.
Our company wants to use the DSC QX10 for Sport video Capturing.
We have to cut the Videos in the Android app and send it to our Server.
Can we record the Video directly to the Android device? (Stream the Video or is there a different Idea to get the Video to Android?)
How long can the maximum length of the Video be? (We would need a 2h Video)
Thanx
Movie recorded using Camera Remote API on DSC-QX10 camera is stored on the memory card.
Best Regards,
Prem, Developer World team
I want to create an application capable to play YouTube video's audios and also save the downloaded content in a local cache, therefore when the user decides to resume or play the video again, then it doesn't have to download part of video again but only download the remaining part (User can decide what to do with the cache then, and how to organize it).
It is also very convenient for mobiles (it is my main focus) but I'd like to create a desktop one too for experimental purposes.
So, my question itself is, does YouTube provide any API for this? I mean, in order to cache the download content I need that my application download the content and not any embed player (also remember that it is a native application). I have a third-party application in my Android system that plays YouTube videos, so I think it's possible unless that the developers use some sort of hack, again this is what I don't know.
Don't confuse with the web gdata info API and the embed API, this is not what I want, what I want is to handle the video transfer.
As far as I know, there is no official API for that. However, you could use libquvi to look up the URLs of the real video data, or you could have a look at how they do it and reimplement it yourself (see here).
Is it possible to obtain a stream of audio data arriving at the system output (speakers, headphones, etc.) using CoreAudio or another framework?
Example: You're listening to a song on iTunes while watching a YouTube video, all while playing a computer game that makes sounds of its own, all of which are being played through your computer's speakers (Probably terribly annoying). My app would need to receive the entire mix as streaming data.
Thanks in advance.
Not at a user application's Core Audio or other app framework level. Some audio output capture/snoop apps may do this with a kernel extension (kext), or perhaps a replacement audio hardware driver.