I need to stream camera and microphone feed using react native. I need to open the camera and get the stream in real time.
You need react-native-fetch-blob and Tailer package. It has a function called RNFetchBlob.fs.readStream which can be used to stream a file.
The step by step tutorial is here
https://medium.com/react-native-training/build-youtube-alike-livestreams-with-react-native-8dde24adf543
Related
The purpose is to play a local video on the host side and stream it on the participants' side. The video should pause and seek for everyone when the host does it.
For this, I want to capture a stream (of MediaStream type) of a local video file while it is playing and pass it into WebRTC.
Just like on the web we have a captureStream() method to capture stream from video or canvas, do we have anything similar in react-native? Or any other way to achieve the same goal?
I could not find a relative solution with the RTCView of react-native-webrtc or the react-native-video. Any type of solution or suggestion would be helpful.
Thank you in advance.
I need to stream microphone to speakers simultaneously in a react native project. I've tried the LiveAudioStream & MicStream packages, but I have no idea how can I directly play the microphone stream.
Can anyone suggest me the best way to do that?
Did you try this library react-native-audio-player-recorder library ? Maybe this let you do both things at same time
1.RtcEngine.startAudioRecordingWithConfig({filePath: '/storage/emulated/0/'});
2.RtcEngine.stopAudioRecording();
You can use the startAudioRecording method to record the audio of the call using the React Native SDK. If you want to record video, you can use cloud recording.
I'm using the package Agora Video SDK for Unity and I have followed these two tutorials:
https://www.agora.io/en/blog/agora-video-sdk-for-unity-quick-start-programming-guide/
https://docs.agora.io/en/Video/screensharing_unity?platform=Unity
Up to here, it is working fine. The problem is that instead os sharing my screen, I want to send a texture. To do so, I'm loading a png picture and trying to set it to the mTexture you find in the second link. It seems to be working on my computer, but it is like it doesn't arrive to the target computer.
How can I send a texture properly?
Thanks
did you copy every line of the code from the example as is? You may not want to do the ReadPixel part since this reads the screen. You may just read the raw data from your input texture and send it with the PushVideoFrame every update.
Is there a way to create App in react native which can be used to record audio and video and save same file on both Android and iOS Devices using App?
Please help me as I am stuck.
Thank You.
I guess you can use 3 library maintained by the React Native community.
react-native-camera (for recording videos)
react-native-video (for playing video)
react-native-audio-toolkit (for recording and playing audio)
To record an audio
import {
Recorder,
} from 'react-native-audio-toolkit';
// function to start recorder
this.recorder = new Recorder(‘filename.mp4’).record();
To record video and play it, there is a really good blog post in medium
https://medium.com/react-native-training/uploading-videos-from-react-native-c79f520b9ae1
For more example, please check examples in the github pages for each library
https://github.com/react-native-community/react-native-camera
https://github.com/react-native-community/react-native-video
https://github.com/react-native-community/react-native-audio-toolkit