What am I doing?
A react-native-video component is loaded into the screen with a source uri from my nodejs backend.
What is the problem? (Only tried on android)
My backend is working fine and the video is loaded, but extremely slow. I checked on the backend that the video component is requesting a byte-range of 1MB every ~1s. On the other hand, if I request the video ranges with axios, it takes a couple of ms to request multiple byte-ranges.
When using a video stored locally on the android device, the loading time is almost instantaneous.
I tried the default android player as well as exoplayer with no difference in loading times.
I am trying all this with only one Video component and when I comment it the app runs smoothly
Possible solution:
I don't want to load the data directly from the Video component and I am trying now to download the byte-ranges and store them locally. Later I plan to load these byte-ranges into the Video component.
import AsyncStorage from '#react-native-async-storage/async-storage';
...
await AsyncStorage.setItem(video_name, JSON.stringify(byte_range_data_from_backend))
This works fine and I can read the byte_range_data_from_backend later with:
const data_from_item = await AsyncStorage.getItem(video_name)
What is my question?
How can I get these video_ranges uris to pass them to the Video component?
I think I need to create some kind of file and update the file content every time I receive a new byte-range. Meanwhile, I would like the video to be played using the already downloaded byte-ranges.
Is this possible or is not a good approach? What would be a possible way to load videos fast? It would be nice to download some byte-ranges before even trying to play the video, so that the waiting time for the user is minimal.
Related
I want to integrate Screen Share feature in my react-native application in which I am using Twilio for video communication. In Web we are able to achieve this by following these steps.
1 : We get the media device stream using
navigator.mediaDevices.getDisplayMedia({
video: true,
});
2 : Then we get the first stream tracks using
const newScreenTrack = first(stream.getVideoTracks());
3 : After that we set this newScreenTrack in some useState
const localScreenTrack = new TwilioVideo.LocalVideoTrack(
newScreenTrack
);
4 : After that we first unpublish the previous tracks and publish the new tracks using
videoRoom.localParticipant.publishTrack(newScreenTrack, {
name: "screen_share",
});
5 : And finally we pass these tracks in our ScreenShare component and render these tracks to View the screenShare from remote Participant.
I need to do the same thing in my react-native application as well. Where if localParticipant ask for screenShare permission to another participant. Participant will accept the permission and able to publish the localScreenShare tracks.
If anyone know this please help me in this. It would be really helpful. Thank you
I think this is an issue with the react-native-twilio-video-webrtc package. It seems that, as you discovered in this issue, that screen sharing was previously a feature of the library and it was removed as part of a refactor.
Sadly, the library does more work than the underlying Twilio libraries to look after the video and audio tracks. The Twilio library is built to be able to publish more than one track at a time, however this React Native library allows you to publish a single audio track and a single video track using the camera at a time.
In order to add screen sharing, you can either support pull requests like this one or refactor the library to separate getting access to the camera from publishing a video track, so that you can publish multiple video tracks at a time, including screen tracks.
Has anyone else found this? I'm using AudioSource.uri to get the remote audio source, then just using await player.play(); to play the remote audio file, on Android these audio files buffer and start playing a lot faster than on iOS where it takes up to 10 seconds to load and start playing (versus just 2-3 seconds on Android)
This happens because by default, iOS tries to prevent the player from having to stutter during playback when the network is slow. In effect, it waits longer for more data to be downloaded up front before allowing the audio to start.
How to override the iOS default: The AudioPlayer constructor in just_audio takes a parameter called audioLoadConfiguration where you can pass in platform specific parameters that control loading behaviour. One parameter that is relevant here is automaticallyWaitsToMinimizeStalling which you'll want to set to false. e.g.:
final player = AudioPlayer(
audioLoadConfiguration: AudioLoadConfiguration(
darwinLoadControl: DarwinLoadControl(
automaticallyWaitsToMinimizeStalling: false)));
I am currently working on a map project using React Native and Expo CLI. I am getting the markers data from firestore (data size is notably small, only around 15 markers), and when I render the map, the map works perfectly except that it is very slow, and I always receive this message for PayloadTooLargeError. After loading the maps, the whole app becomes unresponsive
PayloadTooLargeError: request entity too large
at readStream (C:\Users\user\AppData\Roaming\npm\node_modules\expo-cli\node_modules\raw-body\index.js:155:17)
at getRawBody (C:\Users\user\AppData\Roaming\npm\node_modules\expo-cli\node_modules\raw-body\index.js:108:12)
at read (C:\Users\user\AppData\Roaming\npm\node_modules\expo-cli\node_modules\body-parser\lib\read.js:77:3)
at jsonParser (C:\Users\user\AppData\Roaming\npm\node_modules\expo-cli\node_modules\body-parser\lib\types\json.js:135:5)
at call (root\node_modules\connect\index.js:239:7)
at next (root\node_modules\connect\index.js:183:5)
at remoteDevtoolsCorsMiddleware (C:\Users\user\AppData\Roaming\npm\node_modules\expo-cli\node_modules\#expo\dev-server\src\middleware\remoteDevtoolsCorsMiddleware.ts:31:3)
at call (root\node_modules\connect\index.js:239:7)
at next (root\node_modules\connect\index.js:183:5)
at serveStatic (root\node_modules\serve-static\index.js:75:16)
I've seen several solutions such as to set a limit in express js, however I did not add any code for the backend part and now I am confused on how to solve this problem.
I'm running a VueJS application that displays a full screen story of videos. I don't create as many tag as number of media in my story : I'm just changing component video sources each time I play a new video.
But it looks like Safari (Desktop & mobile) still does not cache HTML video once loaded : when I'm playing again a previous media, Safari is downloading again the asset. Instead of getting from cache like Chrome does.
The same issue has already been reported here but sill no correct answer.
Safari even stops downloading the final bytes video (producing a sort of timeout) when we go back and forth quicky in the story, so the story looks stuck.
Here's an example link.
Does anyone know a good alternative that avoids re-downloading video data at each play on Safari ?
Partial solution
Found myself a workaround that works pretty well if video is small size - all video are less than 3Mb in my case.
The trick is to use js fetch API to download full video, then stream it into video tag.
const videoRequest = fetch("/path/to/video.mp4")
.then(response => response.blob());
videoRequest.then(blob => {
video.src = window.URL.createObjectURL(blob);
});
Contrary to video src attribute, fetch API will get video data from cache if the same video was already fetched before.
Here a codepen demo that can be tested in Safari desktop/mobile (when NOT in private mode).
Pro : Video are now pulled from cache in Safari !
Con : You can't start the video until full data has been downloaded. That's why this solution can be used only for small video (like < 5Mb), else your users may wait a while before being able to play the video.
I am building a video component in which one I can add multiple sources. And It just read all the videos one after the other.
So It's quite simple, for now I'va got only one video component in my custom component and I loadAsync the next video or the previous one when it's needed.
My problem is that when there is bad internet, it can be a bit long to wait between each video.
So I would like to preload videos and cache them like videos are loading at the same time that another is playing.
I tried with FileSystem.downloadAsync() but it's not really smooth, you have to wait the end to have a valid uri to pass it to the video component. So if it's not downloaded before the end of the previous video you can't read it.
I was thinking to have multiple video components in my custom component that preload video and play and show when it's the good time but I find it quite complex.
Is there a better way to do that ?
I just discovered that you can use downloadAsync like you would on images:
async componentDidMount() {
await Asset.fromModule(require('../assets/background.mp4')).downloadAsync();
this.setState({ videoLoaded: true });
}