I want to run tensorflow mobilenet model on a pre-recorded video. The only way I have found to extract frames is to use ffmpeg. But the thing is that I need to keep the app in expo. Is there any way I could get frames from the video and run the model on them?
Maybe by getting current frame from expo-av or something else.
Related
I've been working with expo-camera to record video in my React Native app. However, after going through the documentation I cannot find any way to set the frame rate. Is there any way to shoot video at 60+ fps with expo-camera or some other library?
VisionCamera can achieve 60fps video recording and also many other features.
I create the react native RNCamera app,i shoot the video by rncamera after that i want to trim that video while shooting the video and save the trimmed video .
enter image description here
FFmpeg kit for react native is the best way to go about doing this.
After installing FFmpeg kit to your project, you can then follow this standard ffmpeg trimming guide to create the necessary command to trim the video.
I'm working on a traffic detection project, where I get video stream from Youtube livestream using CamGear and then process each frame with OpenCV. I want to display final video with detections in Streamlit app. I tried passing processed frames to st.frame, but for some reason only first few frames are shown and then the video freezes. Now I am looking at using webrtc component, but I can’t figure out, how to change video source from webcam. Thanks.
I'm making an app where I have a camera feed that should be displayed, where the frames of the feed should be processed in real time.
I was thinking of using native modules for the camera feed and processing, but afaik in order to show the frames I'd have to send them through the React Native Bridge, which has a low bandwidth so the feed wouldn't appear in real time.
So I'm looking at having the camera feed and processing on the React Native side, but I need a way of getting the individual frames to process them while also showing the feed in real time.
I know that there are barcode scanners in React Native, so it must be possible to build something that both shows a camera feed and processes its frames.
Any help or pointers would be greatly appreciated, thanks!
I've got an application in React Native to be developed in Android and iOS, in which I'd like to take the camera stream, make some processing, and render it multiple times.
Imagine an app like Instagram, in which you can add filters to the camera live and it shows the applied filters with on live previews. It would be more or less the same.
I need to get the camera stream and be able to show it on screen multiple times.
I've tried react-native-camera, but it only lets me to get one instance of the camera to be shown.
With this same library, I've tried to take pictures in intervals of the camera and to render them via <Image> container but, of course, it's a performance kill and the app ends up crashing.
How can I achieve what I'm trying to do? Do you know of any module or any approach that allows me to do so?
Thank you.