I've got an application in React Native to be developed in Android and iOS, in which I'd like to take the camera stream, make some processing, and render it multiple times.
Imagine an app like Instagram, in which you can add filters to the camera live and it shows the applied filters with on live previews. It would be more or less the same.
I need to get the camera stream and be able to show it on screen multiple times.
I've tried react-native-camera, but it only lets me to get one instance of the camera to be shown.
With this same library, I've tried to take pictures in intervals of the camera and to render them via <Image> container but, of course, it's a performance kill and the app ends up crashing.
How can I achieve what I'm trying to do? Do you know of any module or any approach that allows me to do so?
Thank you.
Related
I'm making an app where I have a camera feed that should be displayed, where the frames of the feed should be processed in real time.
I was thinking of using native modules for the camera feed and processing, but afaik in order to show the frames I'd have to send them through the React Native Bridge, which has a low bandwidth so the feed wouldn't appear in real time.
So I'm looking at having the camera feed and processing on the React Native side, but I need a way of getting the individual frames to process them while also showing the feed in real time.
I know that there are barcode scanners in React Native, so it must be possible to build something that both shows a camera feed and processes its frames.
Any help or pointers would be greatly appreciated, thanks!
I am currently trying to produce an android app that can do live broadcasting. May I know if Agora has the functionalities to access both the rear and front camera of the broadcaster at the same time? If yes, which part of the code do we need to modify (based on Open-Live-Android)?
Agora does offer a demo that directly displays the code you are looking for, but if you can get both camera frames (which some devices may not support that), you can take a look at this demo app: https://github.com/AgoraIO/Advanced-Video/tree/dev/win-screenshare/Screensharing/Agora-Screen-Sharing-Android. In this demo app, the SDK is sending both camera view and screen share view at the same time. In order to achieve that, you need to make screen share as a standalone service. Following a similar logic, you can change the screen sharing part to one of the camera view.
I'm trying to build an react native app that allow user to record and publish square video (like instagram). Here are my attempts :
1 - Find a library that crop video in app => fail
2 - Find a library that show a square camera screen => fail
3 - Record video using native picker and use hidden overflow to display square video => success
I think that solution 3 is not optimal, do you have others ideas ?
(Library I checked : react-native-image-picker, react-native-image-crop-picker, react-native-camera, react-native-video-processing)
I think react-native-camera and react-native-video-processing is the right combo. You capture video with camera and then aggregate it with processing lib. To display a preview square screen to a user it would be best to crop camera view and make it look as a square. Later, in background, make it actual square.
I'm messing with this currently too :)
I'm looking to build something similar to the way Spotify (or other music players) work, where there is a footer element that allows you to control the currently playing music that persists regardless of how you navigate through the app. See the bottom of this image as an example.
See the bottom here with the playing music
Requirements for this are that the different screens must be able to communicate with that component so users can choose music from any screen. The component must persist across screens so that the currently playing music will be controllable even as you use the app.
I'm building in react-native and currently using react-navigation as my navigation solution, in case that makes a difference.
I used react-native-camera on my iOS app and now trying to add Focus, Brightness and Zoom controls to it. So far I was unable to come up with a solution. Any idea how to do this?
I tried to find an option in different react native camera packages. Also posted in their git repos for help. Finally tried this post: https://medium.com/react-native-development/react-native-camera-app-with-live-preview-saturation-and-brightness-filters-d34535cc6d14 Where they take a photo from the camera every 5 milliseconds and adjust its brightness which seems to be very unstable and it makes the app crash.
It is not possible to use the focus and zoom functionalities with react-native-camera.
Unluckily the focus api has many bugs and the zoom functionalities will not render fast enough with javascript.
Maybe a solution is not using react-native-camera and instead just writing an intent to open the default camera application
The following app uses this solution, all the camera functionalities work perfectly.
Could they re-open the issue as it seems to not be solved?
Developers may need to review all the open issues to estimate the project deadlines.