React native video as GL texture - react-native

It seems it’s not possible to use a video as a texture with expo-gl (texImage2D is not able to take any video params, cf. context definition and API documentation). This feature is currently requested on the canny.
I'm looking to convert an expo-av video to an ArrayBuffer with pixel data or any way to pass a video as a texture to a shader.
I made some research but I didn't find any solution for the moment:
gl-react: video processing: https://github.com/gre/gl-react/issues/215
react-native-gpuimage (react-native-video is not working inside GL.Node): https://github.com/CubeSugar/react-native-GPUImage/issues/1
Actually, this could be interesting: https://github.com/shahen94/react-native-video-processing. This library purpose is to edit, trim and compress videos, but there's actually no way to pass custom shaders.
A lot of issues has been open on Github (even on the expo org expo-three package) but there's no answer yet. I'm looking for resources or any advice to accomplish this. For the moment, the best solution I can see is to do a sprite sheet.

Related

Video Trimming In Expo

Am developing an app which a user can upload a video status just like whatsapp but I need a video trimming library which I can use to implement this feature where a user can trim some portion of the video they selected with a time frame. I searched and found some libraries like react-native-video-processing and others but none seems to work with expo. so I will appreciate if someone can give me a working one or a guide on how I can use ffmpeg or other libraries to archive this.

How to send a texture with Agora Video SDK for Unity

I'm using the package Agora Video SDK for Unity and I have followed these two tutorials:
https://www.agora.io/en/blog/agora-video-sdk-for-unity-quick-start-programming-guide/
https://docs.agora.io/en/Video/screensharing_unity?platform=Unity
Up to here, it is working fine. The problem is that instead os sharing my screen, I want to send a texture. To do so, I'm loading a png picture and trying to set it to the mTexture you find in the second link. It seems to be working on my computer, but it is like it doesn't arrive to the target computer.
How can I send a texture properly?
Thanks
did you copy every line of the code from the example as is? You may not want to do the ReadPixel part since this reads the screen. You may just read the raw data from your input texture and send it with the PushVideoFrame every update.

Is there any library to transform recorded video to certain aspect ratio(croping) in react native?

I am using RNCamera to record video along with react-native-video-helper library for trimming/compressing video. I want to record or transform recorded video like instagram to wide angle(not potrait mode). It's been almost a week I am looking for a solution but I could'nt find anything useful till yet. I have tried react-native-video-processing library as well.
I have figured it out myself, posting answer to any one else struggling with the same scenario:
Go for ffmpeg, you can do almost every thing with this awesome tool, croping, merging, adding emoji's, text to video, triming, removing audio from video and so on.
You can find test application for a quick startup along with the package as well

React Native 360/Panorama viewer

I believe there are a bunch of questions related to this, but they are all outdated.
I'm looking for a way to render a panorama/360 picture viewer in React Native. So far, all the libraries that try to use Google's VR SDK are outdated or broken, and not usable at all.
I have also tried to use a WebView (with react-360), but web views are just way too slow, doubles RAM usage, and worst of all, can't be used to render 360 pictures stored on the device.
I guess that another option would be to grab an OpenGL library and try to implement it myself, but that's probably a lot of work if there's something made already.
We've recently published the panorama viewer we are using in our apps. Hope it can help you too. #lightbase/react-native-panorama-view

DSC-HX400 RAW image data & Movie Recording

I am currently testing a DSC-HX400. While I am able to do almost everything I need to with the camera there are a couple of items that are not exposed via the API that have frustrated my efforts.
1) The camera does not seem to offer an option, via the API or the camera itself, to capture images in RAW format. It does offer standard & fine JPEG format but both of those are leaving artifacts in the image that become extremely noticeable when you zoom in with an image editor. Is there a way to get the camera to capture RAW images? I do not need the SDK to return the data just to save it out to the card. If getting the RAW data is impossible has anyone found an inventive way to clean up the artifacts?
2) The camera supports both still shoot and movie mode but the API will only expose the mode that I am currently in. It makes it impossible to transition between still to movie mode (to allow recording) from the API but I can do that same transition by pressing a single button on the camera. Once I am recording a movie the API will allow me to transition back to still mode (by cancelling recording). Is there plans to support the ability to trigger a movie recording via the API if you are in a still capture mode (Seeing the firmware already supports this functionality)?
Answers to the questions below:
If the camera cannot capture RAW images, the API will not be able to either. I do not know of a way to capture RAW images but can only comment with regards to the API as I am not an expert on usage of the camera itself.
You can change between still and movie mode by using the "setShootMode" API.