How to access current video frame in Expo Camera? - react-native

I'm trying to access the current video frame in React Native. I was able to do the same with 'react-webcam' while using React.js with the below code.
import React from "react";
import Webcam from "react-webcam";
function MyFunc () {
const cameraRef = useRef(null);
const processFrame = async() => {
if (cameraRef.current.video){
const img = cameraRef.current.video;
// Code to process.
};
setTimeout(() => processFrame(), 500)
};
React.useEffect(() => {
processFrame();
}, [])
return (
<Webcam
align="center"
audio={false}
mirrored={false}
id="img"
ref={cameraRef}
style={{display: "none"}}
/>
);
};
My current code in React Native using Expo Camera is -
import React, { useState, useEffect, useRef } from "react";
import { Camera } from "expo-camera";
export function MyFunc () {
const cameraRef = useRef(null);
const processFrame = async () => {
const img = cameraRef.current.video;
console.log(img); // This prints undefined
setTimeout(() => processFrame(), 500)
};
useEffect(() => {
processFrame();
}, []);
return (
<Camera
ref={cameraRef}
type={Camera.Constants.Type.front}
style={{opacity: 0, width:1, height:1}}
/>
);
};
Please let me know how can I access the current video frame without using the asyncTakePicture if possible.

You need a real-time working camera which is Tensorflow Camera (build from expo-camera)
cameraWithTensors (CameraComponent),
A higher-order-component (HOC) that augments the Expo.Camera component with the ability to yield tensors representing the camera stream.
Because the camera data will be consumed in the process, the original camera component will not render any content. This component provides options that can be used to render the camera preview.
Notably the component allows on-the-fly resizing of the camera image to smaller dimensions, this speeds up data transfer between the native and javascript threads immensely.
For more info: Tensorflow React Native API

Related

React Native: data not displaying after async fetch

I'm developing a mobile app with React Native and Expo managed workflow. The app is supposed to serve as a song book with lyrics to songs and hymns. All of the lyrics are stored in Firebase' Firestore database and clients can load them in app. I started to implement offline functionality, where all of the lyrics are stored on the user's device using community's AsyncStorage.
I want to get the data stored in AsyncStorage first, set them to state variable holding songs and then look if user has internet access. If yes, I want to check for updates in Firestore, if there were any, I will set the data from Firestore to state variable holding songs. If user does not have internet access, the data from AsyncStorage will already be set to state variable holding songs.
I'm trying to achieve this with an async function inside useEffect hook with empty array of vars/dependecies. The problem I'm having is that no songs are rendered on screen even though they are successfuly retrieved from AsyncStorage.
(When I console.log the output of retrieving the data from AsyncStorage I can see all songs, when I console log songs or allSongs state var, I'm getting undefined)
Here is my simplified code:
import React, { useEffect, useState } from 'react';
import {
StyleSheet,
FlatList,
SafeAreaView,
LogBox,
View,
Text,
} from 'react-native';
import { StatusBar } from 'expo-status-bar';
import { filter, _ } from 'lodash';
import { doc, getDoc } from 'firebase/firestore';
import NetInfo from '#react-native-community/netinfo';
import { db } from '../../../firebase-config';
import { ThemeContext } from '../../util/ThemeManager';
import {
getStoredData,
getStoredObjectData,
storeData,
storeObjectData,
} from '../../util/LocalStorage';
const SongsList = ({ route, navigation }) => {
// const allSongs = props.route.params.data;
const { theme } = React.useContext(ThemeContext);
const [loading, setLoading] = useState(false);
const [allSongs, setAllSongs] = useState();
const [songs, setSongs] = useState(allSongs);
const hymnsRef = doc(db, 'index/hymns');
useEffect(() => {
const setup = async () => {
setLoading(true);
const locData = await getStoredObjectData('hymnsData');
console.log(locData);
setAllSongs(locData);
setSongs(locData);
const netInfo = await NetInfo.fetch();
if (netInfo.isInternetReachable) {
const data = await getDoc(hymnsRef);
const lastChangeDb = data.get('lastChange').valueOf();
const hymnsData = data.get('all');
const lastChangeLocal = await getStoredData('lastChange');
if (lastChangeLocal) {
if (lastChangeLocal !== lastChangeDb) {
await storeData('lastChange', lastChangeDb);
await storeObjectData('hymnsData', hymnsData);
setAllSongs(hymnsData);
setSongs(hymnsData);
}
}
}
sortHymns();
setLoading(false);
};
setup();
}, []);
return (
<SafeAreaView style={[styles.container, styles[`container${theme}`]]}>
{!loading ? (
<FlatList
data={songs}
keyExtractor={(item) => item?.number}
renderItem={({ item }) => {
return <ListItem item={item} onPress={() => goToSong(item)} />;
}}
ItemSeparatorComponent={Separator}
ListHeaderComponent={
route.params.filters ? (
<SearchFilterBar
filters={filters}
handleFilter={handleFilter}
query={query}
handleSearch={handleSearch}
seasonQuery={seasonQuery}
setSeasonQuery={setSeasonQuery}
/>
) : (
<SearchBar handleSearch={handleSearch} query={query} />
)
}
/>
) : (
<View>
<Text>loading</Text>
</View>
)}
<StatusBar style={theme === 'dark' ? 'light' : 'dark'} />
</SafeAreaView>
);
};
export default SongsList;
<script src="https://cdnjs.cloudflare.com/ajax/libs/react/16.6.3/umd/react.production.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/react-dom/16.6.3/umd/react-dom.production.min.js"></script>
The functions getStoredData, getStoredObjectData, storeData and storeObjectData are just getItem and setItem methods of AsyncStorage.
Here is my full code - GitHub.
What am I doing wrong? I've went over many tutorials and articles and it should be working... but I guess not?
Can you check if hymnsData is undefined? After the const hymnsData = data.get('all'); line.
If so, that would explain the issue - you are correctly setting the locData but then overwriting it immediately after. If that is the case, I would add hymnsData to the if condition if (hymnsData && lastChangeLocal) { ... }
If you log songs and allSongs right before the return (, do you see ever see that they are populated, briefly?
Another thing I'd do to debug, is comment out the
setAllSongs(hymnsData);
setSongs(hymnsData);
lines and see if it is working as expected with locData only
The problem was with the sortHymns() method. I moved it from it's own method to the useEffect and it's working now.

Expo: How to detect when a WebBrowser instance is closed by the user?

I have an Expo app that will open some web page with a redirect to expo itself. On that case, this is to perform 3DS callbacks. Here is a very simplified version:
import React, {
FC, useEffect, useState,
} from 'react';
import * as Linking from 'expo-linking';
import * as WebBrowser from 'expo-web-browser';
import {
Button,
} from '#private/apps-components';
import {
ButtonProps,
View,
} from 'react-native';
export const MyComponent: FC = () => {
const [loading, setLoading] = useState<boolean>(false);
const urlEventHandler = async (event): Promise<void> => {
console.log('url-event', event);
setLoading(false);
// Stuff...
};
useEffect(() => {
Linking.addEventListener('url', urlEventHandler);
return () => Linking.removeEventListener('url', urlEventHandler);
}, []);
const handlePress: ButtonProps['onPress'] = () => {
setLoading(false);
WebBrowser.openBrowserAsync(aRandomUrlThatWillRedirectToTheApp, {
showInRecents: true,
})
}
return (
<View>
<Button
title="Test"
onPress={handlePress}
loading={loading}
/>
</View>
);
};
export default null;
This is working. However, if the customer close the navigator before the web redirect is being processed, the app is stuck on the loading state.
The question is: How to detect if a user has closed the opened WebBrowser?
Solved this using AppState:
https://reactnative.dev/docs/appstate
if (appState === "active") { // do things after closing the browser }
I haven't actually tested this - could follow up - but you could probably use react-navigation to detect whether the component is in focus or not. IE when you open the web browser, the component is not in focus, but when you close the web browser, the component is back in focus.
For react navigation version 4 you would wrap the component in withNavigationFocus in order to achieve this: https://reactnavigation.org/docs/4.x/function-after-focusing-screen#triggering-an-action-with-the-withnavigationfocus-higher-order-component. For 5, and 5+, you can use the useIsFocused hook: https://reactnavigation.org/docs/5.x/function-after-focusing-screen/#re-rendering-screen-with-the-useisfocused-hook

How to Make Shaka Player Compatible with react Native?

I am working on React-Native application. i want to implement shaka player in react-native.
any solution?
Have you tried using the shaka-player npm package? You can use it within your React component and instantiate as state.
import React, { useRef, useState, useEffect } from 'react';
import shaka from 'shaka-player';
const Component = () => {
const playerRef = useRef();
const [videoPlayer, setVideoPlayer] = useState();
useEffect(() => {
if (!videoPlayer) {
const newPlayer = new shaka.Player(playerRef.current);
setVideoPlayer(newPlayer);
// you can start using shaka-player APIs with videoPlayer
}
}, []);
return (
<video ref={playerRef} />
);
}

Determining Camera Texture

Been stuck a while trying to figure an issue out for android devices. The sample code from the tensorFlow.js library says that the resolution of the camera has to be determined empirically. With iphone it's been relatively consistent across versions (only changing for version 6 and above), but android phones are so varied I need to figure out a way to automatically determine it. When the resolution is incorrect, the waypoints used to refer to different parts of the body in the body scan app are in the wrong locations (Ex: Head is near the shoulder). Does anyone have tips to find the resolution? A lot of the resources just refer to them as magic numbers. I've also linked someone with a similar issue who has had no response.
import { Camera } from 'expo-camera';
import { cameraWithTensors } from '#tensorflow/tfjs-react-native';
const TensorCamera = cameraWithTensors(Camera);
class MyComponent {
handleCameraStream(images, updatePreview, gl) {
const loop = async () => {
const nextImageTensor = images.next().value
//
// do something with tensor here
//
// if autorender is false you need the following two lines.
// updatePreview();
// gl.endFrameEXP();
requestAnimation(loop);
}
loop();
}
render() {
// Currently expo does not support automatically determining the
// resolution of the camera texture used. So it must be determined
// empirically for the supported devices and preview size.
let textureDims;
if (Platform.OS === 'ios') {
textureDims = {
height: 1920,
width: 1080,
};
} else {
textureDims = {
height: 1200,
width: 1600,
};
}
return <View>
<TensorCamera
// Standard Camera props
style={styles.camera}
type={Camera.Constants.Type.front}
// Tensor related props
cameraTextureHeight={textureDims.height}
cameraTextureWidth={textureDims.width}
resizeHeight={200}
resizeWidth={152}
resizeDepth={3}
onReady={this.handleCameraStream}
autorender={true}
/>
</View>
}
}
The textureDims also varies depending on whether the user is in potrait or landscape so the current hard-coded value for iOS devices wouldn't work as well if the user were to change the orientation of the device.
What you could use instead is useWindowDimensions but this only works with functional components not with class components.
import React, { useState } from "react";
import { View, useWindowDimensions } from "react-native";
const [textureDims, setTextureDims] = useState();
const windowWidth = useWindowDimensions().width;
const windowHeight = useWindowDimensions().height;
setTextureDims({ height: windowHeight, width: windowWidth });
return (
<View>
<TensorCamera
// Standard Camera props
style={styles.camera}
type={Camera.Constants.Type.front}
// Tensor related props
cameraTextureHeight={textureDims.height}
cameraTextureWidth={textureDims.width}
resizeHeight={200}
resizeWidth={152}
resizeDepth={3}
onReady={handleCameraStream}
autorender={true}
/>
</View>
);

undefined is not an object (evaluating '_expo.Permission.askAsync')

i don't know what's the problem exactly but when i click on the button to choose image that erreur fire in the console
here's my code
_checkPermissions = async () => {
try {
const { status } = await Permission.askAsync(Permission.CAMERA);
this.setState({ camera: status });
const { statusRoll } = await Permission.askAsync(Permission.CAMERA_ROLL);
this.setState({ cameraRoll: statusRoll });
} catch (err) {
console.log(err);
}
};
findNewImage = async () => {
try {
this._checkPermissions();
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: "Images",
allowsEditing: true,
quality: 1
});
if (!result.cancelled) {
this.setState({
image: result.uri
});
} else {
console.log("cancel");
}
} catch (err) {
// console.log(err);
}
};
to me what solved it was importing the permissions and imagePicker like this:
import * as Permissions from 'expo-permissions';
import * as ImagePicker from 'expo-image-picker';
instead of this:
import Permissions from 'expo-permissions';
import ImagePicker from 'expo-image-picker';
And that's basically because there is no default export
It is getAsync(), not askAsync()
https://docs.expo.io/versions/latest/sdk/permissions/
I know I'm a little late to the party, but I feel it's important to show a way that is currently working (as of 2022) and also askAsync is deprecated ...
Getting image from (native) CAMERA
TL;DR: Even though we want "camera", we will actually use expo-image-picker FOR THE CAMERA (yes, you read right!)
I REPEAT: DO NOT USE expo-camera FOR CAMERA!
REMEMBER: USE ImagePickerExpo.requestCameraPermissionsAsync()AND ImagePickerExpo.launchCameraAsync() NOT Camera....!
So install it first: expo install expo-image-picker
Then import everything, from it under 1 alias, I like to use ImagePickerExpo because ImagePicker itself is confusing since it can mean more libraries, + import all needed for this code - you can replace Button with any other button/pressable that supports onPress (to use react-native-elements, you need to install it with yarn add react-native-elements)
Create displaying component
Create a state & setter to save current image source
Create a function that requests the permissions and opens the camera
Return the coponent with button binding onPress on function from 5. and Image that is displayed from the state from 4. but only when available.
working & tested(so far on android in expo go) code:
import React, { useState } from 'react';
import { View, Image, Alert, StyleSheet } from 'react-native';
import { Button } from 'react-native-elements';
import * as ImagePickerExpo from 'expo-image-picker';
const MyCameraComponent = () => {
const [selectedImage, setSelectedImage] = useState(null);
const openCameraWithPermission = async () => {
let permissionResult = await ImagePickerExpo.requestCameraPermissionsAsync();
if (permissionResult.granted === false) {
Alert.alert("For this to work app needs camera roll permissions...");
return;
}
let cameraResult = await ImagePickerExpo.launchCameraAsync({
// ...
});
console.log(cameraResult);
if (cameraResult.cancelled === true) {
return;
}
setSelectedImage({ localUri: cameraResult.uri });
};
return (
<View>
<Button title='Take a photo' onPress={openCameraWithPermission}></Button>
{(selectedImage !== null) && <Image
source={{ uri: selectedImage.localUri }}
style={styles.thumbnail}
/>}
</View>
);
}
const styles = StyleSheet.create({
thumbnail: {
width: 300,
height: 300,
resizeMode: "contain"
}
});
export default MyCameraComponent;
Note that I had to style the Image for it to display, it didn't display to me without proper styling which I find misleading, but I guess that's the react native way...
BTW: This also works in Android emulator (besides expo go in real Android device)
It also works on snack on desktop but only when you choose android (or Web) - https://snack.expo.dev/#jave.web/expo-camera-from-expo-image-picker
Getting image from (native) gallery (not camera)
In case you're wondering how to do the same for gallery, the code is basically the same, you just need a different callback function for the button that uses requestMediaLibraryPermissionsAsync / launchImageLibraryAsync instead of the camera ones.
let openImagePickerAsync = async () => {
let permissionResult = await ImagePickerExpo.requestMediaLibraryPermissionsAsync();
if (permissionResult.granted === false) {
Alert.alert("For this to work app needs media library/gallery permissions...");
return;
}
let pickerResult = await ImagePickerExpo.launchImageLibraryAsync({
presentationStyle: 0, // without this iOS was crashing
});
console.log(pickerResult);
if (pickerResult.cancelled === true) {
return;
}
setSelectedImage({ localUri: pickerResult.uri });
}