Determining Camera Texture - react-native

Been stuck a while trying to figure an issue out for android devices. The sample code from the tensorFlow.js library says that the resolution of the camera has to be determined empirically. With iphone it's been relatively consistent across versions (only changing for version 6 and above), but android phones are so varied I need to figure out a way to automatically determine it. When the resolution is incorrect, the waypoints used to refer to different parts of the body in the body scan app are in the wrong locations (Ex: Head is near the shoulder). Does anyone have tips to find the resolution? A lot of the resources just refer to them as magic numbers. I've also linked someone with a similar issue who has had no response.
import { Camera } from 'expo-camera';
import { cameraWithTensors } from '#tensorflow/tfjs-react-native';
const TensorCamera = cameraWithTensors(Camera);
class MyComponent {
handleCameraStream(images, updatePreview, gl) {
const loop = async () => {
const nextImageTensor = images.next().value
//
// do something with tensor here
//
// if autorender is false you need the following two lines.
// updatePreview();
// gl.endFrameEXP();
requestAnimation(loop);
}
loop();
}
render() {
// Currently expo does not support automatically determining the
// resolution of the camera texture used. So it must be determined
// empirically for the supported devices and preview size.
let textureDims;
if (Platform.OS === 'ios') {
textureDims = {
height: 1920,
width: 1080,
};
} else {
textureDims = {
height: 1200,
width: 1600,
};
}
return <View>
<TensorCamera
// Standard Camera props
style={styles.camera}
type={Camera.Constants.Type.front}
// Tensor related props
cameraTextureHeight={textureDims.height}
cameraTextureWidth={textureDims.width}
resizeHeight={200}
resizeWidth={152}
resizeDepth={3}
onReady={this.handleCameraStream}
autorender={true}
/>
</View>
}
}

The textureDims also varies depending on whether the user is in potrait or landscape so the current hard-coded value for iOS devices wouldn't work as well if the user were to change the orientation of the device.
What you could use instead is useWindowDimensions but this only works with functional components not with class components.
import React, { useState } from "react";
import { View, useWindowDimensions } from "react-native";
const [textureDims, setTextureDims] = useState();
const windowWidth = useWindowDimensions().width;
const windowHeight = useWindowDimensions().height;
setTextureDims({ height: windowHeight, width: windowWidth });
return (
<View>
<TensorCamera
// Standard Camera props
style={styles.camera}
type={Camera.Constants.Type.front}
// Tensor related props
cameraTextureHeight={textureDims.height}
cameraTextureWidth={textureDims.width}
resizeHeight={200}
resizeWidth={152}
resizeDepth={3}
onReady={handleCameraStream}
autorender={true}
/>
</View>
);

Related

Warning: React has detected a change in the order of Hooks

I have run into this error in my code, and don't really know how to solve it, can anyone help me?
I get the following error message:
ERROR Warning: React has detected a change in the order of Hooks called by ScreenA. This will lead to bugs and errors if not fixed. For more information, read the Rules of Hooks: https://reactjs.org/link/rules-of-hooks
import React, { useCallback, useEffect, useState } from "react";
import { View, Text, StyleSheet, Pressable } from "react-native";
import { useNavigation } from '#react-navigation/native';
import { DancingScript_400Regular } from "#expo-google-fonts/dancing-script";
import * as SplashScreen from 'expo-splash-screen';
import * as Font from 'expo-font';
export default function ScreenA({ route }) {
const [appIsReady, setAppIsReady] = useState(false);
useEffect(() => {
async function prepare() {
try {
// Keep the splash screen visible while we fetch resources
await SplashScreen.preventAutoHideAsync();
// Pre-load fonts, make any API calls you need to do here
await Font.loadAsync({ DancingScript_400Regular });
// Artificially delay for two seconds to simulate a slow loading
// experience. Please remove this if you copy and paste the code!
await new Promise(resolve => setTimeout(resolve, 2000));
} catch (e) {
console.warn(e);
} finally {
// Tell the application to render
setAppIsReady(true);
}
}
prepare();
}, []);
const onLayoutRootView = useCallback(async () => {
if (appIsReady) {
// This tells the splash screen to hide immediately! If we call this after
// `setAppIsReady`, then we may see a blank screen while the app is
// loading its initial state and rendering its first pixels. So instead,
// we hide the splash screen once we know the root view has already
// performed layout.
await SplashScreen.hideAsync();
}
}, [appIsReady]);
if (!appIsReady) {
return null;
}
const navigation = useNavigation();
const onPressHandler = () => {
// navigation.navigate('Screen_B', { itemName: 'Item from Screen A', itemID: 12 });
}
return (
<View style={styles.body} onLayout={onLayoutRootView}>
<Text style={styles.text}>
Screen A
</Text>
<Pressable
onPress={onPressHandler}
style={({ pressed }) => ({ backgroundColor: pressed ? '#ddd' : '#0f0' })}
>
<Text style={styles.text}>
Go To Screen B
</Text>
</Pressable>
<Text style={styles.text}>{route.params?.Message}</Text>
</View>
)
}
const styles = StyleSheet.create({
body: {
flex: 1,
justifyContent: 'center',
alignItems: 'center',
},
text: {
fontSize: 40,
margin: 10,
fontFamily: 'DancingScript_400Regular'
}
})
I have read the rules of hooks: https://reactjs.org/docs/hooks-rules.html
The output is correct, but i want to fix this error before i add more additions to the app
You need to move useNavigation use before early returns.
Instead, always use Hooks at the top level of your React function, before any early returns.
The key is you need to call all the hooks in the exact same order on every component lifecycle update, which means you can't use hooks with conditional operators or loop statements such as:
if (customValue) useHook();
// or
for (let i = 0; i< customValue; i++) useHook();
// or
if (customValue) return;
useHook();
So moving const navigation = useNavigation(); before if (!appIsReady) {return null;}, should solve your problem:
export default function ScreenA({ route }) {
const [appIsReady, setAppIsReady] = useState(false);
const navigation = useNavigation();
// ...
}

In React Native, creating a canva and calling getcontext create error

I am trying to use a function called readpixels from this github page and in this function one need to get the context of a canva, but since I am using React Native, I cannot use expressions like new Image() or document.createElement('canvas') so I am trying to do an equivalent using React Native functions.
Here is a minimal version of the code:
import React, { useState, useEffect, useRef } from 'react';
import { Button, Image, View } from 'react-native';
import * as ImagePicker from 'expo-image-picker';
import Canvas from 'react-native-canvas';
export function Canva() {
const ref = useRef(null);
useEffect(() => {
if (ref.current) {
const ctx = ref.current.getContext('2d');
if (ctx) {
Alert.alert('Canvas is ready');
}
}
}, [ref]);
return (
<Canvas ref={ref} />
);
}
function readpixels(url, limit = 0) {
const img = React.createElement(
"img",
{
src: url,
},
)
const canvas = Canva()
const ctx = canvas.getContext('2d')
return 1
}
export default function ImagePickerExample() {
const [image, setImage] = useState(null);
const pickImage = async () => {
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
quality: 1,
});
readpixels(result.uri)
if (!result.cancelled) {
setImage({uri: result.uri, fileSize: result.fileSize});
}
};
return (
<View style={{ flex: 1, backgroundColor: "white", marginTop: 50 }} >
<Button title="Pick image from camera roll" onPress={pickImage} />
{image && <Image source={{ uri: image.uri }} style={{ width: 200, height: 200 }} />}
</View>
);
}
And here is the error that I get:
Error: Invalid hook call. Hooks can only be called inside of the body of a function component. This could happen for one of the following reasons:
1. You might have mismatching versions of React and the renderer (such as React DOM)
2. You might be breaking the Rules of Hooks
3. You might have more than one copy of React in the same app
I have checked the three suggestions to solve the issue, but it did not work.
Thank you
p.s: in order to reproduce the code, you would need to install the react-native-canvas package

How to access current video frame in Expo Camera?

I'm trying to access the current video frame in React Native. I was able to do the same with 'react-webcam' while using React.js with the below code.
import React from "react";
import Webcam from "react-webcam";
function MyFunc () {
const cameraRef = useRef(null);
const processFrame = async() => {
if (cameraRef.current.video){
const img = cameraRef.current.video;
// Code to process.
};
setTimeout(() => processFrame(), 500)
};
React.useEffect(() => {
processFrame();
}, [])
return (
<Webcam
align="center"
audio={false}
mirrored={false}
id="img"
ref={cameraRef}
style={{display: "none"}}
/>
);
};
My current code in React Native using Expo Camera is -
import React, { useState, useEffect, useRef } from "react";
import { Camera } from "expo-camera";
export function MyFunc () {
const cameraRef = useRef(null);
const processFrame = async () => {
const img = cameraRef.current.video;
console.log(img); // This prints undefined
setTimeout(() => processFrame(), 500)
};
useEffect(() => {
processFrame();
}, []);
return (
<Camera
ref={cameraRef}
type={Camera.Constants.Type.front}
style={{opacity: 0, width:1, height:1}}
/>
);
};
Please let me know how can I access the current video frame without using the asyncTakePicture if possible.
You need a real-time working camera which is Tensorflow Camera (build from expo-camera)
cameraWithTensors (CameraComponent),
A higher-order-component (HOC) that augments the Expo.Camera component with the ability to yield tensors representing the camera stream.
Because the camera data will be consumed in the process, the original camera component will not render any content. This component provides options that can be used to render the camera preview.
Notably the component allows on-the-fly resizing of the camera image to smaller dimensions, this speeds up data transfer between the native and javascript threads immensely.
For more info: Tensorflow React Native API

Getting a spinning plank screen and Error: Camera is not ready yet. Wait for 'onCameraReady' callback using expo Camera component

I'm new to web development and I'm trying to build an image recognition app using expo for testing. My code for the camera is below. On screen load, I get a black screen (not the camera) with my "capture" button. When I click on capture, I get the error:
Unhandled promise rejection: Error: Camera is not ready yet. Wait for 'onCameraReady' callback.
My code is below
import { Dimensions, Alert, StyleSheet, ActivityIndicator } from 'react-native';
// import { RNCamera } from 'react-native-camera';
import CaptureButton from './CaptureButton.js'
import { Camera } from 'expo-camera';
export default class AppCamera extends React.Component {
constructor(props){
super(props);
this.state = {
identifiedAs: '',
loading: false
}
}
takePicture = async function(){
if (this.camera) {
// Pause the camera's preview
this.camera.pausePreview();
// Set the activity indicator
this.setState((previousState, props) => ({
loading: true
}));
// Set options
const options = {
base64: true
};
// Get the base64 version of the image
const data = await this.camera.takePictureAsync(options)
// Get the identified image
this.identifyImage(data.base64);
}
}
identifyImage(imageData){
// Initialise Clarifai api
const Clarifai = require('clarifai');
const app = new Clarifai.App({
apiKey: '8d5ecc284af54894a38ba9bd7e95681b'
});
// Identify the image
app.models.predict(Clarifai.GENERAL_MODEL, {base64: imageData})
.then((response) => this.displayAnswer(response.outputs[0].data.concepts[0].name)
.catch((err) => alert(err))
);
}
displayAnswer(identifiedImage){
// Dismiss the acitivty indicator
this.setState((prevState, props) => ({
identifiedAs:identifiedImage,
loading:false
}));
// Show an alert with the answer on
Alert.alert(
this.state.identifiedAs,
'',
{ cancelable: false }
)
// Resume the preview
this.camera.resumePreview();
}
render () {
const styles = StyleSheet.create({
preview: {
flex: 1,
justifyContent: 'flex-end',
alignItems: 'center',
height: Dimensions.get('window').height,
width: Dimensions.get('window').width,
},
loadingIndicator: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
}
});
return (
<Camera ref={ref => {this.camera = ref;}}style={styles.preview}>
<ActivityIndicator size="large" style={styles.loadingIndicator} color="#fff" animating={this.state.loading}/>
<CaptureButton buttonDisabled={this.state.loading} onClick={this.takePicture.bind(this)}/>
</Camera>
)
}
}```
Could someone kindly point me in the right direction to fix this error?
https://docs.expo.dev/versions/latest/sdk/camera/#takepictureasyncoptions
Note: Make sure to wait for the onCameraReady callback before calling this method.
So, you might resolve if you add onCameraReady props to Camera component like this document.
I'm facing issue like this, and it is not resolved now... I hope my advice works well.

How to dynamically change React Native transform with state?

I'm building a custom view that will rotate its contents based on device orientation. This app has orientation locked to portrait and I just want to rotate a single view. It fetches the current device orientation, updates the state, then renders the new component with the updated style={{transform: [{rotate: 'xxxdeg'}]}}.
I'm using react-native-orientation-locker to detect orientation changes.
The view renders correctly rotated on the first render. For example, if the screen loads while the device is rotated, it will render the view rotated. But upon changing the orientation of the device or simulator, the view does not rotate. It stays locked at the rotate value it was initialized at.
It seems like updates to the transform rotate value do not change the rotation. I've verified that new rotate values are present during the render. I've verified that orientation changes are correctly updating the state. But the view is never rotated in the UI when orientation changes. It is as if React Native isn't picking up on changes to the rotate value during a render.
I would expect that updates to the rotate value would rotate the View accordingly but that does not seem to be the case. Is there another way to accomplish this or do I have a bug in this code?
Edit: Is it required for rotate to be an Animated value?
import React, {useState, useEffect} from 'react';
import {View} from 'react-native';
import Orientation from 'react-native-orientation-locker';
const RotateView = props => {
const getRotation = newOrientation => {
switch (newOrientation) {
case 'LANDSCAPE-LEFT':
return '90deg';
case 'LANDSCAPE-RIGHT':
return '-90deg';
default:
return '0deg';
}
};
const [orientation, setOrientation] = useState(
// set orientation to the initial device orientation
Orientation.getInitialOrientation(),
);
const [rotate, setRotate] = useState(
// set rotation to the initial rotation value (xxdeg)
getRotation(Orientation.getInitialOrientation()),
);
useEffect(() => {
// Set up listeners for device orientation changes
Orientation.addDeviceOrientationListener(setOrientation);
return () => Orientation.removeDeviceOrientationListener(setOrientation);
}, []);
useEffect(() => {
// when orientation changes, update the rotation
setRotate(getRotation(orientation));
}, [orientation]);
// render the view with the current rotation value
return (
<View style={{transform: [{rotate}]}}>
{props.children}
</View>
);
};
export default RotateView;
I had this same problem, and solved it by using an Animated.View from react-native-reanimated. (Animated.View from the standard react-native package might also work, but I haven't checked). I didn't need to use an Animated value, I still just used the actual value from the state, and it worked.
If you use Animated.Value + Animated.View directly from react native you'll be fine.
Had the same issue and solved it using an Animated.Value class field (in your case I guess you'd use a useState for this one since functional + a useEffect to set the value of the Animated.Value upon changes in props.rotation), and then pass that into the Animated.View as the transform = [{ rotate: animatedRotationValue }]
Here's the class component form of this as a snippet:
interface Props {
rotation: number;
}
class SomethingThatNeedsRotation extends React.PureComponent<Props> {
rotation = new Animated.Value(0);
rotationValue = this.rotation.interpolate({
inputRange: [0, 2 * Math.PI],
outputRange: ['0deg', '360deg'],
});
render() {
this.rotation.setValue(this.props.rotation);
const transform = [{ rotate: this.rotationValue }];
return (
<Animated.View style={{ transform }} />
);
}
}
Note that in my example I also have the interpolation there since my input is in radians and I wanted it to be in degrees.
Here is my completed component that handles rotation. It will rotate its children based on device orientation while the app is locked to portrait. I'm sure this could be cleaned up some but it works for my purposes.
import React, {useState, useEffect, useRef} from 'react';
import {Animated, Easing, View, StyleSheet} from 'react-native';
import {Orientation} from '../utility/constants';
import OrientationManager from '../utility/orientation';
const OrientedView = (props) => {
const getRotation = useRef((newOrientation) => {
switch (newOrientation) {
case Orientation.LANDSCAPE_LEFT:
return 90;
case Orientation.LANDSCAPE_RIGHT:
return -90;
default:
return 0;
}
});
const {duration = 100, style} = props;
const initialized = useRef(false);
const [orientation, setOrientation] = useState();
const [rotate, setRotate] = useState();
const [containerStyle, setContainerStyle] = useState(styles.containerStyle);
// Animation kept as a ref
const rotationAnim = useRef();
// listen for orientation changes and update state
useEffect(() => {
OrientationManager.getDeviceOrientation((initialOrientation) => {
const initialRotation = getRotation.current(initialOrientation);
// default the rotation based on initial orientation
setRotate(initialRotation);
rotationAnim.current = new Animated.Value(initialRotation);
setContainerStyle([
styles.containerStyle,
{
transform: [{rotate: `${initialRotation}deg`}],
},
]);
initialized.current = true;
// set orientation and trigger the first render
setOrientation(initialOrientation);
});
OrientationManager.addDeviceOrientationListener(setOrientation);
return () =>
OrientationManager.removeDeviceOrientationListener(setOrientation);
}, []);
useEffect(() => {
if (initialized.current === true) {
const rotation = getRotation.current(orientation);
setRotate(
rotationAnim.current.interpolate({
inputRange: [-90, 0, 90],
outputRange: ['-90deg', '0deg', '90deg'],
}),
);
Animated.timing(rotationAnim.current, {
toValue: rotation,
duration: duration,
easing: Easing.ease,
useNativeDriver: true,
}).start();
}
}, [duration, orientation]);
// FIXME: This is causing unnessary animation outside of the oriented view. Disabling removes the scale animation.
// useEffect(() => {
// applyLayoutAnimation.current();
// }, [orientation]);
useEffect(() => {
if (initialized.current === true) {
setContainerStyle([
styles.containerStyle,
{
transform: [{rotate}],
},
]);
}
}, [rotate]);
if (initialized.current === false) {
return <View style={[containerStyle, style]} />;
}
return (
<Animated.View style={[containerStyle, style]}>
{props.children}
</Animated.View>
);
};
const styles = StyleSheet.create({
containerStyle: {flex: 0, justifyContent: 'center', alignItems: 'center'},
});
export default OrientedView;
This is a bug, as the rotation is supposed to change when the value of rotate updates. A workaround is to set the View's key attribute to the rotate value as well.
For example:
return (
<View
key={rotate} // <~~~ fix!
style={{transform: [{rotate}]}}
>
{props.children}
</View>
)
I found this solution here.