Error using DeviceOrientationCamera in Babylon React Native - react-native

I'm trying to setup a minor project for Android using Babylon React Native and use the device orientation as the camera input. When I try to use a DeviceOrientationCamera, the library throws the below error.
[TypeError: undefined is not an object (evaluating 'window.screen.orientation')]
which seems to be coming from this line
When I use an ArcRotateCamera or a FreeCamera instead, the scene loads fine. The code I am using is reproduced below
import React, {useState, useEffect} from 'react';
import {SafeAreaView, useWindowDimensions, View} from 'react-native';
import {EngineView, useEngine} from '#babylonjs/react-native';
import {Scene, SceneLoader, DeviceOrientationCamera, Vector3, FreeCamera} from '#babylonjs/core';
import '#babylonjs/loaders/glTF';
const EngineScreen = (props) => {
const engine = useEngine();
const [camera, setCamera] = useState();
useEffect(() => {
if (engine) {
const scene = new Scene(engine)
const url =
'https://github.com/KhronosGroup/glTF-Sample-Models/blob/master/2.0/Duck/glTF-Binary/Duck.glb?raw=true';
SceneLoader.Append("", url, scene, (sc => {
let cam;
try {
// When FreeCamera is used, the scene renders fine
// cam = new FreeCamera('FreeCamera', new Vector3(0, 0, 0), sc);
// When DeviceOrientationCamera is used, it throws an error
cam = new DeviceOrientationCamera('DeviceOrientationCamera', new Vector3(0, 0, 0), sc);
} catch (err) {
console.log(err);
throw err;
}
const canvas = engine.getRenderingCanvas();
cam.attachControl(canvas, true);
setCamera(sc.activeCamera);
}));
}
}, [engine]);
return (
<>
<View style={props.style}>
<View style={{flex: 1}}>
<EngineView camera={camera} displayFrameRate={true} />
</View>
</View>
</>
);
};
const App = () => {
const {width, height} = useWindowDimensions();
return (
<>
<SafeAreaView
style={{
flex: 1,
backgroundColor: 'red',
height,
width,
}}>
<EngineScreen style={{flex: 1}} />
</SafeAreaView>
</>
);
};
export default App;
How can I fix this issue? On a broader level, I want to allow the user to 'look around'(rotate) and 'walk around'(translate) the scene by moving their mobile. How can I achieve that?

Apparently only touch input is currently supported in babylon native. More details in the discussion below
https://forum.babylonjs.com/t/error-using-deviceorientationcamera-in-babylon-react-native/31454/2?u=ketan_bhokray

Related

How can I draw an image from the media library to canvas using react-native-canvas?

I have set up a basic Expo project and want to allow the user to select an image from the media library and it be drawn on a canvas (so I can do further manipulation of the image). Currently I'm getting the image URI back from the media library fine. I then create a canvas image object, set the src to the URI and draw the image to the canvas. However no image is visible on the canvas.
import { useState, useRef, useEffect } from "react";
import { StatusBar } from "expo-status-bar";
import { StyleSheet, View } from "react-native";
import * as ImagePicker from "expo-image-picker";
import Canvas, { Image as CanvasImage } from "react-native-canvas";
import Button from "./components/Button";
export default function App() {
const canvasRef = useRef();
const [selectedImageUri, setSelectedImageUri] = useState(null);
const pickImageAsync = async () => {
const result = await ImagePicker.launchImageLibraryAsync({
allowsEditing: true,
quality: 1,
});
if (!result.canceled) {
setSelectedImageUri(result.assets[0].uri);
}
};
useEffect(() => {
const image = new CanvasImage(canvasRef.current);
image.addEventListener("load", () => {
canvasRef.current.getContext("2d").drawImage(image, 0, 0);
});
image.src = selectedImageUri;
}, [selectedImageUri]);
return (
<View style={styles.container}>
<Canvas
ref={canvasRef}
style={{
width: "100%",
}}
/>
<View style={styles.footerContainer}>
<Button
theme="primary"
label="Choose a photo"
onPress={pickImageAsync}
/>
</View>
<StatusBar style="auto" />
</View>
);
}
I have placed a log in the 'load' event listener for the image and can see that it is not being called. Additionally, I created an 'error' event listener for the image and could the following error was being logged:
[{"type":"error","target":{"__ref__":"ukp70egumdk"}}]
I wondered if it might be a permissions issue, although I'm testing on the Expo Go Android app which I believe doesn't need permissions to be set.

Errors with reactNative/voice in expo

thats my code
import { StatusBar } from 'expo-status-bar';
import { StyleSheet, Text, Button, View } from 'react-native';
import { useEffect, useState } from 'react';
import Voice from '#react-native-voice/voice';
export default function App() {
let [started, setStarted] = useState(false);
let [results, setResults] = useState([]);
useEffect(() => {
Voice.onSpeechError = onSpeechError;
Voice.onSpeechResults = onSpeechResults;
return () => {
Voice.destroy().then(Voice.removeAllListeners);
}
}, []);
const startSpeechToText = async () => {
await Voice.start();
setStarted(true);
};
const stopSpeechToText = async () => {
await Voice.stop();
setStarted(false);
};
const onSpeechResults = (result) => {
setResults(result.value);
};
const onSpeechError = (error) => {
console.log(error);
};
return (
<View style={styles.container}>
{!started ? <Button title='Start Speech to Text' onPress={startSpeechToText} /> : undefined}
{started ? <Button title='Stop Speech to Text' onPress={stopSpeechToText} /> : undefined}
{results.map((result, index) => <Text key={index}>{result}</Text>)}
<StatusBar style="auto" />
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: '#fff',
alignItems: 'center',
justifyContent: 'center',
},
});
when I run it I keep getting this error
[Unhandled promise rejection: TypeError: null is not an object (evaluating 'Voice.startSpeech')]
I tried asking chatGPT but he can't answer it
Packages and import is correct, so I don't know what the error is nor how I can fix it
Check if it works on another version of android, I launched some emulators and it seems that below Android 12 there are problems with Permissions.
On Android 12 and above, my code with react-native-voice works well with expo run:android but I can't find out why once built with expo build:android I still have problems with mic permissions

react native app crashes when tf.ready called

I followed all instructions to install #tensorflow/tfjs-react-native given at
https://www.npmjs.com/package/#tensorflow/tfjs-react-native/v/0.3.0
this is my app.js file:
import React, { useState, useEffect } from 'react';
import * as tf from '#tensorflow/tfjs';
import '#tensorflow/tfjs-react-native';
import {
SafeAreaView,
StatusBar,
StyleSheet,
Text,
View,
} from 'react-native';
export default () => {
const [ready, setReady] = useState(false)
useEffect(() => {
const load = async () => {
await tf.ready()
setReady(true)
}
load()
})
return (
<SafeAreaView style={{ backgroundColor: '#fff', flex: 1 }}>
<StatusBar barStyle={'dark-content'} />
<View>
<Text>hello</Text>
</View>
</SafeAreaView>
);
};
const styles = StyleSheet.create({
});
App crashes when tf.ready() is called. no error is logged in console.
If i comment tf.ready() everything works fine. am i doing something wrong?
This is my package.json file
image of package.json file
How do i test if this package is installed correctly?
any help from your side will be appreciated.
I had this issue when running my app with expo on an android device, what solved it for me was setting the backend with :
await tf.setBackend('cpu');
before
tf.ready();
It might solve your problem as well

stopRecorder() is not working - react-native-audio-recorder-player

I'm trying to record audio in react native app using package react-native-audio-recorder-player. Audio recording starts successfully but keep recording even after calling stopRecorder().
Tried number of solutions from gitHub but nothing helping. Here is my code
import React from 'react';
import { View, TouchableOpacity, Text} from 'react-native';
import AudioRecorderPlayer from 'react-native-audio-recorder-player';
export const Recorder = () => {
const audioRecorderPlayer = new AudioRecorderPlayer();
const onStartRecord = async () => {
await audioRecorderPlayer.startRecorder();
audioRecorderPlayer.addRecordBackListener(e => {
console.log('Recording . . . ', e.current_position);
return;
});
};
const onStopRecord = async () => {
const audio = await audioRecorderPlayer.stopRecorder();
audioRecorderPlayer.removeRecordBackListener();
};
return (
<View style={{flex: 1, justifyContent: 'center', alignItems: 'space-between'}}>
<TouchableOpacity onPress={onStartRecord}>
<Text>Start</Text>
</TouchableOpacity>
<TouchableOpacity onPress={onStopRecord}>
<Text>Stop</Text>
</TouchableOpacity>
</View>
);
};
Even after pressing Stop, console is still getting Recording . . ., following is my console.
For this work, need to add const audioRecorderPlayer = new AudioRecorderPlayer(); outside the component. So, component will look like this.
import React from 'react';
import { View, TouchableOpacity, Text} from 'react-native';
import AudioRecorderPlayer from 'react-native-audio-recorder-player';
const audioRecorderPlayer = new AudioRecorderPlayer();
export const Recorder = () => {
const onStartRecord = async () => {
await audioRecorderPlayer.startRecorder();
audioRecorderPlayer.addRecordBackListener(e => {
console.log('Recording . . . ', e.current_position);
return;
});
};
const onStopRecord = async () => {
const audio = await audioRecorderPlayer.stopRecorder();
audioRecorderPlayer.removeRecordBackListener();
};
return (
<View style={{flex: 1, justifyContent: 'center', alignItems: 'space-between'}}>
<TouchableOpacity onPress={onStartRecord}>
<Text>Start</Text>
</TouchableOpacity>
<TouchableOpacity onPress={onStopRecord}>
<Text>Stop</Text>
</TouchableOpacity>
</View>
);
};
After this change, my stopRecorder is working like charm.

Render JSX in react native

Let's say we have a jsx saved in a variable, can we render it in react native?
import {StyleSheet} from 'react-native';
const content = `<View style={styles.container}>
<Text>TESTING</Text>
</View>`;
const App = () => {
return {content};
};
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: 'blue',
},
});
export default App;
Getting the above error if I run that code.
How to implement this? Any example would be great.
Don't try to convert it into a string, then.
You could use just like this.
const content = <View style={styles.container}>
<Text>TESTING</Text>
</View>;
const App = () => {
return content;
};