undefined is not an object (evaluating '_expoThree.AR.TrackingConfiguration') - react-native

I am trying to do an AR app with react native and expo and I followed this tutorial(https://blog.expo.dev/arkit-in-react-native-tutorial-the-basics-9f839539f0b9) everything worked great until I got to the Ar camera section. when I added this line of code - arTrackingConfiguration={AR.TrackingConfigurations.World};
my app crashed and I got this error:
TypeError: undefined is not an object (evaluating '_expoThree.AR.TrackingConfiguration')
this is my code:
import ExpoTHREE, { THREE, AR as ThreeAR} from 'expo-three';
import { View as GraphicsView } from 'expo-graphics';
onContextCreate = async ({gl, scale: pixelRatio, width, height, }) => {
// Insert 3D universe
this.renderer = new ExpoTHREE.Renderer({
gl,
pixelRatio,
width,
height,
});
this.scene = new THREE.Scene();
this.camera = new THREE.PerspectiveCamera(75, width / height, 0.1, 1000);
const geometry = new THREE.BoxGeometry(0.1, 0.1, 0.1);
const material = new THREE.MeshPhongMaterial({
color: 0xff00ff,
});
this.cube = new THREE.Mesh(geometry, material);
this.cube.position.z = -0.4
this.scene.add(this.cube);
this.scene.add(new THREE.AmbientLight(0xffffff));
}
onRender = () => {
this.renderer.render(this.scene, this.camera);
};
return (
<GraphicsView
style={{ flex: 1 }}
onContextCreate={this.onContextCreate}
onRender={this.onRender}
isArEnabled
arTrackingConfiguration={AR.TrackingConfiguration.World}
/>
);
}
I saw some errors that are the same but the solutions did not match my situation, I think that the AR variable from expo three is not included or the functions from the AR are not updated in the package, I didn't find a suitable solution for my errors. if someone has a solution to my problem please reach out to me. thanks to everyone:)

Related

React Native & Expo - onContextCreate function not calling when application ran

I don't know why and there is no error shown in debugger-ui. I only see white screen in my iphone with no errors. I also add a console.log inside onContextCreate function and there is no message, so it means onContextCreate function not triggered and here is my code. Any help is very helpful.
import { View as GraphicsView } from 'expo-graphics';
import ExpoTHREE, { THREE } from 'expo-three';
import React from 'react';
export default class App extends React.Component {
UNSAFE_componentWillMount() {
THREE.suppressExpoWarnings();
}
render() {
// Create an `ExpoGraphics.View` covering the whole screen, tell it to call our
// `onContextCreate` function once it's initialized.
return (
<GraphicsView
style={{backgroundColor: 'yellow'}}
onContextCreate={this.onContextCreate}
onRender={this.onRender}
/>
);
}
// This is called by the `ExpoGraphics.View` once it's initialized
onContextCreate = async ({
gl,
canvas,
width,
height,
scale: pixelRatio,
}) => {
console.log('onContextCreate ran...');
this.renderer = new ExpoTHREE.Renderer({ gl, pixelRatio, width, height });
this.renderer.setClearColor(0xffffff)
this.scene = new THREE.Scene();
this.camera = new THREE.PerspectiveCamera(75, width / height, 0.1, 1000);
this.camera.position.z = 5;
const geometry = new THREE.BoxGeometry(1, 1, 1);
const material = new THREE.MeshPhongMaterial({
color: 0xff0000,
});
this.cube = new THREE.Mesh(geometry, material);
this.scene.add(this.cube);
this.scene.add(new THREE.AmbientLight(0x404040));
const light = new THREE.DirectionalLight(0xffffff, 0.5);
light.position.set(3, 3, 3);
this.scene.add(light);
};
onRender = delta => {
this.cube.rotation.x += 3.5 * delta;
this.cube.rotation.y += 2 * delta;
this.renderer.render(this.scene, this.camera);
};
}
I realized that when i close remote debugger in EXPO than my codes are working. This is why happened i don't know. It is good to someone else explain it but it works when i close remote debugging in EXPO...

Why variable value inside of canvas function not incrementing?

Here is an open GitHub issue Github Issue
Here is a Expo Snack
For some reason, variables are not incrementing inside the canvas function while outside works just fine. Please have a look at my code:
function home ({ navigation }) {
const [counter, setCounter] = useState(330);
useEffect(() => {
const timeout = setTimeout(() => {
setCounter(counter + 1);
}, 1000);
return () => {
clearTimeout(timeout);
};
}, [counter]);
console.log('outside ', counter);
const _onGLContextCreate = (gl) => {
var ctx = new Expo2DContext(gl);
// setInterval(() => {
// console.log('set interval doesnt refresh too ', counter);
// }, 1000);
console.log('inside ', counter);
let circle = {
x: counter,
y: 100,
radius: 30,
color: 'black'
}
let circle2 = {
x: 400,
y: 100,
radius: 30,
color: 'blue'
}
function drawCircle() {
ctx.beginPath();
ctx.arc(circle.x, circle.y, circle.radius, 0, Math.PI * 2);
ctx.fillStyle = circle.color;
ctx.fill();
ctx.closePath();
}
function drawCircle2() {
ctx.beginPath();
ctx.arc(circle2.x, circle2.y, circle2.radius, 0, Math.PI * 2);
ctx.fillStyle = circle2.color;
ctx.fill();
}
function update() {
drawCircle();
drawCircle2();
}
function animate() {
ctx.clearRect(0, 0, ctx.width, ctx.height);
requestAnimationFrame(animate);
update();
ctx.flush();
}
animate();
ctx.stroke();
ctx.flush();
};
return (
<GLView style={{ flex: 1 }} onContextCreate={_onGLContextCreate} />
);
}
export { home };
Here is what logs show:
outside 330
inside 330
outside 331
outside 332
outside 333
outside 334
outside 335
outside 336
outside 337
Does anybody know why is being read once in canvas and what could be the solution to increment it as in ouside in canvas function?
I don't know exactly what is the cause, but I found issues in the architecture, and when I fixed them it worked.
TL;DR see result here https://snack.expo.dev/#dozsolti/expogl-ball
You don't need to rerender the component because you update only the canvas so the useState will be swapped with useRef. Also, you probably meant to update the counter every second so for that I changed useTimeout with useInterval. (The useTimeout worked only because it was in a useEffect with the counter dependency which was updated, sort of like calling himself. The correct way was to use a useEffect when the component was loaded and a setInterval)
After that you needed to swap counter with counter.current
Keep in mind that the _onGLContextCreate is running only once so the circle and circle2 objects aren't changing. That's we I changed changed the x value in the update
Besides those, everything looks fine, I guess you optimize the code a little bit, like create a single DrawCircle function that takes x,y as parameters, and so on.

React-Native Flatlist's onViewableItemChanged error

I am trying to account for changes when the element in view changes.
I get the following
Invariant Violation: Changing onViewableItemsChanged on the fly is not supported
I have the following
<Animated.FlatList
viewabilityConfig={carouselViewabilityConfig}
onViewableItemsChanged={onViewableItemsChanged}
...//other properties
/>
and my functions are
const onViewableItemsChanged = ({
viewableItems,
}: {
viewableItems: Array<number>;
}) => {
const insightsById = savedInsights.byId;
console.log('byId:!', insightsById);
if (!viewableItems.length) {
return;
}
const visibleInsightId =
insightsById[viewableItems[Math.max(viewableItems.length - 2, 0)].index];
console.log(visibleInsightId);
Analytics.logViewItem({ insight_id: visibleInsightId });
// const visibleInsightIndex = insightsById.indexOf(visibleInsightId);
};
const carouselViewabilityConfig = {
waitForInteraction: false,
minimumViewTime: 100,
viewAreaCoveragePercentThreshold: 50,
};
I saw a method using refs but it didn't work, my function wouldn't run.
Any help would be appreciated. Thanks.

Save Sketched lines in LocalStorage

Sorry im posting this here
I want to save the info of each line i drew on canvas(the save action would be called onChange)
so i can retrieve this data and draw it on canvas again in case the user change screen or something.
I'm using expo-pixi to draw a image and sketch over it
onChangeAsync = async (param) => {
// here i want on change code i get the line informantion and store it
}
onLayout = async ({
nativeEvent: {
layout: { width, height },
},
}) => {
this.setState({
layoutWidth: width,
layoutHeight: height,
})
this.onReady();
}
onReady = async () => {
const { layoutWidth, layoutHeight, points } = this.state;
this.sketch.graphics = new PIXI.Graphics();
if (this.sketch.stage) {
if (layoutWidth && layoutHeight) {
const background = await PIXI.Sprite.fromExpoAsync(this.props.image);
background.width = layoutWidth * scaleR;
background.height = layoutHeight * scaleR;
this.sketch.stage.addChild(background);
this.sketch.renderer._update();
}
}
};
// The sketch component is pretty much as the example which comes with the lib
<Sketch
ref={ref => (this.sketch = ref)}
style={styles.sketch}
strokeColor={this.state.strokeColor}
strokeWidth={this.state.strokeWidth}
strokeAlpha={1}
onChange={this.onChangeAsync}
onReady={this.onReady}
/>
Does anyone have any clue? im kind of desperate
Thanks

Trying to load obj & mtl file with Three.js in React Native

Main objective : Load animated models exported from Maya into React Native app
Exported files : obj, mtl & png file
I have setup https://github.com/react-community/react-native-webgl in my React Native project and it is working properly.
Now, when I am trying to load the MTL file using the MTLLoader, I am getting following error:
Can't find variable: document
Apparently, the MTLLoader is calling TextureLoader which internally calls some load function which has 'document' reference. So what could be the solution to this ?
Here are the two files that I am using:
three.js
const THREE = require("three");
global.THREE = THREE;
if (!window.addEventListener)
window.addEventListener = () => { };
// require("three/examples/js/renderers/Projector");
require("three/examples/js/loaders/MTLLoader");
require("three/examples/js/loaders/OBJLoader");
export default THREE;
ThreeView.js
import React, { Component } from "react";
import { StyleSheet, View } from "react-native";
import { WebGLView } from "react-native-webgl";
import THREE from "./three";
import { image } from "src/res/image";
export default class ThreeView extends Component {
requestId: *;
componentWillUnmount() {
cancelAnimationFrame(this.requestId);
}
onContextCreate = (gl: WebGLRenderingContext) => {
const rngl = gl.getExtension("RN");
const { drawingBufferWidth: width, drawingBufferHeight: height } = gl;
const renderer = new THREE.WebGLRenderer({
canvas: {
width,
height,
style: {},
addEventListener: () => { },
removeEventListener: () => { },
clientHeight: height
},
context: gl
});
renderer.setSize(width, height);
renderer.setClearColor(0xffffff, 1);
let camera, scene;
let cube;
function init() {
camera = new THREE.PerspectiveCamera(75, width / height, 1, 1100);
camera.position.y = 150;
camera.position.z = 500;
scene = new THREE.Scene();
var mtlLoader = new THREE.MTLLoader();
mtlLoader.load('female-croupier-2013-03-26.mtl', function (materials) {
materials.preload();
var objLoader = new THREE.OBJLoader();
objLoader.setMaterials(materials);
objLoader.load('female-croupier-2013-03-26.obj', function (object) {
scene.add(object);
}, onLoading, onErrorLoading);
}, onLoading, onErrorLoading);
}
const onLoading = (xhr) => {
console.log((xhr.loaded / xhr.total * 100) + '% loaded');
};
const onErrorLoading = (error) => {
console.log('An error happened', error);
};
const animate = () => {
this.requestId = requestAnimationFrame(animate);
renderer.render(scene, camera);
// cube.rotation.y += 0.05;
gl.flush();
rngl.endFrame();
};
init();
animate();
};
render() {
return (
<View style={styles.container}>
<WebGLView
style={styles.webglView}
onContextCreate={this.onContextCreate}
/>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: "#fff",
alignItems: "center",
justifyContent: "center"
},
webglView: {
width: 300,
height: 300
}
});
This error is as others have said caused by threejs trying to use features from a browser which react-native does not have.
I've gotten so far as to be able to load the textures (which is the stage you're getting the error from) by monkey patching the texture loader to use the loader in react-native-webgl. Add this in your init function (right near the top preferably).
//make sure you have defined renderer and rngl
/*
const renderer = new THREE.WebGLRenderer(...)
const rngl = gl.getExtension("RN");
*/
const loadTexture = async function(url, onLoad, onProgress, onError) {
let textureObject = new THREE.Texture();
console.log("loading",url,'with fancy texture loader');
let properties = renderer.properties.get(textureObject);
var texture = await rngl.loadTexture({yflip: false, image: url});
/*
rngl.loadTexture({ image: url })
.then(({ texture }) => {
*/
console.log("Texture [" + url + "] Loaded!")
texture.needsUpdate = true;
properties.__webglTexture = texture;
properties.__webglInit = true;
console.log(texture);
if (onLoad !== undefined) {
//console.warn('loaded tex', texture);
onLoad(textureObject);
}
//});
return textureObject;
}
THREE.TextureLoader.prototype.load = loadTexture;
This solves the problem of loading textures and I can see them load in Charles but they still don't render on a model so I'm stuck past that point. Technically a correct answer but you'll be stuck as soon as you've implemented it. I'm hoping you can comment back and tell me you've gotten further.
I had a similar setup and encountered same issue. My option was to switch to JSONLoader which doesn’t need document to render in react-native. So, I just loaded my model in Blender with a three-js addon, then exported it as json. Just check out this process of adding a three-js adon to Blender
https://www.youtube.com/watch?v=mqjwgTAGQRY
All the best
this might get you closer:
The GLTF format supports embedding texture images (as base64). If your asset pipeline allows it, you could convert to GLTF and then load into three/react-native.
I had to provide some "window" polyfills for "decodeUriComponent" and "atob" because GLTFLoader uses FileLoader to parse the base64:
I've successfully loaded embedded buffers, but you'll need more polyfills to load textures. TextureLoader uses ImageLoader, which uses document.createElementNS
You are using the MTLLoader which uses TextureLoader, and the TextureLoader uses the ImageLoader.
The imageloader uses the document.createElementNS() function.
what i did to solve this was to directly call the THREEjs TextureLoader:
let texture = new THREE.Texture(
url //URL = a base 64 JPEG string in this case
);
(for the use of Texture check the Texture documentation)
Then i used the Image class from React native (instead of the THREEjs Image, which requires the DOM to be constructed) to give that to the Texture as a property:
import { Image } from 'react-native';
var img = new Image(128, 128);
img.src = url;
texture.normal = img;
And then finally map the texture over the target material:
const mat = new THREE.MeshPhongMaterial();
mat.map = texture;
In the react native documentation it will explain how the react native Image element can be used, it supports base64 encoded JPEG.
Maybe there's a way for you to single out the part where it calls for the TextureLoader and replace that part with this answer. Let me know how it works out.
side note, i havent tried to display this yet in my webGLView, but in the logs it looked like normal threejs objects, it's worth the try
Use TextureLoader from expo-three
import { TextureLoader } from "expo-three";
export function loadTexture(resource) {
if (textureCache[resource]) {
return textureCache[resource].clone();
}
const texture = new TextureLoader().load(resource);
texture.magFilter = NearestFilter;
texture.minFilter = NearestFilter;
textureCache[resource] = texture;
return texture;
}
Source: https://github.com/EvanBacon/Expo-Crossy-Road/blob/master/src/Node/Generic.js