I'm currently working with the ViroReact Community Package in React Native to display a video in AR when a specific image is found. However the onTargetFound function of the ViroARImageMarker is not triggered, and the children of the ViroARImageMarker are not displayed.
When I added the onAnchorFound function to the ARScene (parent) the onAnchorFound method was triggered, however the children of the ViroARImageMarker still weren't rendered. Why is the function not triggered and therefore the children not displayed? How do I fix this?
The image is a 12x12cm black card with a bright orange logo (about 3cm) in the center. Neither of the targets are found in the ViroARImageMarker.
Here's my code:
Image Recognition Class
import React, { useEffect, useState } from 'react';
const {
ViroARScene,
ViroARImageMarker,
ViroARTrackingTargets,
ViroAnimations,
ViroVideo,
ViroMaterials,
ViroBox
} = require('#viro-community/react-viro');
const NewViroTracker = () => {
const videoPath = require('#assets/videos/wham.mp4');
const [videoAnimationName] = useState('showVideo');
const [playAnim, setPlayAnim] = useState(false);
function _onAnchorFound(evt: any) {
console.log('Anchor found in Marker :', evt);
setPlayAnim(true);
}
return (
<ViroARScene>
<ViroARImageMarker
target={'inviteCard'}
onAnchorFound={_onAnchorFound}>
<ViroVideo source={videoPath} />
</ViroARImageMarker>
<ViroARImageMarker
target={'logo'}>
<ViroBox position={[0, 0.25, 0]} scale={[0.5, 0.5, 0.5]} />
</ViroARImageMarker>
</ViroARScene>
);
};
ViroARTrackingTargets.createTargets({
inviteCard: {
source: require('#assets/images/invite-card.png'),
orientation: 'Up',
physicalWidth: 0.12 // real world width in meters
},
logo: {
source: require('#assets/images/logo-empty.png'),
orientation: 'Up',
physicalWidth: 0.0287 // real world width in meters
}
});
ViroMaterials.createMaterials({
chromaKeyFilteredVideo: {
chromaKeyFilteringColor: '#00FF00'
}
});
ViroAnimations.registerAnimations({
showVideo: {
properties: { scaleX: 1, scaleY: 1, scaleZ: 1 },
duration: 1000
},
closeVideo: {
properties: { scaleX: 0, scaleY: 0, scaleZ: 0 },
duration: 1
}
});
export default NewViroTracker;
App
import React from 'react';
const { ViroARSceneNavigator } = require('#viro-community/react-viro');
import styled from 'styled-components/native';
import NewViroTracker from 'components/NewViroTracker';
const App = () => {
return (
<ViroWrapper
autofocus={true}
initialScene={{
scene: NewViroTracker
}}
/>
);
};
export default App;
const ViroWrapper = styled(ViroARSceneNavigator)`
flex: 1;
`;
Dependencies:
"#viro-community/react-viro": "^2.21.1",
"react": "17.0.2",
"react-native": "0.66.3",
Related
i tried to use GestureDetector of react-native-gesture-handler
import React from 'react';
import { Directions, Gesture, GestureDetector } from 'react-native-gesture-handler';
import Animated, { useAnimatedStyle, useSharedValue, withTiming } from 'react-native-reanimated';
/**
* Component used as Home Page
*/
const HomePage: React.FC = () => {
const position = useSharedValue(0);
const trigger = () => {
console.log('fdfs')
}
const flingGesture = Gesture.Fling()
.direction(Directions.RIGHT)
.onStart((e) => {
position.value = withTiming(position.value + 10, { duration: 100 });
console.log(e)
// trigger()
});
const flingGestureLeft = Gesture.Fling()
.direction(Directions.LEFT)
.onStart((e) => {
position.value = withTiming(position.value - 10, { duration: 100 });
// trigger()
});
const animatedStyle = useAnimatedStyle(() => ({
transform: [{ translateX: position.value }],
}));
return (
<GestureDetector gesture={Gesture.Simultaneous(flingGestureLeft, flingGesture)}>
<Animated.View style={[{ width: 100, height: 30, backgroundColor: 'red' }, animatedStyle]} />
</GestureDetector>
);
}
export default HomePage;
this work without problem when i fling my bloc to left or right, but when i tried to call an exeternal function like the trigger(), my app crash. Is a bug of the gesture detector or there is something to add?
The reanimated, gesture handler hooks and callbacks works on the UI thread and the trigger function you defined is by default on the JS thread, so you can not use it directly.
There are two solutions to this:
add 'worklet' in the trigger function as below
const trigger = () => {
'worklet'
console.log('fdfs')
}
Or wrap your function with 'runOnJS' from reanimated as below
import { runOnJS } from 'react-native-reanimated';
const flingGesture = Gesture.Fling()
.direction(Directions.RIGHT)
.onStart((e) => {
position.value = withTiming(position.value + 10, { duration: 100 });
console.log(e)
runOnJS(trigger)()
});
Note:- syntax for runOnJS is like 'runOnJS(functionName)(params).
So if your function takes two params (Ex. 1st number and 2nd string), you would call it like this:- runOnJS(trigger)(1, 'dummyString')
For more details you can read the docs from reanimated and gesture-handler.
SDK Version: 37.0.0
Platforms(Android/iOS/web/all): Android/iOS.
I’m trying to load a 3D object in my application and followed many tutorials.
the model is loaded successfully and attached to scene variable but it doesn’t appear in the view.
this is a part of my graduation project and i have searched a lot the 2 months to answers
looking for a real working solution as most of the solution i found was too old and doesn’t work.
my component is as described below:
import * as React from 'react';
import { Ionicons } from '#expo/vector-icons';
import { View, StyleSheet,PixelRatio ,TouchableOpacity} from 'react-native';
import { ExpoWebGLRenderingContext, GLView } from 'expo-gl';
import { Renderer, TextureLoader } from 'expo-three';
import ExpoTHREE from 'expo-three';
import * as THREE from 'three'
import {
AmbientLight,
BoxBufferGeometry,
Fog,
Mesh,
MeshStandardMaterial,
PerspectiveCamera,
PointLight,
Scene,
SpotLight,
} from 'three';
export default class ModelScreen extends React.Component {
constructor(props) {
super(props);
let timeout;
this.state= {
loadingCompleted:true,
}
}
componentDidMount(){
THREE.suppressExpoWarnings(true)
clearTimeout(this.timeout)
}
componentWillUnmount(){
clearTimeout(this.timeout)
}
render() {
return (
<View style={styles.container}>
<TouchableOpacity style={[styles.backButton]} activeOpacity={0.8}
onPress= {()=>{
this.props.setViewModel(false);
}}
>
<Ionicons style={styles.backButtonIcon} name="md-arrow-back"></Ionicons>
</TouchableOpacity>
<GLView
style={styles.viewer}
onContextCreate={async (gl: ExpoWebGLRenderingContext) => {
const { drawingBufferWidth: width, drawingBufferHeight: height } = gl;
const sceneColor = '#111111';
const scale = PixelRatio.get();
// Create a WebGLRenderer without a DOM element
const renderer = new Renderer({gl,alpha:true});
renderer.capabilities.maxVertexUniforms = 52502;
renderer.setSize(width/scale, height/scale);
renderer.setPixelRatio(scale);
renderer.setClearColor(0x000000,0);
const camera = new PerspectiveCamera(45, width / height, 1, 1000);
camera.position.set(0, 2, 5);
camera.lookAt(0,0,0);
const scene = new Scene();
scene.fog = new Fog(sceneColor, 1, 1000);
const ambientLight = new AmbientLight(0x101010);
scene.add(ambientLight);
const pointLight = new PointLight(0xffffff, 2, 1000, 1);
pointLight.position.set(0, 200, 200);
scene.add(pointLight);
const spotLight = new SpotLight(0xffffff, 0.5);
spotLight.position.set(0, 500, 100);
spotLight.lookAt(scene.position);
scene.add(spotLight);
var object = null;
const model = {
'thomas.obj': require('./../assets/models/thomas/thomas.obj'),
'thomas.mtl': require('./../assets/models/thomas/thomas.mtl'),
'thomas.png': require('./../assets/models/thomas/thomas.png'),
};
// Load model!
await ExpoTHREE.loadAsync(
[model['thomas.obj'], model['thomas.mtl']],
null,
name => model[name],
).then((obj)=>{
// // Update size and position
ExpoTHREE.utils.scaleLongestSideToSize(obj, 5);
ExpoTHREE.utils.alignMesh(obj, { y: 1 });
// Smooth mesh
ExpoTHREE.utils.computeMeshNormals(obj.children[0]);
// Add the mesh to the scene
scene.add(obj.children[0]);
}).catch((error)=>{
console.log(error);
});
console.log(scene.children.length)
function update() {
if (scene.children.length == 4)
scene.children[3].rotateY(0.03);
}
// Setup an animation loop
const render = () => {
this.timeout = requestAnimationFrame(render);
update();
renderer.render(scene, camera);
gl.endFrameEXP();
};
render();
}}
/>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex:1,
justifyContent:"center",
alignItems:"center",
backgroundColor:"#111111",
},
backButton:{
position:'absolute',
top:30,
width:50,
height:50,
alignItems:"center",
justifyContent:"center",
borderRadius:100,
backgroundColor:"rgb(1,175,250)",
left:20,
alignSelf:"flex-start",
zIndex:10,
},
backButtonIcon:{
fontSize:25,
fontWeight:'900',
color:"#111111",
},
viewer:{
width:"80%",
height:"80%",
}
});
can anyone help me please?
Edit
it's working now without material i have changed
await ExpoTHREE.loadAsync([model['thomas.obj'], model['thomas.mtl']],null,name => model[name])
to
await ExpoTHREE.loadAsync(model['thomas.obj'],null,name => model[name])
but it doesn't render if i load the material and don't know what's wrong with it. i tried different objects and cannot render the material. the object disappears
I am trying to make a pulsing svg heart in ReactNative expo using an SVG image.
The only way I managed to make the heart to resize with an animated value is to change bind it to style: fontSize.
This seems to change size correctly, but the animation is really choppy.
Here is the code:
import React, { Component } from 'react';
import { Animated } from 'react-native';
import { SimpleLineIcons } from '#expo/vector-icons';
const AnimatedIcon = Animated.createAnimatedComponent(SimpleLineIcons);
const TARGET_FONT_SIZE = 16;
const GROWN_FONT_SIZE = 24;
class GrowingHeart extends Component<any, any> {
size = new Animated.Value(TARGET_FONT_SIZE);
constructor(props) {
super(props);
Animated.sequence([
Animated.timing(this.size, {
duration: 1000,
toValue: GROWN_FONT_SIZE
}),
Animated.timing(this.size, {
duration: 1000,
toValue: GROWN_FONT_SIZE
})
]).start();
}
render() {
return (
<AnimatedIcon
style={{ fontSize: this.size }}
size={20}
name="heart"
color="red"
/>
);
}
}
I tried also to bind width and height but they are also choppy + they change container size, rather than the icon.
Is there a better way of doing this? Thanks
Seems that Animated api is simply still rubbish (please correct me if I am getting this wrong)
I re-wrote this with reanimated and it works smooth. The code below is not complete but shows how heart is growing with no choppiness, but rather perfectly smooth.
import React, { Component } from 'react';
import { TouchableOpacity } from 'react-native-gesture-handler';
import Animated, { Easing } from 'react-native-reanimated';
import { SimpleLineIcons } from '#expo/vector-icons';
const {
createAnimatedComponent,
debug,
set,
get,
block,
eq,
Value,
cond,
greaterThan,
startClock,
timing,
Clock,
Code,
clockRunning,
stopClock
} = Animated;
const AnimatedIcon = createAnimatedComponent(SimpleLineIcons);
const TARGET_FONT_SIZE = 16;
const GROWN_FONT_SIZE = 20;
class GrowingHeart extends Component<any, any> {
size = new Value(TARGET_FONT_SIZE);
clock = new Clock();
updatingValue = new Value(0);
clockState = {
finished: new Value(0),
position: new Value(5),
time: new Value(0),
frameTime: new Value(0)
};
clockConfig = {
duration: new Value(500),
toValue: new Value(GROWN_FONT_SIZE),
easing: Easing.linear
};
constructor(props) {
super(props);
}
render() {
const { color } = this.props;
const { updatingValue, size, clock, clockConfig, clockState } = this;
return (
<>
<Code>
{() =>
block([
cond(
// animation not triggered
eq(0, updatingValue),
[],
[
cond(
clockRunning(clock),
[],
[
set(clockState.finished, 0),
set(clockState.time, 0),
set(clockState.position, TARGET_FONT_SIZE),
set(clockState.frameTime, 0),
set(clockConfig.toValue, GROWN_FONT_SIZE),
startClock(clock)
]
),
cond(
greaterThan(0, updatingValue),
// is decreasing
[debug('going down', updatingValue)],
// is growing
[
timing(clock, clockState, clockConfig),
set(size, clockState.position)
]
),
cond(clockState.finished, [stopClock(clock)])
]
)
])
}
</Code>
<TouchableOpacity
onPress={() => {
this.updatingValue.setValue(1);
}}
>
<AnimatedIcon
style={{ fontSize: this.size }}
name="heart"
color={color}
/>
</TouchableOpacity>
</>
);
}
}
Is there any way I could combine react native component like this one https://github.com/archriss/react-native-snap-carousel/tree/master/example with a vue native?
I have a project in vue-native and want to use react-native component inside it, but I have an erro I do not understand: The console says: Invariant Violation: expected a string (for built-in components) or a class/function (for composite components) but got: undefined
<template>
<nb-container>
<nb-content>
<carousel
:data="similarEventsData"
:renderItem="_renderItem"
:sliderWidth="sliderWidth"
:itemWidth="itemWidth"
:inactiveSlideScale="0.95"
:inactiveSlideOpacity="1"
:enableMomentum="true"
:activeSlideAlignment="'start'"
:containerCustomStyle="stylesObj.slider"
:contentContainerCustomStyle="stylesObj.sliderContentContainer"
:activeAnimationType="'spring'"
:activeAnimationOptions="{ friction: 4, tension: 40 }"
/>
</nb-content>
</nb-container>
</template>
<script>
import { Dimensions, Platform, Share } from "react-native";
import Carousel from 'react-native-snap-carousel';
import { scrollInterpolators, animatedStyles } from '../../utils/animations';
const { width: viewportWidth, height: viewportHeight } = Dimensions.get('window');
const slideHeight = viewportHeight * 0.36;
const slideWidth = wp(75);
const itemHorizontalMargin = wp(2);
export default {
components: { carousel: Carousel },
computed: {
similarEventsData () {
return [1, 2, 3]
}
},
data: function() {
return {
sliderWidth: viewportWidth,
itemWidth: slideWidth + itemHorizontalMargin * 2,
stylesObj: {
slider: {
marginTop: 15,
overflow: 'visible'
},
sliderContentContainer: {
paddingVertical: 10
},
}
};
},
methods: {
_renderItem ({item, index}) {
return <Text>fsd</Text>;
},
},
};
</script>
I expect to render a component but with no luck
this question is about 2 years old and I think in that time the devs added the functionality to do so, if somehow anyone is experiencing this question and ran into this post, here's what to do:
Example with Entypo icon pack from expo vector icons:
<script>
import { Entypo } from '#expo/vector-icons';
export default {
data(){
components: { Entype }
}
}
</script>
and then in template:
<template>
<entypo />
</template>
I am looking for a text editor component for React Native without WebView. I found only react-native-zss-rich-text-editor, but it uses WebView, which I think is terrible.
I hope to find something, that works with NSAttributedString and SpannableString for IOS and Android in a native way, like in Evernote, for example.
Evernote Android app text editor
Here is a TextArea component, that wraps TextInput and resizes it automatically, when you press "new line" button
import React, { PropTypes } from 'react'
import { TextInput } from 'react-native'
export default class TextArea extends React.Component {
static propTypes = {
text: PropTypes.string.isRequired,
onChangeText: PropTypes.func.isRequired,
initialHeight: PropTypes.number,
isEditing: PropTypes.bool,
scrollViewRef: PropTypes.object,
}
static defaultProps = {
initialHeight: 40,
isEditing: true,
scrollViewRef: null,
}
state = {
height: this.props.initialHeight,
}
componentWillUnmount(){
this._isUnmounted = false
}
focus(){
this.refs.textInput.focus()
}
blur(){
this.refs.textInput.blur()
}
_contentSizeChanged = e => {
this.setState({
height: e.nativeEvent.contentSize.height,
}, () => {
if (this.props.scrollViewRef) {
setTimeout(() => {
if (!this._isUnmounted) this.props.scrollViewRef.scrollToEnd()
}, 0)
}
})
}
_onChangeText = text => {
this.props.onChangeText(text)
}
render(){
return (
<TextInput
ref = "textInput"
multiline
value = {this.props.text}
style = {{ height: this.state.height, color: 'black', flex: 1 }}
onChangeText = {this._onChangeText}
onContentSizeChange = {this._contentSizeChanged}
editable = {this.props.isEditing}
blurOnSubmit = {false}
/>
)
}
}