React Native - changing screen content while animating - react-native

I have an animated screen using gesture handler, and reanimated libraries. My problem is I am trying to change the screen content while the animations fired. I have made it work, but honestly I am 100% sure this is not the best solution (or it might be, but need some more configurations)
Code:
const onStateChangeLogin = event([
{
nativeEvent: ({state}) => block([
cond(eq(state, State.END), set(buttonOpacity, runTiming(new Clock(), 1, 0))), call([], () => handleTapLogin())
])
}
]);
In the call function, I am trying to change the content:
const handleTapLogin = (e) => {
setContent(<Login />)}
This is the Animated.View:
<TapGestureHandler onHandlerStateChange={onStateChangeLogin}>
<Animated.View style={{...styles.button, opacity: buttonOpacity, transform:[{translateY: buttonY}]}}>
<Text style={{...styles.buttonText, color: colors.white}}>{strings.signin}</Text>
</Animated.View>
</TapGestureHandler>
And the screen content is located in the bottom of my 'return' method:
....</Animated.View>
</TapGestureHandler>
{content}
</Animated.View>
</View>
</Container>
So it's working now, but too slow. I think because the call function in the event is async. So what happens now, the content is changing, but sometimes I need to wait like 2-3 secs to see the right screen.
I have already tried to use 'listener' in the event function:
const onStateChangeLogin = event([
{
nativeEvent: ({state}) => block([
cond(eq(state, State.END), set(buttonOpacity, runTiming(new Clock(), 1, 0))), call([], () => handleTapLogin())
], {
listener: () => handleTapLogin
}
)
}
]);
And I'm also tried to use onGestureEvent and onHandlerStateChange together like:
<TapGestureHandler onGestureEvent={onStateChangeLogin} onHandlerStateChange={handleTapLogin}>
...
</TapGestureHandler>

Related

I'm trying to call a function defined on the web, but nothing happens

I'm trying to open a simple page with React Native WebView.
It's a single page web, and when you do a search, it prints out some information about your search.
After that, if you want to search again, press the back button on the device to move to the search box.
Because it is a single page, I cannot use goBack, so I created a function called cancel.
The problem is that when I click the device's back button, the function called cancel defined on the web is not executed.
The cancel function deletes the searched information and returns to the search window.
I will upload my code.
Please advise.
export default function App() {
const webviewRef = useRef(null);
const backAction = () => {
setBackTapping((prev) => prev += 1);
webviewRef.current.injectJavaScript('window.cancel()')
return true;
}
useEffect(() => {
const timer = setInterval(() => {
setBackTapping(0)
}, 1000)
return () => clearInterval(timer);
}, [])
useEffect(() => {
const backHandler = BackHandler.addEventListener('hardwareBackPress',backAction);
return () => backHandler.remove()
}, [])
useEffect(() => {
if(backTapping >= 2){
return BackHandler.exitApp();
}
},[backTapping])
return (
<KeyboardAvoidingView style={{ flex: 1 }}>
<StatusBar hidden />
<WebView
ref={webviewRef}
textZoom={100}
originWhitelist={['*']}
javaScriptEnabled
source={{ uri: 'myhome.com'}}
startInLoadingState={true}
/>
</KeyboardAvoidingView>
);
}
Expected behavior:
The cancel function is executed, all open windows are closed, and you are returned to the search window.
in my case, calling is wrong.
instead of :
webviewRef.current.injectJavaScript('window.cancel()')
use :
const generateOnMessageFunction = (data) => `
(function(){
window.dispatchEvent(new MessageEvent('message',{data: ${JSON.stringify(data)}}));
})()
`;
webviewRef.current.injectJavaScript(generateOnMessageFunction('cancel'));
detail referance :
https://github.com/react-native-webview/react-native-webview/issues/809

useState function seems to block Animated.timing event?

I've created a "twitter style" button that when pressed opens up a sub-menu of items that can be selected/"tweeted" about.
The button is simple in that when pressed, it triggers a function with Animated events:
const toggleOpen = () => {
if (this._open) {
Animated.timing(animState.animation, {
toValue: 0,
duration: 300,
}).start();
} else {
Animated.timing(animState.animation, {
toValue: 1,
duration: 300,
}).start(); // putting '() => setFirstInteraction(true)' here causes RenderItems to disappear after the animation duration, until next onPress event.
}
this._open = !this._open;
};
and here's the button that calls this function:
<TouchableWithoutFeedback
onPress={() => {
toggleOpen();
// setFirstInteraction(true); // this works here, but the button doesn't toggleOpen until the 3rd + attempt.
}}>
<Animated.View style={[
styles.button,
styles.buttonActiveBg,
]}>
<Image
style={styles.icon}
source={require('./assets/snack-icon.png')}
/>
</Animated.View>
</TouchableWithoutFeedback>
I need to add a second useState function that is called at the same time as toggleOpen();. You can see my notes above regarding the problems I'm facing when using the setFirstInteraction(true) useState function I'm referring to.
Logically this should work, but for some reason when I add the setFirstInteraction(true) it seems to block the toggleOpen() function. If you persist and press the button a few times, eventually the toggleOpen() will work exactly as expected. My question is, why does this blocking type of action happen?
You can reproduce the issue in my snack: https://snack.expo.dev/#dazzerr/topicactionbutton-demo . Please use a device. The web preview presents no issues, but on both iOS and Android the issue is present. Line 191 is where you'll see the setFirstInteraction(true) instance.
Your animatedValue isn't stable. This causes it to be recreated on each state change. It is advised to useRef instead (though, useMemo would do the trick here as well).
const animState = useRef(new Animated.Value(0)).current;
Your toggleOpen function can also be simplified. In fact, you only need a single state to handle what you want and react on it in a useEffect to trigger the animations that you have implemented.
I have called this state isOpen and I have removed all other states. The toggleOpen function just toggles this state.
const [isOpen, setIsOpen] = useState(false)
const toggleOpen = () => {
setIsOpen(prev => !prev)
}
In the useEffect we react on state changes and trigger the correct animations.
const animState = useRef(new Animated.Value(0)).current;
useEffect(() => {
Axios.get('https://www.getfretwise.com/wp-json/buddyboss/v1/forums')
.then(({ data }) => setData(data))
.catch((error) => console.error(error));
}, []);
useEffect(() => {
Animated.timing(animState, {
toValue: isOpen ? 1 : 0,
duration: 300,
useNativeDriver: true,
}).start();
}, [isOpen, animState])
I have adapted your snack. Here is a working version.
Remarks: Of course, you still need for your data to be fetched from your API. The opacity change of the button is still the same and it remains disabled until the data has been fetched.

Why does header button onPress act differently than regular button onPress?

I have a method called sendResults() that makes an API call and does some array manipulation. When I call the method using a "normal" TouchableOpacity button, everything works fine. However, when I call it using a button I have placed in the App Stack header, the method does not run correctly. It feels like an async issue (?) but not sure...
Here is the code for the two buttons.
<TouchableOpacity
onPress={() => {
sendResults(); // works fine
}}
style={styles.buttonStyle}
>
<Text>Save</Text>
</TouchableOpacity>
useEffect(() => {
navigation.setOptions({
headerRight: () => (
<TouchableOpacity
onPress={() => {
sendResults(); // doesn't work
}}
style={styles.buttonStyle}
>
<Text>Save</Text>
</TouchableOpacity>
),
});
}, []);
Edit: sendResults() code
// Shows alert confirming user wants to send results to API
const sendResults = () => {
Alert.alert("Save Results", "Alert", [
{
text: "Save & Quit",
onPress: () => postNumsAndNavigate(),
style: "destructive",
},
{ text: "Cancel", onPress: () => console.log("") },
]);
};
// Save Results button
const postNumsAndNavigate = async () => {
if (bibNums.length == 0) {
alert("You have not recorded any results. Please try again.");
} else if (bibNums.filter((entry) => entry == "").length > 0) {
alert("Blank");
} else {
console.log("\n" + bibNums);
await postNums();
AsyncStorage.setItem(`done`, "true");
navigation.navigate("Home Screen");
}
};
postNums() does an API call.
Edit 2: bibNums declaration
const [bibNums, setBibNums] = useState([]);
You set the handler of the navigation button only once because your useEffect doesn't have any dependency; it runs only when the component is mounted, it captures an old reference of sendResults. sendResults changes every time postNumsAndNavigate and bibNums change. Add sendResults to the dependency array to update the navigation button handler every time sendResults changes.
useEffect(() => {
...
}, [sendResults])
It works correctly for the TouchableOpacity because you are assigning the handler on every render.
onPress={() => {sendResults()}}

Pass useAnimatedGestureHandler via forwardRef

I'm about to swap the old React Native Animated library with the new React Native Reanimated one to gain performance issues but I have encountered one problem I could not solve.
In all examples I found online, I saw that the GestureHandler, created with useAnimatedGestureHandler, is in the same component as the Animated.View. In reality that is sometimes not possible.
In my previous app, I just pass the GestureHandler object to the component via forwardRef but it seems React Native Reanimated is not able to do that. I don't know whether I have a syntax error or it is just a bug.
const App = () => {
const handlerRef = useAnimatedRef();
const y = useSharedValue(0);
handlerRef.current = useAnimatedGestureHandler({
onStart: (_, ctx) => {
ctx.startY = y.value;
},
onActive: ({translationX, translationY}, ctx) => {
y.value = translationY;
},
onEnd: () => {},
});
const animatedStyles = useAnimatedStyle(() => ({transform: [{translateY: withSpring(y.value)}]}));
const UsingHandlerDirect = () => (
<PanGestureHandler onGestureEvent={handlerRef.current} >
<Animated.View style={[styles.blueBox, animatedStyles]} />
</PanGestureHandler>
)
const UsingHandlerForwardRef = forwardRef(({animatedStyles}, ref) => (
<PanGestureHandler onGestureEvent={ref?.handlerRef?.current}>
<Animated.View style={[styles.redBox, animatedStyles]} />
</PanGestureHandler>
));
return (
<SafeAreaView>
<View style={styles.container}>
<UsingHandlerForwardRef ref={handlerRef} animatedStyles={animatedStyles}/>
<UsingHandlerDirect />
</View>
</SafeAreaView>
);
}
I have saved the GestureHandler in a useAnimatedRef handlerRef.current = useAnimatedGestureHandler({}) to make things more representable. Then I pass the the ref directly into the PanGestureHandler of the UsingHandlerDirect component. The result is that when I drag the blue box the box will follow the handler. So this version works.
But as soon as I pass the handlerRef to the UsingHandlerForwardRef component non of the gesture events get fired. I would expect that when I drag the red box will also follow the handler but it doesn't
Has someone an idea whether it's me or it's a bug in the library?
Cheers
I have given up on the idea to pass a ref around instead, I created a hook that connects both components with each other via context.
I created a simple hook
import { useSharedValue } from 'react-native-reanimated';
const useAppState = () => {
const sharedXValue = useSharedValue(0);
return {
sharedXValue,
};
};
export default useAppState;
that holds the shared value using useSharedValue from reanimated 2
The child component uses this value in the gestureHandler like that
const gestureHandler = useAnimatedGestureHandler({
onStart: (_, ctx) => {
ctx.startX = sharedXValue.value;
},
onActive: (event, ctx) => {
sharedXValue.value = ctx.startX + event.translationX;
},
onEnd: (_) => {
sharedXValue.value = withSpring(0);
},
});
and the Parent just consumes the hook value
const animatedStyle = useAnimatedStyle(() => {
return {
transform: [
{
translateX: -sharedXValue.value,
},
],
};
});
I have created a workable Snack which contains the 2 components - a Child with a blue box and a Parent with a red box

How does Animated.Event work in React Native?

I'm I understanding it correctly ?
Does this two set of code meant the same thing ?Does it have any difference in performance or reliability ?
<ScrollView
onScroll={Animated.event(
[{nativeEvent: {contentOffset: {y: this.state.scrollY}}}]
)}
>
</ScrollView>
AND
handleScroll(e){
this.setState({ scrollY : e.nativeEvent.contentOffset.y });
}
<ScrollView
onScroll={(e) => this.handleScroll(e)}
>
</ScrollView>
Thanks
it's not the same. Animated.event is used to map gestures like scrolling, panning or other events directly to Animated values. so in your first example this.state.scrollY is an Animated.Value. you would probably have code somewhere that initialized it, maybe your constructor would looks something like this:
constructor(props) {
super(props);
this.state = {
scrollY: new Animated.Value(0)
};
}
in your second example this.state.scrollY is the y value (just the number) that was triggered in the scroll event, but completely unrelated to animation. so you couldn't use that value as you could use Animated.Value in animation.
it's explained here in the documentation
If you want to handle the scroll you can use it this way:
handleScroll = (event) => {
//custom actions
}
<ScrollView
onScroll={Animated.event(
[{ nativeEvent: {
contentOffset: {
x: this.state.scrollY
}
}
}],{
listener: event => {
this.handleScroll(event);
}})
}>
</ScrollView>
According to this source code Animated.event traverses the objects passed as arguments of it until finds an instance of AnimatedValue.
Then this key (where the AnimatedValue has been found) is applied to the callback (onScroll) and the value at the key of the event passed to the scroll is assigned to this AnimatedValue.
In the code:
const animatedValue = useRef(new Animated.Value(0)).current;
...
onScroll={Animated.event(
[{nativeEvent: {contentOffset: {y: animatedValue}}}]
)}
is the same as
const animatedValue = useRef(new Animated.Value(0)).current;
...
onScroll={({nativeEvent: { contentOffset: {y} }}) => {
animatedValue.setValue(y);
}}
If your callback accepts more than one event (argument), then just put the mapping object at the needed index (thus the array as the argument of Animated.value.
onScroll={Animated.event(
[
{}, // <- disregard first argument of the event callback
{nativeEvent: {contentOffset: {y: animatedValue}}} // <- apply mapping to the second
]
)}
Yes there is a difference in semantic
<ScrollView onScroll={Animated.event(
[{nativeEvent: {contentOffset: {y: this.state.scrollY}}}]
)}></ScrollView>
The first one i.e the above Animated.event returns a function that sets the scrollview's nativeEvent.contentOffset.y to your current scrollY state which I assume is animated.
The other code just sets scrollY to your scrollView's e.nativeEvent.contentOffset.y and causes a rerender to your component