Redux Toolkit and Class Components - React Native [duplicate] - react-native

This question already has an answer here:
How to use redux-toolkit createSlice with React class components
(1 answer)
Closed 1 year ago.
I am new to using React Native, and was recently planning on using Redux Toolkit with my project. I was previously using standard Redux, but really like how RTK keeps everything concise with slices.
My project currently uses Class Components for the Screens and would like to keep it that way, but I can't find any documentation on using RTK with Class Components, only with Functional Components. I wanted to know if it was possible to use Class Components and what changes need to be made in order to add RTK. I feel like I should be able to just import the actions from the Slices and then use MapDispatchToProps, but any insight would be appreciated.
Thanks!

Redux Toolkit is purely about writing Redux logic, and is totally separate from how you write your React components and use React-Redux.
So, you can use any combination of:
Redux logic: vanilla hand-written Redux or RTK
React-Redux: connect + class components, or function components + hooks
See the React-Redux docs on using connect:
https://react-redux.js.org/using-react-redux/connect-mapstate
https://react-redux.js.org/using-react-redux/connect-mapdispatch

Related

Best Way To Globally Instantiate Ably Connection in React Native

I am building a React Native game with multiple screens. I am trying to instantiate Ably one time and use it throughout the application without having to call new Ably('api-key') on every screen, as this is creating multiple new connections and making it impossible to keep a single connectionId. Is there a way to architect this so that I can define
const client = new Ably('api-key')
one time in App.js and use it throughout the other screens? Either by passing it as props through React Navigation props or using Context? The documentation on this is sparse as most of their documentation only uses this on one page.
Redux will not work. You can create an Ably client and pass it to other components using props, or just use React context to make it available to the whole application. Using redux to store an Ably client will cause an error since Ably clients aren’t serialisable.
See https://reactjs.org/docs/context.html for more info.
Here are some examples:
https://github.com/ably-labs/react-hooks
https://ably.com/blog/ably-react-hooks-npm-package
Cheers,
Richie

problem with types of create react native apps

Hello I'm very beginner to react-native.
I have a problem
There are 2 ways to create react-native apps (Please check those examples from those image links)
first way is
I got this method on documentation of react-native
second way is
this method is got from a you tube video
my problem is Are those methods are same or what is the different between of those 2 ways. Thank you very much your reply.
first way is functional component and second one is class component. please check below links for explanations.
what is the different between class component and function component - React Native
or google it like "difference between functional component and class component"

React Native New Architecture - Redux & Hooks

I've been developing apps using react native for a while now establishing my project architecture on redux, React.Component lifecycle and react-native-router-flux (for navigation).
Now, I need to start to develop a big new project which is going to be in production and needs to handle a lot of users.
As I understand I cant keep using React.Compoent as my lifecycle and I need to implement the functionality of the new hooks system.
My main concern is that react native with redux and hooks aren't stable enough and I won't be able to create a stable application.
As I see it I should use react-native with hooks, redux and react-navigation as my main architecture. I will be glad to get your opinion on this one.
Thanks
All component in react can be defined as classes or function. It depends on you.
As I understand I cant keep using React.Compoent as my lifecycle and I need to implement the functionality of the new hooks system
You can still use React.Component if you write component as classes. Use hooks if you write your component as function
My main concern is that react native with redux and hooks aren't stable enough and I won't be able to create a stable application
Why do you know that they aren’t stable. I am using them on my production apps.
Basically, redux use for state management, react navigation for navigation. You could use hooks or not, it doesn’t matter

How to check if a point lies on polyline in react native

I am not able to use google.maps.geometry.poly.isLocationOnEdge() in my react native project. Is there any other way to check whether a point lies on a route which I can use to react native?
I recently came across this problem while using react-native-maps.
You can use isPointInLine(point, lineStart, lineEnd) method by using geolib JS package. This package has many other useful methods.

How can I do Signature Capture in React Native?

I'm trying to understand how I can do a signature capture in React Native. My App is created with create-react-native-app and Expo and I'd prefer to not have to eject the app to get this functionality to work.
Would it be possible to wrap something like this in a webview? https://github.com/szimek/signature_pad
I've also looked at this project, https://github.com/RepairShopr/react-native-signature-capture but it requires me to eject the app and use react-native link.
Looking for any advice or suggestions on how to implement this feature while keeping my project as straightforward as possible (ideally, using create-react-native-app, but if this isn't possible could someone please explain to me why?)
The way React Native works is that each component available in React Native maps to a native component in the underlying platform.
ie. a <Image /> is an ImageView in Android and a UIImageView.h in iOS.
The Javascript code itself runs in a Javascript thread on each platform and as you use Components in React Native, there's a translation layer that passes information from JS into the React Native bridge that then results in corresponding native components being created.
By default, React Native has included the following components: https://facebook.github.io/react-native/docs/components-and-apis.html#basic-components which means that only those components come out-of-the-box in React Native. If you want other components, then you have 2 options, either create a "composite" component in which your JS component is written into other JS components or, if your feature needs a native component not yet exposed by React Native, write your own "native" component to expose certain native functionality to your React Native code.
The way Expo works is that they have wrapped React Native and a handful of 3rd party components and built it within their application. The reason why you can't use a 3rd party native component they don't support is because when that component is used, the app itself doesn't have translation code to go from JS to a native Android/iOS view.
So, to do what you're asking, you'd need to find either a "native" drawing component that Expo has included in their platform/app. OR you need to find a "composite" drawing component that is built with other default React Native components (or other components Expo supports).
ie. On Android, I might build this with a Canvas view, but from what I can tell React Native doesn't support that object natively, so I would probably write this myself, etc.
It's hard for Expo to support every 3rd party "native" component out there because React Native is open source and it iterates so fast that most community-built components aren't always up to date or they might conflict with one another.
I am using react-native-signature-capture.
Working properly on both Android and iOS.
I know it's been a while, but there is an interesting article here: https://blog.expo.io/drawing-signatures-with-expo-25d1629ca1ac
Wait, but how?
Using “expo-pixi”, you can add a component that lets you choose your brush’s color, thickness, and opacity. Then when your user lifts her finger, you get a callback. From there you can take a screenshot of the transparent view or get the raw point data if that’s what you’re looking for.