I need to send search events to Algolia from a react native app. The closest I got was to find the search-insights package from algolia, but there isn't any documentation on how to properly use it in a React Native app.
Anybody did this before that can point me in the right direction?
Algolia just published some guides about how to use it with react-native i.e. https://www.algolia.com/doc/guides/building-search-ui/going-further/native/react-hooks/ and https://www.algolia.com/doc/guides/building-search-ui/going-further/native/react/
I am using bugsnag in react native to track bugs and I want fallback to custom component when error occurs. For React fall back component mentioned in docs. But in react native I didn't find. How to use this. Help me out
You're best contacting Bugsnag support directly to get a quick response to queries about the product.
Bugsnag doesn't yet support error boundaries in React Native like we do for React but we plan to in the near future. We're also about to start officially supporting Expo as a platform and will be supporting error boundaries on that.
Thanks!
I am not able to use google.maps.geometry.poly.isLocationOnEdge() in my react native project. Is there any other way to check whether a point lies on a route which I can use to react native?
I recently came across this problem while using react-native-maps.
You can use isPointInLine(point, lineStart, lineEnd) method by using geolib JS package. This package has many other useful methods.
I'm trying to understand how I can do a signature capture in React Native. My App is created with create-react-native-app and Expo and I'd prefer to not have to eject the app to get this functionality to work.
Would it be possible to wrap something like this in a webview? https://github.com/szimek/signature_pad
I've also looked at this project, https://github.com/RepairShopr/react-native-signature-capture but it requires me to eject the app and use react-native link.
Looking for any advice or suggestions on how to implement this feature while keeping my project as straightforward as possible (ideally, using create-react-native-app, but if this isn't possible could someone please explain to me why?)
The way React Native works is that each component available in React Native maps to a native component in the underlying platform.
ie. a <Image /> is an ImageView in Android and a UIImageView.h in iOS.
The Javascript code itself runs in a Javascript thread on each platform and as you use Components in React Native, there's a translation layer that passes information from JS into the React Native bridge that then results in corresponding native components being created.
By default, React Native has included the following components: https://facebook.github.io/react-native/docs/components-and-apis.html#basic-components which means that only those components come out-of-the-box in React Native. If you want other components, then you have 2 options, either create a "composite" component in which your JS component is written into other JS components or, if your feature needs a native component not yet exposed by React Native, write your own "native" component to expose certain native functionality to your React Native code.
The way Expo works is that they have wrapped React Native and a handful of 3rd party components and built it within their application. The reason why you can't use a 3rd party native component they don't support is because when that component is used, the app itself doesn't have translation code to go from JS to a native Android/iOS view.
So, to do what you're asking, you'd need to find either a "native" drawing component that Expo has included in their platform/app. OR you need to find a "composite" drawing component that is built with other default React Native components (or other components Expo supports).
ie. On Android, I might build this with a Canvas view, but from what I can tell React Native doesn't support that object natively, so I would probably write this myself, etc.
It's hard for Expo to support every 3rd party "native" component out there because React Native is open source and it iterates so fast that most community-built components aren't always up to date or they might conflict with one another.
I am using react-native-signature-capture.
Working properly on both Android and iOS.
I know it's been a while, but there is an interesting article here: https://blog.expo.io/drawing-signatures-with-expo-25d1629ca1ac
Wait, but how?
Using “expo-pixi”, you can add a component that lets you choose your brush’s color, thickness, and opacity. Then when your user lifts her finger, you get a callback. From there you can take a screenshot of the transparent view or get the raw point data if that’s what you’re looking for.
The official documentation for React Native explains how to add React Native views to existing native applications. However, it does not explain how to run React Native code written in JavaScript from a native application, without having to display any React Native component. Does the framework provide a way to do this?
I think reason being, R-N works on top of Native components and not other way round. I have not come across anything that you are looking.