When using geolocation in react-native, or perhaps any framework that uses geolocation, is it possible to get the source of the coordinates? That is, can I test whether the coordinates came from the GPS satellites, cell tower, or WiFi?
Thanks.
This is currently not possible. See the docs: https://facebook.github.io/react-native/docs/geolocation.html
Related
I am using react-native-maps in my React Native project and I want the map to show only Germany and dhe user should zoom in and out up to a certain level but always withing Germany. I don't want the user to be able to navigate to other countries.
I checked the documentation but I didn't find anything related to my problem. Is there any way to achieve this? Thanks!
As far as I'm aware, you'll really need to 'bodge' this as it's not really true functionality of native maps.
You can limit the maxZoom level, to ensure that they cannot zoom passed a certain point - this will help with zooming. As for scrolling outside of a certain area - you can hitch onto the onDrag event, and check the lat and lon object. If the coordinates are outside a specific boundary, you can take the user back to a specified location - or present a user error? Geo fencing seems like it could be a part of this also.
Apart from theoretical - I'm afraid I've never done this, so cannot show you any implementation.
I'm working on an application for Android & iOS to show points of interest over the camera. ARkit & ARcore has poor compatibility nowdays.
Could you recommend me some library to do this? If it comes with an example, better! I know viro-media, but I don't understand how to do this using that library.
I don't want 3D models, just markers over the camera, similar to the attachment image.
To do this with Viro React -- and in AR in general -- the trick is to recognize that there are two coordinate systems:
The local coordinate system of your device, which we'll call 'AR space'. In Viro, this is centered at the user's initial position when the application starts, and is in meters.
Geographic coordinates (latitude and longitude).
To position the overlays, you have to convert your content from geographic coordinates into AR space. This is a two-step process. First project the spherical geographic coordinates onto a 2D plane -- the Web Mercator is great for this. Then translate the projected coordinates by the device's initial projected position.
The device's initial projected position can be derived by projecting its initial geographic position. In Viro React, you can use the Geolocation module to grab this when the user starts the app.
Finally, you'll need to do a similar transformation for the user's bearing: converting from compass direction to device orientation in AR space.
For this to work well you'll likely have to figure out how handle inaccurate geolocation lookups (e.g. what happens if the location retrieved from the device is inaccurate), and may also have to account for drift: over time the two coordinate systems may start to fall out of sync.
The last part, creating the info cards, is easy with Viro -- you either pre-bake the images with text and use ViroImage, or if the cards need to be more dynamic you can use a ViroFlexView.
I am also interested in this one and I'm trying out ViroReact!
I find a bit difficult to understand how to make this work when the lat's and long's have been converted to x-y-values. What should the z-value be?
Let's say you have the lat-lon coordinates [59, 10] as the user location you want to show where [59, 11] is relative to your location. How to you build that in a ViroARScene?
<ViroNode position={ **userLocationFromLatLonCartesian** }>
<ViroBox position={ **poiLocationFromLatLonToCartesian** }/>
</ViroNode>
So how do you calculate the scale, position and rotation, so that the object will be visible?
Seem like https://github.com/proj4js/proj4js is a library that could provide conversions from latlon to x-y values
I found that both android and ios AR sdk support location base AR View refererence:
https://developers.google.com/ar/develop/ios/geospatial/quickstart and https://developer.apple.com/documentation/arkit/argeoanchor
I'm working on a compass app and need to find current direction to a particular point (I have coordinates), or at least to the north. How can I do that?
Thanks.
For those who faced the same problem:
1) get access to magnetometer and accelerometer of a device. For that you can either write your own react-native <-> java/swift/objC bridge or use one of libraries like these:
github com/pwmckenna/react-native-motion-manager
github com/kprimice/react-native-sensor-manager
2) convert meters' data to compass heading.
Some informations is here:
https://cdn-shop.adafruit.com/datasheets/AN203_Compass_Heading_Using_Magnetometers.pdf
3) If there is a need to find direction to some particular point, first you need to get your own coordinates and then adding that to point's coordinates you can find out azimuth.
It does not support, but you can simply render a polyline annotation from your direction result.
I am trying to create a custom map for iOS. For the time being I am using Openstreetmap images for the custom map app.
Now what I want is to convert the pixel point to Latitude and Longitude value at a particular zoom level. I am finding out the tile(pixel point) in which I clicked. I need to find out the Lat and Long of that particular point. How this can be calculated? Is there any general formula to find the Lat & Long from pixel point.
Thanks in advance
Do you know about route-me (https://github.com/route-me/route-me)? It is an open source iOS map library. I use a fork of this library found at https://github.com/Alpstein/route-me. These libraries provide the projections you are looking for and might even provide other functionalities you would have to implement yourself otherwise.
I am trying to develop an iphone application which needs to show a 360 degree video like the one and rotate the video as per the phone movement. How can i do this? Is it possible to do this with normal MPMovieplayer controller?
I don't think you can do this with a normal MPMoviePlayerController, but there are several libraries out there to achieve this. Have a look here:
PanoramaGL
Panorama 360
They work with OpenGL and you can embed them in your Objective-C code.
EDIT:
As #Mangesh Vyas kindly pointed out those are intended to use with fixed images only. However they might be a suitable starting point for embedding video as well, if you modify the code accordingly. They already do the handling of direction, accelerometer etc. so you don't have to implement all that yourself.