Standardise taking photo's with the react-native vision camera (default wide-angle-lens and zoom 2x) - react-native

I'm new in developing with React native and couldn't make my mobile app work.
I want to take a photo with 2 rules:
The camera can't switch to another lens (so only the wide-angle, not the ultra-wide-angle or telelens).
There is a fixed zoom of 2x on the wide-angle lens that can't be changed
Step 1 worked, it now always select the reight lens (wide-angle-camera), only I dont know how to add zooming as const.
function App() {
const devices = useCameraDevices('wide-angle-camera')
const device = devices.back
if (device == null) return <LoadingView />
return (
<Camera
style={StyleSheet.absoluteFill}
device={device}
/>
)
}

Related

Expo Three.js OrbitControls

I am trying to make a native app using expo in which I want to have a plane that I can pan around and zoom in and out of like a map, I am using Three as my 3d engine as I do need it to have the ability to be rotated in a 3d space. I have got a 3d cube in my app rotating as a start. From what I can tell this is pretty simple in a browser using MapControls or Orbit controls, however in native I can't get either of these things working, even when I import the script directly from the examples folder
export default function MapBuilder() {
const onContextCreate = async gl => {
const scene = new THREE.Scene()
const camera = new THREE.PerspectiveCamera(
75,
gl.drawingBufferWidth / gl.drawingBufferHeight,
0.1,
1000
)
const renderer = new ExpoTHREE.Renderer({ gl })
renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight)
const geometry = new THREE.BoxGeometry(1, 1, 1)
const material = new THREE.MeshNormalMaterial({ wireframe: true })
const cube = new THREE.Mesh(geometry, material)
scene.add(cube)
const controls = OrbitControls(camera, renderer.domElement)
camera.position.y = 0
camera.position.x = 0
camera.position.z = 5
controls.update()
const animate = () => {
window.requestAnimationFrame(animate)
cube.rotation.x += 0.02
cube.rotation.y += 0.02
renderer.render(scene, camera)
controls.update()
gl.endFrameEXP()
}
animate()
}
return (
<GLView
style={{ flex: 1, backgroundColor: 'black' }}
onContextCreate={onContextCreate}
/>
)
}
I belive the issue could be the renderer.domElement but I dont know what to replace this with.
Any help is appreciated.
I misunderstood the question before
Sorry for the wrong answer
I also had problem with using OrbitControls in Expo today,
and found expo-three-orbit-controls works fine for me
I tested with my iPhone and Android emulator
try using ExpoGraphics.View instead of GLView
I have succeeded making a globe and running it on my iPhone by using expo-three and expo graphics
You can check the core of the source from here:
https://github.com/cryslub/history-expo/blob/master/ThreeScene.js

Compass location on react-native Mapbox map

Is it possible to change the location of the compass in React-Native Mapbox GL? It currently defaults to the top-right and I have been trying to find how to move it to the top-left.
<Mapbox.MapView
ref={component => this.map = component}
styleURL={this.state.mapStreetView}
zoomLevel={this.state.defaultZoom}
centerCoordinate={[this.state.long, this.state.lat]}
showUserLocation={true}
onPress={this.onLocationPress}
zoomEnabled={true}>
It's possible with iOS - since #389.
See compassViewPosition and compassViewMargins on MapView
No you can't move the compass position, you can only show it or hide it with the compassEnabled property.
(https://github.com/react-native-mapbox-gl/maps/blob/master/docs/MapView.md)
there is a bug in react-native-mapbox-gl/maps 8.5.0 version in android part. They cut density to int when multiply layout pixels and density. (for example, my devices has 2.8125 and 2.75 dencity, but mapbox counts them as 2.
the bug is in #react-native-mapbox-gl/maps/android/rctmgl/src/main/java/com/mapbox/rctmgl/components/mapview/RCTMGLMapView.java in updateUISettings method
and looks like
int pixelDensity = (int)getResources().getDisplayMetrics().density;
so on js side you can simply fix this with coefficient
const androidPixelRatio = PixelRatio.get();
const androidDensityCoefficient = androidPixelRatio / Math.floor(androidPixelRatio);
compassViewMargins = {
x: layoutX * androidDensityCoefficient;
y: layoutY * androidDensityCoefficient;
};
<MapboxGL.MapView
{...restProps}
compassViewMargins={compassViewMargins}
/>

Exception thrown while executing UI block: Invalid Region <center: +37.33233141, -122.03121860, span: +0.04491556, -5.73770224>

I'm building an app with react-native-maps. In testing, I've got it set up with a button that toggles between two different locations (DC and Concord NH), one of which happens to be my actual current location. I also have a "Find me!" button to go to the current location.
On the Android simulator, when I hit "find me" it jumps to San Fransisco, which I assume is where the simulator location is set. I don't own an actual Android device
On both iOS simulator and on my iPhone, the DC/NH buttons work, and on my iPhone when I've toggled into DC, I see my blue dot. However, on both the iOS simulator and my iPhone, when I hit "find me" I get this error (the coordinates below are when running on the simulator:
Exception thrown while executing UI block:
Invalid Region <center: +37.33233141, -122.03121860, span: +0.04491556, -5.73770224>
The same error happens, sensibly with different coords, on my iPhone.
It appears that Android and iOS are translating the coordinates differently, or something like that. They both read the coords I give for DC and NH the same, and appear to show the same region when I call those locations up.
By the way, I'm using Apple Maps on iOS – not ready to do the whole ejecting code thing to use Google Maps.
Here's the code I use to get the current location and convert it to numbers that React Native can understand correctly:
calculateRegion(latitude, longitude, accuracy) {
const oneDegreeOfLongitudeInMeters = 111.32;
const circumference = 40075 / 360;
const latitudeDelta = accuracy / oneDegreeOfLongitudeInMeters;
const longitudeDelta = accuracy * (1 / Math.cos(latitude * circumference));
const region = { latitude, longitude, latitudeDelta, longitudeDelta };
this.setState({ region });
}
getLocation = () => {
navigator.geolocation.getCurrentPosition(position => {
const lat = position.coords.latitude;
const long = position.coords.longitude;
const accuracy = position.coords.accuracy;
this.calculateRegion(lat, long, accuracy);
});
};
Does anyone have ideas about this?

Couldn't get the camera to work using expoTHREE and expoGRAPHICS to create an AR scene in a React-native project

I am developing a React Native app and I wanted to implement a AR scene using Three.js. My project is initialized using "React-native init" and then I installed expo modules. I created a separate module called "CamTab.js" and implemented the AR scene in it. Then I called the component from MainScreen.js. I can build and run my app without any errors but camera screen does not show up as intended. CamTab.js is shown below.
I added the Text "The Text" to see if it shows up and it works.
onContextCreate = async ({gl, scale, width, height, arSession}) => {
//initialize renderer
this.renderer = ExpoTHREE.createRenderer({gl});
this.renderer.setPixelRatio(scale);
this.renderer.setSize(width, height);
//initialize scene
this.scene = new THREE.Scene();
this.scene.background = ExpoTHREE.createARBackgroundTexture(arSession, this.renderer);
//initialize camera
this.camera = ExpoTHREE.createARCamera(arSession, width / scale, height / scale, 0.01, 1000);
}
onRender = (delta) => {
this.renderer.render(this.scene, this.camera);
}
render(){
return(<View>
<Text> The Text </Text>
<ExpoGraphics.View style = {{flex:1}}
onContextCreate= {this.onContextCreate}
onRender = {this.onRender}
arEnabled={true}/>
</View>
);
}
}
I intend to display the camera input using this component but I get nothing. just the white screen.

Switch front/back camera on Android while on WebRTC call using Circuit SDK

I am able to make a direct call between a Circuit WebClient and the example SDK app at https://output.jsbin.com/posoko.
When running the SDK example on a PC with a second camera (USB), the switching between the built-in camera and the USB camera works fine. But trying the same on my Android device (Samsung Galaxy S6) the switching does not work.
My code uses navigator.mediaDevices.enumerateDevices() to get the cameras and then uses the Circuit SDK function setMediaDevices to switch to the other camera.
async function switchCam() {
let availDevices = await navigator.mediaDevices.enumerateDevices();
availDevices = availDevices.filter(si => si.kind === 'videoinput');
let newDevice = availDevices[1]; // secondary camera
await client.setMediaDevices({video: newDevice.deviceId})
}
Can somebody explain why this doesn’t work on an Android device?
We have seen Android devices that don't allow calling navigator.getUserMedia while a video track (and therefore a stream) is still active. I tried your example above with a Pixel 2 without any issues though.
If you remove the video track from the stream and stop the track before calling client.setMediaDevices, the switch should work.
async function switchCam() {
const stream = await client.getLocalAudioVideoStream();
const currTrack = stream.getVideoTracks()[0];
console.log(`Remove and stop current track: ${currTrack.label}`);
stream.removeTrack(currTrack);
currTrack.stop();
let availDevices = await navigator.mediaDevices.enumerateDevices();
availDevices = availDevices.filter(si => si.kind === 'videoinput');
let newDevice = availDevices[1]; // secondary camera
await client.setMediaDevices({video: newDevice.deviceId})
}
There is a complete switch camera example on JSBin at https://output.jsbin.com/wuniwec/