React-native-image-picker crashes in Android 11 - react-native

I have used react-native-image-picker in my project. It is working fine in android phones that are less than Android 11 but App crashes in android 11 without showing logcat. launchImageLibrary is working as expected but launchCamera is crashing app. I have added the permissions also in android manifest file i.e
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
but still no luck.

I fixed it by adding await in launchImageLibrary/launchCamera
I have added the code for your reference
const openCamera = async () => {
let options = { quality: 5, maxWidth: 500, maxHeight: 500, includeBase64: true, mediaType: 'photo', noData: true, };
await launchCamera(options, response => {
if (response.didCancel) {
console.log('Cancelled');
} else if (response.error) {
console.log('Error', response.errorMessage);
} else {
console.log(response);
setFilePath(response.uri);
setBase64('data:image/png;base64,' + response.base64); } });
};

try to remove
<uses-permission android:name="android.permission.CAMERA" />
Image-picker don't need permission

Related

Getting never_ask_again by default on react native when I ask for location

This is the function that asks for permission and it logs to the console: never_ask_again automatically
requestCameraPermission = async () => {
try {
const granted = await PermissionsAndroid.request(
PermissionsAndroid.PERMISSIONS.ACCESS_FINE_LOCATION,
{
title: "Cool Photo App Camera Permission",
message:
"Cool Photo App needs access to your camera " +
"so you can take awesome pictures.",
buttonNeutral: "Ask Me Later",
buttonNegative: "Cancel",
buttonPositive: "OK"
}
);
console.log(await
PermissionsAndroid.request(PermissionsAndroid.PERMISSIONS.ACCESS_FINE_LOCATION))
} catch (err) {
console.warn(err);
}
};
this is my androidManifest.xml
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.INTERNET" />

React-Native-Image-Picker: How do I restrict user to upload video more than given length?

Can I do something to restrict a user to upload a video of duration more than 300 seconds? Either the bigger videos should be trimmed to 300s or the videos more than 300s should be disabled. I use durationLimit prop which is not working for android. I have used the following versions of this library:
"react-native-image-picker": "^2.3.4"
Then
"react-native-image-picker": "^3.5.0",
Then
"react-native-image-picker": "^4.0.2",
Neither working for me
import ImagePicker from "react-native-image-picker";
const uploadVideo = async () => {
activateKeepAwake();
let options = {
title: "Select video",
takePhotoButtonTitle: null,
mediaType: "video",
path: "video",
quality: 1,
videoQuality: "normal",
durationLimit: 300,
allowsEditing: true,
};
ImagePicker.showImagePicker(options, async (response) => {
if (response.didCancel) {
console.log("User cancelled image picker");
} else if (response.error) {
console.log("ImagePicker Error: ", response.error);
} else if (response.customButton) {
console.log("User tapped custom button: ", response.customButton);
} else {
if (response && response.uri) {
let selectedUri;
let videoFilePath;
let selectedFileUri = response.uri;
setVideoLoader(true);
setModalAddVisible(false);
if (
Platform.OS === "ios" &&
(selectedFileUri.includes(".mov") ||
selectedFileUri.includes(".MOV"))
) {
videoFilePath = await handleConvertToMP4(selectedFileUri);
selectedUri = "file://" + videoFilePath.path;
} else {
selectedUri =
Platform.os === "ios"
? selectedFileUri
: "file://" + response.path;
}
setVideoSource(null);
setVideoSource(selectedUri);
await createThumbnailForVideos(selectedUri, 1);
var filename = selectedUri.substring(
selectedUri.lastIndexOf("/") + 1,
selectedUri.length
);
const file = {
uri: selectedUri,
name: selectedGroup.id + "-dev-" + filename,
};
uploadVideoOnS3(file, "video");
}
}
});
};
Here are my android permissions:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
According to the official documentation, the durationLimit prop works for already recorded video but not for live recording the video. Reference: https://github.com/react-native-image-picker/react-native-image-picker/issues/1738

How do I overcome "Permission Denial....obtain access using ACTION_OPEN_DOCUMENT or related APIs"?

I'm using react-native-firebase and react-native-document-picker and I'm trying to follow the face detection tutorial.
Currently getting the following error despite having read access through PermissionsAndroid:
Permission Denial: reading com.android.provides.media.MediaDocumentsProvider uri [uri] from pid=4746, uid=10135 requires that you obtain access using ACTION_OPEN_DOCUMENT or related APIs
I am able to display the selected image by the user on the screen but the react-native-firebase functions seems to not be able to have permission. The error happens at this call: const faces = await vision().faceDetectorProcessImage(localPath);.
Any suggestions on how to give the face detection function access or what am I doing wrong?
My AndroidManifest.xml file contains the following:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
Here is all the code in that component for reference:
import React, {useState} from 'react';
import { Button, Text, Image, PermissionsAndroid } from 'react-native';
import vision, { VisionFaceContourType } from '#react-native-firebase/ml-vision';
import DocumentPicker from 'react-native-document-picker';
async function processFaces(localPath) {
console.log(localPath)
const faces = await vision().faceDetectorProcessImage(localPath);
console.log("Got faces")
faces.forEach(face => {
console.log('Head rotation on Y axis: ', face.headEulerAngleY);
console.log('Head rotation on Z axis: ', face.headEulerAngleZ);
console.log('Left eye open probability: ', face.leftEyeOpenProbability);
console.log('Right eye open probability: ', face.rightEyeOpenProbability);
console.log('Smiling probability: ', face.smilingProbability);
face.faceContours.forEach(contour => {
if (contour.type === VisionFaceContourType.FACE) {
console.log('Face outline points: ', contour.points);
}
});
});
}
async function pickFile () {
// Pick a single file
try {
const res = await DocumentPicker.pick({
type: [DocumentPicker.types.images],
});
console.log(
res.uri,
res.type, // mime type
res.name,
res.size
);
return res
} catch (err) {
if (DocumentPicker.isCancel(err)) {
// User cancelled the picker, exit any dialogs or menus and move on
console.log("User cancelled")
} else {
console.log("Error picking file or processing faces")
throw err;
}
}
}
const requestPermission = async () => {
try {
const granted = await PermissionsAndroid.request(
PermissionsAndroid.PERMISSIONS.READ_EXTERNAL_STORAGE,
{
title: "Files Permission",
message:
"App needs access to your files " +
"so you can run face detection.",
buttonNeutral: "Ask Me Later",
buttonNegative: "Cancel",
buttonPositive: "OK"
}
);
if (granted === PermissionsAndroid.RESULTS.GRANTED) {
console.log("We can now read files");
} else {
console.log("File read permission denied");
}
return granted
} catch (err) {
console.warn(err);
}
};
function FaceDetectionScreen ({navigation}) {
const [image, setImage] = useState("");
return (
<>
<Text>This is the Face detection screen.</Text>
<Button title="Select Image to detect faces" onPress={async () => {
const permission = await requestPermission();
if (permission === PermissionsAndroid.RESULTS.GRANTED) {
const pickedImage = await pickFile();
const pickedImageUri = pickedImage.uri
setImage(pickedImageUri);
processFaces(pickedImageUri).then(() => console.log('Finished processing file.'));
}
}}/>
<Image style={{flex: 1}} source={{ uri: image}}/>
</>
);
}
export default FaceDetectionScreen;
Thanks to this comment on a github issue I was able to update my code and get it to work by updating the first three lines of processFaces as:
async function processFaces(contentUri) {
const stat = await RNFetchBlob.fs.stat(contentUri)
const faces = await vision().faceDetectorProcessImage(stat.path);
after importing import RNFetchBlob from 'rn-fetch-blob'.
rn-fetch-blob

Image to base64 in react native

How to convert local image into base64 in react native and upload on server, please help anyone to solve this query. I already tried using library which available on google named image-to-base64 npm.
With expo API
import { ImageManipulator } from 'expo';
const response = await ImageManipulator.manipulateAsync("file to local path", [], { base64: true })
console.log('base64res' + JSON.stringify(response));
All guys we can get an base64 string of the image using Image picker in react native for Profile uses and many more.
Here I put the piece of code which will help to get base64 string in react native using image picker function.
selectPhotoTapped() {
const options = {
quality: 1.0,
maxWidth: 500,
maxHeight: 500,
storageOptions: {
skipBackup: true,
},
};
ImagePicker.showImagePicker(options, response => {
console.log('Response = ', response.data);
if (response.didCancel) {
console.log('User cancelled photo picker');
} else if (response.error) {
console.log('ImagePicker Error: ', response.error);
} else if (response.customButton) {
console.log('User tapped custom button: ', response.customButton);
} else {
// let source = { uri: response.uri }; <-- here you can get uri of image
// var RNFS = require('react-native-fs');
// You can also display the image using data:
let source = 'data:image/jpeg;base64,'+ [response.data]; //<-- here you can get image with base64string
this.setState({
avatarSource: source,
});
// this.setState({
// Profile_Picture:this.state.avatarSource
// })
// console.log(this.state.Profile_Picture)
}
});
}
After that you can using onPress event take image from your library but before that you have to grant permission for use android or IOS image from local storage.
link of the installation for image picker
Use This Link for Installation of Image picker in react native
Using react-native-image-base64:
import ImgToBase64 from 'react-native-image-base64';
ImgToBase64.getBase64String('file://path/to/file')
.then(base64String => {
// Send the base64String to server
})
.catch(err => console.log(err));
only solution is react-native-fs
import RNFS from 'react-native-fs';
RNFS.readFile(this.state.imagePath, 'base64')
.then(res =>{
console.log(res);
});

React-Native Camera error - No suitable URL request handler found for assets-library

I am creating react-native app which allows me to take a picture using camera and upload to AWS S3.
I am able to click the picture and save image to my iPhone camera roll. But, when I try to upload the image, I get error No suitable URL request handler found for assets-library://asset as shown below:
Here is the code snippet:
import Camera from 'react-native-camera';
import {RNS3} from "react-native-aws3";
class NCamera extends React.Component {
takePicture() {
this.camera.capture()
.then((data) => {
const file = { uri: data.path, name: 'image.png', type: 'image/png',}
const options = {
keyPrefix: "images/",
bucket: "my-bucket-name",
region: "us-east-1",
accessKey: "key",
secretKey: "secret-key",
successActionStatus: 201
}
RNS3.put(file, options)
.then(response => {
if (response.status != 201 )
console.log('Error uploading file to S3');
else
console.log(response.body);
})
.catch (error => console.log(`Error uploading: ${error}`));
})
.catch(err => console.log(err));
}
render() {
return (
<Camera
ref={(cam) => {
this.camera = cam;
}}
style={styles.preview}
aspect={Camera.constants.Aspect.fill}>
<Text style={styles.capture} onPress={this.takePicture.bind(this)}>[CAPTURE]</Text>
</Camera>
);
}
}
Solution
I added libRCTCameraRoll.a which resolved the issue.
Here are the steps:
1. Open RCTCameraRoll.xcodeproj in xcode. The file can be found under node_modules/react-native/Libraries/CameraRoll
2. Under Build Phases, add libRCTCameraRoll.a(screenshot below).
If this is on IOS, I think you need to link libRCTCamera.a in XCode so that the file url can be resolved properly. See this medium article for more details on that.