How to display Image from camera roll url using react-native-camera? - react-native

I use react-native-camera for capturing images and saving them in cameraRoll (CaptureTarget) for iOS devices, on capture I get image path in the following form
"assets-library://asset/asset.JPG?id=8B55D3E5-11CE-439A-8FC6-D55EB1A88D7E&ext=JPG"
How to use this path to display the image in Image component (from react-native)?
Previously I was using disk as CaptureTarget option, and I was able to show that image url Image component but now the requirements are to save image in camera roll?

I have used RNFetchBlob to get base64 from "assets-library://.." url, my capture function is
this.camera.capture()
.then((data) => {
// console.log(data);
RNFetchBlob.fs.readFile(data.path, 'base64')
.then((base64data) => {
let base64Image = `data:image/jpeg;base64,${base64data}`;
this.props.addImagesToUntagged(data.path);
})
})
after that i give user some in-app functionality on this base64 data and finally when I need to send this to s3 server, I use axios & RNFetchBlob to send this data.following code gives me signed url for s3
axios.get(ENDPOINT_TO_GET_SIGNED_URL, {params:{'file-name': file.name,'file-type': file.type,"content-type": 'evidence'}})
.then(function (result) {
// console.log(result);
returnUrl = result.data.url;
var signedUrl = result.data.signedRequest;
return uploadFile(file,signedUrl)
})
and in my uploadFile function i upload images through following code
RNFetchBlob.fetch('PUT', signedUrl,
{'Content-Type' : file.type},
RNFetchBlob.wrap(file.uri)
)
.then(resolve)

Related

In an Expo React Native app, how to pick a photo, store it locally on the phone, and later upload it to a Node.js service?

I'm building a React Native app with Expo, and I want to include the following workflow: The user takes a picture (either with the camera or picking one from the phone's gallery), which is stored locally on the phone, until the user uploads it some later time to a backend service.
I'm pretty stuck and would appreciate any pointers.
Here is what I have:
I use expo-image-picker to pick a photo from the phone's gallery:
const photo = await launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
base64: true,
quality: 1,
});
Then I store the photo locally as a Base64 string using expo-file-system:
const location = `${FileSystem.documentDirectory}${filename}`;
await FileSystem.writeAsStringAsync(location, photo.base64, {
encoding: FileSystem.EncodingType.Base64
});
I keep information about the storage location, file name, and mime type in an image object. Later, I try to upload that image to my own Node.js backend service with axios, sending the following multi-part form data:
const formdata = new FormData();
formdata.append('file', {
path: image.location,
name: image.filename,
type: image.mimetype
} as any);
The backend service that receives the photo uses multer:
const multer = require('multer');
const upload = multer({ storage: multer.memoryStorage() });
router.post('/photo', upload.single('file'), async (request, response) => {
console.log(request.file);
...
});
What arrives at my service is the following:
{
fieldname: 'file',
originalname: '1653135701413.jpg',
encoding: '7bit',
mimetype: 'image/jpg',
buffer: <Buffer >,
size: 0
}
So no data is transferred. (It seems to be properly stored on the phone, because if I use the Expo filesystem's readStringAsAsync, I do get a pretty long Base64 string.)
What am I missing? Do I need to send the image as a blob? (If I try to do so, then request.file is undefined, so I guess I'm doing something wrong there as well.)
And in general, is there a better way to achieve this workflow in a managed React Native app? (For example, is it ok to store the image as a Base64 string, or would it be better to do this differently?)
Edit:
In the form data, I changed path to uri, and I switched from axios to fetch. Now the backend finally receives the image data. 🥳

Upload Video URI from React-Native Picker to AWS S3 Server

I am trying to upload a video from my IOS device library to S3 using axios and a pre-signed url. I've determined the axios/s3 part is working great, but the issue is coming from the uri I receive from 'react-native-image-picker'.
When I record a video in react-native the video uri uploads fine in S3, but when I grab a video from my photo library it uploads to S3 but it's not a video file.
I grab the video uri from my ios device library using react-native-image-picker.
import {launchImageLibrary} from 'react-native-image-picker';
launchImageLibrary({mediaType: "video"}, ({assets}) => {
let {uri} = assets[0] //uri = /var/mobile/Containers/Data/Application/123/tmp/IMG_1779.mov
uploadFile(uri)
});
and then I attempt to upload the uri to S3
//save to s3 using presignedPost
uploadFile(uri){
var formData = new FormData();
...
formData.append("file", { uri });
await axios.post(presignedPost.url, formData,{ 'Content-Type': 'multipart/form-data' } )
}
The function is successful, but when I look in AWS the file is just a text file with a bunch of random characters.
The good news is this same exact uploadFile functions works if I record a video with react-native-camera
import { RNCamera } from 'react-native-camera';
stopRecord(){
let camera = cameraRef.current //grab camera <RNCamera ref={cameraRef} />
let {uri = null} = await camera.recordAsync(); //uri = /var/mobile/Containers/Data/Application/123/Library/Caches/Camera/123.mov
uploadFile(uri)
}
The only difference I can see is the uri after recording a video is stored in cache. Therefore, I attempted to use 'react-native-fs' to grab the picker uri, save it to cache, and then upload the cached file but I got the same error (file uploads to s3 but not a video).
import * as RNFS from 'react-native-fs';
uploadFileTwo(uri){
let base64Data = await RNFS.readFile(uri, 'base64')
let cachePath = RNFS.CachesDirectoryPath + "/" + fileName + ".mov"
await RNFS.writeFile(cachePath, base64Data, 'base64')
var formData = new FormData();
...
formData.append("file", { uri: cachePath });
await axios.post(presignedPost.url, formData,{ 'Content-Type': 'multipart/form-data' } )
}
So now I am out of options. Why would the 'react-native-camera' uri work great, but the 'react-native-image-picker' uri doesn't?
Have a try by following the Tutorial for uploading video file.
I think this tutorial meets all your requirements as it uses react-native-image-picker to upload the video file(s).

converting selected document from DOCUMENT PICKER in react native to base64 using RNFetchBlob

I have some problem about my react native app. I would just like to ask if are there any ways that RNFetchBlob will accept a dataURI from documentPicker instead of a web URL? I just need to convert the selected file from document picker to base64. Could anyone help me?
RNFetchBlob.config({ fileCache: true })
.fetch("GET", 'http://www.africau.edu/images/default/sample.pdf') // Replace the web URL to dataURI from documentPicker
// the image is now dowloaded to device's storage
.then(resp => {
// the image path you can use it directly with Image component
// return resp.readFile("base64");
return resp.readFile("base64");
}).then(base64Data => {
console.log('base64Data', base64Data);
});
If you are not particularly looking for base64 encoded but want to obtain the actual blob you can use fetch without going through base64 bridge
const fetchResponse = await fetch(at[i].uri);
const blob = await fetchResponse.blob();

React Native Uploading Video to YouTube (Stuck at Processing)

I am attempting to upload video files to YouTube using their v3 API and their MediaUploader. It works in my ReactJS application, but not in my React Native application. When uploading via React Native, the upload completes, then stalls at 100%. In my YouTube account, I can see the new video file, but it is stuck at "Video is still being processed."
I believe the issue may be that I need to send a video file and not an object with a video uri but I don't know how to get around that.
I am using the YouTube MediaUploader from the CORS example at https://github.com/youtube/api-samples/blob/master/javascript/cors_upload.js I am using an OAuth 2.0 client Id, and this setup works correctly when using the ReactJS app via my website. I am using React Native Expo with Camera, which returns me an Object with a URI, for example:
Video File: Object {
"uri": "file:///var/mobile/Containers/Data/Application/353A7969-E2A8-4C80-B641-C80B2B029555/Library/Caches/ExponentExperienceData/%2540dj_walksalot%252Fwandereo/Camera/E971DFEC-AB3E-4B6D-892F-9027AFE47A1A.mov",
}
This file can be viewed in the application, and I can even successfully send this to my server for playback on the web app and in the React Native app. However, sending this object in the MediaUploader does not work. It will take an appropriate amount of time to upload, but then sits at 100%, while my YouTube account will show it has received the video with the correct metadata, but the video itself remains stuck at "Video is still being processed."
video_file: Object {
"uri": "file:///var/mobile/Containers/Data/Application/353A7969-E2A8-4C80-B641-C80B2B029555/Library/Caches/ExponentExperienceData/%2540dj_walksalot%252Fwandereo/Camera/E971DFEC-AB3E-4B6D-892F-9027AFE47A1A.mov",
}
export const uploadToYouTube = (access_token, video_file, metadata) => async (dispatch) => {
...cors_upload...
var uploader = new MediaUploader({
baseUrl: `https://www.googleapis.com/upload/youtube/v3/videos?part=snippet%2Cstatus&key=API_KEY`,
file: video_file,
token: access_token,
metadata: metadata,
contentType: 'video/quicktime',
// contentType: 'application/octet-stream',//"video/*",
// contentType = options.contentType || this.file.type || 'application/octet-stream';
params: {
part: Object.keys(metadata).join(',')
},
onError: function(data) {
// onError code
let err = JSON.parse(data);
dispatch(returnErrors(err.message, err.code))
console.log('Error: ', err);
},
onProgress: function(progressEvent){
// onProgress code
let percentCompleted = Math.round((progressEvent.loaded * 100) / progressEvent.total);
dispatch({
type: UPLOAD_PROGRESS,
payload: percentCompleted
});
},
onComplete: function(data) {
console.log('Complete');
// onComplete code
let responseData = JSON.parse(data);
dispatch({
type: UPLOAD_YOUTUBE_VIDEO,
payload: JSON.parse(data)
})
dispatch({
type: UPLOAD_PROGRESS,
payload: 0
});
}
});
uploader.upload();
}
Similar to my currently-working web app, after completing the upload, the "onComplete" function should fire, and YouTube should process the video. This does not happen. I believe it's because I'm attaching an object with a URI and not the actual file.
I was able to solve this from a post at Expert Mill by Joe Edgar at https://www.expertmill.com/2018/10/19/using-and-uploading-dynamically-created-local-files-in-react-native-and-expo/
By using fetch and .blob() I was able to convert the URI object to a data object and upload. Additional code:
const file = await fetch(video_file.uri);
const file_blob = await file.blob();
No need to install RNFetchBlob since this is in the Expo SDK.

Using Fetch instead of XMLHttpRequest in React Native

I was trying to upload images to S3 from my react native app by following and adapting this guide by heroku: https://devcenter.heroku.com/articles/s3-upload-node
Essentially I am using the aws-sdk on my express.js backend to generate pre-signed request for uploading images to S3 from react native.
Everything works well, so then I tried to convert the XMLHttpRequests into fetch requests, which seem to be favoured by react native. After the conversion, the files are still being uploaded to S3, but when I click on the image links, then the images wouldn't not show properly, instead an empty square is shown:
Empty square shown instead of image
More specifically it seems to be this piece of code conversion that causes it to happen:
From:
_uploadFile(file, signedRequest, url){
const xhr = new XMLHttpRequest();
xhr.open('PUT', signedRequest);
xhr.onreadystatechange = () => {
if(xhr.readyState === 4){
if(xhr.status === 200){
console.log("UPLOAD DONE");
} else {
alert('ERROR UPLOADING');
}
}
};
xhr.send(file);
}
To:
_uploadFile(file, signedRequest, url) {
let option = {
method: "PUT",
headers: {
"Content-Type": "image/jpeg",
},
body: JSON.stringify(file)
}
fetch(signedRequest, option)
.then(res => console.log("UPLOAD DONE"))
.catch(err => console.log("ERROR UPLOADING: ", err))
}
The file object being uploaded:
{
name: "profileImage",
type: "image/jpeg",
uri: 'data:image/jpeg;base64,' + response.data, //just a base64 image string
isStatic: true
}
Could anyone shed some light on why this could be happening, or have had similar experiences? Many thanks!
In your fetch example you put a JSON string in your body. It will be sent to S3 but it will not be interpreted as an image upload. You should be able to construct a FormData object yourself and pass it to fetch as the request body, but I think using XHR is the simpler option. According to this comment it's what Facebook does as well (the comment is over a year old).
If at all possible you should also try to use local URIs instead of passing Base64 encoded data. It takes quite a while to transfer a few MB of image data between JS and native.