In an Expo React Native app, how to pick a photo, store it locally on the phone, and later upload it to a Node.js service? - react-native

I'm building a React Native app with Expo, and I want to include the following workflow: The user takes a picture (either with the camera or picking one from the phone's gallery), which is stored locally on the phone, until the user uploads it some later time to a backend service.
I'm pretty stuck and would appreciate any pointers.
Here is what I have:
I use expo-image-picker to pick a photo from the phone's gallery:
const photo = await launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
base64: true,
quality: 1,
});
Then I store the photo locally as a Base64 string using expo-file-system:
const location = `${FileSystem.documentDirectory}${filename}`;
await FileSystem.writeAsStringAsync(location, photo.base64, {
encoding: FileSystem.EncodingType.Base64
});
I keep information about the storage location, file name, and mime type in an image object. Later, I try to upload that image to my own Node.js backend service with axios, sending the following multi-part form data:
const formdata = new FormData();
formdata.append('file', {
path: image.location,
name: image.filename,
type: image.mimetype
} as any);
The backend service that receives the photo uses multer:
const multer = require('multer');
const upload = multer({ storage: multer.memoryStorage() });
router.post('/photo', upload.single('file'), async (request, response) => {
console.log(request.file);
...
});
What arrives at my service is the following:
{
fieldname: 'file',
originalname: '1653135701413.jpg',
encoding: '7bit',
mimetype: 'image/jpg',
buffer: <Buffer >,
size: 0
}
So no data is transferred. (It seems to be properly stored on the phone, because if I use the Expo filesystem's readStringAsAsync, I do get a pretty long Base64 string.)
What am I missing? Do I need to send the image as a blob? (If I try to do so, then request.file is undefined, so I guess I'm doing something wrong there as well.)
And in general, is there a better way to achieve this workflow in a managed React Native app? (For example, is it ok to store the image as a Base64 string, or would it be better to do this differently?)
Edit:
In the form data, I changed path to uri, and I switched from axios to fetch. Now the backend finally receives the image data. 🥳

Related

Uploading video recorded with Expo Camera to server

I am recording video using expo camera, the recorded video is saved in the cache and I want to upload the recorded video to the server. I have the uri to the video, but to upload it to the server I need the file itself. How can I add the file to the body of the request? (I can't use rn-fletch-blob or react-native-fs, because it 's an expo project)
Ok so you have the uri
Now we have to create a form-data as it is a file.
to create form-data follow these steps
Create A function
const CreateFormData = (uri) => {
// Here uri means the url of the video you captured
const form = new FormData();
form.append("File", {
name: "SampleVideo.mp4",
uri: uri,
type: "video/mp4",
});
// Now perform a post request here by adding this form in the body part of the request
// Then you can handle the file you sent in the backend i.e server
};

React Native Uploading Video to YouTube (Stuck at Processing)

I am attempting to upload video files to YouTube using their v3 API and their MediaUploader. It works in my ReactJS application, but not in my React Native application. When uploading via React Native, the upload completes, then stalls at 100%. In my YouTube account, I can see the new video file, but it is stuck at "Video is still being processed."
I believe the issue may be that I need to send a video file and not an object with a video uri but I don't know how to get around that.
I am using the YouTube MediaUploader from the CORS example at https://github.com/youtube/api-samples/blob/master/javascript/cors_upload.js I am using an OAuth 2.0 client Id, and this setup works correctly when using the ReactJS app via my website. I am using React Native Expo with Camera, which returns me an Object with a URI, for example:
Video File: Object {
"uri": "file:///var/mobile/Containers/Data/Application/353A7969-E2A8-4C80-B641-C80B2B029555/Library/Caches/ExponentExperienceData/%2540dj_walksalot%252Fwandereo/Camera/E971DFEC-AB3E-4B6D-892F-9027AFE47A1A.mov",
}
This file can be viewed in the application, and I can even successfully send this to my server for playback on the web app and in the React Native app. However, sending this object in the MediaUploader does not work. It will take an appropriate amount of time to upload, but then sits at 100%, while my YouTube account will show it has received the video with the correct metadata, but the video itself remains stuck at "Video is still being processed."
video_file: Object {
"uri": "file:///var/mobile/Containers/Data/Application/353A7969-E2A8-4C80-B641-C80B2B029555/Library/Caches/ExponentExperienceData/%2540dj_walksalot%252Fwandereo/Camera/E971DFEC-AB3E-4B6D-892F-9027AFE47A1A.mov",
}
export const uploadToYouTube = (access_token, video_file, metadata) => async (dispatch) => {
...cors_upload...
var uploader = new MediaUploader({
baseUrl: `https://www.googleapis.com/upload/youtube/v3/videos?part=snippet%2Cstatus&key=API_KEY`,
file: video_file,
token: access_token,
metadata: metadata,
contentType: 'video/quicktime',
// contentType: 'application/octet-stream',//"video/*",
// contentType = options.contentType || this.file.type || 'application/octet-stream';
params: {
part: Object.keys(metadata).join(',')
},
onError: function(data) {
// onError code
let err = JSON.parse(data);
dispatch(returnErrors(err.message, err.code))
console.log('Error: ', err);
},
onProgress: function(progressEvent){
// onProgress code
let percentCompleted = Math.round((progressEvent.loaded * 100) / progressEvent.total);
dispatch({
type: UPLOAD_PROGRESS,
payload: percentCompleted
});
},
onComplete: function(data) {
console.log('Complete');
// onComplete code
let responseData = JSON.parse(data);
dispatch({
type: UPLOAD_YOUTUBE_VIDEO,
payload: JSON.parse(data)
})
dispatch({
type: UPLOAD_PROGRESS,
payload: 0
});
}
});
uploader.upload();
}
Similar to my currently-working web app, after completing the upload, the "onComplete" function should fire, and YouTube should process the video. This does not happen. I believe it's because I'm attaching an object with a URI and not the actual file.
I was able to solve this from a post at Expert Mill by Joe Edgar at https://www.expertmill.com/2018/10/19/using-and-uploading-dynamically-created-local-files-in-react-native-and-expo/
By using fetch and .blob() I was able to convert the URI object to a data object and upload. Additional code:
const file = await fetch(video_file.uri);
const file_blob = await file.blob();
No need to install RNFetchBlob since this is in the Expo SDK.

Local storage solutions for large data including images on React Native

Here's the flow of how my end-product should work:
When the user opens the app for the first time, fetch all the data
i.e., including images(150+) and relevant JSON objects.
On opening the app subsequently, the images and data should load
from local storage i.e., no need for internet at all.
I know it seems weird but this is my use case:
The product is a Wayfinder running on Android Box(55-inch touchscreen TV ) which will be placed in the shopping mall. It will not have access to the internet unless I manually connect it.
Hence it should load the data when opening for the first time i.e. when I'm configuring the application.
Solutions I have come across:
Realm: Local database management with excellent support for react-native - my option right now
Native Async Storage: Not suitable for large data
SQLite: Not comfortable with SQL queries
I'm still looking for options on how differently this problem can be tackled. Also, I'm familiar with Redux.
Thanks.
Check out react-native-fs (or expo-file-system if working with expo).
It is specially designed to store files on the device. In your component, it would look something like this:
const RNFS = require('react-native-fs');
RNFS
.downloadFile({ fromUrl: myURL, toFile: myFilePath })
.promise
.then(res => console.log('Done'));
use pouchDB database , this is work with indexDB local browser database
call XHR request for image and convert response to binary data and store in local database
when need to preview image , get from database and make a blobUrl and show in img tag
axios.get(url, {
progress: false, responseType: 'arraybuffer',
onDownloadProgress: (progressEvent) => {
precent = (100 * progressEvent.loaded / progressEvent.total)
console.log(precent)
}
})
.then(resp => {
//get db
let db = $db.dbModel
//set attach
db.get(doc._id).then((doc) => {
db.putAttachment(doc._id, 'index.mp4', doc._rev, new Blob([new Uint8Array(resp.data)], {type: 'video/mp4'}), 'video/mp4')
.then(res => {
// console.log('success store file')
})
})
})
https://github.com/mohammadnazari110/pwa_offline_video_download

How to display Image from camera roll url using react-native-camera?

I use react-native-camera for capturing images and saving them in cameraRoll (CaptureTarget) for iOS devices, on capture I get image path in the following form
"assets-library://asset/asset.JPG?id=8B55D3E5-11CE-439A-8FC6-D55EB1A88D7E&ext=JPG"
How to use this path to display the image in Image component (from react-native)?
Previously I was using disk as CaptureTarget option, and I was able to show that image url Image component but now the requirements are to save image in camera roll?
I have used RNFetchBlob to get base64 from "assets-library://.." url, my capture function is
this.camera.capture()
.then((data) => {
// console.log(data);
RNFetchBlob.fs.readFile(data.path, 'base64')
.then((base64data) => {
let base64Image = `data:image/jpeg;base64,${base64data}`;
this.props.addImagesToUntagged(data.path);
})
})
after that i give user some in-app functionality on this base64 data and finally when I need to send this to s3 server, I use axios & RNFetchBlob to send this data.following code gives me signed url for s3
axios.get(ENDPOINT_TO_GET_SIGNED_URL, {params:{'file-name': file.name,'file-type': file.type,"content-type": 'evidence'}})
.then(function (result) {
// console.log(result);
returnUrl = result.data.url;
var signedUrl = result.data.signedRequest;
return uploadFile(file,signedUrl)
})
and in my uploadFile function i upload images through following code
RNFetchBlob.fetch('PUT', signedUrl,
{'Content-Type' : file.type},
RNFetchBlob.wrap(file.uri)
)
.then(resolve)

Using Fetch instead of XMLHttpRequest in React Native

I was trying to upload images to S3 from my react native app by following and adapting this guide by heroku: https://devcenter.heroku.com/articles/s3-upload-node
Essentially I am using the aws-sdk on my express.js backend to generate pre-signed request for uploading images to S3 from react native.
Everything works well, so then I tried to convert the XMLHttpRequests into fetch requests, which seem to be favoured by react native. After the conversion, the files are still being uploaded to S3, but when I click on the image links, then the images wouldn't not show properly, instead an empty square is shown:
Empty square shown instead of image
More specifically it seems to be this piece of code conversion that causes it to happen:
From:
_uploadFile(file, signedRequest, url){
const xhr = new XMLHttpRequest();
xhr.open('PUT', signedRequest);
xhr.onreadystatechange = () => {
if(xhr.readyState === 4){
if(xhr.status === 200){
console.log("UPLOAD DONE");
} else {
alert('ERROR UPLOADING');
}
}
};
xhr.send(file);
}
To:
_uploadFile(file, signedRequest, url) {
let option = {
method: "PUT",
headers: {
"Content-Type": "image/jpeg",
},
body: JSON.stringify(file)
}
fetch(signedRequest, option)
.then(res => console.log("UPLOAD DONE"))
.catch(err => console.log("ERROR UPLOADING: ", err))
}
The file object being uploaded:
{
name: "profileImage",
type: "image/jpeg",
uri: 'data:image/jpeg;base64,' + response.data, //just a base64 image string
isStatic: true
}
Could anyone shed some light on why this could be happening, or have had similar experiences? Many thanks!
In your fetch example you put a JSON string in your body. It will be sent to S3 but it will not be interpreted as an image upload. You should be able to construct a FormData object yourself and pass it to fetch as the request body, but I think using XHR is the simpler option. According to this comment it's what Facebook does as well (the comment is over a year old).
If at all possible you should also try to use local URIs instead of passing Base64 encoded data. It takes quite a while to transfer a few MB of image data between JS and native.