How to get a stream while a video is being recorded? - react-native

I'd like to upload data chunk to server while recording a video(not after recording).
I tried to go with react-native-camera or react-native-vision-camera.
but as far as I know, recordAsync method only resolves the final version of recorded video.
Is there any smart way to get video chunk or stream during recording.
or should I use react-native-fs or rn-fetch-blob or something like that?
== update ==
I could probably achieve it like it gets done in the link below.
https://medium.com/react-native-training/build-youtube-alike-livestreams-with-react-native-8dde24adf543

If your problem is with regards to uploading a large file to the server, maybe you can go with react-native-background-upload, and show a progress notification, this will upload the file even when the app is in background.
There is also a package where in chunck upload is possible by breaking the file into multiple chunks : react-native-chunk-upload
import ChunkUpload from 'react-native-chunk-upload';
const chunk = new ChunkUpload({
path: response.path, // Path to the file
size: 10095, // Chunk size (must be multiples of 3)
fileName: response.fileName, // Original file name
fileSize: response.size, // Original file size
// Errors
onFetchBlobError: (e) => console.log(e),
onWriteFileError: (e) => console.log(e),
});
chunk.digIn(this.upload.bind(this));
upload(file, next, retry, unlink) {
const body = new FormData();
body.append('video', file.blob); // param name
axios.post('url', body, {
headers: {
"Content-Type": "multipart/form-data",
"Accept": 'application/json',
// Customize the headers
"x-chunk-number": file.headers["x-chunk-number"],
"x-chunk-total-number": file.headers["x-chunk-total-number"],
"x-chunk-size": file.headers["x-chunk-size"],
"x-file-name": file.headers["x-file-name"],
"x-file-size": file.headers["x-file-size"],
"x-file-identity": file.headers["x-file-identity"]
}
}).then((res) => {
...
})

Related

In an Expo React Native app, how to pick a photo, store it locally on the phone, and later upload it to a Node.js service?

I'm building a React Native app with Expo, and I want to include the following workflow: The user takes a picture (either with the camera or picking one from the phone's gallery), which is stored locally on the phone, until the user uploads it some later time to a backend service.
I'm pretty stuck and would appreciate any pointers.
Here is what I have:
I use expo-image-picker to pick a photo from the phone's gallery:
const photo = await launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
base64: true,
quality: 1,
});
Then I store the photo locally as a Base64 string using expo-file-system:
const location = `${FileSystem.documentDirectory}${filename}`;
await FileSystem.writeAsStringAsync(location, photo.base64, {
encoding: FileSystem.EncodingType.Base64
});
I keep information about the storage location, file name, and mime type in an image object. Later, I try to upload that image to my own Node.js backend service with axios, sending the following multi-part form data:
const formdata = new FormData();
formdata.append('file', {
path: image.location,
name: image.filename,
type: image.mimetype
} as any);
The backend service that receives the photo uses multer:
const multer = require('multer');
const upload = multer({ storage: multer.memoryStorage() });
router.post('/photo', upload.single('file'), async (request, response) => {
console.log(request.file);
...
});
What arrives at my service is the following:
{
fieldname: 'file',
originalname: '1653135701413.jpg',
encoding: '7bit',
mimetype: 'image/jpg',
buffer: <Buffer >,
size: 0
}
So no data is transferred. (It seems to be properly stored on the phone, because if I use the Expo filesystem's readStringAsAsync, I do get a pretty long Base64 string.)
What am I missing? Do I need to send the image as a blob? (If I try to do so, then request.file is undefined, so I guess I'm doing something wrong there as well.)
And in general, is there a better way to achieve this workflow in a managed React Native app? (For example, is it ok to store the image as a Base64 string, or would it be better to do this differently?)
Edit:
In the form data, I changed path to uri, and I switched from axios to fetch. Now the backend finally receives the image data. 🥳

React Native Uploading Video to YouTube (Stuck at Processing)

I am attempting to upload video files to YouTube using their v3 API and their MediaUploader. It works in my ReactJS application, but not in my React Native application. When uploading via React Native, the upload completes, then stalls at 100%. In my YouTube account, I can see the new video file, but it is stuck at "Video is still being processed."
I believe the issue may be that I need to send a video file and not an object with a video uri but I don't know how to get around that.
I am using the YouTube MediaUploader from the CORS example at https://github.com/youtube/api-samples/blob/master/javascript/cors_upload.js I am using an OAuth 2.0 client Id, and this setup works correctly when using the ReactJS app via my website. I am using React Native Expo with Camera, which returns me an Object with a URI, for example:
Video File: Object {
"uri": "file:///var/mobile/Containers/Data/Application/353A7969-E2A8-4C80-B641-C80B2B029555/Library/Caches/ExponentExperienceData/%2540dj_walksalot%252Fwandereo/Camera/E971DFEC-AB3E-4B6D-892F-9027AFE47A1A.mov",
}
This file can be viewed in the application, and I can even successfully send this to my server for playback on the web app and in the React Native app. However, sending this object in the MediaUploader does not work. It will take an appropriate amount of time to upload, but then sits at 100%, while my YouTube account will show it has received the video with the correct metadata, but the video itself remains stuck at "Video is still being processed."
video_file: Object {
"uri": "file:///var/mobile/Containers/Data/Application/353A7969-E2A8-4C80-B641-C80B2B029555/Library/Caches/ExponentExperienceData/%2540dj_walksalot%252Fwandereo/Camera/E971DFEC-AB3E-4B6D-892F-9027AFE47A1A.mov",
}
export const uploadToYouTube = (access_token, video_file, metadata) => async (dispatch) => {
...cors_upload...
var uploader = new MediaUploader({
baseUrl: `https://www.googleapis.com/upload/youtube/v3/videos?part=snippet%2Cstatus&key=API_KEY`,
file: video_file,
token: access_token,
metadata: metadata,
contentType: 'video/quicktime',
// contentType: 'application/octet-stream',//"video/*",
// contentType = options.contentType || this.file.type || 'application/octet-stream';
params: {
part: Object.keys(metadata).join(',')
},
onError: function(data) {
// onError code
let err = JSON.parse(data);
dispatch(returnErrors(err.message, err.code))
console.log('Error: ', err);
},
onProgress: function(progressEvent){
// onProgress code
let percentCompleted = Math.round((progressEvent.loaded * 100) / progressEvent.total);
dispatch({
type: UPLOAD_PROGRESS,
payload: percentCompleted
});
},
onComplete: function(data) {
console.log('Complete');
// onComplete code
let responseData = JSON.parse(data);
dispatch({
type: UPLOAD_YOUTUBE_VIDEO,
payload: JSON.parse(data)
})
dispatch({
type: UPLOAD_PROGRESS,
payload: 0
});
}
});
uploader.upload();
}
Similar to my currently-working web app, after completing the upload, the "onComplete" function should fire, and YouTube should process the video. This does not happen. I believe it's because I'm attaching an object with a URI and not the actual file.
I was able to solve this from a post at Expert Mill by Joe Edgar at https://www.expertmill.com/2018/10/19/using-and-uploading-dynamically-created-local-files-in-react-native-and-expo/
By using fetch and .blob() I was able to convert the URI object to a data object and upload. Additional code:
const file = await fetch(video_file.uri);
const file_blob = await file.blob();
No need to install RNFetchBlob since this is in the Expo SDK.

Uploading videos using formdata in react native

Has anyone successfully uploaded a video via React Native Formdata()? The code below attempts to upload a .mov file from the camera roll URI but in fact only the first frame of the video (a JPEG) gets uploaded. What's the issue here?
var movVideo = {
uri: uriFromCameraRoll,
type: 'video/quicktime',
name: 'something.mov',
};
var body = new FormData();
body.append('video', movVideo);
body.append('title', 'A beautiful video!');
fetch('https://mysite/upload_asset', {
method: "POST",
headers: {
'Accept': 'application/json',
'Content-Type': 'multipart/form-data'
},
body: body,
}).then((response) => response.json())
.then((responseJson) => {
//only the first frame of the video got uploaded
console.log(responseJson);
});
Had the same issue. Looks like React Native does not return the correct stream for videos with asset library URIs. Pictures seem to work fine. I would need to dig deeper before submitting an issue though.
I suggest you take a look at react-native-fetch-blob, which provides an improved fetch polyfill with Blob support. This implementation handles videos from the camera roll just fine. Also, the changes needed to use this module are minimal (include the polyfill, wrap URI with RNFetchBlob.wrap).

Download large object from AWS S3

I have an Angular web application in which allows users to download files locally (installers). Some files exceed 1.5 GB in size, which causes the browser to crash (Chrome) when using 'normal' s3.getObject(opts, function(err, data){}) calls, since the entire file binary data is cached....?
I have tried to use other techniques, like streaming (StreamSaver.js), but with no luck.
I am trying to chunk the file data, but in the follow code, the 'httpData' event does not get called until the entire file's binary data is loaded...which seems to defeat the purpose of chunking. I am not understanding this event, or I have something misconfigured.
cache.S3.getObject({ Bucket: 'anduin-installers', Key: filePath })
.on('httpDownloadProgress', function (progress) {
$timeout(function () {
pkg.Download.Progress = Math.floor((progress.loaded / progress.total) * 100.0);
});
})
.on('httpData', function (chunk, response) {
console.log('???');
})
.on('complete', function (response) {
$timeout(function () {
pkg.Download.Active = false;
pkg.Download.Progress = 0;
});
})
.send();
Any ideas no how to make event 'httpData' fire as data chunks are received instead of waiting for th whole file? Or should I go with another solution?
Thanks!

Using Fetch instead of XMLHttpRequest in React Native

I was trying to upload images to S3 from my react native app by following and adapting this guide by heroku: https://devcenter.heroku.com/articles/s3-upload-node
Essentially I am using the aws-sdk on my express.js backend to generate pre-signed request for uploading images to S3 from react native.
Everything works well, so then I tried to convert the XMLHttpRequests into fetch requests, which seem to be favoured by react native. After the conversion, the files are still being uploaded to S3, but when I click on the image links, then the images wouldn't not show properly, instead an empty square is shown:
Empty square shown instead of image
More specifically it seems to be this piece of code conversion that causes it to happen:
From:
_uploadFile(file, signedRequest, url){
const xhr = new XMLHttpRequest();
xhr.open('PUT', signedRequest);
xhr.onreadystatechange = () => {
if(xhr.readyState === 4){
if(xhr.status === 200){
console.log("UPLOAD DONE");
} else {
alert('ERROR UPLOADING');
}
}
};
xhr.send(file);
}
To:
_uploadFile(file, signedRequest, url) {
let option = {
method: "PUT",
headers: {
"Content-Type": "image/jpeg",
},
body: JSON.stringify(file)
}
fetch(signedRequest, option)
.then(res => console.log("UPLOAD DONE"))
.catch(err => console.log("ERROR UPLOADING: ", err))
}
The file object being uploaded:
{
name: "profileImage",
type: "image/jpeg",
uri: 'data:image/jpeg;base64,' + response.data, //just a base64 image string
isStatic: true
}
Could anyone shed some light on why this could be happening, or have had similar experiences? Many thanks!
In your fetch example you put a JSON string in your body. It will be sent to S3 but it will not be interpreted as an image upload. You should be able to construct a FormData object yourself and pass it to fetch as the request body, but I think using XHR is the simpler option. According to this comment it's what Facebook does as well (the comment is over a year old).
If at all possible you should also try to use local URIs instead of passing Base64 encoded data. It takes quite a while to transfer a few MB of image data between JS and native.