Want to share multiple images with separate caption to each image Whatsapp, react native share - react-native

I am using React Native Share library, a good one,
I just need little help,
It is sharing multiple images with same caption,
i just want to share multiple images with separate message (caption) to each image,
suppose, if there is 5 images, then caption to 5 images is different not same.
In current situation, it share 5 images with same message (caption)
Here is my code
var imgs=["base64IMAGE1...///","base64IMAGE2..///","base64IMAGE3..///"];
let shareImage = {
title:"title",
message:"this is message need to send separate to each image",
urls:abcc,
subject: "Image"
};
Share.open(shareImage).catch(err => console.log(err));
I have attached current situation screenshots..
image 1 on whatsapp
image 2 on whatsapp
all sent with same caption, i just to send multiple images with separate messages
ThankYou.

I've created working example to share multiple or single images using react-native-share
CheckOut ExpoSnack Here
added comments before every method what it'll do and what needs to be replaced.
// multiple images share example
const shareMultipleImages = async () => {
const shareOptions = {
title: 'Share multiple files example',
// here replace base64 data with your local filepath
// base64 with mimeType or path to local file
urls: [base64ImagesData.image1, base64ImagesData.image2],
failOnCancel: false,
};
// If you want, you can use a try catch, to parse
// the share response. If the user cancels, etc.
try {
const ShareResponse = await Share.open(shareOptions);
setResult(JSON.stringify(ShareResponse, null, 2));
} catch (error) {
console.log('Error =>', error);
setResult('error: '.concat(getErrorString(error)));
}
};
you can add local file path in shareMultipleImage method like this
urls: Array of base64 string you want to share. base64 with mimeType or path to local file (Array[string])
React Native Share Docs
const shareOptions = {
title: 'Share multiple files example',
urls: ["file..///","file..///","file..///"],
failOnCancel: false,
};

Related

In an Expo React Native app, how to pick a photo, store it locally on the phone, and later upload it to a Node.js service?

I'm building a React Native app with Expo, and I want to include the following workflow: The user takes a picture (either with the camera or picking one from the phone's gallery), which is stored locally on the phone, until the user uploads it some later time to a backend service.
I'm pretty stuck and would appreciate any pointers.
Here is what I have:
I use expo-image-picker to pick a photo from the phone's gallery:
const photo = await launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
base64: true,
quality: 1,
});
Then I store the photo locally as a Base64 string using expo-file-system:
const location = `${FileSystem.documentDirectory}${filename}`;
await FileSystem.writeAsStringAsync(location, photo.base64, {
encoding: FileSystem.EncodingType.Base64
});
I keep information about the storage location, file name, and mime type in an image object. Later, I try to upload that image to my own Node.js backend service with axios, sending the following multi-part form data:
const formdata = new FormData();
formdata.append('file', {
path: image.location,
name: image.filename,
type: image.mimetype
} as any);
The backend service that receives the photo uses multer:
const multer = require('multer');
const upload = multer({ storage: multer.memoryStorage() });
router.post('/photo', upload.single('file'), async (request, response) => {
console.log(request.file);
...
});
What arrives at my service is the following:
{
fieldname: 'file',
originalname: '1653135701413.jpg',
encoding: '7bit',
mimetype: 'image/jpg',
buffer: <Buffer >,
size: 0
}
So no data is transferred. (It seems to be properly stored on the phone, because if I use the Expo filesystem's readStringAsAsync, I do get a pretty long Base64 string.)
What am I missing? Do I need to send the image as a blob? (If I try to do so, then request.file is undefined, so I guess I'm doing something wrong there as well.)
And in general, is there a better way to achieve this workflow in a managed React Native app? (For example, is it ok to store the image as a Base64 string, or would it be better to do this differently?)
Edit:
In the form data, I changed path to uri, and I switched from axios to fetch. Now the backend finally receives the image data. 🥳

converting selected document from DOCUMENT PICKER in react native to base64 using RNFetchBlob

I have some problem about my react native app. I would just like to ask if are there any ways that RNFetchBlob will accept a dataURI from documentPicker instead of a web URL? I just need to convert the selected file from document picker to base64. Could anyone help me?
RNFetchBlob.config({ fileCache: true })
.fetch("GET", 'http://www.africau.edu/images/default/sample.pdf') // Replace the web URL to dataURI from documentPicker
// the image is now dowloaded to device's storage
.then(resp => {
// the image path you can use it directly with Image component
// return resp.readFile("base64");
return resp.readFile("base64");
}).then(base64Data => {
console.log('base64Data', base64Data);
});
If you are not particularly looking for base64 encoded but want to obtain the actual blob you can use fetch without going through base64 bridge
const fetchResponse = await fetch(at[i].uri);
const blob = await fetchResponse.blob();

where is the image capture by RNCamera saved?

I have used RNCamera in my project but I don't know where is it saved in the cache. How can I preview the image and delete the image. The app is taking a lot of cache memory in my android phone.
From the documentation
uri: (string) the path to the image saved on your app's cache directory.
takePicture = async() => {
if (this.camera) {
const options = { quality: 0.5, base64: true };
const data = await this.camera.takePictureAsync(options);
console.log(data.uri);//print uri for image saved
}
};
You can check images in the gallery if not you predefined file location.
Please check takePicture = async function() from following source which returns data.uri and you have full control to move your file and the display images as well as you can delete.
https://github.com/react-native-community/react-native-camera/blob/master/docs/RNCamera.md

How to display Image from camera roll url using react-native-camera?

I use react-native-camera for capturing images and saving them in cameraRoll (CaptureTarget) for iOS devices, on capture I get image path in the following form
"assets-library://asset/asset.JPG?id=8B55D3E5-11CE-439A-8FC6-D55EB1A88D7E&ext=JPG"
How to use this path to display the image in Image component (from react-native)?
Previously I was using disk as CaptureTarget option, and I was able to show that image url Image component but now the requirements are to save image in camera roll?
I have used RNFetchBlob to get base64 from "assets-library://.." url, my capture function is
this.camera.capture()
.then((data) => {
// console.log(data);
RNFetchBlob.fs.readFile(data.path, 'base64')
.then((base64data) => {
let base64Image = `data:image/jpeg;base64,${base64data}`;
this.props.addImagesToUntagged(data.path);
})
})
after that i give user some in-app functionality on this base64 data and finally when I need to send this to s3 server, I use axios & RNFetchBlob to send this data.following code gives me signed url for s3
axios.get(ENDPOINT_TO_GET_SIGNED_URL, {params:{'file-name': file.name,'file-type': file.type,"content-type": 'evidence'}})
.then(function (result) {
// console.log(result);
returnUrl = result.data.url;
var signedUrl = result.data.signedRequest;
return uploadFile(file,signedUrl)
})
and in my uploadFile function i upload images through following code
RNFetchBlob.fetch('PUT', signedUrl,
{'Content-Type' : file.type},
RNFetchBlob.wrap(file.uri)
)
.then(resolve)

Using Fetch instead of XMLHttpRequest in React Native

I was trying to upload images to S3 from my react native app by following and adapting this guide by heroku: https://devcenter.heroku.com/articles/s3-upload-node
Essentially I am using the aws-sdk on my express.js backend to generate pre-signed request for uploading images to S3 from react native.
Everything works well, so then I tried to convert the XMLHttpRequests into fetch requests, which seem to be favoured by react native. After the conversion, the files are still being uploaded to S3, but when I click on the image links, then the images wouldn't not show properly, instead an empty square is shown:
Empty square shown instead of image
More specifically it seems to be this piece of code conversion that causes it to happen:
From:
_uploadFile(file, signedRequest, url){
const xhr = new XMLHttpRequest();
xhr.open('PUT', signedRequest);
xhr.onreadystatechange = () => {
if(xhr.readyState === 4){
if(xhr.status === 200){
console.log("UPLOAD DONE");
} else {
alert('ERROR UPLOADING');
}
}
};
xhr.send(file);
}
To:
_uploadFile(file, signedRequest, url) {
let option = {
method: "PUT",
headers: {
"Content-Type": "image/jpeg",
},
body: JSON.stringify(file)
}
fetch(signedRequest, option)
.then(res => console.log("UPLOAD DONE"))
.catch(err => console.log("ERROR UPLOADING: ", err))
}
The file object being uploaded:
{
name: "profileImage",
type: "image/jpeg",
uri: 'data:image/jpeg;base64,' + response.data, //just a base64 image string
isStatic: true
}
Could anyone shed some light on why this could be happening, or have had similar experiences? Many thanks!
In your fetch example you put a JSON string in your body. It will be sent to S3 but it will not be interpreted as an image upload. You should be able to construct a FormData object yourself and pass it to fetch as the request body, but I think using XHR is the simpler option. According to this comment it's what Facebook does as well (the comment is over a year old).
If at all possible you should also try to use local URIs instead of passing Base64 encoded data. It takes quite a while to transfer a few MB of image data between JS and native.