With React Native on Android I am trying to send the image profile of the user from local cache to a firebase storage bucket. If I send it as blob or Uint8Array, when I open the image on firebase I get the error The image "https://firebasestorage<resto of url here>" cannot be displayed because it contain errors. If I send it as base64 data url,it does not upload to the bucket and I get the message Firebase Storage: String does not match format 'base64': Invalid character found. I have tested the base64 data url with a decoder and it works. How can I get this to work, either as blob, Uint8Array or base64?. Here is the code:
As blob
let mime = 'image/jpeg';
getFile(imageUri)
.then(data => {
return new Blob([data], { type: mime });
})
.then(blob => {
return imageRef.put(blob, { contentType: mime });
})
async function getFile(imageUri) {
let bytes = await FileSystem.readAsStringAsync(imageUri);
return Promise.resolve(bytes);
}
As Uin8Array
let mime = 'image/jpeg';
getFile(imageUri)
.then(data => {
return imageRef.put(data, { contentType: mime });
})
async function getFile(imageUri) {
let bytes = await FileSystem.readAsStringAsync(imageUri);
const imageBytes = new Uint8Array(bytes.length);
for ( let i = 0; i < imageBytes.length; i++) {
imageBytes[i] = bytes.charCodeAt(i);
}
return Promise.resolve(imageBytes);
}
As base64 data url
imageBase64Url = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAMAAAADCAIAAADZSiLoAAAAF0lEQVQI12P8//8/AwMDAwMDEwMMIFgAVCQDA25yGkkAAAAASUVORK5CYII=";
return imageRef.putString(imageBase64Url, 'data_url');
The URI
I retrieve the uri from this object:
Object {
"cancelled": false,
"height": 60,
"type": "image",
"uri": "file:///data/user/0/host.exp.exponent/cache/ExperienceData/%2540anonymous%252FMCC_Project-ee81e7bd-82b1-4624-8c6f-8c882fb131c4/ImagePicker/6ec14b33-d2ec-4f80-8edc-2ee501bf6e92.jpg",
"width": 80,
}
We found at least two problems with the way I was trying to retrieve the picture and send it to the Firebase bucket:
1) When retrieving the image from memory and trying to send it as blob to the bucket, FileSystem.readAsStringAsync(imageUri) was returning for some reason a corrupted file
2) Instead when trying to save the image to Firebase bucket as base64, the problem seems to be with firebase, since not even the very same examples provided here https://firebase.google.com/docs/storage/web/upload-files were working.
The solution:
We retrieved the image from local cache with XMLHttpRequestinstead of Expo's FileSystem, and saved it to Firebase bucket as blob:
import React, { Component } from 'react';
import firebase from './firebase';
export default async function saveImage(picture, uid) {
const storageRef = firebase
.storage('gs://*<bucket-here>*')
.ref(uid + '/' + 'profile-picture.jpeg');
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function() {
resolve(xhr.response);
};
xhr.onerror = function(e) {
console.log(e);
reject(new TypeError('Network request failed'));
};
xhr.responseType = 'blob';
xhr.open('GET', picture.uri, true);
xhr.send(null);
});
const metadata = {
contentType: 'image/jpeg',
};
return (downloadURL = await new Promise((resolve, reject) => {
try {
storageRef.put(blob, metadata).then(snapshot => {
snapshot.ref.getDownloadURL().then(downloadURL => {
resolve(downloadURL);
});
});
} catch (err) {
reject(err);
}
}));
}
Related
I am trying to upload a lot of files from S3 to IPFS via Pinata. I haven't found in Pinata documentation something like that.
This is my solution, using the form-data library. I haven't tested it yet (I will do it soon, I need to code some things).
Is it a correct approach? anyone who has done something similar?
async uploadImagesFolder(
items: ItemDocument[],
bucket?: string,
path?: string,
) {
try {
const form = new FormData();
for (const item of items) {
const file = getObjectStream(item.tokenURI, bucket, path);
form.append('file', file, {
filename: item.tokenURI,
});
}
console.log(`Uploading files to IPFS`);
const pinataOptions: PinataOptions = {
cidVersion: 1,
};
const result = await pinata.pinFileToIPFS(form, {
pinataOptions,
});
console.log(`PiƱata Response:`, JSON.stringify(result, null, 2));
return result.IpfsHash;
} catch (e) {
console.error(e);
}
}
I had the same problem
So, I have found this: https://medium.com/pinata/stream-files-from-aws-s3-to-ipfs-a0e23ffb7ae5
But in the article If am not wrong, is used a different version to the JavaScript AWS SDK v3 (nowadays the most recent: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/index.html).
This is for the Client side with TypeScript:
If you have this version, for me works this code snippet:
export const getStreamObjectInAwsS3 = async (data: YourParamsType) => {
try {
const BUCKET = data.bucketTarget
const KEY = data.key
const client = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'secret-key'
}
})
const resource = await client.send(new GetObjectCommand({
Bucket: BUCKET,
Key: KEY
}))
const response = resource.Body
if (response) {
return new Response(await response.transformToByteArray()).blob()
}
return null
} catch (error) {
return null
}
}
With the previous code, you can get the Blob Object for pass it to the File object with this method and get the URL resource using the API:
export const uploadFileToIPFS = async(file: Response) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`
const data = new FormData()
data.append('file', file)
try {
const response = await axios.post(url, data, {
maxBodyLength: Infinity,
headers: {
pinata_api_key: 'your-api',
pinata_secret_api_key: 'your-secret'
},
data: data
})
return {
success: true,
pinataURL: `https://gateway.pinata.cloud/ipfs/${ response.data.IpfsHash }`
}
} catch (error) {
console.log(error)
return null
}
}
I have found this solution from this nice article and you can explore other implementations (including the Node.js side)
I'm trying to upload files using RN Document Picker.
Once I get those files selected, I need to turn them to base64 string so I can send it to my API.
const handlePickFiles = async () => {
if (await requestExternalStoreageRead()) {
const results = await DocumentPicker.pickMultiple({
type: [
DocumentPicker.types.images,
DocumentPicker.types.pdf,
DocumentPicker.types.docx,
DocumentPicker.types.zip,
],
});
const newUploadedFile: IUploadedFile[] = [];
for (const res of results) {
console.log(JSON.stringify(res, null, 2));
newUploadedFile.push({
name: res.name,
type: res.type as string,
size: res.size as number,
extension: res.type!.split('/')[1],
blob: res.uri, <<-- Must turn this in base64 string
});
}
setUploadedFiles(newUploadedFile);
console.log(newUploadedFile);
}
}
};
The document picker returns content uri (content://...)
They lists this as an example of handling blob data and base64:
let data = new FormData()
data.append('image', {uri: 'content://path/to/content', type: 'image/png', name: 'name'})
const response = await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'multipart/form-data',
},
body: data
})
Where they basically say that you don't need to use blob or base64 when using multipart/form-data as content type. However, my graphql endpoint cannot handle multipart data and I don't have time to rewrite the whole API. All I want is to turn it to blob and base64 string, even if other ways are more performant.
Searching for other libraries, all of them are no longer maintained, or has issues with new versions of android. RN Blob Utils is the latest npm that was no longer maintained.
I tried to use RN Blob Utils but I either get errors, wrong data type, or the file uploads but is corrupted.
Some other things I found is that I can use
fetch(res.uri).then(response => {response.blob()})
const response = await ReactNativeBlobUtil.fetch('GET', res.uri);
const data = response.base64();
ReactNativeBlobUtil.wrap(decodeURIComponent(blob))
///----
const blob = ReactNativeBlobUtil.fs.readFile(res.uri, 'base64');
But I can't do anything with that blob file.
What is the simplest way to uplaod files from document picker as base64 format? Is it possible to avoid using external storage permission?
You don't need to the third-party package to fetch BLOB data
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function () {
resolve(xhr.response);
};
xhr.onerror = function (e) {
reject(new TypeError("Network request failed"));
};
xhr.responseType = "blob";
xhr.open("GET", "[LOCAL_FILE_PATH]", true);
xhr.send(null);
});
// Code to submit blob file to server
// We're done with the blob, close and release it
blob.close();
I ended up using react-native-blob-util
const res = await DocumentPicker.pickSingle({
type: [
DocumentPicker.types.images,
DocumentPicker.types.pdf,
DocumentPicker.types.docx,
DocumentPicker.types.zip,
],
});
const newUploadedFile: IUploadedFile[] = [];
const fileType = res.type;
if (fileType) {
const fileExtension = fileType.substr(fileType.indexOf('/') + 1);
const realURI = Platform.select({
android: res.uri,
ios: decodeURI(res.uri),
});
if (realURI) {
const b64 = await ReactNativeBlobUtil.fs.readFile(
realURI,
'base64',
);
const filename = res.name.replace(/\s/g, '');
const path = uuid.v4();
newUploadedFile.push({
name: filename,
type: fileType,
size: res.size as number,
extension: fileExtension,
blob: b64,
path: Array.isArray(path) ? path.join() : path,
});
} else {
throw new Error('Failed to process file');
}
} else {
throw new Error('Failed to process file');
}
Using react-native-vision-camera, I saw that there is a path for the image. It seem readable by react native tag.
I attempted to upload using this path (I used both file:// for android, and same for IOS), however it failed. Each time the file was detected as "jpeg" or "jpg" but I couldn't access it.
After downloading (From S3 amazon where I uploaded) and converting the jpg to txt, I only find the "file://path".
import { Storage } from "aws-amplify";
export default async function s3UploadBackup(file, user) {
let formatted_date = moment().format("DD-MM-YYYY");
let filePath = file.split("/");
let fileImageName = filePath[filePath.length - 1];
try {
console.log("Files contains :" + JSON.stringify(file));
// example of one of the URL I used "file:///storage/emulated/0/Android/data/com.app/files/Pictures/image-c64a66b3-489d-4af6-bf93-7adb507ceda1790666367.jpg"
const fileName = `${formatted_date}---${user.businessName}---${user.phoneNumber}---${user.location}---${fileImageName}`;
return Storage.put(uploadBackup.path + user.sub + "/" + user.phoneNumber + "/" + fileName, file, {
// contentType: "image/jpeg"
contentType: file.mime
})
} catch(error) {
console.log(error);
}
}
AWS-AMPLIFY support uploading file as BLOB and converting to specified file extension (JPEG, PNG,...).
Assume that we have local file URI - file:///storage/emulated/0/Android/data/com.app/files/Pictures/image-c64a66b3-489d-4af6-bf93-7adb507ceda1790666367.jpg
Let we refactor s3UploadBackup function
import { Storage } from "aws-amplify";
export default async function s3UploadBackup(file, user) {
let formatted_date = moment().format("DD-MM-YYYY");
let filePath = file.split("/");
let fileImageName = filePath[filePath.length - 1];
try {
console.log("Files contains :" + JSON.stringify(file));
// example of one of the URL I used "file:///storage/emulated/0/Android/data/com.app/files/Pictures/image-c64a66b3-489d-4af6-bf93-7adb507ceda1790666367.jpg"
const fileName = `${formatted_date}---${user.businessName}---${user.phoneNumber}---${user.location}---${fileImageName}`;
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function () {
resolve(xhr.response);
};
xhr.onerror = function (e) {
reject(new TypeError("Network request failed"));
};
xhr.responseType = "blob";
xhr.open("GET", localPath, true);
xhr.send(null);
});
await Storage.put("yourKeyHere", blob, {
contentType: "image/jpeg", // contentType is optional
});
// We're done with the blob, close and release it
blob.close();
} catch(error) {
console.log(error);
// We're done with the blob, close and release it
blob.close();
}
}
Progress until now
I have created an app.
captured image from app has a object like this
`
{
"height": 4000,
"uri": "file:///data/user/0/host.exp.exponent/cache/ExperienceData/%2540anonymous%252Fimage-upload-4ceaf845-d42a-4a1d-b1be-5a3f9cfd10ba/Camera/e3d441b6-2664-4c30-a093-35f7f0f09488.jpg",
"width": 3000,
}
`
I have made an end point in api gateway and api gateway connects to a lambda and i'm sending above data to my backend as `
axios
.post(uploadEndPoint, capturedImage, {
headers: {
'Content-Type': 'application/json',
},
})
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error.response);
});
`
i get captured data in my lambda
first i'm trying to save in s3 with out resizing like this if this works then i will go for resizing.
`
import Responses from '../utils/responses';
import * as fileType from 'file-type';
import { v4 as uuid } from 'uuid';
import * as AWS from 'aws-sdk';
const s3 = new AWS.S3();
const allowedMimes = ['image/jpeg', 'image/png', 'image/jpg'];
export const handler = async (event) => {
console.log('event>>', JSON.parse(event.body));
try {
const body = JSON.parse(event.body);
if (!body) {
return Responses._400({ message: 'incorrect body on request' });
}
let imageData = body;
console.log('imageData>>', imageData);
const buffer = Buffer.from(imageData, 'base64');
console.log('buffer>>', buffer);
const fileInfo = await fileType.fromBuffer(buffer);
console.log(fileInfo);
const detectedExt = fileInfo.ext;
const detectedMime = fileInfo.mime;
console.log('detectedExt>>', detectedExt);
console.log('detectedMime>>', detectedMime);
if (!allowedMimes.includes(detectedMime)) {
return Responses._400({ message: 'mime is not allowed ' });
}
const name = uuid();
const key = `${name}.${detectedExt}`;
console.log(`writing image to bucket called ${key}`);
await s3
.putObject({
Body: buffer,
Key: key,
ContentType: detectedMime,
Bucket: process.env.imageUploadBucket,
ACL: 'public-read',
})
.promise();
const url = `https://${process.env.imageUploadBucket}.s3-${process.env.region}.amazonaws.com/${key}`;
return Responses._200({
imageURL: url,
});
} catch (error) {
console.log('error', error);
return Responses._400({
message: error.message || 'failed to upload image',
});
}
};
`
fileInfo shows undifined
Things i want
Resize image and upload to s3 by that lambda
I'm having issues uploading an image from Nativescript to AWS, and I'm pretty sure it's a configuration issue.
select an image
const context = imagepicker.create({
mode: 'single' // use "multiple" for multiple selection
});
await context.authorize();
const selection: Array<ImageAsset> = await context.present();
const imageAsset = selection[0];
const source: ImageSource = await new ImageSource().fromAsset(imageAsset);
const fileLocation = imageAsset.android ? imageAsset.android : imageAsset.ios;
const fileType = mime.extension(mime.lookup(fileLocation));
const image = source.toBase64String(fileType);
console.log(image);
image at this point is: iVBORw0KGgoAAAANSUhEUgAAB4AAAASwCAIAAACVUsChAAAAA3NCSVQI...
image location at this point is: /storage/emulated/0/DCIM/Screenshots/Screenshot_20181106-150854.png
const fileLocation = imageAsset.android ? imageAsset.android : imageAsset.ios;
const signedUrl = await this.getSignedUrl(fileLocation);
Backend Code to get signedURL
const getSignedUrlPromise = (operation, params) => {
return new Promise((resolve, reject) => {
s3.getSignedUrl(operation, params, (err, url) => {
err ? reject(err) : resolve(url);
});
});
}
const params = {
Bucket: BUCKET_NAME,
Key: `abc123/456/3/${fileName}`,
ContentType: contentType,
ContentEncoding: 'base64'
}
const url = await getSignedUrlPromise('putObject', params).catch(err => {
console.log('error', JSON.stringify(err))
return {
statusCode: 400,
body: JSON.stringify(err)
}
});
console.log('success', url);
return {
statusCode: 200,
body: JSON.stringify({ url: url })
}
signedUrl at this point is:
https://myproject.s3.amazonaws.com/abc123/456/3/Screenshot_20181106-150854.png?AWSAccessKeyId=xxxxxxxxxxxxxx&Content-Encoding=base64&Content-Type=image%2Fpng&Expires=1555517358&Signature=yyyyyyyy&x-amz-security-token=long_token
Then, using the signedURL, i upload the image:
const mimeType = mime.lookup(fileLocation);
this.http.put(signedUrl, image, {
headers: {
'Content-Type': mimeType,
'Content-Encoding': 'base64'
}
}).subscribe((resp) => {
console.log('resp2', resp);
});
}
When I open the file, this is what I see
and the meta-data on the S3 object looks correct
When I download the file and open it in NP++, I see the base64 value.
iVBORw0KGgoAAAANSUhEUgAAB4AAAASwCAIAAACVUsChAAAAA3NCSVQI...
I also cannot open the downloaded image
ATTEMPT 2
I saw where some people were using buffers, so I changed my image code to
const image = Buffer.from(source.toBase64String(fileType).replace(/^data:image\/\w+;base64,/, ''), 'base64');
which the image is still broken, and when I download and open the file using NP++ I see
{"type":"Buffer","data":[137,80,78,71,13,10,26,10,0,0,0,13,73,72,68,82,0,0,7,128,0,0,4,176,8,2,0,0,0