Upload file via #aws-sdk/client-s3 and graphql-upload - amazon-s3

S3('#aws-sdk/client-s3') upload function
import { Upload } from '#aws-sdk/lib-storage';
async s3UploadPhoto(fileStream, name, mimetype) {
const fileKey = this.getFileKey(name);
const sendParams: PutObjectCommandInput = {
Bucket: process.env.AWS_BUCKET_NAME,
Body: fileStream,
Key: fileKey,
ContentType: mimetype,
};
try {
const parallelUploads3 = new Upload({
client: this.s3,
tags: [],
queueSize: 4,
leavePartsOnError: false,
params: sendParams,
});
parallelUploads3.on('httpUploadProgress', (progress) => {
console.log(progress);
});
return parallelUploads3.done();
} catch (e) {
throw new BadRequestException('');
}
}
And Graphql upload code via 'graphql-upload'
const fileStream = file.createReadStream();
await this.s3Service.s3UploadPhoto(
fileStream,
file.filename,
file.mimetype,
);
I get error: ReferenceError: ReadableStream is not defined
If uploads a file to s3 without lib-storage, I get error: Are you using a Stream of unknown length as the Body of a PutObject request? Consider using Upload instead from #aws-sdk/lib-storage.
What is wrong written that I get error "ReadableStream is not defined"?

Related

Nestjs upload file with s3 works only in local

i'm really new with NestJS and i can't understand why my code work perfect in local but in my ec2 doesn't works
I have this controller
#Post(':id/add-image')
#UseInterceptors(FileInterceptor('file'))
uploadFile(
#Param('id') id: string,
#UploadedFile() file: Express.Multer.File) {
return this.invitationService.addImage(id, file);
}
And this is my service:
async addImage(id: string, file: Express.Multer.File) {
const invitation = await this.findOne(id);
if(!invitation) throw new Error('Invitation not found');
const s3 = await this.s3Service.uploadFile(file, 'invitations');
console.log(s3);
return this.update(id, {image: s3.fileUrl})
}
my function finally it's this
import { Injectable, Req, Res } from '#nestjs/common';
import * as AWS from "aws-sdk";
#Injectable()
export class S3Service
{
AWS_S3_BUCKET = process.env.AWS_S3_BUCKET;
s3 = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
signatureVersion: 'v4'
});
async uploadFile(file: Express.Multer.File, directory: string){
const originalName = file.originalname;
console.log(file)
return await this.s3_upload(file.buffer, this.AWS_S3_BUCKET, originalName, file.mimetype, directory);
}
async s3_upload(buffer: Buffer, bucket: string, name: string, mimetype: string, directory = 'images'){
const params = {
Bucket: `${bucket}/${directory}`,
Key: String(name),
Body: buffer,
ACL: "public-read",
ContentType: mimetype,
ContentDisposition:"inline",
CreateBucketConfiguration: {
LocationConstraint: process.env.AWS_DEFAULT_REGION
}
};
console.log(params);
try{
const s3Response = await this.s3.upload(params).promise();
console.log(s3Response);
return {
fileName: name,
fileUrl: s3Response.Location,
key: s3Response.Key,
};
}
catch (e){
console.log(e);
}
}
}
When try all this in my localhost works perfectly but in production i have a good response for S3 and return the url correctly but if i try use that url i can't see the file because.
All this file are image so when i put the url returned from S3 in local and i paste in chrome works but in production, when i paste that url, chrome try download the image and when i try see that file en my computer said:
"It may be damaged or use a file format that Preview doesn’t recognize."
If anyone has any idea what might be going on, I'd really appreciate your help.

Using aws-sdk to upload to DigitalOceans

I'm using aws-sdk to upload images to DigitalOceans bucket. On localhost it works 100% but production seems like the function goes on without an error but the file does not upload to the bucket.
I cannot figure out what is going on and can't think of a way to debug this. tried aswell executing the POST request with Postman multipart/form-data + adding file to the body of the request and it is the same for localhost, working, and production is not.
my api endpoint:
import AWS from 'aws-sdk'
import formidable from "formidable"
import fs from 'fs'
const s3Client = new AWS.S3({
endpoint: process.env.DO_SPACES_URL,
region: 'fra1',
credentials: {
accessKeyId: process.env.DO_SPACES_KEY,
secretAccessKey: process.env.DO_SPACES_SECRET
}
})
export const config = {
api: {
bodyParser: false
}
}
export default async function uploadFile(req, res) {
const { method } = req
const form = formidable()
const now = new Date()
const fileGenericName = `${now.getTime()}`
const allowedFileTypes = ['jpg', 'jpeg', 'png', 'webp']
switch (method) {
case "POST":
try {
form.parse(req, async (err, fields, files) => {
const fileType = files.file?.originalFilename?.split('.').pop().toLowerCase()
if (!files.file) {
return res.status(400).json({
status: 400,
message: 'no files'
})
}
if (allowedFileTypes.indexOf(fileType) === -1) {
return res.status(400).json({
message: 'bad file type'
})
}
const fileName = `${fileGenericName}.${fileType}`
try {
s3Client.putObject({
Bucket: process.env.DO_SPACES_BUCKET,
Key: `${fileName}`,
Body: fs.createReadStream(files.file.filepath),
ACL: "public-read"
}, (err, data) => {
console.log(err)
console.log(data)
})
const url = `${process.env.FILE_URL}/${fileName}`
return res.status(200).json({ url })
} catch (error) {
console.log(error)
throw new Error('Error Occured While Uploading File')
}
});
return res.status(200)
} catch (error) {
console.log(error)
return res.status(500).end()
}
default:
return res.status(405).end('Method is not allowed')
}
}

React native content uri to base64 string

I'm trying to upload files using RN Document Picker.
Once I get those files selected, I need to turn them to base64 string so I can send it to my API.
const handlePickFiles = async () => {
if (await requestExternalStoreageRead()) {
const results = await DocumentPicker.pickMultiple({
type: [
DocumentPicker.types.images,
DocumentPicker.types.pdf,
DocumentPicker.types.docx,
DocumentPicker.types.zip,
],
});
const newUploadedFile: IUploadedFile[] = [];
for (const res of results) {
console.log(JSON.stringify(res, null, 2));
newUploadedFile.push({
name: res.name,
type: res.type as string,
size: res.size as number,
extension: res.type!.split('/')[1],
blob: res.uri, <<-- Must turn this in base64 string
});
}
setUploadedFiles(newUploadedFile);
console.log(newUploadedFile);
}
}
};
The document picker returns content uri (content://...)
They lists this as an example of handling blob data and base64:
let data = new FormData()
data.append('image', {uri: 'content://path/to/content', type: 'image/png', name: 'name'})
const response = await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'multipart/form-data',
},
body: data
})
Where they basically say that you don't need to use blob or base64 when using multipart/form-data as content type. However, my graphql endpoint cannot handle multipart data and I don't have time to rewrite the whole API. All I want is to turn it to blob and base64 string, even if other ways are more performant.
Searching for other libraries, all of them are no longer maintained, or has issues with new versions of android. RN Blob Utils is the latest npm that was no longer maintained.
I tried to use RN Blob Utils but I either get errors, wrong data type, or the file uploads but is corrupted.
Some other things I found is that I can use
fetch(res.uri).then(response => {response.blob()})
const response = await ReactNativeBlobUtil.fetch('GET', res.uri);
const data = response.base64();
ReactNativeBlobUtil.wrap(decodeURIComponent(blob))
///----
const blob = ReactNativeBlobUtil.fs.readFile(res.uri, 'base64');
But I can't do anything with that blob file.
What is the simplest way to uplaod files from document picker as base64 format? Is it possible to avoid using external storage permission?
You don't need to the third-party package to fetch BLOB data
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function () {
resolve(xhr.response);
};
xhr.onerror = function (e) {
reject(new TypeError("Network request failed"));
};
xhr.responseType = "blob";
xhr.open("GET", "[LOCAL_FILE_PATH]", true);
xhr.send(null);
});
// Code to submit blob file to server
// We're done with the blob, close and release it
blob.close();
I ended up using react-native-blob-util
const res = await DocumentPicker.pickSingle({
type: [
DocumentPicker.types.images,
DocumentPicker.types.pdf,
DocumentPicker.types.docx,
DocumentPicker.types.zip,
],
});
const newUploadedFile: IUploadedFile[] = [];
const fileType = res.type;
if (fileType) {
const fileExtension = fileType.substr(fileType.indexOf('/') + 1);
const realURI = Platform.select({
android: res.uri,
ios: decodeURI(res.uri),
});
if (realURI) {
const b64 = await ReactNativeBlobUtil.fs.readFile(
realURI,
'base64',
);
const filename = res.name.replace(/\s/g, '');
const path = uuid.v4();
newUploadedFile.push({
name: filename,
type: fileType,
size: res.size as number,
extension: fileExtension,
blob: b64,
path: Array.isArray(path) ? path.join() : path,
});
} else {
throw new Error('Failed to process file');
}
} else {
throw new Error('Failed to process file');
}

Cloudinary\Error: Missing required parameter - file - Express and Postman

first time trying to upload images to Cloudinary and I have come across an issue when using Express via Postman.
Using form-data on setting 'file' to upload an image to Cloudinary
As of now, when I try to access the req.body I get an empty object, so I guess that has to do with why cloudinary.uploader.upload cannot read the file passed as its first param, since its req.body.file, as shown in the code below.
cloudinary.config({
cloud_name: process.env.CLOUDINARY_CLOUD_NAME,
api_key: process.env.CLOUDINARY_KEY,
api_secret: process.env.CLOUDINARY_SECRET
})
exports.upload = async (req, res) => {
try{
console.log(req.body);
const result = await cloudinary.uploader.upload(req.body.file, {
public_id: `${Date.now()}`,
resource_type: "auto"
})
return res.json({
public_id: result.public_id,
url: result.secure_url
})
}catch(err){
console.log(err)
}
}
The error message I get:
{
message: 'Missing required parameter - file',
name: 'Error',
http_code: 400
}
Any suggestions to solve this issue?
I solved it! I was not able to pass the form-data as req.body to the server, so I had to try and access it through req.files, but was not able to with that either, so I searched a bit and found a middleware 'express-fileupload', and that did the trick. I just added it in my app.js and used
const fileupload = require('express-fileupload');
app.use(fileupload({useTempFiles: true}))
So now I can access my req.files.
exports.upload = async (req, res) => {
const file = req.files.image
try{
console.log(file);
const result = await cloudinary.uploader.upload(file.tempFilePath, {
public_id: `${Date.now()}`,
resource_type: "auto"
})
res.json({
public_id: result.public_id,
url: result.secure_url
})
}catch(err){
console.log("Error", err)
return res.status(400).json({error: err})
}
}
The response I get is:
{
name: 'some-image.png',
data: <Buffer >,
size: 99770,
encoding: '7bit',
tempFilePath: ' **C:\\filepath\some-image.png** ',
truncated: false,
mimetype: 'image/png',
md5: 'b5f612a571442bf604952fd12c47c1bf',
mv: [Function: mv]
}
POST /cloudinary/upload-images 200 1617.944 ms - 119
And it is uploaded successfully to my Cloudinary.
const result = await cloudinary.uploader.upload(req.file, {
public_id: ${Date.now()},
resource_type: "auto"
})
and add file from form data and type should be File
Solved!
This is how i am setting the FormData
let myTestForm = new FormData();
myTestForm.set("name", name);
myTestForm.set("email", email);
myTestForm.set("Avatar", Avatar);
myTestForm.set("password", password);
This is how i am using the FormData
const config = {
headers: {
"Content-type": "multipart/form-data",
},
};
const { data } = await axios.post(`/api/v1/register`, userData, { config });
please don't pass it this way { userData} , had struggled for with this :/
This is how i am uploading image
const myCloud = await cloudinary.v2.uploader.upload(req.body.Avatar, {
folder: "Avatars",
width: 150,
crop: "scale",
public_id: `${Date.now()}`,
resource_type: "auto",
});
PS : in my case i had to upload only 1 image. Have not passed any parameter in app.js file
app.use(bodyParser.urlencoded({ extended: true }));
app.use(fileUpload());

React Native - send image from local cache to firebase storage

With React Native on Android I am trying to send the image profile of the user from local cache to a firebase storage bucket. If I send it as blob or Uint8Array, when I open the image on firebase I get the error The image "https://firebasestorage<resto of url here>" cannot be displayed because it contain errors. If I send it as base64 data url,it does not upload to the bucket and I get the message Firebase Storage: String does not match format 'base64': Invalid character found. I have tested the base64 data url with a decoder and it works. How can I get this to work, either as blob, Uint8Array or base64?. Here is the code:
As blob
let mime = 'image/jpeg';
getFile(imageUri)
.then(data => {
return new Blob([data], { type: mime });
})
.then(blob => {
return imageRef.put(blob, { contentType: mime });
})
async function getFile(imageUri) {
let bytes = await FileSystem.readAsStringAsync(imageUri);
return Promise.resolve(bytes);
}
As Uin8Array
let mime = 'image/jpeg';
getFile(imageUri)
.then(data => {
return imageRef.put(data, { contentType: mime });
})
async function getFile(imageUri) {
let bytes = await FileSystem.readAsStringAsync(imageUri);
const imageBytes = new Uint8Array(bytes.length);
for ( let i = 0; i < imageBytes.length; i++) {
imageBytes[i] = bytes.charCodeAt(i);
}
return Promise.resolve(imageBytes);
}
As base64 data url
imageBase64Url = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAMAAAADCAIAAADZSiLoAAAAF0lEQVQI12P8//8/AwMDAwMDEwMMIFgAVCQDA25yGkkAAAAASUVORK5CYII=";
return imageRef.putString(imageBase64Url, 'data_url');
The URI
I retrieve the uri from this object:
Object {
"cancelled": false,
"height": 60,
"type": "image",
"uri": "file:///data/user/0/host.exp.exponent/cache/ExperienceData/%2540anonymous%252FMCC_Project-ee81e7bd-82b1-4624-8c6f-8c882fb131c4/ImagePicker/6ec14b33-d2ec-4f80-8edc-2ee501bf6e92.jpg",
"width": 80,
}
We found at least two problems with the way I was trying to retrieve the picture and send it to the Firebase bucket:
1) When retrieving the image from memory and trying to send it as blob to the bucket, FileSystem.readAsStringAsync(imageUri) was returning for some reason a corrupted file
2) Instead when trying to save the image to Firebase bucket as base64, the problem seems to be with firebase, since not even the very same examples provided here https://firebase.google.com/docs/storage/web/upload-files were working.
The solution:
We retrieved the image from local cache with XMLHttpRequestinstead of Expo's FileSystem, and saved it to Firebase bucket as blob:
import React, { Component } from 'react';
import firebase from './firebase';
export default async function saveImage(picture, uid) {
const storageRef = firebase
.storage('gs://*<bucket-here>*')
.ref(uid + '/' + 'profile-picture.jpeg');
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function() {
resolve(xhr.response);
};
xhr.onerror = function(e) {
console.log(e);
reject(new TypeError('Network request failed'));
};
xhr.responseType = 'blob';
xhr.open('GET', picture.uri, true);
xhr.send(null);
});
const metadata = {
contentType: 'image/jpeg',
};
return (downloadURL = await new Promise((resolve, reject) => {
try {
storageRef.put(blob, metadata).then(snapshot => {
snapshot.ref.getDownloadURL().then(downloadURL => {
resolve(downloadURL);
});
});
} catch (err) {
reject(err);
}
}));
}