S3 to IPFS from Pinata - amazon-s3

I am trying to upload a lot of files from S3 to IPFS via Pinata. I haven't found in Pinata documentation something like that.
This is my solution, using the form-data library. I haven't tested it yet (I will do it soon, I need to code some things).
Is it a correct approach? anyone who has done something similar?
async uploadImagesFolder(
items: ItemDocument[],
bucket?: string,
path?: string,
) {
try {
const form = new FormData();
for (const item of items) {
const file = getObjectStream(item.tokenURI, bucket, path);
form.append('file', file, {
filename: item.tokenURI,
});
}
console.log(`Uploading files to IPFS`);
const pinataOptions: PinataOptions = {
cidVersion: 1,
};
const result = await pinata.pinFileToIPFS(form, {
pinataOptions,
});
console.log(`PiƱata Response:`, JSON.stringify(result, null, 2));
return result.IpfsHash;
} catch (e) {
console.error(e);
}
}

I had the same problem
So, I have found this: https://medium.com/pinata/stream-files-from-aws-s3-to-ipfs-a0e23ffb7ae5
But in the article If am not wrong, is used a different version to the JavaScript AWS SDK v3 (nowadays the most recent: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/index.html).
This is for the Client side with TypeScript:
If you have this version, for me works this code snippet:
export const getStreamObjectInAwsS3 = async (data: YourParamsType) => {
try {
const BUCKET = data.bucketTarget
const KEY = data.key
const client = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'secret-key'
}
})
const resource = await client.send(new GetObjectCommand({
Bucket: BUCKET,
Key: KEY
}))
const response = resource.Body
if (response) {
return new Response(await response.transformToByteArray()).blob()
}
return null
} catch (error) {
return null
}
}
With the previous code, you can get the Blob Object for pass it to the File object with this method and get the URL resource using the API:
export const uploadFileToIPFS = async(file: Response) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`
const data = new FormData()
data.append('file', file)
try {
const response = await axios.post(url, data, {
maxBodyLength: Infinity,
headers: {
pinata_api_key: 'your-api',
pinata_secret_api_key: 'your-secret'
},
data: data
})
return {
success: true,
pinataURL: `https://gateway.pinata.cloud/ipfs/${ response.data.IpfsHash }`
}
} catch (error) {
console.log(error)
return null
}
}
I have found this solution from this nice article and you can explore other implementations (including the Node.js side)

Related

Problem to generate pdf from a blob in an expo app using FileSystem

I get a blob and I treat it like this:
const file = response.data;
var blob = new Blob([file], {
type: 'application/pdf',
});
const fileReaderInstance = new FileReader();
fileReaderInstance.readAsDataURL(blob);
fileReaderInstance.onload = async () => {
const fileUri = `${FileSystem.documentDirectory}file.pdf`;
await FileSystem.writeAsStringAsync(
fileUri,
fileReaderInstance.result.split(',')[1],
{
encoding: FileSystem.EncodingType.Base64,
}
);
console.log(fileUri);
Sharing.shareAsync(fileUri);
};
however when I generate and share the file, I can't access it and if I get its URI and search on the web it returns:
i solved my problem in this way:
This is a func who get other data to request, do the request (generate PDF()) and treat the data and generate by received blob the buffer on (fileReaderInstance.result) who is shared in Sharing.shareAsync()
const generatePDF = async () => {
setAnimating(true);
const companyReponse = await CompanyService.getCompany();
const peopleResponse = await PeopleService.getPerson(sale.customerId);
const company = companyReponse.response.company;
const people = peopleResponse.response;
const quote = false;
const json = await SaleService.generatePDF({
sale,
company,
people,
quote,
});
if (json && json.success) {
try {
const fileReaderInstance = new FileReader();
fileReaderInstance.readAsDataURL(json.data);
fileReaderInstance.onload = async () => {
const base64data = fileReaderInstance.result.split(',');
const pdfBuffer = base64data[1];
const path = `${FileSystem.documentDirectory}/${sale._id}.pdf`;
await FileSystem.writeAsStringAsync(`${path}`, pdfBuffer, {
encoding: FileSystem.EncodingType.Base64,
});
await Sharing.shareAsync(path, { mimeType: 'application/pdf' });
};
} catch (error) {
Alert.alert('Erro ao gerar o PDF', error.message);
}
}
setAnimating(false);
}
This is the func in SaleServicegeneratePDF who do the request to api and pass the parameters that return a blob of pdf using axios:
generatePDF: async ({ sale, company, people, quote }) => {
const token = await AsyncStorage.getItem('token');
const body = { sale, company, people, quote };
try {
const response = await axios(`${BASE_API}/generate-sale-pdf`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: token,
},
responseType: 'blob',
data: body,
});
return {
success: true,
data: response.data,
};
} catch (err) {
return err.error;
}
},
I have solved this problem by passing the blob string to WriteAsStringAsync method of FileSystem library from expo.
const blobDat = data.data[0].data; //blob data coming from an API call
const fileUri = FileSystem.documentDirectory + `testt.pdf`; //Directory Link of the file to be saved
await FileSystem.writeAsStringAsync(fileUri, blobDat, {
encoding: FileSystem.EncodingType.UTF8,
}) //This step writes the blob string to the pdf fileURI
await IntentLauncher.startActivityAsync("android.intent.action.VIEW", {
data: fileUri,
flags: 1,
type: "application/pdf",
});
//prompts user with available application to open the above created pdf.

MERN and Amazon-s3 for file upload

How to post a file to Amazon S3 using node and react and save it path to mongoDB. with mongoose and formidable.
private async storeFile(file: { buffer: Buffer, fileId: string }): Promise<string> {
try {
const awsConfig = new AWS.Config(storageConfig);
const s3 = new AWS.S3(awsConfig);
let storageLink = undefined;
fs.readFile(file.buffer, (err, data) => {
if (err) {
throw err;
}
const params = {
Bucket:storageConfig.s3Bucket,
Key: `${storageConfig.s3Prefix}${file.fileId}`,
Body: data,
};
s3.upload(params, (s3Err: Error, s3Data: AWS.S3.ManagedUpload.SendData) => {
if (s3Err) {
throw s3Err;
}
storageLink = s3Data.Location;
});
});
return storageLink;
} catch (error) {
throw error;
}
}
In your Service file where you wanna call this function, update with record in collection
const storageLink = this.storeFile({ buffer, fileId });
const file = await file.updateOne({ _id: fileId }, {
status: fileStatus.UPLOADED, // just a flag
fileId: storageLink,
});

How to return file Location after uploading to s3

I am new to Node.js and express.
I use following function to upload image to s3.
function defaultContentType(req, file, cb) {
setImmediate(function () {
var ct = file.contentType || file.mimetype || 'application/octet-stream'
cb(null, ct);
});
}
module.exports = async function (fileName, file) {
aws.config.update({
secretAccessKey: process.env.AWSSecretKey,
accessKeyId: process.env.AWSAccessKeyId,
contentType: defaultContentType,
});
var s3bucket = new aws.S3({
params: {
Bucket: process.env.S3_Bucket_Name,
}
});
var params = {
Key: fileName,
Body: file
};
var fileData = await s3bucket.upload(params, function (err, data) {
if (err) {
throw err;
} else {
return data;
}
});
return fileData;
}
before uploading the image, I resize it using
request(req.file.location, async function (err, response, body) {
var fileInstance = await sharp(body);
var resizeFile = await fileInstance.resize({
height: 150,
fit: 'inside'
});
var data = await s3Upload('mobile_' + req.file.key, resizeFile);
req.mobile = data.Location;
next();
});
The problem I have is;
The image does get resized and saved to s3.
But "s3Upload" function does not return the file location.
Seems like it take some time to complete the operation. Before it get completed, undefined value get return.
Can anyone suggest a way to fix this?
Modified method
module.exports = function (fileName, file, finishCallback) {
// more code
s3bucket.upload(params, function (err, data) {
if (err) {
throw err;
} else {
finishCallback(data);
}
});
}
modified the upload method as
s3Upload('mobile_' + req.file.key, resizeFile, (data) => {
req.mobile = data.Location;
next();
});
This seems to be working as expected.
I am not really sure this is the correct way to do things.
Is there a way to do this correctly?

Cannot upload image to s3 using serverless framework but it work in offline (buffer issue)

I'm trying to deploy a lambda function allowing me to upload a picture to S3.
The lambda works well in offline but when I'm deploy it to AWS, the function doesn't work.
The first error I encountered was this one :
ERROR (node:7) [DEP0005] DeprecationWarning: Buffer() is deprecated due to security and usability issues. Please use the Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() methods instead.
So, I followed the recommendation to use Buffer.from() method instead. But it doesn't work too. The lambda run until the timeout.
Can someone tell me where I was wrong or suggest me another solution ?
Below my lambda function :
const AWS = require("aws-sdk");
const Busboy = require("busboy");
const uuidv4 = require("uuid/v4");
require("dotenv").config();
AWS.config.update({
accessKeyId: process.env.ACCESS_KEY_ID,
secretAccessKey: process.env.SECRET_ACCESS_KEY,
subregion: process.env.SUB_REGION
});
const s3 = new AWS.S3();
const getContentType = event => {
// see the second block of codes
};
const parser = event => {
// see the third block of codes
};
module.exports.main = (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
const uuid = uuidv4();
const uploadFile = async (image, uuid) =>
new Promise(() => {
// const bitmap = new Buffer(image, "base64"); // <====== deprecated
const bitmap = Buffer.from(image, "base64"); // <======== problem here
const params = {
Bucket: "my_bucket",
Key: `${uuid}.jpeg`,
ACL: "public-read",
Body: bitmap,
ContentType: "image/jpeg"
};
s3.putObject(params, function(err, data) {
if (err) {
return callback(null, "ERROR");
}
return callback(null, "SUCCESS");
});
});
parser(event).then(() => {
uploadFile(event.body.file, uuid);
});
};
getContentType() :
const getContentType = event => {
const contentType = event.headers["content-type"];
if (!contentType) {
return event.headers["Content-Type"];
}
return contentType;
};
parser()
const parser = event =>
new Promise((resolve, reject) => {
const busboy = new Busboy({
headers: {
"content-type": getContentType(event)
}
});
const result = {};
busboy.on("file", (fieldname, file, filename, encoding, mimetype) => {
file.on("data", data => {
result.file = data;
});
file.on("end", () => {
result.filename = filename;
result.contentType = mimetype;
});
});
busboy.on("field", (fieldname, value) => {
result[fieldname] = value;
});
busboy.on("error", error => reject(error));
busboy.on("finish", () => {
event.body = result;
resolve(event);
});
busboy.write(event.body, event.isBase64Encoded ? "base64" : "binary");
busboy.end();
});
new Buffer(number) // Old
Buffer.alloc(number) // New
new Buffer(string) // Old
Buffer.from(string) // New
new Buffer(string, encoding) // Old
Buffer.from(string, encoding) // New
new Buffer(...arguments) // Old
Buffer.from(...arguments) // New
You are using callbackWaitsForEmptyEventLoop which basically let lambda function thinks that the work is not over yet. Also, you are wrapping it in promise but not resolving it. You can simplify this logic using following inbuilt promise function on aws-sdk
module.exports.main = async event => {
const uuid = uuidv4();
await parser(event); // not sure if this needs to be async or not. check
const bitmap = Buffer.from(event.body.file, "base64"); // <======== problem here
const params = {
Bucket: "my_bucket",
Key: `${uuid}.jpeg`,
ACL: "public-read",
Body: bitmap,
ContentType: "image/jpeg"
};
const response = await s3.putObject(params).promise();
return response;
};

React Native - send image from local cache to firebase storage

With React Native on Android I am trying to send the image profile of the user from local cache to a firebase storage bucket. If I send it as blob or Uint8Array, when I open the image on firebase I get the error The image "https://firebasestorage<resto of url here>" cannot be displayed because it contain errors. If I send it as base64 data url,it does not upload to the bucket and I get the message Firebase Storage: String does not match format 'base64': Invalid character found. I have tested the base64 data url with a decoder and it works. How can I get this to work, either as blob, Uint8Array or base64?. Here is the code:
As blob
let mime = 'image/jpeg';
getFile(imageUri)
.then(data => {
return new Blob([data], { type: mime });
})
.then(blob => {
return imageRef.put(blob, { contentType: mime });
})
async function getFile(imageUri) {
let bytes = await FileSystem.readAsStringAsync(imageUri);
return Promise.resolve(bytes);
}
As Uin8Array
let mime = 'image/jpeg';
getFile(imageUri)
.then(data => {
return imageRef.put(data, { contentType: mime });
})
async function getFile(imageUri) {
let bytes = await FileSystem.readAsStringAsync(imageUri);
const imageBytes = new Uint8Array(bytes.length);
for ( let i = 0; i < imageBytes.length; i++) {
imageBytes[i] = bytes.charCodeAt(i);
}
return Promise.resolve(imageBytes);
}
As base64 data url
imageBase64Url = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAMAAAADCAIAAADZSiLoAAAAF0lEQVQI12P8//8/AwMDAwMDEwMMIFgAVCQDA25yGkkAAAAASUVORK5CYII=";
return imageRef.putString(imageBase64Url, 'data_url');
The URI
I retrieve the uri from this object:
Object {
"cancelled": false,
"height": 60,
"type": "image",
"uri": "file:///data/user/0/host.exp.exponent/cache/ExperienceData/%2540anonymous%252FMCC_Project-ee81e7bd-82b1-4624-8c6f-8c882fb131c4/ImagePicker/6ec14b33-d2ec-4f80-8edc-2ee501bf6e92.jpg",
"width": 80,
}
We found at least two problems with the way I was trying to retrieve the picture and send it to the Firebase bucket:
1) When retrieving the image from memory and trying to send it as blob to the bucket, FileSystem.readAsStringAsync(imageUri) was returning for some reason a corrupted file
2) Instead when trying to save the image to Firebase bucket as base64, the problem seems to be with firebase, since not even the very same examples provided here https://firebase.google.com/docs/storage/web/upload-files were working.
The solution:
We retrieved the image from local cache with XMLHttpRequestinstead of Expo's FileSystem, and saved it to Firebase bucket as blob:
import React, { Component } from 'react';
import firebase from './firebase';
export default async function saveImage(picture, uid) {
const storageRef = firebase
.storage('gs://*<bucket-here>*')
.ref(uid + '/' + 'profile-picture.jpeg');
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function() {
resolve(xhr.response);
};
xhr.onerror = function(e) {
console.log(e);
reject(new TypeError('Network request failed'));
};
xhr.responseType = 'blob';
xhr.open('GET', picture.uri, true);
xhr.send(null);
});
const metadata = {
contentType: 'image/jpeg',
};
return (downloadURL = await new Promise((resolve, reject) => {
try {
storageRef.put(blob, metadata).then(snapshot => {
snapshot.ref.getDownloadURL().then(downloadURL => {
resolve(downloadURL);
});
});
} catch (err) {
reject(err);
}
}));
}