How to return file Location after uploading to s3 - express

I am new to Node.js and express.
I use following function to upload image to s3.
function defaultContentType(req, file, cb) {
setImmediate(function () {
var ct = file.contentType || file.mimetype || 'application/octet-stream'
cb(null, ct);
});
}
module.exports = async function (fileName, file) {
aws.config.update({
secretAccessKey: process.env.AWSSecretKey,
accessKeyId: process.env.AWSAccessKeyId,
contentType: defaultContentType,
});
var s3bucket = new aws.S3({
params: {
Bucket: process.env.S3_Bucket_Name,
}
});
var params = {
Key: fileName,
Body: file
};
var fileData = await s3bucket.upload(params, function (err, data) {
if (err) {
throw err;
} else {
return data;
}
});
return fileData;
}
before uploading the image, I resize it using
request(req.file.location, async function (err, response, body) {
var fileInstance = await sharp(body);
var resizeFile = await fileInstance.resize({
height: 150,
fit: 'inside'
});
var data = await s3Upload('mobile_' + req.file.key, resizeFile);
req.mobile = data.Location;
next();
});
The problem I have is;
The image does get resized and saved to s3.
But "s3Upload" function does not return the file location.
Seems like it take some time to complete the operation. Before it get completed, undefined value get return.
Can anyone suggest a way to fix this?
Modified method
module.exports = function (fileName, file, finishCallback) {
// more code
s3bucket.upload(params, function (err, data) {
if (err) {
throw err;
} else {
finishCallback(data);
}
});
}
modified the upload method as
s3Upload('mobile_' + req.file.key, resizeFile, (data) => {
req.mobile = data.Location;
next();
});
This seems to be working as expected.
I am not really sure this is the correct way to do things.
Is there a way to do this correctly?

Related

S3 to IPFS from Pinata

I am trying to upload a lot of files from S3 to IPFS via Pinata. I haven't found in Pinata documentation something like that.
This is my solution, using the form-data library. I haven't tested it yet (I will do it soon, I need to code some things).
Is it a correct approach? anyone who has done something similar?
async uploadImagesFolder(
items: ItemDocument[],
bucket?: string,
path?: string,
) {
try {
const form = new FormData();
for (const item of items) {
const file = getObjectStream(item.tokenURI, bucket, path);
form.append('file', file, {
filename: item.tokenURI,
});
}
console.log(`Uploading files to IPFS`);
const pinataOptions: PinataOptions = {
cidVersion: 1,
};
const result = await pinata.pinFileToIPFS(form, {
pinataOptions,
});
console.log(`PiƱata Response:`, JSON.stringify(result, null, 2));
return result.IpfsHash;
} catch (e) {
console.error(e);
}
}
I had the same problem
So, I have found this: https://medium.com/pinata/stream-files-from-aws-s3-to-ipfs-a0e23ffb7ae5
But in the article If am not wrong, is used a different version to the JavaScript AWS SDK v3 (nowadays the most recent: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/index.html).
This is for the Client side with TypeScript:
If you have this version, for me works this code snippet:
export const getStreamObjectInAwsS3 = async (data: YourParamsType) => {
try {
const BUCKET = data.bucketTarget
const KEY = data.key
const client = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'secret-key'
}
})
const resource = await client.send(new GetObjectCommand({
Bucket: BUCKET,
Key: KEY
}))
const response = resource.Body
if (response) {
return new Response(await response.transformToByteArray()).blob()
}
return null
} catch (error) {
return null
}
}
With the previous code, you can get the Blob Object for pass it to the File object with this method and get the URL resource using the API:
export const uploadFileToIPFS = async(file: Response) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`
const data = new FormData()
data.append('file', file)
try {
const response = await axios.post(url, data, {
maxBodyLength: Infinity,
headers: {
pinata_api_key: 'your-api',
pinata_secret_api_key: 'your-secret'
},
data: data
})
return {
success: true,
pinataURL: `https://gateway.pinata.cloud/ipfs/${ response.data.IpfsHash }`
}
} catch (error) {
console.log(error)
return null
}
}
I have found this solution from this nice article and you can explore other implementations (including the Node.js side)

MERN and Amazon-s3 for file upload

How to post a file to Amazon S3 using node and react and save it path to mongoDB. with mongoose and formidable.
private async storeFile(file: { buffer: Buffer, fileId: string }): Promise<string> {
try {
const awsConfig = new AWS.Config(storageConfig);
const s3 = new AWS.S3(awsConfig);
let storageLink = undefined;
fs.readFile(file.buffer, (err, data) => {
if (err) {
throw err;
}
const params = {
Bucket:storageConfig.s3Bucket,
Key: `${storageConfig.s3Prefix}${file.fileId}`,
Body: data,
};
s3.upload(params, (s3Err: Error, s3Data: AWS.S3.ManagedUpload.SendData) => {
if (s3Err) {
throw s3Err;
}
storageLink = s3Data.Location;
});
});
return storageLink;
} catch (error) {
throw error;
}
}
In your Service file where you wanna call this function, update with record in collection
const storageLink = this.storeFile({ buffer, fileId });
const file = await file.updateOne({ _id: fileId }, {
status: fileStatus.UPLOADED, // just a flag
fileId: storageLink,
});

Cannot upload image to s3 using serverless framework but it work in offline (buffer issue)

I'm trying to deploy a lambda function allowing me to upload a picture to S3.
The lambda works well in offline but when I'm deploy it to AWS, the function doesn't work.
The first error I encountered was this one :
ERROR (node:7) [DEP0005] DeprecationWarning: Buffer() is deprecated due to security and usability issues. Please use the Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() methods instead.
So, I followed the recommendation to use Buffer.from() method instead. But it doesn't work too. The lambda run until the timeout.
Can someone tell me where I was wrong or suggest me another solution ?
Below my lambda function :
const AWS = require("aws-sdk");
const Busboy = require("busboy");
const uuidv4 = require("uuid/v4");
require("dotenv").config();
AWS.config.update({
accessKeyId: process.env.ACCESS_KEY_ID,
secretAccessKey: process.env.SECRET_ACCESS_KEY,
subregion: process.env.SUB_REGION
});
const s3 = new AWS.S3();
const getContentType = event => {
// see the second block of codes
};
const parser = event => {
// see the third block of codes
};
module.exports.main = (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
const uuid = uuidv4();
const uploadFile = async (image, uuid) =>
new Promise(() => {
// const bitmap = new Buffer(image, "base64"); // <====== deprecated
const bitmap = Buffer.from(image, "base64"); // <======== problem here
const params = {
Bucket: "my_bucket",
Key: `${uuid}.jpeg`,
ACL: "public-read",
Body: bitmap,
ContentType: "image/jpeg"
};
s3.putObject(params, function(err, data) {
if (err) {
return callback(null, "ERROR");
}
return callback(null, "SUCCESS");
});
});
parser(event).then(() => {
uploadFile(event.body.file, uuid);
});
};
getContentType() :
const getContentType = event => {
const contentType = event.headers["content-type"];
if (!contentType) {
return event.headers["Content-Type"];
}
return contentType;
};
parser()
const parser = event =>
new Promise((resolve, reject) => {
const busboy = new Busboy({
headers: {
"content-type": getContentType(event)
}
});
const result = {};
busboy.on("file", (fieldname, file, filename, encoding, mimetype) => {
file.on("data", data => {
result.file = data;
});
file.on("end", () => {
result.filename = filename;
result.contentType = mimetype;
});
});
busboy.on("field", (fieldname, value) => {
result[fieldname] = value;
});
busboy.on("error", error => reject(error));
busboy.on("finish", () => {
event.body = result;
resolve(event);
});
busboy.write(event.body, event.isBase64Encoded ? "base64" : "binary");
busboy.end();
});
new Buffer(number) // Old
Buffer.alloc(number) // New
new Buffer(string) // Old
Buffer.from(string) // New
new Buffer(string, encoding) // Old
Buffer.from(string, encoding) // New
new Buffer(...arguments) // Old
Buffer.from(...arguments) // New
You are using callbackWaitsForEmptyEventLoop which basically let lambda function thinks that the work is not over yet. Also, you are wrapping it in promise but not resolving it. You can simplify this logic using following inbuilt promise function on aws-sdk
module.exports.main = async event => {
const uuid = uuidv4();
await parser(event); // not sure if this needs to be async or not. check
const bitmap = Buffer.from(event.body.file, "base64"); // <======== problem here
const params = {
Bucket: "my_bucket",
Key: `${uuid}.jpeg`,
ACL: "public-read",
Body: bitmap,
ContentType: "image/jpeg"
};
const response = await s3.putObject(params).promise();
return response;
};

Add and Retrieve Audio

I have added and retrived image in MongoDB using Node. Can I use the same code with some adjustment? Suggest me.
upload.ts
var multer = require("multer");
export let UPLOAD_PATH = "uploads";
const storage = multer.diskStorage({
destination: function(req, file, cb) {
req;
file;
cb(null, UPLOAD_PATH);
},
filename: function(req, file, cb) {
req;
cb(null, file.fieldname + "-" + Date.now() + ".jpg");
}
});
export const upload = multer({ storage: storage }).single("avatar");
image.controller.ts
Upload
this._model.findOne(
{ ["user"]: new mongoose.Types.ObjectId(user._id) },
img => {
upload(req, res, err => {
if (err) {
res.status(500).json(null);
} else {
// Create a new image model and fill the properties
let newImage = new Image();
newImage.filename = req.file.filename;
newImage.originalName = req.file.originalname;
newImage.desc = req.body.desc;
newImage.url =
req.protocol + "://" + req.get("host") + "/images/" + newImage._id;
newImage.user = user._id;
newImage.save(err => {
if (err) {
res.status(400).json(null);
} else {
res.status(201).json(img);
}
});
}
});
}
);
Retrive
getImage = (req, res) => {
const user = this.getUser(req, res);
this._model.findOne({ ['user']: new mongoose.Types.ObjectId(user._id) }, (err, image) => {
if (err) {
res.status(500).json(null);
}
else if (image == null) {
res.status(200).json(image);
} else {
// stream the image back by loading the file
res.setHeader('Content-Type', 'image/jpeg');
fs.createReadStream(path.join(UPLOAD_PATH, image.filename)).pipe(res);
}
})
};
Is it is possible to use same code with some modification to add and retrieve audio files using Node, Express in Mongo?

Lambda File Write to S3

For the past six months I have been downloading the NASA APOD and saving to an S3 bucket using a Lambda function. Up until 12/23/2016 all was working as expected. Now when I check my bucket, the images are there but size 0 bytes. I have included my code below. Does anyone know if there has been a change? Thanks!
var AWS = require("aws-sdk");
var https = require('https');
var http = require('http');
var fs = require('fs');
// Incoming Handler
// <><><><><><><><><><><><><><><><><><><><><><><><><><><><><><>
exports.handler = (event, context, callback) => {
GetAPOD();
};
// <><><><><><><><><><><><><><><><><><><><><><><><><><><><><><>
// Functions
// <><><><><><><><><><><><><><><><><><><><><><><><><><><><><><>
function GetAPOD() {
var nasa_api_key = 'MY KEY GOES HERE'
, nasa_api_path = '/planetary/apod?api_key=' + nasa_api_key;
var options = {
host: 'api.nasa.gov',
port: 443,
path: nasa_api_path,
method: 'GET'
};
// Connect to the NASA API and get the APOD.
var req = https.request(options, function (res) {
console.log('Open connection to NASA.');
res.setEncoding('utf-8');
var responseString = '';
res.on('data', function (data) {
responseString = data;
});
res.on('end', function () {
console.log('API Response: ' + responseString);
var responseObject = JSON.parse(responseString)
, image_date = responseObject['date']
, image_url = responseObject['url']
, image_hdurl = responseObject['hdurl']
, media_type = responseObject['media_type'];
if (media_type == 'image') {
var image_name = image_date + '.jpg';
var s3 = new AWS.S3();
var s3Bucket = new AWS.S3( { params: {Bucket: 'nasa-apod'} } );
// Check to see if the image already exists in the S3 bucket.
// If not we will upload the image to S3.
var head_data = {Key: image_name};
s3Bucket.headObject(head_data, function(err, output_head_data) {
if (output_head_data) {
console.log("Image exists on S3.");
}
else {
console.log("Image does not exists on S3.");
// Image has not been uploaded to S3, open a stream and download the image to the /tmp folder.
var file = fs.createWriteStream("/tmp/" + image_name);
var request = http.get(image_url, function(response) {
console.log("Opening file stream.");
// Pipe the data into the file stream and save to disk.
response.pipe(file);
response.on('end', function () {
// File is written to disk, we are going to check that it exists.
var fileName = "/tmp/" + image_name;
fs.exists(fileName, function(exists) {
if (exists) {
console.log("File exits in /tmp folder.");
// Get the stats for the image, will need this for the ContentLength
fs.stat(fileName, function(error, stats) {
if (error) {
console.log("Stat Error: " + error);
}
else {
console.log("Opening file stream.");
var image_stream = fs.createReadStream(fileName);
// Begin the upload process to S3.
var param_data = {Key: image_name, Body: image_stream, ContentType: "image/jpeg", ContentLength: stats.size, ACL: "public-read"};
s3Bucket.putObject(param_data, function(err, output_data) {
if (err) {
console.log('Error uploading data to S3: ' + err);
}
else {
console.log('Image successfully uploaded.');
}
});
}
});
}
else {
console.log('File does not exist in the /tmp folder.');
}
});
});
});
}
});
}
else {
console.log("Media Type: " + media_type);
}
});
});
req.on('error', function (e) {
console.error('HTTP error: ' + e.message);
});
req.end();
}
// <><><><><><><><><><><><><><><><><><><><><><><><><><><><><><>
Found out that NASA APOD API is now using https and not http for images. I had to adjust my code to use https for the image path.