AWS S3 Signature Does Not Match - amazon-s3

I am stuck on s3.getSignedUrl and getting the error 'Signature Does Not Match, Signature calculated does not match...'. I have checked the credentials, and so many other things, but am not making any progress. I have my access key and secret saved in a credentials file.
The URL I'm getting back is https://postcard-photo-repo-dev.s3.amazonaws.com/Screenshot7.png?AWSAccessKeyId=AKIAJWHTSREEZUZGGO3A&Expires=1534619652&Signature=MkyVjARuo3PaO6lAYV6Li%2FAaR9E%3D
upload.js file:
function getSignedRequest(file) {
const xhr = new XMLHttpRequest();
xhr.open('GET', `/api/feed/sign-s3?file-name=${file.name}&file-type=${file.type}`);
xhr.onreadystatechange = () => {
if(xhr.readyState === 4){
if(xhr.status === 200){
const response = JSON.parse(xhr.responseText);
uploadFile(file, response.signedRequest, response.url);
} else {
alert('Could not get signed URL');
}
}
};
xhr.send();
}
function uploadFile(file, signedRequest, url){
const xhr = new XMLHttpRequest();
xhr.open('PUT', signedRequest);
xhr.onreadystatechange = () => {
if(xhr.readyState === 4) {
if(xhr.status === 200) {
$('#preview').src = url;
$('#avatar-url').value = url;
} else {
alert('Could not upload file');
}
}
};
xhr.send(file);
}
routes file:
router.get('/sign-s3', (req, res) => {
const s3 = new aws.S3();
const fileName = req.query['file-name'];
const fileType = req.query['file-type'];
const s3Params = {
Bucket: S3_BUCKET,
Key: fileName,
Expires: 600,
// ACL: 'public-read',
// ContentType: fileType
};
s3.getSignedUrl('putObject', s3Params, (err, data) => {
if(err) {
console.log(err);
return res.end();
}
console.log(data);
const returnData = {
signedRequest: data,
url: `https://${S3_BUCKET}.s3.amazonaws.com/${fileName}`
};
res.write(JSON.stringify(returnData));
res.end();
});
});

Related

How do I make express middleware in class?

I currently use multer middleware like below
const storage = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, "public");
},
filename: function (req, file, cb) {
cb(null, req.params.id + "_" + file.originalname);
},
});
export const multerUploadSingle = (req: Request, res: Response, next: NextFunction) => {
const upload = multer({ storage: storage }).single("file");
upload(req, res, (error: unknown) => {
if (error instanceof multer.MulterError) {
const message = `file upload fail: ${error.message}`;
next(new HttpException(message, HttpStatus.BadRequest));
} else if (error instanceof Error) {
const message = `file upload fail: ${error.message}`;
next(new HttpException(message, HttpStatus.InternalServerError));
} else {
// upload success
next();
}
});
}
and use in router like this
FileRouter.post("/upload/:id", multerUploadSingle, (req, res) => {...});
However, I felt I want to refactor this middleware in class, and rewrote the code like this,
export class Multer {
private readonly storage: multer.StorageEngine;
constructor() {
this.storage = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, "public");
},
filename: function (req, file, cb) {
cb(null, req.params.id + "_" + file.originalname);
},
});
}
uploadSingle(req: Request, res: Response, next: NextFunction) {
const upload = multer({ storage: this.storage }).single("file");
upload(req, res, (error: unknown) => {
if (error instanceof multer.MulterError) {
const message = `file upload fail: ${error.message}`;
next(new HttpException(message, HttpStatus.BadRequest));
} else if (error instanceof Error) {
const message = `file upload fail: ${error.message}`;
next(new HttpException(message, HttpStatus.InternalServerError));
} else {
// upload success
next();
}
});
}
}
const multer = new Multer();
FileRouter.post("/upload/:id", multer.uploadSingle, (req, res) => {...});
With my short knowledge, I think both case should have the same result, but the latter case which uses class made middleware doesn't work at all. It's seems method "uploadSingle" is never called, thus multer not uploading the file.
Did I make any mistake with class usage? or is it just express can only use function defined middleware?
Your code should follow the MVC pattern.
You can do stuff like this:
routerFile.js
const upload = require("../../configs/multer");
const postController = require("../../controllers/postController");
const multiUploadEvent = upload.fields([
{ name: "images", maxCount: 2 },
{ name: "video", maxCount: 2 }
]);
router.post("/add-event-post", multiUploadEvent, postController.addEventPost);
module.exports = router;
multer.js
const multer = require('multer');
const multerFilter = (req, file, cb) => {
console.log("Mime type :", file.mimetype.split('/')[0]);
if (file.mimetype.split('/')[0] === 'image' || file.mimetype.split('/')[0] === 'video' || file.mimetype.split('/')[0] === 'audio') {
cb(null, true);
} else {
cb(new Error('Please upload img, audio, or video file only.'), false);
}
};
const storage = multer.memoryStorage();
const upload = multer({
storage: storage,
fileFilter: multerFilter,
limits: {
fileSize: , 50 * 1024 * 1024// 50 Mb
},
});
module.exports = upload;
postController.js
const addEventPost = async (request, response) => {
try {
let { title, ..... } = request.body;
const images = request.files.images;
const video = request.files.video;
console.log(title);
console.log(images);
console.log(videos);
//upload to services likes aws and save to database
.
.
.
return response
.status(200)
.json({
message: "Event post added successfully"
});
} catch (error) {
console.log(error);
response.status(500).json({
error: "Something went wrong",
});
}
}

stream s3 to dynamodb with fast-csv : not all data inserted

When a csv file is uploaded on my s3 bucket, my lambda will be triggered to insert my data into DynamoDB.
I need a stream because the file is too large to be downloaded as full object.
const batchWrite = async (clientDynamoDB, itemsToProcess) => {
const ri = {};
ri[TABLE_DYNAMO] = itemsToProcess.map((itm) => toPutRequest(itm));
const params = { RequestItems: ri };
await clientDynamoDB.batchWriteItem(params).promise();
};
function runStreamPromiseAsync(stream, clientDynamoDB) {
return new Promise((resolve, reject) => {
const sizeChunk = 25;
let itemsToProcess = [];
stream
.pipe(fastCsv.parse({headers: Object.keys(schemaGeData), trim: true}))
.on("data", (row) => {
stream.pause();
itemsToProcess.push(row);
if (itemsToProcess.length === sizeChunk) {
batchWrite(clientDynamoDB, itemsToProcess).finally(() => {
stream.resume();
});
itemsToProcess = [];
}
})
.on("error", (err) => {
console.log(err);
reject("Error");
})
.on("end", () => {
stream.pause();
console.log("end");
batchWrite(clientDynamoDB, itemsToProcess).finally(() => {
resolve("OK");
});
});
});
}
module.exports.main = async (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
const object = event.Records[0].s3;
const bucket = object.bucket.name;
const file = object.object.key;
const agent = new https.Agent({
keepAlive: true
});
const client = new AWS.DynamoDB({
httpOptions: {
agent
}
});
try {
//get Stream csv data
const stream = s3
.getObject({
Bucket: bucket,
Key: file
})
.createReadStream()
.on('error', (e) => {
console.log(e);
});
await runStreamPromiseAsync(stream, client);
} catch (e) {
console.log(e);
}
};
When my file is 1000 lines everything is inserted but when I have 5000 lines, my function insert only around 3000 lines and this number is random... Sometimes more sometimes less..
So I'd like to understand what am I missing here ?
I also read this article but to be honest even if you pause the second stream, the first one is still running.. So if someone have any ideas on how to do this, it would be greatly appreciated !
Thanks
I found out why It was not fully processed, it's because the callback of batchWriteItem can return unprocess Items. So I change the function batchWrite and also the runPromiseStreamAsync a little bit because i might not have all the items processed from itemsToProcess.
Anyway here is the full code :
const batchWrite = (client, itemsToProcess) => {
const ri = {};
ri[TABLE_DYNAMO] = itemsToProcess.map((itm) => toPutRequest(itm));
const items = { RequestItems: ri };
const processItemsCallback = function(err, data) {
return new Promise((resolve, reject) => {
if(!data || data.length === 0){
return resolve();
}
if(err){
return reject(err);
}
let params = {};
params.RequestItems = data.UnprocessedItems;
return client.batchWriteItem(params, processItemsCallback);
});
};
return client.batchWriteItem(items, processItemsCallback );
};
function runStreamPromiseAsync(stream, clientDynamoDB) {
return new Promise((resolve, reject) => {
const sizeChunk = 25;
let itemsToProcess = [];
let arrayPromise = [];
stream
.pipe(fastCsv.parse({headers: Object.keys(schemaGeData), trim: true}))
.on("error", (err) => {
console.log(err);
reject("Error");
})
.on('data', data => {
itemsToProcess.push(data);
if(itemsToProcess.length === sizeChunk){
arrayPromise.push(batchWrite(clientDynamoDB, itemsToProcess));
itemsToProcess = [];
}
})
.on('end', () => {
if(itemsToProcess.length !== 0){
arrayPromise.push(batchWrite(clientDynamoDB, itemsToProcess));
}
resolve(Promise.all(arrayPromise).catch(e => {
reject(e)
}));
});
});
}

Add and Retrieve Audio

I have added and retrived image in MongoDB using Node. Can I use the same code with some adjustment? Suggest me.
upload.ts
var multer = require("multer");
export let UPLOAD_PATH = "uploads";
const storage = multer.diskStorage({
destination: function(req, file, cb) {
req;
file;
cb(null, UPLOAD_PATH);
},
filename: function(req, file, cb) {
req;
cb(null, file.fieldname + "-" + Date.now() + ".jpg");
}
});
export const upload = multer({ storage: storage }).single("avatar");
image.controller.ts
Upload
this._model.findOne(
{ ["user"]: new mongoose.Types.ObjectId(user._id) },
img => {
upload(req, res, err => {
if (err) {
res.status(500).json(null);
} else {
// Create a new image model and fill the properties
let newImage = new Image();
newImage.filename = req.file.filename;
newImage.originalName = req.file.originalname;
newImage.desc = req.body.desc;
newImage.url =
req.protocol + "://" + req.get("host") + "/images/" + newImage._id;
newImage.user = user._id;
newImage.save(err => {
if (err) {
res.status(400).json(null);
} else {
res.status(201).json(img);
}
});
}
});
}
);
Retrive
getImage = (req, res) => {
const user = this.getUser(req, res);
this._model.findOne({ ['user']: new mongoose.Types.ObjectId(user._id) }, (err, image) => {
if (err) {
res.status(500).json(null);
}
else if (image == null) {
res.status(200).json(image);
} else {
// stream the image back by loading the file
res.setHeader('Content-Type', 'image/jpeg');
fs.createReadStream(path.join(UPLOAD_PATH, image.filename)).pipe(res);
}
})
};
Is it is possible to use same code with some modification to add and retrieve audio files using Node, Express in Mongo?

Image upload using react-admin

I am new to react-admin. I am using react-admin to upload the file. I have following the step mentioned below in tutorial.
But after I submit the request...I see http trace as follow. I see blob link instead of Base64 image payload.
{
"pictures": {
"rawFile": {
"preview": "blob:http://127.0.0.1:3000/fedcd180-cdc4-44df-b8c9-5c7196788dc6"
},
"src": "blob:http://127.0.0.1:3000/fedcd180-cdc4-44df-b8c9-5c7196788dc6",
"title": "Android_robot.png"
}
}
Can someone please advice how to get base64 image payload instead of link?
Check to see if you have this handler, most likely you did not change the name of the resource posts to your:
const addUploadCapabilities = requestHandler => (type, resource, params) => {
if (type === 'UPDATE' && resource === 'posts') {
Create your custom dataProvider to convert picture to base64
import restServerProvider from 'ra-data-json-server';
const servicesHost = 'http://localhost:8080/api';
const dataProvider = restServerProvider(servicesHost);
const myDataProfider = {
...dataProvider,
create: (resource, params) => {
if (resource !== 'your-route' || !params.data.pictures) {
// fallback to the default implementation
return dataProvider.create(resource, params);
}
const myFile = params.data.pictures;
if ( !myFile.rawFile instanceof File ){
return Promise.reject('Error: Not a file...'); // Didn't test this...
}
return Promise.resolve( convertFileToBase64(myFile) )
.then( (picture64) => ({
src: picture64,
title: `${myFile.title}`
}))
.then( transformedMyFile => dataProvider.create(resource, {
...params,
data: {
...params.data,
myFile: transformedMyFile
}
}));
}
};
const convertFileToBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file.rawFile);
reader.onload = () => resolve(reader.result);
reader.onerror = reject;
});
export default myDataProfider;
And get image data at your Server API
exports.create = (req, res) => {
if(req.body.myFile){
var file = req.body.myFile;
var fs = require('fs');
var data = file.src.replace(/^data:image\/\w+;base64,/, "");
var buf = Buffer.from(data, 'base64');
fs.writeFile(`upload/${file.title}`, buf, err => {
if (err) throw err;
console.log('Saved!');
});
}};

upload an image to amazon s3 in react-native

I am trying to upload image to amazon s3,If possible can any one provide links /docs for how to upload to amazon s3, any help much appreciated
S3 options:
// this.state.s3options in YourComponent
{
"url": "https://yourapp.s3.eu-central-1.amazonaws.com",
"fields": {
"key": "cache/22d65141b48c5c44eaf93a0f6b0abc30.jpeg",
"policy": "eyJleHBpcm...1VDE0Mzc1OVoifV19",
"x-amz-credential": "AK...25/eu-central-1/s3/aws4_request",
"x-amz-algorithm": "AWS4-HMAC-SHA256",
"x-amz-date": "20161125T143759Z",
"x-amz-signature": "87863c360...b9b304bfe650"
}
}
Component:
class YourComponent extends Component {
// ...
// fileSource looks like: {uri: "content://media/external/images/media/13", isStatic: true}
async uploadFileToS3(fileSource) {
try {
var formData = new FormData();
// Prepare the formData by the S3 options
Object.keys(this.state.s3options.fields).forEach((key) => {
formData.append(key, this.state.s3options.fields[key]);
});
formData.append('file', {
uri: fileSource.uri,
type: 'image/jpeg',
});
formData.append('Content-Type', 'image/jpeg')
var request = new XMLHttpRequest();
request.onload = function(e) {
if (e.target.status === 204) {
// Result in e.target.responseHeaders.Location
this.setState({avatarSourceRemote: {uri: e.target.responseHeaders.Location}})
}
}.bind(this)
request.open('POST', this.state.s3options.url, true);
request.setRequestHeader('Content-type', 'multipart/form-data');
request.send(formData);
} catch(error) {
console.error(error);
}
}
// Example display the uploaded image
render() {
if (this.state.avatarSourceRemote) {
return (
<Image source={this.state.avatarSourceRemote} style={{width: 100, height: 100}} />
);
} else {
return (
<Text>No Image</Text>
);
}
}
}
This works for me
import fs from 'react-native-fs';
import {decode} from 'base64-arraybuffer';
import AWS from 'aws-sdk';
export const uploadFileToS3 = async (file) => {
const BUCKET_NAME = 'XXXXXXXXXX';
const IAM_USER_KEY = 'XXXXXXXXXX';
const IAM_USER_SECRET = 'XXXXXXXXXXXXXXX';
const s3bucket = new AWS.S3({
accessKeyId: IAM_USER_KEY,
secretAccessKey: IAM_USER_SECRET,
Bucket: BUCKET_NAME,
signatureVersion: 'v4',
});
const contentType = file.type;
const contentDeposition = `inline;filename="${file.name}"`;
const fPath = file.uri;
const base64 = await fs.readFile(fPath, 'base64');
const arrayBuffer = decode(base64);
return new Promise((resolve, reject) => {
s3bucket.createBucket(() => {
const params = {
Bucket: BUCKET_NAME,
Key: file.name,
Body: arrayBuffer,
ContentDisposition: contentDeposition,
ContentType: contentType,
};
s3bucket.upload(params, (error, data) => {
utils.stopLoader();
if (error) {
reject(getApiError(error));
} else {
console.log(JSON.stringify(data));
resolve(data);
}
});
});
});
};
This worked for me after a significant amount of trying over and over again...
I am also using a lambda function to serve me the link to post with.
The lambda function is just using getSignedUrl.
// Lambda Function
const AWS = require('aws-sdk')
AWS.config.update({
accessKeyId: {bucket_access},
secretAccessKey: {bucket_secret},
signatureVersion: 'v4',
region: {bucket_region}
})
const s3 = new AWS.S3()
exports.handler = async (event) => {
const URL = s3.getSignedUrl('putObject', {Bucket: {bucket_name},
// name of file name being placed in S3 Bucket
// event === metaData object
Key: `${event.{key}}/photo00`})
return URL
};
// React Native
const imagePreview = '{image_uri}'
const handleURL = async () => {
// metaData object
const obj = {
key: "meta_data"
}
const response = await fetch{{lambda_func_endpoint}, {
method: 'POST',
body: JSON.stringify(obj)
})
const json = await response.json();
return json
}
const handleUpload = async () => {
const URL = await handleURL()
const imageExt = imagePreview.split('.').pop()
// I have no idea why you are supposed to fetch before fetching...
// makes no sense. But it works. Lots of trying as I said.
let image = await fetch(imagePreview)
// I have no idea why it needs to be a blob in order
// to upload... makes no sense.
image = await image.blob()
await fetch(URL, {
method: 'PUT',
body: image,
headers: {
Accept: `image/${imageExt}`,
'Content-Type': `image/${imageExt}`
}
})
.then((res) => console.log(JSON.parse(JSON.stringify(res)).status))
.catch((err) => console.error(err))
}
Let me know what you guys think!