How can I speed up S3 signedUrl upload from 1Mbps - amazon-s3

I am currently utilizing s3 signedUrl's in order to hide my credentials from users on the front end. I have it set up and working but it is extremely slow around 1.2mb/s. Using speed test my wifi is showing 11.9mb/s so I don't believe it is my network. The image is only 8MB in size that I have been testing.
Server
const { uploadFile } = require("../services/aws");
app.post("/activity/image-upload", async (req, res) => {
try {
const { _projectId, name, type } = req.body;
const key = `${_projectId}/activities/${name}`;
const signedUrl = await uploadFile({ key, type });
res.status(200).send(signedUrl);
} catch (err) {
console.log("/activity/upload-image err", err);
res.status(422).send();
}
});
AWS Service
const aws = require("aws-sdk");
const keys = require("../config/keys");
aws.config.update({
accessKeyId: keys.aws.accessKeyId,
secretAccessKey: keys.aws.secretAccessKey,
useAccelerateEndpoint: true,
signatureVersion: "v4",
region: "my-region",
});
const s3 = new aws.S3();
exports.uploadFile = async ({ type, key }) => {
try {
const awsUrl = await s3.getSignedUrl("putObject", {
Bucket: keys.aws.bucket,
ContentType: type,
Key: key,
ACL: "public-read",
});
return awsUrl;
} catch (err) {
throw err;
}
};
Front End
const handleUpload = async ({ file, onSuccess, onProgress }) => {
try {
const res = await api.post("/activity/image-upload", {
type: file.type,
name: file.name,
_projectId,
});
const upload = await axios.put(res.data, file, {
headers: {
"Content-Type": file.type,
},
onUploadProgress: handleProgressChange,
});
} catch (err) {
console.log("err", err);
}
};
Image of Request Speeds
You can see above the call to image-upload is returning in 63ms so the hang up isn't my server getting the signedURL. The axios PUT request to s3 signedURL is 6.37s. Unless I am horrible at math that for the 8MB file I am uploading that is roughly 1.2mb/s. What am I missing?
Update 7/23
Here is a picture of my speed test through Google showing upload speed of 10.8mbs.
I tried uploading the image in s3 console to compare speeds. When I uploaded it through the s3 console it was 10.11s!!! Are their different plans that throttle speeds? I am even utilizing Transfer Acceleration and its this slow.

Related

S3 to IPFS from Pinata

I am trying to upload a lot of files from S3 to IPFS via Pinata. I haven't found in Pinata documentation something like that.
This is my solution, using the form-data library. I haven't tested it yet (I will do it soon, I need to code some things).
Is it a correct approach? anyone who has done something similar?
async uploadImagesFolder(
items: ItemDocument[],
bucket?: string,
path?: string,
) {
try {
const form = new FormData();
for (const item of items) {
const file = getObjectStream(item.tokenURI, bucket, path);
form.append('file', file, {
filename: item.tokenURI,
});
}
console.log(`Uploading files to IPFS`);
const pinataOptions: PinataOptions = {
cidVersion: 1,
};
const result = await pinata.pinFileToIPFS(form, {
pinataOptions,
});
console.log(`PiƱata Response:`, JSON.stringify(result, null, 2));
return result.IpfsHash;
} catch (e) {
console.error(e);
}
}
I had the same problem
So, I have found this: https://medium.com/pinata/stream-files-from-aws-s3-to-ipfs-a0e23ffb7ae5
But in the article If am not wrong, is used a different version to the JavaScript AWS SDK v3 (nowadays the most recent: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/index.html).
This is for the Client side with TypeScript:
If you have this version, for me works this code snippet:
export const getStreamObjectInAwsS3 = async (data: YourParamsType) => {
try {
const BUCKET = data.bucketTarget
const KEY = data.key
const client = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'secret-key'
}
})
const resource = await client.send(new GetObjectCommand({
Bucket: BUCKET,
Key: KEY
}))
const response = resource.Body
if (response) {
return new Response(await response.transformToByteArray()).blob()
}
return null
} catch (error) {
return null
}
}
With the previous code, you can get the Blob Object for pass it to the File object with this method and get the URL resource using the API:
export const uploadFileToIPFS = async(file: Response) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`
const data = new FormData()
data.append('file', file)
try {
const response = await axios.post(url, data, {
maxBodyLength: Infinity,
headers: {
pinata_api_key: 'your-api',
pinata_secret_api_key: 'your-secret'
},
data: data
})
return {
success: true,
pinataURL: `https://gateway.pinata.cloud/ipfs/${ response.data.IpfsHash }`
}
} catch (error) {
console.log(error)
return null
}
}
I have found this solution from this nice article and you can explore other implementations (including the Node.js side)

Multer google storage returns 'Internal Server Error'

I have just switched image upload with Multer from local to Google Cloud Storage using 'multer-google-storage'. It used to work fine earlier, but now sends a 500 Internal Server Error without message. I am using Nodejs and Express, React for front end. FormData is formatted correctly since it works fine if I go back to local storage. Any ideas on how to fix this? Or display an error message? I am not able to find much documentation on 'multer-google-storage'. Thanks for the help!
Here the back-end post route (I hid the configuration options)
const multer = require('multer');
const multerGoogleStorage = require('multer-google-storage');
const upload = multer({
storage: multerGoogleStorage.storageEngine({
autoRetry: true,
bucket: '******',
projectId: '******',
keyFilename: '../server/config/key.json',
filename: (req, file, callback) => {
callback(null, file.originalname);
},
}),
});
//#route POST api/listings
//#description Create listing
//#access Private
router.post(
'/',
upload.any(),
[
isLoggedIn,
[
check('title', 'Title is required').not().isEmpty(),
check('coordinates').not().isEmpty(),
check('address').not().isEmpty(),
check('price', 'Set a price').not().isEmpty(),
check('description', 'Type a description').not().isEmpty(),
check('condition', 'Declare the condition').not().isEmpty(),
check('category', 'Please select a category').not().isEmpty(),
],
],
async (req, res) => {
const errors = validationResult(req);
if (!errors.isEmpty()) {
console.log('validation error');
return res.status(400).json({ errors: errors.array() });
}
try {
const files = req.files;
let images = [];
for (let image of files) {
images.push(image.originalname);
}
const newListing = new Listing({
title: req.body.title,
images: images,
coordinates: JSON.parse(req.body.coordinates),
price: req.body.price,
description: req.body.description,
condition: req.body.condition,
dimensions: req.body.dimensions,
quantity: req.body.quantity,
address: req.body.address,
author: req.user.id,
category: JSON.parse(req.body.category),
});
const author = await User.findById(req.user.id);
await author.listings.push(newListing);
await author.save();
const listing = await newListing.save();
res.json(listing);
} catch (error) {
console.log('error');
console.error(error);
res.json(error);
res.status(500).send('Server Error');
}
}
);
I have solved the issue, it was a permission problem. My previous Google Cloud Storage bucket had access control 'Uniform' while it should have been 'Fine-grained'.

Uploading photo from react native(expo) camera to aws s3 after resized by lambda

Progress until now
I have created an app.
captured image from app has a object like this
`
{
"height": 4000,
"uri": "file:///data/user/0/host.exp.exponent/cache/ExperienceData/%2540anonymous%252Fimage-upload-4ceaf845-d42a-4a1d-b1be-5a3f9cfd10ba/Camera/e3d441b6-2664-4c30-a093-35f7f0f09488.jpg",
"width": 3000,
}
`
I have made an end point in api gateway and api gateway connects to a lambda and i'm sending above data to my backend as `
axios
.post(uploadEndPoint, capturedImage, {
headers: {
'Content-Type': 'application/json',
},
})
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error.response);
});
`
i get captured data in my lambda
first i'm trying to save in s3 with out resizing like this if this works then i will go for resizing.
`
import Responses from '../utils/responses';
import * as fileType from 'file-type';
import { v4 as uuid } from 'uuid';
import * as AWS from 'aws-sdk';
const s3 = new AWS.S3();
const allowedMimes = ['image/jpeg', 'image/png', 'image/jpg'];
export const handler = async (event) => {
console.log('event>>', JSON.parse(event.body));
try {
const body = JSON.parse(event.body);
if (!body) {
return Responses._400({ message: 'incorrect body on request' });
}
let imageData = body;
console.log('imageData>>', imageData);
const buffer = Buffer.from(imageData, 'base64');
console.log('buffer>>', buffer);
const fileInfo = await fileType.fromBuffer(buffer);
console.log(fileInfo);
const detectedExt = fileInfo.ext;
const detectedMime = fileInfo.mime;
console.log('detectedExt>>', detectedExt);
console.log('detectedMime>>', detectedMime);
if (!allowedMimes.includes(detectedMime)) {
return Responses._400({ message: 'mime is not allowed ' });
}
const name = uuid();
const key = `${name}.${detectedExt}`;
console.log(`writing image to bucket called ${key}`);
await s3
.putObject({
Body: buffer,
Key: key,
ContentType: detectedMime,
Bucket: process.env.imageUploadBucket,
ACL: 'public-read',
})
.promise();
const url = `https://${process.env.imageUploadBucket}.s3-${process.env.region}.amazonaws.com/${key}`;
return Responses._200({
imageURL: url,
});
} catch (error) {
console.log('error', error);
return Responses._400({
message: error.message || 'failed to upload image',
});
}
};
`
fileInfo shows undifined
Things i want
Resize image and upload to s3 by that lambda

Unexpected end of multipart data nodejs multer s3

iam trying to upload image in s3 this is my code
const upload = require('../services/file_upload');
const singleUpload = upload.single('image');
module.exports.uploadImage = (req,res) => {
singleUpload(req, res, function (err) {
if (err) {
console.log(err);
return res.status(401).send({ errors: [{ title: 'File Upload Error', detail: err}] });
}
console.log(res);
return res.json({ 'imageUrl': req.file.location });
});
}
FileUpload.js
const aws = require('aws-sdk');
const multer = require('multer');
const multerS3 = require('multer-s3');
const s3 = new aws.S3();
const fileFilter = (req, file, cb) => {
if (file.mimetype === 'image/jpeg' || file.mimetype === 'image/png') {
cb(null, true)
} else {
cb(new Error('Invalid Mime Type, only JPEG and PNG'), false);
}
}
const upload = multer({
fileFilter,
storage: multerS3({
s3,
bucket: 'image-bucket',
acl: 'public-read',
contentType: multerS3.AUTO_CONTENT_TYPE,
metadata: function (req, file, cb) {
cb(null, {fieldName: 'TESTING_META_DATA!'});
},
key: function (req, file, cb) {
cb(null,"category_"+Date.now().toString()+".png")
}
})
})
module.exports = upload;
I tried to test api with postmanin serverless local it is giving this error
Error: Unexpected end of multipart data
at D:\Flutter\aws\mishpix_web\node_modules\dicer\lib\Dicer.js:62:28
at process._tickCallback (internal/process/next_tick.js:61:11) storageErrors: []
After deploying online. i tried the api. the file is uploaded in server but its a broken
Are you using aws-serverless-express. aws-serverless-express will forward original request to Buffer as utf8 encoding. So multipart data will be lost or error. I am not sure why.
So, I change aws-serverless-express to aws-serverless-express-binary and everything worked.
yarn add aws-serverless-express-binary
Hope this help!

Upload File to S3 Dynamic Storage

I'm having problems uploading files to a dynamic storage service.
AWS.config.update({
accessKeyId: env.s3.accessKey,
secretAccessKey: env.s3.sharedSecret,
httpOptions: {
agent: proxy(env.auth.proxy),
},
});
this.s3Client = new AWS.S3({ endpoint: env.s3.accessHost, signatureVersion: 'v2' });
This is the configuration. I have to define the proxy settings since I'm behind the Swisscom corp proxy.
public upload(image: IFile): Promise<any> {
return new Promise((resolve, reject) => {
const key = image.originalname;
const paramsCreateFile = { Bucket: 'test', Key: key, Body: image.buffer };
this.s3Client.putObject(paramsCreateFile, (err, data) => {
if (err) {
return reject(err);
}
return resolve(data);
});
});
}
And this is my upload method.
However, when I try to upload nothing happens. After approx. 2 mins I get a timeout, but no error.
I followed the official documentation during implementation.