In which stage should I generate signed urls to load S3 bucket objects in an app? - amazon-s3

I am using S3 to store images in my app.
This is the function that generates the signed url that the user can use to upload an image:
const key = `images/${Date.now()}.jpeg`;
s3_config
.getImageSignedUrl(key)
.then((url) => {
res.status(200).send({ key, url });
})
.catch((error) => {
res.status(500).send({
message: "There was an error generating pre-signed url.",
});
});
So after the image is uploaded, the image url will look like this:
https://BUCKET_NAME.s3.amazonaws.com/images/1667119739573.jpeg
Now, in order to make the images only accessible on the website, I will also use a signed-url.
This way when someone uses the direct link:
https://BUCKET_NAME.s3.amazonaws.com/images/1667119739573.jpeg
He will get AccessDenied error.
And only users within the app, will be able to access the images using the signed urls.
This is how I generated the signed url for loading an image:
var getImageReadSignedUrl = async function (key) {
return new Promise((resolve, reject) => {
s3.getSignedUrl(
"getObject",
{
Bucket: AWS_BUCKET_NAME,
Key: key,
Expires: 300,
},
(err, url) => {
if (err) {
reject(err);
} else {
resolve(url);
}
}
);
});
};
And if I feed it an image key:
getImageReadSignedUrl("images/1667119739573.jpeg");
It will generate a signed url that will allow the user to access the private image:
https://BUCKET_NAME.s3.eu-west-3.amazonaws.com/images/1667119739573.jpeg?X-Amz-Algorithm=xxxxxxxxxxxxxxxxx&X-Amz-Credential=xxxxxxxxxxxxxxxxxxxx9%2Feu-xxxx-3%2Fs3%2Faws4_request&X-Amz-Date=202211xxxxxxxx35Z&X-Amz-Expires=300&X-Amz-Signature=5ab0exxxxxxxxxxxxxxxxxx8dc401dc7fxxxxxxxxa5124&X-Amz-SignedHeaders=host
Now, so far so good. Everything works perfectly as intended.
My problem is when or how or where exactly I should use the function getImageReadSignedUrl.
Since in the database, I am saving the direct link to the image:
https://BUCKET_NAME.s3.amazonaws.com/images/1667119739573.jpeg
When the user is using the app, he will receive that url.
And it will be using inside the img html tag, to render the image.
Now, the question is, should I use getImageReadSignedUrl everytime before there's an image url in the data that's sent back to the user and send the signed url instead?
Even though, this makes sense, this means that I will have to go through the entire app in the backend and call that function, everytime there's an image to be sent back to the user.
Is there another approach that makes more sense and is not as tedious as this?
FYI, I am using the MERN stack and an EC2 instance.
Thank you.

Related

In an Expo React Native app, how to pick a photo, store it locally on the phone, and later upload it to a Node.js service?

I'm building a React Native app with Expo, and I want to include the following workflow: The user takes a picture (either with the camera or picking one from the phone's gallery), which is stored locally on the phone, until the user uploads it some later time to a backend service.
I'm pretty stuck and would appreciate any pointers.
Here is what I have:
I use expo-image-picker to pick a photo from the phone's gallery:
const photo = await launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
base64: true,
quality: 1,
});
Then I store the photo locally as a Base64 string using expo-file-system:
const location = `${FileSystem.documentDirectory}${filename}`;
await FileSystem.writeAsStringAsync(location, photo.base64, {
encoding: FileSystem.EncodingType.Base64
});
I keep information about the storage location, file name, and mime type in an image object. Later, I try to upload that image to my own Node.js backend service with axios, sending the following multi-part form data:
const formdata = new FormData();
formdata.append('file', {
path: image.location,
name: image.filename,
type: image.mimetype
} as any);
The backend service that receives the photo uses multer:
const multer = require('multer');
const upload = multer({ storage: multer.memoryStorage() });
router.post('/photo', upload.single('file'), async (request, response) => {
console.log(request.file);
...
});
What arrives at my service is the following:
{
fieldname: 'file',
originalname: '1653135701413.jpg',
encoding: '7bit',
mimetype: 'image/jpg',
buffer: <Buffer >,
size: 0
}
So no data is transferred. (It seems to be properly stored on the phone, because if I use the Expo filesystem's readStringAsAsync, I do get a pretty long Base64 string.)
What am I missing? Do I need to send the image as a blob? (If I try to do so, then request.file is undefined, so I guess I'm doing something wrong there as well.)
And in general, is there a better way to achieve this workflow in a managed React Native app? (For example, is it ok to store the image as a Base64 string, or would it be better to do this differently?)
Edit:
In the form data, I changed path to uri, and I switched from axios to fetch. Now the backend finally receives the image data. šŸ„³

Multiple signatures using react-native-signature-capture saved in same file

I am working on React-Native app that is collecting multiple signatures using react-native-signature-capture
When I want to upload resulting file to Amazon S3 using result.pathName
I always get the same file as a result. Every new signature is being written to the same file internally, i.e: /Users/adnan/Library/Developer/CoreSimulator/Devicā€¦53-4B89-84A2-B2D72D004241/Documents/signature.png
When I use base64 (result.encoded) then it always return different signature, as expected, but I want the file to be uploaded to S3 and therefore I use result.pathName and everything works but I am getting the same signature for all users.
onSaveEvent = (result) =>
{this.props.dispatch(setFieldParticipantAction('current_participant_sign_uri',result.pathName))
this.setState({ isVisible: false })
}
onDragEvent = () => {
this.setState({sigChanged: true })
}
saveSign() {
this.refs["sign"].saveImage();
}
I have used the code from this pull request, and it made my component work with multiple signatures getting unique file names:
https://github.com/RepairShopr/react-native-signature-capture/pull/179Pull request

Podio POST request returns unauthorized

I'm working on a Podio integration as a Slack bot.
I'm starting to use it just for use for my company to test it, then I could share it with everybody.
I've used the podio-js platform with Node JS, and started locally with a "web app" by starting from this example: https://github.com/podio/podio-js/tree/master/examples/password_auth
I need to do a post request, so I maintained all the code of the example in order to log in with user and password. The original code worked, then I changed the code to make a post request, in particular I change the lines of index.js into this:
router.get('/user', function(req, res) {
podio.isAuthenticated().then(function () {
var requestData = { "title": "sample_value" };
return podio.request('POST', '/item/app/15490175', requestData);
})
.then(function(responseData) {
res.render('user', { data: responseData });
})
.catch(function () {
res.send(401);
});
});
But in the end is giving a "Unauthorized" response.
It seems like the password auth doesn't let to make POST request to add new items! Is that possible?
I've already read all the documentation but I'm not able to explain why and how I can solve this.
Regards

Download large object from AWS S3

I have an Angular web application in which allows users to download files locally (installers). Some files exceed 1.5 GB in size, which causes the browser to crash (Chrome) when using 'normal' s3.getObject(opts, function(err, data){}) calls, since the entire file binary data is cached....?
I have tried to use other techniques, like streaming (StreamSaver.js), but with no luck.
I am trying to chunk the file data, but in the follow code, the 'httpData' event does not get called until the entire file's binary data is loaded...which seems to defeat the purpose of chunking. I am not understanding this event, or I have something misconfigured.
cache.S3.getObject({ Bucket: 'anduin-installers', Key: filePath })
.on('httpDownloadProgress', function (progress) {
$timeout(function () {
pkg.Download.Progress = Math.floor((progress.loaded / progress.total) * 100.0);
});
})
.on('httpData', function (chunk, response) {
console.log('???');
})
.on('complete', function (response) {
$timeout(function () {
pkg.Download.Active = false;
pkg.Download.Progress = 0;
});
})
.send();
Any ideas no how to make event 'httpData' fire as data chunks are received instead of waiting for th whole file? Or should I go with another solution?
Thanks!

Using Fetch instead of XMLHttpRequest in React Native

I was trying to upload images to S3 from my react native app by following and adapting this guide by heroku: https://devcenter.heroku.com/articles/s3-upload-node
Essentially I am using the aws-sdk on my express.js backend to generate pre-signed request for uploading images to S3 from react native.
Everything works well, so then I tried to convert the XMLHttpRequests into fetch requests, which seem to be favoured by react native. After the conversion, the files are still being uploaded to S3, but when I click on the image links, then the images wouldn't not show properly, instead an empty square is shown:
Empty square shown instead of image
More specifically it seems to be this piece of code conversion that causes it to happen:
From:
_uploadFile(file, signedRequest, url){
const xhr = new XMLHttpRequest();
xhr.open('PUT', signedRequest);
xhr.onreadystatechange = () => {
if(xhr.readyState === 4){
if(xhr.status === 200){
console.log("UPLOAD DONE");
} else {
alert('ERROR UPLOADING');
}
}
};
xhr.send(file);
}
To:
_uploadFile(file, signedRequest, url) {
let option = {
method: "PUT",
headers: {
"Content-Type": "image/jpeg",
},
body: JSON.stringify(file)
}
fetch(signedRequest, option)
.then(res => console.log("UPLOAD DONE"))
.catch(err => console.log("ERROR UPLOADING: ", err))
}
The file object being uploaded:
{
name: "profileImage",
type: "image/jpeg",
uri: 'data:image/jpeg;base64,' + response.data, //just a base64 image string
isStatic: true
}
Could anyone shed some light on why this could be happening, or have had similar experiences? Many thanks!
In your fetch example you put a JSON string in your body. It will be sent to S3 but it will not be interpreted as an image upload. You should be able to construct a FormData object yourself and pass it to fetch as the request body, but I think using XHR is the simpler option. According to this comment it's what Facebook does as well (the comment is over a year old).
If at all possible you should also try to use local URIs instead of passing Base64 encoded data. It takes quite a while to transfer a few MB of image data between JS and native.