Presigned URL image upload with Google Cloud Storage does not upload image correctly (MERN + React Native) - react-native

I am able to generate signedUrls from my server and send it back to my React Native (expo) client.
The client is also able to send a put request to the signedUrl with an image object that includes a uri to an image stored on the device.
The problem is that the image saved in Google Cloud appears corrupt. I imagine there must be additional settings I need to add to get the image to upload properly.
Server Code To Generate URL
const { Storage } = require("#google-cloud/storage");
const storage = new Storage({keyFilename: "keys.json"});
const bucketName = "bucket";
const bucket = storage.bucket(bucketName);
const { v4 } = require("uuid");
async upload(req, res, next) {
const options = {
action: "write",
expires: Date.now() + 15 * 60 * 1000, // 15 minutes,
contentType: "image/jpeg"
};
const fileName = `1234/${v4()}.jpg`
const [url] = await bucket.file(fileName).getSignedUrl(options);
res.send(url)
}
Example of presigned URL from server
const url = "https://storage.googleapis.com/bucket/1234/00bae114-87e4-4647-94d3-31115453e9bd.jpg?GoogleAccessId=id%40num.iam.gserviceaccount.com&Expires=1587297408&Signature=signaturecode"
Example of an image object
const image = {
"name": "IMG_4831.JPG",
"type": "image/jpeg",
"uri": "assets-library://asset/asset.JPG?id=AC24211D-E728-44D2-8B00-29EF04EC74E0&ext=JPG"
}
React Native code to send image through presigned URL
import axios from "axios";
const image = {uri, type: "image/jpeg", name };
await axios.put(url, image, {
headers: { "Content-Type": "image/jpeg" }
});

I tried to use expo FileSystem.uploadAsync(url, fileUri, options) instead of axios and it works.

Related

How Can I do a Post request sending a photo into body of type binary in React Native?

I'm using React Native and I need to send an image in base64 format using a POST method of binary type.
const session = await Auth.currentSession();
const config = {
headers: {
"Content-Type": "image/jpeg",
"x-amz-acl": "public-read",
Authorization: session.getIdToken().getJwtToken(),
},
};
const API = `event/${eventData.id}/photos`;
const HOST = "https://host.com/";
const url = `${HOST}/${API}`;
const result = await axios.post(url, photo.uri, config);
console.log("Result: ", result);
But I'm running into this error: [AxiosError: Request failed with status code 400]
My postman:
I'm trying to get the right response data from AWS S3.

AWS S3 Video Upload React Native

I am using AWS S3 to upload videos from an Expo App. Most of the examples I see have been with mp4 files. When I get the uri from Expo ImagePicker I receive a mov file. I'm curious to know how to properly upload video files from a React Native app to AWS S3
This is my code for uploading a video to S3:
const uploadVideoToS3 = async (file: any) => {
const date = new Date().toISOString();
const randomString = Math.random().toString(36).substring(2, 7);
const cleanFileName = file.toLowerCase().replace(/[^a-z0-9]/g, "-");
const newFileName = `${randomString}-${date}-${cleanFileName}`;
const newFile = new ReactNativeFile({
uri: file,
name: newFileName,
type: `${file.split(".").pop()}`,
});
AWS.config.update({
region: "eu-north-1",
credentials: new AWS.Credentials(AWS_S3_ACCESS_KEY, AWS_S3_SECRET_KEY),
});
let upload_params = {
Bucket: "bucket-name",
Key: file,
Body: file,
Acl: "public-read",
};
let upload = new AWS.S3.ManagedUpload({ params: upload_params });
let promise = upload.promise();
promise.then(
function (data) {
// ! Log only for testing
console.log("Successfully uploaded:", data);
// console.log(data);
// setVideoSource(data.Location);
},
function (err) {
console.log("Failed to upload", newFile, "with error:", err.message);
}
);
};
I've also trying to use CloudFront to serve the videos, but since I have not come further than trying to upload videos to S3 an answer here would be great!

How to upload image from React Native + Expo app to Azure Blob Storage?

When I try to upload Image from React Native + Expo app to Azure Blob Storage via Azure Functions (Node.js), image(jpeg) can be saved into the storage but that image cannot be opened correctly.
Evidence
== ASK ==
・How can I upload image? (Current behavior explained in below troubleshooting steps is unexpected? if so, how can I prevent it and upload image precisely..?)
== Assumption ==
・React Native 0.63.2
・Expo 42.0.4
・Expo-Image-Picker 10.2.3
・Request Route :
React Native + Expo App -> API (Azure Functions (Node.js)) -> Azure Blob Storage
== My Code ==
= React Native + Expo (Client Side)
*I implemented based on below discussion:
How can I upload a photo with Expo?
*params : returned value from expo-image-picker after image select
const formData = new FormData();
const imageUri = params.uri;
const dataType = mime.getType(imageUri);
const fileName = imageUri.split("/").pop();
formData.append('image',{
uri: imageUri,
name: "a.jpg",
type: dataType
} as any);
const url = 'xxxxx';
await fetch (url, {
method: 'POST',
headers:{
},
body:formData
})
expo-image-picker side code
(pass the value of "result" to the above code.)
const pickImage = async () =>{
console.log("PICKIMAGE CALLED.");
// Access Permission Check
let permissionResult = await ImagePicker.requestMediaLibraryPermissionsAsync();
if (permissionResult.granted === false) {
alert("Permission to access camera roll is required!");
return;
}
// Image Pickup
try{
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: false,
aspect: [4,3],
quality: 0.5,
base64: true,
});
if (!result.cancelled) {
setImageData(result);
}
} catch(E){
console.log(E);
}
};
= Azure Function (Server Side)
const httpTrigger: AzureFunction = async function (context: Context, req: HttpRequest): Promise<void> {
const replace = require('buffer-replace');
// Upload Image
try{
// Parse Blob Data
var multipart = require("parse-multipart");
var bodyBuffer = Buffer.from(req.body);
var boundary = multipart.getBoundary(req.headers['content-type']);
var parts = multipart.Parse(bodyBuffer, boundary);
// Upload Blob to Azure Storage
const {BlobServiceClient} = require("#azure/storage-blob");
const containerName = "ContainerName";
const blobName = parts[0].filename;
const blobServiceClient = BlobServiceClient.fromConnectionString("connection string of storage");
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.upload(parts[0].data, parts[0].data.length);
context.log("Blob was uploaded successfully. requestId: ", uploadBlobResponse.requestId);
context.done();
== Troubleshootings ==
1, When I checked Azure Function side log, "parts" seems have incorrect binary(hex) data.
I mean, before "ff d8" (indicator of jpeg format), unexpected line break "0d 0a" seems added. I doubt this is the reason file cannot be opened precisely.
var parts = multipart.Parse(bodyBuffer, boundary);
Result of context.log(parts)
2, Based on the result of 1, I also checked same thing from client side (React Native + Expo) with below code, and found unexpected line break "0d 0a" is not there. So wondering why unexpected line break is added while processing at server side.
function base64ToHex(str) {
const raw = atob(str);
let result = '';
for (let i = 0; i < raw.length; i++) {
const hex = raw.charCodeAt(i).toString(16);
result += (hex.length === 2 ? hex : '0' + hex);
}
return result.toUpperCase();
}
console.log(base64ToHex(params.base64));
Result of Client Side
3: Try Image Upload from Postman.
Request Route : Postman -> API (Azure Functions) -> Azure Blob Storage.
The way to send image is little different from that of React Native + Expo app (Postman: seems add image file (blob?) directly into the formdata, ReactNative: add blob uri, mime etc into formdata), but It succeeded.
Thank you very much for your help!!

React Native uploading file to multer, i am getting empty buffer

I am making an App with React Native and back end with NodeJS. From VueJS it is exactly the same code(except i am getting the file from an input), and its working fine from an Input and Postman, but i am having trouble on React Native
Code in the App:
const formdata = new FormData()
const file = myMainImage
console.log(file)
const fileName = file.split("/").reverse()[0];
formdata.append('media', {
url: file,
name: fileName,
type: 'image/jpeg'
})
await profileApi.uploadMyMainImage(formdata)
And the request to the backend (tried with axios and fetch)
const postFormMethodWithAuthorization = async (url, content) => {
const headers = getHeaderWithAuthorizationForm()
const response = await Axios.post(url, content, {headers} )
return response.data
}
const postFileusingFetch = async (url, content) => {
const result = await fetch(url, {
method: 'POST',
headers: new Header(await getHeaderWithAuthorizationForm()),
body: content
})
return await result.json()
}
but in the back end i am always getting from req.file:
{
fieldname: 'media',
originalname: 'DADE2091-0C50-456B-8F89-408CCAD98E02.jpg',
encoding: '7bit',
mimetype: 'image/jpeg',
buffer: <Buffer >,
size: 0
}
Any Ideas? i thought it can be something related to the file being upload, so i treated it like a stream, but same problem happens, also i am uploading it into AWS S3, but when i get it back the file is empty and cant be opened.
The image uri is being taken from the Camera. I also tried removing file:// with no luck.
Any Help appreciated!

Downloading images form AWS S3 via Lambda and API Gateway--using fetch class

I'm trying to use the JavaScript fetch API, AWS API Gateway, AWS Lambda, and AWS S3 to create a service that allows users to upload and download media. Server is using NodeJs 8.10; browser is Google Chrome Version 69.0.3497.92 (Official Build) (64-bit).
In the long term, allowable media would include audio, video, and images. For now, I'd be happy just to get images to work.
The problem I'm having: my browser-side client, implemented using fetch, is able to upload JPEG's to S3 via API Gateway and Lambda just fine. I can use curl or the S3 Console to download the JPEG from my S3 bucket, and then view the image in an image viewer just fine.
But, if I try to download the image via the browser-side client and fetch, I get nothing that I'm able to display in the browser.
Here's the code from the browser-side client:
fetch(
'path/to/resource',
{
method: 'post',
mode: "cors",
body: an_instance_of_file_from_an_html_file_input_tag,
headers: {
Authorization: user_credentials,
'Content-Type': 'image/jpeg',
},
}
).then((response) => {
return response.blob();
}).then((blob) => {
const img = new Image();
img.src = URL.createObjectURL(blob);
document.body.appendChild(img);
}).catch((error) => {
console.error('upload failed',error);
});
Here's the server-side code, using Claudia.js:
const AWS = require('aws-sdk');
const ApiBuilder = require('claudia-api-builder');
const api = new ApiBuilder();
api.corsOrigin(allowed_origin);
api.registerAuthorizer('my authorizer', {
providerARNs: ['arn of my cognito user pool']
});
api.get(
'/media',
(request) => {
'use strict';
const s3 = new AWS.S3();
const params = {
Bucket: 'name of my bucket',
Key: 'name of an object that is confirmed to exist in the bucket and to be properly encoded as and readable as a JPEG',
};
return s3.getObject(params).promise().then((response) => {
return response.Body;
})
;
}
);
module.exports = api;
Here are the initial OPTION request and response headers in Chrome's Network Panel:
Here's the consequent GET request and response headers:
What's interesting to me is that the image size is reported as 699873 (with no units) in the S3 Console, but the response body of the GET transaction is reported in Chrome at roughly 2.5 MB (again, with no units).
The resulting image is a 16x16 square and dead link. I get no errors or warnings whatsoever in the browser's console or CloudWatch.
I've tried a lot of things; would be interested to hear what anyone out there can come up with.
Thanks in advance.
EDIT: In Chrome:
Claudia requires that the client specify which MIME type it will accept on binary payloads. So, keep the 'Content-type' config in the headers object client-side:
fetch(
'path/to/resource',
{
method: 'post',
mode: "cors",
body: an_instance_of_file_from_an_html_file_input_tag,
headers: {
Authorization: user_credentials,
'Content-Type': 'image/jpeg', // <-- This is important.
},
}
).then((response) => {
return response.blob();
}).then((blob) => {
const img = new Image();
img.src = URL.createObjectURL(blob);
document.body.appendChild(img);
}).catch((error) => {
console.error('upload failed',error);
});
Then, on the server side, you need to tell Claudia that the response should be binary and which MIME type to use:
const AWS = require('aws-sdk');
const ApiBuilder = require('claudia-api-builder');
const api = new ApiBuilder();
api.corsOrigin(allowed_origin);
api.registerAuthorizer('my authorizer', {
providerARNs: ['arn of my cognito user pool']
});
api.get(
'/media',
(request) => {
'use strict';
const s3 = new AWS.S3();
const params = {
Bucket: 'name of my bucket',
Key: 'name of an object that is confirmed to exist in the bucket and to be properly encoded as and readable as a JPEG',
};
return s3.getObject(params).promise().then((response) => {
return response.Body;
})
;
},
/** Add this. **/
{
success: {
contentType: 'image/jpeg',
contentHandling: 'CONVERT_TO_BINARY',
},
}
);
module.exports = api;