Axios prepending content to start of file making it unreadable - react-native

I am trying to upload a file to an s3 presigned using axios from an expo managed mobile app FE. I have found that the following code works perfectly:
const file = await fetch(fileRef.uri);
const blob = await file.blob();
await fetch(uploadUrl, { method: 'PUT', body: blob });
here fileRef is an object like:
Object {
"height": 1920,
"uri": "file:///....jpg",
"width": 1080,
}
and uploadUrl is a presignedURL
I want to port this over to axios to take advantage of the onUploadProgress event. I've written the following:
const body = new FormData()
body.append('file', fileRef)
await axios.put(uploadUrl, body);
This uploads the file, however it prepends additional information to the start of the file that makes it so the image or video uploaded is not readable. The information it prepends looks like:
--9V.XUQuQ1DIG8HFMzJO-veI4JbmI7j_WawYPxtMUG2NhK_7eGnlL.kVNSXyH_sAQ2897mg^M
content-disposition: form-data; name="file"^M
content-type: image/jpeg^M
^M
I found that if i delete these lines, the file can now be opened (ex. by Quicktime).
I'd like to know how i can not have this information added to the start of the file?

Related

When trying to download certain video it redirects to a new URL and the video starts to play. "disposable content type" is not received from server

I want to download certain videos with a click. For that, I created a Button and attached a Function that should trigger the associated video download.
But I am only able to download the link of the video, not the video. I am able to download videos with an external downloader or simply drag the URL to the download section of the browser. But unable to trigger that activity via JavaScript. Please help Me.
I tried multiple ways to tackle this problem:
Using a Simple Blob Technique without Axios:
const blob = new Blob([this.src_url], { type: 'video/mp4' })
const link = document.createElement('a')
link.href = URL.createObjectURL(blob)
link.download = this.src_url.replace(
>! // 'https://redis-test.com/videos/',
link.click()
URL.revokeObjectURL(link.href)
endpoint: video URL get downloaded as a file of 122 bytes
Then using File Saver Package:
var FileSaver = require('file-saver')
console.log(this.src_url)
var blob = new Blob([this.src_url], { type: 'video/mp4' })
FileSaver.saveAs(blob, 'hello world.mp4')
Then using the form method:
<form method="get" action="file.doc">
<button type="submit">Download!</button>
</form>
endpoint: video starts to play in the same window
Using href download attribute:
function download(url) {
const a = document.createElement('a')
a.href = url
a.download = url.split('/').pop()
document.body.appendChild(a)
a.click()
document.body.removeChild(a)
}
endpoint: video starts to play in the same window
Using your method:
const link = document.createElement('a')
link.href = url
link.click()
endpoint: video starts to play in the same windows
With Axios defaults now:
axios.defaults.withCredentials = true
window.open(
'https://cdn.pixaandom_urlrbay.com/vieo/487508532/Woman%20-%2058142.mp4?rendition=source&expiry=1666842719&hash=7dd6d178d9dbbd8adaf68dafd80c9167e91eca21&download'
)
endpoint: video starts to play in the new window
With attaching disposable content type in headers with AXIOS:
axios
.get(
String(nuxtConfig.axios.mediaURL) +
this.src_url.replace(
'https://redisrandom_url.com/videos/',
''
),
{
headers: {
mode: 'no-cors',
referrerPolicy: 'no-referrer',
'Content-Disposition': 'attachment; filename=Woman - 58142.mp4',
Host: 'redis-nfs',
'User-Agent': 'PostmanRuntime/7.29.2',
Accept: '*/*',
'Accept-Language': 'en-US,en;q=0.5',
'Accept-Encoding': 'gzip, deflate, br',
Connection: 'keep-alive',
Cookie:
'tk_or=%22https%3A%2F%2Fwww.google.com%2F%22; tk_lr=%22https%3A%2F%2Fwww.google.com%2F%22; _gcl_au=1.1.954672920.1660108804; _ga=GA1.2.1392122600.1660108808; _fbp=fb.1.1660108809200.1970395787',
'Upgrade-Insecure-Requests': '1',
'Sec-Fetch-Dest': 'document',
'Sec-Fetch-Mode': 'navigate',
'Sec-Fetch-Site': 'none',
'Sec-Fetch-User': '?1',
Pragma: 'no-cache',
'Cache-Control': 'no-cache',
},
}
)
.then((response) => {
console.log(response)
const url = window.URL.createObjectURL(new Blob([response.data]))
const link = document.createElement('a')
link.href = url
link.setAttribute('download', 'title')
document.body.appendChild(link)
link.click()
})
.catch((error) => {
console.log('rex')
})
endpoint: Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at redis-random_url/videos/be319-72e1-2e79-8dc3-bcef1/…. (Reason: CORS header ‘Access-Control-Allow-Origin’ missing). Status code: 200
"...But I am only able to download the link of the video, not the video."
I don't use VueJS but I suspect this.src_url is just text of the path to video URL.
In HTML5 you can only download those files that exist on your server. If the file is external then you need a PHP script (on same server as your HTML file) to read those external bytes back into your JS buffer array.
const blob = new Blob([this.src_url], { type: 'video/mp4' })
Should be:
let myBytes = //# update variable with data result of reading files bytes
let myBlob = new Blob( [ Uint8Array.from( myBytes ) ] , {type: "application/octet-stream"} );
Where the bytes reading can be done with FileReader API or Fetch API.
When you can read a file's bytes into an Array using VueJS then your problem is solved.

Axios xlsx file download issue

I try to download *.xlsx file in Vue by using Axios get request, however response that i get from GET is not what i expected, what i am trying to do:
on frontend in OnClick method:
const response = await this._fileService.getFileAsBlob(fileName);
const downloadBlob = new Blob([response.data], { type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet;' })
virtualLink.href = URL.createObjectURL(downloadBlob);
virtualLink.download = file.fileName?? 'file';
virtualLink.click();
next the getFileAsBlob call
public getFileAsBlob(fileName: string): Promise<AxiosResponse<Blob>> {
return this._http.get<Blob>(`API_URL`, {
responseType: 'arraybuffer',
headers: {
"content-type": "application/octet-stream"
}
});
}
Now my concerns:
First, orginal file byte array is: byte[11524]
but in axios response.data this file is ArrayBuffer(15370) (disclaimer here, i've checked respone in backend, everything is working fine, at the last step backend is returning proper byte array)
Second, as i debugged this response, i've noticed, that although i set "content-type": "application/octet-stream" in response i get "application/json, text/plain, */*", what can be cause of it?
As a result, downloaded file is corrupted and cannot be opened by Excel, can somebody point me where am i having a flaw in logic?

Cannot upload files with ACL public-read to Digital Ocean spaces

I'm trying to upload images to a Digital Ocean space from the browser. These images should be public. I'm able to upload the images successfully.
However, though the ACL is set to public-read, the uploaded files are always private.
I know they're private because a) the dashboard says that the permissions are "private", and b) because the public urls don't work, and c) manually changing the permissions to "public" in the dashboard fixes everything.
Here's the overall process I'm using.
Create a pre-signed URL on the backend
Send that url to the browser
Upload the image to that pre-signed url
Any ideas why the images aren't public?
Code
The following examples are written in TypeScript and use AWS's v3 SDK.
Backend
This generates the pre-signed url to upload a file.
import { S3Client, PutObjectCommand } from '#aws-sdk/client-s3'
import { getSignedUrl } from '#aws-sdk/s3-request-presigner'
const client = new S3Client({
region: 'nyc3',
endpoint: 'https://nyc3.digitaloceanspaces.com',
credentials: {
accessKeyId: process.env.DIGITAL_OCEAN_SPACES_KEY,
secretAccessKey: process.env.DIGITAL_OCEAN_SPACES_SECRET,
},
})
const command = new PutObjectCommand({
ACL: 'public-read',
Bucket: 'bucket-name',
Key: fileName,
ContentType: mime,
})
const url = await getSignedUrl(client, command)
The pre-signed url is then sent to the browser.
Frontend
This is the code on the client to actually upload the file to Digital Ocean. file is a File object.
const uploadResponse = await fetch(url, {
headers: {
'Content-Type': file.type,
'Cache-Control': 'public,max-age=31536000,immutable',
},
body: file,
method: 'PUT',
})
Metadata
AWS SDK: 3.8.0
Turns out that for Digital Ocean, you also need to set the public-read ACL as a header in the put request.
//front-end
const uploadResponse = await fetch(url, {
headers: {
'Content-Type': file.type,
'Cache-Control': 'public,max-age=31536000,immutable',
'x-amz-acl': 'public-read', // add this line
},
body: file,
method: 'PUT',
})
I don't have the reputation to comment, hence adding a response. Thank you #Nick ... this is one of the few working examples of code I have seen for DigitalOcean pre-signed url. While the official DigitalOcean description here mentions Content-Type is needed for uploading with pre-signed urls, there is no example code.
Another mistake that prevented me from uploading a file using pre-signed URLs in DigitalOcean was using 'Content-Type':'multipart/form-data' and FormData().
After seeing this post, I followed #Nick's suggestion of using a File() object and 'Content-Type':'<relevant_mime>'. Then, the file upload worked like a charm. This is also not covered in official docs.
Try this to force ACL to Public in Digital Ocean Spaces:
s3cmd --access_key=YOUR_ACCESS_KEY --secret_key=YOUR_SECRET_KEY --host=YOUR_BUCKET_REGION.digitaloceanspaces.com --host-bucket=YOUR_BUCKET_NAME.YOUR_BUCKET_REGION.digitaloceanspaces.com --region=YOUR_BUCKET_REGION setacl s3://YOUR_BUCKET_NAME --acl-public

How to get a pre-signed URL that downloads file with http compression

Here is my code in node.js:
const downloadURL = await s3.getSignedUrlPromise('getObject', {
Bucket: BUCKET_NAME,
Key: 'key to a large json file',
});
One got the URL, I want to download a very large JSON file stored in S3 from browser. Since it is large, I would like to use HTTP compression which would compress a 20MB JSON to less than 1MB. I could not find anywhere how to do it or whether it is at all possible with S3 APIs.
I also tried to do below when using the signed URL to download file and it seems not work.
const dataRes = await fetch(downloadURL, {
headers: {
'Accept-Encoding': 'gzip, deflate',
},
method: 'GET',
});
Hope somebody could help me out. Thanks a lot!
After doing some study, I have resolved this. Post here and hope it is helpful to others.
You cannot ask S3 to compress file on the fly when getObject or using signed URL to getObject
You would have to save the zipped file into S3 in the first place. In Linux, using below command to do it:
gzip -9 <file to compress>
Upload the zipped file to S3
Use below code to generate the signed URL:
const downloadURL = await s3.getSignedUrlPromise('getObject', {
Bucket: BUCKET_NAME,
Key: 'key to a large zipped json file',
ResponseContentEncoding: 'gzip',
ResponseContentType: 'application/json',
});
Use below code to download from the signed URL:
const res = await fetch(downloadurl);
const jsonData = await res.json();

S3 uploading and serving image with pre signed URL

I am trying to upload an image to my S3 bucket through a pre-signed url. Everything works well except that when I hit the public URL for that image, the browser downloads it instead of showing it. When I upload the same image from the AWS Console, everything works well and the image gets displayed in the browser.
Here how I do it:
Generation of the pre-signed URL:
s3.getSignedUrl('putObject', {
Bucket: myBucket,
Key: myKey,
Expires: signedUrlExpireSeconds
})
Upload of the file with axios:
const response = await axios.put(url, formElement.files[0])
Should I configure headers somewhere in the process to tell S3 the mime type of the content I'm uploading or something like this?
Thank you for your help
There are two places you can do this.
If you know the type of image ahead of time, then you can explicitly set the ContentType in the s3.getSignedUrl params. This is because those params will be encoded and passed with the signed put request: getSignedUrl docs / putObject docs. So for example:
s3.getSignedUrl('putObject', {
Bucket: myBucket,
Key: myKey,
Expires: signedUrlExpireSeconds,
ContentType: 'image/png'
});
Alternatively, you can set the Content-Type header on the Axios request REST PUT docs, for example:
const response = await axios.put(
url,
formElement.files[0],
{ headers: { 'Content-Type': formElement.files[0].type } });