I am new to aws-sdk and I want to sign Cache-Control, Content-Type and x-amz-acl headers for presigned url. Is it possible to do with s3-request-presigner? I can't find any example.
var command = new PutObjectCommand({
Bucket: 'mybucket',
Key: 'file.txt',
ACL: 'public-read',
CacheControl: 'public, max-age=1000',
ContentType: 'text/plain',
});
var signedUrl = await getSignedUrl(s3Client, command, {
expiresIn: 3600,
signableHeaders:
new Set(['Cache-Control', 'Content-Type', 'x-amz-acl'])
});
The resulting url contains this: X-Amz-SignedHeaders=host. That is, my headers
don't get signed. What do I have to do?
Also, can I restrain the size of the upload with something like content-length-range and sign it as well?
You have probably solved this by now, but what XHR client were you using?
I had the same issue using axios because I was calling axios like this:
axios({
url: signedRequest,
method: 'put',
data: Body,
headers: {
'x-amz-acl': 'public-read-write',
'Content-Type': 'application/pdf'
},
maxContentLength: Infinity,
maxBodyLength: Infinity
})
I needed to remove the headers property in this axios call because I was already setting ACL and ContentType when getting the signed request. Once I remove the headers property from my axios call, it started working.
Related
I am trying to send post request to a url on which authorization of type API Key is enabled. I am able to send request through the post man. The API is responding perfectly fine, but when I come to fetch in Vuejs. I am unable to send the POST request using fetch.
Screenshot of POSTMAN is attached.
The tried code which I am using in Vuejs is:-
const requestOptions = {
method: "POST",
withCredentials: true,
headers: {
"Content-Type": "application/json",
"X-API-Key":"3C68F15FF89132BF254E5FB648FCA",
},
body: JSON.stringify({
name: this.name,
phonenumber: this.phoneNumber,
msg: this.message,
}),
};
let response = await fetch(
"https://auto.toxiclabs.net/webhook/d6121492-4b9c-4dc2-908f-991001b20b61",
requestOptions
);
The error I am getting
Anyone who can tell me what actually is wrong in my code ?
I have been struggling with image upload for days.
I’m using formdata like this:
let formData = new FormData();
formData.append('file', {
uri: uri,
name: `name`,
type: `image/jpeg`,
});
uri on iOS is something like asset-library://asset/path on Android it is like content://media/external/images/media/25377.
let options = {
method: 'POST',
body: formData,
headers: {
Accept: 'application/json',
'Authorization': 'Bearer ' + token,
},
};
let response = await fetch("https://myserverurl", options)
I tried every trick reading the image as blob, removing content-type, other libraries like axios, etc…
No matter what I always get back a 400 bad file format error.
Is there something I’m missing with formdata?
(On the backend we use ASP.NET)
We have had a similar issue and were able to solve the issue the following way.
We are using a NodeJS backend (with multer) to handle the file uploads.
Expo - Mobile App Code
// extract the filetype
let fileType = uri.substring(uri.lastIndexOf(".") + 1);
let formData = new FormData();
formData.append("photo", {
uri,
name: `photo.${fileType}`,
type: `image/${fileType}`
});
let options = {
method: "POST",
body: formData,
headers: {
Accept: "application/json",
"Content-Type": "multipart/form-data"
}
};
We are executing the request with fetch(apiUrl, options).
The uri is the local file path (full URI e.g., file:///...) of the photo in our case and apiUrl is the endpoint of the server-side.
I think the issue might be with the type and format of uri in formdata. Have you tried to use the uri returned by the image picker?
I have no trouble getting a bearer token returned when using Postman. However, when using Aurelia, I receive a status 200 with "OK" as the only response. I see that the Request Method is still "OPTIONS". I see this in the Chrome Console:
Failed to load https://------.auth0.com/oauth/token: Request header field Access-Control-Allow-Origin is not allowed by Access-Control-Allow-Headers in preflight response.
But, from what I can see the headers shown in the response and from what I'm seeing everything looks like it's there.
Here's what I receive from Postman:
Response: Status 200 OK
JSON:
{
"access_token": "eyJ0eXAiOiJKV1QiLCJhbGci...{shortened for brevity}",
"expires_in": 86400,
"token_type": "Bearer"
}
Here's code from Aurelia:
private getToken() {
var postData = { "client_id": API_CONFIG.clientId, "client_secret": API_CONFIG.clientSecret, "audience": API_CONFIG.audience, "grant_type": "client_credentials" };
this.http.fetch('https://kimberlite.auth0.com/oauth/token', {
credentials: 'omit',
headers: {
'Content-Type': 'application/json',
'Access-Control-Allow-Origin': 'http://localhost:3000/'
},
mode: 'cors',
method: 'post',
body: JSON.stringify(postData)
}).then(result => result.json())
.then(data => {
localStorage.setItem('api_access_token', data.access_token);
localStorage.setItem('api_expires_at', new Date().getTime() + data.expires_in);
});
}
I've searched and haven't found anything that's helped me get passed this. What am I missing? Any help greatly appreciated
After reading Jesse's comment below, I removed the header for the 'Access-Control-Allow-Origin' and receive the same 200 OK. However, receive error in Google Chrome Origin 'localhost:3000'; is therefore not allowed access. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.".
After reading other questions, I attempted removing all headers and I receive a 401 Unathorized with the following response {{"error":"access_denied","error_description":"Unauthorized"}
private getToken() {
var postData = { "client_id": API_CONFIG.clientId, "client_secret": API_CONFIG.clientSecret, "audience": API_CONFIG.audience, "grant_type": "client_credentials" };
let http = new HttpClient();
http.fetch('https://kimberlite.auth0.com/oauth/token', {
credentials: 'omit',
//headers: {
// 'Content-Type': 'application/json'
//},
mode: 'cors',
method: 'post',
body: JSON.stringify(postData)
}).then(result => result.json())
.then(data => {
localStorage.setItem('api_access_token', data.access_token);
localStorage.setItem('api_expires_at', new Date().getTime() + data.expires_in);
});
}
ok, I just tried in Firefox, using only the 'Content-Type' header and received expected response. Is there something with Chrome (which most users are going to be using) that I need to be aware of?
You shouldn't set the access-control-allow-origin header on the request. In a CORS request, the server endpoint needs to set this header on the response of your OPTIONS request.
The way that Cross-Origin Resource Sharing works, is that the client first makes an OPTIONS call to the server endpoint. The server endpoint should be configured to use CORS, and have a list of origins that are allowed (or simply a * to allow all origins). Then on the response to this OPTIONS request, the server will set the Access-Control-Allow-Origin: https://localhost:3000 to indicate the origin is allowed to make the request. You can see this in your response too:
The client then proceeds to make the GET or POST call to the same endpoint and actually retrieve/store the data.
In your case, if you make the request using the Aurelia fetch client, you don't need to set a header to do this. You can simply do the following:
private getToken() {
var postData = { "client_id": API_CONFIG.clientId, "client_secret": API_CONFIG.clientSecret, "audience": API_CONFIG.audience, "grant_type": "client_credentials" };
this.http.fetch('https://kimberlite.auth0.com/oauth/token', {
credentials: 'omit',
headers: {
'Content-Type': 'application/json'
},
mode: 'cors',
method: 'post',
body: JSON.stringify(postData)
}).then(result => result.json())
.then(data => {
localStorage.setItem('api_access_token', data.access_token);
localStorage.setItem('api_expires_at', new Date().getTime() + data.expires_in);
});
}
I got Watson Speech-to-Text working on the web. I am now trying to do it on react native but am getting errors on the file upload part.
I am using the HTTPS Watson API. I need to set the Content-Type otherwise Watson returns a error response. However in react-native, for the file upload to work, we seem to need to set 'Content-Type' to 'multipart/form-data'. Is there anyway to upload a file in react-native while setting Content-Type to 'audio/aac'?
The error Watson API gives me if I set 'Content-Type': 'multipart/form-data' is:
{
type: "default",
status: 400,
ok: false,
statusText: undefined,
headers: Object,
url: "https://stream.watsonplatform.net/speech-to-text/api/v1/recognize?continuous=true",
_bodyInit: Blob,
_bodyBlob: Blob
}
The response body is:
{
"code_description": "Bad Request",
"code": 400,
"error": "No JSON object could be decoded"
}
Here is my code (full code is here - gist.github.com ):
const ext = 'aac';
const file_path = '/storage/emulated/0/Music/enter-the-book.aac';
data.append('file', {
uri: `file://${file_path}`,
name: `recording.${ext}`,
type: `audio/${ext}`
}, `recording.${ext}`);
const response = await fetch('https://stream.watsonplatform.net/speech-to-text/api/v1/recognize?continuous=true', {
method: 'POST',
headers: {
// 'Content-Type': `audio/${ext}`,
'Content-Type': 'multipart/form-data',
'X-Watson-Authorization-Token': token
},
body: data
});
console.log('watson-stt::getResults - response:', response);
if (response.status !== 200) {
const error = await response.text();
throw new Error(`Got bad response "status" (${response.status}) from Watson Speach to Text server, error: "${error}"`);
}
Here is a screenshot of the error I get when I set 'Content-Type': 'audio/aac':
Thanks so much to DanielBolanos and NikolayShmyrev this is the solution I used:
This code is for iOS so I recorded the audio as blah.ulaw BUT the part_content_type is aduio/mulaw;rate=22050 this is very important to use mulaw even though file ext is ulaw. An interesting note: I couldn't play the blah.ulaw file on my macOS desktop.
Also note that you MUST NOT set Content-Type to multipart/form-data this will destroy the boundary.
Also Bluemix requires rate in the part_content_type for mulaw
const body = new FormData();
let metadata = {
part_content_type: 'audio/mulaw;rate=22050' // and notice "mulaw" here, "ulaw" DOES NOT work here
};
body.append('metadata', JSON.stringify(metadata));
body.append('upload', {
uri: `file://${file_path}`,
name: `recording.ulaw`, // notice the use of "ulaw" here
type: `audio/ulaw` // and here it is also "ulaw"
});
const response = await fetch('https://stream.watsonplatform.net/speech-to-text/api/v1/recognize?continuous=true', {
method: 'POST',
headers: {
// 'Content-Type': 'multipart/form-data' // DO NOT SET THIS!! It destroys the boundary and messes up the request
'Authorization': `Basic ${btoa(`${USERNAME}:${PASSWORD}`)}`
},
body
});
According to the documentation for multipart requests the request should be:
curl -X POST -u "{username}":"{password}"
--header "Transfer-Encoding: chunked"
--form metadata="{
\"part_content_type\":\"audio/flac\",
\"timestamps\":true,
\"continuous\":true}"
--form upload="#audio-file1.flac"
"https://stream.watsonplatform.net/speech-to-text/api/v1/recognize"
So the content-type should be multipart/form-data, you can specify aac as "part_content_type": "audio/aac".
The big problem you have is that audio/aac is not in supported formats. You might probably need another codec.
We're using signed urls to upload from the browser. I haven't been able to figure out how to set the cache-control header while uploading.
We're using the gcloud-node library to sign urls:
var bucket = gcs.bucket('mybucket');
var file = bucket.file('image.jpg');
var expireDate = new Date
expireDate.setDate(expireDate.getDate() + 1);
file.getSignedUrl({
action: 'write',
expires: expireDate,
contentType: 'image/jpeg'
}, function (err, signedUrl) {
if (err) {
console.error('SignedUrl error', err);
} else {
console.log(signedUrl);
}
});
How do I set the Cache-Control headers while uploading a file to GCS?
The code to upload is running in the browser:
var signedUrl = ...; // get from nodejs server
var fileList = this.files;
var file = fileList[0];
jQuery.ajax({
url: signedUrl,
type: 'PUT',
data: file,
processData: false,
contentType: 'image/jpeg'
})
This is possible, but the documentation is terrible. First you need to setup CORS on the bucket you're uploading to with:
gsutil cors set cors.json gs://bucket-name
Where cors.json contains something like:
[{
"maxAgeSeconds": 3600,
"method": ["GET", "PUT", "POST"],
"origin": [
"http://localhost:3000"
],
"responseHeader": ["Content-Type", "Cache-Control"]
}]
"Cache-Control" needs to be listed in the "responseHeader" field. Then upload like you normally would, but set the Cache-Control header. Using fetch it would be:
fetch(uploadUrl, {
method: 'PUT',
body: blob,
headers: {
'Content-Type': blob.type,
'Cache-Control': 'public, max-age=31536000',
},
});
the snippet you have is getting a signed url. when you upload (insert) the object into GCS, you should be able to set it via the API:
https://cloud.google.com/storage/docs/json_api/v1/objects/insert