I'm trying to access s3://aps-external-download using .NET SDK
var client = new AmazonS3Client(accessKey, secretKey, RegionEndpoint.USEast1);
var response = await client.ListObjectsAsync();
foreach (var x in response.S3Objects)
{
Console.WriteLine("{0}\t{1}", x.BucketName);
}
But I'm getting Access Denied
{Amazon.S3.AmazonS3Exception}
ErrorCode: AccessDenied
ErrorMessage: Access Denied
From aws CLI, I'm able to list the folders
aws s3 ls s3://aps-external-download/***/
PRE ***_report/
PRE ***_report/
I'm trying to find a solution from https://docs.aws.amazon.com/sdk-for-net/index.html documentation but it's not been helpful.
What could I be missing?
The problem is that I was using the wrong method.
var response = await client.GetObjectAsync("aps-external-download/path/to/the/report/1999-01-01", "report_file");
var reader = new StreamReader(response.ResponseStream);
while (!reader.EndOfStream)
{
Console.WriteLine(reader.ReadLine());
}
Related
Is there a way to determine if ses has a local region available using AWS SDK?
After registering e-mail service with Core Middleware services.AddAWSService<IAmazonSimpleEmailService>() I want to find and assign the SES local region to e-mail service if there is one available. Using .Net Core 2.1 and AWS SDK. Thank you!
I believe this link contains information you are interested in - https://aws.amazon.com/blogs/aws/new-query-for-aws-regions-endpoints-and-more-using-aws-systems-manager-parameter-store/
Here’s how to get the list of regions where a service (Amazon Athena,
in this case) is available:
$ aws ssm get-parameters-by-path \
--path /aws/service/global-infrastructure/services/athena/regions --output json | \
jq .Parameters[].Value
"ap-northeast-2"
"ap-south-1"
"ap-southeast-2"
"ca-central-1"
"eu-central-1"
"eu-west-1"
"eu-west-2"
"us-east-1"
"us-east-2"
"us-gov-west-1"
"ap-northeast-1"
"ap-southeast-1"
"us-west-2"
https://www.nuget.org/packages/AWSSDK.SimpleSystemsManagement/ NuGet package can be used to achieve same via .NET SDK. For your example the code to get the set of regions where SES is available can look like
var client = new AmazonSimpleSystemsManagementClient(...);
var path = "/aws/service/global-infrastructure/services/ses/regions";
var response = await client.GetParametersByPathAsync(new GetParametersByPathRequest
{
Path = path
}).ConfigureAwait(false);
var regions = new HashSet<string>(response.Parameters.Select(x => x.Value));
while (!string.IsNullOrWhiteSpace(response.NextToken))
{
response = await client.GetParametersByPathAsync(new GetParametersByPathRequest
{
Path = path,
NextToken = response.NextToken
}).ConfigureAwait(false);
foreach (var parameter in response.Parameters) regions.Add(parameter.Value);
}
I am trying to use Microsoft Azure Storage SDK for Node.js and JavaScript for Browsers (https://github.com/Azure/azure-storage-node) to display PDF contents stored in Azure blob in browsers. So far I couldn't find any examples on how to do it.
I tried to follow the suggestion from https://github.com/Azure/azure-storage-node/issues/440, But couldn't make it work. I am using Azure function.
module.exports = async function (context, req) {
let accessToken = await getAccessToken();
let container = req.params.container;
let filename = req.params.filename;
let tokenCredential = new azure.TokenCredential(accessToken);
let storageAccountName = process.env.StorageAccountName;
let blobService = azure.createBlobServiceWithTokenCredential(`https://${storageAccountName}.blob.core.windows.net/`, tokenCredential);
return new Promise((resolve, reject) => {
let readStream = blobService.createReadStream(container, filename, function (error, result, response) {
if (error) {
context.log(error);
context.log(response);
context.res = {
status: 400,
body: response
};
resolve(context.res);
}
});
let body = '';
readStream.on('data', (chunk) => {
body += chunk;
});
readStream.on('end', () => {
context.res = {
headers: {
'Content-Type': "application/pdf"
},
body: body
};
resolve(context.res);
});
});
};
But I got "Couldn't open PDF" error message in the browser or timeout error.
For downloading blob in browser environment, using URL with SAS is recommended, and in the framework you are using, would an accessible URL pointing to PDF be enough?
Please follow example:
Download Blob
BlobService provides interfaces for downloading a blob into browser memory. Because of browser's sandbox limitation, we cannot save the downloaded data trunks into disk until we get all the data trunks of a blob into browser memory. The browser's memory size is also limited especially for downloading huge blobs, so it's recommended to download a blob in browser with SAS Token authorized link directly.
Shared access signatures (SAS) are a secure way to provide granular access to blobs and containers without providing your storage account name or keys. Shared access signatures are often used to provide limited access to your data, such as allowing a mobile app to access blobs.
I am trying to enable default encryption for s3 Bucket programmatically. Following is not working no errors as well. Anybody know the reason for this ?
private async Task<PutBucketEncryptionResponse> EnableServerSideEncriptionAsync(string bucketName)
{
return await S3Client.PutBucketEncryptionAsync(new PutBucketEncryptionRequest
{
BucketName = bucketName,
ServerSideEncryptionConfiguration = new ServerSideEncryptionConfiguration()
{
ServerSideEncryptionRules = new List<ServerSideEncryptionRule>()
{
new ServerSideEncryptionRule()
{
ServerSideEncryptionByDefault = new ServerSideEncryptionByDefault()
{
ServerSideEncryptionAlgorithm = ServerSideEncryptionMethod.AES256
}
}
}
}
});
}
I tried it using the AWS Command-Line Interface (CLI) to see what would happen.
I created a new bucket, and ran:
aws s3api put-bucket-encryption --bucket my-bucket --server-side-encryption-configuration '{"Rules": [{"ApplyServerSideEncryptionByDefault": {"SSEAlgorithm": "AES256"}}]}'
I then went to the bucket in the Amazon S3 console, clicked the Properties tab and the Default Encryption box displayed: AES-256
Finally this is end up with an permission issue. I didn't have permission to see the status of default encryption. users who has permission could see default encryption is enabled.
Thank you John Rotenstein for your time to find solution for this issue.
Hope that aws console shows an message saying "Access Denied" instead
of showing wrong default encryption is disabled.
Good night, I had the same problem an hour ago and maybe I found the solution!
Code:
private async void EncryptBucket(string bucketName)
{
var encryptResquest = new PutBucketEncryptionRequest
{
BucketName = bucketName,
ServerSideEncryptionConfiguration = new ServerSideEncryptionConfiguration()
{
ServerSideEncryptionRules = new List<ServerSideEncryptionRule>()
{
new ServerSideEncryptionRule()
{
ServerSideEncryptionByDefault = new ServerSideEncryptionByDefault()
{
ServerSideEncryptionAlgorithm = ServerSideEncryptionMethod.AWSKMS,
ServerSideEncryptionKeyManagementServiceKeyId = "arn:aws:kms:us-west-2:**insert your account id**:alias/aws/s3"
}
}
}
}
};
#!/bin/bash
for bucket_name in $(aws s3api list-buckets --query "Buckets[].Name" --output text);
do
if (aws s3api get-bucket-encryption --bucket ${bucket_name})
then
echo "already encrypted"
else
echo "doing encrption"
aws s3api put-bucket-encryption --bucket ${bucket_name} --server-side-encryption-configuration '{"Rules": [{"ApplyServerSideEncryptionByDefault": {"SSEAlgorithm": "AES256"}}]}'
echo "done encrption"
fi
done
I am trying to take a picture on my phonegap app and then use the FileTransfer plugin to upload it to my server. I am getting error code 1 but there is no other explanation - this is VERY frustrating. I have scoured every piece of documentation and blog known to man with no luck.
I am using a basic LAMP server and it continues to give me an http 500 code. I am 99.9% sure this error is specific to my server because I have tested this with a different web server of mine and the code works fine. Here is the response:
{"code":1,"source":"file:///storage/emulated/0/Android/data/io.cordova.xxappxx/cache/1477607161788.jpg","target":"https://server.com/php/uploadPhoto.php","http_status":500,"body":"\t","exception":"https://server.com/php/uploadPhoto.php"}
Below is my front-end js code:
function uploadPhoto(imageURI) {
var options = new FileUploadOptions();
options.fileKey="file";
options.fileName=imageURI.substr(imageURI.lastIndexOf('/')+1);
alert(options.fileName);
options.mimeType="image/jpeg";
var params = {};
params.value1 = sessionStorage.getItem("token");
options.params = params;
options.chunkedMode = false;
options.headers = {Connection: "close"};
var ft = new FileTransfer();
ft.upload(imageURI, "https://servername.com/php/uploadPhoto.php", function(result){
console.log(JSON.stringify(result));
}, function(error){
console.log(JSON.stringify(error));
}, options, true);
}
And here is my back-end PHP code that is being called (uploadPhoto.php):
<?php
session_start();
header('Access-Control-Allow-Origin : *');
$new_image_name = "$userId.jpg";
move_uploaded_file($_FILES["file"]["tmp_name"], "/var/img/".$new_image_name);
?>
This ended up being an issue of image size. I was working on this project for a university and their servers have lots of security installed - one of these security configurations had a very small file upload size limit which was blocking the uploads. I discovered this by scanning some of the log files in the /var/log/ directory.
Just finished breakfast and already hit a snag. I'm trying to call the salesforce REST api from my google sheets. I've written a working script locally in python, but converting it into JS, something went wrong:
function authenticateSF(){
var url = 'https://login.salesforce.com/services/oauth2/token';
var options = {
grant_type:'password',
client_id:'XXXXXXXXXXX',
client_secret:'111111111111',
username:'ITSME#smee.com',
password:'smee'
};
var results = UrlFetchApp.fetch(url, options);
}
Here is the error response:
Request failed for https://login.salesforce.com/services/oauth2/token
returned code 400. Truncated server response:
{"error_description":"grant type not
supported","error":"unsupported_grant_type"} (use muteHttpExceptions
option to examine full response) (line 12, file "Code")
Mind you, these exact parameters work fine in my local python script (putting the key values inside quotations).
Here are the relevant docs:
Google Script: Connecting to external API's
Salesforce: REST API guide
Thank you all!
Google's UrlFetchApp object automatically defaults to a GET request. To authenticate, you have to explicitly set in the options the method "post":
function authenticateSF(){
var url = 'https://login.salesforce.com/services/oauth2/token';
var payload = {
'grant_type':'password',
'client_id':'XXXXXXXXXXX',
'client_secret':'111111111111',
'username':'ITSME#smee.com',
'password':'smee'
};
var options = {
'method':'post',
'payload':payload
};
var results = UrlFetchApp.fetch(url, options);
}