access denied when I try to upload file on express - express

I got error something like this
[google-id].gserviceaccount.com does not have storage.objects.create access to upload
I already make my bucket as public, and the service account as owner, or I did something missing for set it up ???
I was trying to upload files on my express , to test it , I am new for this,
can anyone tell me what's wrong and what I should set ???

One way to give permissions to your application/user would be through the following command:
gsutil iam ch user:[google-id].gserviceaccount.com:objectCreator gs://[YOUR_BUCKET]
This is fully documented at Using Cloud IAM permissions. You can also perform this action using Cloud Console. An example of using that interface is provided in the documentation previously linked.

Related

AWS Backup from S3 Access Denied

I am trying to setup a simple on-demand backup of an s3 bucket in AWS and anything I try I always get an access denied. See screenshot:
I have tried create a new bucket which is completely public, I've tried setting the access policy on the Vault, I've tried in different regions, all have the same result. Access Denied!
The messaging doesn't advise anything other than Access Denied, really helpful!
Can anyone give me some insight into what this message is referring to and more over how I can resolve this issue.
For aws backup, you need to set up a service role.
Traditionally you need 2 policies attached.
[AWSBackupServiceRolePolicyForBackup]
[AWSBackupServiceRolePolicyForRestore]
For S3, it seems there is a separate policy that you need to attach to your service role.
[AWSBackupServiceRolePolicyForS3Backup]
[AWSBackupServiceRolePolicyForS3Restore]
Just putting this here for those who will be looking for this answer.
To solve this problem for AWS CDK (javascript/typescript) you can use the following examples:
https://github.com/SimonJang/blog-aws-backup-s3/blob/68a05f8cb443411a23f02aa0c188adfe15bab0ff/infrastructure/lib/infrastructure-stack.ts#L63-L200
or this:
https://github.com/finnishtransportagency/hassu/blob/8adc0bea3193ff016a9aaa6abe0411292714bbb8/deployment/lib/hassu-database.ts#L230-L312

Copying folders from S3 to an Azure Storage Blob and receiving "cannot list objects, access is denied" error. Anyone else have this and resolve it?

I've confirmed that my S3 credentials are set correctly and even tried full permissions on the bucket and still receive the same message. On the Azure side, I've added Storage Blob Data Owner permissions to my account and can list files I manually upload through the portal with the credentials I used (signing in through AAD rather than using a token).
Any help is appreciated here.
As it turned out there were no issues with either the Azure or AWS configuration. Trying the same commands in a linux VM worked perfectly. If you're experiencing this error it may be an issue with the AWS CLI or some other local config.

Issues writing to S3 using s3-streamlogger

I'm writing in NodeJS and trying to send winston log data to S3 bucket using s3-streamlogger, but I get access denied.
Testing it from the CLI it is working fine read and write. the only reason I can think of is that we are using MFA in our AWS account.
Any ideas? workarounds?
Thanks

How to access s3 link i got from other people

I have this sql on s3 link
s3://fff-cans/crm/full/production_20190214.sql.gz
how can I access this ? it only gives blank page
You will need an IAM key setup to access it too. Once you have that there are several ways, including the Amazon Command Line toolset:
https://aws.amazon.com/cli/
Which you would use like so:
aws s3 cp s3://fff-cans/crm/full/production_20190214.sql.gz .
(After installing hte IAM credentials)
You can use the web interface http://aws.amazon.com Which you will need a username and password.
There are also several UI tools you can use. Like http://s3browser.com/ and https://cyberduck.io/

Access issue accessing s3 selectively for application running under ecs. But works from CLI

When i access s3 through my code which is running under ecs. I can access only few files but for few files i get access denied exception. I have access to the bucket and the object are not restricted based on object leve;. i am using DefaultAWSCredentialsProviderChain.
But the same files can be downloaded using AWS cli under the same assumed role.
Can help me understand what can be the issue?