I got error something like this
[google-id].gserviceaccount.com does not have storage.objects.create access to upload
I already make my bucket as public, and the service account as owner, or I did something missing for set it up ???
I was trying to upload files on my express , to test it , I am new for this,
can anyone tell me what's wrong and what I should set ???
One way to give permissions to your application/user would be through the following command:
gsutil iam ch user:[google-id].gserviceaccount.com:objectCreator gs://[YOUR_BUCKET]
This is fully documented at Using Cloud IAM permissions. You can also perform this action using Cloud Console. An example of using that interface is provided in the documentation previously linked.
Related
I am trying to setup a simple on-demand backup of an s3 bucket in AWS and anything I try I always get an access denied. See screenshot:
I have tried create a new bucket which is completely public, I've tried setting the access policy on the Vault, I've tried in different regions, all have the same result. Access Denied!
The messaging doesn't advise anything other than Access Denied, really helpful!
Can anyone give me some insight into what this message is referring to and more over how I can resolve this issue.
For aws backup, you need to set up a service role.
Traditionally you need 2 policies attached.
[AWSBackupServiceRolePolicyForBackup]
[AWSBackupServiceRolePolicyForRestore]
For S3, it seems there is a separate policy that you need to attach to your service role.
[AWSBackupServiceRolePolicyForS3Backup]
[AWSBackupServiceRolePolicyForS3Restore]
Just putting this here for those who will be looking for this answer.
To solve this problem for AWS CDK (javascript/typescript) you can use the following examples:
https://github.com/SimonJang/blog-aws-backup-s3/blob/68a05f8cb443411a23f02aa0c188adfe15bab0ff/infrastructure/lib/infrastructure-stack.ts#L63-L200
or this:
https://github.com/finnishtransportagency/hassu/blob/8adc0bea3193ff016a9aaa6abe0411292714bbb8/deployment/lib/hassu-database.ts#L230-L312
I've confirmed that my S3 credentials are set correctly and even tried full permissions on the bucket and still receive the same message. On the Azure side, I've added Storage Blob Data Owner permissions to my account and can list files I manually upload through the portal with the credentials I used (signing in through AAD rather than using a token).
Any help is appreciated here.
As it turned out there were no issues with either the Azure or AWS configuration. Trying the same commands in a linux VM worked perfectly. If you're experiencing this error it may be an issue with the AWS CLI or some other local config.
I'm writing in NodeJS and trying to send winston log data to S3 bucket using s3-streamlogger, but I get access denied.
Testing it from the CLI it is working fine read and write. the only reason I can think of is that we are using MFA in our AWS account.
Any ideas? workarounds?
Thanks
I have this sql on s3 link
s3://fff-cans/crm/full/production_20190214.sql.gz
how can I access this ? it only gives blank page
You will need an IAM key setup to access it too. Once you have that there are several ways, including the Amazon Command Line toolset:
https://aws.amazon.com/cli/
Which you would use like so:
aws s3 cp s3://fff-cans/crm/full/production_20190214.sql.gz .
(After installing hte IAM credentials)
You can use the web interface http://aws.amazon.com Which you will need a username and password.
There are also several UI tools you can use. Like http://s3browser.com/ and https://cyberduck.io/
When i access s3 through my code which is running under ecs. I can access only few files but for few files i get access denied exception. I have access to the bucket and the object are not restricted based on object leve;. i am using DefaultAWSCredentialsProviderChain.
But the same files can be downloaded using AWS cli under the same assumed role.
Can help me understand what can be the issue?