AWS S3 CMD Error message An Error occurred (AccessDenied) when calling the PutObject operation: Access Denied - amazon-s3

I have installed AWSCLIV2.MSI on my windows desktop.
After signing into my IAM account using secret key and key ID I try to upload a simple TXT document to my S3 Bucket using the following command:
aws s3 cp 4CLItest.txt s3://bucketname
After entering the command, I receive the following error:
aws s3 cp 4CLItest.txt s3://bucketname
upload failed: .\4CLItest.txt to s3://bucketname/4CLItest.txt
An error occurred (AccessDenied) when
calling the PutObject operation: Access Denied
I'm running the following version of AWS CLI:
aws-cli/2.10.0 Python/3.9.11 Windows/10 exe/AMD64 prompt/off
I am able to upload to this bucket if I use the browser with click and drag using the same IAM account.
However, I have a very large upload i'm needing to perform and would like to use this CLI through command prompt to do so.
Any help or advice would be greatly appreciated.

Related

eks assume role is not permitted on user

I'm trying to run an automated build on my aws codebuild project. I have the following statement in my buildpsec.yaml file
aws eks update-kubeconfig --name ticket-api --region us-west-1 --role-arn arn:aws:iam::12345:role/service-role/CodeBuild-API-Build-service-role
I get the following error
An error occurred (AccessDenied) when calling the AssumeRole operation: User: arn:aws:sts::12345:assumed-role/CodeBuild-API-Build-service-role/AWSCodeBuild-99c25416-7046-416e-b5d9-4bff1f4992f3 is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::12345:role/service-role/CodeBuild-API-Build-service-role
I'm not sure whats the role arn:aws:sts::12345:assumed-role/CodeBuild-API-Build-service-role/AWSCodeBuild-99c25416-7046-416e-b5d9-4bff1f4992f3 especially the part AWSCodeBuild-99c25416-7046-416e-b5d9-4bff1f4992f3. Is it something unique to the current user ?
Also I think the current user is already assigned to the role based on what its trying to assign.
If I run the same command without role aws eks update-kubeconfig --name ticket-api --region us-west-1 then it works but when I try kubectl version after that I get the following error
error: You must be logged in to the server (the server has asked for the client to provide credentials)

Query from Bigquery using local command line

I'm trying to query from BigQuery using PowerShell. I've initialised gcloud init and logged in to my account.
The request was this:
bq query --use_legacy_sql=false 'SELECT customer_id FROM `demo1.customers1`'
Resulting with this error:
BigQuery error in query operation: Error processing job
'PROJECT-ID:bqjob': Access Denied:
BigQuery BigQuery: Permission denied while getting Drive credentials.
This worked when I run it in cloud shell.
I've created a service account before and a key for the project. I tried to run this command and doesn't solve it:
gcloud auth activate-service-account SERVICE_ACCOUNT#DOMAIN.COM --key-file=D:/folder/key.json --project=MYPROJECT_ID
Service account should have the OAuth scope for Drive to access drive, below command can be used to authenticate with Drive.
gcloud auth login --enable-gdrive-access

Permission denied on S3 path - What are the minimum policies required on the data source to get athena-express to work?

I'm attempting to use the Athena-express node module to query Athena.
Per the Athena-express docs:
This IAM role/user must have AmazonAthenaFullAccess and AmazonS3FullAccess policies attached
Note: As an alternative to granting AmazonS3FullAccess you could granularize and limit write access to a specific bucket. Just specify this bucket name during athena-express initialization"
Providing AmazonS3FullAccess to this micro service is a non-starter. What is the minimum set of priviledges I can grant to the micro service and still get around the "Permission denied on S3 path: s3://..." errors I've been getting?
Currently, I've got the following
Output location: (I don't think the problem is here)
s3:AbortMultipartUpload, s3:CreateMultipartUpload, s3:DeleteObject, s3:Get*, s3:List*, s3:PutObject, s3:PutObjectTagging
on "arn:aws:s3:::[my-bucket-name]/tmp/athena" and "arn:aws:s3:::[my-bucket-name]/tmp/athena/*"
Data source location:
s3:GetBucketLocation
on "arn:aws:s3:::*"
s3:ListBucket
on "arn:aws:s3:::[my-bucket-name]"
s3:Get* and s3:List*
on "arn:aws:s3:::[my-bucket-name]/production/[path]/[path]" and "arn:aws:s3:::[my-bucket-name]/production/[path]/[path]/*"
The error message I get with the above is:
"Permission denied on S3 path: s3://[my-bucket-name]/production/[path]/[path]/v1/dt=2022-05-26/.hoodie_partition_metadata"
Any suggestions? Thanks!
It turned out that the bucket storing the data I needed to query was encrypted, which meant that the missing permission to query was kms:Decrypt.
Athena by outputs the results of a query to a location (which athena-express then retrieves). The location of the output was in that same encrypted bucket, so I also ended up giving my cronjob kms:Encrypt and kms:GeneratedDataKey.
I ended up using CloudTrails to figure out which permissions were causing my queries to fail.

Unable to create Big Query Data Transfer job using a service account

Am not able to create a data transfer job between a google playstore store to a google storage bucket using a service account that has permissions for both. I am able to create a transfer job using my project account which only has access to the storage bucket so I cannot use this in production.
By running the following:
bq mk --transfer_config --target_dataset=<my dataset> --display_name=<My Transfer Job> --params='{"bucket":<playstore bucket>,"table_suffix":<my suffix>}' --data_source=play --service_account <service account email> --service_account_credential_file $GOOGLE_APPLICATION_CREDENTIALS
I am getting error:
Unexpected exception in GetCredentialsFromFlags operation: Credentials
appear corrupt. Please delete the credential file and try your command
again. You can delete your credential file using "bq init
--delete_credentials".
Did a bq init and reran bq cmd but getting same error.
Also activated service acct using below cmd; still same error.
gcloud auth activate-service-account <service account email> --key-file $GOOGLE_APPLICATION_CREDENTIALS

Access denied while loading from gcs to bigquery

I have uploaded a csv file to gcs and I can list/view that file using "gsutil(ls and cat)". I can even load that file using "bq load" but when I try to load it from the python loadTable script mentioned in the examples, it fails with the error message "Access Denied"
{u'state': u'DONE', u'errors': [{u'reason': u'accessDenied', u'message': u'Access Denied: File (file_name): Access Denied'}], u'errorResult': {u'reason': u'accessDenied', u'message': u'Access Denied: File (file_name): Access Denied'}}
Done Loading!
The authorization used is through "Service account" key. I have tested the listing of dataset and the table. Its just the upload through gcs which says "Access Denied".
Please help.
Does the service account have read access to the google storage file? Can you send the job id of the failed load job?