Access denied while loading from gcs to bigquery - google-bigquery

I have uploaded a csv file to gcs and I can list/view that file using "gsutil(ls and cat)". I can even load that file using "bq load" but when I try to load it from the python loadTable script mentioned in the examples, it fails with the error message "Access Denied"
{u'state': u'DONE', u'errors': [{u'reason': u'accessDenied', u'message': u'Access Denied: File (file_name): Access Denied'}], u'errorResult': {u'reason': u'accessDenied', u'message': u'Access Denied: File (file_name): Access Denied'}}
Done Loading!
The authorization used is through "Service account" key. I have tested the listing of dataset and the table. Its just the upload through gcs which says "Access Denied".
Please help.

Does the service account have read access to the google storage file? Can you send the job id of the failed load job?

Related

AWS S3 CMD Error message An Error occurred (AccessDenied) when calling the PutObject operation: Access Denied

I have installed AWSCLIV2.MSI on my windows desktop.
After signing into my IAM account using secret key and key ID I try to upload a simple TXT document to my S3 Bucket using the following command:
aws s3 cp 4CLItest.txt s3://bucketname
After entering the command, I receive the following error:
aws s3 cp 4CLItest.txt s3://bucketname
upload failed: .\4CLItest.txt to s3://bucketname/4CLItest.txt
An error occurred (AccessDenied) when
calling the PutObject operation: Access Denied
I'm running the following version of AWS CLI:
aws-cli/2.10.0 Python/3.9.11 Windows/10 exe/AMD64 prompt/off
I am able to upload to this bucket if I use the browser with click and drag using the same IAM account.
However, I have a very large upload i'm needing to perform and would like to use this CLI through command prompt to do so.
Any help or advice would be greatly appreciated.

Permission denied on S3 path - What are the minimum policies required on the data source to get athena-express to work?

I'm attempting to use the Athena-express node module to query Athena.
Per the Athena-express docs:
This IAM role/user must have AmazonAthenaFullAccess and AmazonS3FullAccess policies attached
Note: As an alternative to granting AmazonS3FullAccess you could granularize and limit write access to a specific bucket. Just specify this bucket name during athena-express initialization"
Providing AmazonS3FullAccess to this micro service is a non-starter. What is the minimum set of priviledges I can grant to the micro service and still get around the "Permission denied on S3 path: s3://..." errors I've been getting?
Currently, I've got the following
Output location: (I don't think the problem is here)
s3:AbortMultipartUpload, s3:CreateMultipartUpload, s3:DeleteObject, s3:Get*, s3:List*, s3:PutObject, s3:PutObjectTagging
on "arn:aws:s3:::[my-bucket-name]/tmp/athena" and "arn:aws:s3:::[my-bucket-name]/tmp/athena/*"
Data source location:
s3:GetBucketLocation
on "arn:aws:s3:::*"
s3:ListBucket
on "arn:aws:s3:::[my-bucket-name]"
s3:Get* and s3:List*
on "arn:aws:s3:::[my-bucket-name]/production/[path]/[path]" and "arn:aws:s3:::[my-bucket-name]/production/[path]/[path]/*"
The error message I get with the above is:
"Permission denied on S3 path: s3://[my-bucket-name]/production/[path]/[path]/v1/dt=2022-05-26/.hoodie_partition_metadata"
Any suggestions? Thanks!
It turned out that the bucket storing the data I needed to query was encrypted, which meant that the missing permission to query was kms:Decrypt.
Athena by outputs the results of a query to a location (which athena-express then retrieves). The location of the output was in that same encrypted bucket, so I also ended up giving my cronjob kms:Encrypt and kms:GeneratedDataKey.
I ended up using CloudTrails to figure out which permissions were causing my queries to fail.

Airflow Permission denied while getting Drive credentials

I am trying to run a bigquery query on Airflow with MWAA.
This query uses a table that is based on a Google Sheet. When I run it, I have the following error:
google.api_core.exceptions.Forbidden: 403 Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
I already have a working Google cloud connection on Airflow with an admin service account.
Also:
This service account has access to the google sheet
I added https://www.googleapis.com/auth/drive in the scopes of the Airflow connection
I re-generated a JSON file
Am I doing something wrong? Any idea what I can do to fix this problem?
Thanks a lot
I fixed my issue by creating a NEW Airflow connection. It's a new google cloud connection, with the exact same values as the default google_cloud_default values. Now it works perfectly.
Hope it can help !

Access Denied: Permission denied while getting Drive credentials

Since today our Airflow service is not able to access queries in BigQuery. All jobs fail with the following message:
[2021-03-12 10:17:28,079] {taskinstance.py:1150} ERROR - Reason: 403 GET https://bigquery.googleapis.com/bigquery/v2/projects/waipu-app-prod/queries/e62030d7-36eb-4420-b482-b5327f4f6c7e?maxResults=0&timeoutMs=900&location=EU: Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
We haven't changed anything in recent days. Therefore we are quite puzzled what the reason might be. Is there a temporary bug? Or might we have to check any settings?
Thanks & Best regards
Albrecht
I solved this by:
Giving the Airflow service account email access to Google Sheet where BigQuery table is derived from
Adding https://www.googleapis.com/auth/cloud-platform,https://www.googleapis.com/auth/bigquery,https://www.googleapis.com/auth/drive to scopes in the Airflow connection
Regenerating the service account JSON keyfile and pasting into the Keyfile JSON in the Airflow connection

Access Denied while globbing file pattern in transfer data from Google Cloud Platform to BigQuery

I'm quite new to the BigQuery world so apologize if I'm asking a stupid question.
I'm trying to create a scheduled transfer data job that import data into BigQuery from Google Cloud Storage.
Unfortunately I always get the following error message:
Failed to start job for table MyTable with error PERMISSION_DENIED: Access Denied: BigQuery BigQuery: Permission denied while globbing file pattern.
I verified to have all the required permissions already but it still isn't working.