Airflow Permission denied while getting Drive credentials - google-bigquery

I am trying to run a bigquery query on Airflow with MWAA.
This query uses a table that is based on a Google Sheet. When I run it, I have the following error:
google.api_core.exceptions.Forbidden: 403 Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
I already have a working Google cloud connection on Airflow with an admin service account.
Also:
This service account has access to the google sheet
I added https://www.googleapis.com/auth/drive in the scopes of the Airflow connection
I re-generated a JSON file
Am I doing something wrong? Any idea what I can do to fix this problem?
Thanks a lot

I fixed my issue by creating a NEW Airflow connection. It's a new google cloud connection, with the exact same values as the default google_cloud_default values. Now it works perfectly.
Hope it can help !

Related

Query from Bigquery using local command line

I'm trying to query from BigQuery using PowerShell. I've initialised gcloud init and logged in to my account.
The request was this:
bq query --use_legacy_sql=false 'SELECT customer_id FROM `demo1.customers1`'
Resulting with this error:
BigQuery error in query operation: Error processing job
'PROJECT-ID:bqjob': Access Denied:
BigQuery BigQuery: Permission denied while getting Drive credentials.
This worked when I run it in cloud shell.
I've created a service account before and a key for the project. I tried to run this command and doesn't solve it:
gcloud auth activate-service-account SERVICE_ACCOUNT#DOMAIN.COM --key-file=D:/folder/key.json --project=MYPROJECT_ID
Service account should have the OAuth scope for Drive to access drive, below command can be used to authenticate with Drive.
gcloud auth login --enable-gdrive-access

Access Denied: Permission denied while getting Drive credentials

Since today our Airflow service is not able to access queries in BigQuery. All jobs fail with the following message:
[2021-03-12 10:17:28,079] {taskinstance.py:1150} ERROR - Reason: 403 GET https://bigquery.googleapis.com/bigquery/v2/projects/waipu-app-prod/queries/e62030d7-36eb-4420-b482-b5327f4f6c7e?maxResults=0&timeoutMs=900&location=EU: Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
We haven't changed anything in recent days. Therefore we are quite puzzled what the reason might be. Is there a temporary bug? Or might we have to check any settings?
Thanks & Best regards
Albrecht
I solved this by:
Giving the Airflow service account email access to Google Sheet where BigQuery table is derived from
Adding https://www.googleapis.com/auth/cloud-platform,https://www.googleapis.com/auth/bigquery,https://www.googleapis.com/auth/drive to scopes in the Airflow connection
Regenerating the service account JSON keyfile and pasting into the Keyfile JSON in the Airflow connection

Access Denied while globbing file pattern in transfer data from Google Cloud Platform to BigQuery

I'm quite new to the BigQuery world so apologize if I'm asking a stupid question.
I'm trying to create a scheduled transfer data job that import data into BigQuery from Google Cloud Storage.
Unfortunately I always get the following error message:
Failed to start job for table MyTable with error PERMISSION_DENIED: Access Denied: BigQuery BigQuery: Permission denied while globbing file pattern.
I verified to have all the required permissions already but it still isn't working.

Failed to create workflow job because of insufficient permissions in dataflow

I apologize in advance in asking this question. It must be something very silly that I am overlooking. I am a beginner to GCP. When I try to create a job using the GUI and google pubsub to bigquery template, I get the following error:
The workflow could not be created. Causes: (717932ea69118a95): Unable to get machine type information for machine type n1-standard-4 in zone us-central1-a because of insufficient permissions. Please refer to https://cloud.google.com/dataflow/access-control#creating_jobs and make sure you have sufficient permissions.
I went to the IAM and checked that I already am the owner of the project. Can someone please guide me?
Thanks
We faced the similar issue.The root cause we found was the dataflow service account was missing in the IAM.
You should find a service account similar to below:
service-xxxxxxxxxxxx#<<project_name>>.iam.gserviceaccount.com
If you dont find this, try disabling the Dataflow API and re enable it.
You need roles/compute.viewer role for SA
We solved this by making the user dataflow.admin in the IAM console. The link provided in the error message has more granular permissions you can add if you don't want your data flow developers to be full admin.

Error: "does not have storage.buckets.get access to default” when trying to connect to Google Cloud Console

I am trying to connect Google Cloud Console and Skyvia so I can run SQL queries using Salesforce data.
In Skyvia, I am getting error "does not have storage.buckets.get access to default” when trying to connect to Google.
I am not a developer or programmer, so a reply that would be understood by a newbie would be appreciated.
I have tried one thing, which doesn't seem to work.
Since this page
https://cloud.google.com/storage/docs/access-control/iam-roles
says:
Role: roles/storage.legacyBucketReader
Has permission: storage.buckets.get
In my Google Console, in Permissions, I set "Storage Legacy Bucket Reader" to be "allUsers." Maybe I am missing the default part?
https://console.cloud.google.com/storage/browser?project=flowing-tooling
Thanks
Open the Cloud Storage browser in the Google Cloud Platform Console, click here.
You will see a list showing all your buckets. If you don't see any bucket, create a new, you can follow this tutorial to do that.
Then, copy the name of your bucket (it's something like: your-bucket-name.appspot.com ).
Finally, replace default for your bucket name :D