Query from Bigquery using local command line - google-bigquery

I'm trying to query from BigQuery using PowerShell. I've initialised gcloud init and logged in to my account.
The request was this:
bq query --use_legacy_sql=false 'SELECT customer_id FROM `demo1.customers1`'
Resulting with this error:
BigQuery error in query operation: Error processing job
'PROJECT-ID:bqjob': Access Denied:
BigQuery BigQuery: Permission denied while getting Drive credentials.
This worked when I run it in cloud shell.
I've created a service account before and a key for the project. I tried to run this command and doesn't solve it:
gcloud auth activate-service-account SERVICE_ACCOUNT#DOMAIN.COM --key-file=D:/folder/key.json --project=MYPROJECT_ID

Service account should have the OAuth scope for Drive to access drive, below command can be used to authenticate with Drive.
gcloud auth login --enable-gdrive-access

Related

Airflow Permission denied while getting Drive credentials

I am trying to run a bigquery query on Airflow with MWAA.
This query uses a table that is based on a Google Sheet. When I run it, I have the following error:
google.api_core.exceptions.Forbidden: 403 Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
I already have a working Google cloud connection on Airflow with an admin service account.
Also:
This service account has access to the google sheet
I added https://www.googleapis.com/auth/drive in the scopes of the Airflow connection
I re-generated a JSON file
Am I doing something wrong? Any idea what I can do to fix this problem?
Thanks a lot
I fixed my issue by creating a NEW Airflow connection. It's a new google cloud connection, with the exact same values as the default google_cloud_default values. Now it works perfectly.
Hope it can help !

How to create a service account for a bigquery dataset from the cli

I've found instructions how to generate credentials for the project level but there aren't clear instructions on adding a service account to only a specific dataset using the cli.
I tried creating the service account:
gcloud iam service-accounts create NAME
and then getting the dataset:
bq show \
--format=prettyjson \
project_id:dataset > path_to_file
and then adding a role to the access section
{
"role": "OWNER",
"userByEmail": "NAME#PROJECT.iam.gserviceaccount.com"
},
and then updating it. It seemed to work because I was able to create a table but then I got an access denied error User does not have bigquery.jobs.create permission in project when I tried loading data into the table.
When I inspected the project in the cloud console, it seemed as if my service account was added to the project rather then the dataset, which is not what I want but also does not explain why I don't have the correct permissions. In addition to owner permissions I tried assigning editor permission and admin, neither of which solved the issue.
It is not possible for a service account to only have permissions on a dataset level and then run a query. When a query is invoked, it will create a job. To create a job, the service account to be used should have permission bigquery.jobs.create added at a project level. See document for required permissions to run a job.
With this in mind, it is required to add bigquery.jobs.create at project level so you can run queries on the shared dataset.
NOTE: You can use any of the following pre-defined roles as they all have bigquery.jobs.create.
roles/bigquery.user
roles/bigquery.jobUser
roles/bigquery.admin
With my example I used roles/bigquery.user. See steps below:
Create a new service account (bq-test-sa#my-project.iam.gserviceaccount.com)
Get the permissions on my dataset using bq show --format=prettyjson my-project:mydataset > info.json
Add OWNER permission to service account in info.json
{
"role": "OWNER",
"userByEmail": "bq-test-sa#my-project.iam.gserviceaccount.com"
},
Updated the permissions using bq update --source info.json my-project:mydataset
Check BigQuery > mydataset > "SHARE DATASET" to see if the service account was added.
Add role roles/bigquery.user to service account using gcloud projects add-iam-policy-binding myproject --member=serviceAccount:bq-test-sa#my-project.iam.gserviceaccount.com --role=roles/bigquery.jobUser

Access BigQuery data from Jupyter Notebook in AI Platform Google Cloud

I am trying to get access to the data stored in BigQuery from Jupyter Notebook in AI Platform on Google cloud platform.
First, I tried the following code:
from google.cloud import bigquery
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file(r'\local_path\gcpcred.json')
project_id = 'my-bq'
client = bigquery.Client(credentials= credentials,project=project_id)
The authentication credentials are stored in a json file named gcpcred on the local machine but
this gives me an error saying
FileNotFoundError: [Errno 2] No such file or directory:
'\local_path\gcpcred.json
I thought that since I am running this in AI Platform(on the cloud itself), I would not have to use this API and authenticate.
So I simply wrote:
%%bigquery
SELECT * FROM `project.dataset.table` LIMIT 1000
I got an error saying
ERROR:
403 Access Denied: User does not have access to the table
How do I access the table? Please help
Seems like the service account assosiated with jupyter notebooks doesn't have enough privilage to access bigquery. You can update it in IAM service Account section with required privilages.
The links Bellow will provide further clarification:
Visualizing BigQuery data in a Jupyter notebook
Getting started with authentication

Unable to create Big Query Data Transfer job using a service account

Am not able to create a data transfer job between a google playstore store to a google storage bucket using a service account that has permissions for both. I am able to create a transfer job using my project account which only has access to the storage bucket so I cannot use this in production.
By running the following:
bq mk --transfer_config --target_dataset=<my dataset> --display_name=<My Transfer Job> --params='{"bucket":<playstore bucket>,"table_suffix":<my suffix>}' --data_source=play --service_account <service account email> --service_account_credential_file $GOOGLE_APPLICATION_CREDENTIALS
I am getting error:
Unexpected exception in GetCredentialsFromFlags operation: Credentials
appear corrupt. Please delete the credential file and try your command
again. You can delete your credential file using "bq init
--delete_credentials".
Did a bq init and reran bq cmd but getting same error.
Also activated service acct using below cmd; still same error.
gcloud auth activate-service-account <service account email> --key-file $GOOGLE_APPLICATION_CREDENTIALS

gcloud compute ssh fails

I'm using Windows Server 2008.
on issuing gcloud compute ssh instance-1 --zone us-central1-a
I receive the error:
ERROR: (gcloud.compute.ssh) Could not fetch instance:
- Invalid value 'EFOnline'. Values must match the following regular expression: '(?:(?:[-a-z0-9]{1,63}\.)*(?:[a-z](?:[-
a-z0-9]{0,61}[a-z0-9])?):)?(?:[0-9]{1,19}|(?:[a-z](?:[-a-z0-9]{0,61}[a-z0-9])?))'
To be clear, I did do a gcloud auth login and received a successful 'you are now authenticated'
My project Name is EFOnline
my instance name is : instance-1 us-central1-a (cut and paste there)
So why the weird regex error??
Thanks
Google Cloud projects have both a name and an id.
Your project ID is the string that uniquely identifies your project to Google. Project IDs show up in URI paths to cloud resources, and have to be "good" strings to put in URIs. You can find the project id for your project at the "Overview" in http://console.developers.google.com/ page.
The project name is a human-readable string that can, for example, contain spaces and some special characters.
GCloud (and most of the GCP tooling) uses project ID.
So... please try the following: Look up your project id here: http://console.developers.google.com/ then run
$ gcloud config set project <id>
$ gcloud compute ssh instance-1 --zone us-central1-a
Also, we're working on fixing the error message.
Also, if you have more google cloud accounts or you are not logged in, you need first to authenticate with google cloud:
gcloud auth login
after that, copy given link, login with wanted account and you will be able to SSH to google cloud console with gcloud command.
Additionally, one can log into their developer console, open up your VM instance and click on the SSH widget at the top of the page, selecting View Gcloud Command. This will generate your specific string for terminal.
gcloud compute --project "project_name" ssh --zone "us-central1-a" "vm_instance_name"