Use service acount default credentials in Google Bigquery command line tool - google-bigquery

I want to use service account with default credentials in bq command.
How can I make the following command read the key-file from environment variable GOOGLE_APPLICATION_CREDENTIALS:
gcloud auth active-service-account

Related

Execute BigQuery using python sdk from Jenkins

I have a python program executing bigquery using cloud service account successfully.
When I try to schedule the python program using Jenkins, I see the below error
The gcloud user has bigquery editor, dataowner and admin permission to table, and dataset.
Log:
gcloud auth activate-service-account abc --key-file=****
Activated service account credentials for: abc273721.iam.gserviceaccount.com]
gcloud config set project p1
Updated property p1.
403 Access Denied: Table XYZ: User does not have permission to query table
I see that you have provided all the required roles; bigquery.dataOwner & bigquery.admin,as mentioned here but it looks like you have to also grant the service account access to the dataset.
Create a service account with BigQuery Admin Role and download JSON key file (example: data-lab.json).
Use below code:
gcloud auth activate-service-account "service-account" --key-file=data-lab.json --project="project-name"

How can curl authenticate to Google Cloud based on local gcloud's CLI authentication?

My script uses a sequence of gcloud commands and of course gcloud is authenticated. I need to use curl to access GCP REST APIs that are not available to gcloud. I can do this by generating a JSON credentials file in the Cloud Console, but I'd rather not do that as a separate manual step.
[Edit: The answer is to replace the gcloud auth string in that curl command with gcloud auth print-access-token. See answer below.]
This is how I do it now, using a JSON file downloaded from the Console.
GOOGLE_APPLICATION_CREDENTIALS=credentials.json
curl -X POST -H "Authorization: Bearer \"$(gcloud auth application-default print-access-token)\"" \
-H "Content-Type: application/json; charset=utf-8" \
https://cloudresourcemanager.googleapis.com/v1/projects/<MY_PROJECT>:getAncestry
Without that downloaded JSON, I get this:
ERROR: (gcloud.auth.application-default.print-access-token)
The Application Default Credentials are not available. They are available if running in Google Compute Engine.
Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing
to a file defining the credentials. See https://developers.google.com/accounts/docs/application-default-credentials
for more information.
{ "error": {
"code": 401,
"message": "Request had invalid authentication credentials.
Expected OAuth 2 access token, login cookie or other valid authentication credential.
See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status": "UNAUTHENTICATED"
}
}
How can I leverage my gcloud authentication to use curl? I don't mind using the JSON if I can automate its generation as part of this script. For example, is there a way to call gcloud to generate this JSON, which I would then set as GOOGLE_APPLICATION_CREDENTIALS?
Once you have the Cloud SDK authenticated, you can use the gcloud auth print-access-token command to get an access token for the current active account, without the need of specifying the path to the JSON credentials file.
You will still need the credentials.json file, if you wish to activate a service account for the Cloud SDK instance. However, I am pretty sure you will only have to do that once.
EDIT:
I noticed you are using the gcloud auth application-default print-access-token.
Keep in mind that, contrary to the gcloud auth print-access-token command, it uses the current Application Default Credential (ADC), which has to be set by specifying the credentials.json file path with the GOOGLE_APPLICATION_CREDENTIALS env variable, or by using the gcloud auth application-default login command.

Using airflow with BigQuery and cloud sdk gives error "User must be authenticated when user project is provided"

I am trying to run airflow locally. My DAG has a BigQueryOperator and I want to use the cloud sdk for authentication. I run "gcloud auth application-default login" in order to get the json file with the credentials. I try to test my Dag running the command:
airflow test testdag make_tmp_table 2019-02-13 I get the error message "User must be authenticated when user project is provided"
If I instead of using the cloud sdk use a service account that has admin rights to BigQuery it works, but I need to use authentication through the cloud sdk.
Does anyone know what this error message means or how I can run airflow and using the cloud sdk for authentication?
I have used the following source to try to understand how I can run airflow with BigQueryOperators locally.
https://medium.com/#jbencina/local-testing-with-google-cloud-composer-apache-airflow-75d4213d2893
either you are not working on the right project or you don't have permissions to do this job.
what I suggest is:
check your current configuration by running:
gcloud auth list
make sure that you have the right project and the right account set if not run these commands to set them:
gcloud auth application-default login
you will be prompted for a link. follow it and enter your account. after that you will see a verification code, copy it and add it to your gcloud terminal.
next thing to do is to make sure that your account has permissions to do the job that you are trying. probably you need this role roles/composer.admin if it didn't work add the premitive role roles/editor from your IAM console. But use that premitive role only for testing purposes and it's not adviasable to use it for production level project.
I solved it by deleting the credentials file produced when I did:
gcloud auth application-default login and then recreating the file.
Then it worked. So I had the right method, just that something was broken in the credentials file.
as #dlbech said:
Blockquote
This solution was not enough for me. I solved it by deleting the line "quota_project_id": "myproject" line in the application_default_credentials.json file. I don't know why Airflow doesn't like the quota project ID key, but I tested it multiple times, and this was the problem

how to switch to user account to access bigquery

all.need your help there~
In GCP vm instance, i use nodejs write code,try to access dataset, I want use email account, and try this way to set account:
//in linux run commond to set the account
var exec = require('child_process').exec;
var cmdStr = 'gcloud config set account email#mmm.com';
exec(cmdStr, function (err, stdout, srderr) {
if(err) {
console.log(err);
} else {
console.log(out);
}
});
and when I run it, it uses a service account to access a big query.
Anyone can help me to switch the account from the service account to an email account?
Running ‘gcloud config set account email#mmm.com’ “sets the specified property in your active configuration” and does NOT authenticate your Cloud SDK with email#mmm.com.
You can authenticate Cloud SDK by executing ‘gcloud auth init’ from the command line. For more information please refer to documentation. Note that this does not authenticate the BigQuery API as explained below, instead this authenticates the GCloud SDK command line tools such as bq command line and gcloud commands.
To authenticate your Google Cloud API calls, you need to use a service account key. Please refer to steps 3-4 in documentation for detailed steps on achieving this in node.js.
You cannot authenticate API calls with an email account. The workaround is to use service account impersonation, please refer to documentation for further explanation.

How to get gcloud auth activate-service-account persist

I am using the bq command line tool to query from a Bigquery table. Is there a way to get the service account authentication to persist when I logged in and out of the box that the query process is running on?
Steps I did:
I logged into the linux box
Authenticate service account by running:
gcloud auth activate-service-account --key-file /somekeyFile.p12 someServiceAccount.gserviceaccount.com
Query from bigquery table, this works fine:
bq --project_id=formal-cascade-571 query "select * from dw_test.clokTest"
But then I logged out from the box, and logged back in. When I query the Bigquery table again:
bq --project_id=formal-cascade-571 query "select * from dw_test.clokTest"
It gives me the error:
Your current active account [someServiceAccount.gserviceaccount.com] does not have any valid credentials.
Even when I pass in the private key file:
bq --service_account=someServiceAccount.gserviceaccount.com --service_account_credential_file=~/clok_cred.txt --service_account_private_key_file=/somekeyFile.p12 --project_id=formal-cascade-571 query "select * from dw_test.clokTest"
It gives the same error:
Your current active account [someServiceAccount.gserviceaccount.com] does not have any valid credentials.
So every time I need to re-authenticate my service account by:
gcloud auth activate-service-account
Is there a way to have the authenticated service account credential persist?
Thank you for your help.
I asked the GCloud devs and they mention a known bug where service accounts don't show up unless the environment variable CLOUDSDK_PYTHON_SITEPACKAGES is set.
Hopefully this will be fixed soon, but in the meantime, when you log in again, can you try running
export CLOUDSDK_PYTHON_SITEPACKAGES=1
and seeing if it then works?
You can run
gcloud auth list
to see what accounts there are credentials for; it should list your service account.
I fixed it by relaunching gcloud auth login. Google then asked me to open a webpage which triggered the CLOUDSDK authorization which I believe is linked to the solution shared by J. Tigani.