How to authenticate with a Google Service Account in Jenkins pipeline - authentication

I want to use gcloud in Jenkins pipeline and therefore I have to authenticate first with the Google Service account. I'm using the https://wiki.jenkins.io/display/JENKINS/Google+OAuth+Plugin which holds my Private Key Credentials. I'm stuck with loading the credentials into the pipeline:
withCredentials([[$class: 'MultiBinding', credentialsId: 'my-creds', variable: 'GCSKEY']]) {
sh "gcloud auth activate-service-account --key-file=${GCSKEY}"
}
I also tried it with from file, but without luck.
withCredentials([file(credentialsId:'my-creds', variable: 'GCSKEY')]) {
The log says:
org.jenkinsci.plugins.credentialsbinding.impl.CredentialNotFoundException: Credentials 'my-creds' is of type 'Google Service Account from private key' ....

You need to upload your Sevice Account JSON file as a secret file.
Then:
withCredentials([file(credentialsId: 'key-sa', variable: 'GC_KEY')]) {
sh("gcloud auth activate-service-account --key-file=${GC_KEY}")
sh("gcloud container clusters get-credentials prod --zone northamerica-northeast1-a --project ${project}")
}

I couldn't get the 'Google Service Account from private key' working, but using the 'Secret File' type of credential in Jenkins, and uploaded my google service account JSON works.

Related

Execute BigQuery using python sdk from Jenkins

I have a python program executing bigquery using cloud service account successfully.
When I try to schedule the python program using Jenkins, I see the below error
The gcloud user has bigquery editor, dataowner and admin permission to table, and dataset.
Log:
gcloud auth activate-service-account abc --key-file=****
Activated service account credentials for: abc273721.iam.gserviceaccount.com]
gcloud config set project p1
Updated property p1.
403 Access Denied: Table XYZ: User does not have permission to query table
I see that you have provided all the required roles; bigquery.dataOwner & bigquery.admin,as mentioned here but it looks like you have to also grant the service account access to the dataset.
Create a service account with BigQuery Admin Role and download JSON key file (example: data-lab.json).
Use below code:
gcloud auth activate-service-account "service-account" --key-file=data-lab.json --project="project-name"

How can curl authenticate to Google Cloud based on local gcloud's CLI authentication?

My script uses a sequence of gcloud commands and of course gcloud is authenticated. I need to use curl to access GCP REST APIs that are not available to gcloud. I can do this by generating a JSON credentials file in the Cloud Console, but I'd rather not do that as a separate manual step.
[Edit: The answer is to replace the gcloud auth string in that curl command with gcloud auth print-access-token. See answer below.]
This is how I do it now, using a JSON file downloaded from the Console.
GOOGLE_APPLICATION_CREDENTIALS=credentials.json
curl -X POST -H "Authorization: Bearer \"$(gcloud auth application-default print-access-token)\"" \
-H "Content-Type: application/json; charset=utf-8" \
https://cloudresourcemanager.googleapis.com/v1/projects/<MY_PROJECT>:getAncestry
Without that downloaded JSON, I get this:
ERROR: (gcloud.auth.application-default.print-access-token)
The Application Default Credentials are not available. They are available if running in Google Compute Engine.
Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing
to a file defining the credentials. See https://developers.google.com/accounts/docs/application-default-credentials
for more information.
{ "error": {
"code": 401,
"message": "Request had invalid authentication credentials.
Expected OAuth 2 access token, login cookie or other valid authentication credential.
See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status": "UNAUTHENTICATED"
}
}
How can I leverage my gcloud authentication to use curl? I don't mind using the JSON if I can automate its generation as part of this script. For example, is there a way to call gcloud to generate this JSON, which I would then set as GOOGLE_APPLICATION_CREDENTIALS?
Once you have the Cloud SDK authenticated, you can use the gcloud auth print-access-token command to get an access token for the current active account, without the need of specifying the path to the JSON credentials file.
You will still need the credentials.json file, if you wish to activate a service account for the Cloud SDK instance. However, I am pretty sure you will only have to do that once.
EDIT:
I noticed you are using the gcloud auth application-default print-access-token.
Keep in mind that, contrary to the gcloud auth print-access-token command, it uses the current Application Default Credential (ADC), which has to be set by specifying the credentials.json file path with the GOOGLE_APPLICATION_CREDENTIALS env variable, or by using the gcloud auth application-default login command.

how to switch to user account to access bigquery

all.need your help there~
In GCP vm instance, i use nodejs write code,try to access dataset, I want use email account, and try this way to set account:
//in linux run commond to set the account
var exec = require('child_process').exec;
var cmdStr = 'gcloud config set account email#mmm.com';
exec(cmdStr, function (err, stdout, srderr) {
if(err) {
console.log(err);
} else {
console.log(out);
}
});
and when I run it, it uses a service account to access a big query.
Anyone can help me to switch the account from the service account to an email account?
Running ‘gcloud config set account email#mmm.com’ “sets the specified property in your active configuration” and does NOT authenticate your Cloud SDK with email#mmm.com.
You can authenticate Cloud SDK by executing ‘gcloud auth init’ from the command line. For more information please refer to documentation. Note that this does not authenticate the BigQuery API as explained below, instead this authenticates the GCloud SDK command line tools such as bq command line and gcloud commands.
To authenticate your Google Cloud API calls, you need to use a service account key. Please refer to steps 3-4 in documentation for detailed steps on achieving this in node.js.
You cannot authenticate API calls with an email account. The workaround is to use service account impersonation, please refer to documentation for further explanation.

Assume Role for IAM user to do s3 upload from Jenkins CI

I am trying to use s3upload from Jenkins CI, I have added IAM user S3_User credentials in Jenkins console and using withAWS(region: s3Region ,credentials: s3User). But my IAM user S3_User doesnt have S3 RW policy, IAM user has to assume role S3_UserRoleWithRWpolicy .How do I do that?
Provided S3_User access and secret key in Jenkins IAM credentials and added S3_UserRoleWithRWpolicy in IAM Role to use under IAM Role Support. But still not able to do S3 upload from Jenkins. How could I configure in Jenkins file to use role?
Finally figured out the solution:
I was using this in Jenkins CI file :
withAWS(region: 's3Region', credentials: 'iamUser')
{
s3Upload( file:'jar', bucket:s3Bucket, path: s3Path)
}
It worked fine when iamUser has direct access to S3
but failed when iamUser has to assume role to access S3 (after adding IAM Role to Assume in credentials)
But the below worked:
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', credentialsId: 'iamUser']]) {
withAWS(region: 's3Region')
{
s3Upload( file:'jar', bucket:s3Bucket, path: s3Path)
}
}

Use service acount default credentials in Google Bigquery command line tool

I want to use service account with default credentials in bq command.
How can I make the following command read the key-file from environment variable GOOGLE_APPLICATION_CREDENTIALS:
gcloud auth active-service-account