I'm running Python code on my computer that makes calls to Google Cloud Platform. I'm trying to know if my application is using my own credentials or service account keys to get authorizations on GCP.
On AWS, I could use aws sts get-caller-identity to know who the caller is (IAM user or IAM role).
Is there a GCP equivalent, something like gcloud whoami, that I could run from the command line or from my Python code itself to know the identity used by my application?
Use the command gcloud auth list in your cli to view the active credentials account.
Related
I created a Managed Notebooks instance using the Compute Engine Service Account. I have some python code which reads from a BigQuery table and does some processing. I did 'gcloud auth application default login', logged into my google account, and then was able to access that BQ table (which otherwise gave access denied error).
Now, I want to run this notebook using the Executor. However, I get access denied errors since the Executor runs the notebook in a tenant project. This page mentions
Also, the executor cannot use end-user credentials to authenticate access to resources, for example, the gcloud auth login command.
To resolve these issues, in your notebook file's code, authenticate access to resources through a service account.
Then when you create an execution or schedule, specify the service account.
How do I authenticate access to resources through a service account? I tried setting the compute engine service account as the service account to be used in Executor settings, but it still gives me access denied error for that BQ table. What can I do within my code that is similar to running 'gcloud auth application default login'?
I want to use Google BigQuery authentication like other Google services (for example, Google sheet).
The auth of Google sheet works on the scope and makes appear to the user a popup like "The app XXX request the access to your Google Account" and in this popup, you can see what permission needed by the app.
I would the same auth with Google BigQuery but after I read the docs, looks the code of official PHP client, I can't understand how to make this auth. Is this possible?
P.S. Obviously I tried the flow in the google docs with generated JSON from google developer console and it works fine.
What you want to do is not possible. At least in the way you would like.
When using, let's say, "native" GCP products, the OAuth authentication is performed automatically after logging in. This is why you don't are prompted to identify yourself when accessing to your GCS buckets, or when getting into the App Engine Dashboard.
When you want to grant access to an external user to your project, you run the command gcloud auth login. An authorization screen is shown like the one below:
This screen is also shown to "non-native" GCP services, such as BigQuery Geo Viz, Dialogflow, etc. You are prompt to grant access since these are "external" GCP features which interacts with your project's internal info.
BigQuery is an integrated GCP service and does not requires OAuth authentication when used via the UI.
If you would like to interact with the BigQuery API's, I highly recommend you to use the BigQuery Client libraries which do the authentication method much easier.
However, there is a way to grant access to external users. I found the Authorizing API requests doc where it's said that you can get a temporal access token for external users. This is done by following these steps:
Run the command gcloud auth application-default print-access-token in a Cloud Shell session.
Copy the output and paste it in a HTTP request like
https://www.googleapis.com/bigquery/v2/projects/$GOOGLE_CLOUD_PROJECT/datasets?access_token=ACCESS_TOKEN
Note that this could lead to even more effort than the required for Client libraries.
Hope this is helpful.
Authenticating with service account using gcloud
We are using below command for activating service account using .json file.
gcloud auth activate-service-account <service_account> --key-file <file_name>
After doing this we are able to deploy templates.
But we are not supposed to keep json file on server for authentication purpose.
Is there any other way of authenticating for deploying templates?
Is there any way to deploy templates using client secret and client id without using json file ?
To authorize Cloud SDK tools without storing private key, alternatively use tokens, see OAuth:
gcloud init on your local terminal, see Run gcloud init documentation
gcloud init on Compute Engine VM Instance, see Set up gcloud compute documentation
To avoid prompts, provide parameters for gcloud init in the command line (works only when $ gcloud config set disable_prompts false)
$ gcloud init --account=[account-name] --configuration=[config-name] --project=[prj-name] --console-only
For more details see documentation Managing SDK Configurations and Managing SDK Properties
There is also Google Cloud Shell, with 5GB of persistent disk storage and no additional authorization required to use Cloud SDK, see Starting Cloud Shell
To provide authorization you also can use Cloud Identity and Access Management API. You may also find helpful the answer for similar question on Stack Overflow
I'm attempting to use Apache Airflow and pygsheets to upload to various Team Drives. When using oauth authentication, because it's an Airflow task, there's no interactive terminal for inputting the authorization code returned by Google.
I know that using a service account would typically work, but unfortunately, we're unable to give Google Drive access to users outside of our organization (so no xxxxxxx#gserviceaccount.com).
Is there any way to use oauth in a non-interactive manner?
You can authorize the account locally and then copy the generated credentials file to the server. Then use that file for authorization, it won't ask for code again.
In order to log into AWS MFA is required. But if I had a program running on an EC2 instance that invoked AWS services via API calls, would such a program also need to authenticate using MFA or would this not be required because we are already "in?"
MFA is only required when logging into the AWS web console with a username/password. When you make API calls you would use an IAM access key, or even better (since your program is running on EC2) an IAM instance profile, which doesn't require MFA.
API calls can be made to require MFA as well using an IAM policy. However, if you were to deploy such policy, you could also exclude VPC-internal subnets from the MFA requirement so that MFA would only be required when accessing the AWS API endpoints from the outside.
Here's a link to my repository which contains an example enforcement policy (see example-mfa-policies/EnforceMFA.txt): https://github.com/vwal/awscli-mfa