I created a Managed Notebooks instance using the Compute Engine Service Account. I have some python code which reads from a BigQuery table and does some processing. I did 'gcloud auth application default login', logged into my google account, and then was able to access that BQ table (which otherwise gave access denied error).
Now, I want to run this notebook using the Executor. However, I get access denied errors since the Executor runs the notebook in a tenant project. This page mentions
Also, the executor cannot use end-user credentials to authenticate access to resources, for example, the gcloud auth login command.
To resolve these issues, in your notebook file's code, authenticate access to resources through a service account.
Then when you create an execution or schedule, specify the service account.
How do I authenticate access to resources through a service account? I tried setting the compute engine service account as the service account to be used in Executor settings, but it still gives me access denied error for that BQ table. What can I do within my code that is similar to running 'gcloud auth application default login'?
Related
I'm running Python code on my computer that makes calls to Google Cloud Platform. I'm trying to know if my application is using my own credentials or service account keys to get authorizations on GCP.
On AWS, I could use aws sts get-caller-identity to know who the caller is (IAM user or IAM role).
Is there a GCP equivalent, something like gcloud whoami, that I could run from the command line or from my Python code itself to know the identity used by my application?
Use the command gcloud auth list in your cli to view the active credentials account.
I want to use Google BigQuery authentication like other Google services (for example, Google sheet).
The auth of Google sheet works on the scope and makes appear to the user a popup like "The app XXX request the access to your Google Account" and in this popup, you can see what permission needed by the app.
I would the same auth with Google BigQuery but after I read the docs, looks the code of official PHP client, I can't understand how to make this auth. Is this possible?
P.S. Obviously I tried the flow in the google docs with generated JSON from google developer console and it works fine.
What you want to do is not possible. At least in the way you would like.
When using, let's say, "native" GCP products, the OAuth authentication is performed automatically after logging in. This is why you don't are prompted to identify yourself when accessing to your GCS buckets, or when getting into the App Engine Dashboard.
When you want to grant access to an external user to your project, you run the command gcloud auth login. An authorization screen is shown like the one below:
This screen is also shown to "non-native" GCP services, such as BigQuery Geo Viz, Dialogflow, etc. You are prompt to grant access since these are "external" GCP features which interacts with your project's internal info.
BigQuery is an integrated GCP service and does not requires OAuth authentication when used via the UI.
If you would like to interact with the BigQuery API's, I highly recommend you to use the BigQuery Client libraries which do the authentication method much easier.
However, there is a way to grant access to external users. I found the Authorizing API requests doc where it's said that you can get a temporal access token for external users. This is done by following these steps:
Run the command gcloud auth application-default print-access-token in a Cloud Shell session.
Copy the output and paste it in a HTTP request like
https://www.googleapis.com/bigquery/v2/projects/$GOOGLE_CLOUD_PROJECT/datasets?access_token=ACCESS_TOKEN
Note that this could lead to even more effort than the required for Client libraries.
Hope this is helpful.
I am currently developing a client-side app where users login using e-mail/password against MongoDB Atlas. The backend runs completely serverless.
All logged in users should be able to upload and retrieve images from GCP - Storage bucket without a visible login, which means the application should authenticate for every user on the background.
I was thinking about using Google Service Accounts in combination with auth0, but I don't know where to start...
If someone could help me tell where to start, that would be great :)
The question is difficult to answer. However, here some insights.
The prefered way is to have a serverless backend, AppEngine standard, Cloud Run or Cloud Function for doing this. The user performs its authentication and then exchange security token between the frontend and the backend. When the user want to reach a GCP resource, it asks the backend, which request the request thanks to its own service account.
By the way, it's easy to trace the user request and to serve him only the resources related to it. And you have only 1 service account, for the backend
If you grant access to a bucket to a user, it could download all the files (But maybe there is one bucket per user?). If you chose to limit object access with ACL, the management is complex.
You don't need to have a service account per user (and in any case, you have a quota to 100 service accounts per project), you can use Cloud Identity Platform (CIP) instead of your MongoDB database for authentication (CIP don't perform authorization, you should use MongoDB for authorization and other stuffs related to authenticated user). CIP is Firebase Auth rebranded
I'm attempting to use Apache Airflow and pygsheets to upload to various Team Drives. When using oauth authentication, because it's an Airflow task, there's no interactive terminal for inputting the authorization code returned by Google.
I know that using a service account would typically work, but unfortunately, we're unable to give Google Drive access to users outside of our organization (so no xxxxxxx#gserviceaccount.com).
Is there any way to use oauth in a non-interactive manner?
You can authorize the account locally and then copy the generated credentials file to the server. Then use that file for authorization, it won't ask for code again.
I have Google app engine application that uses google cloud sql, it works locally but when deployed on the google app engine cloud I get access denied. locally i have defined ip address to access sql cloud and for the app engine app the application id is define. is there any other config that is missing, why the code is not working deploy on the google cloud?
Are you using a password? Don't. When app is deployed in GAE and you use a password for cloud sql it will give you an access denied error. If you null out your password before you deploy, it will work. The password is only required if you are not using GAE.
There might be a few reasons that are preventing your GAE instance to connect to your GCS instance:
The App Engine app is not configured to have permissions to GCS
Your GCS url is not in the form: /cloudsql/your-project-id:your-instance-name
Your MySQL user was not created using the Cloud SQL administration console.
You are using a password to connect (you shouldn't)