I want to be able to run BigQuery queries in Python without a service account (I don't have access to it right now and it's going to take some time before I am granted this).
At the moment, I am able to do this by generating the Google Client ID and secret this way:
Going to https://console.cloud.google.com/
Choosing the Project that I have access to: my_project
Activating the Cloud Shell
a) In the shell, type in "gcloud auth application-default login"
b) Answer y in the prompt message
c) Click on the link provided
d) Signing in with my Google Account
e) So that the Google Auth Library can access my Google Account
f) Then a verification code would be generated. It would look something like
"4/4A_something_something"
g) This creates a temporary credentials file with my Client ID and Secret. It would be something like "/tmp/tmp.some_strings/application_default_credentials.json"
h) Then I run "cat /tmp/tmp.some_strings/application_default_credentials.json" in order to view the contents of the .json file. It would look like this:
{
"client_id": whatever_code,
"client_secret": whatever_code,
"refresh_token": whatever_code,
"type":"authorized_user"
}
i) Then I save the outputs of that dictionary to a .json file in my directory
on command prompt, I run "set GOOGLE_APPLICATION_CREDENTIALS=path_of_the_json_file"
Once I have all of this setup, then I can run queries in Jupyter notebook with the bigquery module. However, every time I lose access to the VDI or need to login again, I have to reset the token in order to regain access. Is there a way to automate all of the steps above? Or is it only possible through a service account?
Related
I know of the process of manually logging in to github.com and going to settings in order to generate a new personal access token, but what I want to do is generate that token from the github api in order to access the github api.
Basically, I want a script to run and ask the user to enter a username and password which will then generate a personal access token and automatically use it to access different things on github.
Is that possible now since they got rid of the authorizations endpoint?
I am trying to run airflow locally. My DAG has a BigQueryOperator and I want to use the cloud sdk for authentication. I run "gcloud auth application-default login" in order to get the json file with the credentials. I try to test my Dag running the command:
airflow test testdag make_tmp_table 2019-02-13 I get the error message "User must be authenticated when user project is provided"
If I instead of using the cloud sdk use a service account that has admin rights to BigQuery it works, but I need to use authentication through the cloud sdk.
Does anyone know what this error message means or how I can run airflow and using the cloud sdk for authentication?
I have used the following source to try to understand how I can run airflow with BigQueryOperators locally.
https://medium.com/#jbencina/local-testing-with-google-cloud-composer-apache-airflow-75d4213d2893
either you are not working on the right project or you don't have permissions to do this job.
what I suggest is:
check your current configuration by running:
gcloud auth list
make sure that you have the right project and the right account set if not run these commands to set them:
gcloud auth application-default login
you will be prompted for a link. follow it and enter your account. after that you will see a verification code, copy it and add it to your gcloud terminal.
next thing to do is to make sure that your account has permissions to do the job that you are trying. probably you need this role roles/composer.admin if it didn't work add the premitive role roles/editor from your IAM console. But use that premitive role only for testing purposes and it's not adviasable to use it for production level project.
I solved it by deleting the credentials file produced when I did:
gcloud auth application-default login and then recreating the file.
Then it worked. So I had the right method, just that something was broken in the credentials file.
as #dlbech said:
Blockquote
This solution was not enough for me. I solved it by deleting the line "quota_project_id": "myproject" line in the application_default_credentials.json file. I don't know why Airflow doesn't like the quota project ID key, but I tested it multiple times, and this was the problem
I have written thousands of lines of Apps Scripts in an internal enterprise setting, but have been wracking my brain on this without any success for some time. Here's the use case:
App Script #1 is run by users in our organization with Calendar read/write permissions. In order to operate correctly, however, it requires access to certain data that can only be accessed by a separate user with different permissions.
App Script #2 is run as this second user, and serves up an API endpoint via doPost() or doGet() that returns the requested information.
For security reasons it is not possible to grant user #1 the full permissions required for user #2, hence the desire for user #2 to expose an API that provides only a very specific set of information to user #1.
The most intuitive solution would appear to be for user #2 to deploy an Apps Script with the "execute the app as" field set to "me", and the "Who has access to the app" field set to the organization's GSuite domain (for security reasons this must be restricted to within the organization).
However if user #1 then tries to hit that endpoint using UrlFetchApp, the request will fail (I believe with an HTTP 403) since the request does include a session token to prove that the request originates from within the same GSuite organization. Google does not seem to provide a way to generate such a session token; attempting to set an "Authorization" header with the value "Bearer " + ScriptApp.getOAuthToken() does not work.
I have also tried using the Apps Script API to have user #1 execute the script which was deployed by user #2, however the Apps Script API makes it clear that scripts can only be executed in the context of the calling user.
I've also read separately that Google Apps Scripts do not play well with service accounts (https://issuetracker.google.com/issues/36763096).
Hopefully I'm missing something obvious. Any ideas?
I created a repository on hub.docker.com and now want to push my image to the Dockerhub using my credentials. I am wondering whether I have to use my username and password or whether I can create some kind of access token to push the docker image.
What I want to do is using the docker-image resource from Concourse to push an image to Dockerhub. Therefore I have to configure credentials like:
type: docker-image
source:
email: {{docker-hub-email}}
username: {{docker-hub-username}}
password: {{docker-hub-password}}
repository: {{docker-hub-image-dummy-resource}}
and I don't want to use my Dockerhub password for that.
In short, you can't. There are some solutions that may appeal to you, but it may ease your mind first to know there's a structural reason for this:
Resources are configured via their source and params, which are defined at the pipeline level (in your yml file). Any authentication information has to be defined there, because there's no way to get information from an earlier step in your build into the get step (it has no inputs).
Since bearer tokens usually time out after "not that long" (i.e. hours or days) which is also true of DockerHub tokens, the concourse instance needs to be able to fetch a new token from the authentication service every time the build runs if necessary. This requires some form of persistent auth to be stored in the concourse server anyway, and currently Dockerhub does not support CI access tokens a la github.
All that is to say, you will need to provide a username and password to Concourse one way or another.
If you're worried about security, there are some steps you can most likely take to reduce risk:
you can use --load-vars-from to protect your credentials from being saved in your pipeline, storing them elsewhere (LastPass, local file, etc).
you might be able to create a user on Dockerhub that only has access to the particular repo(s) you want to push, a "CI bot user" if you will.
Docker Hub supports Access Token
goto Account Settings > Security
its same as Github personal access token (PAT)
You can use this token instead of actual password
I am using the bq command line tool to query from a Bigquery table. Is there a way to get the service account authentication to persist when I logged in and out of the box that the query process is running on?
Steps I did:
I logged into the linux box
Authenticate service account by running:
gcloud auth activate-service-account --key-file /somekeyFile.p12 someServiceAccount.gserviceaccount.com
Query from bigquery table, this works fine:
bq --project_id=formal-cascade-571 query "select * from dw_test.clokTest"
But then I logged out from the box, and logged back in. When I query the Bigquery table again:
bq --project_id=formal-cascade-571 query "select * from dw_test.clokTest"
It gives me the error:
Your current active account [someServiceAccount.gserviceaccount.com] does not have any valid credentials.
Even when I pass in the private key file:
bq --service_account=someServiceAccount.gserviceaccount.com --service_account_credential_file=~/clok_cred.txt --service_account_private_key_file=/somekeyFile.p12 --project_id=formal-cascade-571 query "select * from dw_test.clokTest"
It gives the same error:
Your current active account [someServiceAccount.gserviceaccount.com] does not have any valid credentials.
So every time I need to re-authenticate my service account by:
gcloud auth activate-service-account
Is there a way to have the authenticated service account credential persist?
Thank you for your help.
I asked the GCloud devs and they mention a known bug where service accounts don't show up unless the environment variable CLOUDSDK_PYTHON_SITEPACKAGES is set.
Hopefully this will be fixed soon, but in the meantime, when you log in again, can you try running
export CLOUDSDK_PYTHON_SITEPACKAGES=1
and seeing if it then works?
You can run
gcloud auth list
to see what accounts there are credentials for; it should list your service account.
I fixed it by relaunching gcloud auth login. Google then asked me to open a webpage which triggered the CLOUDSDK authorization which I believe is linked to the solution shared by J. Tigani.