I've created my first Jupyter Notebook in Google cloud. I've followed the instructions here
https://cloud.google.com/ai-platform/notebooks/docs/use-r-bigquery
to use R with BQ. However, when I try to run the code I keep getting
"Error: Access Denied: Project ABC: User does not have bigquery.jobs.create permission in project ABC. [accessDenied] " where ABC is my project ID.
I've added BigQuery User & Admin permissions, logout, and logged back in, but keep getting the same message.
I recommend you to verify the Manage access using IAM roles documentation where is mentioned how to add the required permissions to access to the dataset in BigQuery.
Due to the error message:
Error: Access Denied: Project ABC: User does not have bigquery.jobs.create permission in project ABC. [accessDenied] " you need to add the permission bigquery.jobs.create to the user or service account used.
In the Compute Engine menu you can check the service account used in the Notebook, confirm that that service account has all the permissions mentioned above or the BigQuery Admin role mentioned in your question.
Related
I'm hoping to get help with the right permission settings for accessing my files from a Colab app.
Goal
I'd like to be able to access personal images in a CGS bucket from a Colab python notebook running the "Style Transfer for Arbitrary Styles" demo of Tensorflow.
Situation
I setup a GCS bucket, made it public, and was able to retrieve files and use them in the demo.
To avoid having the GCS bucket publicly accessible, I removed allUsers and changed to my account/email that's tied to both Colab and GCS.
That caused the following error message:
Error Messages
Exception: URL fetch failure on https://storage.googleapis.com/01_bucket-02/Portrait-Ali-02-PXL_20220105_233524809.jpg: 403 -- Forbidden
Other Approaches
I'm trying to understand how I should approach this.
Is it a URL problem?
The 'Authenticated URL' caused the above 403 error.
https://storage.cloud.google.com/01_bucket-02/Portrait_82A6118_r01.png
And the gsutil link:
gs://01_bucket-02/Portrait_82A6118_r01.png
Returned this error message:
Exception: URL fetch failure on gs://01_bucket-02/Portrait_82A6118_r01.png: None -- unknown url type: gs
Authentication setup
For IAM
I have a service account in the project, as well as my user account (email: d#arrovox.com) that's tied to both the Colab and GCP accounts.
The Service Account role is Storage Admin.
The Service Account has an inheritance from the Project.
My user account, my email, is Storage Object Viewer
Assessment
Seems like the Authenticated URL is the right one, and it's a permissions issue.
Is this just about having the right permissions set in GCS, or do I need to call anything in the code before trying to return the image at the GCS URL?
I'd greatly appreciate any help or suggestions in how to troubleshoot this.
Thanks
doug
storage.objects.get is the demand for viewing files from GCS, but it looks like your user account or email already has the right permission.
How should I know my account has the right permission?
I think there's a simple solution to figure it out.
copy your Authenticated URL
Paste on any website and search.
If your current account doesn't have the right permission, that will return #Gmail-account does not have storage.objects.get access to the Google Cloud Storage object.
Or you can visit permission of bucket details to check are your email and service over there and have the right role.
We have lighthouse configured and I am trying to extract azure aks RBAC permissions information for a managing subscription from a managed tenant:
Get-AzRoleAssignment -scope "/subscriptions/0000000-0000-0000-00000000000000/resourcegroups/testrg/providers/Microsoft.ContainerService/managedClusters/testakscluster
Can we extract role assignments for a managing tenant's subscription while logged in a managed tenant cloud shell?
Thanks for your help
When using the Get-AzRoleAssignment command, it will call the Azure AD Graph - getObjectsByObjectIds meanwhile to validate the objects in Azure AD.
To solve the issue, make sure your user account logged in the cloud shell has permission to call the API, if your user account type is member, it will has the permission by default. So I suppose your user account is a guest, if so, there are two ways.
1.Navigate to the Azure Active Directory in the portal -> User settings -> click Manage external collaboration settings -> select the first option like below.
2.Navigate to the Azure Active Directory in the portal -> Roles and administrators -> search for Directory readers -> click it -> Add assignments -> add your user account as a Directory readers role.
Just select any of the options above, then the command will work fine.
For anyone coming to this thread after some searching: I had the same issue with this call across multiple versions of the AZ.Resources module: 2.5.0, 4.1.0 an 5.6.0. All my rights where setup correctly, both for an SPN and a user, both got the same error.
Changing the call to use the Azure CLI and that just works 😠.
az role assignment list -g [resource group name]
I have a python program executing bigquery using cloud service account successfully.
When I try to schedule the python program using Jenkins, I see the below error
The gcloud user has bigquery editor, dataowner and admin permission to table, and dataset.
Log:
gcloud auth activate-service-account abc --key-file=****
Activated service account credentials for: abc273721.iam.gserviceaccount.com]
gcloud config set project p1
Updated property p1.
403 Access Denied: Table XYZ: User does not have permission to query table
I see that you have provided all the required roles; bigquery.dataOwner & bigquery.admin,as mentioned here but it looks like you have to also grant the service account access to the dataset.
Create a service account with BigQuery Admin Role and download JSON key file (example: data-lab.json).
Use below code:
gcloud auth activate-service-account "service-account" --key-file=data-lab.json --project="project-name"
Below is the error coming while creating a cluster:
(gcloud.container.clusters.create) ResponseError: code=403, message=Request had insufficient authentication
scopes
Check the IAM roles for the "Compute Engine default service account" and make sure it has enough permission to run the command [2]. Usually it would have an owner or editor role.
If you are on the Google Cloud Console, when creating an instance you need to look for the 'Identity and API access' section, and select 'Allow full access to all Cloud APIs' [1]
[1]https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances?hl=en_US&_ga=2.168486115.-390700867.1538154355
[2]https://cloud.google.com/iam/docs/granting-roles-to-service-accounts
I am using the bq command line tool to query from a Bigquery table. Is there a way to get the service account authentication to persist when I logged in and out of the box that the query process is running on?
Steps I did:
I logged into the linux box
Authenticate service account by running:
gcloud auth activate-service-account --key-file /somekeyFile.p12 someServiceAccount.gserviceaccount.com
Query from bigquery table, this works fine:
bq --project_id=formal-cascade-571 query "select * from dw_test.clokTest"
But then I logged out from the box, and logged back in. When I query the Bigquery table again:
bq --project_id=formal-cascade-571 query "select * from dw_test.clokTest"
It gives me the error:
Your current active account [someServiceAccount.gserviceaccount.com] does not have any valid credentials.
Even when I pass in the private key file:
bq --service_account=someServiceAccount.gserviceaccount.com --service_account_credential_file=~/clok_cred.txt --service_account_private_key_file=/somekeyFile.p12 --project_id=formal-cascade-571 query "select * from dw_test.clokTest"
It gives the same error:
Your current active account [someServiceAccount.gserviceaccount.com] does not have any valid credentials.
So every time I need to re-authenticate my service account by:
gcloud auth activate-service-account
Is there a way to have the authenticated service account credential persist?
Thank you for your help.
I asked the GCloud devs and they mention a known bug where service accounts don't show up unless the environment variable CLOUDSDK_PYTHON_SITEPACKAGES is set.
Hopefully this will be fixed soon, but in the meantime, when you log in again, can you try running
export CLOUDSDK_PYTHON_SITEPACKAGES=1
and seeing if it then works?
You can run
gcloud auth list
to see what accounts there are credentials for; it should list your service account.
I fixed it by relaunching gcloud auth login. Google then asked me to open a webpage which triggered the CLOUDSDK authorization which I believe is linked to the solution shared by J. Tigani.