I am trying to use the GCP to train a computer vision project.
I am using the AI platform.
When I submit my job, it fails with the following error:
message: ....does not have storage.objects.list access to NAME OF BUCKET REMOVED FOR SECURITY
"domain": "global",
"reason": "forbidden"
Any suggestions of where to start to fix this?
Does it matter that my project location and bucket location are different?
Thanks!
This problem caused by insufficient IAM permissions. More information you can find in the documentation like Cloud IAM roles for Cloud Storage section Predefined roles.
To solve this issue you should grant to your service account Storage Admin role (roles/storage.admin) or you can use other IAM roles to grant permissions storage.objects.* (in some cases permissions storage.buckets.* could be required) like Environment and Storage Object Administrator role (roles/composer.environmentAndStorageObjectAdmin) you've mentioned above.
Related
I'm hoping to get help with the right permission settings for accessing my files from a Colab app.
Goal
I'd like to be able to access personal images in a CGS bucket from a Colab python notebook running the "Style Transfer for Arbitrary Styles" demo of Tensorflow.
Situation
I setup a GCS bucket, made it public, and was able to retrieve files and use them in the demo.
To avoid having the GCS bucket publicly accessible, I removed allUsers and changed to my account/email that's tied to both Colab and GCS.
That caused the following error message:
Error Messages
Exception: URL fetch failure on https://storage.googleapis.com/01_bucket-02/Portrait-Ali-02-PXL_20220105_233524809.jpg: 403 -- Forbidden
Other Approaches
I'm trying to understand how I should approach this.
Is it a URL problem?
The 'Authenticated URL' caused the above 403 error.
https://storage.cloud.google.com/01_bucket-02/Portrait_82A6118_r01.png
And the gsutil link:
gs://01_bucket-02/Portrait_82A6118_r01.png
Returned this error message:
Exception: URL fetch failure on gs://01_bucket-02/Portrait_82A6118_r01.png: None -- unknown url type: gs
Authentication setup
For IAM
I have a service account in the project, as well as my user account (email: d#arrovox.com) that's tied to both the Colab and GCP accounts.
The Service Account role is Storage Admin.
The Service Account has an inheritance from the Project.
My user account, my email, is Storage Object Viewer
Assessment
Seems like the Authenticated URL is the right one, and it's a permissions issue.
Is this just about having the right permissions set in GCS, or do I need to call anything in the code before trying to return the image at the GCS URL?
I'd greatly appreciate any help or suggestions in how to troubleshoot this.
Thanks
doug
storage.objects.get is the demand for viewing files from GCS, but it looks like your user account or email already has the right permission.
How should I know my account has the right permission?
I think there's a simple solution to figure it out.
copy your Authenticated URL
Paste on any website and search.
If your current account doesn't have the right permission, that will return #Gmail-account does not have storage.objects.get access to the Google Cloud Storage object.
Or you can visit permission of bucket details to check are your email and service over there and have the right role.
We have lighthouse configured and I am trying to extract azure aks RBAC permissions information for a managing subscription from a managed tenant:
Get-AzRoleAssignment -scope "/subscriptions/0000000-0000-0000-00000000000000/resourcegroups/testrg/providers/Microsoft.ContainerService/managedClusters/testakscluster
Can we extract role assignments for a managing tenant's subscription while logged in a managed tenant cloud shell?
Thanks for your help
When using the Get-AzRoleAssignment command, it will call the Azure AD Graph - getObjectsByObjectIds meanwhile to validate the objects in Azure AD.
To solve the issue, make sure your user account logged in the cloud shell has permission to call the API, if your user account type is member, it will has the permission by default. So I suppose your user account is a guest, if so, there are two ways.
1.Navigate to the Azure Active Directory in the portal -> User settings -> click Manage external collaboration settings -> select the first option like below.
2.Navigate to the Azure Active Directory in the portal -> Roles and administrators -> search for Directory readers -> click it -> Add assignments -> add your user account as a Directory readers role.
Just select any of the options above, then the command will work fine.
For anyone coming to this thread after some searching: I had the same issue with this call across multiple versions of the AZ.Resources module: 2.5.0, 4.1.0 an 5.6.0. All my rights where setup correctly, both for an SPN and a user, both got the same error.
Changing the call to use the Azure CLI and that just works 😠.
az role assignment list -g [resource group name]
Below is the error coming while creating a cluster:
(gcloud.container.clusters.create) ResponseError: code=403, message=Request had insufficient authentication
scopes
Check the IAM roles for the "Compute Engine default service account" and make sure it has enough permission to run the command [2]. Usually it would have an owner or editor role.
If you are on the Google Cloud Console, when creating an instance you need to look for the 'Identity and API access' section, and select 'Allow full access to all Cloud APIs' [1]
[1]https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances?hl=en_US&_ga=2.168486115.-390700867.1538154355
[2]https://cloud.google.com/iam/docs/granting-roles-to-service-accounts
I am a Google cloud project owner but I am not able to access the files in my project buckets. I am getting the error
You need the storage.objects.list permission to list objects in this bucket. Ask a project or bucket owner to give you this permission and try again.
I am unable to copy files from the bucket as well and get an error The caller does not have permission
I have verified I'm authenticated as the right user (gcloud auth list).
What is going on here?
Somehow I had lost the Storage Object permission to my bucket. The option to modify permissions wasn't visible to me as well. I had to ask anothe project owner to add storage object admin permission for me on that bucket and it fixed the problem.
Is it possible to set this permission through the Cloud Console UI for cloud storage? Or is it only settable through the API (for example, following the guidance in this post)
In the documentation for Google's cloud storage, one of the defined permission scopes is "domain". This allows you to specify that the read or write permission is granted to any authenticated user that is part of your Google Apps domain.
When accessing a storage container UI in the cloud console, you can set user or group permissions, but entering a naked domain with either "User" or "Group" selected results in an "Invalid Value" message when the changes are saved.
This setting is now exposed via the Cloud Console UI. You should notice 3 sections in the dropdown: user, group, and domain.
The setting is also available via the API and via the command-line utility, gsutil. To grant read access to the domain my-domain.org from gsutil, you'd do something like this:
gsutil acl ch -g my-domain.org:R gs://bucket