I have two GCP projects. Project A and Project B. Under project A, I have a Bigquery data set and an IAM user - IAM-BQ-PROJ-A with roles BigQuery Data Viewer and BigQuery User.
Project B hosts a Kubernetes cluster. There is a Rails application in project B that is executing queries against the Bigquery dataset in project A. I have the credentials for IAM user - IAM-BQ-PROJ-A accessible to the Rails app. However, these queries fail with the following error -
Google::Cloud::PermissionDeniedError: accessDenied: Access Denied: Project B: The user IAM-BQ-PROJ-A does not have bigquery.jobs.create permission in Project B.
These queries run successfully when the Rails application is running in the local development environment outside of GCP.
If I create an IAM user - IAM-BQ-PROJ-A under Project B with roles BigQuery Data Viewer and BigQuery User then these queries execute successfully.
Why is this the case? Shouldn't these queries fire successfully without having an IAM user under project B provided that the credentials are accessible (similar to the local dev environment functionality)?
This is the expected behavior since you are using 2 different projects.
The user created under project A might have the right permissions, but it has nothing to do with project B, so it won't have the permissions needed to perform the queries from project B. That's why you get the error:
Google::Cloud::PermissionDeniedError: accessDenied: Access Denied: Project B: The user IAM-BQ-PROJ-A does not have bigquery.jobs.create permission in Project B.
Therefore creating the user under project B with the right permissions would allow the queries to be executed successfully.
Related
I have a Virtuoso server running on a remote machine and can access the conductor UI by logging in as dba user. I have created a graph using Linked Data -> Quad Store Upload on which I am able to run select SPARQL queries from Linked Data -> SPARQL. However, when I run INSERT DATA query I am facing the following error.
Virtuoso RDF02 Error SR619: SPARUL INSERT access denied: database user 108 (SPARQL) has no write permission on graph http://localhost:8890/dummy
I have checked on System Admin -> User Accounts that the users SPARQL/dba have SPARQL_UPDATE, SPARQL_SELECT role. I have checked Linked Data -> Graphs -> Roles Security and it seems fine. I have the same setup on my local machine on which I initially faced a similar permission issue but after granting roles SPARQL_UPDATE, SPARQL_SELECT role it was resolved.
Please suggest how can I avoid this error.
A workaround is proposed in virtuoso-opensource/issues/1094:
run the following in the ISQL console,
DB.DBA.RDF_DEFAULT_USER_PERMS_SET ('nobody', 7);
A second option it to change the Usert type of the SPARQL user from SQL/ODBC to SQL/ODBC and WebDAV (see here for the original quote).
Option 2 worked for me (virtuoso-opensource, version: 07.20.3235).
I've found instructions how to generate credentials for the project level but there aren't clear instructions on adding a service account to only a specific dataset using the cli.
I tried creating the service account:
gcloud iam service-accounts create NAME
and then getting the dataset:
bq show \
--format=prettyjson \
project_id:dataset > path_to_file
and then adding a role to the access section
{
"role": "OWNER",
"userByEmail": "NAME#PROJECT.iam.gserviceaccount.com"
},
and then updating it. It seemed to work because I was able to create a table but then I got an access denied error User does not have bigquery.jobs.create permission in project when I tried loading data into the table.
When I inspected the project in the cloud console, it seemed as if my service account was added to the project rather then the dataset, which is not what I want but also does not explain why I don't have the correct permissions. In addition to owner permissions I tried assigning editor permission and admin, neither of which solved the issue.
It is not possible for a service account to only have permissions on a dataset level and then run a query. When a query is invoked, it will create a job. To create a job, the service account to be used should have permission bigquery.jobs.create added at a project level. See document for required permissions to run a job.
With this in mind, it is required to add bigquery.jobs.create at project level so you can run queries on the shared dataset.
NOTE: You can use any of the following pre-defined roles as they all have bigquery.jobs.create.
roles/bigquery.user
roles/bigquery.jobUser
roles/bigquery.admin
With my example I used roles/bigquery.user. See steps below:
Create a new service account (bq-test-sa#my-project.iam.gserviceaccount.com)
Get the permissions on my dataset using bq show --format=prettyjson my-project:mydataset > info.json
Add OWNER permission to service account in info.json
{
"role": "OWNER",
"userByEmail": "bq-test-sa#my-project.iam.gserviceaccount.com"
},
Updated the permissions using bq update --source info.json my-project:mydataset
Check BigQuery > mydataset > "SHARE DATASET" to see if the service account was added.
Add role roles/bigquery.user to service account using gcloud projects add-iam-policy-binding myproject --member=serviceAccount:bq-test-sa#my-project.iam.gserviceaccount.com --role=roles/bigquery.jobUser
When I create a brand new CodeBuild project, it allows me to select an IAM Service Role, and when I check the box "Allow AWS CodeBuild to modify this service role so it can be used with this build project", AWS modifies that Service Role with a custom policy that's specific to this role.
But If after creating that CodeBuild project I want to attach a different service role to it, I keep getting the below message saying "The policy was not attached ot role [x]"
I'm pretty sure I'm missing a permission somewhere, but I'm not sure where.
Edit with more troubleshooting data:
If I uncheck the box "Allow aws [...]" It allows me to update the CodeBuild project configuration, but all subsequent builds fail at startup. This is expected.
If I try to re-add the original service role I added to this project when I created it, it lets me add it without any problems.
I had the similar issue when I tried creating a more generic role that can be used by all of my CodeBuild projects. The way I got around it is I unchecked the "Allow AWS CodeBuild to modify this service role so it can be used with this build project" checkbox
I had to ensure that the role I'm attaching had all the necessary IAM permissions for my subsequent builds to keep running
I had the same issue and noticed that the previous role that was assigned to the CodeBuild project also had a Managed policy attached that had been added when the project was originally created. This role was named similar to this:
CodeBuildBasePolicy-project-name-us-west-2
I attached this policy to the new Role and detached it from the old role.
After this I was able to select "Update environment" and did not receive the error message.
After a long time spent on this issue, I discovered the problem!
I modified my CodeBuildServiceRole-projectName base policy instead of creating a new policy and attaching it to the CodeBuildServiceRole-projectName role. You should never edit the inline policy that was created by CodePipeline! Only create and add new policies to a role.
As AWS obscurely states in their documentation:
Modifying a policy statement or attaching another policy to the role can prevent your pipelines from functioning. Be sure that you understand the implications before you modify the service role for CodePipeline in any way. Make sure you test your pipelines after you make any change to the service role.
If you delete a CodeBuild project, the policies that CodeBuild created remain attached to the existing role. When you create a new project with the same name as the deleted one - this error will occur.
My solution was to delete all of the roles and policies that were referenced in the pipeline and rebuild those roles and policies. Then rebuild the pipeline.
We're trying to create a very basic role that allows users to query BigQuery tables, but not delete them. The custom role we're experimenting with now has the following permissions:
- bigquery.jobs.create
- bigquery.jobs.get
- bigquery.jobs.list
- bigquery.jobs.listAll
- bigquery.readsessions.create
- bigquery.routines.get
- bigquery.routines.list
- bigquery.savedqueries.get
- bigquery.savedqueries.list
- bigquery.tables.export
- bigquery.tables.getData
- bigquery.tables.list
- bigquery.transfers.get
- resourcemanager.projects.get
We're only focusing on delete at this time, so the permissions list is a work in progress. There is only one custom role assigned to our test user with the above permissions. However, the user can delete tables from our BigQuery dataset. Any idea on the correct combinations of permissions to achieve our objective.
Thanks in advance!
You have listed 14 permissions and seem to be making an assumption these permissions allow BQ table deletion.
This assumption looks odd (because clearly the permission bigquery.tables.delete is not on the list) and in fact is incorrect. Which means the GCP IAM identity (a user or a service account) assigned the role comprised of these 14 permissions will be unable to delete BQ tables. This in turn means the identity you are testing with is assigned additional role(s) and/or permission(s) that are not accounted for.
To prove the assumption is incorrect open BQ Console as a project administrator and click on the Cloud Shell icon to start Cloud Shell VM. Then execute the following commands at the command prompt replacing <project-name>:
# Prove the current user is BQ admin by creating 'ds_test1' dataset,
# 'tbl_test1' table, then deleting and recreating the table
bq mk ds_test1
bq mk -t ds_test1.tbl_test1
bq rm -f -t ds_test1.tbl_test1
bq mk -t ds_test1.tbl_test1
# Create role `role_test1`
gcloud iam roles create role_test1 --project <project-name> --title "Role role_test1" --description "My custom role role_test1" --permissions bigquery.jobs.create,bigquery.jobs.get,bigquery.jobs.list,bigquery.jobs.listAll,bigquery.readsessions.create,bigquery.routines.get,bigquery.routines.list,bigquery.savedqueries.get,bigquery.saved
queries.list,bigquery.tables.export,bigquery.tables.getData,bigquery.tables.list,bigquery.transfers.get,resourcemanager.projects.get --stage GA
# Create service account 'sa-test1'
# It is a good security practice to dispose of it when testing is finished
gcloud iam service-accounts create sa-test1 --display-name "sa-test1" --description "Test SA sa-test1, delete it when not needed anymore" --project <project-name>
# Grant the role (and its permissions) to the service account
gcloud projects add-iam-policy-binding <project-name> --member=serviceAccount:sa-test1#<project-name>.iam.gserviceaccount.com --role projects/<project-name>/roles/role_test1
# Save the credential of the service account (including the security sensitive
# private key) to a disk file
gcloud iam service-accounts keys create ~/key-sa-test1.json --iam-account sa-test1#<project-name>.iam.gserviceaccount.com
# Impersonate the service account. This replaces the current permissions with
# that of the service account
gcloud auth activate-service-account sa-test1#<project-name>.iam.gserviceaccount.com --key-file=./key-sa-test1.json
# Confirm the ability to list tables
bq ls ds_test1
# Confirm inability to delete tables
# The command fails with error: BigQuery error in rm operation: Access Denied: Table <project-name>:ds_test1.tbl_test1: User does not have bigquery.tables.delete permission for table <project-name>:ds_test1.tbl_test1.
bq rm -f -t ds_test1.tbl_test1
# Close SSH connection to the VM and logoff
exit
To see the roles granted to the service account 'sa-test1' created above open Cloud Shell and execute:
gcloud projects get-iam-policy <project-name> --flatten="bindings[].members" --filter="bindings.members:serviceAccount:sa-test1#<project-name>.
iam.gserviceaccount.com"
It should list our role projects/<project-name>/roles/role_test1.
To see the roles granted to the user who can delete tables execute:
gcloud projects get-iam-policy <project-name> --flatten="bindings[].members" --filter="bindings.members:user:<email-of-the-user>"
I did some tests on my end.
When an user has the 14 listed permissions, they are not even able to see the BigQuery Datasets on the UI. To do so, the bigquery.datasets.get permission must be added to the custom role.
Even with the 15 permissions, they are unable to Delete BigQuery tables so you are in the right path.
Being able to delete tables indicates that the user does not have the created Custom role assigned or has more permissions from additional roles. Please:
Check that the Roles have been set correctly (my scenario with the 15 permissions). Be sure to save changes when assigning permissions to your Custom roles.
In your IAM Dashboard please double check that the user has this role linked to their account.
Also check if the user does not have additional roles like Owner, Editor, BigQuery Admin, BigQuery Data Editor, etc. If they have any of those extra roles, their permissions are making them able to delete BigQuery tables.
Finally, double check who is logged into the UI, you can check it by clicking on the photo at the top right corner of your GCP UI. The user should not see an account different to myUser#emaildomain.com like in the following image
Hope this is helpful!
I am unable to create project in open shift. I created a project previously and deleted it. Looks like a project exists but unable to access or delete it. Seems like i am stuck. Also logging into the console https://console.preview.openshift.com/console/ doesn't show any existing projects.
I ran the following oc commands from the terminal.
Any suggestions on how to resolve this issue?
Thanks
XX:~ XX$ oc new-project test
Error from server: projectrequests "test" is forbidden: user XX cannot create more than 1 project(s).
XX:~ XX$ oc delete project test
Error from server: User "XX" cannot delete projects in project "test"
XX:~ XX$ oc status
Error from server: User "XX" cannot get projects in project "default"
XX:~ XX$ oc get projects
You need to give privileges/policies to your user which will allow the actions you want to perform.
If you are just in a proof-of-concept environment I would recommend the make your user cluster-admin in the whole cluster. This will give all the possible privileges to your user. Of course this in't recommended for every user in a 'real' environment.
First you need to authenticate with the 'default admin' which is created after the installation. This default admin-user isn't working with the normal user/password authentication. It's using a client certificate.
oc login -u system:admin --config=/etc/origin/master/admin.kubeconfig
Now you will see a list of the available projects (default, openshift management, etc). Now you're able to give cluster-roles to other users.
Make your user cluster-admin over the whole cluster
oadm policy add-cluster-role-to-user cluster-admin (youruser)
Now you have the cluster-admin privileges inside the whole cluster. You are also able to give privileges for some user in a specific project and not in the whole cluster. Than you have to use:
oadm policy add-role-to-user <role> <username> (in the current project)
This will give the role to a user, but only inside the project from where you've performed this command.
For more information about the avaiable cluster roles and policies I will point to the official documentation.
I raised a defect with Openshift Team as pointed out in the Support Link.
https://docs.openshift.com/online/getting_started/devpreview_faq.html#devpreview-faq-support
Here is the response i received from Support Team.
It seems that you have issued a bug and followed up for this already:
https://bugzilla.redhat.com/show_bug.cgi?id=1368862
After the cause is investigated, our operations team will sure clean up the project manually for you to allow you continue working with the developer preview
Latest update:
The project has now been cleaned up and you should be able to create a new project.
I am able to create Project in Openshift now.