Determine which project I'm in and which are available - google-bigquery

I think I'm connected to BigQuery via ODBC:
isql -v BigQuery
+---------------------------------------+
| Connected! |
| |
| sql-statement |
| help [tablename] |
| quit |
| |
+---------------------------------------+
My odbc.ini file looks similar to this:
[ODBC]
Trace=yes
TraceFile=/root/odbc.error
[BigQuery]
# 1 = User Authentication
Email=someemail#blah.iam.gserviceaccount.com
KeyFilePath=/service.json
Driver=/opt/simba/driver/lib/libgooglebigqueryodbc_sb64.so
OAuthMechanism=0
Catalog=someproject
With this ODBC connection, I assumed I would connect to project someproject. However, when I try to query a table I get:
SQL> select * from `someproject.ua.ua_user_daily` limit 10;
[37000][Simba][BigQuery] (100) Error interacting with REST API: Access Denied: Project someproject: User does not have bigquery.jobs.create permission in project someproject.
[ISQL]ERROR: Could not SQLPrepare
I wanted to know which projects are therefore available to me within the SQL terminal, but I could not find any command to do this.
How can I test my connection by looking at what projects I do have access to?

The Problem you are facing deals with roles, you need to grant the role bigquery.jobs.create to the service account.
You can also add the role bigquery.user or bigquery.jobUser these two roles contain inside the role bigquery.jobs.create. Check all BigQuery roles that you can grant.
Additionally, if you want to use the same Service Account around multiple GCP projects see this post to see how you need to set it up.
You can Run the following Command in Cloud SDK to see which projects does your service account have access to:
gcloud projects list --impersonate-service-account=<your-service-account-email-address>
This command requires Cloud Resource Manager API to be enabled and resourcemanager.projects.list permission in your Service Account.

Related

How to create a service account for a bigquery dataset from the cli

I've found instructions how to generate credentials for the project level but there aren't clear instructions on adding a service account to only a specific dataset using the cli.
I tried creating the service account:
gcloud iam service-accounts create NAME
and then getting the dataset:
bq show \
--format=prettyjson \
project_id:dataset > path_to_file
and then adding a role to the access section
{
"role": "OWNER",
"userByEmail": "NAME#PROJECT.iam.gserviceaccount.com"
},
and then updating it. It seemed to work because I was able to create a table but then I got an access denied error User does not have bigquery.jobs.create permission in project when I tried loading data into the table.
When I inspected the project in the cloud console, it seemed as if my service account was added to the project rather then the dataset, which is not what I want but also does not explain why I don't have the correct permissions. In addition to owner permissions I tried assigning editor permission and admin, neither of which solved the issue.
It is not possible for a service account to only have permissions on a dataset level and then run a query. When a query is invoked, it will create a job. To create a job, the service account to be used should have permission bigquery.jobs.create added at a project level. See document for required permissions to run a job.
With this in mind, it is required to add bigquery.jobs.create at project level so you can run queries on the shared dataset.
NOTE: You can use any of the following pre-defined roles as they all have bigquery.jobs.create.
roles/bigquery.user
roles/bigquery.jobUser
roles/bigquery.admin
With my example I used roles/bigquery.user. See steps below:
Create a new service account (bq-test-sa#my-project.iam.gserviceaccount.com)
Get the permissions on my dataset using bq show --format=prettyjson my-project:mydataset > info.json
Add OWNER permission to service account in info.json
{
"role": "OWNER",
"userByEmail": "bq-test-sa#my-project.iam.gserviceaccount.com"
},
Updated the permissions using bq update --source info.json my-project:mydataset
Check BigQuery > mydataset > "SHARE DATASET" to see if the service account was added.
Add role roles/bigquery.user to service account using gcloud projects add-iam-policy-binding myproject --member=serviceAccount:bq-test-sa#my-project.iam.gserviceaccount.com --role=roles/bigquery.jobUser

Prevent a user from deleting BigQuery tables

We're trying to create a very basic role that allows users to query BigQuery tables, but not delete them. The custom role we're experimenting with now has the following permissions:
- bigquery.jobs.create
- bigquery.jobs.get
- bigquery.jobs.list
- bigquery.jobs.listAll
- bigquery.readsessions.create
- bigquery.routines.get
- bigquery.routines.list
- bigquery.savedqueries.get
- bigquery.savedqueries.list
- bigquery.tables.export
- bigquery.tables.getData
- bigquery.tables.list
- bigquery.transfers.get
- resourcemanager.projects.get
We're only focusing on delete at this time, so the permissions list is a work in progress. There is only one custom role assigned to our test user with the above permissions. However, the user can delete tables from our BigQuery dataset. Any idea on the correct combinations of permissions to achieve our objective.
Thanks in advance!
You have listed 14 permissions and seem to be making an assumption these permissions allow BQ table deletion.
This assumption looks odd (because clearly the permission bigquery.tables.delete is not on the list) and in fact is incorrect. Which means the GCP IAM identity (a user or a service account) assigned the role comprised of these 14 permissions will be unable to delete BQ tables. This in turn means the identity you are testing with is assigned additional role(s) and/or permission(s) that are not accounted for.
To prove the assumption is incorrect open BQ Console as a project administrator and click on the Cloud Shell icon to start Cloud Shell VM. Then execute the following commands at the command prompt replacing <project-name>:
# Prove the current user is BQ admin by creating 'ds_test1' dataset,
# 'tbl_test1' table, then deleting and recreating the table
bq mk ds_test1
bq mk -t ds_test1.tbl_test1
bq rm -f -t ds_test1.tbl_test1
bq mk -t ds_test1.tbl_test1
# Create role `role_test1`
gcloud iam roles create role_test1 --project <project-name> --title "Role role_test1" --description "My custom role role_test1" --permissions bigquery.jobs.create,bigquery.jobs.get,bigquery.jobs.list,bigquery.jobs.listAll,bigquery.readsessions.create,bigquery.routines.get,bigquery.routines.list,bigquery.savedqueries.get,bigquery.saved
queries.list,bigquery.tables.export,bigquery.tables.getData,bigquery.tables.list,bigquery.transfers.get,resourcemanager.projects.get --stage GA
# Create service account 'sa-test1'
# It is a good security practice to dispose of it when testing is finished
gcloud iam service-accounts create sa-test1 --display-name "sa-test1" --description "Test SA sa-test1, delete it when not needed anymore" --project <project-name>
# Grant the role (and its permissions) to the service account
gcloud projects add-iam-policy-binding <project-name> --member=serviceAccount:sa-test1#<project-name>.iam.gserviceaccount.com --role projects/<project-name>/roles/role_test1
# Save the credential of the service account (including the security sensitive
# private key) to a disk file
gcloud iam service-accounts keys create ~/key-sa-test1.json --iam-account sa-test1#<project-name>.iam.gserviceaccount.com
# Impersonate the service account. This replaces the current permissions with
# that of the service account
gcloud auth activate-service-account sa-test1#<project-name>.iam.gserviceaccount.com --key-file=./key-sa-test1.json
# Confirm the ability to list tables
bq ls ds_test1
# Confirm inability to delete tables
# The command fails with error: BigQuery error in rm operation: Access Denied: Table <project-name>:ds_test1.tbl_test1: User does not have bigquery.tables.delete permission for table <project-name>:ds_test1.tbl_test1.
bq rm -f -t ds_test1.tbl_test1
# Close SSH connection to the VM and logoff
exit
To see the roles granted to the service account 'sa-test1' created above open Cloud Shell and execute:
gcloud projects get-iam-policy <project-name> --flatten="bindings[].members" --filter="bindings.members:serviceAccount:sa-test1#<project-name>.
iam.gserviceaccount.com"
It should list our role projects/<project-name>/roles/role_test1.
To see the roles granted to the user who can delete tables execute:
gcloud projects get-iam-policy <project-name> --flatten="bindings[].members" --filter="bindings.members:user:<email-of-the-user>"
I did some tests on my end.
When an user has the 14 listed permissions, they are not even able to see the BigQuery Datasets on the UI. To do so, the bigquery.datasets.get permission must be added to the custom role.
Even with the 15 permissions, they are unable to Delete BigQuery tables so you are in the right path.
Being able to delete tables indicates that the user does not have the created Custom role assigned or has more permissions from additional roles. Please:
Check that the Roles have been set correctly (my scenario with the 15 permissions). Be sure to save changes when assigning permissions to your Custom roles.
In your IAM Dashboard please double check that the user has this role linked to their account.
Also check if the user does not have additional roles like Owner, Editor, BigQuery Admin, BigQuery Data Editor, etc. If they have any of those extra roles, their permissions are making them able to delete BigQuery tables.
Finally, double check who is logged into the UI, you can check it by clicking on the photo at the top right corner of your GCP UI. The user should not see an account different to myUser#emaildomain.com like in the following image
Hope this is helpful!

Google Bigquery Permissions Issue

I have two GCP projects. Project A and Project B. Under project A, I have a Bigquery data set and an IAM user - IAM-BQ-PROJ-A with roles BigQuery Data Viewer and BigQuery User.
Project B hosts a Kubernetes cluster. There is a Rails application in project B that is executing queries against the Bigquery dataset in project A. I have the credentials for IAM user - IAM-BQ-PROJ-A accessible to the Rails app. However, these queries fail with the following error -
Google::Cloud::PermissionDeniedError: accessDenied: Access Denied: Project B: The user IAM-BQ-PROJ-A does not have bigquery.jobs.create permission in Project B.
These queries run successfully when the Rails application is running in the local development environment outside of GCP.
If I create an IAM user - IAM-BQ-PROJ-A under Project B with roles BigQuery Data Viewer and BigQuery User then these queries execute successfully.
Why is this the case? Shouldn't these queries fire successfully without having an IAM user under project B provided that the credentials are accessible (similar to the local dev environment functionality)?
This is the expected behavior since you are using 2 different projects.
The user created under project A might have the right permissions, but it has nothing to do with project B, so it won't have the permissions needed to perform the queries from project B. That's why you get the error:
Google::Cloud::PermissionDeniedError: accessDenied: Access Denied: Project B: The user IAM-BQ-PROJ-A does not have bigquery.jobs.create permission in Project B.
Therefore creating the user under project B with the right permissions would allow the queries to be executed successfully.

How to migrate a technical user to a LDAP one?

I have nearly all my users setup as local (technical as SonarQube doc calls them) users and just installed & configured the LDAP plugin 2.2 to connect to my Active Directory.
The connection works fine: if an user unknown to SonarQube but existing in LDAP tries to log in, its user is automatically created.
I'd like to convert my existing SonarQube users (not linked to LDAP) to LDAP users so that their password and group memberships are automatically updated, but could not find how to do this in the documentation.
I found this answer how to change a local user to ldap, but it didn't work: when I try to login with LDAP credentials and the same login, I get an "Authentication failed.".
Some background:
At some point in time (i.e. some years and SonarQube versions ago), I had configured the LDAP plugin and everything worked as expected. This configuration somehow disappeared during an update, and the LDAP users were all converted to technical users (or so I assume).
I could not find a way to delete a user (as suggested in the SO post I linked above), only deactivate. Semantics, but it may have some importance.
I'm running SonarQube 5.6.1.
Edit:
I updated to the latest LTS version 5.6.6.
With trace logs activated:
When I try to log in with a deactivated local user (hoping that this would find it in LDAP):
TRACE web[sql] time=0ms | sql=SELECT count(`users`.id) AS count_id FROM `users` WHERE (login='tguerin' and user_local=1)
TRACE web[sql] time=1ms | sql=SELECT * FROM `users` WHERE (login='tguerin' AND active=1) LIMIT 1
TRACE web[sql] time=0ms | sql=SELECT * FROM `properties` WHERE (((`properties`.`resource_id` IS NULL AND `properties`.`user_id` IS NULL)) AND (`properties`.`prop_key` = 'sonar.allowUsersToSignUp')) LIMIT 1
DEBUG web[http] POST /sessions/login | time=224ms
Nothing more in the logs: no call to LDAP
When I try to log in with a user that doesn't exist (neither as local nor in LDAP):
TRACE web[sql] time=3ms | sql=SELECT count(`users`.id) AS count_id FROM `users` WHERE (login='notLocal' and user_local=1)
DEBUG web[o.s.p.l.LdapUsersProvider] Requesting details for user notLocal
DEBUG web[o.s.p.l.LdapSearch] Search: LdapSearch{baseDn=...), parameters=[notLocal], attributes=[mail, cn]}
DEBUG web[o.s.p.l.LdapContextFactory] Initializing LDAP context {java.naming.provider.url=ldap://x.x.x.x:389, java.naming.factory.initial=com.sun.jndi.ldap.LdapCtxFactory, java.naming.security.principal=..., com.sun.jndi.ldap.connect.pool=true, java.naming.security.authentication=simple, java.naming.referral=follow}
DEBUG web[o.s.p.l.LdapUsersProvider] User notLocal not found in <default>
TRACE web[sql] time=0ms | sql=SELECT * FROM `properties` WHERE (((`properties`.`resource_id` IS NULL AND `properties`.`user_id` IS NULL)) AND (`properties`.`prop_key` = 'sonar.allowUsersToSignUp')) LIMIT 1
DEBUG web[http] POST /sessions/login | time=66ms
The database is checked, then LDAP, as expected.
Edit2: to rule out a problem with a particular config/plugin on my server, I fired up a Docker Sonarqube 5.6.6 container, added a local user, added LDAP plugin (restarted, LDAP config ok), deactivated the user, tried to log in: same behaviour (i.e. the LDAP server is not queried)
As nothing seemed to work, I decided to inspect the database.
Changing field user_local in table users from 0 to 1 did the trick. I can't imagine this being recommended by SonarQube, but as of now, I did not find any side effects.

TeamCity Username / password

Hi
I installed teamcity longtime ago, on my home computer.
I am trying to re-use it again now, but I forgot the admin username and password
Is there a default admin user name?
and how can I get the password?
Thank
From TeamCity 8 you can log in as a super user and change the password that way. You just need to use an empty username and last occurrence of the "super user authentication token" found in the logs\teamcity-server.log file as your password.
Please see the following for more information:
TeamCity 8 - http://confluence.jetbrains.com/display/TCD8/Super+User
TeamCity 9 - http://confluence.jetbrains.com/display/TCD9/Super+User
Ok, so you've forgot username and password in your teamcity instance.
How to reset password: described here.
How to get username:
go to the teamcity data directory
open config\database.properties file
there is connectionUrl property which points out to database which stores some teamcity settings
take a look at users table
Update
After you get the user name you can reset reset its password via the following(copied from linked answer):
Open a command prompt and go to \webapps\ROOT\WEB-INF\lib folder
Run the following: ..\..\..\..\jre\bin\java.exe -cp server.jar;common-api.jar;commons-codec-1.3.jar;util.jar;hsqldb.jar ChangePassword username newpassword
FYI, don't follow the advice of going through the users table. TeamCity is a quality product, so all passwords are salted/hashed (TC 9 below):
mysql [teamcity]> SELECT id, password FROM users;
+-----+---------------------------------------------------+
| id | password |
+-----+---------------------------------------------------+
| 21 | k9d9yuE13FtQm8eT:1e24ad492777f94dec0c905127d1ea48 |
| 13 | m1l79Yy03hjoxKdA:199d1ea48e28a78bafde576dd88e6de7 |
| 85 | gOBpYHipOrtEGbUx:88f234847c07085798f9a4f8726e39df |
+-----+---------------------------------------------------+