I am trying to schedule a query from BQ console with pub/sub notification.
The query is below.
INSERT INTO `myproject.my_ds.mytable_test`(Operator, Technology, Freq_Band, Sector)
SELECT Operator, Technology, Freq_Band, Sector FROM `myproject.my_ds.mytable` WHERE Freq_Band = '800' ;
The topic is already created. The custom service account has below permissions.
BigQuery Data Editor
BigQuery User
Logs Writer
Monitoring Metric Writer
Pub/Sub Publisher
The error is "User not authorized to perform this action". The screenshot is given below.
Please help.
Regards,
Santanu
The account you are using needs more privileges.You can see more documentation about the privileges you need to schedule a query with BigQuery.
The privileges you need to schedule a query are:
bigquery.transfers.update or both bigquery.jobs.create and
bigquery.transfers.get to create the transfer
bigquery.jobs.create to run the scheduled query
bigquery.datasets.update on the target dataset
To modify a scheduled query, you must be the creator of the schedule and have the following permissions:
bigquery.jobs.create
Bigquery.transfers.update
You can see the predefined roles and permission you need. You can see more documentation.
BigQuery ML
BigQuery Data Transfer Service
BigQuery BI Engine
EDIT
Hi, If you have admin permission for BigQuery, you need more privileges for Pub/Sub notifications. You can see more documentation about it.
You need to have sufficient permissions on the bucket you wish to monitor:
If you own the project that contains the bucket, you most likely have
the necessary permission.
If you use IAM, you should have storage.buckets.update permission.
If you use ACLs, you should have OWNER permission.
Have sufficient permissions on the project that will receive notifications:
If you own the project that will receive notifications, you most
likely have the necessary permission.
If you plan to create topics for receiving notifications, you should
have pubsub.topics.create permission.
Whether you plan to use new or existing topics, you should have
pubsub.topics.setIamPolicy permission. If you create a topic, you
typically have pubsub.topics.setIamPolicy for it.
Have an existing Pub/Sub topic that you wish to send notifications to.
Get the email address of the service agent associated with the project that contains your Cloud Storage bucket.
Use the email address that you obtained in the previous step to give the service agent the IAM role pubsub.publisher for the desired Pub/Sub topic.
Related
My GCP expert tells me that my SA only needs data viewer role in the project in which the datasets I want to query are and that as long as it has job user role in any other project the query job should work.
But when I run the query I get this error:
google.api_core.exceptions.Forbidden: 403 POST https://bigquery.googleapis.com/bigquery/v2/projects/... : Access Denied: Project ... : User does not have bigquery.jobs.create permission in project ....
So does the SA need BQ job user role in exactly the same project where the datasets are?
Your GCP expert is correct!
as long as it has job user role in any other project ...
You just need to make sure you are running the job from within the project where that SA has job user role. This project will be billed for the cost of running job
In order to avoid the error you are facing, it is necessary to have the bigquery.jobs.create permission, as you can see in the error. You have two options:
1.- Create a custom role with such a permission.
2.- Add the BigQuery Job User or BigQuery User role. Both of them have the bigquery.jobs.create permission you need.
So I have a number of datasets under the same GCP BQ project, and I want to allow an external user to have read-only and read/write access on a few of them, but other datasets should not be visible to him. What's the best approach for this?
P.S. Probably not going to create an email account for him under our domain, so I'm thinking service accounts.
Just figured out one way to do it:
Create service account for external user (with BigQuery Job User role so it can be used to run queries in this project)
In GCP console web UI, for each dataset to share, click "SHARE DATASET", and in the pop-up panel add the service account created in step 1, with appropriate roles (BigQuery Data Viewer or BigQuery Data Editor)
Not sure if there's a cleaner way.
I am trying to synchronise databases in two different subscriptions using Azure datasync on the new portal
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-get-started-sql-data-sync
On the portal, I do not get the ability to choose subscription or connection string to connect to a Azure database on a different subscription (this is not on premise)
The options presented are either
a) Database from existing subscription + database server
b) on-premise database- with a sync agent to be downloaded
Can linking to another database via connection string be implemented via API's or is there any restriction or feature limitation around this?
Based on your comment, your issue is due to you having different logins for each subscription. In order to achieve what you want to do, you will need to cross add the various users to the subscriptions.
First, log into your Azure portal. Navigate to the subscription you are the admin for. Click on Access control (IAM) to manage the permissions for it.
Click the Add button which will bring up a dialog to add permissions. Simply select the role you wish to grant (I believe you will need contributor for this) and enter the email address of the user that you want to grant permissions to.
I am working on new iOT project. I have a telit device that comunicates with m2m cloud plataform. I created a new project on google cloud and abilities the bigquery api. I created a new dataset but I did not create a table because I understand that this will be created when the first data will sent. I created a trigger on m2m cloud to send data when an condition is true. I have shared the dataset using the email generated by google cloud to the my application. I don't know if and where I should to put this email address on m2m cloud.
This is dataset share permissions.
Can you help me?
Rodrigo Rocha
You will have to make sure that the email you share the dataset with is set to Can Edit so you can write tables.
Once that is done, to accomplish what you want to do, you'll have to run "append" jobs with a "CREATE IF NEEDED" disposition and a "WRITE_EMPTY" disposition. For more info, check out this link:
https://cloud.google.com/bigquery/docs/reference/v2/jobs
My question is whether Cassandra enables the below described scenario out-of-the-box.
The scenario:
Data owner - a "super-user" who wants to share some data in Cassandra with other users.
End user - a regular user who requests access to some subset of data
Cloud provider - hosts the Cassandra data store on behalf of the Data owner.
I know it is possible to define some restrictions on accessing data using configuration files (see reference). My question is if it is possible to dynamically allow the Data owner to update those access rights (e.g. add/remove users to the access list) by remotely contacting the Cassandra cluster?
As of Cassandra v1.2.2 (i think) user management is much improved.
A cloud provider could host multiple Cassandra clusters and grant super user access to data owners. Super users have access to all keyspaces on the cluster.
The data owners in turn could create keyspaces and grant access to an entire keyspace to end users.
http://www.datastax.com/docs/1.2/security/native_authentication
This can all be done with CQL, the configuration files for user access are no longer needed.