Best way to share an Azure Storage Container with multiple users within an organization - azure-storage

I have a requirement of allowing organization level access to one of my storage containers within Azure.
What would be the best way to go about with this?
The access is being implemented via bash script.

1, Create a group and contains the users.
2, After created the group, go to the 'Access Control' of your container, then allow the group you created just now to access.
(Make sure these users don't have access in storage account level. Otherwise they will be able to access the container at the beginning.)

Related

Dataset level access control in BigQuery

So I have a number of datasets under the same GCP BQ project, and I want to allow an external user to have read-only and read/write access on a few of them, but other datasets should not be visible to him. What's the best approach for this?
P.S. Probably not going to create an email account for him under our domain, so I'm thinking service accounts.
Just figured out one way to do it:
Create service account for external user (with BigQuery Job User role so it can be used to run queries in this project)
In GCP console web UI, for each dataset to share, click "SHARE DATASET", and in the pop-up panel add the service account created in step 1, with appropriate roles (BigQuery Data Viewer or BigQuery Data Editor)
Not sure if there's a cleaner way.

Allow non-admin users to created CouchDB databases

I'm using CouchDB 2.1.0 and for my use case I would like non-admin users to be able to create their own databases that they will then have write/read access to, and the ability to add other users with write/read access.
Note that this is not one database per user, which seems to be the common use case, but many user-created databases per user.
Users are being created right now by POSTing to the _users database. Authentication is being handled by CouchDB's built-in authentication.
I could create a backend service that has admin credentials that would create these databases, but I would like to avoid doing so. Reading through docs it seems like by default CouchDB only allows admins to create databases; is there a way to change this?
Honestly, I think the only real answer here is that you'll have to make a backend service that has admin credentials that can create new databases. Kind of a bummer since one of my goals for this project was "no backend other than CouchDB".
My backend service ended up just taking a list of users that should have access to the created database, creating the database with a unique ID, and returning that ID. I then have a document in each user's DB that lists all of the DBs they have created.

Azure DataSync with SQL Azure databases across subscriptions

I am trying to synchronise databases in two different subscriptions using Azure datasync on the new portal
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-get-started-sql-data-sync
On the portal, I do not get the ability to choose subscription or connection string to connect to a Azure database on a different subscription (this is not on premise)
The options presented are either
a) Database from existing subscription + database server
b) on-premise database- with a sync agent to be downloaded
Can linking to another database via connection string be implemented via API's or is there any restriction or feature limitation around this?
Based on your comment, your issue is due to you having different logins for each subscription. In order to achieve what you want to do, you will need to cross add the various users to the subscriptions.
First, log into your Azure portal. Navigate to the subscription you are the admin for. Click on Access control (IAM) to manage the permissions for it.
Click the Add button which will bring up a dialog to add permissions. Simply select the role you wish to grant (I believe you will need contributor for this) and enter the email address of the user that you want to grant permissions to.

Is it possible to turn off the possibility of FT-indexing on a per database level

I understand there is a Domino ini setting for turning off all FT-indexing for an entire server. But is there any way to do this for only some databases on the sever, possibly on a per folder basis?
A fulltext can only be created by a user with manager access to the database.
In a well configured environment NO USER needs manager access to ANY database.
Even administrators don't need that (as there is Full Administration Mode).
So: Give users editor to the databases, manage access to databases with groups (user managed groups if you want), and then decide which databases to index.
In the end give the rules about which databases should have an index to the admins...

BigQuery - Grant Access to Other Google Cloud Platform Projects

I'm trying to setup customer access to some of my BigQuery data. I'll start off with my requirements, then what I think the solution needs to be, though I'm not sure how to execute.
Requirements
Separate billing per customer for queries
I don't want to make my dataset public
Read only access to specific datasets
Accessible via Excel connector
No access rights to my main project
They manage their own access privileges, I don't want to have to add and remove individual users from direct dataset access on behalf of all our clients.
Nice to have - Web UI access
What I've Done
Created a new Google Developer Project
Added a view-only user on that project
Added a service account
Granted access to my BigQuery dataset to the service account
Here are the options for granting dataset access from the documentation:
I imagine that I need to setup some sort of special group, but I can't figure out how to do it.
Thanks in advance!
In BigQuery there are two different concepts:
The first one is billing (for queries and any other billable
activity) that is linked with a Google Cloud Project.
The second one is access to a dataset.
Having said that, to fulfil your requirements you'd create a separate project for each of the customers, and grant access to the datasets in the granularity that you would want.
That way you would have the costs for each of the projects separated but billed to you. Be careful to give them only read access to the project, unless you want them to be able to create other services like VM or deploy GAE apps, as they'd be billed to you as well.
For example dataset [MyDatasetA] to users X and Y in projects Project1 and Project2, but access to [MyDatasetB] to users Y and Z in projects Project2 and Project3.
Thus, each project is accountable for the queries their users run, and you have your access control on each dataset without it being public.
Separate billing per customer for queries. Done with the independent projects.
I don't want to make my dataset public. Done with fine grained control access.
Read only access to specific datasets. Same as above.
Accessible via Excel connector. It should work without problems as they'd be first class BQ users.
No access rights to my main project. Again possible if they are restricted to their own projects.
They manage their own access privileges. This is trickier. I think they'd need more than read access to the datasets or more than read access to the projects to be able to add new users, if you use the project groups as access control.
Nice to have - Web UI access. Check out https://bigquery.cloud.google.com/
The project groups are groups that allow to select members with Viewer, Developer or Owner roles in one click, without the hassle of adding each member manually.
You get already three groups set-up for you to use: Viewers, Editors and Owners of the original project.
But you may create your own Google Groups and give those groups the permission you want.
The hint when doing so, is that new users will usually need to Display your project so that it appears in the BQ online browser. This is done by clicking on the arrow to the side of the project name in the BQ online browser followed by Switch to project then Display project with the project name that the Dataset belongs to.
Edit: Improved the explanation about Group access