Access Azure Datalake from Datafactory using Service Principal - azure-data-factory-2

We are trying to access datalake from datafactory using Service principal.
So as part of it, i created an AD Group and a Service principal. Added the SP to the AD Group.
Used the AD Group to create ACL Roles in the Azure datalake. But this does not work as we get 'This request is not authorized' error.
If i add the ServicePrincipal to 'Storage Blob Contributor' RBAC it works.
Any idea on how to get this working. TIA.

Role assignments are used by Azure RBAC to apply sets of permissions to security principals. A security principal is an Azure Active Directory object that represents a user, group, service principal, or managed identity (AD). A permission set can grant a security principal "coarse-grain" access to all of the data in a storage account or all of the data in a container, for example.
Storage Blob Data Contributor, owner or Reader roles.
ACLs allow you to apply the degree of access to directories and files at "Finner Grain." A permission construct containing a sequence of ACL entries is known as an ACL. Each ACL entry links a security principle to a certain access level. See Access control lists (ACLs) in Azure Data Lake Storage Gen2 for additional information.
Hence, you require to set both the access.

Related

Accessing a Postgres SQL Server from a Google cloud (gcp) account using another Google cloud (gcp) account which has a virtual machine

I have a Postgres SQL Server in my Google cloud account (account a). I can access it using the external IP address locally or using the internal IP address from one of my virtual machines.
I have a friend who has another google cloud account (account b). He can't access the account unless I white list his VM's ip address. Is there another way he can access my SQL Server such as adding or changing the IAM permissions?
You can use Cloud SQL IAM database authentication. Note that this feature is on Pre-GA and it is only available for Cloud SQL with PostgreSQL but basically Cloud SQL is integrated with IAM to help you better monitor and manage access for users and service accounts to databases.
Something other thing to take into consideration is that at this time groups are not supported, only direct user and service accounts are (i.e., indirect users via groups is not supported). You will need to give your own user account the "Cloud SQL Instance User" role as well as those user accounts in that group that will use IAM based authentication to the Postgres instances in this project configured to use IAM based authentication. Note that you need at least one individual user account assigned this role in the project - you cannot just have service accounts.
On the other hand, make sure that the Postgres instance you are attempting to add a user to has been set with the flag cloudsql.iam_authentication as per the instructions here.
Next, you should be able to add the user and Service Accounts granted the "Cloud SQL Instance User" role via the 'Add User' interface as described here.
Finally, you'll need to GRANT each user/Service Account appropriate permissions on the schemas it should have access to, pay attention to the fact that the full email address of the user or Service Account is required as outlined here.

In Azure can we create a service principal connection that can access multiple subscriptions

So I have an identity(azure AD email id) that has access to multiple subscriptions in the azure portal. when I create a service principal using that identity, it binds it to only 1 subscription and i am not able to use that service principal to access resources in other subscriptions. Do I have to create a Service principal for each subscription?
I use this Service principal in a devops pipeline. I want to access resources across multiple subscriptions. But because this Service principal is associated to only 1 subscription, I am not able to do that.
Should I create a Service principal for each subscription? I dont want to do that. Is there some other solution to this problem?
You can achieve this by adding role assignment to that service principal to each of the subscriptions.
Go to Azure portal, and pick each of the subscription resources. Pick the Access Control (IAM) tab, and click on "Add role assignments". Here you get the option to pick the role you can assign (for example: contributor) and in "assign access to" you pick the service principal you created that needs access to this subscription. Once you save, that service principal will have contributor access to that subscription. Repeat for each of the other subscriptions.

Can be Keycloak configured to set roles automatically based on user's properties?

The goal is to get an access token with a custom set of roles.
My scenario is that I have an User Storage Provider SPI that looks into an Oracle DB for authentication. It also checks users permissions defined in other tables of that DB. I would set that permissions into the UserModel object returned by the SPI.
Now I would like to define in Keycloak (using the administration application) custom roles and configure them so they are included in the access token of the user depending on some rules over the user's permissions that I've set previously into the UserModel. The idea is to be flexible and allow to change the configuration between roles and the user's permissions found in DB.
I've read the theory about mappers and policies but I'm not sure how to manage it and I can't find an example that ilustrates the process.

Azure Storage - Allowed Microsoft Service when Firewall is set

I am trying to connect a public logic app (not ISE environment) to a storage account that is restricted to a Vnet.
According to the Storage account documentation access should be possible using a system managed identity.
However I just tried in 3 different subscriptions and the result is always the same:
{
"status": 403,
"message": "This request is not authorized to perform this operation.\r\nclientRequestId: 2ada961e-e4c5-4dae-81a2-520397f277a6",
"error": {
"message": "This request is not authorized to perform this operation."
},
"source": "azureblob-we.azconn-we-01.p.azurewebsites.net"
}
Already provided access with different IAM roles, including owner. This feels like the service that should be allowed according to the documentation is not being allowed.
The Allow trusted Microsoft services... setting also allows a
particular instance of the below services to access the storage
account, if you explicitly assign an RBAC role to the system-assigned
managed identity for that resource instance. In this case, the scope
of access for the instance corresponds to the RBAC role assigned to
the managed identity.
Azure Logic Apps Microsoft.Logic/workflows Enables logic apps to
access storage accounts
[https://learn.microsoft.com/en-us/azure/storage/common/storage-network-security#exceptions][1]
What am I doing wrong?
Added screenshots:
https://i.stack.imgur.com/CfwJK.png
https://i.stack.imgur.com/tW7k9.png
https://i.stack.imgur.com/Lxyqd.png
https://i.stack.imgur.com/Sp7ZV.png
https://i.stack.imgur.com/Hp9JG.png
https://i.stack.imgur.com/rRbau.png
For authenticating access to Azure resources by using managed identities in Azure Logic Apps, you could follow the document. Azure Logic Apps should be registered in the same subscription as your storage account. If you want to access the blob in an Azure Storage container. You could add the Storage Blob Data Contributor(Use to grant read/write/delete permissions to Blob storage resources) role for the Logic App system identity in the storage account.
Update
From your screenshot, I found that you have not used a system-managed identity to design the Create blob logic but using an API connection.
For validating connecting a public logic app to a storage account with Allow trusted Microsoft services... setting enabled. You can design your logic using the managed identity with a trigger or action through the Azure portal. To specify the managed identity in a trigger or action's underlying JSON definition, see Managed identity authentication.
output
For more details, please read these steps in Authenticate access with managed identity.

Can I refer to a specific Cognito User Pool as a principal in an AWS IAM policy?

I want to restrict access to an AWS S3 bucket so that new objects can only be created by users who have authenticated through a specific Cognito User Pool. Looking through the AWS documentation I can't work out how to specify the user pool as a principal in the policy. Can anyone help me with this?
If this isn't possible then I'd appreciate some pointers/guidance which might help me achieve the desired result using some other technique. Thanks.
I think a User Pool will not allow you to do this, seen as you only get an access token for an authenticated entity. You'll probably need to create an identity pool, which allows users in your user pool to retrieve temporary IAM credentials. In the identity pool, you can also configure the role these logged in users get with their temporary credentials. You can use these roles to restrict access to the S3 bucket to only those authenticated users.
In the example below, you can see the settings of an identity pool configured with a Cognito user pool as authentication provider. It will either use the default roles configured on top (big red rectangle) or you can choose a custom role for the Cognito authentication provider (small rectangle). All roles can be managed using IAM.