Can an System assigned managed service identity be added to an AAD group? - azure-data-lake

I have an Azure Data Factory V2 service running with an MSI identity. This service needs to access a Data Lake Gen 1 with thousands of folders and millions of files.
For efficiency, we have a group assigned to the root of the data lake which has RX permissions and these are inherited and defaulted throughout the tree.
I'd like to add the above ADF MSI to this group and I cannot figure out how to via the portal AAD blade.
I can assign this MSI to the datalake directly but it then has to update millions of files which is slow and error prone (the blade needs to be kept open while the permissions are applied and this often fails over the hours it takes due to a network glitch).
Mark.

Yes. You can add a system assigned managed identity to an Azure AD group. See this link, for how it can be achieved via PowerShell: https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-sql#create-a-group-in-azure-ad-and-make-the-vms-system-assigned-managed-identity-a-member-of-the-group

This is also possible using the Azure CLI now:
az ad group member add --group <Group Object ID or Name> --member-id <Object ID of your managed identity>

Related

Azure Data Factory fails to create Azure SQL linked service utilising Managed private endpoint

I have created and approved a managed private endpoint in Azure Data Factory, targeting my Azure SQL server (which has public network access disabled).
I have also created a database user for the System Assigned Managed Identity.
When attempting to add a new linked service in the Data Factory portal, I am able to select my Azure subscription and the Server name, as shown in the screenshot below. However, the Database name dropdown never moves beyond "Loading..."
Attempting to create the linked service via Bicep instead seems to succeed - but reviewing the linked services blade, the linked service is not "Using private endpoint" - and my data pipeline fails.
Fix: Ensure SQL server name is all lowercase.
Checking my browser console whilst the above screen was displayed, I noticed an error relating to validation of the Server name, specifically "Servername cannot be empty or null. It can only be made up of lowercase letters, the numbers 0-9 and the hyphen. The hyphen may not lead or trail in the name."
My server name contained capital letters (although the Data Factory UI was rendering it in all lower case).
After recreating my Azure SQL server, with a name complying with the requirements above, I was able to set up the linked service without issue (both through the UI and through Bicep).
Try with enabling interactive authoting option.
https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-portal-private

How to enhance Microsoft AAD Example with database access using Microsoft Best Practices (No Passwords)

Background
I started with this sample AAD C# Blazor example
I stored the AAD client secret in a key vault using this example bicep script as a guide. This bicep script grants the webapp's system assigned service principal access to the key vault.
I am attempting to add cosmos db access using RBAC instead of passwords as recommended using CDennig's example bicep script as a guide by using RBAC to grant the web app's service principal access to the cosmos database. This is not working because the RBAC functions need to know the principal in advance.
Error BCP120: This expression is being used in an assignment to the "name" property of the "Microsoft.DocumentDB/databaseAccounts/sqlRoleDefinitions" type, which requires a value that can be calculated at the start of the deployment. You are referencing a variable which cannot be calculated at the start ("roleDefId" -> "web"). Properties of web which can be calculated at the start include "apiVersion", "id", "name", "type".
As inspired by CDennig's example (previous link) I used Powershell to create a managed identity (MI), I passed this to the bicep script, I make this the user assigned identity of the webapp (instead of using the webapp's system assigned identity), I grant this MI access to the key vault and finally I use this MI to create role definitions and role assignments for access to the cosmos db via RBAC. I'm having some troubles and questions about this approach as detailed in this other post
Is there another approach to accessing the cosmos DB without passwords that I should consider?
Question:
This must be extremely common! I cannot find any examples of a simple authenticating web app that accesses a database securely... Are my approaches good and I'm just doing something wrong or do I have the wrong approach?

How to get Azure SQL transactional log

How to get the transaction logs for a Azure SQL db? I'm trying to find log from portal of azure but not getting any luck.
If there is no way to get the log where that is saying in Microsoft docs. any help is appriciate
You don't as it is not exposed in the service. Please step back and describe what problem you'd like to solve. If you want a DR solution, for example, then active geo-replication can solve this for you as part of the service offering.
The log format in Azure SQL DB is constantly changing and is "ahead" of the most recent version of SQL Server. So, it is probably not useful to expose the log (the format is not documented). Your use case will likely determine the alternative question you can ask instead.
Azure SQL Database auditing tracks database events and writes them to an audit log in your Azure storage account, or sends them to Event Hub or Log Analytics for downstream processing and analysis.
Blob audit
Audit logs stored in Azure Blob storage are stored in a container named sqldbauditlogs in the Azure storage account. The directory hierarchy within the container is of the form ////. The Blob file name format is _.xel, where CreationTime is in UTC hh_mm_ss_ms format, and FileNumberInSession is a running index in case session logs spans across multiple Blob files.
For example, for database Database1 on Server1 the following is a possible valid path:
Server1/Database1/SqlDbAuditing_ServerAudit_NoRetention/2019-02-03/12_23_30_794_0.xel
Read-only Replicas audit logs are stored in the same container. The directory hierarchy within the container is of the form ////RO/. The Blob file name shares the same format. The Audit Logs of Read-only Replicas are stored in the same container.

How to query AAD Security Group Membership from Azure SQL

I'm trying to find a way from within Azure SQL to either 1) enumerate members of an Azure Active Directory security group or 2) check if a user login is a member of an SG. I've found various articles about doing it from a domain joined standalone SQL installation but not from Azure SQL. Most of the samples for the standalone installation use system sprocs like xp_cmdshell which don't exist in Azure SQL. I know I can create an Azure Function or Logic App to sync users to a table but I'd like to avoid using an external process to do this if possible.
#Kalyan Chanumolu-MSFT's comment should be very helpful to you. This scenario is not supported today.
You can try to use his suggestion.
You will have to talk to Microsoft Graph API from an intermediate like an Azure function to relay the data to Azure SQL Database.
You also can raise a support ticket to confirm it and also can put forward your suggestions in the feedback.

Data Factory New Linked Service connection failure ACL and firewall rule

I'm trying to move data from a datalake stored in Azure Data Lake Storage Gen1 to a table in an Azure SQL database. In Data Factory "new Linked Service" when I test the connection I get a "connection failed" error message, "Access denied...make sure ACL and firewall rule is correctly configured in the Azure Data Lake Store account. I tried numerous times to correct using related Stack overflow comments and plethora of fragmented Azure documentation to no avail. Am I using the correct approach and if so how do I fix the issue?
Please follow me:
First:
Go to ADF and new Linked service in ADF,then copy Managed identity object ID.
Second:Go to Azure Data Lake Storage Gen1,navigate to Data Explorer -> Access -> click select in the 'Select User or group' field.
Finally:paste your Managed identity object ID and then test your connection in ADF.