Generate Event to Azure Event Hub on a file creation in azure DataLake - azure-data-lake

I have a requirement where I need to send file metadata informations (like file name, file path etc.) to Azure Event Hub/ Azure Event Grid whenever a new file is created into Azure Datalake container.
What would be ideal way to do it? Is there any out of box solution for this use case?

You can refer here to filter events reacting to Blob storage events and find some Tutorials and how-tos
Example- Microsoft.Storage.BlobCreated for Use Azure Event Grid to route Blob storage events to web endpoint (Azure portal)

Related

change the access tier of file blob storage from hot tier to archive tier in ADF pipeline

I would like to copy my files from one container to another container using ADF pipeline and while copying i have to changes of Access tier from Hot tier to archive tier.
I have to achieve this using ADF pipeline. Help me a way with out using custom activity would be great.
I don't see a direct property in any activity to achieve this, you can try one of the methods from below.
Using Web activity in Azure Data Factory and Azure Synapse Analytics
Copy Blob where, x-ms-access-tier Specifies the tier to be set on the target blob.
If after the CopyActivity, use Set Blob Tier, x-ms-access-tier Indicates the tier to be set on the blob
Of course you would have to use parameters to make this dynamically executable for multiple files involved.

Is there a way to stream data change events from Azure SQL to Azure Event Hubs at scale?

We have a few SQL Azure instances which have a bunch of databases (like one for each department in an org) . We want to do stream processing on the data as and when there are updates to data in these individual databases.
From a scale perspective, we are looking at 10K events/day across all the databases, with a possible SLA of a few seconds to stream process an event.
We want to push data to Azure Event Hubs. Any existing Azure product offering can help here?
Just some suggestions for feasible solutions.
If you are looking for a solution for Azure services, then logic app should be able to meet your needs, this is the official doc of sql server trigger in logic app:
https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-sqlazure#add-a-sql-trigger
https://learn.microsoft.com/en-us/connectors/sql/#when-an-item-is-modified-(v2)
Or, if you can send web request from sql server when items changed like this:
https://www.botreetechnologies.com/blog/how-to-fire-a-web-request-from-microsoft-sql-server
Then you can send it to a endpoint like azure function, and use the event hub output binding, or just use the event hub sdk to send to event hub.

How to import application properties from iot hub message enrichment into azure data explorer

We are using azure iot hub and azure iot edge devices for multiple of our customers. The devices are sending their telemetry data using iot hub telemetry messages. In the iot hub we use message routing -> enrich messages to add the customer id to the messages application properties. This works pretty good if we forward the messages to service bus topics and consume them using azure functions as the application properties are easily accessable there.
Instead of using Azure functions, we now want to store all the telemetry data directly in azure data explorer databases and we want to split the messages to one dedicated database per customer (the name of the database is the customer-id).
So far I could not figure out how to access the application properties from iot hub when importing data to the data explorer? I am not even sure that it is possible, but i really hope it is.
Unfortunately automatically routing messages to different databases is not currently supported, please add a request to Azure Data Explorer user voice. For now, you can do the message routing to a dedicated event hub per customer (i.e. per database) and create one Azure Data Explorer data connection per database.
Please note that within a data connection you can dynamically route the messages to different tables by adding the table information to the event properties, see sample here

Azure Stream Analytics Output to Data Lake Storage Gen2 with System-Assigned Managed Identity

I have a Stream Analytics Job with Use System-assigned Managed Identity enabled and which I would like to output its results to a Data Lake Storage Gen2.
As far as I understand I should only need to go into the Storage Account's IAM settings and add the Stream Analytics Identity as a Stroage Blob Data Owner. However, I don't see the Category of Stream Ananlytics Jobs in the dropdown and I can't seem to find the service principal in any of the other ones.
Am I missing something here or is this scenario just not supported yet?
Just choose the options like below, in the Select option, search for the name of your Stream Analytics Job, then you can find it and add it.

How to query Log Analytics data into Azure Data Explorer?

I need to query my Log Analytics workspace into Azure Data Explorer but i didn't fined any idea about it.
Below are my doubts?
1. Do i need to ingest data from Log Analytics to Azure Data Explorer before utilizing it?
2. I didn't find any way to make a connection to Log Analytics into Azure Data Explorer?
3. The only option i saw to ingest data in Azure Data Explorer is through Event Hub. But now my issue is how can i ingest my log analytics data into Azure Data Explorer using event hub? Do i need to write any process to ingest?
If anyone have then please share so that I can explore about it.
Thanks,
Log Analytics team is working on a direct solution to ingest data to Azure Data
Explorer, meanwhile please export Log Analytics data and ingest data into ADX using the ingest API's or Logic Apps (Event Hub) to setup the export of Log Analytics data to Event Hub.