How to import application properties from iot hub message enrichment into azure data explorer - azure-iot-hub

We are using azure iot hub and azure iot edge devices for multiple of our customers. The devices are sending their telemetry data using iot hub telemetry messages. In the iot hub we use message routing -> enrich messages to add the customer id to the messages application properties. This works pretty good if we forward the messages to service bus topics and consume them using azure functions as the application properties are easily accessable there.
Instead of using Azure functions, we now want to store all the telemetry data directly in azure data explorer databases and we want to split the messages to one dedicated database per customer (the name of the database is the customer-id).
So far I could not figure out how to access the application properties from iot hub when importing data to the data explorer? I am not even sure that it is possible, but i really hope it is.

Unfortunately automatically routing messages to different databases is not currently supported, please add a request to Azure Data Explorer user voice. For now, you can do the message routing to a dedicated event hub per customer (i.e. per database) and create one Azure Data Explorer data connection per database.
Please note that within a data connection you can dynamically route the messages to different tables by adding the table information to the event properties, see sample here

Related

Generate Event to Azure Event Hub on a file creation in azure DataLake

I have a requirement where I need to send file metadata informations (like file name, file path etc.) to Azure Event Hub/ Azure Event Grid whenever a new file is created into Azure Datalake container.
What would be ideal way to do it? Is there any out of box solution for this use case?
You can refer here to filter events reacting to Blob storage events and find some Tutorials and how-tos
Example- Microsoft.Storage.BlobCreated for Use Azure Event Grid to route Blob storage events to web endpoint (Azure portal)

Stream Analytics - Reference Data

I have to set a pipe line to get data's into "stream analytics job service" from sql database directedly( should not use other layers/components Ex: Data factory, Event hubs etc between sql db and stream analytics job ) . But no stream input in my use case. Tried by using this link with by adding reference data input option
"https://learn.microsoft.com/en-us/azure/stream-analytics/sql-reference-data "
But received "Query must refer to at least one data stream input" this error. Query expects always at least one stream input.
Please let me know to to achieve this with out using code only through pipelines.
Thank You.
To set up a process where you must stream data directly from SQL to Azure Stream Analysis is directly not available. As SQL is only used as reference input not as a stream input and there only below three Stream inputs which you can opt:
Azure Event Hubs
Azure IoT Hub
Azure Blob storage
Azure Data Lake Storage Gen2
And could use as SQL as a reference input i.e. A reference data set / Lookup Table is a finite data collection that is either static or slowly changing in nature, and it is used to execute a search or to supplement your data streams.
Hence, in this scenario the only way out is to Stream SQL data using Apache Kafka and then integrating with Azure Event Hubs which could then directly be a Steam input for Azure Stream Analytics.
Stream data as input into Stream Analytics
Use Azure Event Hubs from Apache Kafka applications

Is there a way to stream data change events from Azure SQL to Azure Event Hubs at scale?

We have a few SQL Azure instances which have a bunch of databases (like one for each department in an org) . We want to do stream processing on the data as and when there are updates to data in these individual databases.
From a scale perspective, we are looking at 10K events/day across all the databases, with a possible SLA of a few seconds to stream process an event.
We want to push data to Azure Event Hubs. Any existing Azure product offering can help here?
Just some suggestions for feasible solutions.
If you are looking for a solution for Azure services, then logic app should be able to meet your needs, this is the official doc of sql server trigger in logic app:
https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-sqlazure#add-a-sql-trigger
https://learn.microsoft.com/en-us/connectors/sql/#when-an-item-is-modified-(v2)
Or, if you can send web request from sql server when items changed like this:
https://www.botreetechnologies.com/blog/how-to-fire-a-web-request-from-microsoft-sql-server
Then you can send it to a endpoint like azure function, and use the event hub output binding, or just use the event hub sdk to send to event hub.

Can we use Azure logic apps as an alternative for SQL broker

I need to migrate existing SQL broker to Azure SQL.
Can we use Azure Logic apps to perform the same functionality.
Suggestions , Please.
Yes ,you can do it using combination of Azure Web job(read message from source database or application) and Azure Service bus (for queuing and messaging) and Logic App (process data and store in to final database)
please see below diagram for understading

Data architecture for a occasionally connected scientific field jounal solution

We are at the onset of developing a solution to handle collection and storage of scientific field data.
The solution should handle multiple Thick Windows PC field-clients attached to vehicles (trucks, boats, etc.) connected through cellular-network to a central SQL server.
The clients provide the central server, with data collected from equipment as well as manual input. The clients consume semi-static data from the central server e.g. personnel lists, and predefined data relevant to the specific task.
Connection to the server is erratic and hence the clients should be able to operate fully without connection to the central server for up to 3 hrs.
We are looking at MSMQ and Microsoft Sync Framework as options to handle client/server communication. Any insights you can provide will be much appreciated.
Implement the sync with sync framework over WCF. This will allow you to (a.o.) compress the data with WCf behaviors. And you won't have to expose your sql server to the internets.
http://code.msdn.microsoft.com/Database-Sync-SQL-Server-7e88adab
and http://code.msdn.microsoft.com/Database-SyncSQL-Server-e97d1208
If you can have collisions (update data on multiple clients or both on client and server), implement a command pattern to send data to the server from the clients. Change the data locally on the clients and at the same time create a message to send to the server that does not use sync framework, but can be processed by the server with the same results. This gives you more control and flexibility.
I don't know about msmq. You can have reliable messaging over WCF and as long as the messages you send from the clients are idempotent and the data you send to the clients from the server is considered as the overriding truth, I don't see the need for msmq.
If you can use sql express on the clients, I very much prefer the sync fx 2.0 approach with sql server change tracking, but that's a Microsoft unsupported scenario.
Otherwise, the sync fx 2.1 approach with metadata tables is ok, as long as you don't have more thann, say 50 tables.
If you have more specific questions, I might know more.