Using Dataverse Synapse Link as reference data for Azure Stream Analytics - azure-synapse

We are trying to use our Dataverse data as reference data for our Azure Stream Analytics. The idea is to couple customer acitivities with their CRM profile to create meaningful actions for them. We are currently moving from DES to the Dataverse Synapse Link, and have created the data lake where data gets dumped and can see it in Synapse Studio. However, Stream Analytics does not take CDM format out-of-the-box. It seems it can only handle CSV (with headers) and Json formats.
What is the best approach to get our Dataverse data in as reference for Stream Analytics (and in real time as possible)? Should we create a custom deserializer, or use ADF or something else?

Related

azue synapse copy from Azure Sql to Datalake Table

i want to copy data from azure Sql Tabel to Datalake Storage account table using synapse analytics, in the Datalake table i want to store table name and max id for the incremental load, is this possibe
If your requirement is only to transfer the data from Azure SQL Database to Data Lake Storage (ADLS) account and no big data analysis required, you can simply use Copy activity in either Azure Data Factory (ADF) or Synapse pipeline.
ADF also allows you to perform required transformation on your data before storing it into the destination using data flow activity.
Refer this official tutorial to Copy data from a SQL Server database to Azure Blob storage.
Now coming to Incremental load, ADF and Synapse pipelines both provide complete inbuilt support for it. You need to select a column as Watermark column in your source table.
Watermark column in the source data store, which can be used to slice
the new or updated records for every run. Normally, the data in this
selected column (for example, last_modify_time or ID) keeps increasing
when rows are created or updated. The maximum value in this column is
used as a watermark.
Microsoft provides a complete step-by-step tutorial to Incrementally load data from Azure SQL Database to Azure Blob storage using the Azure portal which you can follow and implement with appropriate changes as per your use case.
Apart from watermark technique, there are other methods which you can choose to manage incremental load. Check here.

Azure Stream Analytics Output to Data Lake Storage Gen2 with System-Assigned Managed Identity

I have a Stream Analytics Job with Use System-assigned Managed Identity enabled and which I would like to output its results to a Data Lake Storage Gen2.
As far as I understand I should only need to go into the Storage Account's IAM settings and add the Stream Analytics Identity as a Stroage Blob Data Owner. However, I don't see the Category of Stream Ananlytics Jobs in the dropdown and I can't seem to find the service principal in any of the other ones.
Am I missing something here or is this scenario just not supported yet?
Just choose the options like below, in the Select option, search for the name of your Stream Analytics Job, then you can find it and add it.

Can we access Data from ADLA tables creating ODATA source using REST API?

Is there a way if ODATA can be established on top of azure data lake analytics table via REST API's?
It seems there are REST API's to get ADLA job informations, account information etc.,
Is there any such existing API's to get data or is it possible to create API to access data via ODATA concept?
If you want to access data in ADLS files, there are REST APIs to get data from the lake. ADLS supports WebHDFS APIs with OAuth.
If you want to send queries and see their results or get U-SQL table data, you would have to write your own shim that translates the query you submit via your API into a U-SQL Script that outputs a file and then transparently download the file and returns it as the result.
Note that so-called interactive support is on the roadmap and being worked on. Once that is available, you can access the data using standard query APIs (such as ODBC, JDBC etc).

Is it possible to export data from MS Azure SQL directly Into the Azure Table Storage?

Is there any direct way within the Azure MSSQL ecosystem to export SQL returned data set into the Azure table storage?
Something like BCP but with the Table Storage connection string on the -Output end?
There is a service named Azure Data Factory which can directly copy data from Azure SQL Database to Azure Table Storage, even between other supported data stores, please see the section Supported data stores of the article "Data movement and the Copy Activity: migrating data to the cloud and between cloud stores" to know, but it is for Web, not like BCP command tool.
You can refer to the tutorial Build your first Azure data factory using Azure Portal/Data Factory Editor to know how to use it.
And as references, you can refer to the articles Move data to and from Azure SQL Database using Azure Data Factory & Move data to and from Azure Table using Azure Data Factory to know how it works.

Error trying to move data from Azure table to DataLake store with DataFactory

I've been building a Datafactory pipeline to move data from my azure table storage to a datalake store, but the tasks fail with an exception that I can't find any information on. The error is
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
I don't know where the problem lies, if in the datasets, the linked services or the pipeline, and can't seem to find any info at all on the error I'm seeing on the console.
Since the copy behavior from Azure Table Storage to Azure Data Lake Store is not currently supported as a temporary work around you could go from Azure Table Storage to Azure Blob Storage to Azure Data Lake store.
Azure Table Storage to Azure Blob Storage
Azure Blob Storage to Azure Data Lake Store
I know this is not ideal solution but if you are under time constraints, it is just an intermediary step to get the data into the data lake.
HTH
The 'CopyBehaviour' property is not supported for Table storage (which is not a file based store) that you are trying to use as a source in ADF copy activity. That is the reason why you are seeing this error message.