Azure Stream Analytics Output to Data Lake Storage Gen2 with System-Assigned Managed Identity - azure-storage

I have a Stream Analytics Job with Use System-assigned Managed Identity enabled and which I would like to output its results to a Data Lake Storage Gen2.
As far as I understand I should only need to go into the Storage Account's IAM settings and add the Stream Analytics Identity as a Stroage Blob Data Owner. However, I don't see the Category of Stream Ananlytics Jobs in the dropdown and I can't seem to find the service principal in any of the other ones.
Am I missing something here or is this scenario just not supported yet?

Just choose the options like below, in the Select option, search for the name of your Stream Analytics Job, then you can find it and add it.

Related

Using Dataverse Synapse Link as reference data for Azure Stream Analytics

We are trying to use our Dataverse data as reference data for our Azure Stream Analytics. The idea is to couple customer acitivities with their CRM profile to create meaningful actions for them. We are currently moving from DES to the Dataverse Synapse Link, and have created the data lake where data gets dumped and can see it in Synapse Studio. However, Stream Analytics does not take CDM format out-of-the-box. It seems it can only handle CSV (with headers) and Json formats.
What is the best approach to get our Dataverse data in as reference for Stream Analytics (and in real time as possible)? Should we create a custom deserializer, or use ADF or something else?

How to query Log Analytics data into Azure Data Explorer?

I need to query my Log Analytics workspace into Azure Data Explorer but i didn't fined any idea about it.
Below are my doubts?
1. Do i need to ingest data from Log Analytics to Azure Data Explorer before utilizing it?
2. I didn't find any way to make a connection to Log Analytics into Azure Data Explorer?
3. The only option i saw to ingest data in Azure Data Explorer is through Event Hub. But now my issue is how can i ingest my log analytics data into Azure Data Explorer using event hub? Do i need to write any process to ingest?
If anyone have then please share so that I can explore about it.
Thanks,
Log Analytics team is working on a direct solution to ingest data to Azure Data
Explorer, meanwhile please export Log Analytics data and ingest data into ADX using the ingest API's or Logic Apps (Event Hub) to setup the export of Log Analytics data to Event Hub.

Write to data lake in stream analytics

Is there a way that I can have a output to data lake from stream analytics and use a aad app or something else than my account that is used to write to data lake? It is not practical to have a user as the one that writes to the data lake.
According to your description, I checked and tested Azure Data Lake Store output for Azure Stream Analytics, and I found that this output would use my current account for authorization as you mentioned.
Moreover, as Renew Data Lake Store authorization section mentioned as follows:
Currently, there is a limitation where the authentication token needs to be manually refreshed every 90 days for all jobs with Data Lake Store output.
For your requirement, I assumed that you could add feedback here. Or you could choose other outputs type for temporarily storing your results, then you could use another background task to trigger the temporary output store, then manually retrieve the records and write to your data lake. For this approach, you could leverage Service-to-service authentication with Data Lake Store.
For now answer is No
But it is planned to be available till the end of 2018
https://feedback.azure.com/forums/270577-stream-analytics/suggestions/15367185-please-provide-support-for-azure-data-lake-store-o

Azure Data Lake Analytics and querying data from linked data source (blob)

I would like to access/read/store file from linked (external) blob storage account - not from primary data lake store account. I added a new Data Source (external blob) in Azure Portal.
How can I specify that I need to read from external blob?
I found syntax in documentation (following example), but I do not understand what means BlobContainerName :
wasb://<BlobContainerName>#<StorageAccountName>.blob.core.windows.net/Samples/Data/SearchLog.tsv
Thank you in advance
Peter
It's the name of your container. See Blob Service Concepts. See also, WASB_Path_URI (Windows Azure Blob Storage Path URI).

Error trying to move data from Azure table to DataLake store with DataFactory

I've been building a Datafactory pipeline to move data from my azure table storage to a datalake store, but the tasks fail with an exception that I can't find any information on. The error is
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
I don't know where the problem lies, if in the datasets, the linked services or the pipeline, and can't seem to find any info at all on the error I'm seeing on the console.
Since the copy behavior from Azure Table Storage to Azure Data Lake Store is not currently supported as a temporary work around you could go from Azure Table Storage to Azure Blob Storage to Azure Data Lake store.
Azure Table Storage to Azure Blob Storage
Azure Blob Storage to Azure Data Lake Store
I know this is not ideal solution but if you are under time constraints, it is just an intermediary step to get the data into the data lake.
HTH
The 'CopyBehaviour' property is not supported for Table storage (which is not a file based store) that you are trying to use as a source in ADF copy activity. That is the reason why you are seeing this error message.