How to push the data from Azure Data Lake to SSAS ( Azure analysis Service) ? Is it possible? - azure-data-lake

Azure data lake is my data source. I want to push data from azure data lake to Azure Analysis Service (SSAS). How i can do that?

I think it is not supported. In following documentation of Azure Analysis Services, Azure Data Lake is not listed in Data Source providers list:
https://opbuildstorageprod.blob.core.windows.net/output-pdf-files/en-us/Azure.azure-documents/live/analysis-services.pdf

This is supported.
You need to use a compatibility level of 1400. I have the latest Azure Data Lake plugin for VS2015. You would need to add Data Lake Store as a data source.
There is a good series of blogs here which give you insight of building azure analysis service on top of BLOB storage. The same principle can be applied to data lake store as well:
Part 01 :
https://blogs.msdn.microsoft.com/analysisservices/2017/05/15/building-an-azure-analysis-services-model-on-top-of-azure-blob-storage-part-1/
Part 02 :
https://blogs.msdn.microsoft.com/analysisservices/2017/05/30/building-an-azure-analysis-services-model-on-top-of-azure-blob-storage-part-2/
Part 03 :
https://blogs.msdn.microsoft.com/analysisservices/2017/06/22/building-an-azure-analysis-services-model-on-top-of-azure-blob-storage-part-3/
Update
This blog goes in details how to do this. the basic premise is the same as the blob storage backed SSAS process. They however introduced a data lake store connector.
https://blogs.msdn.microsoft.com/analysisservices/2017/09/05/using-azure-analysis-services-on-top-of-azure-data-lake-storage/

Related

Tibco Spotfire, Tibco Analyst/ TDV connectivity to Azure Data Lake

Tibco spotfire is good for dashboards, but can't see any Azure data-source adapters in TDV, Anyway to seamless connect Azure with Spotfire for realtime dashboards, perhaps without synapse !?
You can ref this tutorial: Visualize Azure Data Lake Storage Data in TIBCO Spotfire through ADO.NET:
This article walks you through using the CData ADO.NET Provider for
Azure Data Lake Storage in TIBCO Spotfire. You will establish a
connection and create a simple dashboard.
It shows the example and may can give some guides.

Visual Studio Tabular Model to Data Lake

How can create a tabular model over Azure Data Lake?
I am able to create the Power BI Model with Azure Data Lake and upload it to the Azure Analysis Server. But, I need to create a tabular model over Azure Data Lake.
When I check the data sources, I don't see Azure Data Lake. I see it in Power BI Get Data.
You need to select Compatibility level SQL Server 2017/ Azure Analysis Services (1400) and then you will have ADLS in data sources.

Azure Data Factory Copy Activity to Copy to Azure Data Lake Table

I need to Copy data incrementally from On-Prem SQL server into Table in Azure Data Lake Store.
But when creating Copy Activity using Azure Portal, in the Destination I only see the folders(No option for Tables).
How can I do scheduled On-prem table to Data Lake Table Syncs?
Data Lake Store does not have a notion of tables. It is file storage system (like HDFS). You can however use capabilities such as Hive or Data Lake Analytics on top of your data stored in Data Lake Store to conform your data to a schema. In hive, you can do that using external tables, while in Data Lake Analytics you can run a simple extract script.
I hope this helps!
Azure Data Lake Analytics (ADLA) does have the concept of databases which have tables. However they are not currently exposed as a target in Data Factory. I believe it's on the backlog, although I can't find the reference right now.
What you could do is use Data Factory to copy data into Data Lake Store then run a U-SQL script which imports it into the ADLA database.
If you do feel this is an important feature, you can create a request here and vote for it:
https://feedback.azure.com/forums/327234-data-lake
ADLA Databases and tables:

How to move sharepoint list or excel file to azure sql dw?

I want to copy data from sharepoint to microsoft azure sql DW using azure datafactory or alternative service. Can I do this. Please anyone help me with this.
You can do this by setting up a data pipeline using Azure Data Factory to Azure blob storage. Afterwards you can use Azure's fast PolyBase technology to load the data from blob to your SQL Data Warehouse instance.
Can I ask how much data you intend on loading into the DW? Azure Data Warehouse is intended for use with at least terabyte level data up to petabyte compute and storage. I only ask because each SharePoint list or Excel file has a maximum of 2GB per file.

Error trying to move data from Azure table to DataLake store with DataFactory

I've been building a Datafactory pipeline to move data from my azure table storage to a datalake store, but the tasks fail with an exception that I can't find any information on. The error is
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
I don't know where the problem lies, if in the datasets, the linked services or the pipeline, and can't seem to find any info at all on the error I'm seeing on the console.
Since the copy behavior from Azure Table Storage to Azure Data Lake Store is not currently supported as a temporary work around you could go from Azure Table Storage to Azure Blob Storage to Azure Data Lake store.
Azure Table Storage to Azure Blob Storage
Azure Blob Storage to Azure Data Lake Store
I know this is not ideal solution but if you are under time constraints, it is just an intermediary step to get the data into the data lake.
HTH
The 'CopyBehaviour' property is not supported for Table storage (which is not a file based store) that you are trying to use as a source in ADF copy activity. That is the reason why you are seeing this error message.