Function app to load data from .csv in azure data lake store to azure sql server - azure-function-app

I have a .csv file in azure data lake store, Is there a way to load the data from that file to azure sql server using Function app? Currently I am using ADF to load the data.
Thanks,
Rav

There is an Azure Data Lake Store Binding for Azure Functions. It supports input and output bindings.
Samples

Related

How to store data either in Parquet or Jason files in "Azure Datalake Gen2 blob storage" before purge data in Azure SQL table

How to store data either in Parquet or Jason files in "Azure Datalake Gen2 blob storage" before purge data in Azure SQL table. What are the steps & services required to achieve this.
You can use Copy activity in Azure Data Factory to store data from SQL table as parquet files. Rather simplified, but here are the steps just to give you an idea.
If source sql table is in-house, install Azure Integration Runtime on in-house server.
Connect to source table using linked service and dataset using IR from above step.
Connect to Azure Gen 2 stortage from ADF, again using linked service.
Configure copy activity with source and sink.
Read more details here

How to push the data from Azure Data Lake to SSAS ( Azure analysis Service) ? Is it possible?

Azure data lake is my data source. I want to push data from azure data lake to Azure Analysis Service (SSAS). How i can do that?
I think it is not supported. In following documentation of Azure Analysis Services, Azure Data Lake is not listed in Data Source providers list:
https://opbuildstorageprod.blob.core.windows.net/output-pdf-files/en-us/Azure.azure-documents/live/analysis-services.pdf
This is supported.
You need to use a compatibility level of 1400. I have the latest Azure Data Lake plugin for VS2015. You would need to add Data Lake Store as a data source.
There is a good series of blogs here which give you insight of building azure analysis service on top of BLOB storage. The same principle can be applied to data lake store as well:
Part 01 :
https://blogs.msdn.microsoft.com/analysisservices/2017/05/15/building-an-azure-analysis-services-model-on-top-of-azure-blob-storage-part-1/
Part 02 :
https://blogs.msdn.microsoft.com/analysisservices/2017/05/30/building-an-azure-analysis-services-model-on-top-of-azure-blob-storage-part-2/
Part 03 :
https://blogs.msdn.microsoft.com/analysisservices/2017/06/22/building-an-azure-analysis-services-model-on-top-of-azure-blob-storage-part-3/
Update
This blog goes in details how to do this. the basic premise is the same as the blob storage backed SSAS process. They however introduced a data lake store connector.
https://blogs.msdn.microsoft.com/analysisservices/2017/09/05/using-azure-analysis-services-on-top-of-azure-data-lake-storage/

How to move sharepoint list or excel file to azure sql dw?

I want to copy data from sharepoint to microsoft azure sql DW using azure datafactory or alternative service. Can I do this. Please anyone help me with this.
You can do this by setting up a data pipeline using Azure Data Factory to Azure blob storage. Afterwards you can use Azure's fast PolyBase technology to load the data from blob to your SQL Data Warehouse instance.
Can I ask how much data you intend on loading into the DW? Azure Data Warehouse is intended for use with at least terabyte level data up to petabyte compute and storage. I only ask because each SharePoint list or Excel file has a maximum of 2GB per file.

Is it possible to export data from MS Azure SQL directly Into the Azure Table Storage?

Is there any direct way within the Azure MSSQL ecosystem to export SQL returned data set into the Azure table storage?
Something like BCP but with the Table Storage connection string on the -Output end?
There is a service named Azure Data Factory which can directly copy data from Azure SQL Database to Azure Table Storage, even between other supported data stores, please see the section Supported data stores of the article "Data movement and the Copy Activity: migrating data to the cloud and between cloud stores" to know, but it is for Web, not like BCP command tool.
You can refer to the tutorial Build your first Azure data factory using Azure Portal/Data Factory Editor to know how to use it.
And as references, you can refer to the articles Move data to and from Azure SQL Database using Azure Data Factory & Move data to and from Azure Table using Azure Data Factory to know how it works.

Error trying to move data from Azure table to DataLake store with DataFactory

I've been building a Datafactory pipeline to move data from my azure table storage to a datalake store, but the tasks fail with an exception that I can't find any information on. The error is
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
I don't know where the problem lies, if in the datasets, the linked services or the pipeline, and can't seem to find any info at all on the error I'm seeing on the console.
Since the copy behavior from Azure Table Storage to Azure Data Lake Store is not currently supported as a temporary work around you could go from Azure Table Storage to Azure Blob Storage to Azure Data Lake store.
Azure Table Storage to Azure Blob Storage
Azure Blob Storage to Azure Data Lake Store
I know this is not ideal solution but if you are under time constraints, it is just an intermediary step to get the data into the data lake.
HTH
The 'CopyBehaviour' property is not supported for Table storage (which is not a file based store) that you are trying to use as a source in ADF copy activity. That is the reason why you are seeing this error message.