Synapse Pipeline to copy from Datalakes to DW error - azure-sql-database

I'm trying to to import data into the Azure Dedicated SQL Pool from our OnPremis DB SQL Servers from Data Lakes. Some of the tables in OnPremise are multiple gig.
Importing the data from Data Lakes only works if I select the "Enable Staging" option and select our existing datalakes container.
My question is, if the import file is already in Data Lakes, why do we need to select a "Enable Staging" option to import it into the Azure Dedicated SQL Pool (DW)?

To copy data from Data Lake to synapse dedicated pool I followed below procedure:
I created synapse dedicated pool linked service:
I created linked service for data lake Storage Gen2
I created pipeline and performed copy activity by creating Data lake Storage Gen2 delimited text dataset as source:
synapse dedicated pool dataset as source and enabled auto create table
In the settings page enabled stagging selected the Azure Data Lake Storage Gen2 linked service.
As per this Enable staging option applies if your source data is not compatible with PolyBase.
The storage is used for staging the data before it loads into Azure Synapse Analytics by using PolyBase. After the copy is complete, the interim data in Azure Data Lake Storage Gen2 is automatically cleaned up.
I debug the pipeline, it run successfully
It worked for me kindly check from your end.

Related

Data transfer from on prem to azure synapse database with out using ADF

Can we move data from onprem sql server to azure synapse with out using a azure data factory?
My main intention behind this is not to have intermediate storage of data in blob or storage container. the bulk insert works in ADF but do we have any other option here.
In case if this is a 1 time activity, you can use database migration assistant tool.
Else you can also leverage ssis to transfer data between on prem sql and synapse.

How to store data either in Parquet or Jason files in "Azure Datalake Gen2 blob storage" before purge data in Azure SQL table

How to store data either in Parquet or Jason files in "Azure Datalake Gen2 blob storage" before purge data in Azure SQL table. What are the steps & services required to achieve this.
You can use Copy activity in Azure Data Factory to store data from SQL table as parquet files. Rather simplified, but here are the steps just to give you an idea.
If source sql table is in-house, install Azure Integration Runtime on in-house server.
Connect to source table using linked service and dataset using IR from above step.
Connect to Azure Gen 2 stortage from ADF, again using linked service.
Configure copy activity with source and sink.
Read more details here

Azure Data Factory Copy Activity to Copy to Azure Data Lake Table

I need to Copy data incrementally from On-Prem SQL server into Table in Azure Data Lake Store.
But when creating Copy Activity using Azure Portal, in the Destination I only see the folders(No option for Tables).
How can I do scheduled On-prem table to Data Lake Table Syncs?
Data Lake Store does not have a notion of tables. It is file storage system (like HDFS). You can however use capabilities such as Hive or Data Lake Analytics on top of your data stored in Data Lake Store to conform your data to a schema. In hive, you can do that using external tables, while in Data Lake Analytics you can run a simple extract script.
I hope this helps!
Azure Data Lake Analytics (ADLA) does have the concept of databases which have tables. However they are not currently exposed as a target in Data Factory. I believe it's on the backlog, although I can't find the reference right now.
What you could do is use Data Factory to copy data into Data Lake Store then run a U-SQL script which imports it into the ADLA database.
If you do feel this is an important feature, you can create a request here and vote for it:
https://feedback.azure.com/forums/327234-data-lake
ADLA Databases and tables:

Is it possible to export data from MS Azure SQL directly Into the Azure Table Storage?

Is there any direct way within the Azure MSSQL ecosystem to export SQL returned data set into the Azure table storage?
Something like BCP but with the Table Storage connection string on the -Output end?
There is a service named Azure Data Factory which can directly copy data from Azure SQL Database to Azure Table Storage, even between other supported data stores, please see the section Supported data stores of the article "Data movement and the Copy Activity: migrating data to the cloud and between cloud stores" to know, but it is for Web, not like BCP command tool.
You can refer to the tutorial Build your first Azure data factory using Azure Portal/Data Factory Editor to know how to use it.
And as references, you can refer to the articles Move data to and from Azure SQL Database using Azure Data Factory & Move data to and from Azure Table using Azure Data Factory to know how it works.

Error trying to move data from Azure table to DataLake store with DataFactory

I've been building a Datafactory pipeline to move data from my azure table storage to a datalake store, but the tasks fail with an exception that I can't find any information on. The error is
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
I don't know where the problem lies, if in the datasets, the linked services or the pipeline, and can't seem to find any info at all on the error I'm seeing on the console.
Since the copy behavior from Azure Table Storage to Azure Data Lake Store is not currently supported as a temporary work around you could go from Azure Table Storage to Azure Blob Storage to Azure Data Lake store.
Azure Table Storage to Azure Blob Storage
Azure Blob Storage to Azure Data Lake Store
I know this is not ideal solution but if you are under time constraints, it is just an intermediary step to get the data into the data lake.
HTH
The 'CopyBehaviour' property is not supported for Table storage (which is not a file based store) that you are trying to use as a source in ADF copy activity. That is the reason why you are seeing this error message.