Is it possible to export data from MS Azure SQL directly Into the Azure Table Storage? - azure-sql-database

Is there any direct way within the Azure MSSQL ecosystem to export SQL returned data set into the Azure table storage?
Something like BCP but with the Table Storage connection string on the -Output end?

There is a service named Azure Data Factory which can directly copy data from Azure SQL Database to Azure Table Storage, even between other supported data stores, please see the section Supported data stores of the article "Data movement and the Copy Activity: migrating data to the cloud and between cloud stores" to know, but it is for Web, not like BCP command tool.
You can refer to the tutorial Build your first Azure data factory using Azure Portal/Data Factory Editor to know how to use it.
And as references, you can refer to the articles Move data to and from Azure SQL Database using Azure Data Factory & Move data to and from Azure Table using Azure Data Factory to know how it works.

Related

Access Azure Table Storage in SQL Server

I'm trying to access Azure Table Storage in a Gen 2 data lake from Azure SQL Server, but I can't find any documentation. Loads on how to get to csv's in blob storage, but nothing on Azure tables.
Any ideas?
John
Your requirement isn't feasible.
Azure Table storage is a service that stores non-relational
structured data (also known as structured NoSQL data) in the cloud,
providing a key/attribute store with a schemaless design.
Since, Table storage can't be queried using SQL, therefore there is no sense to access it using any SQL Server.
I recommend you to first go through Table storage concepts
before knowing how to query it.
Once getting the Table Storage structure, you can query the tables either through REST API or Cosmos DB Table API based on your application. Refer Querying tables and entities.
You can also follow this complete tutorial Quickstart: Build a Table API app with .NET SDK and Azure Cosmos DB to create basic application using Table Storage for learning purpose.

Can we migrate a Database from Azure Sql Database directly to Azure postgreSql Database

Is there a way to directly migrate your database in Azure SQL database to the Azure PostgreSQL database (HyperScale-Citus).
I have looked into the Azure migration services but it does not support this particular migration route.
I have an approach in mind but don't know if it will work?
We can make a backup of the Azure SQL database on the cloud itself
and then load that backup to Azure PostgreSQL database
But I do not where to make a backup. In azure blob storage or something else?
Frist way, you could try the tutorial #ffffff01 provided for you.
There this another way can help you achieve that: Data Factory can help you migrate the database/data from Azure SQL database to Azure PostgreSQL database directly.
Ref bellow tutorial:
Copy data to and from Azure Database for PostgreSQL by using Azure
Data Factory
Copy and transform data in Azure SQL Database by using Azure Data
Factory
Create Azure SQL database as source dataset and Azure PostgreSQL database as sink.
Hope this helps.

SQL Server 2019 - Connecting to Databricks Cluster using SSMS or Azure Data Studio

Has anyone used SSMS v18.2 or Azure Data Studio to connect to a DataBricks Cluster and so query on DataBricks tables and/or the DataBricks File System (dbfs)?
Would like to know how you can set this up to show a DataBricks server in connections and use PolyBase to connect to dbfs
I can connect to ADLS using the PolyBase commands like as follows:
-- Scoped Credential
CREATE DATABASE SCOPED CREDENTIAL myScopedCredential
WITH
IDENTITY = '<MyId>#https://login.microsoftonline.com/<Id2>/oauth2/token',
SECRET = '<MySecret>';
-- External Data Source
CREATE EXTERNAL DATA SOURCE myDataSource
WITH
(
TYPE = HADOOP,
LOCATION = 'adl://mydatalakeserver.azuredatalakestore.net',
CREDENTIAL = myScopedCredential
)
-- Something similar to setup for dbfs?
-- What IDENTITY used for Scoped Credential?
As per my knowledge, Azure Databrick cannot be connect to SQL Server 2019 using SSMS or Azure Data Studio.
The following list provides the data sources in Azure that you can use with Azure Databricks. For a complete list of data sources that can be used with Azure Databricks, see Data sources for Azure Databricks.
The Spark connector for Microsoft SQL Server and Azure SQL Database enables Microsoft SQL Server and Azure SQL Database to act as input data sources and output data sinks for Spark jobs. It allows you to use real- time transactional data in big data analytics and persist results for ad-hoc queries or reporting.
For more details, refer "Connecting to Microsoft SQL Server and Azure SQL database with Spark connector".
Hope this helps.
This doesn't seem possible without the use of 3rd party tools or custom applications. Databricks SQL just doesn't expose the protocols necessary.
There are 3rd party tools (e.g. from CData) that can help you here. See this article: https://www.cdata.com/kb/tech/databricks-odbc-linked-server.rst

U SQL: direct output to SQL DB

Is there a way to output U-SQL results directly to a SQL DB such as Azure SQL DB? Couldn't find much about that.
Thanks!
U-SQL only currently outputs to files or internal tables (ie tables within ADLA databases), but you have a couple of options. Azure SQL Database has recently gained the ability to load files from Azure Blob Storage using either BULK INSERT or OPENROWSET, so you could try that. This article shows the syntax and gives a reminder that:
Azure Blob storage containers with public blobs or public containers
access permissions are not currently supported.
wasb://<BlobContainerName>#<StorageAccountName>.blob.core.windows.net/yourFolder/yourFile.txt
BULK INSERT and OPENROWSET with Azure Blob Storage is shown here:
https://blogs.msdn.microsoft.com/sqlserverstorageengine/2017/02/23/loading-files-from-azure-blob-storage-into-azure-sql-database/
You could also use Azure Data Factory (ADF). Its Copy Activity could load the data from Azure Data Lake Storage (ADLS) to an Azure SQL Database in two steps:
execute U-SQL script which creates output files in ADLS (internal tables are not currently supported as a source in ADF)
move the data from ADLS to Azure SQL Database
As a final option, if your data is likely to get into larger volumes (ie Terabytes (TB) then you could use Azure SQL Data Warehouse which supports Polybase. Polybase now supports both Azure Blob Storage and ADLS as a source.
Perhaps if you can tell us a bit more about your process we can refine which of these options is most suitable for you.

How to move sharepoint list or excel file to azure sql dw?

I want to copy data from sharepoint to microsoft azure sql DW using azure datafactory or alternative service. Can I do this. Please anyone help me with this.
You can do this by setting up a data pipeline using Azure Data Factory to Azure blob storage. Afterwards you can use Azure's fast PolyBase technology to load the data from blob to your SQL Data Warehouse instance.
Can I ask how much data you intend on loading into the DW? Azure Data Warehouse is intended for use with at least terabyte level data up to petabyte compute and storage. I only ask because each SharePoint list or Excel file has a maximum of 2GB per file.