CETAS equivalent in Azure SQL database - azure-sql-database

It is possible to read data from Azure blob storage in Azure SQL database via openrowset or bulkinsert.
But is it possible to upload a file in blob through any SQL commands in Azure SQL DB ?
Similar to CETAS in Azure Synapse.

Unfortunately, that seems to be current limitation for SQLDB. below link has the details PolyBase features and limitations

Related

Can we migrate a Database from Azure Sql Database directly to Azure postgreSql Database

Is there a way to directly migrate your database in Azure SQL database to the Azure PostgreSQL database (HyperScale-Citus).
I have looked into the Azure migration services but it does not support this particular migration route.
I have an approach in mind but don't know if it will work?
We can make a backup of the Azure SQL database on the cloud itself
and then load that backup to Azure PostgreSQL database
But I do not where to make a backup. In azure blob storage or something else?
Frist way, you could try the tutorial #ffffff01 provided for you.
There this another way can help you achieve that: Data Factory can help you migrate the database/data from Azure SQL database to Azure PostgreSQL database directly.
Ref bellow tutorial:
Copy data to and from Azure Database for PostgreSQL by using Azure
Data Factory
Copy and transform data in Azure SQL Database by using Azure Data
Factory
Create Azure SQL database as source dataset and Azure PostgreSQL database as sink.
Hope this helps.

SQL Server 2019 - Connecting to Databricks Cluster using SSMS or Azure Data Studio

Has anyone used SSMS v18.2 or Azure Data Studio to connect to a DataBricks Cluster and so query on DataBricks tables and/or the DataBricks File System (dbfs)?
Would like to know how you can set this up to show a DataBricks server in connections and use PolyBase to connect to dbfs
I can connect to ADLS using the PolyBase commands like as follows:
-- Scoped Credential
CREATE DATABASE SCOPED CREDENTIAL myScopedCredential
WITH
IDENTITY = '<MyId>#https://login.microsoftonline.com/<Id2>/oauth2/token',
SECRET = '<MySecret>';
-- External Data Source
CREATE EXTERNAL DATA SOURCE myDataSource
WITH
(
TYPE = HADOOP,
LOCATION = 'adl://mydatalakeserver.azuredatalakestore.net',
CREDENTIAL = myScopedCredential
)
-- Something similar to setup for dbfs?
-- What IDENTITY used for Scoped Credential?
As per my knowledge, Azure Databrick cannot be connect to SQL Server 2019 using SSMS or Azure Data Studio.
The following list provides the data sources in Azure that you can use with Azure Databricks. For a complete list of data sources that can be used with Azure Databricks, see Data sources for Azure Databricks.
The Spark connector for Microsoft SQL Server and Azure SQL Database enables Microsoft SQL Server and Azure SQL Database to act as input data sources and output data sinks for Spark jobs. It allows you to use real- time transactional data in big data analytics and persist results for ad-hoc queries or reporting.
For more details, refer "Connecting to Microsoft SQL Server and Azure SQL database with Spark connector".
Hope this helps.
This doesn't seem possible without the use of 3rd party tools or custom applications. Databricks SQL just doesn't expose the protocols necessary.
There are 3rd party tools (e.g. from CData) that can help you here. See this article: https://www.cdata.com/kb/tech/databricks-odbc-linked-server.rst

U SQL: direct output to SQL DB

Is there a way to output U-SQL results directly to a SQL DB such as Azure SQL DB? Couldn't find much about that.
Thanks!
U-SQL only currently outputs to files or internal tables (ie tables within ADLA databases), but you have a couple of options. Azure SQL Database has recently gained the ability to load files from Azure Blob Storage using either BULK INSERT or OPENROWSET, so you could try that. This article shows the syntax and gives a reminder that:
Azure Blob storage containers with public blobs or public containers
access permissions are not currently supported.
wasb://<BlobContainerName>#<StorageAccountName>.blob.core.windows.net/yourFolder/yourFile.txt
BULK INSERT and OPENROWSET with Azure Blob Storage is shown here:
https://blogs.msdn.microsoft.com/sqlserverstorageengine/2017/02/23/loading-files-from-azure-blob-storage-into-azure-sql-database/
You could also use Azure Data Factory (ADF). Its Copy Activity could load the data from Azure Data Lake Storage (ADLS) to an Azure SQL Database in two steps:
execute U-SQL script which creates output files in ADLS (internal tables are not currently supported as a source in ADF)
move the data from ADLS to Azure SQL Database
As a final option, if your data is likely to get into larger volumes (ie Terabytes (TB) then you could use Azure SQL Data Warehouse which supports Polybase. Polybase now supports both Azure Blob Storage and ADLS as a source.
Perhaps if you can tell us a bit more about your process we can refine which of these options is most suitable for you.

Azure SQL Database: load json file from Azure storage

I was wondering if it's possible to load a json file stored in an Azure storage account (blob or file share) directly from Azure SQL Database to then leverage the new openjson syntax.
I tried following command in my Azure SQL Database:
Select * from openrowset(bulk '\\mystorage.file.core.windows.net\myshare\myfile.json', single_clob) as jsondata
but it throws me an error "You do not have permission to use the bulk load statement".
After googling around I couldn't find anything which could help me on the way.
Is it even possible ?
Thank you
OPENROWSET(BULK) and BULK INSERT T-SQL statements are not yet enabled in Azure SQL Database. You can use them in SQL server on-prem to load files from Azure Blob Storage: https://msdn.microsoft.com/en-us/library/mt805207.aspx
I cannot confirm exact date when these statements will be be enabled in Azure SQL Database (but trust me they will be available very soon :) ) and you will be able to load files from Azure Blob Storage.
Stay tuned and keep monitoring Azure service updates :)

Is it possible to export data from MS Azure SQL directly Into the Azure Table Storage?

Is there any direct way within the Azure MSSQL ecosystem to export SQL returned data set into the Azure table storage?
Something like BCP but with the Table Storage connection string on the -Output end?
There is a service named Azure Data Factory which can directly copy data from Azure SQL Database to Azure Table Storage, even between other supported data stores, please see the section Supported data stores of the article "Data movement and the Copy Activity: migrating data to the cloud and between cloud stores" to know, but it is for Web, not like BCP command tool.
You can refer to the tutorial Build your first Azure data factory using Azure Portal/Data Factory Editor to know how to use it.
And as references, you can refer to the articles Move data to and from Azure SQL Database using Azure Data Factory & Move data to and from Azure Table using Azure Data Factory to know how it works.