I am trying to get access to my Azure blob storage via SQL. This (https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-table-transact-sql) article describes how it works.
I have tried the following SQL command:
-- PolyBase only: Azure Storage Blob as data source
-- (on SQL Server 2016 and Azure SQL Data Warehouse)
CREATE EXTERNAL DATA SOURCE dataSourceNameTestBlob
WITH (
TYPE = HADOOP,
LOCATION = 'wasb[s]://container#account_name.blob.core.windows.net'
[, CREDENTIAL = credential_name ]
)
[;]
Which results in the following error:
Msg 102, Level 15, State 1, Line 5 Incorrect syntax near 'HADOOP'.
When I google on this error I find that I need to use sql DW (.dsql) instead of .sql queries. However the article mentions that I can use a Azure SQL Database.
What am I doing wrong? I just only want to access my blob storage in SQL.
PolyBase scenario with Hadoop is only supported on SQL Server 2016 (or higher), Azure SQL Data Warehouse, and Parallel Data Warehouse.
Below is the T-SQL script against Azure SQL Database to store to blob storage.
CREATE EXTERNAL DATA SOURCE data_source_name
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'https://storage_account_name.blob.core.windows.net/container_name'
[, CREDENTIAL = credential_name ]
)
Related
We have created a SQL Database from our Azure SQL Serverless Pool. We have a table that has over 450 fields.
Whenever we try to extract the table with all the fields the query times out and produces the following error:
Msg 15884, Level 16, State 1, Line 2
Query timeout expired.
However, when I we try to extract just a few fields it successfully gives us all the rows.
Therefore, can someone let me know if there are any limitations on the number fields when extracting tables from Azure SQL Serverless Pool?
Msg 15884, Level 16, State 1, Line 2
Query timeout expired.
This error is because the SQL query takes long time to execute. Unfortunately, timeout settings cannot be modified in Synapse SQL serverless pool. The solution is to either optimize the query or to optimize the data stored in external storage.
Below are some points for better performance.
Try to store data in parquet format than csv or Json file. Parquet files are columnar format and size will be lesser for same data which is stored as csv or Json format.
Do not use the storage account with other workloads during query execution.
In order to query large amount of data, use Azure Data Studio or SQL Server Management Studio than azure synapse studio.
Make sure to have Synapse serverless SQL pool and Storage in the same region.
Refer Microsoft document on Best practices for serverless SQL pool - Azure Synapse Analytics .
I'm just finding my way around Azure, trying to build a modern data warehouse. One thing I haven't been able to figure out is how to query my data lake from an Azure SQL database.
Something similar to the following works in Azure Synapse (note the long-term plan is to remove Synapse due to cost reasons):
SELECT top 100 *
FROM
OPENROWSET( BULK
'https://storageaccount.blob.core.windows.net/container/folder/2022/09/03/filename.parquet'
,SINGLE_BLOB
) AS [result]
But I get the following error running this from an Azure SQL database (in the Azure portal, using the query editor):
Failed to execute the query. Error: Cannot bulk load because the file "https://storageaccount.blob.core.windows.net/container/folder/2022/09/03/filename.parquet" could not be opened. Operating system error code 6(The handle is invalid.).
I also tried the code below after searching on the Internet:
CREATE EXTERNAL DATA SOURCE pocBlobStorage
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://storageaccount.blob.core.windows.net/container/folder/2022/09/03',
CREDENTIAL= sqlblob);
-- Query remote file
SELECT *
FROM OPENROWSET(BULK 'filename.parquet',
DATA_SOURCE = 'pocBlobStorage',
SINGLE_CLOB
--FORMATFILE='currency.fmt',
--FIRSTROW=2
--, FORMATFILE_DATA_SOURCE = 'pocBlobStorage'
) as D
I tried various combinations of the formatting options, but couldn't get anything to work.
The current error I'm getting is: Failed to execute query. Error: Referenced external data source "pocBlobStorage" not found.
I'm wondering if I need to do something to enable the Azure SQL database to access my data lake. For example, I haven't configured any credential called 'SQL blob' as per my last code segment, but I'm not sure where to do this (for example something similar to creating a linked service in azure data factory).
So how do I query my data lake, directly from my azure SQL database? Is the issue in my query, or do I need to configure access first, and if so how?
I'm trying to create an external data source to access Azure Blob Storage. However, I'm having issues with creating the actual data source.
I've followed the instructions located here:
Examples of bulk access to data in azure blob storage and
Create external data source - transact sql. I'm using SQL Server 2016 on a VM accessing via SSMS on a client machine using Windows Authentication with no issues. Instructions say creating this external data source works for SQL Server 2016 and Azure Blob Storage.
I have created the Master Key:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = <password>
and, the database scoped credential
CREATE DATABASE SCOPED CREDENTIAL UploadCountries
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = <key>;
I have verified both of these exist in the database by querying sys.symmetric_keys and sys.database_scoped_credentials.
However, when I try executing the following code it says 'Incorrect syntax near 'EXTERNAL'
CREATE EXTERNAL DATA SOURCE BlobCountries
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'https://<somewhere>.table.core.windows.net/<somewhere>',
CREDENTIAL = UploadCountries
);
Your thoughts and help are appreciated!
Steve.
In “Examples of Bulk Access to Data in Azure Blob Storage”, we can find:
Bulk access to Azure blob storage from SQL Server, requires at least SQL Server 2017 CTP 1.1.
And in Arguments section of “CREATE EXTERNAL DATA SOURCE (Transact-SQL)”, we can find similar information:
Use BLOB_STORAGE when performing bulk operations using BULK INSERT or OPENROWSET with SQL Server 2017
You are using SQL Server 2016, so you get Incorrect syntax near 'EXTERNAL' error when you create external data source for Azure Blob storage.
Here is my script which I am trying to run in Azure SQL Database:
CREATE DATABASE SCOPED CREDENTIAL some_cred WITH IDENTITY = user1,
SECRET = '<Key of Blob Storage container>';
CREATE EXTERNAL DATA SOURCE TEST
WITH
(
TYPE=BLOB_STORAGE,
LOCATION='wasbs://<containername>#accountname.blob.core.windows.net',
CREDENTIAL= <somecred>`enter code here`
);
CREATE EXTERNAL TABLE dbo.test
(
val VARCHAR(255)
)
WITH
(
DATA_SOURCE = TEST
)
I am getting the following error:
External tables are not supported with the provided data source type.
My goal is to create external table in blob storage so that Hive query in HDInsight references to the same blob. The table needs to be managed through Azure SQL. What's wrong with this script?
Azure SQL Database does have the feature to load files stored in Blob Storage but it only via the BULK INSERT and OPENROWSET language features. See here for more information.
BULK INSERT dbo.test
FROM 'data/yourFile.txt'
WITH ( DATA_SOURCE = 'YourAzureBlobStorageAccount');
The way you have scripted it is more like an external table using Polybase which is only available in SQL Server 2016 and Azure SQL Data Warehouse at this time.
I'm thinking External tables can be used for Cross Database Querying (Elastic queries). So it couldn't able to use the External Data Source which is BLOB_STORAGE
Unable to create EXTERNAL TABLE at Azure SQL Data Warehouse pointing to Azure SQL Server.
CREATE EXTERNAL DATA SOURCE EX_SOURCE
WITH (
TYPE = RDBMS,
LOCATION = 'SERVER.database.windows.net',
DATABASE_NAME = 'DB_NAME',
CREDENTIAL = "CREDENTIAL"
)
;
Responds with:
Incorrect syntax near 'RDBMS'
Does anyone know a workaround for this?
I also have tried to to do it the other way around. It allows me to create an EXTERNAL DATA SOURCE pointing to Azure SQL Data Warehouse from Azure SQL Server and to create EXTERNAL TABLE, but when I try to query it I get:
Error retrieving data from one or more shards. The underlying error message received was: 'Parse error at line: 1, column: 36: Incorrect syntax near '='.'.
Azure SQL Data Warehouse, today, only supports creating an external data source to Azure Blob Storage and Hadoop ('hdfs://') targets. The current external query syntax for Azure SQL Database does not support Azure SQL Data Warehouse.