I was wondering if it's possible to load a json file stored in an Azure storage account (blob or file share) directly from Azure SQL Database to then leverage the new openjson syntax.
I tried following command in my Azure SQL Database:
Select * from openrowset(bulk '\\mystorage.file.core.windows.net\myshare\myfile.json', single_clob) as jsondata
but it throws me an error "You do not have permission to use the bulk load statement".
After googling around I couldn't find anything which could help me on the way.
Is it even possible ?
Thank you
OPENROWSET(BULK) and BULK INSERT T-SQL statements are not yet enabled in Azure SQL Database. You can use them in SQL server on-prem to load files from Azure Blob Storage: https://msdn.microsoft.com/en-us/library/mt805207.aspx
I cannot confirm exact date when these statements will be be enabled in Azure SQL Database (but trust me they will be available very soon :) ) and you will be able to load files from Azure Blob Storage.
Stay tuned and keep monitoring Azure service updates :)
Related
I need to migrate the data from one table in Azure DB to another table in a different AWS DB. What is the best way to copy all the information from a table in one database to a table that resides within a different database?
I am using SQL management studio, and the option to script table results in the error invalid version 16 (Microsoft.SQlServer.Smo)
I could copy all the data in the table and add it into an insert statement. The problem is that I would have to format the data, which is subject to error manually. I do not have any former training on how to work with SQL. What is the best way to migrate the data? If anyone can assist, it would be greatly appreciated.
AFAIK your SSMS version is not compatible for this activity.
When I try with version 16, I got below error:
I upgraded the version and tried with 18.12.1 It worked fine for me.
Image for reference:
The file created successfully.
Migrate Azure SQL database table from AWS database table create a .bacpac file using export option to azure blob storage.
Image for reference:
Copy the .bacpac file from Azure Storage to Amazon EC2 EBS storage and import the .bacpac file to Amazon RDS for SQL Server. In this way you can migrate the data of database from Azure SQL to AWS.
All other questions and answers I have found on this topic reference accessing a CSV/Excel file (e.g. via OPENROWSET or BULK INSERT) rather than as a blob.
Is it possible, from within a Stored Procedure, to access Azure Blob Storage (for a particular, known file URL), and output the actual file's data as a varbinary(max) column from a stored procedure? Similarly, in reverse, is it possible to accept a varbinary(max) as a stored procedure parameter and subsequently write that file to Blob Storage from within Azure SQL Database?
My first reaction would be: why would you?
If there's a reason to combine data from SQL with data from a file, your logic should probably do that. Use tools and services for what they're there for, and what they're good at. Reading (information from) files from within (Azure) SQL is not one of them.
With that being said: it does not seem possible without OPENROWSET or BULK INSERT.
Closest that is out there is an option to SQL Server backup to URL for Microsoft Azure Blob Storage but even that's only available for SQL Server and Managed Instance. The fact even that is not possible with Azure SQL probably tells us something like this is not going to possible.
It is possible to read data from Azure blob storage in Azure SQL database via openrowset or bulkinsert.
But is it possible to upload a file in blob through any SQL commands in Azure SQL DB ?
Similar to CETAS in Azure Synapse.
Unfortunately, that seems to be current limitation for SQLDB. below link has the details PolyBase features and limitations
I'm looking for solution to store a file in Azure Storage Account - File Share from a Stored Procedure. I'm not using this file content in my tables, it's all photo references. Could you please suggest me if any of these approaches would help or any other alternatives?
Calling Azure Serverless Functions from the Stored Procedure
Accessing the Physical Path from Stored Procedure, like using “CREATE EXTERNAL DATA SOURCE” command.
Calling xp_cmd to store the files.
Thanks in advance.
Azure SQL database doesn't support access local files, and only support Blob storage. That's why those approaches don't work.
Just for now, we can not find any supports for the file storage. You may could add a new feedback that Azure SQL database product team could see it.
Is there a way to output U-SQL results directly to a SQL DB such as Azure SQL DB? Couldn't find much about that.
Thanks!
U-SQL only currently outputs to files or internal tables (ie tables within ADLA databases), but you have a couple of options. Azure SQL Database has recently gained the ability to load files from Azure Blob Storage using either BULK INSERT or OPENROWSET, so you could try that. This article shows the syntax and gives a reminder that:
Azure Blob storage containers with public blobs or public containers
access permissions are not currently supported.
wasb://<BlobContainerName>#<StorageAccountName>.blob.core.windows.net/yourFolder/yourFile.txt
BULK INSERT and OPENROWSET with Azure Blob Storage is shown here:
https://blogs.msdn.microsoft.com/sqlserverstorageengine/2017/02/23/loading-files-from-azure-blob-storage-into-azure-sql-database/
You could also use Azure Data Factory (ADF). Its Copy Activity could load the data from Azure Data Lake Storage (ADLS) to an Azure SQL Database in two steps:
execute U-SQL script which creates output files in ADLS (internal tables are not currently supported as a source in ADF)
move the data from ADLS to Azure SQL Database
As a final option, if your data is likely to get into larger volumes (ie Terabytes (TB) then you could use Azure SQL Data Warehouse which supports Polybase. Polybase now supports both Azure Blob Storage and ADLS as a source.
Perhaps if you can tell us a bit more about your process we can refine which of these options is most suitable for you.