I'm looking for solution to store a file in Azure Storage Account - File Share from a Stored Procedure. I'm not using this file content in my tables, it's all photo references. Could you please suggest me if any of these approaches would help or any other alternatives?
Calling Azure Serverless Functions from the Stored Procedure
Accessing the Physical Path from Stored Procedure, like using “CREATE EXTERNAL DATA SOURCE” command.
Calling xp_cmd to store the files.
Thanks in advance.
Azure SQL database doesn't support access local files, and only support Blob storage. That's why those approaches don't work.
Just for now, we can not find any supports for the file storage. You may could add a new feedback that Azure SQL database product team could see it.
Related
All other questions and answers I have found on this topic reference accessing a CSV/Excel file (e.g. via OPENROWSET or BULK INSERT) rather than as a blob.
Is it possible, from within a Stored Procedure, to access Azure Blob Storage (for a particular, known file URL), and output the actual file's data as a varbinary(max) column from a stored procedure? Similarly, in reverse, is it possible to accept a varbinary(max) as a stored procedure parameter and subsequently write that file to Blob Storage from within Azure SQL Database?
My first reaction would be: why would you?
If there's a reason to combine data from SQL with data from a file, your logic should probably do that. Use tools and services for what they're there for, and what they're good at. Reading (information from) files from within (Azure) SQL is not one of them.
With that being said: it does not seem possible without OPENROWSET or BULK INSERT.
Closest that is out there is an option to SQL Server backup to URL for Microsoft Azure Blob Storage but even that's only available for SQL Server and Managed Instance. The fact even that is not possible with Azure SQL probably tells us something like this is not going to possible.
I need expert opinion on a project I am working on. We currently get data files that we load into our Azure sql database using a local script that calls stored procedures. I am planning on replacing the script with ssis jobs to load the data into our Azure Sql but wondering if that's a good option given our needs.I am opened to different suggestions too. The process we go through is to load data file to staging tables and validate before making updates to live tables. The validation and updates are done by calling stored procedures...so the ssis package will just load the data and make calls to those stored procedures. I have looked at ADF IR and Databricks but they seem overkill but am open to hear people with experience using those as well. I am currently running the ssis package locally as well. Any suggestion on better architecture or tools for this scenario? Thanks!
I would definitely have a look at Azure Data Factory Data flows. With this you can easily build your ETL pipelines in the a Azure Data Factory GUI.
In the following example two text files from a Blob Storage are read, joined, a surrogate key is added and finally the data is loaded to Azure Synapse Analytics (would be the same for Azure SQL):
You finally put this Mapping Data Flow into a pipeline and can trigger it, e. g. if new data arrives.
You can just BULK INSERT data from Azure Blob Store:
https://learn.microsoft.com/en-us/sql/relational-databases/import-export/examples-of-bulk-access-to-data-in-azure-blob-storage?view=sql-server-ver15#accessing-data-in-a-csv-file-referencing-an-azure-blob-storage-location
Then you can use ADF (no IR) or Databricks or Azure Batch or Azure Elastic Jobs to schedule the execution.
I am new to data lake analytics and using USQL.
I am currently setting up data factory pipeline which would replace an existing SSIS workflow. The data factory pipeline would essentially
Extract data transactional database into ADLS
Transform raw entities using USQL
Load the data into SSAS using custom activity
Question
I have a USQL project set up and wanted if there was a standard way of deploying them to ADLA other than just uploading the scripts to a folder in the store.
Great question!
I'm not sure about a standard way, or even a way that might be considered best practice yet. But I use all of the tools you mention to perform very similar tasks.
To try and answer your question: What I do is create the U-SQL scripts as stored procedures within the logical ADLA database. In the VS USQL project I have 1 script per stored proc. The ADF activities then call the proc name. This gives you the right level of disconnection between services and also means you don't need additional blob storage for USQL files.
In my VS solution I often also have a PowerShell project to help manage things. Specifically one what takes all my 'usp_' U-SQL scripts to create one big DDL style thing that can be deployed to the logical ADLA database.
The PowerShell then does the deployment for me using the submit job cmdlet. Example below.
Submit-AzureRmDataLakeAnalyticsJob `
-Name $JobName `
-AccountName $DLAnalytics `
–Script $USQLProcDeployAll `
-DegreeOfParallelism $DLAnalyticsDoP
Hope this gives you a steer. I also accept that these tools are still fairly new. So open to other suggestions.
Cheers
I'm extremely confused, so I've created an SQL Database in Windows Azure, created a "video table" with a "video_file" column as "varbinary(max)" because I want to upload a video file into that field, however Azure offers no "Upload" option like say, PHPMyAdmin does where you can hit "browse" and upload a video directly into the field. Can anyone guide me as to how to actually upload a file into a Windows Azure SQL Database so it can be read as a varbinary type? Can it be done within the Azure management portal? Or does it require some sort of external program/service?
To answer your question, the functionality to upload files directly into SQL Azure Database does not exist. This is something you have to do on your own.
Can anyone guide me as to how to actually upload a file into a Windows
Azure SQL Database so it can be read as a varbinary type?
Do a search for uploading files in SQL Server and you will find plenty of examples on how to do that. Take a look at this link for example: http://www.codeproject.com/Articles/225446/Uploading-and-downloading-files-to-from-a-SQL-Serv
Can it be done within the Azure management portal? Or does it require some sort of external program/service?
No. This functionality does not exist in Azure Management Portal. As mentioned above, you would need to write some code to do so.
A little bit off-topic comment:
May I suggest that instead of saving the image files in the database you save them in Blob Storage and store the URL of the blob in your table. There're some advantages I could see in this:
Compared to SQL Database, Azure Blob Storage is much cheaper. If you store video files (or in other words large files) in the database, you will end up with large database and thus end up paying more money.
You will be choking the database when reading this large data from the database which will impact the performance of your application.
Can anybody help me in understanding how to upload large files in SQL azure using block.
I am not able to find any good implementation of Blocks to upload files in SQL azure.
Thanks,
Ashwani
You may want to look at storing large files in Azure BLOB Storage. You will end up running out of size in your SQL Azure database, or put yourself into a more expensive SQL Azure price point, by storing files in your relational database. You can always store the pointer to your BLOB in your relational database.