I have a ruby app in azure container for which I have mounted an azure storage. The app uploads few files to mounted drive which needs to be picked up by azure sql for bulk insert and processing. Now, from this article https://github.com/Azure/app-service-linux-docs/blob/master/BringYourOwnStorage/mounting_azure_blob.md mounting the blob storage is readonly, I can use azure files for mounting but azure sql doesn't give any option to directly bulk insert from azure files. so I have got stuck between azure files vs blobs, please help me out...
Azure SQL Database only supports reading from Azure Blob Storage.
File Storage is not supported.
Ref: BULK INSERT (Transact-SQL)
I would suggest you choose Blob Storage.
HTH.
Related
I'm trying to figure out if there is a way to backup Azure SQL Databases (Not SQL on Azure VMs) to a service vault or a blob storage outside of the resource group the databases is located in. So far I have not found any resources on the topic. Can anyone confirm that this is not possible?
Yes, it's possible.
The easiest way is that you could using Export to backup the database to blob storage outside of the resource group the databases is located in On portal.
For example:
Export:
Export database:
Choose the blob storage out of the resource group which the databases is located in:
For more details, please reference: Export an Azure SQL database to a BACPAC file.
Hope this helps.
I'm investigating whether the feature to copy multiple folders
(Exports from Collections) from Azure File Share to onPremise Accelerate file share (windows share) exists or not.
Azure file share is indeed supported in the Import/Export process:
"Azure Import/Export service is used to securely import large amounts of data to Azure Blob storage and Azure Files by shipping disk drives to an Azure datacenter"
You can read more about the feature and when it's best used here
Current story:
Moving overall BI solution fully to Azure cloud services. Building a new Azure DW and loading data from an Azure DB. Currently, Azure DW doesn't support linked servers and/or the elastic query (this is only supported in Azure DB). Due to price, we can not use data factory or an instance of SSIS. We can't use bcp as we don't have a local directory to hold the file in between loads.
Is it possible to use Azure PowerShell with sqlcmd to write results of a query directly to Azure Storage, without having to write to a file on a local directory in between?
Are there other options that aren't mentioned above?
Thank you for any input.
The current Azure Storage PowerShell (Set-AzureStorageBlobContent) only support upload blob from local file.
Azure Storage Client Library (https://github.com/Azure/azure-storage-net) support to upload blob from stream, can you try to develop your own application with the Azure Storage Client Library?
If your data is big, you can also try https://github.com/Azure/azure-storage-net-data-movement/, it has better performance in upload big blob.
I have a large amount of XML files that I transfer via ftp to an azure website folder on a daily basis. I currently use c# to transfer the data to azure sql server tables. However, it is extremely slow.
Is there a way I can run an Azure SQL job to bulk import these files and if so, how do I access the files in the web apps folder?
I know how to do this on a standard SQL server with XML files residing on a share drive but am unsure how to do this in azure.
Currently, we do not support any T-SQL interface to read files from blob store or container. So you have to push the data from outside of SQL Server.
One option is to use Azure Automation to run your code periodically or based on a schedule. See post below on how to use Azure Automation:
http://azure.microsoft.com/en-us/documentation/articles/automation-manage-sql-database/
Are there any good tools to take a snapshot of my Azure tables and blob containers and copy it into local development storage?
Developers sometimes need to work in a isolated environment but would like a copy of some "real" application data. Right now we have data creation scripts that we can run to populate local storage but it would be helpful to be able to grab a snapshot and move into development storage.
I generally use Cloud Storage Studio for all handling of Azure Storage. Using that you can easily download from your live blob storage and then upload to your local storage.
You can also use the Azure Storage Synctool to upload the local storage to a live storage blob on Azure, or download (vice versa).