How to write sqlcmd results directly to Azure Storage using Azure PowerShell? - azure-storage

Current story:
Moving overall BI solution fully to Azure cloud services. Building a new Azure DW and loading data from an Azure DB. Currently, Azure DW doesn't support linked servers and/or the elastic query (this is only supported in Azure DB). Due to price, we can not use data factory or an instance of SSIS. We can't use bcp as we don't have a local directory to hold the file in between loads.
Is it possible to use Azure PowerShell with sqlcmd to write results of a query directly to Azure Storage, without having to write to a file on a local directory in between?
Are there other options that aren't mentioned above?
Thank you for any input.

The current Azure Storage PowerShell (Set-AzureStorageBlobContent) only support upload blob from local file.
Azure Storage Client Library (https://github.com/Azure/azure-storage-net) support to upload blob from stream, can you try to develop your own application with the Azure Storage Client Library?
If your data is big, you can also try https://github.com/Azure/azure-storage-net-data-movement/, it has better performance in upload big blob.

Related

What is the best method to sync medical images between my client PCs and my Azure Blob storage through a cloud-based web application?

What is the best method to sync medical images between my client PCs and my Azure Blob storage through a cloud-based web application? I tried to use MS Azure Blob SDK v18, but it is not that fast. I'm looking for something like dropbox, fast, resumable and efficient parallel uploading.
Solution 1:
AzCopy is a command-line tool for copying data to or from Azure Blob storage, Azure Files, and Azure Table storage, by using simple commands. The commands are designed for optimal performance. Using AzCopy, you can either copy data between a file system and a storage account, or between storage accounts. AzCopy may be used to copy data from local (on-premises) data to a storage account.
And also You can create a scheduled task or cron job that runs an AzCopy command script. The script identifies and uploads new on-premises data to cloud storage at a specific time interval.
Fore more details refer this document
Solution 2:
Azure Data Factory is a fully managed, cloud-based, data-integration ETL service that automates the movement and transformation of data.
By using Azure Data Factory, you can create data-driven workflows to move data between on-premises and cloud data stores. And you can process and transform data with Data Flows. ADF also supports external compute engines for hand-coded transformations by using compute services such as Azure HDInsight, Azure Databricks, and the SQL Server Integration Services (SSIS) integration runtime.
Create an Azure Data Factory pipeline to transfer files between an on-premises machine and Azure Blob Storage.
For more details refer this thread

Azure SQL - bulk insert from Azure files and not blobs

I have a ruby app in azure container for which I have mounted an azure storage. The app uploads few files to mounted drive which needs to be picked up by azure sql for bulk insert and processing. Now, from this article https://github.com/Azure/app-service-linux-docs/blob/master/BringYourOwnStorage/mounting_azure_blob.md mounting the blob storage is readonly, I can use azure files for mounting but azure sql doesn't give any option to directly bulk insert from azure files. so I have got stuck between azure files vs blobs, please help me out...
Azure SQL Database only supports reading from Azure Blob Storage.
File Storage is not supported.
Ref: BULK INSERT (Transact-SQL)
I would suggest you choose Blob Storage.
HTH.

Remote SQL Server backups using Azure

I've got a handful of databases running on a SQL Server instance. I don't have access to be able to install the Azure Backup agent but I do have connection details and credentials to access the database and perform backups in SQL Server Management Studio.
What I want to do is be able to perform and schedule these backups and save them in to Azure Blob Storage. I could have this schedule running on my local computer but that's not an ideal solution.
I've got a powershell script that will perform this action for me but it relies on SQL Server assemblies to run. I've tried running this as a devops build task but am unable to do so without the assemblies it requires.
Does anybody know a way of setting this up? In azure for example? Is there a resource that will allow me to connect and backup a sql instance via connections string and save down to blob storage. Or an azure function perhaps?
Is there a resource that will allow me to connect and backup a sql instance via connections string and save down to blob storage?
I'm afraid the answer is no.
We can't find any API support in Azure to help you achieve that.
I think the SQL Server Management Studio and powershell script is more suitable for you.
Maybe you can think about using third-party tool SQL Backup and FTP, it can help you schedule backup the SQL Server to Azure Blob Storage.
Hope this helps.

Importing XML files to Azure SQL Database

I have a large amount of XML files that I transfer via ftp to an azure website folder on a daily basis. I currently use c# to transfer the data to azure sql server tables. However, it is extremely slow.
Is there a way I can run an Azure SQL job to bulk import these files and if so, how do I access the files in the web apps folder?
I know how to do this on a standard SQL server with XML files residing on a share drive but am unsure how to do this in azure.
Currently, we do not support any T-SQL interface to read files from blob store or container. So you have to push the data from outside of SQL Server.
One option is to use Azure Automation to run your code periodically or based on a schedule. See post below on how to use Azure Automation:
http://azure.microsoft.com/en-us/documentation/articles/automation-manage-sql-database/

Tool to migrate Azure storage to local development storage

Are there any good tools to take a snapshot of my Azure tables and blob containers and copy it into local development storage?
Developers sometimes need to work in a isolated environment but would like a copy of some "real" application data. Right now we have data creation scripts that we can run to populate local storage but it would be helpful to be able to grab a snapshot and move into development storage.
I generally use Cloud Storage Studio for all handling of Azure Storage. Using that you can easily download from your live blob storage and then upload to your local storage.
You can also use the Azure Storage Synctool to upload the local storage to a live storage blob on Azure, or download (vice versa).