What is the best method to sync medical images between my client PCs and my Azure Blob storage through a cloud-based web application? - file-upload

What is the best method to sync medical images between my client PCs and my Azure Blob storage through a cloud-based web application? I tried to use MS Azure Blob SDK v18, but it is not that fast. I'm looking for something like dropbox, fast, resumable and efficient parallel uploading.

Solution 1:
AzCopy is a command-line tool for copying data to or from Azure Blob storage, Azure Files, and Azure Table storage, by using simple commands. The commands are designed for optimal performance. Using AzCopy, you can either copy data between a file system and a storage account, or between storage accounts. AzCopy may be used to copy data from local (on-premises) data to a storage account.
And also You can create a scheduled task or cron job that runs an AzCopy command script. The script identifies and uploads new on-premises data to cloud storage at a specific time interval.
Fore more details refer this document
Solution 2:
Azure Data Factory is a fully managed, cloud-based, data-integration ETL service that automates the movement and transformation of data.
By using Azure Data Factory, you can create data-driven workflows to move data between on-premises and cloud data stores. And you can process and transform data with Data Flows. ADF also supports external compute engines for hand-coded transformations by using compute services such as Azure HDInsight, Azure Databricks, and the SQL Server Integration Services (SSIS) integration runtime.
Create an Azure Data Factory pipeline to transfer files between an on-premises machine and Azure Blob Storage.
For more details refer this thread

Related

How to migrate .trn files into azure databases?

I am receiving multiple .trn files on daily basis & I am resorting those file in on-premises sql database. Now How we can migrate those .trn files daily on azure.
Connect on-premises SQL database to Azure data factory using Self hosted IR. A self-hosted integration runtime can run copy activities between a cloud data store and a data store in a private network.
Create pipeline and use copy activity.
Select on-premises SQL database as Source.
Select Azure service of your choice as Sink.
Now use trigger to copy data from on premise to Azure periodically.
Refer – Copy activity link
Also refer – Trigger link

Why does external I.P. need access to on-prem sql database when moving data with ADF to Azure SQL?

Why does external I.P. need access to on-prem sql database when copying data with ADF to Azure SQL?
It looks like on-prem sql makes a direct connection to Azure SQL (bypassing ADF). Is this by design or do I follow the wrong workflow?
Data Factory use the integration runtime to help us create the connection to the Source/Sink dataset. Azure integration runtime for cloud dataset and Self-host integration runtime for on-premise source/sink dataset.
The integration runtime (IR) is the compute infrastructure that Azure
Data Factory uses to provide data-integration capabilities across
different network environments. For details about IR, see Integration
runtime overview.
A self-hosted integration runtime can run copy activities between a
cloud data store and a data store in a private network. It also can
dispatch transform activities against compute resources in an
on-premises network or an Azure virtual network. The installation of
a self-hosted integration runtime needs an on-premises machine or a
virtual machine inside a private network.
Azure integration runtime is provides by ADF in default. The self-host integration runtime must be created manually.
That means Data Factory can not access the on-prem SQL database directly. It need the self-host integration runtime to help us connect to the on-prem SQL database.
It means that the on-prem sql does not make a direct connection to Azure SQL(bypassing ADF. That why external I.P. need access to on-prem sql database when copying data with ADF to Azure SQL.
HTH.

Azure Gov Cloud and Azure Functions trigger on Storage

I have hard time with Azure Functions on Azure Government. I need to create a C# trigger bases process on Azure Storage. The goal is to automate the process of the loading the files into Azure SQL DB when a file is dropped into Azure Storage.
Since Azure Functions in Azure Government are not fully comparable to Azure Function in regular Azure and not all UIs are the same, I can't deploy the function to trigger on a storage file.
I was able to build the process in regular Azure Cloud following instructions from https://github.com/yorek/AzureFunctionUploadToSQL but since Azure Government is missing the UI for Azure Functions I'm having hard time to replicating the process in Azure Government.
Portal UI support is not yet available in Azure Government, but it is coming soon. Additionally, Azure Government currently supports "App Service plan" ("Consumption plan" coming soon).
In the meantime, you can do everything you need. First, provision your Azure Function in Azure Gov via the Azure CLI by following this Quickstart example for Functions on Azure Gov. That same link also shows you how you can use Visual Studio to set up your triggers (in your case, a Blob trigger).
Once complete, deploy your Function to Azure Gov with Visual Studio.

How to write sqlcmd results directly to Azure Storage using Azure PowerShell?

Current story:
Moving overall BI solution fully to Azure cloud services. Building a new Azure DW and loading data from an Azure DB. Currently, Azure DW doesn't support linked servers and/or the elastic query (this is only supported in Azure DB). Due to price, we can not use data factory or an instance of SSIS. We can't use bcp as we don't have a local directory to hold the file in between loads.
Is it possible to use Azure PowerShell with sqlcmd to write results of a query directly to Azure Storage, without having to write to a file on a local directory in between?
Are there other options that aren't mentioned above?
Thank you for any input.
The current Azure Storage PowerShell (Set-AzureStorageBlobContent) only support upload blob from local file.
Azure Storage Client Library (https://github.com/Azure/azure-storage-net) support to upload blob from stream, can you try to develop your own application with the Azure Storage Client Library?
If your data is big, you can also try https://github.com/Azure/azure-storage-net-data-movement/, it has better performance in upload big blob.

Tool to migrate Azure storage to local development storage

Are there any good tools to take a snapshot of my Azure tables and blob containers and copy it into local development storage?
Developers sometimes need to work in a isolated environment but would like a copy of some "real" application data. Right now we have data creation scripts that we can run to populate local storage but it would be helpful to be able to grab a snapshot and move into development storage.
I generally use Cloud Storage Studio for all handling of Azure Storage. Using that you can easily download from your live blob storage and then upload to your local storage.
You can also use the Azure Storage Synctool to upload the local storage to a live storage blob on Azure, or download (vice versa).