How to replicate snapshot of Azure SQL from Dev subscription to Test subscription? - azure-sql-database

I have Dev Azure SQL server in Dev subscription and Test Azure SQL Server in Test subscription.
I would like to copy snaphot of Azure SQL database from "dev" to "test" with script and eventually automated with Azure DevOps pipeline.
I have successfully replicated database inside "Dev" subscription with New-AzSqlDatabaseCopy.
However it does not have source and destination subscription parameter.
What is best method to replicate snapshot of database from subscription to another?

This could be done by T-SQL. If the login is member of the dbmanager role or a server administrator, on both source and target servers/subscriptions, then below command copies Database1 on server1 to a new database named Database2 on server2. Depending on the size of your database, the copying operation might take some time to complete.
Execute on the master database of the target server (server2) to start copying from Server1 to Server2
CREATE DATABASE Database2 AS COPY OF server1.Database1;
Note: The Azure portal, PowerShell, and the Azure CLI do not support database copy to a different subscription.
Reference Link for more details: https://learn.microsoft.com/en-us/azure/azure-sql/database/database-copy?view=azuresql&tabs=azure-powershell#copy-to-a-different-subscription

Related

Dev Environment creation from Prod for Azure SQL

I have a production server in Azure SQL. I have created a another empty server(dev) for development purpose. I need a copy of the tables, views, stored procedure in the dev server as well. Please suggest some way to transfer the data to dev server database
#John11 : You can take a backup of your Production Database and then simply restore on your Dev server
Ideally Production data restore to Dev is not advisable if it is a highly confidential data
If you just need to move the schema without data then you can use DevOps / CI-CD to deploy the artifacts to Dev

Copy Data From On-Premise SQL Server To Azure SQL - Azure Private Network

Requirement: I wanted to copy data from a specific table/view residing on a on-premise SQL Server to Azure SQL DB.
Infrastructure: As depicted in below picture. Essentially, the Azure network is directly connected with corporate network over Express Route. Thus it's a pure private network connection; as good as the corporate network itself.
Issue/Question: I know there are multiple approaches present to get this operation done and I am not restricted to use ADF copy Data tool only. BUT, for all of these I see some cavets or extra steps needed to be done as below:
ADF Copy Data Tool: Needs a SH-IR and a small MSI package needs to be installed on on-premise machine which hosts the SQL server for registration purpose.
Logic Apps: Needs a Virtual Gateway (OR) ASE
App Service: If the operation is wrapped in a C# application and I choose to deploy to a Azure Web Apps. Then in-order to connect to on-premise SQL Server we need to setup hybrid connection manager and as in #1 we need to install something in on-premise machine.
For my case, none of these extra steps can be done. essentially, the on-premise SQL Server comes under a different BU and thus I don't have any permission there; except they have given grant to a table/view. Thus, none of these extra shitty steps can be done.
Moreover, as mentioned above; since it's connected over express route as direct connection, As can be seen in above picture, both the on-premise and azure SQL are essentially inside the same corporate network. THUS, I should be able to access them directly without configuring any of these extra steps as mentioned above.
Please confirm on these and provide a suggestion.
Thank You.
You can still go with the ADF scenario without a SHIR by creating ADF in a Managed VNET using Private Endpoint. As you already have an ER circuit and have the flexibility to configure the Azure side, can you do this with Azure IR: Access on-premises SQL Server from Data Factory Managed VNet using Private Endpoint - Azure Data Factory | Microsoft Docs
There are 2 solutions which could work for your scenario but even for them to work ,you would need access to on prem SQL server machine access to some extent atleast for one time config and Azure SQL db should be accessible via SSMS installed on on-prem machine.
Using linked server
You can create a linked server ( process explained here https://www.sqlshack.com/create-linked-server-azure-sql-database/ ) on on-prem server and create a agent server job to insert data to azure SQL db table.
Via Python Script
This would need Python installation on on-prem machine. Once installed you can write script to transfer data between on-prem SQL server and Azure SQL db. You can schedule this script again by using an agent server job.

How to migrate .trn files into azure databases?

I am receiving multiple .trn files on daily basis & I am resorting those file in on-premises sql database. Now How we can migrate those .trn files daily on azure.
Connect on-premises SQL database to Azure data factory using Self hosted IR. A self-hosted integration runtime can run copy activities between a cloud data store and a data store in a private network.
Create pipeline and use copy activity.
Select on-premises SQL database as Source.
Select Azure service of your choice as Sink.
Now use trigger to copy data from on premise to Azure periodically.
Refer – Copy activity link
Also refer – Trigger link

SQL server to Azure process workflow migration

We are supporting a legacy system for our organisation. In the current scenario, we receive a SQL Server backup (.bak files) from the application vendor on an FTP location. For every weekend on Sunday it is a Full backup and for every other day its the differential one.
On our side, we have a SQL server instance running which has custom stored procedures written and scheduled to check the location every morning and then restore the backups every day. These restored backups are then used by the organisation for internal reporting purposes. There are 100s of other stored procedures written for different reports in different DBs on the same instance.
Since SQL Server 2008 is now out of support and for cost-saving purposes of running on-premise system, my team has been given a task to look into migrating this whole system to Azure SQL database.
My question is what is the most effective way in which we can move this workflow to the cloud? I have an azure trial account set up for me to try but haven't been successful in restoring the .bak files on Azure SQL instance.
Thanks.
You essentially have two options for Azure, either perform a fairly linear Lift and Shift to SQL Server on an Azure VM or go with a more advanced Azure PaaS offering in Azure SQL Database Managed Instance. The specific deployment Azure SQL Database (Single Instance) will not support your current solution requires with regard to the .bak file support, and I have detailed that below. For further details between the difference between Azure SQL Database Single Instance versus Managed Instance, please see: Features comparison: Azure SQL Database and Azure SQL Managed Instance
The second option, is to leverage the Azure Enterprise Ready Analytics Architecture (AERAA) (link) of Azure (PaaS) Analytics services. With Azure SQL Database (PaaS) services, as opposed to on-premise SQL Server or SQL Server on an Azure VM, there is no Integration Runtime or Analysis Services as a bundled service component. These services are separate PaaS offerings and with the help of the linked AERAA blog, you can gain a better understanding of the Azure Analytics services.
The .bak versus .bacpac support dilemma:
Since the main requirement for your solution is support of .bak files, you need to understand where .bak and where .bacpac files are supported. The term Azure SQL Database applies to both a specific deployment option for an Azure SQL database (PaaS) service and as a general term for Azure SQL cloud databases. As for the specific deployment option, Azure SQL Database (Single Instance nor Elastic Pools) will support your scenario with .bak files. This deployment option will support export/import functionality via .bacpac file format. It will not support full/partial restore functionality. The backup/restore functionality although configurable, is only in scope for the specific database hosted by an Azure SQL (logical) Server instance. Basically, you can not restore an external file. You can import, which is always a full copy. So, for that reason, for an Azure PaaS database service you will need Azure SQL Database Managed Instance for .bak file support or deploy an SQL Server VM image to an Azure VM, and migrate your objects via Azure Database Migration Service.
Regards,
Mike

Migrating SQL Server database from AWS to Azure

I have a large database in an AWS instance running SQL Server 2008 on a Windows Server 2008 R2.
The database is constantly changing and writing information, and its size is about ~100GB
I wish to migrate from our Amazon services to Microsoft Azure.
But I cannot afford any lost of information more them for more than 20-30 minutes
I don't mind using the Azure SQL or running a SQL Server under a VM in Azure Cloud, but I must keep the databases live and updated, there are few main tables that information is being added to them constantly
What would be the best way to do so ?
if you are using an AWS instance and not RDS and you are going to an Azure instance and not "Azure SQL Database" you can use log-shipping or something similar to get the downtime down to a few seconds: http://msdn.microsoft.com/en-us/library/ms187103.aspx
The steps you need to take:
Take full backup on AWS
restore full backup without recovery on Azure
take log backup on AWS
restore log backup without recovery on Azure
repeat 3 and 4 until the time it takes is short enough (you probably want to script this out)
take app offline
take another log backup on AWS
restore that log backup WITH recovery on Azure
repoint App to Azure
bring App online again.
3, 4 and 5 is what log-shipping would automate, but you could just write a powershell script too.