SQL Azure test database transfer every night - azure-sql-database

I have a test database in SQL Azure, and I'd like to restore the live database (SQL Azure) on to it at 3am each day, over writing all the data so as the test database becomes a copy of the previous days data.
What is the best practice for doing this inside of Azure?

Azure SQL Database always creates a new database when restoring. Because of this you would not be able to restore into an existing database, however you could create a new database using restore, then later delete it when you are ready to create the new day's restored database.
One approach for accomplishing this would be to leverage Azure Automation.
If you go this route you will first need to configure an automation account and set up a runbook to run a workflow. Here are the steps:
Set up Azure Automation Account and Runbook. See guide here: http://azure.microsoft.com/blog/2014/08/27/azure-automation-authenticating-to-azure-using-azure-active-directory/
After you setup the automation account you will want to create a runbook and PowerShell workflow to run the task. Here is a workflow sample of restoring a database to a copy: https://gallery.technet.microsoft.com/Azure-SQL-Database-Daily-cbd4f15d
After you save and publish this runbook on your account, setup a schedule for it. This can be done by just selecting the runbook's schedule tab in the portal.
Hope this helps!

Related

How to set up recurring migration through queries/script from SQL server(On-premise) to Azure SQL database

I need a help to sync the data on Azure SQL from SQL server(On-premise).
Available recourses:
2 Database (SQL Server) on premise available on different server
Azure SQL database on cloud
Migration script/queries are ready to fetch data from on-premise sql server
Requirements:
Set up a scheduler which will run every 12 hours means two times in a day on Azure SQL.
In scheduler, using migration scripts data will be fetch from the On-premise SQL server and insert into Azure SQL Database.
One of the most prevalent Azure SQL DB migration/replication features is Azure Data Sync.
There are two sorts of schedules: Automatic and Manual. Automatic schedules function on time intervals, with the lowest number being 5 minutes. The second kind is the manual schedule, which allows the user to conduct the sync anytime they want using the Azure Portal or PowerShell. The goal of this PowerShell workflow script is to do a one-time sync manually, then switch to automatic sync and configure Sync time intervals according to your preferences.
you can refer this links to setup data sync Set up Data Sync in the Azure portal, Set up Data Sync with PowerShell
Reference: Schedule Data Sync with Azure SQL Database use an Automation account

What is the fastest way to load data into Azure Hypescale?

I have a need to load data into Azure Hyperscale incrementally.
Source data is in Azure VM that has SQL server installed in it.
Source database is about 6Tb in size and has about 370 tables.
We need a way to get incremental changes in the last X amount of hours and sync them into the same database in Hyperscale.
Ideally, we would extend our database with the availability group setup but since Hyperscale does not support that, we need to find a way to keep these in sync.
Source database does have change data capture enabled.
The best on-line migration option is to use the Azure Database Migration Service (link) where the Online (continuous sync) migration support scenario (link) you need is supported:
The sync will essentially run in the background until completed while being able to access the data that has been migrated. I believe this is a continuous copy scenario and is not incremental. With PaaS database services, you do not have access to perform snapshot replication operations from external data sources. The Hyperscale instance is built upon snapshot replication but it currently only serves the hosted database functionality.
Regards,
Mike

My copy of Azure database seems to be synced with the original

I'm new to the subject but I needed a copy of an Azure database to try some improvements on views.
I managed to make the copy but now I notice that the data is synced, i.e. new data in the original database is also found in the copy.
I used the restore function of the Azure Portal. Just gave the database a name and executed.
Now I'm confused and reluctant to make any changes. Is this really a copy? I don't mind the sync but can it be switched off?
The Copy option of the portal does not keep the copy of the database in sync with the original database.
Another option you can do is to choose Export on the Azure portal menu for that database. It will export the database as a bacpac to an Azure Storage Account. After that you can import the bacpac and create a new Azure SQL Database from that bacpac or you can download that bacpac to a local computer and imported to a local SQL Server instance.
Although I was assured within my company that there is no test environment there was a server running and I happened to give the copy database wrong/right name.
For years this server was logging errors until a database popped up with the right name.
Sorry to confuse you all and myself.

Push data to Azure SQL database

I am pretty new using Azure SQL database. I have been given a task to push a 100 million record text file to Azure SQL database. I'm looking for suggestions how to do it in an efficient manner.
You have several options to upload on-premise data to your SQL Azure database
SSIS - As Randy mentioned you can create an SSIS package (using SSMS) and schedule an SQL Agent job to run this package periodically.
Azure Data Factory - You can define an ADF pipeline that periodically uploads data from your on-premise file to your SQL Azure database. Depending on your requirements you might need just the initial 'Connect and collect' part of the pipeline or you might want to add further additional processing in the pipeline
bcp - The 'bulk copy program' utility can be used to copy data between SqlServer and a data file.Similar to the SSIS package you can use an SQL Agent job to schedule periodic uploads using bcp.
SqlBulkCopy - I doubt if you would need this, but in case you need to integrate this into your application programmatically this class helps you achieve the same as the bcp utility (bcp is faster) via .NET code.
I would do this via SSIS using SQL Studio Managemenet Studio (if it's a one time operation). If you plan to do this repeatedly, you could schedule the SSIS job to execute on schedule. SSIS will do bulk inserts using small batches so you shouldn't have transaction log issues and it should be efficient (because of bulk inserting). Before you do this insert though, you will probably want to consider your performance tier so you don't get major throttling by Azure and possible timeouts.

In SQL can I copy one database to another

I have two databases for my customers, a LIVE database and a TEST database. Periodically my customers would like the LIVE data copied over to the TEST database so it has current data that they can mess around with.
I am looking for a simple way that I could run maybe a command in my application that would copy all the data from one and move it into the other.
Currently I have to remote in with their IT department or consultant and restore from a backup of the LIVE to the TEST. I would prefer to have a button in my application that says RESTORE TEST and it will do that for me.
Any suggestions on how I could do this? Is it possible in SQL? Is there a tool out there in .NET that could help?
Thanks
If you have a backup plan, which I hope you do, you could simply restore the latest full .bak, if it is accessible to your application. However, this would require some tunneling for your application to access the latest backup file and this is generally a no-no for zones containing database servers.
Perhaps you could set up a scheduled delivery of a backup from machine to machine. Does the LIVE server have access to your TEST server. I wouldn't think that a DBA would be inclined to set up a delivery of backup unless it was to have a remote backup location for disaster recovery and that is usually to dedicated locations, not a testing server. Maybe you could workout a scenario where your TEST server doubles as an extra remote backup location for the production environment.
It would be better to be shipped a backup and then periodically or manually start a job to restore from a local backup. This would take the burden of your application. Then you would only need to simply kick of the sql job from within your app as needed.