How to Create a bacPac file from PIT Azure SQL - sql

I have a requirement to basically beable to 'tag' PIT restores as to hold for longer (and then later use the restore on demand), the only way i can think to do this is basically figure out a way to export the requested Point-in-Time restore to a bacpac and save it in Azure storage.

PIT restores as to hold for longer.
The Azure SQL Database service protects all databases with an automated backup system. These backups are retained for 7 days for Basic, 35 days for Standard and 35 days for Premium.
I can think to do this is basically figure out a way to export the requested Point-in-Time restore to a bacpac and save it in Azure storage.
These are two different approach. Either you restore a database from PIT or from bacpac file which is either stored in Azure Storage account or local machine.
You can't perform a Point-of-Time restore from bacpac file.
Refer Azure SQL Database Point in Time Restore and Export to a BACPAC file - Azure SQL Database to know more.

Related

Azure SQL database data archive solution

We have Azure SQL database , we need build data archival solution to manage data more than 2 years old.
Requirement:
Archive and delete more than 2 years data from certain transaction tables.
Should archive the data in low cost storage.
Should be able to quickly restore the data if required.
Looking for recurring job need to execute on every week.
Please recommend an azure solution to achieve this.
Here are the approaches you can try:
If you want to archive the complete database, you can Export a bacpac directly from Azure Portal. This bacpac file will be stored in existing Azure Storage account. Once done, you can delete the data from database. Refer Export to a BACPAC file - Azure SQL Database.
If you need to only archive 2 years of data, you can create a Stored Procedure for each table in your database. You can run that SP using Stored Procedure activity in Azure Data Factory.
The cheapest (free) option to store archive data is by creating bacpac file and store it in local machine. Or else you can use Blob Storage cold storage service for archiving data.
To restore data from bacpac, just simply import the bacpac file in your database. To restore from Blob Storage, update the container from cool to hot and use ADF to copy from that file into destination database.
If you are using SP activity in ADF as mentioned in point 1, you can trigger the ADF pipeline to run your SP weekly/monthly or whenever as per your requirement. Refer Schedule Trigger in ADF.

My copy of Azure database seems to be synced with the original

I'm new to the subject but I needed a copy of an Azure database to try some improvements on views.
I managed to make the copy but now I notice that the data is synced, i.e. new data in the original database is also found in the copy.
I used the restore function of the Azure Portal. Just gave the database a name and executed.
Now I'm confused and reluctant to make any changes. Is this really a copy? I don't mind the sync but can it be switched off?
The Copy option of the portal does not keep the copy of the database in sync with the original database.
Another option you can do is to choose Export on the Azure portal menu for that database. It will export the database as a bacpac to an Azure Storage Account. After that you can import the bacpac and create a new Azure SQL Database from that bacpac or you can download that bacpac to a local computer and imported to a local SQL Server instance.
Although I was assured within my company that there is no test environment there was a server running and I happened to give the copy database wrong/right name.
For years this server was logging errors until a database popped up with the right name.
Sorry to confuse you all and myself.

Azure SQL: Archive a sql database before deleting

I have a number of sql databases in azure sql which I believe are no longer in use.
I'm planning on deleting them however, as a precaution, I would like to take some kind of backup or archive copies that I can quickly use to restore each database if necessary.
I've googled around but haven't found anything concrete. I found one mention of making a copy in a storage account so that it can be recovered but haven't been able to find how to do it - the copy command makes a copy of the database to another database. The "restore" option disappears after you remove a database.
The Database's in question are all less than 10mb in size.
Please consider using an export of the database as a bacpac to a cheap Blob Storage account.
On the Storage field on below image you can provide an existent Storage Account or create a new one.
If you need to recover one of those databases you just need to import them and specify the location of the bacpac you want to import.
You can export your Azure SQL database to the BACPAC backup files, store these backup file to Azure Blob Storage or your on-premise computer.
You can restore the your Azure SQL database from the bacpac files when you need.
Alberto Morllo provides the way about export the database to Blob Storage On Portal.
Beside this, there are many ways can help you do that, please reference:
Export to a BACPAC file using the SQLPackage utility;
Export to a BACPAC file using SQL Server Management Studio (SSMS)
Export to a BACPAC file using PowerShell
Get more details, please reference:
Quickstart: Import a BACPAC file to a database in Azure SQL Database
Export an Azure SQL database to a BACPAC file:
You can choose the way you like best.
Hope this helps.

Data movement from SQL on-premise to SQL Azure

I have migrated my database schema to SQL Azure, but I have huge(millions) data records to be migrated please suggest me an approach to move data
Approaches I have tried.
SQLAzureMW tool (but it takes 14 hours time, its not feasible for me)
Import export on SQL server(even this is taking time)
Any other approaches ..need help..!!
For large datasets you usually have to take a more imaginative approach to migration!
One possible approach is to take a full data backup. Ensuring that transaction logs are committed and cleared at the same time.
upload, or use Azure Import / Export to get the backup into Azure blob storage
syncronise your transaction logs with Azure blob storage
Create an Azure SQL database, import backup
replay transaction logs
Keep in sync with transaction logs until you are ready to switch over.
If 14 hours using SQLAzure Migration Wizard and your database is Azure compatible, you have 4 other choices:
export locally to BACPAC, upload BACPAC to Azure, and import BACPAC to Azure
export BACPAC directly to Azure and then import BACPAC to Azure
Use SSMS migration wizard with the most recent version of SSMS (includes a number of functional and performance enhancements)
Use SQL Server transaction replication - see additional requirements for this option. This last option enables you to incrementally migrate to SQL DB and then when SQL DB is current with your on-premise database, just cut your application(s) over to SQL DB with minimal downtime
For more information, see https://azure.microsoft.com/en-us/documentation/articles/sql-database-cloud-migrate/#options-to-migrate-a-compatible-database-to-azure-sql-database

SQL Azure, frequency of DB copies and backup strategy

Trying to design our DB backup strategy for SQL Azure. In the first instance transactions will be about 200/day.
Scenarios I will be protecting against is:
1) Complete DB lost, failure, corruption which is essentially covered by SQL Azure's saving to 3 point policy ie it has 1 primary and 2 secondary copies.
2) Corruption of records, by buggy code or user error. I would not want to restore a DB for this, and my current thoughts are use a DB copy from a "previous period" (maybe previous night) and do a data compare. Tool in mind is SQL Server Data Tools are used in VS.
My current thoughts are once a day over night take a DB Copy ie
Create Database as copy of liveDB
I think MS talks about a rolling 3 copy procedure, in my case 3 days of backup would be kept then copy 1 would be overwritten by copy 4.
Also do a DB Export as recommended by MS.
Thoughts?
Since the question was "thoughts?" here are some:
Be advised, a DB Export via the Azure managment pages produces a BACPAC which is not transactionally consistent (see http://msdn.microsoft.com/en-us/library/hh335292.aspx).
A DB Copy as you expressed above (Create Database as copy of liveDB) is transactionally consistent, but when it is finished it is billable as it will have the same edition and database size as the source database (see http://msdn.microsoft.com/en-us/library/azure/ff951631.aspx).
As JuneT mentions, using an Automated Export is transactionally consistent, because it first creates a DB Copy, then does a DB Export. Because databases are billed in increments of a day no matter how long the copy is online, if you had a daily backup, you would be paying double the cost of your source database because of the copy coming online before the BACPAC is produced. Once the BACPAC is produced the copy is no longer needed by Automated Export, but you'll still be charged for a prorated day of use. Your retention settings will also impact billing as it relates to storage accounts since you'll be paying for stored BACPACs in terms of their size.
see http://blogs.msdn.com/b/sql-bi-sap-cloud-crm_all_in_one_place/archive/2013/07/24/sql-azure-automated-database-export.aspx and http://msdn.microsoft.com/library/azure/ee621788.aspx
SQL Azure has a built-in feature for restoring from this sort of issue:
see http://msdn.microsoft.com/en-us/library/hh852669.aspx
All Azure SQL Databases are automatically backed up, and the recovery options vary based on the edition of the database. Basic databases allow you to restore the database back to a its state when it was last backed up (once per 24 hours).
Standard and Premium edition databases allow restore to any point in time.