Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I am new to Azure SQL server and trying to understand how do I backup Azure SQL databases to Azure blob few times a day.
My company is currently using [Cherry Safe][1] to backup Azure SQL Databases but Cherry Safe is shutting down in 2 weeks.
As I read more about it, it seems like I can configure export to Azure blob but I do not see that option. I see history of exports but I do not know where is the configuration to schedule or change it.
For long term retention, I see an option to configure retention vault.
Are there replacement services out there that Cherry Safe users are using?
Do I need an external service or I can configure the backups myself?
Thanks.
It seems that you could backup sql server to azure blob storage, not the azure sql server.
You could use SQLBackupAndFTP to backup Azure sql database to local machine.
1.Connect SQLBackupAndFTP to the logical SQL Server in the Azure
2.Create a job for regular Azure SQL Database backup
Also as you have said, you could use long-term backup retention to backup.
It allows you to preserve weekly, monthly, and yearly backups for an extended period of time up to 10 years.
Restore a database from a specific long-term backup if the database has been configured with a long-term retention policy. This allows you to restore an old version of the database to satisfy a compliance request or to run an old version of the application. See Long-term retention.
You don't need to use any third party to backup Azure SQL database, they have much better backup and retention tool on portal. Read this https://learn.microsoft.com/en-us/azure/sql-database/sql-database-long-term-backup-retention-configure
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 months ago.
Improve this question
Little too late for AWS hosting but I guess its never too late :)
I have a .NET MWC based website hosted on a traditional windows server with SQL server. I have about 5000 hits per day and database of roughly 500 gig in size. Monthly traffic is about 50 GB.
I have to migrate this to AWS, what are the steps? I have a simple .NET C# MVC webapp which connects to SQL server. Also, I would like to know how much will it cost to host website with above mentioned requirements on AWS?
Thanks in Advance
Mandy
It will be around $5xx USD per month base on AWS calculator, and $4xx per month for Azure VM + SQL managed Db. If your desire long-term hosting, i will suggest you purchase a proprietary license of SQL server. It will save more!
For reference : One EC2+One SQL Server
You should consider using Amazon Lightsail, which has a single price for the compute (including Data Transfer), and a single price for the Database (including storage). However, Microsoft SQL Server is not available with Lightsail.
The alternative is to launch:
An Amazon EC2 instance for your application
An Amazon RDS database for your SQL Server
The price will vary based upon the size of instance and database you choose. Storage is an extra cost, based on how much you allocate.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I know Azure does its own backups in the cloud. However, due to company policy I need to generate a local backup copy of the database and be date-time stamped.
I've read this, and it has allowed me to create a .bacpac file and import it into our on-prem SQL server (2019). What I want is a way to save the bacpac file on a network folder, on a regular basis.
UPDATE - no I don't have to, store the bacpac file in an on-prem database. I only mentioned it to say, yes I can do this extra step. What I really want is to simply save the bacpac file, date-stamped in the filename, and in a network folder on-prem.
If you don't mind use third-part tool to regular backup Azure SQL database to local, please ref this blog: How to backup Azure SQL Database to Local Machine. It provides all the way to backup the Azure database to local, include regular backup features.
This blog provide the tools SqlBackupAndFtp to help us regular backup the database. The output .bacpac backup file name schema example like this: Mydatabase202103250956, databasename+date.
It also give the tutorial to Backup Azure SQL Database Using BCP Utility:
bcp sqlftpbackupdb.SalesLT.CustomerAddress out c:\sqlfile\cust.dat -c -U daniel -S tcp:sqlftpbackupserver.database.windows.net
You also could ref the official document: https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-export?WT.mc_id=DP-MVP-5001259&view=sql-server-ver15.
HTH.
I have the following set up:
Azure service
Azure SQL database
Azure Table Storage
Azure Blob Storage
I am trying to develop a backup strategy for this service.
The thing is, that SQL, Tables and BLOBs should be synced. In the backup all three of those have to be of the same version. (backups taken at the same moment). And the main problem is - I can only afford several minutes downtime, not more than that.
What should I do? May be there is existing solution?
Windows Azure Storage supports geo-replication for Blobs, Tables and Queues. Data in the storage account is made durable by replicating transactions across different storage nodes in the same region (LRS) or a secondary region (GRS). GRS is the default redundancy option when creating a storage account. Refer to http://blogs.msdn.com/b/windowsazurestorage/archive/2013/12/11/introducing-read-access-geo-replicated-storage-ra-grs-for-windows-azure-storage.aspx for more details.
If you want to build a custom backup solution then you could use the techniques suggested in the below 2 blogs
1) http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/30/protecting-your-blobs-against-application-errors.aspx
2) http://blogs.msdn.com/b/windowsazurestorage/archive/2010/05/03/protecting-your-tables-against-application-errors.aspx
I am not sure of the exact use case of why you need to backup azure table and blob. You can backup All the above services without downtime; might be there would be slight glitch or bottleneck performance with SQL database durning back.
The single shot answer is to write a custom script which would read the data from azure table ( or SQL database, or the required service ) make a archive (packaging) and store it back.
The important thing to note here is where would storage backups, broadly speaking generally store the archives in blob. In this case you have thing where you would be storing, if you are storing on-premises you need calculate upon the storage locally, out bandwidth cost and latency of the data transfer from azure.
PS : cloud storage by itself has good leave of availability and durability, you further improve these factors by enabling geo-replication
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
With SQL Server Transactional Replication, is it possible to also replicate users and roles from the source to the target database? This is not something I am able to do.
No, with Transactional Replication you cannot replicate users and roles from source to target. A list of database objects that can be published using Transactional Replication is listed in Publish Data and Database Objects.
You will need to deploy the users and roles from source to target database using a pre or post snapshot script. For information on pre and post snapshot scripts refer to:
Execute Scripts Before and After the Snapshot is Applied
Execute Scripts Before and After the Snapshot is Applied (SQL Server Management Studio)
Configure Snapshot Properties (Replication Transact-SQL Programming)
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I have developed/managed my dev database in SQL Azure, but I can't figure out how to copy the schema of my dev SQL Azure database so I can upload it to my production SQL Azure database? I have heard of other third party tools, but is this the best way? If so, what is a good tool to use for this?
http://sqlazuremw.codeplex.com/ is also a quite simple to use and efficient tool
If your production database is empty and you just need to clone your current dev database -
you can export your DevDB to .bacpac file and then import back it on production server.
Old portal have that functionality there: Database->subscription->server->Pick database, import\export on the top.
New portal: DB, click on servers tab, select server -> databases tab, Import\Export buttons on bottom panel
If you need just migrate your schema to production database, which filled with data and should't be interrupted - you can create SQLProject type (if i remember correctly - you need SQL Server Data Tools, SSDT. Also its available with SQL2012 Tools\Studio) and then you can compare your Azure DB Schema to empty project and script schema back to sql constructs. Then just publish newly created scripts to your production database.
Two tools you should never do without when working with SQL are SQL Compare and SQL Data Compare by Redgate. They have saved me countless hours of work and will streamline what you want to do. I've used them with azure and they work well and do the job for us.
See http://www.red-gate.com/products/sql-development/sql-developer-bundle/
The first product will compare and create your schema and the second will let you synch your data. Sometimes it's just better to pay for the right tools and this is that case.
SQL Azure MW (http://sqlazuremw.codeplex.com/) works ok, although it did fall apart on me a couple of times when working with a lot of tables in the schema.
Another option is also Azure Data Sync, but as far as I know, it also doesn't want to work if you have more than 500 tables in your database.
What I did in the end is sqlcmd with the SQL dump of the database:
http://msdn.microsoft.com/en-us/library/windowsazure/ee336280.aspx
that took ages, but worked fine. The big problem is how to get it back locally :-)