I'm trying to figure out if there is a way to backup Azure SQL Databases (Not SQL on Azure VMs) to a service vault or a blob storage outside of the resource group the databases is located in. So far I have not found any resources on the topic. Can anyone confirm that this is not possible?
Yes, it's possible.
The easiest way is that you could using Export to backup the database to blob storage outside of the resource group the databases is located in On portal.
For example:
Export:
Export database:
Choose the blob storage out of the resource group which the databases is located in:
For more details, please reference: Export an Azure SQL database to a BACPAC file.
Hope this helps.
Related
I've got a handful of databases running on a SQL Server instance. I don't have access to be able to install the Azure Backup agent but I do have connection details and credentials to access the database and perform backups in SQL Server Management Studio.
What I want to do is be able to perform and schedule these backups and save them in to Azure Blob Storage. I could have this schedule running on my local computer but that's not an ideal solution.
I've got a powershell script that will perform this action for me but it relies on SQL Server assemblies to run. I've tried running this as a devops build task but am unable to do so without the assemblies it requires.
Does anybody know a way of setting this up? In azure for example? Is there a resource that will allow me to connect and backup a sql instance via connections string and save down to blob storage. Or an azure function perhaps?
Is there a resource that will allow me to connect and backup a sql instance via connections string and save down to blob storage?
I'm afraid the answer is no.
We can't find any API support in Azure to help you achieve that.
I think the SQL Server Management Studio and powershell script is more suitable for you.
Maybe you can think about using third-party tool SQL Backup and FTP, it can help you schedule backup the SQL Server to Azure Blob Storage.
Hope this helps.
I've been given some credentials for a database hosted through Azure. I need to download the contents of the database but am not sure where I go from here.
Here are the details provided.
define('DB_NAME', '****');
define('DB_USER', '****');
define('DB_PASSWORD', '****');
define('DB_HOST', 'au-cdbr-azure-southeast-a.cloudapp.net');
How do I use this to download what I need?
Thanks
You have these Azure SQL database credentials. But if the Azure SQL DB doesn't open the database firewall for you, you can't do anything with it.
It means that you could not connect to the Azure SQL database or download the contents of it.
For more details, please see: Azure SQL Database and SQL Data Warehouse IP firewall rules.
If you are added to the database firewall role, you can connect to the Azure SQL database and download it.
Please see: Export an Azure SQL database to a BACPAC file.
Hope this helps.
Using New-AzureRmSqlDatabaseExport i'm able to export a database to a blob storage within the same subscription. However I would like to export database from subscription A into blob storage in subscription B. For security reasons it's not acceptable to expose subscription A Azure account credentials.
This is possible by creating a new server in subscription A, create a copy of db and then switch the new server to subscription B. That seems overly complicated and is affecting subscription A.
Code below is possible if I provide Connect-AzureRmAccount credentials for subscription A, but that's not an option.
New-AzureRmSqlDatabaseExport
-ResourceGroupName "SubscriptionA'"
–ServerName "SubscriptionA"
–DatabaseName "SubscriptionA"
–AdministratorLogin "SubscriptionA"
–AdministratorLoginPassword "SubscriptionA"
–StorageKeyType "SubscriptionB"
–StorageKey "SubscriptionB"
-StorageUri "SubscriptionB"
How can this be achieved using New-AzureRmSqlDatabaseExport with only providing database user/pass and not the account credentials?
Running Azure PowerShell commands requires you to be logged in to Azure. Database users have rights only on the database, they don't have any rights on the Azure Fabric, which is what you are trying to use here.
If you must use New-AzureRmSqlDatabaseExport then you will need to provide credentials that can log on to Azure. You can limit the scope of these credentials to only have rights on this server, and only to backup databases (look at custom roles for RBAC), but you will need to use an Azure user.
Alternatively, you can look at using other tools that work at the database layer to do your export. One example is exporting a BacPac file - https://learn.microsoft.com/en-us/azure/sql-database/sql-database-export
The Storage account credentials are the New-AzureRmSqlDatabaseExport required parameters.
Other words, we could not export an Azure SQL Database to a storage account without account credentials.
For more details, you can see New-AzureRmSqlDatabaseExport
and Required Parameters.
For security, you can export your Azure SQL database to a BACPAC file and then upload the file to your Blob Storage in subscription B.
Hope this can helps you.
I am using azure server for sql database.
I want to enabled backup database daily.
And also need to dump sql file for current database and other images uploaded to server.
Any suggestions please?
You can install backup software to your azure server and backup your SQL server to azure cloud storage. There are plenty such software (Duplicati, CloudBerry, Acronis etc).
Some of them have special features to backup SQL server in a proper way, also there are free versions among them.
You can do this in a different ways. You can use the third party applications and schedule backup jobs. Or you can use the native tools and configure everything by your-self. Hope this will be useful for you.
Since you're going down the Azure services route, for the images you had ought to look at Azure Blob Storage
And to back it up...Look at this answer
I know that Cloudberry works with Azure. You can try this software for doing backups daily both full image or icnremental backup.The price is afforable. The tool's simple. I see the person above has already mentioned Cloudberry. Seems to be a good thing.
How to load my SQL database created in MySQL Workbench on Azure cloud?
I created a database which consists of some tables - for now, there is now data in them, it's just a small script created by MySQL Workbench. I also created a database on Azure cloud, created login & password and when I want to use 'automated export' option (I have Storage account, I enter valid login with password) I have error:
'Could not find any bacpac files in the specified storage account.'
I tried google this phrase but I completely do not understand the idea behind these bacpac files and I do not know what to do with it. Can anyone describe me step-by-step how to put my database on Azure cloud?
I want to connect to this DB on Azure in the future because I would like to do a webapplication and android app which will use a remote DB available online.
Azure SQL Database is a custom SQL Server, so if you want to use MySQL you should create a Clear DB (which is a Microsoft partner that offers MySQL on azure). Other option, you can create a Virtual Machine and install by yourself a MySQL.
After that, you can import your tables / records.