SQL Azure backups.
Is there a way to get emails alerts when the backup's are processed?
I have a client that needs this for an ISO accreditation.
You can configure Azure storage blob events to observe new backup blobs being created. You can filter for your need : full backup, differential backup, transaction log backup.
Now, you can handle the events accordingly using event handlers and send email.
More information on Azure Storage Blob Events
Azure SQL Backup Email Alert is not available. We can achieve using
You have SQL in Azure VM use Recovery Services Vaults
Search the Recovery Service Vault in Azure portal
Open Recovery Service Vault and Create a new Recovery Service Vault and add required details
Open created Recovery Service vault and select Backup Alerts.
Select Configure notification
Select Email notification ON and add Recipient’s Email which you want to receive alert and save it.
After successful configuration you can receive the alert mail.
Please Refer: Click here
For alternate solution: Create Alert to Monitor (You have Azure SQL Server Database without VM)
Please Refer: Click here
Related
I am experiencing a problem configuring the backup of an SQL database using Azure.
I have web application and an associated Azure SQL database. The app connects to the DB no problem. I have pasted the connection string provided to me by the Azure UI (Home -> SQL Databases -> My SQL Database) into the connection strings section of the configuration for the App Service (Home -> App Services -> My App Service -> Configuration). I created a backup of the App Service (Home -> App Services -> My App Service -> Backups -> Configuration) and ticked my connection string to be back up my database.
After about 20 minutes, the backup fails with the error:
A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No such host is known.)
I can connect to the database from the SQL Server Management Studio running on my laptop, and from code running on my laptop, using the server, username and password from the connection string, why can the backup not connect to the database?
Many thanks for any advice.
Linking the same question asked on MSDN: Azure SQL Database Backup Fails, cannot connect to the database.
Please see the Requirements and Restrictions details where this functionality is not supported, which I have listed below the applicable items that apply to your scenario:
The Backup and Restore feature requires the App Service plan to be in the Standard tier or Premium tier. For more information about scaling your App Service plan to use a higher tier, see Scale up an app in Azure. Premium tier allows a greater number of daily back ups than Standard tier.
You need an Azure storage account and container in the same subscription as the app that you want to back up. For more information on Azure storage accounts, see Azure storage account overview.
Backups can be up to 10 GB of app and database content. If the backup size exceeds this limit, you get an error.
Using a firewall enabled storage account as the destination for your backups is not supported. If a backup is configured, you will get failed backups.
If none of the above apply to you, then the issue is an IP Address issue in that you need to enable "Allow access to Azure services" in the firewall for your Azure SQL (logical) Server.
Additional troubleshooting can be performed by leveraging Application Insights to capture the backup failure event and then drill into the collected log detail to see what the specific error is.
Using New-AzureRmSqlDatabaseExport i'm able to export a database to a blob storage within the same subscription. However I would like to export database from subscription A into blob storage in subscription B. For security reasons it's not acceptable to expose subscription A Azure account credentials.
This is possible by creating a new server in subscription A, create a copy of db and then switch the new server to subscription B. That seems overly complicated and is affecting subscription A.
Code below is possible if I provide Connect-AzureRmAccount credentials for subscription A, but that's not an option.
New-AzureRmSqlDatabaseExport
-ResourceGroupName "SubscriptionA'"
–ServerName "SubscriptionA"
–DatabaseName "SubscriptionA"
–AdministratorLogin "SubscriptionA"
–AdministratorLoginPassword "SubscriptionA"
–StorageKeyType "SubscriptionB"
–StorageKey "SubscriptionB"
-StorageUri "SubscriptionB"
How can this be achieved using New-AzureRmSqlDatabaseExport with only providing database user/pass and not the account credentials?
Running Azure PowerShell commands requires you to be logged in to Azure. Database users have rights only on the database, they don't have any rights on the Azure Fabric, which is what you are trying to use here.
If you must use New-AzureRmSqlDatabaseExport then you will need to provide credentials that can log on to Azure. You can limit the scope of these credentials to only have rights on this server, and only to backup databases (look at custom roles for RBAC), but you will need to use an Azure user.
Alternatively, you can look at using other tools that work at the database layer to do your export. One example is exporting a BacPac file - https://learn.microsoft.com/en-us/azure/sql-database/sql-database-export
The Storage account credentials are the New-AzureRmSqlDatabaseExport required parameters.
Other words, we could not export an Azure SQL Database to a storage account without account credentials.
For more details, you can see New-AzureRmSqlDatabaseExport
and Required Parameters.
For security, you can export your Azure SQL database to a BACPAC file and then upload the file to your Blob Storage in subscription B.
Hope this can helps you.
I started to study Azure Log Analytics and I'm wondering a very simple question: where are stored the data?
Is there a kind of database behind this resource? How can I access that?
If not, is there a way to "redirect" the logs into a particular storage?
I didn't find any info on the documentations.
Thanks
Azure Diagnostics is an Azure extension that enables you to collect diagnostic data from a worker role, web role, or virtual machine running in Azure. The data is stored in an Azure storage account (you have to assign a diagnostic storage account to store log data) and can then be collected by Log Analytics.
I've created a publisher that uses FTP snapshot delivery method using SSMS.
However, when I tried to configure Subscriber, I couldn't find option to use FTP credentials.
I read "To create a pull subscription to a snapshot or transactional publication that uses FTP snapshot delivery" section in How to: Deliver a Snapshot Through FTP (Replication Transact-SQL Programming) and steps provided are for script.
I was wondering, if this can be achieved from SSMS, rather running it as script.
Thanks in advance.
I have a merge replication set up, with a publication on our dev server (SQL Server 2008 R2 SP2) and a subscription on my local machine. The publisher is acting as its own distributor. The publisher and distributor connections in the subscription properties on my local machine are using a login (from the dev server) that is in the publication access list (PAL) of the publication. If I add this login to the sysadmin server role on the publisher, everything works fine when I sync the subscription. But if I remove the login from the sysadmin server role, the sync does not work -- I get a timeout ("The operation timed out").
My understanding was that I just needed to add the login to the PAL, but I must be missing something with the necessary permissions.
Another question I have is whether it's possible to create a database role whose members would automatically be added to the PAL. I read somewhere that this is possible with SQL Server 2008 SP3; I'm wondering if it's possible with SP2. Or is there a way to create a database role that would have all necessary permissions, so that its members (users) could be used in a subscription (as the publisher and distributor connections) ?
The background of all this is that we have users who will subscribe to our publication, but we only want to replicate data based on their login. So we have filter rows on our articles that use SUSER_SNAME().
Thanks in advance.
Brad
If this is a push subscription then the Merge Agent process account used to make connections to the Publisher and Distributor must be db_owner in the distribution database, be a member of the PAL, a login associated with a user in the publication database, and have read permissions on the snapshot share.
If this is a pull subscription then the Merge Agent process account used to make connections to the Subscriber must be db_owner in the subscription database. The account used to connect to the Publisher and Distributor must be a member of the PAL, a login associated with a user in the publication and distribution databases, and have read permissions on the snapshot share.
This is all covered in the section Permissions That Are Required by Agents in Replication Agent Security Model.