Azure database backup to blob using Powershell - backup

We need to backup the azure database and store it on blob so that it can be restored. I've seen this blog but it uses the third party cmdlets.
http://weblogs.thinktecture.com/cweyer/2011/01/automating-backup-of-a-sql-azure-database-to-azure-blob-storage-with-the-help-of-powershell-and-task-scheduler.html
Could someone please guide/help how above can be achieved using powershell.

Backing up to WA Blob Store is not supported from Azure DB, rather the service does automatic backups for you with PITR capability. You'll find the following documentation useful:
http://msdn.microsoft.com/en-us/library/azure/hh852669.aspx
http://msdn.microsoft.com/en-us/library/azure/jj650016.aspx
Hope this helps.

Here is my powershell script
https://gist.github.com/voxon2/be29a3fd6dabbb9155ca
Here is an article describing many different approaches other than powershell
http://blogs.msdn.com/b/mast/archive/2013/03/04/different-ways-to-backup-your-windows-azure-sql-database.aspx

First get your Azure Automation Settings done (see here).
Edit the blow script and save it as .ps1 file. When you run it for
the first time, it will ask you both your azure automation account and
your database credentials. During this process, it will save your
credentials in a local file securely (see here how it is done). After this time on wards, it uses the saved credentials.
The .psl file and the encrypted credential files should be stored in one
directory
Once you are happy you can schedule it to run in task scheduler.
function Get-MyCredential
{
param(
$CredPath,
[switch]$Help
)
$HelpText = #"
Get-MyCredential
Usage:
Get-MyCredential -CredPath `$CredPath
If a credential is stored in $CredPath, it will be used.
If no credential is found, Export-Credential will start and offer to
Store a credential at the location specified.
"#
if($Help -or (!($CredPath))){write-host $Helptext; Break}
if (!(Test-Path -Path $CredPath -PathType Leaf)) {
Export-Credential (Get-Credential) $CredPath
}
$cred = Import-Clixml $CredPath
$cred.Password = $cred.Password | ConvertTo-SecureString
$Credential = New-Object System.Management.Automation.PsCredential($cred.UserName, $cred.Password)
Return $Credential
}
function Export-Credential($cred, $path) {
$cred = $cred | Select-Object *
$cred.password = $cred.Password | ConvertFrom-SecureString
$cred | Export-Clixml $path
}
#Create a directory with you azure server name to isolate configurations
$FileRootPath = "C:\PowerShellScripts\AzureServerName"
Write-Host "Getting Azure credentials"
$AzureCred = Get-MyCredential ($FileRootPath + "AzureSyncred.txt")
#Use Azure Automation Account
#(If You do not have it will not work with other accounts)
Add-AzureAccount -Credential $AzureCred
Select-AzureSubscription -SubscriptionId "myAzureSubscriptionId"
#DO NOT use tcp:myServerName.database.windows.net,1433 but only myServerName
$ServerName = "myServerName"
$Date = Get-Date -format "yyyy-MM-dd-HH-mm"
$DatabaseName = "myTargetDatabaseName"
$BlobName = $Date + "-" + $DatabaseName.bacpac"
$StorageName = "myStorageAccountName"
$ContainerName = "myContainerNameToStoreBacpacFiles"
$StorageKey = "myStorageAccountKey"
Write-Host "Getting database user credential"
#DO NOT use myDatabaseUsername#myServerName but only myDatabaseUsername
$credential = Get-MyCredential ($FileRootPath + "DbSyncred.xml")
Write-Host "Connecting to Azure database"
$SqlCtx = New-AzureSqlDatabaseServerContext -ServerName $ServerName -Credential $credential
Write-Host "Connecting to Blob storage"
$StorageCtx = New-AzureStorageContext -StorageAccountName $StorageName -StorageAccountKey $StorageKey
$Container = Get-AzureStorageContainer -Name $ContainerName -Context $StorageCtx
Write-Host "Exporting data to blob"
$exportRequest = Start-AzureSqlDatabaseExport -SqlConnectionContext $SqlCtx -StorageContainer $Container -DatabaseName $DatabaseName -BlobName $BlobName
Get-AzureSqlDatabaseImportExportStatus -Request $exportRequest
# use the below script in powershell to execute the script
# powershell -ExecutionPolicy ByPass –File C:\PowerShellScripts\AzureServerName\mySavedScript.ps1 –noexit

Related

How to enable auto scaling for SQL app in Azure Portal

I am looking to allow auto-scaling from my SQL DB app in Azure portal so that allow it to increase to 200 DTUs on a particular day and then automatically scale back down to 20. I seem to be getting confused as to how to go about it as I'm aware I will need to use Azure cli also. Any help would be much appreciated.
If you wanna scale SQL in Azure on some schedule I would recommend you to prepare Automation service with PowerShell script and schedules connected to that Runbook. Also you would need to configure Run as options.
I personally use this script to scale DB up and down on daily base.
## Authentication
Write-Output ""
Write-Output "------------------------ Authentication ------------------------"
Write-Output "Logging into Azure ..."
$connectionName = "AzureRunAsConnection"
try
{
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection=Get-AutomationConnection -Name $connectionName
"Logging in to Azure..."
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
##DB Part
$vCores = 18
$currentTier = "GP_Gen5"
$size = 200
$resourceGroup = ""
$serverName = ""
$databaseName = ""
$db_size = "GP_Gen5_4"
Write-Output "Changing DB type to GP_Gen5_4"
Set-AzureRmSqlDatabase `
-ServerName $serverName `
-ResourceGroupName $resourceGroup `
-DatabaseName $databaseName `
-RequestedServiceObjectiveName $db_size
# -RequestedServiceObjectiveName "$currentTier" + "_" + "$vCores"
Write-Output "Writing current DB parameters"
Get-AzureRmSqlDatabase `
-ServerName $serverName `
-ResourceGroupName $resourceGroup `
-DatabaseName $databaseName

Get All blob names from Storage account from Runbook

I have a very simple script which absolutely works fine when run from remote powershell ISE (not using the RUNAS credentials from the Automation Runbook), but when we try to run it from Automation Runbook it returns 0 . Following is the code:-
$connectionName = "AzureRunAsConnection"
$SubId = "XXXXXXXXXXXXXXXXXX"
try
{
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
Write-Verbose "Logging in to Azure..." -Verbose
Connect-AzAccount -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint -ApplicationId $servicePrincipalConnection.ApplicationId -Tenant $servicePrincipalConnection.TenantId -ServicePrincipal
Write-Verbose "Setting Subscription......" -Verbose
Set-AzContext -SubscriptionId $SubId | Write-Verbose
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
$storageAccount = Get-AzStorageAccount -ResourceGroupName $ResourceGroup -Name $StorageAccountName -ErrorAction SilentlyContinue
# Resource group name for the storage acccount
$ResourceGroup = "MYDEV01-RG"
# Storage account name
$StorageAccountName = "myDev01StrgName"
# Container name for analytics logs
$ContainerName = "`$logs"
$storageContext = $storageAccount.Context
$containers = New-Object System.Collections.ArrayList
$container = Get-AzStorageContainer -Context $storageContext -Name $ContainerName -ErrorAction SilentlyContinue |
ForEach-Object { $containers.Add($_) } | Out-Null
Write-Output("> Container count: {0}" -f $containers.Count)
Not sure if we are missing something like permissions or some other thing because of with the Automation Account (Runbook) is not working, any help?
Thank you,
After spending 24 hours on this one and staring and trying everything, it turned out that the ABOVE SCRIPT is correct and nothing is wrong in it but the STORAGE ACCOUNT's FIREWALL and NETWORK Setting were set to SELECTED NETWORK (You can either add the network IP addresses which you want to permit or select All Networks and that helped me resolve my issue). In NO WAYS I am suggesting SELECTING ALL NETWORKS but for testing we can and then add only the Selected networks and that should work.

Create point-to-site connection for all web apps slots using powershell

I'm following the tutorial from here https://azure.microsoft.com/pl-pl/documentation/articles/app-service-vnet-integration-powershell/
where I've a script which allows me to connect multiple web apps with VNet.
The issue is our web apps have few deployment slots and when this script is run it only updates currently used slot.
I wasn't able to get web app from different slot by the name, also I don't see any parameter which would apply my configuration to all slots.
Script for the reference:
function ConnectWebAppWithVNet()
{
param(
$SubscriptionId,
$VNetResourceGroupName,
$AppResourceGroupName,
$WebAppName,
$VNetName,
$GatewayName,
$P2SRootCertName2,
$MyP2SCertPubKeyBase64_2
)
$webApp = Get-AzureRmResource -ResourceName $WebAppName -ResourceType "Microsoft.Web/sites" -ApiVersion 2015-08-01 -ResourceGroupName $AppResourceGroupName
$location = $webApp.Location
$vnet = Get-AzureRmVirtualNetwork -name $VNetName -ResourceGroupName $VNetResourceGroupName
$gateway = Get-AzureRmVirtualNetworkGateway -ResourceGroupName $vnet.ResourceGroupName -Name $GatewayName
# validate gateway types, etc.
if($gateway.GatewayType -ne "Vpn")
{
Write-Error "This gateway is not of the Vpn type. It cannot be joined to an App."
return
}
if($gateway.VpnType -ne "RouteBased")
{
Write-Error "This gateways Vpn type is not RouteBased. It cannot be joined to an App."
return
}
if($gateway.VpnClientConfiguration -eq $null -or $gateway.VpnClientConfiguration.VpnClientAddressPool -eq $null)
{
Write-Host "This gateway does not have a Point-to-site Address Range. Please specify one in CIDR notation, e.g. 10.0.0.0/8"
return
}
Write-Host "Creating App association to VNET"
$propertiesObject = #{
"vnetResourceId" = "/subscriptions/$($subscriptionId)/resourceGroups/$($vnet.ResourceGroupName)/providers/Microsoft.Network/virtualNetworks/$($vnetName)"
}
$virtualNetwork = New-AzureRmResource -Location $location -Properties $propertiesObject -ResourceName "$($webAppName)/$($vnet.Name)" -ResourceType "Microsoft.Web/sites/virtualNetworkConnections" -ApiVersion 2015-08-01 -ResourceGroupName $AppResourceGroupName -Force
# We need to check if the certificate here exists in the gateway.
$certificates = $gateway.VpnClientConfiguration.VpnClientRootCertificates
$certFound = $false
foreach($certificate in $certificates)
{
if($certificate.PublicCertData -eq $virtualNetwork.Properties.CertBlob)
{
$certFound = $true
break
}
}
if(-not $certFound)
{
Write-Host "Adding certificate"
Add-AzureRmVpnClientRootCertificate -ResourceGroupName $VNetResourceGroupName -VpnClientRootCertificateName "AppServiceCertificate.cer" -PublicCertData $virtualNetwork.Properties.CertBlob -VirtualNetworkGatewayName $gateway.Name
}
# Now finish joining by getting the VPN package and giving it to the App
Write-Host "Retrieving VPN Package and supplying to App"
$packageUri = Get-AzureRmVpnClientPackage -ResourceGroupName $vnet.ResourceGroupName -VirtualNetworkGatewayName $gateway.Name -ProcessorArchitecture Amd64
# Put the VPN client configuration package onto the App
$PropertiesObject = #{
"vnetName" = $vnet.Name; "vpnPackageUri" = $packageUri
}
New-AzureRmResource -Location $location -Properties $propertiesObject -ResourceName "$($webAppName)/$($vnet.Name)/primary" -ResourceType "Microsoft.Web/sites/virtualNetworkConnections/gateways" -ApiVersion 2015-08-01 -ResourceGroupName $AppResourceGroupName -Force
Write-Host "Finished!"
}
If your web app is already connected to VPN there is a way to connect also its slot.
$webAppName = "name_of_app_service"
$resourceGroup = "name_of_resource_group"
$vnetName = "name_of_vnet"
$slotName = "staging"
$resName = $webAppName + "/" + $slotName
$WebAppConfig = Get-AzureRmResource -ResourceGroupName $resourceGroup -ResourceType Microsoft.Web/sites/slots/config -ResourceName $resName -ApiVersion 2016-08-01
$WebAppConfig.Properties.vnetName = $vnetName
Set-AzureRmResource -ResourceId $WebAppConfig.ResourceId -Properties $WebAppConfig.Properties -ApiVersion 2016-08-01 # -Force
I've manged to get help from Azure support and we've found out the problem:
I've created secondary slot without cloning the configuration
settings from main slot. If this setting would be selected,
secondary slot would be connected automatically. In my case I couldn't recreate slot so I needed to manually
connect secondary slot after swapping.

How to run a SQL batch within an Azure runbook?

I need to execute a batch to perform some maintenance tasks in my database but all the examples on Azure Automation I see are dealing with a single SQL command.
How do I do it if creating an SP is not an option? I think I need to either somehow embed my script.sql file into a runbook script or reference it (like here, for example)?
You could store the .sql file in Azure Blob Storage, and within the runbook download the .sql file, read its contents, and pass that to the SqlCommand object.
Something like:
try {
# Connect to Azure using service principal auth
$ServicePrincipalConnection = Get-AutomationConnection -Name $AzureConnectionAssetName
Write-Output "Logging in to Azure..."
$Null = Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $ServicePrincipalConnection.TenantId `
-ApplicationId $ServicePrincipalConnection.ApplicationId `
-CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint
}
catch {
if(!$ServicePrincipalConnection) {
throw "Connection $AzureConnectionAssetName not found."
}
else {
throw $_.Exception
}
}
$Path = "C:\abc.sql"
Set-AzureRmCurrentStorageAccount -ResourceGroupName $ResourceGroupName -Name $StorageAccountName
Get-AzureStorageBlobContent -Container $Container -Blob $Blob -Destination $Path
$Content = Get-Content $Path
$Cmd = New-Object System.Data.SqlClient.SqlCommand($Content, $Conn)

Powershell BitsTransfer https basic authentication syntax

I'm new to PowerShell scripting. I'm struggling with the MS documentation and finding few examples to work with.
I'm trying to automate the weekly download of a large txt file from ntis.gov with a BitsTransfer script. I'm using .ps1 script because apparently SSIS can't do this without writing .NET code.
Access to this text file is via https: with an NTIS issued username and password. How can I specify (hard code) the password into the authentication string? I know this is bad practice. Is there a better way to do this?
My script looks like this-
$date= Get-Date -format yyMMdd
Import-Module BitsTransfer
$Job = Start-BitsTransfer `
-DisplayName DMFweeklytrans `
-ProxyUsage AutoDetect `
-Source https://dmf.ntis.gov/dmldata/weekly/WA$date `
-Destination D:\Test.txt `
-Authentication Basic `
-Credential "myIssuedUsername" `
-Asynchronous
While (($Job.JobState -eq "Transferring") -or ($Job.JobState -eq "Connecting")) {sleep 5}
Switch($Job.JobState)
{
"Transfer Completed" {Complete-BitsTransfer -BitsJobs $Jobs}
default {$Job | Format-List}
}
When you have to provide credentials in non-interactive mode, you can create a PSCredential object in the following way.
$secpasswd = ConvertTo-SecureString "PlainTextPassword" -AsPlainText -Force
$yourcreds = New-Object System.Management.Automation.PSCredential ("username", $secpasswd)
$Job = Start-BitsTransfer `
-DisplayName DMFweeklytrans `
-ProxyUsage AutoDetect `
-Source https://dmf.ntis.gov/dmldata/weekly/WA$date `
-Destination D:\Test.txt `
-Authentication Basic `
-Credential $yourcreds `
-Asynchronous