Backup Azure SQL DB during VSTS Release - azure-sql-database

I am exploring VSTS Release Management and I wanted to backup my production database hosted on Azure SQL DB before I apply any migration scripts to it. I fail to find any particular task or preferred way of waiting till the Azure SQL DB is fully backed up so that I can proceed with deployment only after the database is correctly backed up. 
I have looked at either using a PowerShell task or Azure SQL CMD task, but I am not sure how to make rest of the tasks wait for the backup to complete. 
Would appreciate if anyone could point me in the right direction. Thanks. 

You can backup Azure SQL database and check the status in a loop.
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $ResourceGroupName -ServerName $ServerName `
-DatabaseName $DatabaseName -StorageKeytype $StorageKeytype -StorageKey $StorageKey -StorageUri $BacpacUri `
-AdministratorLogin $creds.UserName -AdministratorLoginPassword $creds.Password
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write("Exporting")
while ($importStatus.Status -eq "InProgress")
{
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write(".")
Start-Sleep -s 10
}
[Console]::WriteLine("")
$importStatus
More information, you can refer to Export an Azure SQL database to a BACPAC file.
Another way is that you can backup Azure SQL database by call Microsoft.SqlServer.Dac.DacServices.ExportBacpac method with PowerShell.
param([string]$ConnectionString, [string]$DatabaseName,[string]$OutputFile,[string]$s)
Add-Type -Path "$s\AzureDatabaseSolution\SQLDatabase\lib\Microsoft.SqlServer.Dac.dll"
$now = $(Get-Date).ToString("HH:mm:ss")
$Services = new-object Microsoft.SqlServer.Dac.DacServices $ConnectionString
Write-Host "Starting at $now"
$Watch = New-Object System.Diagnostics.StopWatch
$Watch.Start()
$Services.ExportBacpac($OutputFile, $DatabaseName)
$Watch.Stop()
Write-Host "Backup completed in" $Watch.Elapsed.ToString()
Note: Using the assembly in this package: Microsoft.SqlServer.Dac 1.0.3 (I add it to the source control and map to build agent)
On the other hand, to add firewall rule, you can refer to this thread: Deploy Dacpac packages via power shell script to Azure SQL Server.
BTW, you can build the custom build/release step/task with these PowerShell scripts. Add a build task

Azure SQL Databases are continually backed up automatically. If you are trying to create a copy of the database or archive the database to a BACPAC file, you can do either.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-automated-backups
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-copy
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-export

Related

List / discover all Azure SQL Database backup retention policies

I have a large number of Azure SQL Databases and I would like to create a list or report of some kind that shows what backup retention policies are in place for each one.
All I can find is how to check on per-database or per-server basis. This would take me a long time and is error-prone and not something I can check on a regular basis or easily provide to an auditor/manager who wants confirmation that everything is being backed up and retained properly.
Is there a way to obtain all this information in one place? A PowerShell solution would be acceptable.
You can use Powershell commands to get the Long-term retention policies for your SQL Server or even for each database using below commands:
# get all LTR policies within a server
$ltrPolicies = Get-AzSqlDatabase -ResourceGroupName $resourceGroup -ServerName $serverName | `
Get-AzSqlDatabaseLongTermRetentionPolicy
# get the LTR policy of a specific database
$ltrPolicies = Get-AzSqlDatabaseBackupLongTermRetentionPolicy -ServerName $serverName -DatabaseName $dbName `
-ResourceGroupName $resourceGroup
You can also use CLI command to get LTR policies for each database.
az sql db ltr-policy show \
--resource-group mygroup \
--server myserver \
--name mydb
In the above code only you can write the code for each database to get the LTR policies.
Refer: Manage Azure SQL Database long-term backup retention

How do I know what user my ADO pipeline is using?

I'm using Azure DevOps pipelines, and have a PowerShell task that runs stuff with Invoke-Sqlcmd. The PowerShell works fine when run from my computer, but when it runs through the pipeline it says it can't find or doesn't have access to the server. I don't see anything in the failed connection logs on my sql servers...
I assume whatever account the pipeline is attempting to connect under does not have access. How can I find out what that account is?
If you're curious, here's the simple PS, it just updates a table:
Invoke-Sqlcmd -ServerInstance "myremoteserver" -Query "--update the table"
You can add a powershell task to run below script to get the current user account that your pipeline is using
[System.Security.Principal.WindowsIdentity]::GetCurrent().Name

how can i run this script automatically on azure for DTU

$ServiceObjective = Get-AzureSqlDatabaseServiceObjective -ServerName exampledb-ServiceObjectiveName S0
Set-AzureSqlDatabase -DatabaseName exampledb-ServerName exampledb-ServiceObjective $ServiceObjective
I can run above script to make my sql database S0 DTU level,
Can i make this automatically.
BTW i searched on forums and stackoverflow,
they recommend automation account and runebook.
But i dont have RunAsAccount. I dont have admin privileges and i cant create RunAsAccount. So i couldnt use runbook.
can you recommend me another way ?
Thanks :)
But i dont have RunAsAccount. I dont have admin privileges and i cant create RunAsAccount. So i couldnt use runbook.
If you want to create an Automation account with run as account, you need to have Owner role in your subscription, because when creating the run as account(service principal), it will automatically add the service principal in your subscription as a Contributor, it just can be done with Owner.
But if you just want to use the runbook in automation account, you don't need the Owner role. You just need to ask the Owner in your subscription to create an automation account with run as account for you, then you will be able to create a powershell runbook and run your command above with e.g. Contributor role.
After the Owner creating the automation account for you, follow the steps below.
1.Navigate to the automation account -> Runbooks -> Create a runbook -> create a Powershell runbook.
2.The two commands Get-AzureSqlDatabaseServiceObjective, Set-AzureSqlDatabase, you are using belong to Azure i.e. ASM powershell module, it is old, and if you want to use them, you need to use Azure Classic Run As Account(which is not supported in CSP subscription). So I recommend you to use the new Az powershell module. In your automation account -> Modules, check if there are Az.Accounts and Az.Sql module, if not, in the Modules -> Browse Gallery, search for the modules and import them.)
After importing successfully, use the script as below to login and set the sql db with Standard S0.
$connectionName = "AzureRunAsConnection"
try
{
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection=Get-AutomationConnection -Name $connectionName
"Logging in to Azure..."
Connect-AzAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
Set-AzSqlDatabase -ResourceGroupName "<resource-group-name>" -DatabaseName "<database-name>" -ServerName "<server-name>" -Edition "Standard" -RequestedServiceObjectiveName "S0"
3.If you want to run the script automatically with a schedule, you can follow this link Scheduling a runbook in Azure Automation to do that.

Failed to connect to database server. How do I connect to a database that is not on my localhost using powershell and integrated security?

Background to my question
At any moment I am expecting the security people in black suits and black sun glasses to come and take me away because of all my sql server login attempts...
I used and adapted iris classon's example to connect to a database via Powershell. The adapted code uses Integrated Security=True"
$dataSource = my_enterprise_db_server
$database = my_db
$connectionString = "Server=$dataSource;Database=$database;Integrated Security=True;"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$table = new-object “System.Data.DataTable”
$query = "..."
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $query
...
Hot diggity dog that worked. Thank's Iris.
I read the snapin verses the Import-Module sqlps way of executing a sql command. I also read all the links that Michael Sorens provided in his answer. I can mount a sqlserver connect with mount mydb SQLSERVER SQLSERVER:\SQL, use ls or dir, walk the path down the objects, etc. I also revised the main part of what Iris provided to
$table = Invoke-Sqlcmd –Server $dataSource –Database $database -Query $query
This version of Invoke-Sqlcmd allows me to connect to an "enterprise" database. The problem with all the references provided are that they expect you to work with a localhost sqlexpress database. The moment I try to use
Set-Location SQLSERVER:\SQL\my_enterprise_db_server\my_db
or similar constructs, I receive a message that ends with
...WARNING: Could not obtain SQL Server Service information. An attempt to connect to WMI on 'my_enterprise_db_server' failed with the following error: Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))
I also saw mention of the SQLCMDSERVER and SQLCMDDBNAME environment variables. I set these to
$env:SQLCMDDBNAME = "my_db"
$env:SQLCMDSERVER = "my_enterprise_db_server"
set-location sqlserver:\sql
ls
produces
MachineName
-----------
localhost
Question
How do I correctly use set-location or New-PSDrive-Name for a database that does not reside on my local computer?
I found the answer by a serendipitous route. I right clicked on a database object in sql server management studio. There was an option to start powershell. Even though this looks like the order sqlps option, SSMS gave me the right way to set the location.
Option 1. If the server does not have instances, then add DEFAULT after the server_name in the slashy path.
Set-Location SQLSERVER:\SQL\server_name\DEFAULT\Databases\database_name\Tables\dbo.table_name
Option 2. If you have a server with an instance, then set the instance name after the server_name in the slashy path.
Set-Location SQLSERVER:\SQL\server_name\instance_name\Databases\database_name\Tables\dbo.table_name
I am a mere mortal as far as database security goes. Many of the features of SSMS are turned off to me because of my security settings verses how the dba security settings are configured. I receive errors in in SSMS all the time. Well that is no different with Powershell using the Set-Location. I did not realize that the two error messages where related because of the security policy configuration verses pilot error. If I set a location to a table, then I only have two warnings of access denied. If I set the location to the database level, then Powershell blows chunks for a bit but I have my slashy path setting. I do not see the errors if I used the Invoke-SqlCmd. I see now that the way the security errors were presented in Powershell are why I thought there was a problem with how I was attempting to connect to the database. Now I can do this:
mount rb SELSERVER SQLSERVER:\SQL\server_name\DEFAULT\Databases\database_name\Tables
# Look at a list of tables.
ls
# Go to a traditional file system
cd F:\
# Go to the Linux Style mounted file system
cd rb:\
# Go to a table like a directory
cd dbo.my_table_name
# Look at the column names
ls
# Use relative navigation
cd ..\dbo.my_other_table_name
ls
# Compare column names with another table using relative navigation after I have just
# listed the current directory/table that I am in.
ls ..\dbo.my_table_name
That just rocks! Now all I need to do is come up with an array of server names and databases to create mount points for all the databases that I can connect to. An array like that is just begging for an iteration to create all the mount points.

Powershell script to execute DDL statements on linked servers - not working when run using SSIS

I have a Powershell script that loops through a list of SQL Servers and creates server logins and database users.
The script runs on a separate server, under the administrator credentials on that server, and connects to the other SQL Servers via linked servers.
#Get administrator credentials
$password = get-content C:\Powershell\General\password.txt | convertto-securestring;
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist "DOMAIN\administrator",$password;
When this script is run manually (either directly through a Powershell window or using a batch file through a command prompt) it works perfectly well. I am logged onto the executing server as administrator when running the script manually.
I then tried to run this Powershell script using an SSIS package on the executing server, using the Execute Process Task to run a batch file. The package was executed from a SQL Agent Job. Although both the job and the package seemed to execute successfully, the DDL statements were not executed against the linked servers.
SQL Agent on the executing server is run under a designated Service Account. SSIS runs under the Network Service account.
Does anybody have any thoughts on what I might be doing wrong? I am happy to provide details of the script or anything else that is required.
Thanks
Ash
UPDATE: ok we have a little more information.
I took out the lines I posted above as I have discovered I don't actually need the administrator credentials I was retrieving.
I logged onto the server with the script on it using the service account. As per #ElecticLlama's suggestion I set a Profiler trace on the destination server. When running the script manually (or running a batch file manually that runs the Powershell script) everything works well and the Profiler shows the DDL actions, under the service account login.
When running a job through SQL Agent (either a CmdExec job or an SSIS package) that runs the same batch file, I get the following error:
'Login failed for user 'DOMAIN\ServiceAccount'. Reason: Token-based server access validation failed with an infrastructure error.'
Anybody have any further thoughts?
Thnaks to everyone for their help. Once I got that last error a quick search revealed I just had to restart SQL Agent and now everything works as it should. Thanks in particular to #ElecticLlama for pointing me in the right direction.
Ash