i need to query database in an azure pipeline to find out when was the last login to an environment and if it is more than 2 weeks destroy the environment. to do that i used the below task. but i dont know how can i store the variable to use in the next task. can someone please help me?
- task: SqlAzureDacpacDeployment#1
inputs:
azureSubscription: ' A Service Connection'
AuthenticationType: 'servicePrincipal'
ServerName: 'myserver.database.windows.net'
DatabaseName: 'mydb'
deployType: 'InlineSqlTask'
SqlInline: |
DECLARE #LastLoginDate AS NVARCHAR(50)
SELECT #LastLoginDate = [LastLoginDate]
FROM [dbo].[AspNetUsers]
WHERE UserName <> 'system'
PRINT #LAstLoginDate
IpDetectionMethod: 'AutoDetect'
SQLInlineTask meant for execution of SQL Script on database.
Setting Variables in Pipeline Tasks available with either Bash or PowerShell.
PowerShell Script Task
$query = "DECLARE #LastLoginDate AS NVARCHAR(50)
SELECT #LastLoginDate = [LastLoginDate]
FROM [dbo].[AspNetUsers]
WHERE UserName <> 'system'
PRINT #LAstLoginDate"
# If ARM Connection used with service connection, ignore getting access token
$clientid = "<client id>" # Store in Variable Groups
$tenantid = "<tenant id>" # Store in Variable Groups
$secret = "<client secret>" # Store in Variable Groups
$request = Invoke-RestMethod -Method POST -Uri "https://login.microsoftonline.com/$tenantid/oauth2/token"
-Body #{ resource="https://database.windows.net/"; grant_type="client_credentials"; client_id=$clientid; client_secret=$secret }
-ContentType "application/x-www-form-urlencoded"
$access_token = $request.access_token
# If ARM connection used with service connection, ignore AccessToken Parameter
$sqlOutput = Invoke-Sqlcmd -ServerInstance $.database.windows.net -Database db$ -AccessToken $access_token -query $query
Write-Host "##vso[task.setvariable variable=<variable name>;]$sqlOutput"
Bash
echo "##vso[task.setvariable variable=<variable name>;isOutput=true]<variable value>"
Within same job and different tasks, access it using $(<variable name>)
In Different job, access it using $[ dependencies.<firstjob name>.outputs['mystep.<variable name>'] ]
References:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/set-variables-scripts?view=azure-devops&tabs=bash
https://medium.com/microsoftazure/how-to-pass-variables-in-azure-pipelines-yaml-tasks-5c81c5d31763
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$LoginDate=(Sqlcmd myserver -U $(Username) -P $(Password) -d mydatabase -q "SET NOCOUNT ON; SELECT LastLoginDate=min(LastLoginDate) FROM mytable WHERE UserName<>'system';")
$Last=$LoginDate[2].trimstart()
Write-host $Last
Write-host "##vso[task.setvariable variable=LastLoginDate]$Last"
Related
I have a situation where in the Azure automation runbook results in 'Completed' state and does not run the 'actual' code. I have pasted the code below. It creates a Event Hub inside a Namespace. The code works perfectly executing in local machine but it does not execute in Runbook.
I have written a 'write-output "Declaring local variables for use in script"' --> to check if the printing is working. However, the code is not going beyond that. I am sure, I am missing some thing. Kindly help me.
Param(
[Parameter(Mandatory=$true)]
[string] $NameSpaceNameName,
[Parameter(Mandatory=$true)]
[string[]] $EventhubNames,
[Parameter(Mandatory=$true)]
[string] $ProjectId,
[Parameter(Mandatory=$true)]
[int] $PartitionCount,
[Parameter(Mandatory=$true)]
[string]$Requested_for,
[Parameter(Mandatory=$true)]
[string]$Comments
)
## Login to Azure using RunAsAccount
$servicePrincipalConnection = Get-AutomationConnection -Name 'AzureRunAsConnection'
Write-Output ("Logging in to Az Account...")
Login-AzAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
write-output "Declaring local variables for use in script"
## Declaring local variables for use in script
$Creation_date = [System.Collections.ArrayList]#()
$ResourceGroups = Get-AzResourceGroup
$provided_name_space_exists = $false
## Change context to Platform subscription
select-azsubscription -subscription "GC302_Sub-platform_Dev"
## Create Event Hub
foreach($Resourcegroup in $ResourceGroups){
Write-Host("Processing the Resource Group: {0} " -f $Resourcegroup.ResourceGroupName)
$EventhubNameSpaces = Get-AzEventHubNamespace -ResourceGroupName $ResourceGroup.ResourceGroupName
# Iterate over each Namespace. Fetch the Resource Group that contains the provided Namespace
foreach($EHNameSpace in $EventhubNameSpaces){
if($EHNameSpace.Name -eq $NameSpaceName){
$provided_name_space_exists = $true
Write-Host ("Found the provided Namespace in resource group: {0}" -f $Resourcegroup.ResourceGroupName)
Write-Output ("Found the provided Namespace in resource group: {0}" -f $Resourcegroup.ResourceGroupName)
$nameSpace_resource_group = $ResourceGroup.ResourceGroupName
# Fetch the existing Event Hubs in the Namespace
$existing_event_hubs_list = Get-AzEventHub -Namespace $EHNameSpace.Name -ResourceGroupName $nameSpace_resource_group
## Check provided EH for uniqueness
if($existing_event_hubs_list.Count -le 1000){
for($i = 0;$i -lt $EventhubNames.Count;$i++){
if($existing_event_hubs_list.name -notcontains $EventhubNames[$i]){
$EventHub = New-AzEventHub -ResourceGroupName $nameSpace_resource_group -Namespace $EHNameSpace.Name -Name $EventhubNames[$i] -PartitionCount $PartitionCount
$date = $EventHub.CreatedAt
$Creation_date+= $date.GetDateTimeFormats()[46]
}else{
Write-Host ("Event hub: '{0}' already exists in the NameSpace: {1}. skipping this Event hub creation" -f $EventhubNames[$i], $EHNameSpace.Name)
}
}
}else{
Write-Host ("The Namespace - {0} has Event Hubs count greater or equal to 1000." -f $EHNameSpace.Name)
Write-Host ("Please refer the Link for Eevent Hubs quota/limit: 'https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-quotas#event-hubs-dedicated---quotas-and-limits'")
exit
}
}
}
}
# Print a message that Namespace does not exist
if($provided_name_space_exists -eq $false){
Write-Host ("Provided NameSpace: {0} does not exist." -f $NameSpaceName)
exit
}
Screenshot:
You have $NameSpaceNameName in the parameters section of the runbook but later in the runbook at 50th line you have $NameSpaceName which is not the same as mentioned in parameters section. Correct it and then it should work as expected. One suggestion is to always have an else block wherever you have if block to overcome such issues in future.
I'm trying to import a database from a .bacpac file using New-AzureRmSqlDatabaseImport in powershell. The database is about 8GB in size. When I set the required parameter DatabaseMaxSizeBytes to any amount greater than 5GB, I receive the error:
Get-AzureRmSqlDatabaseImportExportStatus : BadRequest: The ImportExport operation with
Request Id 'a824a510-xxxxx' failed due to 'Error encountered during the service operation.
Could not import package.
Warning SQL0: A project which specifies SQL Server 2008 as the target platform may
experience compatibility issues with Microsoft Azure SQL Database v12.
Error SQL72014: .Net SqlClient Data Provider: Msg 40619, Level 16, State 1,
Line 1 The edition 'Premium' does not
support the database data max size '9663676416'.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE [$(DatabaseName)] COLLATE SQL_Latin1_General_CP1_CI_AS
(EDITION = 'Premium', SERVICE_OBJECTIVE = 'P1', MAXSIZE = 9 GB)
The command accepted a DatabaseMaxSizeBytes of 5GB but of course, the process failed when the import hit the 5GB ceiling.
The database import succeeds if I use the Azure portal.
Install the latest version of SQL PowerShell module from here.
The following script works:
$password = ConvertTo-SecureString "MyPassword" -AsPlainText -Force
$userId = "MyEmail#MyEmail"
$cred = New-Object -TypeName System.Management.Automation.PSCredential($userId ,$password)
Login-AzureRmAccount -Credential $cred -TenantId "MyTenantID"
$sourceserver = "DBServer"
$sourceresourcegroupname = "ResourceGroupName"
$sourcedatabasename = "DBName"
$copyDatabaseName = "CopyDatabaseName"
$DBImport = New-AzureRmSqlDatabaseImport -ResourceGroupName $sourceresourcegroupname `
-ServerName $sourceserver `
-DatabaseName $sourcedatabasename `
-DatabaseMaxSizeBytes "21474836480" `
-StorageKeyType "StorageAccessKey" `
-StorageKey $(Get-AzureRmStorageAccountKey -ResourceGroupName $sourceresourcegroupname -StorageAccountName devtestdatabase).Value[0] `
-StorageUri "https://mystorageAccount.blob.core.windows.net/testing/test.bacpac" `
-Edition "Standard" `
-ServiceObjectiveName "S9" `
-AdministratorLogin "adminUserName" `
-AdministratorLoginPassword $(ConvertTo-SecureString -String "AdminPassword" -AsPlainText -Force)
While ((Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $DBImport.OperationStatusLink).Status -eq "InProgress")
{
Sleep -Seconds 10
}
I need to execute a batch to perform some maintenance tasks in my database but all the examples on Azure Automation I see are dealing with a single SQL command.
How do I do it if creating an SP is not an option? I think I need to either somehow embed my script.sql file into a runbook script or reference it (like here, for example)?
You could store the .sql file in Azure Blob Storage, and within the runbook download the .sql file, read its contents, and pass that to the SqlCommand object.
Something like:
try {
# Connect to Azure using service principal auth
$ServicePrincipalConnection = Get-AutomationConnection -Name $AzureConnectionAssetName
Write-Output "Logging in to Azure..."
$Null = Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $ServicePrincipalConnection.TenantId `
-ApplicationId $ServicePrincipalConnection.ApplicationId `
-CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint
}
catch {
if(!$ServicePrincipalConnection) {
throw "Connection $AzureConnectionAssetName not found."
}
else {
throw $_.Exception
}
}
$Path = "C:\abc.sql"
Set-AzureRmCurrentStorageAccount -ResourceGroupName $ResourceGroupName -Name $StorageAccountName
Get-AzureStorageBlobContent -Container $Container -Blob $Blob -Destination $Path
$Content = Get-Content $Path
$Cmd = New-Object System.Data.SqlClient.SqlCommand($Content, $Conn)
I've written a powershell script that creates a new sql server database and login, and then sets the database owner to the newly created user. This is successful. However, I get a login failed exception when attempting to login within the same script. If I use SQL Server Management Studio the login works.
Here's the script:
$server = new-Object Microsoft.SqlServer.Management.Smo.Server("(local)")
$db = New-Object Microsoft.SqlServer.Management.Smo.Database($server, 'TestDB')
$db.Create()
$login = new-object Microsoft.SqlServer.Management.Smo.Login("(local)", 'TestUser')
$login.LoginType = 'SqlLogin'
$login.PasswordPolicyEnforced = $false
$login.PasswordExpirationEnabled = $false
$login.Create('Password1')
$server = new-Object Microsoft.SqlServer.Management.Smo.Server("(local)")
$db = New-Object Microsoft.SqlServer.Management.Smo.Database
$db = $server.Databases.Item('TestDB')
$db.SetOwner('TestUser', $TRUE)
$db.Alter()
Invoke-Sqlcmd -ServerInstance localhost -Database 'TestDB' -Username 'TestUser' -Password 'Password1' -Query "SELECT * FROM sysusers"
I've tried adding a Start-Sleep (up to 5mins) to no avail, and I've tried Restart-Service mssqlserver -Force, also to no avail.
Any ideas?
This isn't an answer to the problem I was encountering, just a work around. The script is being run as part of an automated deployment, the overall scripts are run under the "NT AUTHORITY\SYSTEM" username, so to get around my logging in issue I'm simply using Integrated Security=true.
I think your final line should read:
Invoke-Sqlcmd -ServerInstance '(local)' -Database 'TestDB' -Username 'TestUser' -Password 'Password1' -Query "SELECT * FROM sysusers"
Notice the use of '(local)' rather than 'localhost'.
follow the codes below
$SqlServer = "servar.site.com Or server ip with port"
$SqlDBName = "dbName"
$sqlConnection = New-Object Microsoft.SqlServer.Management.Common.ServerConnection
$sqlConnection.ServerInstance=$SqlServer
$sqlConnection.LoginSecure = $false
$sqlConnection.Login = "userid if you have"
$sqlConnection.Password = "password if is needed to connect to sql server"
Add-Type -Path "C:\Program Files\Microsoft SQL
Server\140\SDK\Assemblies\Microsoft.SqlServer.Smo.dll"
$server = New-Object Microsoft.SqlServer.Management.Smo.Server($sqlConnection)
# get all of the current logins and their types
$server.Logins |
Select-Object Name, LoginType, Parent
# create a new login by prompting for new credentials
$NewLoginCredentials = Get-Credential -Message "Enter credentials for the new login"
$NewLogin = New-Object Microsoft.SqlServer.Management.Smo.Login($server,
$NewLoginCredentials.UserName)
$NewLogin.LoginType = [Microsoft.SqlServer.Management.Smo.LoginType]::SqlLogin
$NewLogin.Create($NewLoginCredentials.Password)
# create a new database user for the newly created login
$NewUser = New-Object
Microsoft.SqlServer.Management.Smo.User($server.Databases[$SqlDBName],
$NewLoginCredentials.UserName)
$NewUser.Login = $NewLoginCredentials.UserName
$NewUser.Create()
$NewUser.AddToRole("db_datareader")
We need to backup the azure database and store it on blob so that it can be restored. I've seen this blog but it uses the third party cmdlets.
http://weblogs.thinktecture.com/cweyer/2011/01/automating-backup-of-a-sql-azure-database-to-azure-blob-storage-with-the-help-of-powershell-and-task-scheduler.html
Could someone please guide/help how above can be achieved using powershell.
Backing up to WA Blob Store is not supported from Azure DB, rather the service does automatic backups for you with PITR capability. You'll find the following documentation useful:
http://msdn.microsoft.com/en-us/library/azure/hh852669.aspx
http://msdn.microsoft.com/en-us/library/azure/jj650016.aspx
Hope this helps.
Here is my powershell script
https://gist.github.com/voxon2/be29a3fd6dabbb9155ca
Here is an article describing many different approaches other than powershell
http://blogs.msdn.com/b/mast/archive/2013/03/04/different-ways-to-backup-your-windows-azure-sql-database.aspx
First get your Azure Automation Settings done (see here).
Edit the blow script and save it as .ps1 file. When you run it for
the first time, it will ask you both your azure automation account and
your database credentials. During this process, it will save your
credentials in a local file securely (see here how it is done). After this time on wards, it uses the saved credentials.
The .psl file and the encrypted credential files should be stored in one
directory
Once you are happy you can schedule it to run in task scheduler.
function Get-MyCredential
{
param(
$CredPath,
[switch]$Help
)
$HelpText = #"
Get-MyCredential
Usage:
Get-MyCredential -CredPath `$CredPath
If a credential is stored in $CredPath, it will be used.
If no credential is found, Export-Credential will start and offer to
Store a credential at the location specified.
"#
if($Help -or (!($CredPath))){write-host $Helptext; Break}
if (!(Test-Path -Path $CredPath -PathType Leaf)) {
Export-Credential (Get-Credential) $CredPath
}
$cred = Import-Clixml $CredPath
$cred.Password = $cred.Password | ConvertTo-SecureString
$Credential = New-Object System.Management.Automation.PsCredential($cred.UserName, $cred.Password)
Return $Credential
}
function Export-Credential($cred, $path) {
$cred = $cred | Select-Object *
$cred.password = $cred.Password | ConvertFrom-SecureString
$cred | Export-Clixml $path
}
#Create a directory with you azure server name to isolate configurations
$FileRootPath = "C:\PowerShellScripts\AzureServerName"
Write-Host "Getting Azure credentials"
$AzureCred = Get-MyCredential ($FileRootPath + "AzureSyncred.txt")
#Use Azure Automation Account
#(If You do not have it will not work with other accounts)
Add-AzureAccount -Credential $AzureCred
Select-AzureSubscription -SubscriptionId "myAzureSubscriptionId"
#DO NOT use tcp:myServerName.database.windows.net,1433 but only myServerName
$ServerName = "myServerName"
$Date = Get-Date -format "yyyy-MM-dd-HH-mm"
$DatabaseName = "myTargetDatabaseName"
$BlobName = $Date + "-" + $DatabaseName.bacpac"
$StorageName = "myStorageAccountName"
$ContainerName = "myContainerNameToStoreBacpacFiles"
$StorageKey = "myStorageAccountKey"
Write-Host "Getting database user credential"
#DO NOT use myDatabaseUsername#myServerName but only myDatabaseUsername
$credential = Get-MyCredential ($FileRootPath + "DbSyncred.xml")
Write-Host "Connecting to Azure database"
$SqlCtx = New-AzureSqlDatabaseServerContext -ServerName $ServerName -Credential $credential
Write-Host "Connecting to Blob storage"
$StorageCtx = New-AzureStorageContext -StorageAccountName $StorageName -StorageAccountKey $StorageKey
$Container = Get-AzureStorageContainer -Name $ContainerName -Context $StorageCtx
Write-Host "Exporting data to blob"
$exportRequest = Start-AzureSqlDatabaseExport -SqlConnectionContext $SqlCtx -StorageContainer $Container -DatabaseName $DatabaseName -BlobName $BlobName
Get-AzureSqlDatabaseImportExportStatus -Request $exportRequest
# use the below script in powershell to execute the script
# powershell -ExecutionPolicy ByPass –File C:\PowerShellScripts\AzureServerName\mySavedScript.ps1 –noexit