Create keyvault secret - Operation returned an invalid status code 'Conflict' - azure-powershell

I want to create multiple secrets in keyvault. Assign dynamic values of Blobstorage account, Batch account.
I tried below code to create secrets:
Function CreateKeyvaultSecrets
{
Param
(
[Parameter(Mandatory=$true, Position=0)]
[string] $keyvaultName,
[Parameter(Mandatory=$true, Position=1)]
[string] $blobStorageAccountName,
[Parameter(Mandatory=$true, Position=2)]
[string] $batchaccountName,
[Parameter(Mandatory=$true, Position=3)]
[string] $logRgName
)
#Get Storagekey
$blobStorageKeyObject = (Get-AzStorageAccountKey -ResourceGroupName $logRgName -AccountName $blobStorageAccountName)| Where-Object {$_.KeyName -eq "key1"}
$blobStorageKey = $blobStorageKeyObject.Value
$blobStorageConnectionString = "DefaultEndpointsProtocol=https;AccountName=$blobStorageAccountName;AccountKey=$blobStorageKey;EndpointSuffix=core.windows.net"
#Create blobstorage key secret
$blobSecretkey = ConvertTo-SecureString -String $blobStorageKey -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName $keyvaultName -Name 'blobstorageaccesskey' -SecretValue $blobSecretkey
#Create blobstorage connectionstring key secret
$blobconnectionstringSecret = ConvertTo-SecureString -String $blobStorageConnectionString -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName $keyvaultName -Name 'blobstorageconnectionstring' -SecretValue $blobconnectionstringSecret
Write-host "Blob Storage Account connection string added to Keyvault secret"
}
CreateKeyvaultSecrets 'kvtevalmock' 'steval' 'abtaeval' 'rg-eval'
I am trying to execute above code from Azure DevOps Powershell task. Azure powershell version is 5.
Secrets are not getting creating. Below error is thrown:
WARNING: Upcoming breaking changes in the cmdlet 'Set-AzKeyVaultSecret' :
- The output type 'Microsoft.Azure.Commands.KeyVault.Models.PSKeyVaultSecret' is changing
- The following properties in the output type are being deprecated : 'SecretValueText'
- The change is expected to take effect from the version : '3.0.0'
Note : Go to https://aka.ms/azps-changewarnings for steps to suppress this breaking change warning, and other
information on breaking changes in Azure PowerShell.
##[error]Operation returned an invalid status code 'Conflict'
##[error]PowerShell exited with code '1'.

I test your script on my side, it works fine.
From the error message, looks your Az.KeyVault powershell module version is too old, my version is 3.4.0, try to update it with the command below.
Update-Module -Name Az.KeyVault -Force
After the update, close all the powershell sessions and open a new one to try again, it should work.

Related

Error MSB3325: Cannot import the following key file

I have a project hosted in Azure DevOps and there the build is failing with the error message:
Error MSB3325: Cannot import the following key file: xxxx.pfx. The key
file may be password protected. To correct this, try to import the
certificate again or manually install the certificate to the Strong
Name CSP with the following key container name: VS_KEY_xxxx
This happens after a project has been changed to sign the assembly with a newly generated password protected pfx signing certificate.
I have tried various fixes given in other SO posts and nothing seems to work.
Can anyone with azure-devops expertise help me with this situation.
You can use the SnInstallPfx.exe and add this in your pipeline as a powershell task
- task: PowerShell#2
env:
SN_INSTALL_PFX: $(snInstallPfx.secureFilePath)
MYCERTIFICATE_PFX: $(myCertificatePfx.secureFilePath)
MYCERTIFICATE_PFX_PASSWORD: $(myCertificatePfxPassword)
inputs:
targetType: 'inline'
script: '&"$($ENV:SN_INSTALL_PFX)" "$($ENV:MYCERTIFICATE_PFX)" "$($ENV:MYCERTIFICATE_PFX_PASSWORD)"'
The pfx, exe and password are stored in the Pipeline library as secure files and variables.
For more information, see the following blog article.
Error MSB3325: Cannot import the following key file
You can create a PowerShell script and add a PowerShell Script step in your build definition to import the new certificate file before the VSBuild step:
The PowerShell script I used to use:
$pfxpath = 'pathtoees.pfx'
$password = 'password'
Add-Type -AssemblyName System.Security
$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$cert.Import($pfxpath, $password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
$store = new-object system.security.cryptography.X509Certificates.X509Store -argumentlist "MY", CurrentUser
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]"ReadWrite")
$store.Add($cert)
$store.Close()
And it works fine on my side.
You can check the similar thread for some more details.
Hope this helps.

Azure DevOps Server pipeline build fails when using self-signed SSL certificate with "unable to get local issuer certificate" during NuGet restore

After upgrading to Azure DevOps Server 2019, automated pipeline builds are failing at the NuGet restore step showing:
Error: Error: unable to get local issuer certificate
Packages failed to restore
Microsoft's documentation states that the build agent running on Windows uses the Windows certificate store, so I have checked that the required certificates are installed correctly on the build server, however it is still failing.
There are many questions with similar symptoms but not the same cause. After investigation, I have found the solution to this but I didn't spot anything on this exact issue so I will post an answer that will hopefully save somebody else some time!
It turns out that the Azure DevOps build agent is using a version of Node.js that doesn't use the Windows Certificate Store.
The solution required is to point Node.js at an exported copy (*.cer file) of your self-signed SSL certificate's root CA certificate, using either a system environment variable called NODE_EXTRA_CA_CERTS or by using a Task Variable called NODE.EXTRA.CA.CERTS, with a value pointing to the certificate.
Developer Community Issue Link
I use a PowerShell agent job with the following script. This effectively gives a "Use the Windows Machine Certificate Store" option to Node.JS for the pipeline.
Some notes:
Monitoring node.exe with ProcMon suggests that the file referenced in NODE_EXTRA_CA_CERTS is read every time the pipeline is run. However, others have suggested running Restart-Service vstsagent* -Force is required for the change to be picked up. This isn't my experience but perhaps something different between environments causes this behaviour.
This adds an additional ~1s pipeline execution time. Probably an acceptable price for a "set and forget certificate management for Node in Pipelines on Windows" but worth noting nonetheless.
# If running in a pipeline then use the Agent Home directory,
# otherwise use the machine temp folder which is useful for testing
if ($env:AGENT_HOMEDIRECTORY -ne $null) { $TargetFolder = $env:AGENT_HOMEDIRECTORY }
else { $TargetFolder = [System.Environment]::GetEnvironmentVariable('TEMP','Machine') }
# Loop through each CA in the machine store
Get-ChildItem -Path Cert:\LocalMachine\CA | ForEach-Object {
# Convert cert's bytes to Base64-encoded text and add begin/end markers
$Cert = "-----BEGIN CERTIFICATE-----`n"
$Cert+= $([System.Convert]::ToBase64String($_.export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert),'InsertLineBreaks'))
$Cert+= "`n-----END CERTIFICATE-----`n"
# Append cert to chain
$Chain+= $Cert
}
# Build target path
$CertFile = "$TargetFolder\TrustedRootCAs.pem"
# Write to file system
$Chain | Out-File $CertFile -Force -Encoding ASCII
# Clean-up
$Chain = $null
# Let Node (running later in the pipeline) know from where to read certs
Write-Host "##vso[task.setvariable variable=NODE.EXTRA.CA.CERTS]$CertFile"
I formatted the PowerShell script from #alifen. The script below can be executed on the build agent itself. It takes a parameter for the target path and sets the environment variable on the server.
Credit to #alifen
[CmdletBinding()]
param (
[Parameter()]
[string]
$TargetFolder = "$env:SystemDrive\Certs"
)
If (-not(Test-Path $TargetFolder))
{
$null = New-Item -ItemType Directory -Path $TargetFolder -Force
}
# Loop through each CA in the machine store
Get-ChildItem -Path Cert:\LocalMachine\CA | ForEach-Object {
# Convert cert's bytes to Base64-encoded text and add begin/end markers
$Cert = "-----BEGIN CERTIFICATE-----`n"
$Cert += $([System.Convert]::ToBase64String($_.export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert), 'InsertLineBreaks'))
$Cert += "`n-----END CERTIFICATE-----`n"
# Append cert to chain
$Chain += $Cert
}
# Build target path
$CertFile = "$TargetFolder\TrustedRootCAs.pem"
# Write to file system
Write-Host "[$($MyInvocation.MyCommand.Name)]: Exporting certs to: [$CertFile]"
$Chain | Out-File $CertFile -Force -Encoding ASCII
# Set Environment variable
Write-Host "[$($MyInvocation.MyCommand.Name)]: Setting environment variable [NODE_EXTRA_CA_CERTS] to [$CertFile]"
[Environment]::SetEnvironmentVariable("NODE_EXTRA_CA_CERTS", "$CertFile", "Machine")

Remove-AzureRmSqlDatabase keeps asking me to run LoginAzureRmAccount

I have an Azure SQL database which I want to delete. The command should be:
Remove-AzureRmSqlDatabase -ResourceGroupName $dbResourceGroup -ServerName $dbServerName -DatabaseName $dbToDelete -Whatif -Force
The error I keep getting back is
Remove-AzureRmSqlDatabase : Run Login-AzureRmAccount to login.
I tried running Login-AzureRmAccount as myself, then as a service principal I use for unattended scripts, and nothing worked.
I am able to log into the Azure RM portal and delete databases. I am also able to run Invoke-SqlCmd against this database to query and manipulate data.
How can I make this work?
According to this error message, it seems that you have not login Azure whit the right subscription.
We can use this command to get the sql database's information, and check the subscription.
(Get-AzureRmSqlDatabase -DatabaseName jasontest1 -ServerName jasontest -ResourceGroupName jasontest).resourceid
Then we can find the subscription in the powershell output.
PS C:\Users> (Get-AzureRmSqlDatabase -DatabaseName jasontest1 -ServerName jasontest -ResourceGroupName jasontest).resourceid
/subscriptions/5384xxxx-xxxx-xxxx-xxxx-xxxxe29axxxx/resourceGroups/jasontest/providers/Microsoft.Sql/servers/jasontest/databases/jasontest1
To find which subscription, we can use this command Get-AzureRmSubscription to list it:
PS C:\Users> Get-AzureRmSubscription
Name : Visual Studio Ultimate with MSDN
Id : 5384xxxx-xxxx-xxxx-xxxx-xxxxe29axxxx
TenantId : 1fcfxxx-xxxx-xxxx-xxxx9-xxxx8bf8xxxx
State : Enabled
Also we can use this command to select the subscription:
Get-AzureRmSubscription -SubscriptionId 5384xxxx-xxxx-xxxx-xxx-xxxxe29axxxx
My Powershell Azure modules had dependency errors for something. To fix it I (at the behest of Microsoft tech support) ran:
PS C:\> Install-Module AzureRM -Force
This re-installed it and fixed the dependency problems.

Recompose VM using vmware 5.5 power cli

How can I recompose a VM v1 with existing parent VM v2's snapshot s2 ?
After referring few documents, I saw command Send-LinkedCloneRecompose for recomposing VM.
I am trying this command as follows:
$myVM = 'v1'
$clone = Get-DesktopVM -Name $myVM
$pool = Get-Pool -pool_id $clone.pool_id
$date = Get-Date
$date = $date.AddSeconds(10)
Write-Host "Recomposing" $clone.name
Send-LinkedCloneRecompose -machine_id $clone.machine_id -parentVMPath $pool.parentVMPath -parentSnapshotPath $pool.parentVMSnapshotPath -schedule $date -forcelogoff $true | tee-object -variable vmState
Here I am getting error as Get-DesktopVM : The term 'Get-DesktopVM' is not recognized as the name of a
cmdlet, function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and try again.
I am getting the same error for powercli commandlets Get-Pool and Send-LinkedCloneRecompose as well.
I am using VMware VSphere power cli 5.5 release 2 patch 1.
Can anyone please help me in understanding the problem here ?
I missed to add
& "C:\Program Files\VMware\VMware View\Server\extras\PowerShell\add-snapin.ps1"
before executing above commands. So the commandlets were not recognized.

How to use Cloudberry Powershell Snap-in (for Amazon S3) from within a scheduled SQL Agent Job?

I am trying to automate my SQL database backup process. My goal is to use the Cloudberry Powershell cmdlet to give me direct control and access over my S3 buckets. I am able to do this manually but cannot get my SQL jobs to work with this.
According to Cloudberry's installation instructions, I shouldn't have to register the Cloudberry Powershell snap-in if Powershell is already installed. I have found that to be false. I have tried to register it, both 64-bit and 32-bit with no luck.
This works when executed manually/explicitly from the ISE:
Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn
$today = Get-Date -format "yyyy.MM.dd.HH.mm.ss"
$key = "mykeygoeshere"
$secret = "mysecretgoeshere"
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "ProductionBackups/MyClient/log/" | Add-CloudFolder $today
$src = Get-CloudFilesystemConnection | Select-CloudFolder "X:\backups\MyClient\current\"
$src | Copy-CloudItem $destination -filter "log.trn"
^ When this command is executed in a SQL Agent job, it fails with this message:
Executed as user: DB-MAIN\SYSTEM. A job step received an error at line 1 in a PowerShell script. The corresponding line is 'Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'The term 'Add-PSSnapin' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. '. Process Exit Code -1. The step failed.
I read in this blog post that SQLPS.exe cannot execute 'Add-PSSnapin' commands? Is that true? I cannot find any clarification on the subject...
how can I automate my SQL backup files to the Amazon S3 cloud? I have tried everything. TNT Drive was a huge waste of time. I am hoping Cloudberry can do it, any tips?
You could use Amazon AWS .Net SDK. You can download it from here:
http://aws.amazon.com/sdkfornet/
Here's the example function to download file from S3:
function DownloadS3File([string]$bucket, [string]$file, [string]$localFile)
{
if (Test-Path "C:\Program Files (x86)")
{
Add-Type -Path "C:\Program Files (x86)\AWS SDK for .NET\bin\AWSSDK.dll"
}
else
{
Add-Type -Path "C:\Program Files\AWS SDK for .NET\bin\AWSSDK.dll"
}
$secretKeyID= $env:AWS_ACCESS_KEY_ID
$secretAccessKeyID= $env:AWS_SECRET_ACCESS_KEY
$client=[Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID,$secretAccessKeyID)
$request = New-Object -TypeName Amazon.S3.Model.GetObjectRequest
$request.BucketName = $bucket
$request.Key = $file
$response = $client.GetObject($request)
$writer = new-object System.IO.FileStream ($localFile ,[system.IO.filemode]::Create)
[byte[]]$buffer = new-object byte[] 4096
[int]$total = [int]$count = 0
do
{
$count = $response.ResponseStream.Read($buffer, 0, $buffer.Length)
$writer.Write($buffer, 0, $count)
}
while ($count -gt 0)
$response.ResponseStream.Close()
$writer.Close()
echo "File downloaded: $localFile"
}