Removing self signed certificate from my store - ssl

Is there a way to remove/ uninstall a self signed certificate from my store using powershell ?
I tried
Remove-Item cert:\LocalMachine\My\$thumb
it did not work, I got an exception saying "Provider does not support this operation"
I also tried
certmgr.msc /del /n "MyTestServer" /s MY
it did not work either
How can I uninstall certificate from store ??
Thanks in advance
Jeez

This approach seems to apply to Powershell 2 only and thus it is outdated.
Remove-Item does not work with certificates because der cert-provider is readonly in powershell. Found that information here
$store = new-object system.security.cryptography.x509certificates.x509Store 'My','CurrentUser'
$store.Open('ReadWrite')
$certs = #(dir cert:\currentuser\my | ? { $_.Subject -like '*MyTestServer*' })
foreach ($cert in $certs) {$store.Remove($cert)}
$store.close()
I found the solution here in the comments. So it is untested.

Found this article because remove-item wasn't working.
This is not exactly 'true' powershell, but I use this method:
certutil -delstore my "5314bdfa0255be36e53e749d033"
You can get thumbprint via cert:\LocalMachine\my or through certutil. In my case, I have multiple certs with exact same name, so I like above method more because it gives me a specific target when I delete a cert.

With PS 3.0, if you want to remove by subjectName
Get-ChildItem -Path Cert:\CurrentUser\My | where { $_.subject -eq "CN=MysubjectName" } | Remove-Item

With PS 3.0 there is a more concise and idiomatic approach:
Remove-Item -Path cert:\LocalMachine\My\{Thumbprint} -DeleteKey
See TechNet for all the details.

This will work as well in powershell
To get the thumbpeint
dir cert:\localmachine\my
To delete the thumbprint
del cert:\localmachine\my\thumbprint

Realise this is an old thread, but since I'm looking at doing the same right now thought I'd post. I'm needing to remove from all cert stores by friendly name.
Realise this isn't the answer for OP but may help someone.
If that is required by anyone this works for me dir cert: -Recurse | Where-Object { $_.FriendlyName -like "*SOMENAME*" } | Remove-Item

You are set on wrong cert store
Use $cert = Get-ChildItem -Path "Cert:\CurrentUser\My\THUMBPRINT" instead of cert:\LocalMachine\My\$thumb you say that the certificates are your. So your certificates are stored in -Path "Cert:\CurrentUser\My\THUMBPRINT" CurrentUser = Your user account, and you don't need to change it to your account name.
br.

Related

Create keyvault secret - Operation returned an invalid status code 'Conflict'

I want to create multiple secrets in keyvault. Assign dynamic values of Blobstorage account, Batch account.
I tried below code to create secrets:
Function CreateKeyvaultSecrets
{
Param
(
[Parameter(Mandatory=$true, Position=0)]
[string] $keyvaultName,
[Parameter(Mandatory=$true, Position=1)]
[string] $blobStorageAccountName,
[Parameter(Mandatory=$true, Position=2)]
[string] $batchaccountName,
[Parameter(Mandatory=$true, Position=3)]
[string] $logRgName
)
#Get Storagekey
$blobStorageKeyObject = (Get-AzStorageAccountKey -ResourceGroupName $logRgName -AccountName $blobStorageAccountName)| Where-Object {$_.KeyName -eq "key1"}
$blobStorageKey = $blobStorageKeyObject.Value
$blobStorageConnectionString = "DefaultEndpointsProtocol=https;AccountName=$blobStorageAccountName;AccountKey=$blobStorageKey;EndpointSuffix=core.windows.net"
#Create blobstorage key secret
$blobSecretkey = ConvertTo-SecureString -String $blobStorageKey -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName $keyvaultName -Name 'blobstorageaccesskey' -SecretValue $blobSecretkey
#Create blobstorage connectionstring key secret
$blobconnectionstringSecret = ConvertTo-SecureString -String $blobStorageConnectionString -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName $keyvaultName -Name 'blobstorageconnectionstring' -SecretValue $blobconnectionstringSecret
Write-host "Blob Storage Account connection string added to Keyvault secret"
}
CreateKeyvaultSecrets 'kvtevalmock' 'steval' 'abtaeval' 'rg-eval'
I am trying to execute above code from Azure DevOps Powershell task. Azure powershell version is 5.
Secrets are not getting creating. Below error is thrown:
WARNING: Upcoming breaking changes in the cmdlet 'Set-AzKeyVaultSecret' :
- The output type 'Microsoft.Azure.Commands.KeyVault.Models.PSKeyVaultSecret' is changing
- The following properties in the output type are being deprecated : 'SecretValueText'
- The change is expected to take effect from the version : '3.0.0'
Note : Go to https://aka.ms/azps-changewarnings for steps to suppress this breaking change warning, and other
information on breaking changes in Azure PowerShell.
##[error]Operation returned an invalid status code 'Conflict'
##[error]PowerShell exited with code '1'.
I test your script on my side, it works fine.
From the error message, looks your Az.KeyVault powershell module version is too old, my version is 3.4.0, try to update it with the command below.
Update-Module -Name Az.KeyVault -Force
After the update, close all the powershell sessions and open a new one to try again, it should work.

Error MSB3325: Cannot import the following key file

I have a project hosted in Azure DevOps and there the build is failing with the error message:
Error MSB3325: Cannot import the following key file: xxxx.pfx. The key
file may be password protected. To correct this, try to import the
certificate again or manually install the certificate to the Strong
Name CSP with the following key container name: VS_KEY_xxxx
This happens after a project has been changed to sign the assembly with a newly generated password protected pfx signing certificate.
I have tried various fixes given in other SO posts and nothing seems to work.
Can anyone with azure-devops expertise help me with this situation.
You can use the SnInstallPfx.exe and add this in your pipeline as a powershell task
- task: PowerShell#2
env:
SN_INSTALL_PFX: $(snInstallPfx.secureFilePath)
MYCERTIFICATE_PFX: $(myCertificatePfx.secureFilePath)
MYCERTIFICATE_PFX_PASSWORD: $(myCertificatePfxPassword)
inputs:
targetType: 'inline'
script: '&"$($ENV:SN_INSTALL_PFX)" "$($ENV:MYCERTIFICATE_PFX)" "$($ENV:MYCERTIFICATE_PFX_PASSWORD)"'
The pfx, exe and password are stored in the Pipeline library as secure files and variables.
For more information, see the following blog article.
Error MSB3325: Cannot import the following key file
You can create a PowerShell script and add a PowerShell Script step in your build definition to import the new certificate file before the VSBuild step:
The PowerShell script I used to use:
$pfxpath = 'pathtoees.pfx'
$password = 'password'
Add-Type -AssemblyName System.Security
$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$cert.Import($pfxpath, $password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
$store = new-object system.security.cryptography.X509Certificates.X509Store -argumentlist "MY", CurrentUser
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]"ReadWrite")
$store.Add($cert)
$store.Close()
And it works fine on my side.
You can check the similar thread for some more details.
Hope this helps.

Azure DevOps Server pipeline build fails when using self-signed SSL certificate with "unable to get local issuer certificate" during NuGet restore

After upgrading to Azure DevOps Server 2019, automated pipeline builds are failing at the NuGet restore step showing:
Error: Error: unable to get local issuer certificate
Packages failed to restore
Microsoft's documentation states that the build agent running on Windows uses the Windows certificate store, so I have checked that the required certificates are installed correctly on the build server, however it is still failing.
There are many questions with similar symptoms but not the same cause. After investigation, I have found the solution to this but I didn't spot anything on this exact issue so I will post an answer that will hopefully save somebody else some time!
It turns out that the Azure DevOps build agent is using a version of Node.js that doesn't use the Windows Certificate Store.
The solution required is to point Node.js at an exported copy (*.cer file) of your self-signed SSL certificate's root CA certificate, using either a system environment variable called NODE_EXTRA_CA_CERTS or by using a Task Variable called NODE.EXTRA.CA.CERTS, with a value pointing to the certificate.
Developer Community Issue Link
I use a PowerShell agent job with the following script. This effectively gives a "Use the Windows Machine Certificate Store" option to Node.JS for the pipeline.
Some notes:
Monitoring node.exe with ProcMon suggests that the file referenced in NODE_EXTRA_CA_CERTS is read every time the pipeline is run. However, others have suggested running Restart-Service vstsagent* -Force is required for the change to be picked up. This isn't my experience but perhaps something different between environments causes this behaviour.
This adds an additional ~1s pipeline execution time. Probably an acceptable price for a "set and forget certificate management for Node in Pipelines on Windows" but worth noting nonetheless.
# If running in a pipeline then use the Agent Home directory,
# otherwise use the machine temp folder which is useful for testing
if ($env:AGENT_HOMEDIRECTORY -ne $null) { $TargetFolder = $env:AGENT_HOMEDIRECTORY }
else { $TargetFolder = [System.Environment]::GetEnvironmentVariable('TEMP','Machine') }
# Loop through each CA in the machine store
Get-ChildItem -Path Cert:\LocalMachine\CA | ForEach-Object {
# Convert cert's bytes to Base64-encoded text and add begin/end markers
$Cert = "-----BEGIN CERTIFICATE-----`n"
$Cert+= $([System.Convert]::ToBase64String($_.export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert),'InsertLineBreaks'))
$Cert+= "`n-----END CERTIFICATE-----`n"
# Append cert to chain
$Chain+= $Cert
}
# Build target path
$CertFile = "$TargetFolder\TrustedRootCAs.pem"
# Write to file system
$Chain | Out-File $CertFile -Force -Encoding ASCII
# Clean-up
$Chain = $null
# Let Node (running later in the pipeline) know from where to read certs
Write-Host "##vso[task.setvariable variable=NODE.EXTRA.CA.CERTS]$CertFile"
I formatted the PowerShell script from #alifen. The script below can be executed on the build agent itself. It takes a parameter for the target path and sets the environment variable on the server.
Credit to #alifen
[CmdletBinding()]
param (
[Parameter()]
[string]
$TargetFolder = "$env:SystemDrive\Certs"
)
If (-not(Test-Path $TargetFolder))
{
$null = New-Item -ItemType Directory -Path $TargetFolder -Force
}
# Loop through each CA in the machine store
Get-ChildItem -Path Cert:\LocalMachine\CA | ForEach-Object {
# Convert cert's bytes to Base64-encoded text and add begin/end markers
$Cert = "-----BEGIN CERTIFICATE-----`n"
$Cert += $([System.Convert]::ToBase64String($_.export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert), 'InsertLineBreaks'))
$Cert += "`n-----END CERTIFICATE-----`n"
# Append cert to chain
$Chain += $Cert
}
# Build target path
$CertFile = "$TargetFolder\TrustedRootCAs.pem"
# Write to file system
Write-Host "[$($MyInvocation.MyCommand.Name)]: Exporting certs to: [$CertFile]"
$Chain | Out-File $CertFile -Force -Encoding ASCII
# Set Environment variable
Write-Host "[$($MyInvocation.MyCommand.Name)]: Setting environment variable [NODE_EXTRA_CA_CERTS] to [$CertFile]"
[Environment]::SetEnvironmentVariable("NODE_EXTRA_CA_CERTS", "$CertFile", "Machine")

Remove-AzureRmSqlDatabase keeps asking me to run LoginAzureRmAccount

I have an Azure SQL database which I want to delete. The command should be:
Remove-AzureRmSqlDatabase -ResourceGroupName $dbResourceGroup -ServerName $dbServerName -DatabaseName $dbToDelete -Whatif -Force
The error I keep getting back is
Remove-AzureRmSqlDatabase : Run Login-AzureRmAccount to login.
I tried running Login-AzureRmAccount as myself, then as a service principal I use for unattended scripts, and nothing worked.
I am able to log into the Azure RM portal and delete databases. I am also able to run Invoke-SqlCmd against this database to query and manipulate data.
How can I make this work?
According to this error message, it seems that you have not login Azure whit the right subscription.
We can use this command to get the sql database's information, and check the subscription.
(Get-AzureRmSqlDatabase -DatabaseName jasontest1 -ServerName jasontest -ResourceGroupName jasontest).resourceid
Then we can find the subscription in the powershell output.
PS C:\Users> (Get-AzureRmSqlDatabase -DatabaseName jasontest1 -ServerName jasontest -ResourceGroupName jasontest).resourceid
/subscriptions/5384xxxx-xxxx-xxxx-xxxx-xxxxe29axxxx/resourceGroups/jasontest/providers/Microsoft.Sql/servers/jasontest/databases/jasontest1
To find which subscription, we can use this command Get-AzureRmSubscription to list it:
PS C:\Users> Get-AzureRmSubscription
Name : Visual Studio Ultimate with MSDN
Id : 5384xxxx-xxxx-xxxx-xxxx-xxxxe29axxxx
TenantId : 1fcfxxx-xxxx-xxxx-xxxx9-xxxx8bf8xxxx
State : Enabled
Also we can use this command to select the subscription:
Get-AzureRmSubscription -SubscriptionId 5384xxxx-xxxx-xxxx-xxx-xxxxe29axxxx
My Powershell Azure modules had dependency errors for something. To fix it I (at the behest of Microsoft tech support) ran:
PS C:\> Install-Module AzureRM -Force
This re-installed it and fixed the dependency problems.

scheduling automated file name change

I'm trying to schedule a task to edit the names of images. Images from 123-1234.jpg to 123_1234.jpg for example.
This is what i have now.
powershell.exe -noexit -command "cd'C:\path\i\want" Get-ChildItem -Filter "*.jpg" -Recurse | Rename-Item -NewName {$_.name -replace "-", "_" }
Appreciate the help.
My reasons for needing this code are that i have 1 software that produces images in various extensions however the way it names the files is like this 12345-123456.jpg
The other Software i have imports names like this 12345_123456.jpg.
This would solve me needing third party software if i can create an automated task to simply change all the .jpg in a directory to the needed name format for the automated import.
powershell.exe -noexit -command Get-ChildItem 'C:\my\file\path\' -Filter "*.jpg" -Recurse | Rename-Item -NewName {$_.name -replace '-','_' }
Im proud of myself as this is my first time using Powershell. This worked fine.