I'm trying to use PowerShell to create some backups, and then to copy these to a web folder (or, in other words, upload them to a WebDAV share).
At first I thought I'd do the WebDAV stuff from within PowerShell, but it seems this still requires a fair amount of "manual labour", ie: constructing HTTP requests. I then settled for creating a web folder from the script and letting Windows handle the WebDAV stuff. It seems that all it takes to create a web folder is to create a standard shortcut, as described here.
What I can't figure out is how to actually copy files to the shortcut's target..? Maybe I'm going about this the wrong way.
It would be ideal if I could somehow encrypt the credentials for the WebDAV in the script, then have it create the web folder, shunt over the files, and delete the web folder again. Or even better, not use a web folder at all. Third option would be to just create the web folder manually and leave it there, though I'd rather not.
Any ideas/pointers/tips? :)
If you are using powershell to backup your SVN repositories using svnadmin dump then be aware that piping to a file will silently corrupt your backups.
Powershell likes to change things to UTF-16 when piping, it also changes unix linebreaks to windows ones. This will come back to haunt you when you try and restore.
Problem well described here:
http://thoughtfulcode.wordpress.com/2010/01/29/powershells-object-pipeline-corrupts-piped-binary-data/
Solution here:
http://www.microsoft.com/communities/newsgroups/en-us/default.aspx?dg=microsoft.public.windows.powershell&tid=e4cd89e9-427b-407d-a94f-c24be3f1e36f&cat=&lang=&cr=&sloc=&p=1
In summary, use cmd.exe instead of powershell:
cmd.exe /c svnadmin dump ... `> dumpfile.dump
Note that the backtick on the output redirection is required to stop powershell parsing it.
Well, in the meantime I cobbled together another solution. Maybe it will be of use to someone..
[Run.cmd] --May need to change powershell path
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -nologo -noninteractive -command "C:\Scripts\RunBackup.ps1"
[RunBackup.ps1] --Out-File not having the desired effect, maybe someone can figure out why?
C:\Scripts\SqlBackup.ps1 | Out-File "C:\Scripts\log.txt"
C:\Scripts\SVNBackup.ps1 | Out-File "C:\Scripts\log.txt"
C:\Scripts\Zip.ps1 | Out-File "C:\Scripts\log.txt"
[SqlBackup.ps1] --You may need to modify which SMO assemblies are loaded, depending on your
version of SQL server. Don't forget to set $instance and $bkdir.
#http://www.mssqltips.com/tip.asp?tip=1862&home
$instance = ".\SQLEXPRESS"
[System.Reflection.Assembly]::LoadFrom("C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.SMO.dll") | out-null
[System.Reflection.Assembly]::LoadFrom("C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.SMOExtended.dll") | out-null
$s = new-object ("Microsoft.SqlServer.Management.Smo.Server") $instance
$bkdir = "c:\Backups" #We define the folder path as a variable
$dbs = $s.Databases
foreach ($db in $dbs)
{
if($db.Name -ne "tempdb") #We don't want to backup the tempdb database
{
$dbname = $db.Name
$dt = get-date -format yyyyMMddHHmm #We use this to create a file name based on the timestamp
$dbBackup = new-object ("Microsoft.SqlServer.Management.Smo.Backup")
$dbBackup.Action = "Database"
$dbBackup.Database = $dbname
$dbBackup.Devices.AddDevice($bkdir + "\" + $dbname + "_db_" + $dt + ".bak", "File")
$dbBackup.SqlBackup($s)
}
if($db.RecoveryModel -ne 3) #Don't issue Log backups for DBs with RecoveryModel=3 or SIMPLE
{
$dbname = $db.Name
$dt = get-date -format yyyyMMddHHmm #Create a file name based on the timestamp
$dbBackup = new-object ("Microsoft.SqlServer.Management.Smo.Backup")
$dbBackup.Action = "Log"
$dbBackup.Database = $dbname
$dbBackup.Devices.AddDevice($bkdir + "\" + $dbname + "_log_" + $dt + ".trn", "File")
$dbBackup.SqlBackup($s)
}
}
[SVNBackup.ps1] --Modify repo and backup paths
#set alias to svnadmin exe
set-alias svnadmin "C:\Program Files (x86)\CollabNet Subversion Server\svnadmin.exe"
#create dump
cmd.exe /c svnadmin dump "C:\Repo" `> "C:\Backups\svn.dmp"
#remove alias
remove-item alias:svnadmin
[Zip.ps1] --Need to have 7zip installed, modify 7z.exe path if necessary
#set alias to command line version of 7zip
set-alias sevenz "c:\program files\7-zip\7z.exe"
#Backups location
cd 'C:\Backups'
#rar the contents of the directory
$dt = get-date -format yyyyMMddHHmm #We use this to create a file name based on the timestamp
$outputFileName = "SQLSVNBackup$dt.7z"
$exclude1 = "-x!*.rar"
$exclude2 = "-x!*.7z"
sevenz a -t7z "$outputFileName" *.* "$exclude1" "$exclude2"
#find all .bak files in the immediate directory
dir '*.bak' | foreach-object{
#remove the bak file
remove-item $_.name
}
#find all .dmp files in the immediate directory
dir '*.dmp' | foreach-object{
#remove the dmp file
remove-item $_.name
}
#find all .trn files in the immediate directory
dir '*.trn' | foreach-object{
#remove the trn file
remove-item $_.name
}
#remove 7zip alias
remove-item alias:sevenz
I used GoodSync to backup to WebDAV and scheduled two tasks to run the .cmd file and then sync/backup offsite.
Related
After upgrading to Azure DevOps Server 2019, automated pipeline builds are failing at the NuGet restore step showing:
Error: Error: unable to get local issuer certificate
Packages failed to restore
Microsoft's documentation states that the build agent running on Windows uses the Windows certificate store, so I have checked that the required certificates are installed correctly on the build server, however it is still failing.
There are many questions with similar symptoms but not the same cause. After investigation, I have found the solution to this but I didn't spot anything on this exact issue so I will post an answer that will hopefully save somebody else some time!
It turns out that the Azure DevOps build agent is using a version of Node.js that doesn't use the Windows Certificate Store.
The solution required is to point Node.js at an exported copy (*.cer file) of your self-signed SSL certificate's root CA certificate, using either a system environment variable called NODE_EXTRA_CA_CERTS or by using a Task Variable called NODE.EXTRA.CA.CERTS, with a value pointing to the certificate.
Developer Community Issue Link
I use a PowerShell agent job with the following script. This effectively gives a "Use the Windows Machine Certificate Store" option to Node.JS for the pipeline.
Some notes:
Monitoring node.exe with ProcMon suggests that the file referenced in NODE_EXTRA_CA_CERTS is read every time the pipeline is run. However, others have suggested running Restart-Service vstsagent* -Force is required for the change to be picked up. This isn't my experience but perhaps something different between environments causes this behaviour.
This adds an additional ~1s pipeline execution time. Probably an acceptable price for a "set and forget certificate management for Node in Pipelines on Windows" but worth noting nonetheless.
# If running in a pipeline then use the Agent Home directory,
# otherwise use the machine temp folder which is useful for testing
if ($env:AGENT_HOMEDIRECTORY -ne $null) { $TargetFolder = $env:AGENT_HOMEDIRECTORY }
else { $TargetFolder = [System.Environment]::GetEnvironmentVariable('TEMP','Machine') }
# Loop through each CA in the machine store
Get-ChildItem -Path Cert:\LocalMachine\CA | ForEach-Object {
# Convert cert's bytes to Base64-encoded text and add begin/end markers
$Cert = "-----BEGIN CERTIFICATE-----`n"
$Cert+= $([System.Convert]::ToBase64String($_.export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert),'InsertLineBreaks'))
$Cert+= "`n-----END CERTIFICATE-----`n"
# Append cert to chain
$Chain+= $Cert
}
# Build target path
$CertFile = "$TargetFolder\TrustedRootCAs.pem"
# Write to file system
$Chain | Out-File $CertFile -Force -Encoding ASCII
# Clean-up
$Chain = $null
# Let Node (running later in the pipeline) know from where to read certs
Write-Host "##vso[task.setvariable variable=NODE.EXTRA.CA.CERTS]$CertFile"
I formatted the PowerShell script from #alifen. The script below can be executed on the build agent itself. It takes a parameter for the target path and sets the environment variable on the server.
Credit to #alifen
[CmdletBinding()]
param (
[Parameter()]
[string]
$TargetFolder = "$env:SystemDrive\Certs"
)
If (-not(Test-Path $TargetFolder))
{
$null = New-Item -ItemType Directory -Path $TargetFolder -Force
}
# Loop through each CA in the machine store
Get-ChildItem -Path Cert:\LocalMachine\CA | ForEach-Object {
# Convert cert's bytes to Base64-encoded text and add begin/end markers
$Cert = "-----BEGIN CERTIFICATE-----`n"
$Cert += $([System.Convert]::ToBase64String($_.export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert), 'InsertLineBreaks'))
$Cert += "`n-----END CERTIFICATE-----`n"
# Append cert to chain
$Chain += $Cert
}
# Build target path
$CertFile = "$TargetFolder\TrustedRootCAs.pem"
# Write to file system
Write-Host "[$($MyInvocation.MyCommand.Name)]: Exporting certs to: [$CertFile]"
$Chain | Out-File $CertFile -Force -Encoding ASCII
# Set Environment variable
Write-Host "[$($MyInvocation.MyCommand.Name)]: Setting environment variable [NODE_EXTRA_CA_CERTS] to [$CertFile]"
[Environment]::SetEnvironmentVariable("NODE_EXTRA_CA_CERTS", "$CertFile", "Machine")
I am maintaining my Powershell script file and SQL file in TFS repository. I am trying to ready my SQL file from Powershell script (which is also residing in TFS). I am calling this powershell script in my build. I am getting error when I execute.
$ServerInstance = "ABCServer"
$Database = "MyDB"
$ConnectionTimeout = 30
$Query = get-content "$/MYPROJECT/Queries/GetProjects.sql"
$QueryTimeout = 120
$conn=new-object System.Data.SqlClient.SQLConnection
$ConnectionString = "Server={0};Database={1};Integrated Security=True;Connect Timeout={2}" -f $ServerInstance,$Database,$ConnectionTimeout
$conn.ConnectionString=$ConnectionString
$conn.Open()
$cmd=new-object system.Data.SqlClient.SqlCommand($Query,$conn)
$cmd.CommandTimeout=$QueryTimeout
$ds=New-Object system.Data.DataSet
$da=New-Object system.Data.SqlClient.SqlDataAdapter($cmd)
[void]$da.fill($ds)
$ds.Tables[0] | foreach {
write-host 'Name value is : ' + $_.Title
}
$conn.Close()
#$ds.Tables
My Powershell is saved in "$/MYPROJECT/PowershellScripts/QueryDB.ps1"
I have this Powershell added as TFS task in my Build steps. I am getting the following error
**PathNotFound,Microsoft.PowerShell.Commands.GetContentCommand
Exception calling "Open" with "0" argument(s): "Cannot open database "MyDB" requested by the login**
UPDATE:
In the build process it will first get the source (.sql file) to Agent machine. It works with the variable "$env:BUILD_REPOSITORY_LOCALPATH/Queries/GetProjects.sql" set when run the PS script on Agent machine.
If you need to run the PS script on other machines (not the agent machine), You have to copy the script file to that machine first, and specify the actual file path in the script.
I can reproduce your issue, please following below steps to fix it:
Modify GetProjects.sql file path in your PowerShell script based
on your project structure like this:(See screenshot example)
$Query = get-content "$env:BUILD_REPOSITORY_LOCALPATH/Queries/GetProjects.sql"
So, the complete PS script should be :
$ServerInstance = "ABCServer"
$Database = "MyDB"
$ConnectionTimeout = 30
$Query = get-content "$env:BUILD_REPOSITORY_LOCALPATH/Queries/GetProjects.sql"
$QueryTimeout = 120
$conn=new-object System.Data.SqlClient.SQLConnection
$ConnectionString = "Server={0};Database={1};Integrated Security=True;Connect Timeout={2}" -f $ServerInstance,$Database,$ConnectionTimeout
$conn.ConnectionString=$ConnectionString
$conn.Open()
$cmd=new-object system.Data.SqlClient.SqlCommand($Query,$conn)
$cmd.CommandTimeout=$QueryTimeout
$ds=New-Object system.Data.DataSet
$da=New-Object system.Data.SqlClient.SqlDataAdapter($cmd)
[void]$da.fill($ds)
$ds.Tables[0] | foreach {
write-host 'Name value is : ' + $_.Title
}
$conn.Close()
#$ds.Tables
Add build agent service account (the default service account
should be NT AUTHORITY\NETWORK SERVICE if you didn't change it)
as the database ("MyDB") users with the login and query permissions.
Go to Database >> Security >> Users and right click on
NT AUTHORITY\NETWORK SERVICE and select Properties
In newly opened screen of Login Properties, go to the
“Membership” tab. On the lower screen, check the role
db_owner. Click OK.
Then try building again, it should works now.
I'm trying to schedule a task to edit the names of images. Images from 123-1234.jpg to 123_1234.jpg for example.
This is what i have now.
powershell.exe -noexit -command "cd'C:\path\i\want" Get-ChildItem -Filter "*.jpg" -Recurse | Rename-Item -NewName {$_.name -replace "-", "_" }
Appreciate the help.
My reasons for needing this code are that i have 1 software that produces images in various extensions however the way it names the files is like this 12345-123456.jpg
The other Software i have imports names like this 12345_123456.jpg.
This would solve me needing third party software if i can create an automated task to simply change all the .jpg in a directory to the needed name format for the automated import.
powershell.exe -noexit -command Get-ChildItem 'C:\my\file\path\' -Filter "*.jpg" -Recurse | Rename-Item -NewName {$_.name -replace '-','_' }
Im proud of myself as this is my first time using Powershell. This worked fine.
It seems like I get the same UTF-8 error every time I submit a Windows 8 app.
Is there a faster way to convert a batch of files to be UTF-8 formatted?
You can use PowerShell script below to convert all files in a directory to UTF-8.
$files = [IO.Directory]::GetFiles("C:\temp\files")
foreach($file in $files)
{
$content = get-content -path $file
$content | out-file $file -encoding utf8
}
You should be able to run the script above using PowerShell ISE or follow this instruction.
I am trying to automate my SQL database backup process. My goal is to use the Cloudberry Powershell cmdlet to give me direct control and access over my S3 buckets. I am able to do this manually but cannot get my SQL jobs to work with this.
According to Cloudberry's installation instructions, I shouldn't have to register the Cloudberry Powershell snap-in if Powershell is already installed. I have found that to be false. I have tried to register it, both 64-bit and 32-bit with no luck.
This works when executed manually/explicitly from the ISE:
Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn
$today = Get-Date -format "yyyy.MM.dd.HH.mm.ss"
$key = "mykeygoeshere"
$secret = "mysecretgoeshere"
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "ProductionBackups/MyClient/log/" | Add-CloudFolder $today
$src = Get-CloudFilesystemConnection | Select-CloudFolder "X:\backups\MyClient\current\"
$src | Copy-CloudItem $destination -filter "log.trn"
^ When this command is executed in a SQL Agent job, it fails with this message:
Executed as user: DB-MAIN\SYSTEM. A job step received an error at line 1 in a PowerShell script. The corresponding line is 'Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'The term 'Add-PSSnapin' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. '. Process Exit Code -1. The step failed.
I read in this blog post that SQLPS.exe cannot execute 'Add-PSSnapin' commands? Is that true? I cannot find any clarification on the subject...
how can I automate my SQL backup files to the Amazon S3 cloud? I have tried everything. TNT Drive was a huge waste of time. I am hoping Cloudberry can do it, any tips?
You could use Amazon AWS .Net SDK. You can download it from here:
http://aws.amazon.com/sdkfornet/
Here's the example function to download file from S3:
function DownloadS3File([string]$bucket, [string]$file, [string]$localFile)
{
if (Test-Path "C:\Program Files (x86)")
{
Add-Type -Path "C:\Program Files (x86)\AWS SDK for .NET\bin\AWSSDK.dll"
}
else
{
Add-Type -Path "C:\Program Files\AWS SDK for .NET\bin\AWSSDK.dll"
}
$secretKeyID= $env:AWS_ACCESS_KEY_ID
$secretAccessKeyID= $env:AWS_SECRET_ACCESS_KEY
$client=[Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID,$secretAccessKeyID)
$request = New-Object -TypeName Amazon.S3.Model.GetObjectRequest
$request.BucketName = $bucket
$request.Key = $file
$response = $client.GetObject($request)
$writer = new-object System.IO.FileStream ($localFile ,[system.IO.filemode]::Create)
[byte[]]$buffer = new-object byte[] 4096
[int]$total = [int]$count = 0
do
{
$count = $response.ResponseStream.Read($buffer, 0, $buffer.Length)
$writer.Write($buffer, 0, $count)
}
while ($count -gt 0)
$response.ResponseStream.Close()
$writer.Close()
echo "File downloaded: $localFile"
}