How to use Cloudberry Powershell Snap-in (for Amazon S3) from within a scheduled SQL Agent Job? - sql

I am trying to automate my SQL database backup process. My goal is to use the Cloudberry Powershell cmdlet to give me direct control and access over my S3 buckets. I am able to do this manually but cannot get my SQL jobs to work with this.
According to Cloudberry's installation instructions, I shouldn't have to register the Cloudberry Powershell snap-in if Powershell is already installed. I have found that to be false. I have tried to register it, both 64-bit and 32-bit with no luck.
This works when executed manually/explicitly from the ISE:
Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn
$today = Get-Date -format "yyyy.MM.dd.HH.mm.ss"
$key = "mykeygoeshere"
$secret = "mysecretgoeshere"
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "ProductionBackups/MyClient/log/" | Add-CloudFolder $today
$src = Get-CloudFilesystemConnection | Select-CloudFolder "X:\backups\MyClient\current\"
$src | Copy-CloudItem $destination -filter "log.trn"
^ When this command is executed in a SQL Agent job, it fails with this message:
Executed as user: DB-MAIN\SYSTEM. A job step received an error at line 1 in a PowerShell script. The corresponding line is 'Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'The term 'Add-PSSnapin' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. '. Process Exit Code -1. The step failed.
I read in this blog post that SQLPS.exe cannot execute 'Add-PSSnapin' commands? Is that true? I cannot find any clarification on the subject...
how can I automate my SQL backup files to the Amazon S3 cloud? I have tried everything. TNT Drive was a huge waste of time. I am hoping Cloudberry can do it, any tips?

You could use Amazon AWS .Net SDK. You can download it from here:
http://aws.amazon.com/sdkfornet/
Here's the example function to download file from S3:
function DownloadS3File([string]$bucket, [string]$file, [string]$localFile)
{
if (Test-Path "C:\Program Files (x86)")
{
Add-Type -Path "C:\Program Files (x86)\AWS SDK for .NET\bin\AWSSDK.dll"
}
else
{
Add-Type -Path "C:\Program Files\AWS SDK for .NET\bin\AWSSDK.dll"
}
$secretKeyID= $env:AWS_ACCESS_KEY_ID
$secretAccessKeyID= $env:AWS_SECRET_ACCESS_KEY
$client=[Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID,$secretAccessKeyID)
$request = New-Object -TypeName Amazon.S3.Model.GetObjectRequest
$request.BucketName = $bucket
$request.Key = $file
$response = $client.GetObject($request)
$writer = new-object System.IO.FileStream ($localFile ,[system.IO.filemode]::Create)
[byte[]]$buffer = new-object byte[] 4096
[int]$total = [int]$count = 0
do
{
$count = $response.ResponseStream.Read($buffer, 0, $buffer.Length)
$writer.Write($buffer, 0, $count)
}
while ($count -gt 0)
$response.ResponseStream.Close()
$writer.Close()
echo "File downloaded: $localFile"
}

Related

Create keyvault secret - Operation returned an invalid status code 'Conflict'

I want to create multiple secrets in keyvault. Assign dynamic values of Blobstorage account, Batch account.
I tried below code to create secrets:
Function CreateKeyvaultSecrets
{
Param
(
[Parameter(Mandatory=$true, Position=0)]
[string] $keyvaultName,
[Parameter(Mandatory=$true, Position=1)]
[string] $blobStorageAccountName,
[Parameter(Mandatory=$true, Position=2)]
[string] $batchaccountName,
[Parameter(Mandatory=$true, Position=3)]
[string] $logRgName
)
#Get Storagekey
$blobStorageKeyObject = (Get-AzStorageAccountKey -ResourceGroupName $logRgName -AccountName $blobStorageAccountName)| Where-Object {$_.KeyName -eq "key1"}
$blobStorageKey = $blobStorageKeyObject.Value
$blobStorageConnectionString = "DefaultEndpointsProtocol=https;AccountName=$blobStorageAccountName;AccountKey=$blobStorageKey;EndpointSuffix=core.windows.net"
#Create blobstorage key secret
$blobSecretkey = ConvertTo-SecureString -String $blobStorageKey -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName $keyvaultName -Name 'blobstorageaccesskey' -SecretValue $blobSecretkey
#Create blobstorage connectionstring key secret
$blobconnectionstringSecret = ConvertTo-SecureString -String $blobStorageConnectionString -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName $keyvaultName -Name 'blobstorageconnectionstring' -SecretValue $blobconnectionstringSecret
Write-host "Blob Storage Account connection string added to Keyvault secret"
}
CreateKeyvaultSecrets 'kvtevalmock' 'steval' 'abtaeval' 'rg-eval'
I am trying to execute above code from Azure DevOps Powershell task. Azure powershell version is 5.
Secrets are not getting creating. Below error is thrown:
WARNING: Upcoming breaking changes in the cmdlet 'Set-AzKeyVaultSecret' :
- The output type 'Microsoft.Azure.Commands.KeyVault.Models.PSKeyVaultSecret' is changing
- The following properties in the output type are being deprecated : 'SecretValueText'
- The change is expected to take effect from the version : '3.0.0'
Note : Go to https://aka.ms/azps-changewarnings for steps to suppress this breaking change warning, and other
information on breaking changes in Azure PowerShell.
##[error]Operation returned an invalid status code 'Conflict'
##[error]PowerShell exited with code '1'.
I test your script on my side, it works fine.
From the error message, looks your Az.KeyVault powershell module version is too old, my version is 3.4.0, try to update it with the command below.
Update-Module -Name Az.KeyVault -Force
After the update, close all the powershell sessions and open a new one to try again, it should work.

In Powershell script file residing in TFS, how to read SQL File which resides in TFS?

I am maintaining my Powershell script file and SQL file in TFS repository. I am trying to ready my SQL file from Powershell script (which is also residing in TFS). I am calling this powershell script in my build. I am getting error when I execute.
$ServerInstance = "ABCServer"
$Database = "MyDB"
$ConnectionTimeout = 30
$Query = get-content "$/MYPROJECT/Queries/GetProjects.sql"
$QueryTimeout = 120
$conn=new-object System.Data.SqlClient.SQLConnection
$ConnectionString = "Server={0};Database={1};Integrated Security=True;Connect Timeout={2}" -f $ServerInstance,$Database,$ConnectionTimeout
$conn.ConnectionString=$ConnectionString
$conn.Open()
$cmd=new-object system.Data.SqlClient.SqlCommand($Query,$conn)
$cmd.CommandTimeout=$QueryTimeout
$ds=New-Object system.Data.DataSet
$da=New-Object system.Data.SqlClient.SqlDataAdapter($cmd)
[void]$da.fill($ds)
$ds.Tables[0] | foreach {
write-host 'Name value is : ' + $_.Title
}
$conn.Close()
#$ds.Tables
My Powershell is saved in "$/MYPROJECT/PowershellScripts/QueryDB.ps1"
I have this Powershell added as TFS task in my Build steps. I am getting the following error
**PathNotFound,Microsoft.PowerShell.Commands.GetContentCommand
Exception calling "Open" with "0" argument(s): "Cannot open database "MyDB" requested by the login**
UPDATE:
In the build process it will first get the source (.sql file) to Agent machine. It works with the variable "$env:BUILD_REPOSITORY_LOCALPATH/Queries/GetProjects.sql" set when run the PS script on Agent machine.
If you need to run the PS script on other machines (not the agent machine), You have to copy the script file to that machine first, and specify the actual file path in the script.
I can reproduce your issue, please following below steps to fix it:
Modify GetProjects.sql file path in your PowerShell script based
on your project structure like this:(See screenshot example)
$Query = get-content "$env:BUILD_REPOSITORY_LOCALPATH/Queries/GetProjects.sql"
So, the complete PS script should be :
$ServerInstance = "ABCServer"
$Database = "MyDB"
$ConnectionTimeout = 30
$Query = get-content "$env:BUILD_REPOSITORY_LOCALPATH/Queries/GetProjects.sql"
$QueryTimeout = 120
$conn=new-object System.Data.SqlClient.SQLConnection
$ConnectionString = "Server={0};Database={1};Integrated Security=True;Connect Timeout={2}" -f $ServerInstance,$Database,$ConnectionTimeout
$conn.ConnectionString=$ConnectionString
$conn.Open()
$cmd=new-object system.Data.SqlClient.SqlCommand($Query,$conn)
$cmd.CommandTimeout=$QueryTimeout
$ds=New-Object system.Data.DataSet
$da=New-Object system.Data.SqlClient.SqlDataAdapter($cmd)
[void]$da.fill($ds)
$ds.Tables[0] | foreach {
write-host 'Name value is : ' + $_.Title
}
$conn.Close()
#$ds.Tables
Add build agent service account (the default service account
should be NT AUTHORITY\NETWORK SERVICE if you didn't change it)
as the database ("MyDB") users with the login and query permissions.
Go to Database >> Security >> Users and right click on
NT AUTHORITY\NETWORK SERVICE and select Properties
In newly opened screen of Login Properties, go to the
“Membership” tab. On the lower screen, check the role
db_owner. Click OK.
Then try building again, it should works now.

Recompose VM using vmware 5.5 power cli

How can I recompose a VM v1 with existing parent VM v2's snapshot s2 ?
After referring few documents, I saw command Send-LinkedCloneRecompose for recomposing VM.
I am trying this command as follows:
$myVM = 'v1'
$clone = Get-DesktopVM -Name $myVM
$pool = Get-Pool -pool_id $clone.pool_id
$date = Get-Date
$date = $date.AddSeconds(10)
Write-Host "Recomposing" $clone.name
Send-LinkedCloneRecompose -machine_id $clone.machine_id -parentVMPath $pool.parentVMPath -parentSnapshotPath $pool.parentVMSnapshotPath -schedule $date -forcelogoff $true | tee-object -variable vmState
Here I am getting error as Get-DesktopVM : The term 'Get-DesktopVM' is not recognized as the name of a
cmdlet, function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and try again.
I am getting the same error for powercli commandlets Get-Pool and Send-LinkedCloneRecompose as well.
I am using VMware VSphere power cli 5.5 release 2 patch 1.
Can anyone please help me in understanding the problem here ?
I missed to add
& "C:\Program Files\VMware\VMware View\Server\extras\PowerShell\add-snapin.ps1"
before executing above commands. So the commandlets were not recognized.

How to execute .sql file using powershell?

I have a .sql file. I am trying to pass connection string details through a Powershell script and invoke an .sql file.
I was searching and came up with a cmdlet related to Invoke-sqlcmd. While I was trying to find a module corresponding to SQL, I did not find any one in my machine.
Should I install anything in my machine (the machine already has SQL Server Management Studio 2008 R2) to get the modules or is there any easy way to execute .sql files using Powershell?
Try to see if SQL snap-ins are present:
get-pssnapin -Registered
Name : SqlServerCmdletSnapin100
PSVersion : 2.0
Description : This is a PowerShell snap-in that includes various SQL Server cmdlets.
Name : SqlServerProviderSnapin100
PSVersion : 2.0
Description : SQL Server Provider
If so
Add-PSSnapin SqlServerCmdletSnapin100 # here lives Invoke-SqlCmd
Add-PSSnapin SqlServerProviderSnapin100
then you can do something like this:
invoke-sqlcmd -inputfile "c:\mysqlfile.sql" -serverinstance "servername\serverinstance" -database "mydatabase" # the parameter -database can be omitted based on what your sql script does.
Quoting from Import the SQLPS Module on MSDN,
The recommended way to manage SQL Server from PowerShell is to import
the sqlps module into a Windows PowerShell 2.0 environment.
So, yes, you could use the Add-PSSnapin approach detailed by Christian, but it is also useful to appreciate the recommended sqlps module approach.
The simplest case assumes you have SQL Server 2012: sqlps is included in the installation so you simply load the module like any other (typically in your profile) via Import-Module sqlps. You can check if the module is available on your system with Get-Module -ListAvailable.
If you do not have SQL Server 2012, then all you need do is download the sqlps module into your modules directory so Get-Module/Import-Module will find it. Curiously, Microsoft does not make this module available for download! However, Chad Miller has kindly packaged up the requisite pieces and provided this module download. Unzip it under your ...Documents\WindowsPowerShell\Modules directory and proceed with the import.
It is interesting to note that the module approach and the snapin approach are not identical. If you load the snapins then run Get-PSSnapin (without the -Registered parameter, to show only what you have loaded) you will see the SQL snapins. If, on the other hand, you load the sqlps module Get-PSSnapin will not show the snapins loaded, so the various blog entries that test for the Invoke-Sqlcmd cmdlet by only examining snapins could be giving a false negative result.
2012.10.06 Update
For the complete story on the sqlps module vs. the sqlps mini-shell vs. SQL Server snap-ins, take a look at my two-part mini-series Practical PowerShell for SQL Server Developers and DBAs recently published on Simple-Talk.com where I have, according to one reader's comment, successfully "de-confused" the issue. :-)
if(Test-Path "C:\Program Files\Microsoft SQL Server\MSSQL11.SQLEXPRESS") { #Sql Server 2012
Import-Module SqlPs -DisableNameChecking
C: # Switch back from SqlServer
} else { #Sql Server 2008
Add-PSSnapin SqlServerCmdletSnapin100 # here live Invoke-SqlCmd
}
Invoke-Sqlcmd -InputFile "MySqlScript.sql" -ServerInstance "Database name" -ErrorAction 'Stop' -Verbose -QueryTimeout 1800 # 30min
Here is a function that I have in my PowerShell profile for loading SQL snapins:
function Load-SQL-Server-Snap-Ins
{
try
{
$sqlpsreg="HKLM:\SOFTWARE\Microsoft\PowerShell\1\ShellIds\Microsoft.SqlServer.Management.PowerShell.sqlps"
if (!(Test-Path $sqlpsreg -ErrorAction "SilentlyContinue"))
{
throw "SQL Server Powershell is not installed yet (part of SQLServer installation)."
}
$item = Get-ItemProperty $sqlpsreg
$sqlpsPath = [System.IO.Path]::GetDirectoryName($item.Path)
$assemblyList = #(
"Microsoft.SqlServer.Smo",
"Microsoft.SqlServer.SmoExtended",
"Microsoft.SqlServer.Dmf",
"Microsoft.SqlServer.WmiEnum",
"Microsoft.SqlServer.SqlWmiManagement",
"Microsoft.SqlServer.ConnectionInfo ",
"Microsoft.SqlServer.Management.RegisteredServers",
"Microsoft.SqlServer.Management.Sdk.Sfc",
"Microsoft.SqlServer.SqlEnum",
"Microsoft.SqlServer.RegSvrEnum",
"Microsoft.SqlServer.ServiceBrokerEnum",
"Microsoft.SqlServer.ConnectionInfoExtended",
"Microsoft.SqlServer.Management.Collector",
"Microsoft.SqlServer.Management.CollectorEnum"
)
foreach ($assembly in $assemblyList)
{
$assembly = [System.Reflection.Assembly]::LoadWithPartialName($assembly)
if ($assembly -eq $null)
{ Write-Host "`t`t($MyInvocation.InvocationName): Could not load $assembly" }
}
Set-Variable -scope Global -name SqlServerMaximumChildItems -Value 0
Set-Variable -scope Global -name SqlServerConnectionTimeout -Value 30
Set-Variable -scope Global -name SqlServerIncludeSystemObjects -Value $false
Set-Variable -scope Global -name SqlServerMaximumTabCompletion -Value 1000
Push-Location
if ((Get-PSSnapin -Name SqlServerProviderSnapin100 -ErrorAction SilentlyContinue) -eq $null)
{
cd $sqlpsPath
Add-PsSnapin SqlServerProviderSnapin100 -ErrorAction Stop
Add-PsSnapin SqlServerCmdletSnapin100 -ErrorAction Stop
Update-TypeData -PrependPath SQLProvider.Types.ps1xml
Update-FormatData -PrependPath SQLProvider.Format.ps1xml
}
}
catch
{
Write-Host "`t`t$($MyInvocation.InvocationName): $_"
}
finally
{
Pop-Location
}
}
Here's a light weight approach for simple scripts that requires no additional tools / setup / PowerShell add-ons.
$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = $connectionStringGoesHere
$conn.Open()
$content = Get-Content $scriptFileNameGoesHere
$cmds = New-Object System.Collections.ArrayList
$cmd = ""
$content | foreach {
if ($_.Trim() -eq "GO") { $cmds.Add($cmd); $cmd = "" }
else { $cmd = $cmd + $_ +"`r`n" }
}
$cmds | foreach {
$sc = New-Object System.Data.SqlClient.SqlCommand
$sc.CommandText = $_
$sc.Connection = $conn
$sc.ExecuteNonQuery()
}
with 2008 Server 2008 and 2008 R2
Add-PSSnapin -Name SqlServerCmdletSnapin100, SqlServerProviderSnapin100
with 2012 and 2014
Push-Location
Import-Module -Name SQLPS -DisableNameChecking
Pop-Location
Invoke-Sqlcmd -Database MyDatabase -Query "exec dbo.sp_executesql N'$(Get-Content "c:\my.sql")'"

Powershell scripts to backup SQL, SVN

I'm trying to use PowerShell to create some backups, and then to copy these to a web folder (or, in other words, upload them to a WebDAV share).
At first I thought I'd do the WebDAV stuff from within PowerShell, but it seems this still requires a fair amount of "manual labour", ie: constructing HTTP requests. I then settled for creating a web folder from the script and letting Windows handle the WebDAV stuff. It seems that all it takes to create a web folder is to create a standard shortcut, as described here.
What I can't figure out is how to actually copy files to the shortcut's target..? Maybe I'm going about this the wrong way.
It would be ideal if I could somehow encrypt the credentials for the WebDAV in the script, then have it create the web folder, shunt over the files, and delete the web folder again. Or even better, not use a web folder at all. Third option would be to just create the web folder manually and leave it there, though I'd rather not.
Any ideas/pointers/tips? :)
If you are using powershell to backup your SVN repositories using svnadmin dump then be aware that piping to a file will silently corrupt your backups.
Powershell likes to change things to UTF-16 when piping, it also changes unix linebreaks to windows ones. This will come back to haunt you when you try and restore.
Problem well described here:
http://thoughtfulcode.wordpress.com/2010/01/29/powershells-object-pipeline-corrupts-piped-binary-data/
Solution here:
http://www.microsoft.com/communities/newsgroups/en-us/default.aspx?dg=microsoft.public.windows.powershell&tid=e4cd89e9-427b-407d-a94f-c24be3f1e36f&cat=&lang=&cr=&sloc=&p=1
In summary, use cmd.exe instead of powershell:
cmd.exe /c svnadmin dump ... `> dumpfile.dump
Note that the backtick on the output redirection is required to stop powershell parsing it.
Well, in the meantime I cobbled together another solution. Maybe it will be of use to someone..
[Run.cmd] --May need to change powershell path
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -nologo -noninteractive -command "C:\Scripts\RunBackup.ps1"
[RunBackup.ps1] --Out-File not having the desired effect, maybe someone can figure out why?
C:\Scripts\SqlBackup.ps1 | Out-File "C:\Scripts\log.txt"
C:\Scripts\SVNBackup.ps1 | Out-File "C:\Scripts\log.txt"
C:\Scripts\Zip.ps1 | Out-File "C:\Scripts\log.txt"
[SqlBackup.ps1] --You may need to modify which SMO assemblies are loaded, depending on your
version of SQL server. Don't forget to set $instance and $bkdir.
#http://www.mssqltips.com/tip.asp?tip=1862&home
$instance = ".\SQLEXPRESS"
[System.Reflection.Assembly]::LoadFrom("C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.SMO.dll") | out-null
[System.Reflection.Assembly]::LoadFrom("C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.SMOExtended.dll") | out-null
$s = new-object ("Microsoft.SqlServer.Management.Smo.Server") $instance
$bkdir = "c:\Backups" #We define the folder path as a variable
$dbs = $s.Databases
foreach ($db in $dbs)
{
if($db.Name -ne "tempdb") #We don't want to backup the tempdb database
{
$dbname = $db.Name
$dt = get-date -format yyyyMMddHHmm #We use this to create a file name based on the timestamp
$dbBackup = new-object ("Microsoft.SqlServer.Management.Smo.Backup")
$dbBackup.Action = "Database"
$dbBackup.Database = $dbname
$dbBackup.Devices.AddDevice($bkdir + "\" + $dbname + "_db_" + $dt + ".bak", "File")
$dbBackup.SqlBackup($s)
}
if($db.RecoveryModel -ne 3) #Don't issue Log backups for DBs with RecoveryModel=3 or SIMPLE
{
$dbname = $db.Name
$dt = get-date -format yyyyMMddHHmm #Create a file name based on the timestamp
$dbBackup = new-object ("Microsoft.SqlServer.Management.Smo.Backup")
$dbBackup.Action = "Log"
$dbBackup.Database = $dbname
$dbBackup.Devices.AddDevice($bkdir + "\" + $dbname + "_log_" + $dt + ".trn", "File")
$dbBackup.SqlBackup($s)
}
}
[SVNBackup.ps1] --Modify repo and backup paths
#set alias to svnadmin exe
set-alias svnadmin "C:\Program Files (x86)\CollabNet Subversion Server\svnadmin.exe"
#create dump
cmd.exe /c svnadmin dump "C:\Repo" `> "C:\Backups\svn.dmp"
#remove alias
remove-item alias:svnadmin
[Zip.ps1] --Need to have 7zip installed, modify 7z.exe path if necessary
#set alias to command line version of 7zip
set-alias sevenz "c:\program files\7-zip\7z.exe"
#Backups location
cd 'C:\Backups'
#rar the contents of the directory
$dt = get-date -format yyyyMMddHHmm #We use this to create a file name based on the timestamp
$outputFileName = "SQLSVNBackup$dt.7z"
$exclude1 = "-x!*.rar"
$exclude2 = "-x!*.7z"
sevenz a -t7z "$outputFileName" *.* "$exclude1" "$exclude2"
#find all .bak files in the immediate directory
dir '*.bak' | foreach-object{
#remove the bak file
remove-item $_.name
}
#find all .dmp files in the immediate directory
dir '*.dmp' | foreach-object{
#remove the dmp file
remove-item $_.name
}
#find all .trn files in the immediate directory
dir '*.trn' | foreach-object{
#remove the trn file
remove-item $_.name
}
#remove 7zip alias
remove-item alias:sevenz
I used GoodSync to backup to WebDAV and scheduled two tasks to run the .cmd file and then sync/backup offsite.