scheduling automated file name change - automation

I'm trying to schedule a task to edit the names of images. Images from 123-1234.jpg to 123_1234.jpg for example.
This is what i have now.
powershell.exe -noexit -command "cd'C:\path\i\want" Get-ChildItem -Filter "*.jpg" -Recurse | Rename-Item -NewName {$_.name -replace "-", "_" }
Appreciate the help.
My reasons for needing this code are that i have 1 software that produces images in various extensions however the way it names the files is like this 12345-123456.jpg
The other Software i have imports names like this 12345_123456.jpg.
This would solve me needing third party software if i can create an automated task to simply change all the .jpg in a directory to the needed name format for the automated import.

powershell.exe -noexit -command Get-ChildItem 'C:\my\file\path\' -Filter "*.jpg" -Recurse | Rename-Item -NewName {$_.name -replace '-','_' }
Im proud of myself as this is my first time using Powershell. This worked fine.

Related

Converting web app files to UTF-8 for Windows 8 Apps

It seems like I get the same UTF-8 error every time I submit a Windows 8 app.
Is there a faster way to convert a batch of files to be UTF-8 formatted?
You can use PowerShell script below to convert all files in a directory to UTF-8.
$files = [IO.Directory]::GetFiles("C:\temp\files")
foreach($file in $files)
{
$content = get-content -path $file
$content | out-file $file -encoding utf8
}
You should be able to run the script above using PowerShell ISE or follow this instruction.

Using CurrentDomain.SetData("APP_CONFIG_FILE") doesn't work in PowerShell ISE

I'm attempting to use a .NET 4.0 assembly in PowerShell ISE, and trying to change the config file which is used via:
[System.AppDomain]::CurrentDomain.SetData("APP_CONFIG_FILE", $PathToConfig);
[Configuration.ConfigurationManager]::ConnectionStrings.Count always returns "1",and "[Configuration.ConfigurationManager]::ConnectionStrings[0].Name" always returns "LocalSqlServer", and that ConnectionString name is not in my ".config" file.
Note that executing the PowerShell script from a PowerShell command prompt functions as expected. It's just when I execute it from within PowerShell ISE, it doesn't work as expected.
It's because the path to app.config for PowerShell ISE has already been loaded and cached so changing the app.config path afterwards won't make a difference:
stackoverflow.com/q/6150644/222748
Here is an example script that will clear the cached path so it will work under PowerShell ISE:
[System.AppDomain]::CurrentDomain.SetData("APP_CONFIG_FILE", $PathToConfig)
Add-Type -AssemblyName System.Configuration
[Configuration.ConfigurationManager].GetField("s_initState", "NonPublic, Static").SetValue($null, 0)
[Configuration.ConfigurationManager].GetField("s_configSystem", "NonPublic, Static").SetValue($null, $null)
([Configuration.ConfigurationManager].Assembly.GetTypes() | where {$_.FullName -eq "System.Configuration.ClientConfigPaths"})[0].GetField("s_current", "NonPublic, Static").SetValue($null, $null)
[Configuration.ConfigurationManager]::ConnectionStrings[0].Name
Taking off [0] works for me.
([Configuration.ConfigurationManager].Assembly.GetTypes() | where {$_.FullName -eq "System.Configuration.ClientConfigPaths"}).GetField("s_current", "NonPublic, Static").SetValue($null, $null)

How to use Cloudberry Powershell Snap-in (for Amazon S3) from within a scheduled SQL Agent Job?

I am trying to automate my SQL database backup process. My goal is to use the Cloudberry Powershell cmdlet to give me direct control and access over my S3 buckets. I am able to do this manually but cannot get my SQL jobs to work with this.
According to Cloudberry's installation instructions, I shouldn't have to register the Cloudberry Powershell snap-in if Powershell is already installed. I have found that to be false. I have tried to register it, both 64-bit and 32-bit with no luck.
This works when executed manually/explicitly from the ISE:
Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn
$today = Get-Date -format "yyyy.MM.dd.HH.mm.ss"
$key = "mykeygoeshere"
$secret = "mysecretgoeshere"
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "ProductionBackups/MyClient/log/" | Add-CloudFolder $today
$src = Get-CloudFilesystemConnection | Select-CloudFolder "X:\backups\MyClient\current\"
$src | Copy-CloudItem $destination -filter "log.trn"
^ When this command is executed in a SQL Agent job, it fails with this message:
Executed as user: DB-MAIN\SYSTEM. A job step received an error at line 1 in a PowerShell script. The corresponding line is 'Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'The term 'Add-PSSnapin' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. '. Process Exit Code -1. The step failed.
I read in this blog post that SQLPS.exe cannot execute 'Add-PSSnapin' commands? Is that true? I cannot find any clarification on the subject...
how can I automate my SQL backup files to the Amazon S3 cloud? I have tried everything. TNT Drive was a huge waste of time. I am hoping Cloudberry can do it, any tips?
You could use Amazon AWS .Net SDK. You can download it from here:
http://aws.amazon.com/sdkfornet/
Here's the example function to download file from S3:
function DownloadS3File([string]$bucket, [string]$file, [string]$localFile)
{
if (Test-Path "C:\Program Files (x86)")
{
Add-Type -Path "C:\Program Files (x86)\AWS SDK for .NET\bin\AWSSDK.dll"
}
else
{
Add-Type -Path "C:\Program Files\AWS SDK for .NET\bin\AWSSDK.dll"
}
$secretKeyID= $env:AWS_ACCESS_KEY_ID
$secretAccessKeyID= $env:AWS_SECRET_ACCESS_KEY
$client=[Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID,$secretAccessKeyID)
$request = New-Object -TypeName Amazon.S3.Model.GetObjectRequest
$request.BucketName = $bucket
$request.Key = $file
$response = $client.GetObject($request)
$writer = new-object System.IO.FileStream ($localFile ,[system.IO.filemode]::Create)
[byte[]]$buffer = new-object byte[] 4096
[int]$total = [int]$count = 0
do
{
$count = $response.ResponseStream.Read($buffer, 0, $buffer.Length)
$writer.Write($buffer, 0, $count)
}
while ($count -gt 0)
$response.ResponseStream.Close()
$writer.Close()
echo "File downloaded: $localFile"
}

Removing self signed certificate from my store

Is there a way to remove/ uninstall a self signed certificate from my store using powershell ?
I tried
Remove-Item cert:\LocalMachine\My\$thumb
it did not work, I got an exception saying "Provider does not support this operation"
I also tried
certmgr.msc /del /n "MyTestServer" /s MY
it did not work either
How can I uninstall certificate from store ??
Thanks in advance
Jeez
This approach seems to apply to Powershell 2 only and thus it is outdated.
Remove-Item does not work with certificates because der cert-provider is readonly in powershell. Found that information here
$store = new-object system.security.cryptography.x509certificates.x509Store 'My','CurrentUser'
$store.Open('ReadWrite')
$certs = #(dir cert:\currentuser\my | ? { $_.Subject -like '*MyTestServer*' })
foreach ($cert in $certs) {$store.Remove($cert)}
$store.close()
I found the solution here in the comments. So it is untested.
Found this article because remove-item wasn't working.
This is not exactly 'true' powershell, but I use this method:
certutil -delstore my "5314bdfa0255be36e53e749d033"
You can get thumbprint via cert:\LocalMachine\my or through certutil. In my case, I have multiple certs with exact same name, so I like above method more because it gives me a specific target when I delete a cert.
With PS 3.0, if you want to remove by subjectName
Get-ChildItem -Path Cert:\CurrentUser\My | where { $_.subject -eq "CN=MysubjectName" } | Remove-Item
With PS 3.0 there is a more concise and idiomatic approach:
Remove-Item -Path cert:\LocalMachine\My\{Thumbprint} -DeleteKey
See TechNet for all the details.
This will work as well in powershell
To get the thumbpeint
dir cert:\localmachine\my
To delete the thumbprint
del cert:\localmachine\my\thumbprint
Realise this is an old thread, but since I'm looking at doing the same right now thought I'd post. I'm needing to remove from all cert stores by friendly name.
Realise this isn't the answer for OP but may help someone.
If that is required by anyone this works for me dir cert: -Recurse | Where-Object { $_.FriendlyName -like "*SOMENAME*" } | Remove-Item
You are set on wrong cert store
Use $cert = Get-ChildItem -Path "Cert:\CurrentUser\My\THUMBPRINT" instead of cert:\LocalMachine\My\$thumb you say that the certificates are your. So your certificates are stored in -Path "Cert:\CurrentUser\My\THUMBPRINT" CurrentUser = Your user account, and you don't need to change it to your account name.
br.

Powershell scripts to backup SQL, SVN

I'm trying to use PowerShell to create some backups, and then to copy these to a web folder (or, in other words, upload them to a WebDAV share).
At first I thought I'd do the WebDAV stuff from within PowerShell, but it seems this still requires a fair amount of "manual labour", ie: constructing HTTP requests. I then settled for creating a web folder from the script and letting Windows handle the WebDAV stuff. It seems that all it takes to create a web folder is to create a standard shortcut, as described here.
What I can't figure out is how to actually copy files to the shortcut's target..? Maybe I'm going about this the wrong way.
It would be ideal if I could somehow encrypt the credentials for the WebDAV in the script, then have it create the web folder, shunt over the files, and delete the web folder again. Or even better, not use a web folder at all. Third option would be to just create the web folder manually and leave it there, though I'd rather not.
Any ideas/pointers/tips? :)
If you are using powershell to backup your SVN repositories using svnadmin dump then be aware that piping to a file will silently corrupt your backups.
Powershell likes to change things to UTF-16 when piping, it also changes unix linebreaks to windows ones. This will come back to haunt you when you try and restore.
Problem well described here:
http://thoughtfulcode.wordpress.com/2010/01/29/powershells-object-pipeline-corrupts-piped-binary-data/
Solution here:
http://www.microsoft.com/communities/newsgroups/en-us/default.aspx?dg=microsoft.public.windows.powershell&tid=e4cd89e9-427b-407d-a94f-c24be3f1e36f&cat=&lang=&cr=&sloc=&p=1
In summary, use cmd.exe instead of powershell:
cmd.exe /c svnadmin dump ... `> dumpfile.dump
Note that the backtick on the output redirection is required to stop powershell parsing it.
Well, in the meantime I cobbled together another solution. Maybe it will be of use to someone..
[Run.cmd] --May need to change powershell path
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -nologo -noninteractive -command "C:\Scripts\RunBackup.ps1"
[RunBackup.ps1] --Out-File not having the desired effect, maybe someone can figure out why?
C:\Scripts\SqlBackup.ps1 | Out-File "C:\Scripts\log.txt"
C:\Scripts\SVNBackup.ps1 | Out-File "C:\Scripts\log.txt"
C:\Scripts\Zip.ps1 | Out-File "C:\Scripts\log.txt"
[SqlBackup.ps1] --You may need to modify which SMO assemblies are loaded, depending on your
version of SQL server. Don't forget to set $instance and $bkdir.
#http://www.mssqltips.com/tip.asp?tip=1862&home
$instance = ".\SQLEXPRESS"
[System.Reflection.Assembly]::LoadFrom("C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.SMO.dll") | out-null
[System.Reflection.Assembly]::LoadFrom("C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.SMOExtended.dll") | out-null
$s = new-object ("Microsoft.SqlServer.Management.Smo.Server") $instance
$bkdir = "c:\Backups" #We define the folder path as a variable
$dbs = $s.Databases
foreach ($db in $dbs)
{
if($db.Name -ne "tempdb") #We don't want to backup the tempdb database
{
$dbname = $db.Name
$dt = get-date -format yyyyMMddHHmm #We use this to create a file name based on the timestamp
$dbBackup = new-object ("Microsoft.SqlServer.Management.Smo.Backup")
$dbBackup.Action = "Database"
$dbBackup.Database = $dbname
$dbBackup.Devices.AddDevice($bkdir + "\" + $dbname + "_db_" + $dt + ".bak", "File")
$dbBackup.SqlBackup($s)
}
if($db.RecoveryModel -ne 3) #Don't issue Log backups for DBs with RecoveryModel=3 or SIMPLE
{
$dbname = $db.Name
$dt = get-date -format yyyyMMddHHmm #Create a file name based on the timestamp
$dbBackup = new-object ("Microsoft.SqlServer.Management.Smo.Backup")
$dbBackup.Action = "Log"
$dbBackup.Database = $dbname
$dbBackup.Devices.AddDevice($bkdir + "\" + $dbname + "_log_" + $dt + ".trn", "File")
$dbBackup.SqlBackup($s)
}
}
[SVNBackup.ps1] --Modify repo and backup paths
#set alias to svnadmin exe
set-alias svnadmin "C:\Program Files (x86)\CollabNet Subversion Server\svnadmin.exe"
#create dump
cmd.exe /c svnadmin dump "C:\Repo" `> "C:\Backups\svn.dmp"
#remove alias
remove-item alias:svnadmin
[Zip.ps1] --Need to have 7zip installed, modify 7z.exe path if necessary
#set alias to command line version of 7zip
set-alias sevenz "c:\program files\7-zip\7z.exe"
#Backups location
cd 'C:\Backups'
#rar the contents of the directory
$dt = get-date -format yyyyMMddHHmm #We use this to create a file name based on the timestamp
$outputFileName = "SQLSVNBackup$dt.7z"
$exclude1 = "-x!*.rar"
$exclude2 = "-x!*.7z"
sevenz a -t7z "$outputFileName" *.* "$exclude1" "$exclude2"
#find all .bak files in the immediate directory
dir '*.bak' | foreach-object{
#remove the bak file
remove-item $_.name
}
#find all .dmp files in the immediate directory
dir '*.dmp' | foreach-object{
#remove the dmp file
remove-item $_.name
}
#find all .trn files in the immediate directory
dir '*.trn' | foreach-object{
#remove the trn file
remove-item $_.name
}
#remove 7zip alias
remove-item alias:sevenz
I used GoodSync to backup to WebDAV and scheduled two tasks to run the .cmd file and then sync/backup offsite.