DBATools - Remove timestamp from Export-DBAScript -PATH - sql

Is it possible to remove the timestamp when Export-DBAScript -PATH is used? I would just like to use the Server Name and Date.
For example this is what I want to export:
Get-DbaAgentJob -SqlInstance $ServerList| EXPORT-DBASCRIPT -PATH $FILENAME -Append
This is what my Output File is named (I would like to not have the timestamp):
Is this is possible, and if so how I would go by achieving this?

Related

How to pull SQL row with multiple values comma separated into PowerShell array to be used in Get-ChildItem -exclude

PowerShell newbie trying to figure out how to pull SQL row with multiple values comma separated into PowerShell array to be used in Get-ChildItem -exclude.
Please note I have the SQL portion working to where I can pull the values into PowerShell, but no matter what I have tried when I try to pass it to the -exclude option it thinks its just one name, versus comma separated names. I need help understanding how to pass the SQL row values into an array like this so it can be used properly with -exclude
View of hardcoded array that works
https://i.stack.imgur.com/9Fnsh.png
View of SQL table
https://i.stack.imgur.com/bNuKy.png
Code that gets it from SQL
https://i.stack.imgur.com/hCP2P.png
View of PS code to remove folders using the -ExcludeFolder option https://i.stack.imgur.com/dfvlK.png
$ExcludeFolder = $row.ExcludeFolder
Get-ChildItem -Path $Source -Recurse -Force -Exclude $excludeFolder | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null -and $_.LastWriteTime -lt $CutDay} | Remove-Item -Force -Recurse
Write-Host "End Time:" (Get-Date)
}
Thanks for any help you can provide.
SK
If you have a string in your $row.ExcludeFolders variable that reliably includes commas, you can use similar to the following to create an array:
You might then be able to pass the array as the -Exclude value

Inline yaml pipeline scripts work separately, but not together

The following two inline yaml-pipeline powershell scripts:
- pwsh: (get-content -path $(versionHeader)) | foreach-object {$_ -replace "2.0.0.0", '$(major).$(minor).$(patch).0'} | set-content -path $(versionHeader)
displayName: 'Update build number in header file'
- pwsh: (get-content -path $(versionHeader)) | foreach-object {$_ -replace "20200101000000", (get-date -f 'yyyyMMddhhmmss')} | set-content -path $(versionHeader)
displayName: 'Update date in header file'
are meant to take these two lines
[assembly: MyApp.Net.Attributes.AssemblyVersion("2.0.0.0")]
[assembly: MyApp.Net.Attributes.BuildDateAttribute("20200101000000")]
and turn them into these two lines (i.e. put new values in the quotes)
[assembly: MyApp.Net.Attributes.AssemblyVersion("2.0.185.0")]
[assembly: MyApp.Net.Attributes.BuildDateAttribute("20200724013502")]
(The replacement values vary)
And either script works fine by itself. But when I try to use both scripts, one after the other, the second value comes out messed up.
[assembly: MyApp.Net.Attributes.AssemblyVersion("2.0.209.0")] // correct
[assembly: MyApp.Net.Attributes.BuildDateAttribute("202.0.209.000000")] // ?????
Obviously they are somehow interfering with each other but I don't know how. Can someone tell me what I'm doing wrong?
The problem is in powershell's -replace - it interprets wildcards, and it happens that 20200101000000 matches 2.0.0.0:
PS> "20200101000000" -replace "2.0.0.0", "zzzzzz"
20zzzzzz00000

Powershell - using wildcards to search for filename

I am trying to make a PowerShell script that will search a folder for a filename that contains a certain file-mask. All files in the folder will have format like *yyyyMd*.txt.
I have made a script:
[String]$date = $(get-date -format yyyyMd)
$date1 = $date.ToString
Get-ChildItem C:\Users\pelam\Desktop\DOM | Where-Object {$_.Name -like '*$date1*'}
But this does not seem to work..
Can anyone help? It seems the problem is that the date variable is not correct because when I hard code something like below, it works:
Get-ChildItem C:\Users\pelam\Desktop\DOM | Where-Object {$_.Name -like '*20141013*'}
You can simplify this by just using regex with the -match operator:
Get-ChildItem C:\Users\pelam\Desktop\DOM | Where-Object {$_ -match (Get-Date -format yyyyMMdd)}
And if you are on V3 or higher, you can further simplify to:
Get-ChildItem C:\Users\pelam\Desktop\DOM | Where Name -match (Get-Date -format yyyyMMdd)

Import-CSV Powershell, Exclude Records Containing String?

I have a section of a script here where I am importing a CSV but trying to only select records where AppName does not contain "Security"
Here is what I have but it doesn't seem to be working; the records I am trying to omit still appear.
$file2 = import-csv -Path "$UpdatePath\$($todaydate).csv" | where {$_.AppName -notcontains "Security"} | Select-Object AppName
Any suggestions greatly appreciated
Thanks
$file2 = import-csv -Path "$UpdatePath\$($todaydate).csv" | where {$_.AppName -notlike "*Security*"} | Select-Object AppName
-notlike will do this, but I also added asterisks which are a wildcard character in the like operators.
-contains is actually for arrays (think collections of values).
about_Comparison_Operators

Powershell get files older than x days and move them

I am sure this might have been asked a million times before. I am very new to Power Shell and would like to ask if I am doing this right.
In the directory, we have many files types. What I am trying to accomplish is to move only PDF files that are older than one month. To not even touch the other file extensions. The extensions in the folder are:
pdf, xml, csv
I have searched the forums prior to asking. This is what I have so far.
get-childitem -path \\server\folder -include "*.pdf" -exclude "*.xml,*.csv" | where-object {$_.LastWriteTime -gt (get-date).AddDays(-31)} | move-item -destination \\server\folder\folder2
One question though, how would you handle an exclude if there is no file extension?
Thanks for your time and patience with this noob!
There is no need to use "where" to test the extension as get-childitem does this for you. Although I would use the filter parameter (2nd positional parameter) in the case of a single extension to search for e.g.:
$date = (get-date).AddDays(-31)
get-childitem \\server\folder *.pdf | where-object {$_.LastWriteTime -gt $date} |
move-item -destination \\server\folder\folder2
Btw using the filter parameter is also faster which maybe important when searching a network share.
One way is to add $_.Extension -eq ".pdf" to your where-object block so that you only grab those extensions.
get-childitem -path \\server\folder | where-object {
$_.extension -eq ".pdf" -and ($_.LastWriteTime -gt (get-date).AddDays(-31))} | move-item -destination C:\test\test
Also, if you want files older than one month, your date comparison needs to be -lt and not -gt
get-childitem -path \\server\folder | where-object {
$_.extension -eq ".pdf" -and ($_.LastWriteTime -lt (get-date).AddDays(-31))} | move-item -destination C:\test\test
Another way to do this is to specify the -Name flag and use wildcard for filename:
get-childitem -path \\server\folder -name "*.pdf" | where-object {$_.LastWriteTime -gt (get-date).AddDays(-31)} | move-item -destination \\server\folder\folder2
I am not sure to understand your last question, but if you have pdf files with no extension, see this discussion.
I think something like that can help you. The file.exe (download) showing the mime-type of the files, which is something like PDF document, version 1.6, depending the pdf file version.
gci . | ? { (file.exe -b $_) -match "pdf" }