suppose i have 2 powershell scripts, test1.ps1 and test2.ps1
#test1.ps1
$a="test1"
##################
#test2.ps1
echo $a
how to ref. $a in test2.ps1 from test1.ps1?
In Script 1:
Set-Variable -Name a -Value "test1" -Scope Global
Related
PowerShell newbie trying to figure out how to pull SQL row with multiple values comma separated into PowerShell array to be used in Get-ChildItem -exclude.
Please note I have the SQL portion working to where I can pull the values into PowerShell, but no matter what I have tried when I try to pass it to the -exclude option it thinks its just one name, versus comma separated names. I need help understanding how to pass the SQL row values into an array like this so it can be used properly with -exclude
View of hardcoded array that works
https://i.stack.imgur.com/9Fnsh.png
View of SQL table
https://i.stack.imgur.com/bNuKy.png
Code that gets it from SQL
https://i.stack.imgur.com/hCP2P.png
View of PS code to remove folders using the -ExcludeFolder option https://i.stack.imgur.com/dfvlK.png
$ExcludeFolder = $row.ExcludeFolder
Get-ChildItem -Path $Source -Recurse -Force -Exclude $excludeFolder | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null -and $_.LastWriteTime -lt $CutDay} | Remove-Item -Force -Recurse
Write-Host "End Time:" (Get-Date)
}
Thanks for any help you can provide.
SK
If you have a string in your $row.ExcludeFolders variable that reliably includes commas, you can use similar to the following to create an array:
You might then be able to pass the array as the -Exclude value
Is it possible to remove the timestamp when Export-DBAScript -PATH is used? I would just like to use the Server Name and Date.
For example this is what I want to export:
Get-DbaAgentJob -SqlInstance $ServerList| EXPORT-DBASCRIPT -PATH $FILENAME -Append
This is what my Output File is named (I would like to not have the timestamp):
Is this is possible, and if so how I would go by achieving this?
I have TSQL insert statements in a multiple text file in a folder. I need to run on sql server using foreach loop in powershell. it should go to each file in a folder read the query execute that and terminate the connection again read the next file and do the same operation
invoke-sqlcmd -ServerInstance KUMSUSHI7 -Query (Get-Content | For-each "C:\Drill\Task\SAMS Automation\Query.txt")
Please help me on this
Invoke-Sqlcmd can take script file as a parameter, so no -Query is needed. There's an example on MSDN:
Invoke-Sqlcmd -InputFile "C:\TestSQLCmd.sql" | Out-File -filePath "C:\TestSQLCmd.rpt"
Thus, get a list of your sql files and pass the file names to Invoke-Sqlcmd like so,
gci "c:\some\path\*" -include *.sql | % {
Invoke-Sqlcmd -InputFile $_.FullName
}
I am sure this might have been asked a million times before. I am very new to Power Shell and would like to ask if I am doing this right.
In the directory, we have many files types. What I am trying to accomplish is to move only PDF files that are older than one month. To not even touch the other file extensions. The extensions in the folder are:
pdf, xml, csv
I have searched the forums prior to asking. This is what I have so far.
get-childitem -path \\server\folder -include "*.pdf" -exclude "*.xml,*.csv" | where-object {$_.LastWriteTime -gt (get-date).AddDays(-31)} | move-item -destination \\server\folder\folder2
One question though, how would you handle an exclude if there is no file extension?
Thanks for your time and patience with this noob!
There is no need to use "where" to test the extension as get-childitem does this for you. Although I would use the filter parameter (2nd positional parameter) in the case of a single extension to search for e.g.:
$date = (get-date).AddDays(-31)
get-childitem \\server\folder *.pdf | where-object {$_.LastWriteTime -gt $date} |
move-item -destination \\server\folder\folder2
Btw using the filter parameter is also faster which maybe important when searching a network share.
One way is to add $_.Extension -eq ".pdf" to your where-object block so that you only grab those extensions.
get-childitem -path \\server\folder | where-object {
$_.extension -eq ".pdf" -and ($_.LastWriteTime -gt (get-date).AddDays(-31))} | move-item -destination C:\test\test
Also, if you want files older than one month, your date comparison needs to be -lt and not -gt
get-childitem -path \\server\folder | where-object {
$_.extension -eq ".pdf" -and ($_.LastWriteTime -lt (get-date).AddDays(-31))} | move-item -destination C:\test\test
Another way to do this is to specify the -Name flag and use wildcard for filename:
get-childitem -path \\server\folder -name "*.pdf" | where-object {$_.LastWriteTime -gt (get-date).AddDays(-31)} | move-item -destination \\server\folder\folder2
I am not sure to understand your last question, but if you have pdf files with no extension, see this discussion.
I think something like that can help you. The file.exe (download) showing the mime-type of the files, which is something like PDF document, version 1.6, depending the pdf file version.
gci . | ? { (file.exe -b $_) -match "pdf" }
How can I check the return value of "Find" statement in shell script
I am use Find in my script , if find statement don't find any file the execute exit !!
I want to check the return value of "Find" if it found any files or not
You can redirect output of the find command to a file called say output.txt then you can check if the size of that file is 0 or not by using -s option;
if [[ -s "output.txt" ]]
then
echo "File is not empty!"
else
echo "File is empty!"
fi
You can count the number of files found by find using the wc -l command:
export result=`find . -name *.txt | wc -l`
You can now check result to see how many files where found
if [ $result == "0" ]; then echo zero found; fi