As a newbie in powershell, im trying to read thru a folder which has multiple sql files and iterate them through poweshell scripts read the data from oracle and export to CSV.
If my sqlfile has a single line statement no issues with the code, its working fine, If my sql file has multiple line statement - as always it has,
the powershell errors out saying
"Get-DataTable : Cannot process argument transformation on parameter 'sql' Cannot convert value to type System.String."
could you please help me how to resolve this issue? Below my code snapshot.
function Get-DataTable{
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)]
[Oracle.DataAccess.Client.OracleConnection]$conn,
[Parameter(Mandatory=$true)]
[string]$sql
)
$cmd = New-Object Oracle.DataAccess.Client.OracleCommand($sql,$conn)
$da = New-Object Oracle.DataAccess.Client.OracleDataAdapter($cmd)
$dt = New-Object System.Data.DataTable
[void]$da.Fill($dt)
return ,$dt
}
foreach ($file in Get-ChildItem -path $ScriptsDirectory -Filter *.sql | sort-object -desc )
{
$SQLquery = get-content "$ScriptsDirectory\$file"
echo $SQLquery
$fileName = $file.name.split(".")[0]
$dt = Get-DataTable $conn $SQLquery
Write-Host "Retrieved records:" $dt.Rows.Count -ForegroundColor Green
$dt | Export-Csv -NoTypeInformation -LiteralPath $WorkingDirectory\$fileName.csv
Write-Host "Output Written to :" $WorkingDirectory\$fileName.csv -ForegroundColor Green }
Get-Content returns an array of lines. If you're using PowerShell v3 or higher you can use the -Raw parameter to read the file as one big string:
$SQLquery = get-content "$ScriptsDirectory\$file" -Raw
Alternatively you could re-join the array with line endings:
$SQLquery = $SQLquery -join "`r`n"
Or you can read the file all at once with .net classes:
$SQLquery = [System.IO.File]::ReadAllText("$ScriptsDirectory\$file")
Related
hi i am running the following query in powershell:
Import-Module Hall.psm1
$Database = 'Report'
$Server = '192.168.1.2'
$Query = 'SELECT all * FROM [Report].[dbo].[TestView]'
$LogLocation = "\\Report\LogFile.csv"
$DynamicYear = (Get-Date).Year
$DynamicMonth = (Get-Culture).DateTimeFormat.GetMonthName((Get-Date).Month)
$FileDestination = "\\Report\MONTHLY REPORTS\"+$DynamicYear+"\"+$DynamicMonth+"\"
$Outputfilename='TestView-'+(Get-Date).ToString('MM-dd-yyyy')+'.csv'
$LocalCreate = 'C:\Scripts\LocalCreate\'
$FolderPathExtension = "Microsoft.PowerShell.Core\FileSystem::"
$CodeDestination = $FolderPathExtension+$FileDestination
$filedest=$LocalCreate+$outputfilename
$Logfile = $FolderPathExtension+$LogLocation
Invoke-sqlcmd -querytimeout 120 -query "
$Query
" -database $database -serverinstance $server |
ConvertTo-Csv -NoTypeInformation | # Convert to CSV string data without the type metadata
Select-Object -Skip 0 | # Trim header row, leaving only data columns
% {$_ -replace '"',''} | # Remove all quote marks
Set-Content -Path $filedest
(gc $filedest) | ? {$_.trim() -ne "" } | set-content $filedest
if(Test-Path ($filedest)) {
Move-Item -Path $filedest -Destination $CodeDestination -Force
$LogType = 'INFO'
$LogEntry = "$filedest MovedTo $To"
Write-Log -Message $LogEntry -Level $LogType -Logfile $Logfile
}
which works fine without any issue if the query has data.
however, if the query does not have any data it does not create a .csv. how can i get it to create a blank .csv? or .csv with headers only?
Use New-Item -ItemType File -Path $filedest before your Invoke-SqlCmd Or ConvertTo-Csv
I have not an Expert in power shell. I Need a script/Approch which handles the below requirement.
I have list of files in a folder and the file Names like below.
001_File.sql
002_File.sql
003_File.sql
004_File.sql
Also, I have a table in sql server which holds the file Name Information.
TableName: Executedfile with a column FileName.
002_File.sql
004_File.sql
My requirement is to read the files which is available in the folder but not in the table.
I have to the Read files only:
001_File.sql
003_File.sql
Now, I Need to Execute these two file in the sequential order under the same Transaction on SQL Server. As I Need to rollback all the transaction if any Error occurred.
As of now I wrote a power shell something below.
$QueryResult = Invoke-Sqlcmd -ServerInstance 'MyServer' -Database 'MyDb' -Query "SELECT DISTINCT FNames from TableName"
Get-ChildItem "E:\Testing\" -Filter *.sql | Sort-Object $_.Name|
Foreach-Object {
$FileFullpath= $_.FullName
Write-Host $FileFullpath
$FileName = $_.Name
Write-Host $FileName
if(!$QueryResult.FName.Contains($FileName))
{
invoke-sqlcmd -inputfile $FileFullpath -serverinstance "servername\serverinstance" -database "mydatabase"
}
}
Please suggest me some script.
Challenges:
How to read the files in Sequential order as it has leading Zeros. Does the Above `Sort-Object $_.Name ' will Sort ?
How to Execute all the list of files under one transaction.
Thanks
Finally I did something like this.
$QueryResult = Invoke-Sqlcmd -ServerInstance 'MyServer' -Database 'MyDb' -Query "SELECT DISTINCT FNames from TableName"
$FullScript = #()
$FullScript += "BEGIN TRANSACTION;"
Get-ChildItem "E:\Testing\" -Filter *.sql | Sort-Object $_.Name|
Foreach-Object {
if(!$QueryResult.FName.Contains($_.Name))
{
$FullScript += Get-Content $_.FullName
}
}
$FullScript += "COMMIT TRANSACTION;"
sqlcmd -S localhost -d test -Q "$FullScript"
Try this...
#get list of filenames from database...
$QueryResult = Invoke-Sqlcmd -ServerInstance 'MyServer' -Database 'MyDB' -Query "SELECT DISTINCT FNames from TableName" | Select-Object -ExpandProperty FileName
#get files from folder whose names are not in $queryresult...
$files = Get-ChildItem -Path E:\Testing -Filter *.sql | ? {(!($QueryResult.Contains($_.BaseName)))} | Sort-Object Name
#get the content of each $file and replace "GO" with empty string, etc...
$queries = #()
foreach ($file in $files) {
$queries += (Get-Content $file.FullName).replace("GO","")
}
#join each query into a single T-SQL statement...
$singleTransaction = $queries -join ";"
#execute statement...
Invoke-Sqlcmd -ServerInstance 'SERVER' -Database 'DB' -Query $singleTransaction
To really achieve a 'single transaction'...you may have to have a consistent input to modify and put into one statement. I am not sure how you will need to do that.
Finally I wrote the Script using SMO Objects to handle the GO Statement and Transactions.
$SqlFilePath = "D:\Roshan\Testing\SQL\"
$serverName = "MyServer"
$databaseName = "MyDB"
$QueryResult = Invoke-Sqlcmd -ServerInstance $serverName -Database $databaseName -Query "SELECT DISTINCT FName from dbo.TableName" -AS DataRows
$connection = new-object system.data.SqlClient.SQLConnection("Data Source=$serverName;Integrated Security=SSPI;Initial Catalog=$databaseName;Connection Timeout=600;Max Pool Size=10");
$Server = new-Object Microsoft.SqlServer.Management.Smo.Server(New-Object Microsoft.SqlServer.Management.Common.ServerConnection($connection))
$script_contents ="SET XACT_ABORT ON
GO
BEGIN TRANSACTION
GO"
Get-ChildItem $SqlFilePath -Filter *.sql| Sort-Object $_.Name|
ForEach-Object {
if(!$QueryResult.FName.Contains($_.Name))
{
Write-Host $_.Name -ForegroundColor Magenta
#[string]$script_contents = Get-Content $_.FullName
$script_contents += [IO.File]::ReadAllText($_.FullName)
#Write-Host $script_contents
#$Server.ConnectionContext.ExecuteNonQuery($script_contents)
}
}
$script_contents+= " COMMIT TRANSACTION;"
$Server.ConnectionContext.ExecuteNonQuery($script_contents)
You can write some thing in your shell script
select filename from tablename; >> file.out
--->002_File.sql
grep -v 'file.out' * >> excludedfile.out
I have a script that executes a stored procedure on a SQL Server which returns XML. I then have a function to to format the XML in powershell so it is readable. When i open the XML in Chrome i get this error:
This page contains the following errors:
error on line 149 at column 27: Encoding error
Below is a rendering of the page up to the first error.
I think I may need to encode it in UTF8 but I am unsure where to do it in my code. Any help to rectify the error or how to do the encoding is appreciated.
Here is the Powershell that I run to get the XML file:
function Format-XML {
[CmdletBinding()]
Param ([Parameter(ValueFromPipeline=$true,Mandatory=$true)][string]$xmlcontent)
$xmldoc = New-Object -TypeName System.Xml.XmlDocument
$xmldoc.LoadXml($xmlcontent)
$sw = New-Object System.IO.StringWriter
$writer = New-Object System.Xml.XmlTextwriter($sw)
$writer.Formatting = [System.XML.Formatting]::Indented
$xmldoc.WriteContentTo($writer)
$sw.ToString()
}
$Date = Get-Date -format "yyyyMMdd_HHmm"
$File = "C:\Temp\MyFile"+$Date+".xml"
$Query = "EXEC dbo.usp_MyProc"
$resultRow = Invoke-Sqlcmd -Query $Query -database MyDatabase -ServerInstance MyServer
Format-xml $resultRow['results'] | Set-Content -Path $File -Force
Comment "Try appending -Encoding UTF8 to your last line" from Martin Brandi worked
I need a way to execute a SQL (by importing a .SQL script) on a remote Oracle DB using PowerShell. In addition to this I am also trying to output the results in an .xls format in a desired folder location. To add to the fun, I would also want to run this task on an automatic schedule. Please help !
I have gotten so far :
[System.Reflection.Assembly]::LoadWithPartialName ("System.Data.OracleClient") | Out-Null
$connection = "my TNS entry"
$queryString = "my SQL query"
$command = new-Object System.Data.OracleClient.OracleCommand($queryString, $connection)
$connection.Open()
$reader = $command.ExecuteReader()
$tempArr = #()
#read all rows into a hash table
while ($reader.Read())
{
$row = #{}
for ($i = 0; $i -lt $reader.FieldCount; $i++)
{
$row[$reader.GetName($i)] = $reader.GetValue($i)
}
#convert hashtable into an array of PSObjects
$tempArr+= new-object psobject -property $row
}
$connection.Close()
write-host "Conn State--> " $connection.State
$tmpArr | Export-Csv "my File Path" -NoTypeInformation
$Error[0] | fl -Force
The easiest way is to drive sqlplus.exe via powershell. To execute the sql and get the output you do this:
$result = sqlplus.exe #file.sql [credentials/server]
#parse result into CSV here which can be loaded into excel
You can schedule this script with something like:
schtasks.exe /create /TN sqlplus /TR "Powershell -File script.ps1" /ST 10 ...
For this you need to have sqlplus installed (it comes with oracle express and you could install it without it). This obviously introduces dependency that is not needed but sqlplus could be used to examine the database and do any kind of thing which might be good thing to have around.
I want to read a file that contains a line separated list of *.sql file names (all located at the same directory) to execute the following:
$reader = [System.IO.File]::OpenText("_Scripts.txt")
try {
for(;;) {
$line = $reader.ReadLine()
if ($line -eq $null) { break }
#output
$out = $line.split(".")[0] + ".txt" ;
# -serverinstance u should change the value of it according to use
invoke-sqlcmd -inputfile $line -serverinstance "." | format-table | out-file -filePath $out
$line
}
}
finally {
$reader.Close()
}
I'm trying to execute this script file by using a batch file containing the command:
powershell.exe -ExecutionPolicy Bypass -Command "_scripts.ps1"
but I get the error shown below:
Can anyone help me fix my ps1 script please?
This works for me:
$lines = Get-Content C:\Temp\TEST\_Scripts.txt
ForEach ($line in $lines)
{
$out = $line.split(".")[0] + ".txt" ;
Invoke-Sqlcmd -InputFile $line -ServerInstance "localhost" -Database "master" | Format-Table | Out-File -FilePath $out
}