I've built a powershell script that allows me to run and email the result of sql queries that are being read from a sql file:
function global:readscript($path)
{
#using UTF7 encoding to allow select with accented/french/russian/chinese/... etc chars
$inenc = [System.Text.Encoding]::UTF7
$reader = new-object System.IO.StreamReader($path, $inenc)
$finalquery = ""
while ($line = $reader.ReadLine())
{
$finalquery += $line
}
$reader.close()
return $finalquery
}
function global:get-result($query)
{
$oracleconnection = new-object Oracle.ManagedDataAccess.Client.OracleConnection
$oracleconnection.connectionstring = $connectionstring
$oracleconnection.Open()
$oraclecommand = $oracleconnection.CreateCommand()
$oraclecommand.CommandText = $query
$reader = $oraclecommand.ExecuteReader()
#...etc
}
$scriptquery = readscript "d:\mysqlquery.sql"
get-result($scriptquery)
Everything is working fine so far, except this one sql script that contains the "+" sign for purpose of calculation.
Lets say file mysqlquery.sql contains a line such as:
(SELECT COUNT(a.ID)) + (SELECT COUNT(b.ID))
I can see in the console it's being translated to
(SELECT COUNT(a.ID)) (SELECT COUNT(b.ID))
and of course throws this annoying exception "missing right parenthesis"
How do I escape this plus sign when reading it from a txt file ?
Related
I need to load data table to ODBC driver connection with powershell.
With OLEDB and SQL server we can use Bulk Copy and insert data quickly.
Is there such posibility with ODBC ?
I'm using powershell because it shoud have the best support for these kind of opperations,
but my current code doesn't utillise an of the dlls.
So my code firstly needs to create an insert statements with two for loops and iterate on every row and hold it in its memory,
and then to construct INSERT INTO with 1000 rows, and then repeat same thing.
Am i doomed to something like this ?
$Datatable = New-Object System.Data.DataTable
$tabledump= $src_cmd.ExecuteReader()
$Datatable.Load($tabledump)
foreach ($item in $Datatable.Rows) {
$f +=1
for ($i = 0; $i -lt $item.ItemArray.Length; $i++) {
$items = $item[$i] -replace "'" , "''"
$val +="'"+ $items + "',"
}
$vals += $val
if ($f % 1000 -eq 0 -or $f -eq $row_cnt) {
$values = [system.String]::Join(" ", $vals)
$values = $values.TrimEnd(",")
$cols = [system.String]::Join(",", $columns)
$postgresCommand = "Insert Into $dst_schema.$dst_table ($cols) values $values"
$dest_cmd_.CommandText = $postgresCommand
$dest_cmd_.ExecuteNonQuery()
Bad code i admit, any advice on code compositions are welcomed.
You can use Get-ODBCDSN command to retrieve the values of the ODBC connections and use it with a query
$conn.ConnectionString= "DSN=$dsn;"
$cmd = new-object System.Data.Odbc.OdbcCommand($query,$conn)
$conn.open()
$cmd.ExecuteNonQuery()
$conn.close()
https://www.andersrodland.com/working-with-odbc-connections-in-powershell/
But the ODBC provider doesnt do bulk copy
https://learn.microsoft.com/en-us/sql/relational-databases/native-client-odbc-bulk-copy-operations/performing-bulk-copy-operations-odbc?view=sql-server-ver15
I know this post is not new, but i've been fiddeling around looking for a solution and also found nothing, however this post gave me a couple of insights.
First: There is no such thing as 'Bad Code'. If it works is not bad, heck even if it didn't worked, but helped with something..
Alright, what i did is not the best solution, but i'm trying to import Active Directory data on PostgreSQL, so...
I noticed that you're trying with pgsql as well, so you can use the COPY statement.
https://www.postgresql.org/docs/9.2/sql-copy.html
https://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/
In my case i used it with a csv file:
*Assuming you have installed pgsql ODBC driver
$DBConn = New-Object System.Data.Odbc.OdbcConnection
$DBConnectionString = "Driver={PostgreSQL UNICODE(x64)};Server=$ServerInstance;Port=$Port;Database=$Database;Uid=$Username;Pwd=$(ConvertFrom-SecureString -SecureString $Password);"
$DBConn.ConnectionString = $DBConnectionString
try
{
$ADFObject = #()
$ADComputers = Get-ADComputer -Filter * -SearchBase "OU=Some,OU=OrgU,OU=On,DC=Domain,DC=com" -Properties Description,DistinguishedName,Enabled,LastLogonTimestamp,modifyTimestamp,Name,ObjectGUID | Select-Object Description,DistinguishedName,Enabled,LastLogonTimestamp,modifyTimestamp,Name,ObjectGUID
foreach ($ADComputer in $ADComputers) {
switch ($ADComputer.Enabled) {
$true {
$ADEnabled = 1
}
$false {
$ADEnabled = 0
}
}
$ADFObject += [PSCustomObject] #{
ADName = $ADComputer.Name
ADInsert_Time = Get-Date
ADEnabled = $ADEnabled
ADDistinguishedName = $ADComputer.DistinguishedName
ADObjectGUID = $ADComputer.ObjectGUID
ADLastLogonTimestamp = [datetime]::FromFileTime($ADComputer.LastLogonTimestamp)
ADModifyTimestamp = $ADComputer.modifyTimestamp
ADDescription = $ADComputer.Description
}
}
$ADFObject | Export-Csv $Env:TEMP\TempPsAd.csv -Delimiter ',' -NoTypeInformation
docker cp $Env:TEMP\TempPsAd.csv postgres_docker:/media/TempPsAd.csv
$DBConn.Open()
$DBCmd = $DBConn.CreateCommand()
$DBCmd.CommandText = #"
COPY AD_Devices (ADName,ADInsert_Time,ADEnabled,ADDistinguishedName,ADObjectGUID,ADLastLogonTimestamp,ADModifyTimestamp,ADDescription)
FROM '/media/TempPsAd.csv'
DELIMITER ','
CSV HEADER
"#
$DBCmd.ExecuteReader()
$DBConn.Close()
docker exec postgres_docker rm -rf /media/TempPsAd.csv
Remove-Item $Env:TEMP\TempPsAd.csv -Force
}
catch
{
Write-Error "$($_.Exception.Message)"
continue
}
Hope it helps!
Cheers!
I have a .xlsx file which was made into data table by oledb provider.Now I want to add value to that .xlsx based on the sql table data I have
(which is also converted into a csv file Book1.csv)
The sql table consists of name and notes...
Where name column is same in both .xlsx file and sql variable $sql
I want to add that close notes to f column of .xlsx file if the value of name matches with the value of sql table "A" column One I wrote below is very slow and not effective.
Any help would be highly appreciated.
$Excel = New-Object -ComObject Excel.Application
$Workbook = $Excel.Workbooks.Open('C:\Users\VIKRAM\Documents\Sample - Superstore.xlsx')
$workSheet = $Workbook.Sheets.Item(1)
$WorkSheet.Name
$Found = $WorkSheet.Cells.Find('$Data.number')
$Found.row
$Found.text
$Excel1 = New-Object -ComObject Excel.Application
$file = $Excel1.Workbooks.Open('C:\Users\VIKRAM\Documents\Book1.xlsx')
$ff=$file.Sheets.Item(1)
$ff.Name
$ff1=$ff.Range("A1").entirecolumn
$ff1.Value2
foreach ($line in $ff1.value2){
if( $found.text -eq $line)
{
Write-Host "success"
$fff=$ff1.Row
$WorkSheet.Cells.item($fff,20) =$ff.cells.item($fff,2)
}
}
Data in .xlsx file
Number Priority Comment
612721 4 - High
Data in Book1.csv
Number Clo_notes
612721 Order has been closed
I need to update clo_notes value to comment in .xlsx file if this "number" column in each file matches update the clos_notes to the corresponding column of comment
It looks like you answered my question about where "Nebraska" falls into the data.
Make sure to release any COM objects, or you'll have orphaned Excel processes.
You might try something like this. I was able to write the Clo_notes value into column 6 as you were requesting:
## function to close all com objects
function Release-Ref ($ref) {
([System.Runtime.InteropServices.Marshal]::ReleaseComObject([System.__ComObject]$ref) -gt 0)
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
}
## open Excel data
$Excel = New-Object -ComObject Excel.Application
$Workbook = $Excel.Workbooks.Open('C:\Users\51290\Documents\_temp\StackOverflowAnswers\Excel.xlsx')
$workSheet = $Workbook.Sheets.Item(1)
$WorkSheet.Name
## open SQL data
$Excel1 = New-Object -ComObject Excel.Application
$file = $Excel1.Workbooks.Open('C:\Users\51290\Documents\_temp\StackOverflowAnswers\SQL.xlsx')
$sheetSQL = $file.Sheets.Item(1)
$dataSQL = $sheetSQL.Range("A1").currentregion
$foundNumber = 0
$row_idx = 1
foreach ($row in $WorkSheet.Rows) {
"row_idx = " + $row_idx
if ($row_idx -gt 1) {
$foundNumber = $row.Cells.Item(1,1).Value2
"foundNumber = " + $foundNumber
if ($foundNumber -eq "" -or $foundNumber -eq $null) {
Break
}
foreach ($cell in $dataSQL.Cells) {
if ($cell.Row -gt 1) {
if ($cell.Column -eq 1 -and $cell.Value2 -eq $foundNumber) {
$clo_notes = $sheetSQL.Cells.Item($cell.Row, 2).Value2
Write-Host "success"
$WorkSheet.Cells.item($row_idx, 6).Value2 = $clo_notes
}
}
}
}
$row_idx++
}
$Excel.Quit()
$Excel1.Quit()
## close all object references
Release-Ref($WorkSheet)
Release-Ref($WorkBook)
Release-Ref($Excel)
Release-Ref($Excel1)
I am trying to load 160gb csv file to sql and I am using powershell script I got from Github and I get this error
IException calling "Add" with "1" argument(s): "Input array is longer than the number of columns in this table."
At C:\b.ps1:54 char:26
+ [void]$datatable.Rows.Add <<<< ($line.Split($delimiter))
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DotNetMethodException
So I checked the same code with small 3 line csv and all of the columns match and also have header in first row and there are no extra delimiters not sure why I am getting this error.
The code is below
<# 8-faster-runspaces.ps1 #>
# Set CSV attributes
$csv = "M:\d\s.txt"
$delimiter = "`t"
# Set connstring
$connstring = "Data Source=.;Integrated Security=true;Initial Catalog=PresentationOptimized;PACKET SIZE=32767;"
# Set batchsize to 2000
$batchsize = 2000
# Create the datatable
$datatable = New-Object System.Data.DataTable
# Add generic columns
$columns = (Get-Content $csv -First 1).Split($delimiter)
foreach ($column in $columns) {
[void]$datatable.Columns.Add()
}
# Setup runspace pool and the scriptblock that runs inside each runspace
$pool = [RunspaceFactory]::CreateRunspacePool(1,5)
$pool.ApartmentState = "MTA"
$pool.Open()
$runspaces = #()
# Setup scriptblock. This is the workhorse. Think of it as a function.
$scriptblock = {
Param (
[string]$connstring,
[object]$dtbatch,
[int]$batchsize
)
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connstring,"TableLock")
$bulkcopy.DestinationTableName = "abc"
$bulkcopy.BatchSize = $batchsize
$bulkcopy.WriteToServer($dtbatch)
$bulkcopy.Close()
$dtbatch.Clear()
$bulkcopy.Dispose()
$dtbatch.Dispose()
}
# Start timer
$time = [System.Diagnostics.Stopwatch]::StartNew()
# Open the text file from disk and process.
$reader = New-Object System.IO.StreamReader($csv)
Write-Output "Starting insert.."
while ((($line = $reader.ReadLine()) -ne $null))
{
[void]$datatable.Rows.Add($line.Split($delimiter))
if ($datatable.rows.count % $batchsize -eq 0)
{
$runspace = [PowerShell]::Create()
[void]$runspace.AddScript($scriptblock)
[void]$runspace.AddArgument($connstring)
[void]$runspace.AddArgument($datatable) # <-- Send datatable
[void]$runspace.AddArgument($batchsize)
$runspace.RunspacePool = $pool
$runspaces += [PSCustomObject]#{ Pipe = $runspace; Status = $runspace.BeginInvoke() }
# Overwrite object with a shell of itself
$datatable = $datatable.Clone() # <-- Create new datatable object
}
}
# Close the file
$reader.Close()
# Wait for runspaces to complete
while ($runspaces.Status.IsCompleted -notcontains $true) {}
# End timer
$secs = $time.Elapsed.TotalSeconds
# Cleanup runspaces
foreach ($runspace in $runspaces ) {
[void]$runspace.Pipe.EndInvoke($runspace.Status) # EndInvoke method retrieves the results of the asynchronous call
$runspace.Pipe.Dispose()
}
# Cleanup runspace pool
$pool.Close()
$pool.Dispose()
# Cleanup SQL Connections
[System.Data.SqlClient.SqlConnection]::ClearAllPools()
# Done! Format output then display
$totalrows = 1000000
$rs = "{0:N0}" -f [int]($totalrows / $secs)
$rm = "{0:N0}" -f [int]($totalrows / $secs * 60)
$mill = "{0:N0}" -f $totalrows
Write-Output "$mill rows imported in $([math]::round($secs,2)) seconds ($rs rows/sec and $rm rows/min)"
Working with a 160 GB input file is going to be a pain. You can't really load it into any kind of editor - or at least you don't really analyze such a data mass without some serious automation.
As per the comments, it seems that the data has some quality issues. In order to find the offending data, you could try binary searching. This approach shrinks the data fast. Like so,
1) Split the file in about two equal chunks.
2) Try and load first chunk.
3) If successful, process the second chunk. If not, see 6).
4) Try and load second chunk.
5) If successful, the files are valid, but you got another a data quality issue. Start looking into other causes. If not, see 6).
6) If either load failed, start from the beginning and use the failed file as the input file.
7) Repeat until you narrow down the offending row(s).
Another a method would be using an ETL tool like SSIS. Configure the package to redirect invalid rows into an error log to see what data is not working properly.
Good morning stackoverflow. I have a PowerShell script that is executing a SQL query against an Oracle database and then taking the results and passing them to a local shell command. It works, mostly. What is happening is some of the results are being dropped and the only significance I can see about these is that they have a couple of columns that have null values (but only 2 out of the 8 columns that are being returned). When the query is executed in sQL developer I get all expected results. This issue applies to the $eventcheck switch, the $statuscheck works fine. The Powershell script is below:
param(
[parameter(mandatory=$True)]$username,
[parameter(mandatory=$True)]$password,
$paramreport,
$paramsite,
[switch]$eventcheck,
[switch]$statuscheck
)
$qry1 = Get-Content .\vantageplus_processing.sql
$qry2 = #"
select max(TO_CHAR(VP_ACTUAL_RPT_DETAILS.ETLLOADER_OUT,'YYYYMMDDHH24MISS')) Completed
from MONITOR.VP_ACTUAL_RPT_DETAILS
where VP_ACTUAL_RPT_DETAILS.REPORTNUMBER = '$($paramreport)' and VP_ACTUAL_RPT_DETAILS.SITE_NAME = '$($paramsite)'
order by completed desc
"#
$connString = #"
Provider=OraOLEDB.Oracle;Data Source=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST="HOST")(PORT="1521"))
(CONNECT_DATA=(SERVICE_NAME="SERVICE")));User ID="$username";Password="$password"
"#
function Get-OLEDBData ($connectstring, $sql) {
$OLEDBConn = New-Object System.Data.OleDb.OleDbConnection($connectstring)
$OLEDBConn.open()
$readcmd = New-Object system.Data.OleDb.OleDbCommand($sql,$OLEDBConn)
$readcmd.CommandTimeout = '300'
$da = New-Object system.Data.OleDb.OleDbDataAdapter($readcmd)
$dt = New-Object System.Data.DataTable
[void]$da.fill($dt)
$OLEDBConn.close()
return $dt
}
if ($eventcheck)
{
$output = Get-OLEDBData $connString $qry1
ForEach ($lines in $output)
{
start-process -NoNewWindow -FilePath msend.exe -ArgumentList #"
-n bem_snmp01 -r CRITICAL -a CSG_VANTAGE_PLUS -m "The report $($lines.RPT) for site $($lines.SITE) has not been loaded by $($lines.EXPDTE)" -b "vp_reportnumber='$($lines.RPT)'; vp_sitename='$($lines.SITE)'; vp_expectedcomplete='$($lines.SIMEXPECTED)'; csg_environment='Production';"
"# # KEEP THIS TERMINATOR AT THE BEGINNING OF LINE
}
}
if ($statuscheck)
{
$output = Get-OLEDBData $connString $qry2
# $output | foreach {$_.completed}
write-host -nonewline $output.completed
}
So that you can see the data, below is a csv output from Oracle SQL Developer with the ACTUAL results to the query that is being referenced by my script. Of these results lines 4, 5, 6, 7, 8, 10 are the only ones being passed along in the ForEach loop, while the others are not even captured in the $output array. If anyone can advise of a method for getting all of the results passed along, I would appreciate it.
"SITE","RPT","LSDTE","EXPDTE","CIMEXPECTED","EXPECTED_FREQUENCY","DATE_TIMING","ETME"
"chrcse","CPHM-054","","2014/09/21 12:00:00","20140921120000","MONTHLY","1",
"chrcse","CPSM-226","","2014/09/21 12:00:00","20140921120000","MONTHLY","1",
"dsh","CPSD-176","2014/09/28 23:20:04","2014/09/30 04:00:00","20140930040000","DAILY","1",1.41637731481481481481481481481481481481
"dsh","CPSD-178","2014/09/28 23:20:11","2014/09/30 04:00:00","20140930040000","DAILY","1",1.4162962962962962962962962962962962963
"exp","CPSM-610","2014/08/22 06:42:10","2014/09/21 09:00:00","20140921090000","MONTHLY","1",39.10936342592592592592592592592592592593
"mdc","CPKD-264","2014/09/24 00:44:32","2014/09/30 04:00:00","20140930040000","DAILY","1",6.35771990740740740740740740740740740741
"nea","CPKD-264","2014/09/24 01:00:31","2014/09/30 03:00:00","20140930030000","DAILY","1",6.34662037037037037037037037037037037037
"twtla","CPOD-034","","2014/09/29 23:00:00","20140929230000","DAILY","0",
"twtla","CPPE-002","2014/09/29 02:40:35","2014/09/30 06:00:00","20140930060000","DAILY","1",1.27712962962962962962962962962962962963
"twtla","CPXX-004","","2014/09/29 23:00:00","20140929230000","DAILY","0",
It appears, actually, that somehow a comment that was in the query was causing this issue. I removed it and the results started returning normal. I have no idea why this would be the case, unless it has to do with the way the import works (does it import everything as one line?). Either way, the results are normal now.
Currently I'm working on a PowerShell script which is extracting info about OS, hotfix and installed software.
Well, everything is working well if I want to export to txt file - everything.
But my new task is to upload this information to sql server.
So I'm creating a foreach loop to print installed software and put everything in server. PowerShell isn't showing any errors but I can't get this data into sql server.
$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = "Data Source=myserver; Initial Catalog=table; Integrated Security=SSPI;"
$conn.Open()
$soft = Get-WmiObject Win32_Product
$cmd = New-Object System.Data.SqlClient.SqlCommand
$cmd.Connection = $conn
$cmd = $conn.CreateCommand()
foreach ($Software in $soft){
$query = "INSERT INTO dbo.mytable (SoftName, SoftVersion) VALUES ('$($Software.Name)', $($Software.Version))" }
$cmd.CommandText = $query
$result = $cmd.ExecuteNonQuery
$conn.close()
So the idea is that when I run this script, I get all software installed on pc listed in sql server.
SoftName: SoftVersion:
Office 14.202
Sql 15
Thanks!
What do you get in $result variable when you run your script?
I believe, that you get signatures of overloaded methods with the name ExecuteNonQuery. The proper method call looks like this:
$result = $cmd.ExecuteNonQuery()
If you see errors I would recommend you to use code below to get full information about them:
$ErrorActionPreference = "Stop"
$Error.Clear()
try {
# Your code
}
catch {
$Error | Format-List * -Force
}
You are overwriting the insert command instead of appending into it. In addition, as levgen points out, you are missing method call syntax:
foreach ($Software in $soft){
$query = "INSERT INTO dbo.mytable (SoftName, SoftVersion) VALUES ('$($Software.Name)', $($Software.Version))"
} # Oops!
# Now $query contains only the last insert statement.
$cmd.CommandText = $query
$result = $cmd.ExecuteNonQuery # Should be .ExecuteNonQuery()
$conn.close()
Try something like so,
foreach ($Software in $soft){
$query += "INSERT INTO dbo.mytable (SoftName, SoftVersion) VALUES ('$($Software.Name)', $($Software.Version));"
} # Appending insert staements
# Now $query should contain lots of insert staements
$cmd.CommandText = $query
$result = $cmd.ExecuteNonQuery()
$conn.close()