Error: Input array is longer than number of columns in this table powershell - sql

I am trying to load 160gb csv file to sql and I am using powershell script I got from Github and I get this error
IException calling "Add" with "1" argument(s): "Input array is longer than the number of columns in this table."
At C:\b.ps1:54 char:26
+ [void]$datatable.Rows.Add <<<< ($line.Split($delimiter))
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DotNetMethodException
So I checked the same code with small 3 line csv and all of the columns match and also have header in first row and there are no extra delimiters not sure why I am getting this error.
The code is below
<# 8-faster-runspaces.ps1 #>
# Set CSV attributes
$csv = "M:\d\s.txt"
$delimiter = "`t"
# Set connstring
$connstring = "Data Source=.;Integrated Security=true;Initial Catalog=PresentationOptimized;PACKET SIZE=32767;"
# Set batchsize to 2000
$batchsize = 2000
# Create the datatable
$datatable = New-Object System.Data.DataTable
# Add generic columns
$columns = (Get-Content $csv -First 1).Split($delimiter)
foreach ($column in $columns) {
[void]$datatable.Columns.Add()
}
# Setup runspace pool and the scriptblock that runs inside each runspace
$pool = [RunspaceFactory]::CreateRunspacePool(1,5)
$pool.ApartmentState = "MTA"
$pool.Open()
$runspaces = #()
# Setup scriptblock. This is the workhorse. Think of it as a function.
$scriptblock = {
Param (
[string]$connstring,
[object]$dtbatch,
[int]$batchsize
)
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connstring,"TableLock")
$bulkcopy.DestinationTableName = "abc"
$bulkcopy.BatchSize = $batchsize
$bulkcopy.WriteToServer($dtbatch)
$bulkcopy.Close()
$dtbatch.Clear()
$bulkcopy.Dispose()
$dtbatch.Dispose()
}
# Start timer
$time = [System.Diagnostics.Stopwatch]::StartNew()
# Open the text file from disk and process.
$reader = New-Object System.IO.StreamReader($csv)
Write-Output "Starting insert.."
while ((($line = $reader.ReadLine()) -ne $null))
{
[void]$datatable.Rows.Add($line.Split($delimiter))
if ($datatable.rows.count % $batchsize -eq 0)
{
$runspace = [PowerShell]::Create()
[void]$runspace.AddScript($scriptblock)
[void]$runspace.AddArgument($connstring)
[void]$runspace.AddArgument($datatable) # <-- Send datatable
[void]$runspace.AddArgument($batchsize)
$runspace.RunspacePool = $pool
$runspaces += [PSCustomObject]#{ Pipe = $runspace; Status = $runspace.BeginInvoke() }
# Overwrite object with a shell of itself
$datatable = $datatable.Clone() # <-- Create new datatable object
}
}
# Close the file
$reader.Close()
# Wait for runspaces to complete
while ($runspaces.Status.IsCompleted -notcontains $true) {}
# End timer
$secs = $time.Elapsed.TotalSeconds
# Cleanup runspaces
foreach ($runspace in $runspaces ) {
[void]$runspace.Pipe.EndInvoke($runspace.Status) # EndInvoke method retrieves the results of the asynchronous call
$runspace.Pipe.Dispose()
}
# Cleanup runspace pool
$pool.Close()
$pool.Dispose()
# Cleanup SQL Connections
[System.Data.SqlClient.SqlConnection]::ClearAllPools()
# Done! Format output then display
$totalrows = 1000000
$rs = "{0:N0}" -f [int]($totalrows / $secs)
$rm = "{0:N0}" -f [int]($totalrows / $secs * 60)
$mill = "{0:N0}" -f $totalrows
Write-Output "$mill rows imported in $([math]::round($secs,2)) seconds ($rs rows/sec and $rm rows/min)"

Working with a 160 GB input file is going to be a pain. You can't really load it into any kind of editor - or at least you don't really analyze such a data mass without some serious automation.
As per the comments, it seems that the data has some quality issues. In order to find the offending data, you could try binary searching. This approach shrinks the data fast. Like so,
1) Split the file in about two equal chunks.
2) Try and load first chunk.
3) If successful, process the second chunk. If not, see 6).
4) Try and load second chunk.
5) If successful, the files are valid, but you got another a data quality issue. Start looking into other causes. If not, see 6).
6) If either load failed, start from the beginning and use the failed file as the input file.
7) Repeat until you narrow down the offending row(s).
Another a method would be using an ETL tool like SSIS. Configure the package to redirect invalid rows into an error log to see what data is not working properly.

Related

Error Trapping of Redundant Event in 4 minute Window with PowerSell

I have a script that successfully traps an application event using a SQL query in the database. If found - it will write to the event log and sent an email to support team. Now the team wants to have a double check within a four minute window. You may get ErrorA & ErrorD twice - and ErrorB & ErrorC do not reoccur.
How would you do an internal check within the 1st loop. So first time you have ErrorA.1st and second check two minutes later you see ErrorA.1st = ErrorA.2nd, therefore send off a email?
while($true) ### Endless loop - continuely looking for ERRORS to trap for two actions below (Write event log and send email)
{
$connString = "data source=sqlservername,1433;Initial catalog=HugsDB;Integrated Security=True;"
$date= $((get-date).AddSeconds(-120).ToString("MM-dd-yyyy HH:mm:ss"))
$QueryText = "select statement that graps all errors $date"
#SETUP SQL VALUES####
$SqlConnection = new-object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $connString
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $QueryText
#### query the database
$DataAdapter = new-object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$dataset = new-object System.Data.Dataset
$rowCount = $DataAdapter.Fill($dataset)
$sqlConnection.Close()
$sqlConnection.Dispose()
#### IF QUERY FINDS ERRORS # exact time of query ( could be 1 or more devices reporting an error ) write an event log message & send team an email
if($rowCount -gt 0) {
### assign a unique variable to each unique error on 1st find.
ForEach ($row in $dataset.Tables[0].Rows) {
[int]$incre = 0
$row.exciter_name = $incre.$row.exciter_name
}
## sleep 2 minutes before checking to see if we have a repeat of any of the finds in second query
Start-Sleep -Seconds 120
## Another query used to see if a reoccurance of same error.
## If the same error occurs send email and write error to event log.
if($rowCount -gt 0) {
ForEach ($row in $dataset.Tables[0].Rows) {
## NOT SURE HOW TO DO A CHECK TO LOOK FOR REOCCURANCE TO GENERATE EVENT.
write-Eventlog -LogName Exciter_Log -Source Exciter_Health –EventID 108 -Message "PACE Exciter Health Alert"
Send-MailMessage -smtpserver "$SMTPServer" -from "$EmailFrom" -to "$EmailTo" -subject "$Subject" -bodyAsHtml "$Body" -credential $anonCredentials
#################################################################################################################################################
#Second BREAK
Start-Sleep -Seconds 120
}
Sounds like you need to keep track of your previous message and compare them with the new messages. I'm assuming from the code that it does not matter which error message is actually repeated, but only that one is. In the example below what I've done is capture all unique messages in a variable called $currentErrors. Once done with capturing the messages you'll want to check if any of these previous errors appear in in this current set. If even one repeat error is found the the $alert flag is set to true and performs your error handling. Last it moves all current error messages into previous error message to be checked on the next run.
$currentErrors = #()
$alert = $false
if ($rowCount -gt 0)
{
foreach ($row in $dataset.Tables[0].Rows)
{
# not sure what these 2 lines do?
[int]$incre = 0
$row.exciter_name = $incre.$row.exciter_name
if ($currentErrors -notcontains $row.exciter_name) { $currentErrors += $row.exciter_name } # add error message to $currentErrors if not already added
}
# on first run previousErrors will be null so this foreach will do nothing
foreach ($error in $previousErrors)
{
if ($currentErrors -contains $error)
{
# previous error found in current errors
$alert = $true
break # exit loop once duplicate error is found
}
}
if ($alert)
{
Write-EventLog -LogName Exciter_Log -Source Exciter_Health –EventID 108 -Message "PACE Exciter Health Alert"
Send-MailMessage -smtpserver "$SMTPServer" -from "$EmailFrom" -to "$EmailTo" -subject "$Subject" -bodyAsHtml "$Body" -credential $anonCredentials
}
# move currentErrors into previousErrors for next loop
$previousErrors = $currentErrors
Start-Sleep -Seconds 120
}

Powershell Excel - Using new range for search gives errors for old range

I am a complete beginner to Microsoft PowerShell, but I have worked with C++, Java, and C#. I was working on this small script for my job, and I came across a strange issue that is likely due to me not properly understanding how ranges work. This script is supposed to open up an Excel workbook, search each sheet for a name, and give the names that match along with their information. The problem is when I set the range again to search for the starting column index of information (in this case, the column to the right of the column labeled "Description"), it breaks the range when it searches for more than one match of the same last name.
I have a do-while loop that uses worksheet.range.entirerow.findnext() so that I can find multiple with the same last name. This works until I used a new range, worksheet.range.entirecolumn.find(). This is my latest code of what I tried, but I have already tried hardcoding $Range to 5 (which worked, but I want it to be dynamic) or used a new variable $RowRange (which didn't fix the issue). If I understood correctly, the range is like the current selection of two or more cells, so why can I not reset it or use a new variable? It does not loop, so I only keep finding the first name in each sheet.
P.S. As a side question, I had an issue of shutting down the process of this excel document I want to open in the background without shutting down other Excel workbooks. For some reason, using Get-Process EXCEL | Stop-Process -Force; shuts down ALL of my open workbooks. I commented it out, but I'm worried about the process not quite ending when it's done executing this code.
# Prepare output file for results
$FileName = "TEST";
$OutputFile = "Results.txt";
Remove-Item $OutputFile;
New-Item $OutputFile -ItemType file;
$Writer = [System.IO.StreamWriter] $OutputFile;
Clear-Host
Write-Host Starting...
# Start up Excel
$Excel = New-Object -ComObject Excel.Application;
$File = $FileName + ".xlsx";
# Prompt user for last name of person to search for (and write to the Results.txt output file)
Clear-Host
Write-Host Search for users in each region by their last name.
$SearchLastName = Read-Host -Prompt "Please input the person's last name";
Write-Host Searching for person...;
$Writer.WriteLine("Name Search: " + $SearchLastName);
$Writer.WriteLine("");
# Then open it without it being made visible to the user
$Excel.Visible = $false;
$Excel.DisplayAlerts = $true;
$Excel.Workbooks.Open($File);
# For each worksheet, or tab, search for the name in the first column (last names)
$Excel.Worksheets | ForEach-Object{
$_.activate();
$Range = $_.Range("A1").EntireColumn;
# Note: To search for text in the ENTIRETY of a cell, need to use the find method's lookat
# parameter (use 1). Otherwise, if searching for Smith, Nesmith also gets detected.
$SearchLast = $Range.find($SearchLastName,[Type]::Missing,[Type]::Missing,1);
$Writer.WriteLine($_.name + ": ");
if ($SearchLast -ne $null) {
$FirstRow = $SearchLast.Row;
do {
# If a first name was found, get the first name too
$FirstName = $_.Cells.Item($SearchLast.Row, $SearchLast.Column + 1).value();
# Then display in proper order
$Writer.WriteLine(" " + $SearchLast.value() + "," + $FirstName);
# From here, find the relevant information on that person
# Search for the column labeled "Description", the starting column is the next one, ending column is the number of used columns
$BeginCol = $_.Range("A1").EntireRow.find("Description",[Type]::Missing,[Type]::Missing,1).Column + 1;
$MaxColumns = $_.UsedRange.Columns.Count;
# Check each column for relevant information. If there are no extra rows after "Description" just skip
for ($i = $BeginCol; $i -le $MaxColumns; $i++) {
# The information of the current cell, found by the row of the name and the current row
$CurrentCell = $_.Cells.Item($SearchLast.Row, $i);
# Only add the description if it exists.
if (!([string]::IsNullOrEmpty($CurrentCell.value2))) {
$Description = $_.Cells.Item(1,$i).text();
# Concatenate the description with it's information.
$Display = " - (" + $Description + ": " + $CurrentCell.text() + ")";
# Display the information
$Writer.WriteLine($Display);
}
}
$Writer.WriteLine("");
# Keep searching that name in the current workbook until it finds no more
$SearchLast = $Range.FindNext($SearchLast);
} while (($SearchLast -ne $null) -and ($SearchLast.Row -ne $FirstRow));
} else {
$Writer.WriteLine("Not Found");
}
$Writer.WriteLine("");
};
# Cleaning up the environment
$Writer.close();
$Excel.Workbooks.Item($FileName).close();
$Excel.Quit();
# Force quit the Excel process after quitting
# Get-Process EXCEL | Stop-Process -Force;
# Then remove the $Excel com object to ready it for garbage collection
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Excel);
# Then, open up the Results.txt file
Invoke-Item Results.txt;

StreamWriter cannot call a method on a null-valued expression

First time user, looking for help with a script that's been driving me crazy.
Basically, I need to create a set number of files of an exact size (512KB, 2MB, 1GB) to test a SAN. These files need to be filled with random text so that the SAN doesn't catch the nuls and does actually allocate the blocks - that's also the reason I couldn't just use fsutils.
Now, I've been messing with the new-bigrandomfile by Verboon and tweaking it to my needs.
However I'm getting the error:
You cannot call a method on a null-valued expression.
At L:\random5.ps1:34 char:9
+ $stream.Write($longstring)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvokeMethodOnNull
This is the bit of code I've come up with so far; I'll add a loop at the end to copy the file I just created N times so to fill up the lun.
Set-Strictmode -Version 2.0
#temp file
$file = "c:\temp\temp.rnd"
#charset size
$charset = 64
#Block Size
$blocksize = 512
#page size
$Pagesize = 512KB
#Number of blocks in a page
$blocknum = $Pagesize / $blocksize
#Resulting/desired test file size
$filesize = 1GB
#number of pages in a file
$pagenum = $filesize / $Pagesize
# create the stream writer
$stream = System.IO.StreamWriter $file
# get a 64 element Char[]; I added the - and _ to have 64 chars
[char[]]$chars = 'azertyuiopqsdfghjklmwxcvbnAZERTYUIOPQSDFGHJKLMWXCVBN0123456789-_'
1..$Pagenum | ForEach-Object {
# get a page's worth of blocks
1..$blocknum| ForEach-Object {
# randomize all chars and...
$rndChars = $chars | Get-Random -Count $chars.Count
# ...join them in a string
$string = -join $rndChars
# repeat random string N times to get a full block string length
$longstring = $string * ($blocksize / $charset)
# write 1 block to file
$stream.Write($longstring)
# release resources by clearing string variables
Clear-Variable string, longstring
}
}
$stream.Close()
$stream.Dispose()
# release resources through garbage collection
[GC]::Collect()
$file.Close()
I've tried a gazillion variants like:
$stream = [System.IO.StreamWriter] $file
$stream = System.IO.StreamWriter $file
$stream = NewObject System.IO.StreamWriter $file
Of course, being a total noob at powershell, I've tried using quotes, brackets, provided the full path instead of the variable, etc. All (or most) seem to be valid syntax variants, according to a ton of examples I found online, but the output is still the same.
In case you have any improvement to suggest or alternative way to perform this task I'm all ears.
Edited the script above: just a couple of " for $file made the error disappear, - thanks LinuxDisciple; however, the file gets created but stays at 0 bytes and the script stuck in a loop.
Fix your instantiation of StreamWriter to any of these correct variants:
$stream = [System.IO.StreamWriter]::new($file)
$stream = [IO.StreamWriter]::new($file) # the default namespace may be omitted
$stream = New-Object System.IO.StreamWriter $file
You can specify encoding:
$stream = [IO.StreamWriter]::new(
$file,
$false, # don't append
[Text.Encoding]::ASCII
)
See StreamWriter on MSDN for available constructors and parameters.
PowerShell ISE offers autocomplete with tooltips:
type [streamw and press Ctrl-Space to autocomplete the full .NET class name
type ]:: to see the available methods and properties
type new and press Ctrl-Space to see the constructor overrides
whenever needed, put the caret at the method name and press Ctrl-Space for the tooltip
I know nothing about powershell but a few things:
Are you sure $longstring has a value before you call stream.Write()? It sounds like it's null and that's why the error. If you can somehow output the value of $longstring to the console, it would help you make sure that it has a value.
Also, troubleshoot the code with a simplified version of your code, so that you can pinpoint what's going on, for example
$file = c:\temp\temp.rnd
$stream = System.IO.StreamWriter $file
$longstring = 'whatever'
$stream.Write($longstring)

How to increase efficiency of perl script which uses sqlplus

I have this perl script which takes the data from sqlplus database... this database adds a new entry every time when there is a change in the value of state for a particular serial number. Now we need to pick the entries at every state change and prepare a csv file with old state, new state and other fields. db table sample.
SERIALNUMBER STATE AT OPERATORID SUBSCRIBERID TRANSACTIONID
51223344558899 Available 20081008T10:15:47 vsuser
51223344558857 Available 20081008T10:15:49 vsowner
51223344558899 Used 20081008T10:20:25 vsuser
51223344558860 Stolen 20081008T10:15:49 vsanyone
51223344558857 Damaged 20081008T10:50:49 vsowner
51223344558899 Damaged 20081008T10:50:25 vsuser
51343253335355 Available 20081008T11:15:47 vsindian
my script:
#! /usr/bin/perl
#use warnings;
use strict;
#my $circle =
#my $schema =
my $basePath = "/scripts/Voucher-State-Change";
#my ($sec, $min, $hr, $day, $month, $years) = localtime(time);
#$years_+=1900;$mont_+=1;
#my $timestamp=sprintf("%d%02d%02d",$years,$mont,$moday);
sub getDate {
my $daysago=shift;
$daysago=0 unless ($daysago);
#my #months=qw(Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec);
my ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localtime(time-(86400*$daysago));
# YYYYMMDD, e.g. 20060126
return sprintf("%d%02d%02d",$year+1900,$mon+1,$mday);
}
my $filedate=getDate(1);
#my $startdate="${filedate}T__:__:__";
my $startdate="20081008T__:__:__";
print "$startdate\n";
##### Generating output file---
my $outputFile = "${basePath}/VoucherStateChangeReport.$filedate.csv";
open (WFH, ">", "$outputFile") or die "Can't open output file $outputFile for writing: $!\n";
print WFH "VoucherSerialNumber,Date,Time,OldState,NewState,UserId\n";
##### Generating log file---
my $logfile = "${basePath}/VoucherStateChange.$filedate.log";
open (STDOUT, ">>", "$logfile") or die "Can't open logfile $logfile for writing: $!\n";
open (STDERR, ">>", "$logfile") or die "Can't open logfile $logfile for writing: $!\n";
print "$logfile\n";
##### Now login to sqlplus-----
my $SQLPLUS='/opt/oracle/product/11g/db_1/bin/sqlplus -S system/coolman7#vsdb';
`$SQLPLUS \#${basePath}/VoucherQuery1.sql $startdate> ${basePath}/QueryResult1.txt`;
open (FH1, "${basePath}/QueryResult1.txt");
while (my $serial = <FH1>) {
chomp ($serial);
my $count = `$SQLPLUS \#${basePath}/VoucherQuery2.sql $serial $startdate`;
chomp ($count);
$count =~ s/\s+//g;
#print "$count\n";
next if $count == 1;
`$SQLPLUS \#${basePath}/VoucherQuery3.sql $serial $startdate> ${basePath}/QueryResult3.txt`;
# print "select * from sample where SERIALNUMBER = $serial----\n";
open (FH3, "${basePath}/QueryResult3.txt");
my ($serial_number, $state, $at, $operator_id);
my $count1 = 0;
my $old_state;
while (my $data = <FH3>) {
chomp ($data);
#print $data."\n";
my #data = split (/\s+/, $data);
my ($serial_number, $state, $at, $operator_id) = #data[0..3];
#my $serial_number = $data[0];
#my $state = $data[1];
#my $at = $data[2];
#my $operator_id = $data[3];
$count1++;
if ($count1 == 1) {
$old_state = $data[1];
next;
}
my ($date, $time) = split (/T/, $at);
$date =~ s/(\d{4})(\d{2})(\d{2})/$1-$2-$3/;
print WFH "$serial_number,$date,$time,$old_state,$state,$operator_id\n";
$old_state = $data[1];
}
}
close(WFH);
query in VoucherQuery1.sql:
select distinct SERIALNUMBER from sample where AT like '&1';
query in VoucherQuery2.sql:
select count(*) from sample where SERIALNUMBER = '&1' and AT like '&2';
query in VoucherQuery2.sql:
select * from sample where SERIALNUMBER = '&1' and AT like '&2';
and my sample output:
VoucherSerialNumber,Date,Time,OldState,NewState,UserId
51223344558857,2008-10-08,10:50:49,Available,Damaged,vsowner
51223344558899,2008-10-08,10:20:25,Available,Used,vsuser
51223344558899,2008-10-08,10:50:25,Used,Damaged,vsuser
Script is working pretty fine. But problem is that actual db table has millions of records for a specific day... and therefore it is raising performance issues... could you please advise how can we improve the efficiency of this script in terms of time & load. Only restriction is that I can't use DBI module for this...
Also in case of any error in the sql queries, error msg is coming to QueryResult?.txt files. I want to handle and receive these errors in my log file. how this can be accomplished? thanks
I think you need to tune your query. A good starting point is to use the EXPLAIN PLAN, if it is an Oracle database.

PowerShell Script DB Query not passing all results

Good morning stackoverflow. I have a PowerShell script that is executing a SQL query against an Oracle database and then taking the results and passing them to a local shell command. It works, mostly. What is happening is some of the results are being dropped and the only significance I can see about these is that they have a couple of columns that have null values (but only 2 out of the 8 columns that are being returned). When the query is executed in sQL developer I get all expected results. This issue applies to the $eventcheck switch, the $statuscheck works fine. The Powershell script is below:
param(
[parameter(mandatory=$True)]$username,
[parameter(mandatory=$True)]$password,
$paramreport,
$paramsite,
[switch]$eventcheck,
[switch]$statuscheck
)
$qry1 = Get-Content .\vantageplus_processing.sql
$qry2 = #"
select max(TO_CHAR(VP_ACTUAL_RPT_DETAILS.ETLLOADER_OUT,'YYYYMMDDHH24MISS')) Completed
from MONITOR.VP_ACTUAL_RPT_DETAILS
where VP_ACTUAL_RPT_DETAILS.REPORTNUMBER = '$($paramreport)' and VP_ACTUAL_RPT_DETAILS.SITE_NAME = '$($paramsite)'
order by completed desc
"#
$connString = #"
Provider=OraOLEDB.Oracle;Data Source=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST="HOST")(PORT="1521"))
(CONNECT_DATA=(SERVICE_NAME="SERVICE")));User ID="$username";Password="$password"
"#
function Get-OLEDBData ($connectstring, $sql) {
$OLEDBConn = New-Object System.Data.OleDb.OleDbConnection($connectstring)
$OLEDBConn.open()
$readcmd = New-Object system.Data.OleDb.OleDbCommand($sql,$OLEDBConn)
$readcmd.CommandTimeout = '300'
$da = New-Object system.Data.OleDb.OleDbDataAdapter($readcmd)
$dt = New-Object System.Data.DataTable
[void]$da.fill($dt)
$OLEDBConn.close()
return $dt
}
if ($eventcheck)
{
$output = Get-OLEDBData $connString $qry1
ForEach ($lines in $output)
{
start-process -NoNewWindow -FilePath msend.exe -ArgumentList #"
-n bem_snmp01 -r CRITICAL -a CSG_VANTAGE_PLUS -m "The report $($lines.RPT) for site $($lines.SITE) has not been loaded by $($lines.EXPDTE)" -b "vp_reportnumber='$($lines.RPT)'; vp_sitename='$($lines.SITE)'; vp_expectedcomplete='$($lines.SIMEXPECTED)'; csg_environment='Production';"
"# # KEEP THIS TERMINATOR AT THE BEGINNING OF LINE
}
}
if ($statuscheck)
{
$output = Get-OLEDBData $connString $qry2
# $output | foreach {$_.completed}
write-host -nonewline $output.completed
}
So that you can see the data, below is a csv output from Oracle SQL Developer with the ACTUAL results to the query that is being referenced by my script. Of these results lines 4, 5, 6, 7, 8, 10 are the only ones being passed along in the ForEach loop, while the others are not even captured in the $output array. If anyone can advise of a method for getting all of the results passed along, I would appreciate it.
"SITE","RPT","LSDTE","EXPDTE","CIMEXPECTED","EXPECTED_FREQUENCY","DATE_TIMING","ETME"
"chrcse","CPHM-054","","2014/09/21 12:00:00","20140921120000","MONTHLY","1",
"chrcse","CPSM-226","","2014/09/21 12:00:00","20140921120000","MONTHLY","1",
"dsh","CPSD-176","2014/09/28 23:20:04","2014/09/30 04:00:00","20140930040000","DAILY","1",1.41637731481481481481481481481481481481
"dsh","CPSD-178","2014/09/28 23:20:11","2014/09/30 04:00:00","20140930040000","DAILY","1",1.4162962962962962962962962962962962963
"exp","CPSM-610","2014/08/22 06:42:10","2014/09/21 09:00:00","20140921090000","MONTHLY","1",39.10936342592592592592592592592592592593
"mdc","CPKD-264","2014/09/24 00:44:32","2014/09/30 04:00:00","20140930040000","DAILY","1",6.35771990740740740740740740740740740741
"nea","CPKD-264","2014/09/24 01:00:31","2014/09/30 03:00:00","20140930030000","DAILY","1",6.34662037037037037037037037037037037037
"twtla","CPOD-034","","2014/09/29 23:00:00","20140929230000","DAILY","0",
"twtla","CPPE-002","2014/09/29 02:40:35","2014/09/30 06:00:00","20140930060000","DAILY","1",1.27712962962962962962962962962962962963
"twtla","CPXX-004","","2014/09/29 23:00:00","20140929230000","DAILY","0",
It appears, actually, that somehow a comment that was in the query was causing this issue. I removed it and the results started returning normal. I have no idea why this would be the case, unless it has to do with the way the import works (does it import everything as one line?). Either way, the results are normal now.