Powershell Excel - Using new range for search gives errors for old range - vba

I am a complete beginner to Microsoft PowerShell, but I have worked with C++, Java, and C#. I was working on this small script for my job, and I came across a strange issue that is likely due to me not properly understanding how ranges work. This script is supposed to open up an Excel workbook, search each sheet for a name, and give the names that match along with their information. The problem is when I set the range again to search for the starting column index of information (in this case, the column to the right of the column labeled "Description"), it breaks the range when it searches for more than one match of the same last name.
I have a do-while loop that uses worksheet.range.entirerow.findnext() so that I can find multiple with the same last name. This works until I used a new range, worksheet.range.entirecolumn.find(). This is my latest code of what I tried, but I have already tried hardcoding $Range to 5 (which worked, but I want it to be dynamic) or used a new variable $RowRange (which didn't fix the issue). If I understood correctly, the range is like the current selection of two or more cells, so why can I not reset it or use a new variable? It does not loop, so I only keep finding the first name in each sheet.
P.S. As a side question, I had an issue of shutting down the process of this excel document I want to open in the background without shutting down other Excel workbooks. For some reason, using Get-Process EXCEL | Stop-Process -Force; shuts down ALL of my open workbooks. I commented it out, but I'm worried about the process not quite ending when it's done executing this code.
# Prepare output file for results
$FileName = "TEST";
$OutputFile = "Results.txt";
Remove-Item $OutputFile;
New-Item $OutputFile -ItemType file;
$Writer = [System.IO.StreamWriter] $OutputFile;
Clear-Host
Write-Host Starting...
# Start up Excel
$Excel = New-Object -ComObject Excel.Application;
$File = $FileName + ".xlsx";
# Prompt user for last name of person to search for (and write to the Results.txt output file)
Clear-Host
Write-Host Search for users in each region by their last name.
$SearchLastName = Read-Host -Prompt "Please input the person's last name";
Write-Host Searching for person...;
$Writer.WriteLine("Name Search: " + $SearchLastName);
$Writer.WriteLine("");
# Then open it without it being made visible to the user
$Excel.Visible = $false;
$Excel.DisplayAlerts = $true;
$Excel.Workbooks.Open($File);
# For each worksheet, or tab, search for the name in the first column (last names)
$Excel.Worksheets | ForEach-Object{
$_.activate();
$Range = $_.Range("A1").EntireColumn;
# Note: To search for text in the ENTIRETY of a cell, need to use the find method's lookat
# parameter (use 1). Otherwise, if searching for Smith, Nesmith also gets detected.
$SearchLast = $Range.find($SearchLastName,[Type]::Missing,[Type]::Missing,1);
$Writer.WriteLine($_.name + ": ");
if ($SearchLast -ne $null) {
$FirstRow = $SearchLast.Row;
do {
# If a first name was found, get the first name too
$FirstName = $_.Cells.Item($SearchLast.Row, $SearchLast.Column + 1).value();
# Then display in proper order
$Writer.WriteLine(" " + $SearchLast.value() + "," + $FirstName);
# From here, find the relevant information on that person
# Search for the column labeled "Description", the starting column is the next one, ending column is the number of used columns
$BeginCol = $_.Range("A1").EntireRow.find("Description",[Type]::Missing,[Type]::Missing,1).Column + 1;
$MaxColumns = $_.UsedRange.Columns.Count;
# Check each column for relevant information. If there are no extra rows after "Description" just skip
for ($i = $BeginCol; $i -le $MaxColumns; $i++) {
# The information of the current cell, found by the row of the name and the current row
$CurrentCell = $_.Cells.Item($SearchLast.Row, $i);
# Only add the description if it exists.
if (!([string]::IsNullOrEmpty($CurrentCell.value2))) {
$Description = $_.Cells.Item(1,$i).text();
# Concatenate the description with it's information.
$Display = " - (" + $Description + ": " + $CurrentCell.text() + ")";
# Display the information
$Writer.WriteLine($Display);
}
}
$Writer.WriteLine("");
# Keep searching that name in the current workbook until it finds no more
$SearchLast = $Range.FindNext($SearchLast);
} while (($SearchLast -ne $null) -and ($SearchLast.Row -ne $FirstRow));
} else {
$Writer.WriteLine("Not Found");
}
$Writer.WriteLine("");
};
# Cleaning up the environment
$Writer.close();
$Excel.Workbooks.Item($FileName).close();
$Excel.Quit();
# Force quit the Excel process after quitting
# Get-Process EXCEL | Stop-Process -Force;
# Then remove the $Excel com object to ready it for garbage collection
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Excel);
# Then, open up the Results.txt file
Invoke-Item Results.txt;

Related

Error: Input array is longer than number of columns in this table powershell

I am trying to load 160gb csv file to sql and I am using powershell script I got from Github and I get this error
IException calling "Add" with "1" argument(s): "Input array is longer than the number of columns in this table."
At C:\b.ps1:54 char:26
+ [void]$datatable.Rows.Add <<<< ($line.Split($delimiter))
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DotNetMethodException
So I checked the same code with small 3 line csv and all of the columns match and also have header in first row and there are no extra delimiters not sure why I am getting this error.
The code is below
<# 8-faster-runspaces.ps1 #>
# Set CSV attributes
$csv = "M:\d\s.txt"
$delimiter = "`t"
# Set connstring
$connstring = "Data Source=.;Integrated Security=true;Initial Catalog=PresentationOptimized;PACKET SIZE=32767;"
# Set batchsize to 2000
$batchsize = 2000
# Create the datatable
$datatable = New-Object System.Data.DataTable
# Add generic columns
$columns = (Get-Content $csv -First 1).Split($delimiter)
foreach ($column in $columns) {
[void]$datatable.Columns.Add()
}
# Setup runspace pool and the scriptblock that runs inside each runspace
$pool = [RunspaceFactory]::CreateRunspacePool(1,5)
$pool.ApartmentState = "MTA"
$pool.Open()
$runspaces = #()
# Setup scriptblock. This is the workhorse. Think of it as a function.
$scriptblock = {
Param (
[string]$connstring,
[object]$dtbatch,
[int]$batchsize
)
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connstring,"TableLock")
$bulkcopy.DestinationTableName = "abc"
$bulkcopy.BatchSize = $batchsize
$bulkcopy.WriteToServer($dtbatch)
$bulkcopy.Close()
$dtbatch.Clear()
$bulkcopy.Dispose()
$dtbatch.Dispose()
}
# Start timer
$time = [System.Diagnostics.Stopwatch]::StartNew()
# Open the text file from disk and process.
$reader = New-Object System.IO.StreamReader($csv)
Write-Output "Starting insert.."
while ((($line = $reader.ReadLine()) -ne $null))
{
[void]$datatable.Rows.Add($line.Split($delimiter))
if ($datatable.rows.count % $batchsize -eq 0)
{
$runspace = [PowerShell]::Create()
[void]$runspace.AddScript($scriptblock)
[void]$runspace.AddArgument($connstring)
[void]$runspace.AddArgument($datatable) # <-- Send datatable
[void]$runspace.AddArgument($batchsize)
$runspace.RunspacePool = $pool
$runspaces += [PSCustomObject]#{ Pipe = $runspace; Status = $runspace.BeginInvoke() }
# Overwrite object with a shell of itself
$datatable = $datatable.Clone() # <-- Create new datatable object
}
}
# Close the file
$reader.Close()
# Wait for runspaces to complete
while ($runspaces.Status.IsCompleted -notcontains $true) {}
# End timer
$secs = $time.Elapsed.TotalSeconds
# Cleanup runspaces
foreach ($runspace in $runspaces ) {
[void]$runspace.Pipe.EndInvoke($runspace.Status) # EndInvoke method retrieves the results of the asynchronous call
$runspace.Pipe.Dispose()
}
# Cleanup runspace pool
$pool.Close()
$pool.Dispose()
# Cleanup SQL Connections
[System.Data.SqlClient.SqlConnection]::ClearAllPools()
# Done! Format output then display
$totalrows = 1000000
$rs = "{0:N0}" -f [int]($totalrows / $secs)
$rm = "{0:N0}" -f [int]($totalrows / $secs * 60)
$mill = "{0:N0}" -f $totalrows
Write-Output "$mill rows imported in $([math]::round($secs,2)) seconds ($rs rows/sec and $rm rows/min)"
Working with a 160 GB input file is going to be a pain. You can't really load it into any kind of editor - or at least you don't really analyze such a data mass without some serious automation.
As per the comments, it seems that the data has some quality issues. In order to find the offending data, you could try binary searching. This approach shrinks the data fast. Like so,
1) Split the file in about two equal chunks.
2) Try and load first chunk.
3) If successful, process the second chunk. If not, see 6).
4) Try and load second chunk.
5) If successful, the files are valid, but you got another a data quality issue. Start looking into other causes. If not, see 6).
6) If either load failed, start from the beginning and use the failed file as the input file.
7) Repeat until you narrow down the offending row(s).
Another a method would be using an ETL tool like SSIS. Configure the package to redirect invalid rows into an error log to see what data is not working properly.

StreamWriter cannot call a method on a null-valued expression

First time user, looking for help with a script that's been driving me crazy.
Basically, I need to create a set number of files of an exact size (512KB, 2MB, 1GB) to test a SAN. These files need to be filled with random text so that the SAN doesn't catch the nuls and does actually allocate the blocks - that's also the reason I couldn't just use fsutils.
Now, I've been messing with the new-bigrandomfile by Verboon and tweaking it to my needs.
However I'm getting the error:
You cannot call a method on a null-valued expression.
At L:\random5.ps1:34 char:9
+ $stream.Write($longstring)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvokeMethodOnNull
This is the bit of code I've come up with so far; I'll add a loop at the end to copy the file I just created N times so to fill up the lun.
Set-Strictmode -Version 2.0
#temp file
$file = "c:\temp\temp.rnd"
#charset size
$charset = 64
#Block Size
$blocksize = 512
#page size
$Pagesize = 512KB
#Number of blocks in a page
$blocknum = $Pagesize / $blocksize
#Resulting/desired test file size
$filesize = 1GB
#number of pages in a file
$pagenum = $filesize / $Pagesize
# create the stream writer
$stream = System.IO.StreamWriter $file
# get a 64 element Char[]; I added the - and _ to have 64 chars
[char[]]$chars = 'azertyuiopqsdfghjklmwxcvbnAZERTYUIOPQSDFGHJKLMWXCVBN0123456789-_'
1..$Pagenum | ForEach-Object {
# get a page's worth of blocks
1..$blocknum| ForEach-Object {
# randomize all chars and...
$rndChars = $chars | Get-Random -Count $chars.Count
# ...join them in a string
$string = -join $rndChars
# repeat random string N times to get a full block string length
$longstring = $string * ($blocksize / $charset)
# write 1 block to file
$stream.Write($longstring)
# release resources by clearing string variables
Clear-Variable string, longstring
}
}
$stream.Close()
$stream.Dispose()
# release resources through garbage collection
[GC]::Collect()
$file.Close()
I've tried a gazillion variants like:
$stream = [System.IO.StreamWriter] $file
$stream = System.IO.StreamWriter $file
$stream = NewObject System.IO.StreamWriter $file
Of course, being a total noob at powershell, I've tried using quotes, brackets, provided the full path instead of the variable, etc. All (or most) seem to be valid syntax variants, according to a ton of examples I found online, but the output is still the same.
In case you have any improvement to suggest or alternative way to perform this task I'm all ears.
Edited the script above: just a couple of " for $file made the error disappear, - thanks LinuxDisciple; however, the file gets created but stays at 0 bytes and the script stuck in a loop.
Fix your instantiation of StreamWriter to any of these correct variants:
$stream = [System.IO.StreamWriter]::new($file)
$stream = [IO.StreamWriter]::new($file) # the default namespace may be omitted
$stream = New-Object System.IO.StreamWriter $file
You can specify encoding:
$stream = [IO.StreamWriter]::new(
$file,
$false, # don't append
[Text.Encoding]::ASCII
)
See StreamWriter on MSDN for available constructors and parameters.
PowerShell ISE offers autocomplete with tooltips:
type [streamw and press Ctrl-Space to autocomplete the full .NET class name
type ]:: to see the available methods and properties
type new and press Ctrl-Space to see the constructor overrides
whenever needed, put the caret at the method name and press Ctrl-Space for the tooltip
I know nothing about powershell but a few things:
Are you sure $longstring has a value before you call stream.Write()? It sounds like it's null and that's why the error. If you can somehow output the value of $longstring to the console, it would help you make sure that it has a value.
Also, troubleshoot the code with a simplified version of your code, so that you can pinpoint what's going on, for example
$file = c:\temp\temp.rnd
$stream = System.IO.StreamWriter $file
$longstring = 'whatever'
$stream.Write($longstring)

PowerPoint to PDF: Minimum size option

I was attempting to convert PowerPoint files to PDF using PowerShell and was able to do so. However, I am trying to take this one step further and select the 'Minimum Size (publishing online)' option through the script.
Is there a property that needs to be set for this to happen? I'm guessing it is the $ppQualityStandard variable but not exactly sure.
EDIT: This is what I am using currently:
function ppt_to_pdf ($folderpath, $pptname) {
Add-Type -AssemblyName office
$ppFormatPDF = 2
$ppQualityStandard = 0
$p = New-Object -ComObject PowerPoint.Application
$p.Visible = [Microsoft.Office.Core.MsoTriState]::msoTrue
$ppt = $p.Presentations.Open("$folderpath\$pptname")
$ppt.SaveCopyAs("$folderpath\$pptname", 32)
$ppt.Close()
$p.Quit()
$p = $null
[gc]::collect()
[gc]::WaitForPendingFinalizers()
}
I suspect you need to use .ExportAsFixedFormat rather than .SaveCopyAs.
It takes, among other parameters, Intent as type ppFixedFormatIntent, which can be either:
ppFixedFormatIntentScreen (=1)
or
ppFixedFormatIntentPrint (=2)
There's a host of other parms. To learn more, start PPT, go into the VBA IDE and press F2 for the Object Browser and search for ExportAsFixedFormat

Copying Text File Data into existing Excel Workbook Using PowerShell

So I'm having a problem exporting data from a Text.File into a Excel Workbook that contains data from a month. What ends up happening is, the code opens a new workbook with the name and tittle of the sheet called 'Geraldine3-16-2016', I don't mind that, however it never gets added or copied to the the main workbook. So eventually, the only thing that changes in the main workbook is that a new sheet gets created called 'Sheet 1'but there is no data from the text file. Any help is greatly appreciated, Thank you in advance!
function Release-Ref ($ref) {
([System.Runtime.InteropServices.Marshal]::ReleaseComObject(
[System.__ComObject]$ref) -gt 0)
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
}
$File='C:\users\cesar.sanchez\downloads\Returns data 2-16-15.xlsx'
$TextFile='C:\Users\cesar.sanchez\downloads\Geraldine3-16-2016.txt'
$Excel = New-Object -C Excel.Application
$Excel.Visible=$true #For troubleshooting purposes only.
# $Excel.DisplayAlerts = $false
$TextData = $Excel.Workbooks.Opentext($TextFile,$null,$true)
$ExcelData = $Excel.Workbooks.Open($File) # Open Template
$NewS_ExcelData=$ExcelData.sheets.add()
$TexttoCopy=$TextData.Sheets.item(1)
$TexttoCopy.copy($NewS_ExcelData)
I believe it has to do with something in this part of the code but I'm not completely sure.
$NewS_ExcelData=$ExcelData.sheets.add()
$TexttoCopy=$TextData.Sheets.item(1)
$TexttoCopy.copy($NewS_ExcelData)
.OpenText is not the same as .Open. It does not return an object. (found out the hard way!)
$TexttoCopy=$TextData.Sheets.item(1) throws the error:
You cannot call a method on a null-valued expression.
Alternative code:
$File='C:\users\cesar.sanchez\downloads\Returns data 2-16-15.xlsx'
$TextFile='C:\Users\cesar.sanchez\downloads\Geraldine3-16-2016.txt'
$Excel = New-Object -C Excel.Application
$Excel.Visible=$true #For troubleshooting purposes only.
# $Excel.DisplayAlerts = $false
$Excel.Workbooks.Opentext($TextFile,$null,$true) # Open Text file
$TextData = $Excel.ActiveWorkbook # Assign active workbook
$ExcelData = $Excel.Workbooks.Open($File) # Open Template
$NewS_ExcelData = $ExcelData.sheets.add()
$TexttoCopy = $TextData.Sheets.item(1)
$TexttoCopy.copy($NewS_ExcelData)
You may also find $TexttoCopy.move($NewS_ExcelData) useful.

How to set global variable and

I have this script running every half hour. It sends an email if the value is greater than 99, but I don't want it to send an email every 30 min if value is still 99.
I would only like it to send an email once. if the value goes back less than 99 then again greater than 99 then send an email once only.
Here is the script
$main = 100
IF ($main -gt 99) {
Write-Host $main
##Send Email
}
else
{
Write-Host "Not Greater than 99."
}
What I tried, it doesn't work.
I tried using a global variable and setting equal nothing if $var1 -eq "NOTSENT" -and $main -gt 99 it will send email and set $var1 = "SENT". I need help here.
$main = 100
$var1
IF (($main -gt 99) -and ($var1 -eq "NOTSENT")) {
Write-Host $main
##Send Email
$var1 = "SENT"
}
else
{
Write-Host "Not Greater than 99."
$var1 = "NOTSENT"
}
"Global" only means it's global to the PS session it was created in. When the session ends, the variable is gone. If you need to persist that value from one session to another, you need to write it out to disk, and then read it back the next time the script runs.
mjolinor is correct about the reason it's not working; I won't restate that.
But rather than write to disk, you could set a user level or machine level environment variable instead:
[Environment]::SetEnvironmentVariable('Variable', 'Value' , 'User')
[Environment]::SetEnvironmentVariable('Variable', 'Value' , 'Machine') # requires elevation
See http://technet.microsoft.com/en-us/library/ff730964.aspx