tst10 telnet scripting continuously - vb.net

I am using this website (http://npr.me.uk/scripting.html) to connect to telnet and run command. It returns me some information. I need to get this info every 4 seconds. How do I do that? Now it runs but reconnects everytime, so I have to wait while it opens a connection and it takes much more than 4s. Bat file:
echo off
cls
if exist r1.txt del r1.txt
if exist r2.txt del r2.txt
tst10.exe /r:stats.txt /o:r1.txt /m
for /f "skip=30 tokens=*" %%A in (r1.txt) do echo %%A >> r2.txt
del r1.txt
start r2.txt
And stats file:
192.168.xxx.xxx
WAIT "login:"
SEND "myuser\m"
WAIT "Password:"
SEND "mypass\m"
WAIT ">"
SEND "mycommand\m"
WAIT ">"

Use Powershell to program using a csv file with the connections, I am using it for re-programming mfd's
I have a file mfd.txt and a script that reads it in.
I have a telnet script template to change the settings on the mfd and the powershell script creates custom scripts for each mfd and sets dns and hostname parameters. When run, a logfile is piped into a directory for checking later
Script is as follows:
#Process for updating devices quickly using telnet
#Check file exists
c:
cd 'C:\Resources\Telnet'
cls
$fileisthere = $false
$fileisthere = test-path 'C:\Resources\Telnet\mfds.csv'
if ($fileisthere -ne $true)
{
""
Write-Host ("There is no MFD import list C:\Resources\telnet\mfds.csv") | out-file -filepath $logfile -force
""
exit
}
Write-Host ("MFD import List is present")
# for each device in devices:
$mfds = import-csv 'C:\Resources\Telnet\mfds.csv'
foreach ($mfd in $mfds)
{
# ping device and check for response
$mfdname = $mfd.name
$mfdip = $mfd.ipaddress
$mfddns1 = $mfd.dns1
$mfddns2 = $mfd.dns2
$mfdhostname = $mfd.serial
""
Write-Host ("Updating device $($mfdname) on IP address $($Mfdip) ")
""
("Updating device $($mfdname) on IP address $($Mfdip) ") | out-file -filepath $logfile -Append -force
if(!(Test-Connection -Cn $mfdip -BufferSize 16 -Count 1 -ea 0 -quiet))
{
Write-Host ""
Write-Host ("MFD $($mfdname) is offline or not at this address")
Write-Host ""
"" | out-file $logfile -Append -force
("MFD $($mfdname) is offline or not at this address") | out-file $logfile -Append -force
"" | out-file $logfile -Append -force
}
else
{
#find replace script
# Device is present and add to script header
$tststring = "$($mfdip) 23"
$tstfile = "$($mfdname)-$($mfdip).txt"
$tstlogfile = "$($mfdname)-$($mfdip).log"
$tststring | out-file $tstfile -force
type dns.txt >> $tstfile
$location1 = "C:\Resources\telnet\$($tstfile)"
$change1 = get-content $location1
$change1 | ForEach-Object { $_ -replace "dns 1 server", "dns 1 server $($mfddns1)"} | Set-Content $location
$location2 = "C:\Resources\telnet\$($tstfile)"
$change2 = get-content $location2
$change2 | ForEach-Object { $_ -replace "dns 2 server", "dns 2 server $($mfddns2)"} | Set-Content $location
$location3 = "C:\Resources\telnet\$($tstfile)"
$change3 = get-content $location3
$change3 | ForEach-Object { $_ -replace "hostname ether name", "hostname ether name $($mfdhostname)"} | Set-Content $location
$location4 = "C:\Resources\telnet\$($tstfile)"
$change4 = get-content $location4
$change4 | ForEach-Object { $_ -replace "devicename name", "devicename name $($mfdhostname)"} | Set-Content $location
# Create variables for update
Write-Host ("Updating $($Mfdname) on IP Address $($mfdIP) ")
$parameter1 = "/r:$($tstfile)"
$parameter2 = "/o:$($tstlogfile)"
#& cmd tst10 $parameter1 $paremeter2
write-host ("$($tstfile) $($tstlogfile)")
new-item $tstfolder -Type directory
move-item $tstfile $tstfolder
move-item $tstlogfile $tstfolder -ErrorAction SilentlyContinue
}
}

Related

PowerShell 7. ForEach-Object -Parallel Does Not Autheticate Against Azure PowerShell

We wrote a script that supposed to execute Azure PowerShell commands in parallel. The problem is when we increase -ThrottleLimit higher than one, some of the commands are not being performed properly. The script is:
# Writing IPs for whitelisting into file.
Add-Content -Path IPs.txt -Value ((Get-AzWebApp -ResourceGroupName "ResourceGroup1" -Name "WebApp1").OutboundIpAddresses).Split(",")
Add-Content -Path IPs.txt -Value ((Get-AzWebApp -ResourceGroupName "ResourceGroup1" -Name "WebApp1").PossibleOutboundIpAddresses).Split(",")
# Writing new file with inique IPs.
Get-Content IPs.txt | Sort-Object -Unique | Set-Content UniqueIPs.txt
# Referencing the file.
$IPsForWhitelisting = Get-Content UniqueIPs.txt
# Assigning priotiry number to each IP
$Count = 100
$List = foreach ($IP in $IPsForWhitelisting) {
$IP|Select #{l='IP';e={$_}},#{l='Count';e={$Count}}
$Count++
}
# Whitelisting all the IPs from the list.
$List | ForEach-Object -Parallel {
$IP = $_.IP
$Priority = $_.Count
$azureApplicationId ="***"
$azureTenantId= "***"
$azureApplicationSecret = "***"
$azureSecurePassword = ConvertTo-SecureString $azureApplicationSecret -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($azureApplicationId , $azureSecurePassword)
Connect-AzAccount -Credential $credential -TenantId $azureTenantId -ServicePrincipal | Out-null
echo "IP-$Priority"
echo "$IP/24"
echo $Priority
Add-AzWebAppAccessRestrictionRule -ResourceGroupName "ResourceGroup1" -WebAppName "WebApp1" -Name "IP-$Priority" -Priority $Priority -Action Allow -IpAddress "$IP/24"
} -ThrottleLimit 1
If ThrottleLimit is set to 1 - 8 rules are being created, if ThrottleLimit is set to 2 - 7 rules are being created, 3 - 4 rules, 10 - 1 rule, hence some rules are being skipped.
What is the reason for such behavior?
In short - the -Parallel parameter does not (yet perhaps) magically import all dependent variables that fall in the scope of the For-EachObject block. In reality PWSH spans separate processes and only the array that is looped over will be implicitly passed, all other variables need explicit designations.
One should use the $using: directive (prefix) to denote which variables are to be imported (made visible) in the parallel code block.
Example:
$avar = [Int]10
$bvar = [Int]20
$list = #('here', 'it', 'eees')
$list | ForEach-Object -Parallel {
Write-Output "(a, b) is here ($($using:avar), $($using:bvar))"
Write-Output "(a, b) missing ($($avar), $($bvar))"
Write-Output "Current element is $_"
}```
*thus - the described behavior is likely due to the fact that config. variables are not imported (at all) and thus the operations silently fail.*

Pass a Variable from a Powershell Script to SQLPlus

I'm trying to pass a variable from my Powershell Script to SQLPlus.
I've defined my variable as $csvStorage (the file path to the folder "csv_files"):
Powershell Script:
# Set file path to C:\xxxx\xxxx\xxxx\xxxx\csv_files
$filePath = Split-Path -Path $directory -Parent
$csvStorage = Join-Path $filePath -ChildPath "csv_files"
I have then passed it as an argument to the SQL script:
Powershell Script:
# Run sql_queries.sql whilst passing C:\xxxx\xxxx\xxxx\xxxx\csv_files\ into the SQL script
$queryTables = "/c sqlplus $dbUsername/$dbPassword#$dbServiceName #$filePath $csvStorage"
&$sqlPlus $queryTables
Then finally, referenced the variable in my SQL using '&1':
SQL Script:
set null null
set heading on
set pagesize 50000
set termout on
spool &1\hc_actual_vs_shadow_inv.csv
SELECT *
FROM hc_actual_vs_shadow_inv
/
spool off
/
exit
However, the query is not being executed and no .csv file is outputted. But I can't see what I'm doing wrong. Any assistance would be much appreciated.
Thanks
<#
.SYNOPSIS
This script executes all sql files from the specified directory and creates separate csv files in a specified directory.
Author: Dmitry Demin dmitrydemin1973#gmail.com
.DESCRIPTION
In the script, the format for displaying the date and decimal separator is configured.
.PARAMETER username
Specify the username for example SCOTT
.PARAMETER password
Specify the password for example TIGER
.PARAMETER connect_string
Specify the connect_string(TNS alias) for connect to database from $ORACLE_HOME/network/admin/tnsnames.ora.
.PARAMETER sql_path
Specify the directory for executing sql scripts.
.PARAMETER csv_path
Specify the directory for output csv.
.PARAMETER log_path
Specify the log file.
.EXAMPLE
This script executes all sql files from the specified directory and creates separate csv files in a specified directory.
.\run_export_all_tables.ps1 -username SCOTT -password tiger -connect_string ORCL -sql_path C:\export\sql\ -csv_path C:\export\csv\
#>
param(
[string]$username = "scott",
[string]$password = "tiger",
[string]$connect_string = "192.168.0.166:1521/TEST",
[string]$sql_path="C:\upwork\powershell_sqlplus_export_csv\sql\",
[string]$csv_path="C:\upwork\powershell_sqlplus_export_csv\csv\",
[string]$log_path="C:\upwork\powershell_sqlplus_export_csv\log_file.log"
)
# Column separator for csv file
$COLSEP=";"
# NLS_NUMERIC_CHARACTERS
$NLS_NUMERIC_CHARACTERS=".,"
$NLS_DATE_FORMAT="DD.MM.YYYY HH24:MI:SS"
#[string]$connect_string = "server2003ora10:1521/ORCL"
# Log file
$full_sql_path=$sql_path
$full_csv_path=$csv_path
$full_log_path=$log_path
#csv file extension
$csv_ext=".csv"
#Set NLS_LANG for session sqlplus
#"RUSSIAN_CIS.UTF8"
#"RUSSIAN_CIS.CL8MSWIN1251"
#"AMERICAN_AMERICA.UTF8"
#$NLS_LANG="RUSSIAN_CIS.CL8MSWIN1251"
$NLS_LANG="AMERICAN_AMERICA.CL8MSWIN1251"
#$NLS_LANG="AMERICAN_AMERICA.UTF8"
#Set NLS_LANG for session sqlplus
[Environment]::SetEnvironmentVariable("NLS_LANG",$NLS_LANG , [System.EnvironmentVariableTarget]::PROCESS)
$env_path_NLS=[Environment]::GetEnvironmentVariable("NLS_LANG", [EnvironmentVariableTarget]::PROCESS)
echo "SET session NLS_LANG: $env_path_NLS" | tee-object -Append -filepath $full_log_path
$SqlQueryExportTable1 =
#"
set heading off
set termout OFF
SET FEEDBACK OFF
SET TAB OFF
set pause off
set verify off
SET UNDERLINE OFF
set trimspool on
set timing off
set echo off
set numwidth 30
set linesize 10000
set pagesize 0
SET COLSEP '$COLSEP'
ALTER SESSION SET NLS_NUMERIC_CHARACTERS='$NLS_NUMERIC_CHARACTERS';
ALTER SESSION SET NLS_DATE_FORMAT='$NLS_DATE_FORMAT';
"#
$SqlQueryExportTable2 =
#"
exit
"#
function Check_File
{
param (
[string]$pathfile
)
try {
$A=Get-Content -Path $pathfile -ErrorAction Stop
}
catch [System.UnauthorizedAccessException]
{
#Write-Host "File $pathfile is not accessible."
echo "File $pathfile is not accessible." | tee-object -Append -filepath $full_log_path
exit
}
catch [System.Management.Automation.ItemNotFoundException]
{
#Write-Host "File $pathfile is not found."
echo "File $pathfile is not found." | tee-object -Append -filepath $full_log_path
exit
}
catch {
Write-Host "File $pathfile. Other type of error was found:"
#Write-Host "Exception type is $($_.Exception.GetType().Name)"
echo "Exception type is $($_.Exception.GetType().Name)" | tee-object -Append -filepath $full_log_path
exit
}
}
echo "===========================================================================================" | tee-object -Append -filepath $full_log_path
$date_time_start = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
$date_time_log = Get-Date -Format "yyyyMMddHHmmss"
Write-host "Script start time : $date_time_start "
try
{
echo "Script start time : $date_time_start ">>$full_log_path
}
catch {
Write-Host "Log File $full_log_path. Other type of error was found:"
Write-Host "Exception type is $($_.Exception.GetType().Name)"
exit
}
#chcp 1251
$files_input = Get-Childitem -File $full_sql_path
foreach ($file_input in $files_input)
{
echo "Found SQL file $file_input " | tee-object -Append -filepath $full_log_path
$full_sql_path_file=$full_sql_path+$file_input
$user_tab= get-content -Path $full_sql_path_file | out-string
echo "Found SQL : $user_tab " | tee-object -Append -filepath $full_log_path
$sqlQuery_show_table_all=""
$sqlQuery_show_table_all=$SqlQueryExportTable1+ $user_tab+ $SqlQueryExportTable2
$full_csv_path_file=$full_csv_path + $file_input + "_" + $date_time_log + $csv_ext
echo "-------------------------------------------------------------------------------------------" | tee-object -Append -filepath $full_log_path
echo "For SQL file : $full_sql_path_file will be created new csv file: $full_csv_path_file" | tee-object -Append -filepath $full_log_path
echo "Script will run for SQL: $user_tab " | tee-object -Append -filepath $full_log_path
$sqlOutput_tab = $sqlQuery_show_table_all | sqlplus -s $username/$password#$connect_string
$sqlOutput_count = $sqlOutput_tab.count
if ($sqlOutput_tab.count -gt 0)
{
Out-File -filepath $full_csv_path_file -append -inputobject $sqlOutput_tab -encoding default
echo "Exported rows: $sqlOutput_count " | tee-object -Append -filepath $full_log_path
}
else
{
echo "No exported rows: 0 row" | tee-object -Append -filepath $full_log_path
echo "$full_csv_path_file file not created " | tee-object -Append -filepath $full_log_path
}
echo "-------------------------------------------------------------------------------------------" | tee-object -Append -filepath $full_log_path
}

Disable Downloading Cached FTP File

I've got a PowerShell script that I call from VBA using Excel. The script uses WinSCP to download some datetime-named FTP and SFTP files and saves them with a static filename, overwriting the old file, on a network drive location.
The script works on first run, but after that it loads the same cached version of the file. The workaround is to change the cache settings in IE to check for newer versions of stored webpages 'every time I visit the webpage'.
The macro is used by several people and is accessed using a variety of computers. Is there a way around this that I can incorporate in my code, either in VBA or PS so they don't have to remember to go into IE to change their settings?
Script is called from VBA:
Call Shell("powershell -executionpolicy bypass & ""H:\FTP\FTP.ps1""", vbHide)
Script:
try
{
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
$localPath = "H:\Worksheets\FTP"
$remotePath = "/outgoing/data/LatestData/"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::ftp
$sessionOptions.HostName =
$sessionOptions.UserName =
$sessionOptions.Password =
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest = $directoryInfo.Files |
Where-Object { -Not $_.IsDirectory} |
Where-Object {
[System.IO.Path]::GetExtension($_.Name) -eq ".nc1" -or
[System.IO.Path]::GetExtension($_.Name) -eq ".ky1" -or
[System.IO.Path]::GetExtension($_.Name) -like ".tn*" }
Group-Object { [System.IO.Path]::GetExtension($_.Name) } |
ForEach-Object{
$_.Group | Sort-Object LastWriteTime -Descending | Select -First 1
}
$extension = [System.IO.Path]::GetExtension($latest.Name)
"GetExtension('{0}') returns '{1}'" -f $fileName, $extension
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
$latest | ForEach-Object{
$extension = ([System.IO.Path]::GetExtension($_.Name)).Trim(".")
$session.GetFiles($session.EscapeFileMask($remotePath + $_.Name), "$localPath\$extension.txt" ).Check()
}
$stamp = $(Get-Date -f "yyyy-MM-dd-HHmm")
$filename = $stamp.subString(0,$stamp.length-6)
$session.GetFiles(
($remotePath + $fileName),
($localPath + $fileName + "." + $stamp)).Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host $_.Exception.Message
exit 1
}

Powershell script to run list of sql files

I want to read a file that contains a line separated list of *.sql file names (all located at the same directory) to execute the following:
$reader = [System.IO.File]::OpenText("_Scripts.txt")
try {
for(;;) {
$line = $reader.ReadLine()
if ($line -eq $null) { break }
#output
$out = $line.split(".")[0] + ".txt" ;
# -serverinstance u should change the value of it according to use
invoke-sqlcmd -inputfile $line -serverinstance "." | format-table | out-file -filePath $out
$line
}
}
finally {
$reader.Close()
}
I'm trying to execute this script file by using a batch file containing the command:
powershell.exe -ExecutionPolicy Bypass -Command "_scripts.ps1"
but I get the error shown below:
Can anyone help me fix my ps1 script please?
This works for me:
$lines = Get-Content C:\Temp\TEST\_Scripts.txt
ForEach ($line in $lines)
{
$out = $line.split(".")[0] + ".txt" ;
Invoke-Sqlcmd -InputFile $line -ServerInstance "localhost" -Database "master" | Format-Table | Out-File -FilePath $out
}

Sql script runner

Get-ChildItem ".\Stored Procedures\*.sql" | ForEach-Object { sqlcmd -S ServerName -d DatabaseName -E -i $_.FullName }
When I run a batch of scripts from a folder with the above command, if a problem persists in the intermediate script (like create/Alter/DROP DML script in between) then it should stop there only and need to give me an error message.
You'll need to do a few things:
Set ErrorActionPreference to stop
Use the -b parameter with sqlcmd.exe utility
Capture and log or display output of sqlcmd.exe utility
I answered a similar a question on another forum and I've re-posted the answer here:
echo "select 'Good 1'" > C:\temp\scripts\1.sql
echo "select * from missingTable" > C:\temp\scripts\2.sql
echo "Select 'Good 3'" > C:\temp\scripts\3.sql
$ErrorActionPreference = "Stop"
ForEach ($S In Gci -Path "C:\Temp\Scripts\" -Filter *.sql | Sort-Object Name) {
try {
$result = SqlCmd -b -S $env:computername\sql1 -i $S.FullName
$result = $result -join "`n"
if ($LASTEXITCODE -ne 0) {
throw "$S.FullName : $lastexitcode : $result"
}
else {
write-output "Success: $($s.fullname) : $result" | Out-File C:\Temp\Scripts\sqllogging.txt -Append
}
}
catch {
write-output "Failed: $_ " | Out-File C:\Temp\Scripts\sqllogging.txt -Append
throw
}
}