I'm trying to create a tcl proc, which is passed a shell command as argument and then opens a temporary file and writes a formatted string to the temporary file, followed by running the shell command in background and storing the output to the temp file as well.
Running the command in background, is so that the proc can be called immediately afterwards
with another arg passed to it, writing to another file. So running a hundred such commands should not take as long as running them serially would do. The multiple temp files can finally be concatenated into a single file.
This is the pseudocode of what I'm trying to do.
proc runthis { args }
{
set date_str [ exec date {+%Y%m%d-%H%M%S} ]
set tempFile ${date_str}.txt
set output [ open $tempFile a+ ]
set command [concat exec $args]
puts $output "### Running $args ... ###"
<< Run the command in background and store output to tempFile >>
}
But how do I ensure the background'ing of the task is done properly? What would need to be done to ensure that the multiple temp files get closed properly?
Any help would be welcome. I'm new at tcl and finding to get my mind around this. I read about using threads in tcl but I'm working with an older version of tcl which doesn't support threading.
How about:
proc runthis { args } {
set date_str [clock format [clock seconds] -format {+%Y%m%d-%H%M%S}]
set tempFile ${date_str}.txt
set output [ open $tempFile a+ ]
puts $output "### Running $args ... ###"
close $output
exec {*}$args >> $tempFile &
}
See http://tcl.tk/man/tcl8.5/TclCmd/exec.htm
Since you seem to have an older Tcl, replace
exec {*}$args >> $tempFile &
with
eval exec [linsert $args 0 exec] >> $tempFile &
Related
I am trying to read multiple inputs from user using while loop.
I tried below
while
do
read -p "Enter Server Names : " multi_act
done
'read -p' command is not working inside the while loop, How can I solve this?
I got it fixed as below
exec 5< file1
while read <&5 a; do
echo before
read var
echo after
done
exec 5<&-
its working like a charm.
I am unfamiliar with linux/linux environment so do pardon me if I make any mistakes, do comment to clarify.
I have created a simple perl script. This script creates a sql file and as shown, it would execute the lines in the file to be inserted into the database.
#!/usr/bin/perl
use strict;
use warnings;
use POSIX 'strftime';
my $SQL_COMMAND;
my $HOST = "i";
my $USERNAME = "need";
my $PASSWORD = "help";
my $NOW_TIMESTAMP = strftime '%Y-%m-%d_%H-%M-%S', localtime;
open my $out_fh, '>>', "$NOW_TIMESTAMP.sql" or die 'Unable to create sql file';
printf {$out_fh} "INSERT INTO BOL_LOCK.test(name) VALUES ('wow');";
sub insert()
{
my $SQL_COMMAND = "mysql -u $USERNAME -p'$PASSWORD' ";
while( my $sql_file = glob '*.sql' )
{
my $status = system ( "$SQL_COMMAND < $sql_file" );
if ( $status == 0 )
{
print "pass";
}
else
{
print "fail";
}
}
}
insert();
This works if I execute it while I am logged in as a user(I do not have access to Admin). However, when I set a cronjob to run this file let's say at 10.08am by using the line(in crontab -e):
08 10 * * * perl /opt/lampp/htdocs/otpms/Data_Tsunami/scripts/test.pl > /dev/null 2>&1
I know the script is being executed as the sql file is created. However no new rows are inserted into the database after 10.08am. I've searched for solutions and some have suggested using the DBI module but it's not available on the server.
EDIT: Didn't manage to solve it in the end. A root/admin account was used to to execute the script so that "solved" the problem.
First things first, get rid of the > /dev/null 2>&1 at the end of your crontab entry (at least temporarily) so you can actually see any errors that may be occurring.
In other words, change it temporarily to something like:
08 10 * * * perl /opt/lampp/htdocs/otpms/Data_Tsunami/scripts/test.pl >/tmp/myfile 2>&1
Then you can examine the /tmp/myfile file to see what's being output.
The most likely case is that mysql is not actually on the path in your cron job, because cron itself gives a rather minimal environment.
To fix that problem (assuming that's what it is), see this answer, which gives some guidelines on how best to expand the cron environment to give you what you need. That will probably just involve adding the MySQL executable directory to your PATH variable.
The other thing you may want to consider is closing the out_fh file before trying to pass it to mysql - if the buffers haven't been flushed, it may still be an empty file as far as other processes are concerned.
The expression glob(".* *") matches all files in the current working
directory.
- http://perldoc.perl.org/functions/glob.html
you should not rely on the wd in a cron job. If you want to use a glob (or any file operation) with a relative path, set the wd with chdir first.
source: http://www.perlmonks.org/bare/?node_id=395387
So if your working directory is, for example /home/user, you should insert
chdir('/home/user/');
before the WHILE, ie:
sub insert()
{
my $SQL_COMMAND = "mysql -u $USERNAME -p'$PASSWORD' ";
chdir('/home/user/');
while( my $sql_file = glob '*.sql' )
{
...
replace /home/user with wherever your sql files are being created.
It's better to do as much processing within Perl as possible. It avoids the overhead of generating a separate shell process and leaves everything under the control of the program so that you can handle any errors much more simply
Database access from Perl is done using the DBI module. This program demonstrates how to achieve what you have written using the mysql utility. As you can see it's also much more concise
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
my $host = "i";
my $username = "need";
my $password = "help";
my $dbh = DBI->connect("DBI:mysql:database=test;host=$host", $username, $password);
my $insert = $dbh->prepare('INSERT INTO BOL_LOCK.test(name) VALUES (?)');
my $rv = $insert->execute('wow');
print $rv ? "pass\n" : "fail\n";
I'm launching a single EXE from a Tcl script, and would like to get the output from the EXE and display it using a simple PUTS command to provide user feedback. At the moment, I am launching the EXE in a CMD window where the user can see the progress, and waiting for the EXE to create a file. The first script here works whenever the output file LUC.CSV is created.
file delete -force luc.csv
set cmdStatus [open "| cmd.exe /c start /wait uc.exe"]
while {![file exists "luc.csv"]} {
}
# continue after output file is created
However, sometimes the file is not created, so I can't rely on this method.
I've been trying to get my head around the use of fileevent and pipes, and have tried several incarnations of the script below, but I'm obviously either missing the point or just not getting the syntax right.
puts "starting"
set fifo [open "| cmd.exe /c start uc.exe" r]
fconfigure $fifo -blocking 0
proc read_fifo {fifo} {
puts "calling read_fifo"
if {[gets $fifo x] < 0} {
if {[eof $fifo]} {
close $fifo
}
}
puts "x is $x"
}
fileevent $fifo readable [list read_fifo $fifo]
vwait forever
puts"finished"
Any help would be greatly appreciated!
If you just want to launch a subprocess and do nothing else until it finishes, Tcl's exec command is perfect.
exec cmd.exe /c start /wait uc.exe
(Since you're launching a GUI application via start, there won't be any meaningful result unless there's an error in launching. And in that case you'll get a catchable error.) Things only get complicated when you want to do several things at once.
To make your original code work, you have to understand that the subprocess has finished. Tcl's just vwaiting forever because your code says to do that. We need to put something in to make the wait finish. A good way is to make the wait be for something to happen to the fifo variable, which can be unset after the pipe is closed as it no longer contains anything useful. (vwait will become eligible to return once the variable it is told about is either written to or destroyed; it uses a variable trace under the covers. It also won't actually return until the event handlers it is currently processing return.)
puts "starting"
# ***Assume*** that this code is in the global scope
set fifo [open "| cmd.exe /c start uc.exe" r]
fconfigure $fifo -blocking 0
proc read_fifo {} {
global fifo
puts "calling read_fifo"
if {[gets $fifo x] < 0} {
if {[eof $fifo]} {
close $fifo
unset fifo
}
}
puts "x is $x"
}
fileevent $fifo readable read_fifo
vwait fifo
puts "finished"
That ought to work. The lines that were changed were the declaration of read_fifo (no variable passed in), the adding of global fifo just below (because we want to work with that instead), the adding of unset fifo just after close $fifo, the setting up of the fileevent (don't pass an extra argument to the callback), and the vwait (because we want to wait for fifo, not forever).
I'm using powershell to run another powershell script, at some point the other script asks for some input, I would like to be able to read the output from the other script and based on that supply input to it. Similar to what you can do with expect on bash.
Any ideas?
Thanks
Just posting my solution so that it can help someone. I faced the same problem while running some other script that will ask for answers. First create a file "inputFileLocation.txt" with answers to each question in each line in sequence. Then run the script in below syntax. And it will do the work.
`cmd.exe /c "script.bat < inputFileLocation.txt"`
You just use Expect program in your powershell. It works. Powershell is a shell too, you can run code wrote by powershell, which call bash code, which call powershell again.
Bellow is a test, it passed.
It "can work with tcl expect" {
$bs = #'
echo "Do you wish to install this program?"
select yn in "Yes" "No"; do
case $yn in
Yes ) echo "install"; break;;
No ) exit;;
esac
done
'#
$bsf = New-TemporaryFile
$bs | Set-Content -Path $bsf
$tcls = #'
#!/bin/sh
# exp.tcl \
exec tclsh "$0" ${1+"$#"}
package require Expect
set timeout 100000
spawn {spawn-command}
expect {
"Enter password: $" {
exp_send "$password\r"
exp_continue
}
"#\? $" {
exp_send "1"
}
eof {}
timeout {}
}
'#
$tclf = New-TemporaryFile
$tcls -replace "{spawn-command}",("bash",$bsf -join " ") | Set-Content -Path $tclf
"bash", $tclf -join " " | Invoke-Expression
Remove-Item $bsf
Remove-Item $tclf
}
Let me explain the test.
create a bash file which expect an input.
create a tcl file which call bash created in step one.
invoke tcl program from powershell, it works, will not waiting for input.
Sample to solve part of the problem
[void] [System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms")
Start-Process -FilePath C:\myexecbatchfile.bat
# Wait the application start for 2 sec
Start-Sleep -m 2000
# Send keys
[System.Windows.Forms.SendKeys]::SendWait("input1")
[System.Windows.Forms.SendKeys]::SendWait("{ENTER}")
Start-Sleep -m 3000
[System.Windows.Forms.SendKeys]::SendWait("input2")
[System.Windows.Forms.SendKeys]::SendWait("{ENTER}")
I am not aware of any native capability to duplicate exact. This question has an answer that claims to be able to pass content to/from a process, so it might work with what you want.
How to run interactive commands in another application window from powershell
Good Luck!
Lee Holmes put out an "Expect for Powershell" in 2014 on the Powershell Gallery called Await. Turns out emulating expect is a lot more complicated than you'd imagine, involving the Win32 calls.
Package
https://www.powershellgallery.com/packages/Await/0.8
Demo
https://www.youtube.com/watch?v=tKyAVm7bXcQ
I have an expect script which I need to run every 3 mins on my management node to collect tx/rx values for each port attached to DCX Brocade SAN Switch using the command #portperfshow#
Each time I try to use crontab to execute the script every 3 mins, the script does not work!
My expect script starts with #!/usr/bin/expect -f and I am calling the script using the following syntax under cron:
3 * * * * /usr/bin/expect -f /root/portsperfDCX1/collect-all.exp sanswitchhostname
However, when I execute the script (not under cron) it works as expected:
root# ./collect-all.exp sanswitchhostname
works just fine.
Please Please can someone help! Thanks.
The script collect-all.exp is:
#!/usr/bin/expect -f
#Time and Date
set day [timestamp -format %d%m%y]
set time [timestamp -format %H%M]
#logging
set LogDir1 "/FPerf/PortsLogs"
et timeout 5
set ipaddr [lrange $argv 0 0]
set passw "XXXXXXX"
if { $ipaddr == "" } {
puts "Usage: <script.exp> <ip address>\n"
exit 1
}
spawn ssh admin#$ipaddr
expect -re "password"
send "$passw\r"
expect -re "admin"
log_file "$LogDir1/$day-portsperfshow-$time"
send "portperfshow -tx -rx -t 10\r"
expect timeout "\n"
send \003
log_file
send -- "exit\r"
close
I had the same issue, except that my script was ending with
interact
Finally I got it working by replacing it with these two lines:
expect eof
exit
Changing interact to expect eof worked for me!
Needed to remove the exit part, because I had more statements in the bash script after the expect line (calling expect inside a bash script).
There are two key differences between a program that is run normally from a shell and a program that is run from cron:
Cron does not populate (many) environment variables. Notably absent are TERM, SHELL and HOME, but that's just a small proportion of the long list that will be not defined.
Cron does not set up a current terminal, so /dev/tty doesn't resolve to anything. (Note, programs spawned by Expect will have a current terminal.)
With high probability, any difficulties will come from these, especially the first. To fix, you need to save all your environment variables in an interactive session and use these in your expect script to repopulate the environment. The easiest way is to use this little expect script:
unset -nocomplain ::env(SSH_AUTH_SOCK) ;# This one is session-bound anyway
puts [list array set ::env [array get ::env]]
That will write out a single very long line which you want to put near the top of your script (or at least before the first spawn). Then see if that works.
Jobs run by cron are not considered login shells, and thus don't source your .bashrc, .bash_profile, etc.
If you want that behavior, you need to add it explicitly to the crontab entry like so:
$ crontab -l
0 13 * * * bash -c '. .bash_profile; etc ...'
$