I've searched on stackoverflow and haven't really found an answer to this.
I'm pretty new to scripting and I have created a fully functional Expect script but I would like to improve it a bit. Currently I have created 3 lindex values of argv 0, 1 and 2 for hostaddress, username and password.
I would like to create a default username and password if argv 1 and 2 is NOT specified. I tried solving this through some if statements but after searching through stackoverflow it seems that TCL/Expect does not support NULL or empty values. Instead you have to make a query for it. Currently my code looks like this:
#!/usr/bin/expect
#Variables
set HOSTADDRESS [lindex $argv 0]
set USER [lindex $argv 1]
set PASSWORD [lindex $argv 2]
spawn ssh $USER#$HOSTADDRESS
set timeout 100
expect {
"(yes/no)?" {send "yes\n"; exp_continue}
"assword:" {send "$PASSWORD\n"}
}
expect {
"%" {send "cli\r"; exp_continue}
">" {sleep 1}
}
send "show interfaces st0 terse | match st0. | count \r"
expect "Count:???"
puts [open $HOSTADDRESS.op5.vpn.results w] $expect_out(0,string)
expect ">"
send "exit\r"
expect {
"%" {send "exit\r"; exp_continue}
"closed." {exit}
}
exit
Can you guys please help me create a default variable for $USER and $PASSWORD if it's not specified in the argv 1 or argv2?
Tcl doesn't support NULL at all. Or rather, it actually maps it to the variable being unset (that's exactly what happens with local variables under the hood; global variables are different). To query whether a variable exists, you use info exists (and yes, that's actually a NULL check in its implementation).
However, for handling defaulting of values from users on the command line, it is better to do it like this:
proc parseArgv {hostAddress {user "TheDefaultUser"} {pass "TheDefaultPassword"}} {
variable ::HOSTADDRESS $hostAddress
variable ::USER $user
variable ::PASSWORD $pass
}
parseArgv {*}$argv
If you're using 8.4 (upgrade, man!) then replace that last line with:
eval parseArgv $argv
You could also do it by looking at the llength of $argv (or the value in $argc) and doing conditional stuff based on that, but leveraging Tcl's proc default argument value stuff is easier (and you even get a reasonable error when someone gives too few or too many arguments).
Related
I have an expect script that I would like to do unit-tests for, but I'm unsure how to go about it.
My initial thought was to override keychain, lpass and bw somehow, but I have no idea how to do this without modifying the original script, in my other tests I have overridden functions with shell function stubs and set PATH='' in some cases. I guess I could test all the 3 executed commands manually, but that doesn't really test the project as a whole and leaves some code untested which is vital to the functionality.
#!/usr/bin/expect --
set manager [lindex $argv 0]
# strip manager part
set argv [lrange $argv 1 end]
spawn -noecho keychain --quick --quiet --agents ssh {*}$argv
foreach key $argv {
if {$manager == "lastpass"} {
set pass [exec lpass show --name $key --field=Passphrase | tr -d '\n']
}
if {$manager == "bitwarden"} {
set pass [exec bw get password $key | tr -d '\n']
}
expect ":"
send "$pass\r"
}
interact
Any suggestions would be be highly appreciated!
For unit testing, you can just put a directory at the start of the PATH with mock scripts for the keychain, lpass and bw commands. After all, in a unit test you're just really checking that the code in the script itself is plausible and doesn't contain stupid errors. Yes, there are other ways of doing that, but mocking the commands via a PATH tweak is definitely the easiest and most effective way.
However, this is definitely a case where the useful testing is integration testing where you run against the real commands. Of course, you might do that in some sort of testing environment; a VM (especially something comparatively lightweight like Docker) might help here.
You do not need to test whether exec and spawn obey the PATH. That's someone else's job and definitely is tested!
For starters, I'm a complete novice with expect scripts. I have written a few ssh scripts but I cant seem to figure out how to get the latest 3 log files after running a set of tests for a new build. My main goal is to find the latest log files and copy them to my local machine. PLEASE DON'T tell me that it's bad practice to hard code the login and password, I'm doing so because it's temporary to make the script work. My code currently...
#!/usr/bin/expect -f
set timeout 15
set prompt {\]\$ ?#}
spawn ssh -o "StrictHostKeyChecking no" "root#remote_ip"
expect {
"RSA key fingerprint" {send "yes\r"; exp_continue}
"assword:" {send "password\r"; exp_continue}
}
sleep 15
send -- "export DISPLAY=<display_ip>\r"
sleep 5
send "cd /path/to/test/\r"
sleep 5
set timeout -1
send "bash run.sh acceptance.test\r"
#Everything above all works. The tests has finished, about to cp log files
send "cd Log\r"
sleep 5
send -- "pwd\r"
sleep 5
set newestFile [send "ls -t | head -3"]
#tried [eval exec `ls -t | head -3`]
#No matter what I try, my code always gets stuck here. Either it wont close the session
#or ls: invalid option -- '|' or just nothing and it closes the session.
#usually never makes it beyond here :(
expect $prompt
sleep 5
puts $newestFile
sleep 5
send -- "exit\r"
sleep 5
set timeout 120
spawn rsync -azP root#remote_ip:'ls -t /logs/path/ | head -3' /local/path/
expect {
"fingerprint" {send "yes\r"; exp_continue};
"assword:" {send "password\r"; exp_continue};
}
Thanks in advance
When writing an expect script, you need to follow the pattern of expecting the remote side to write some output (e.g., a prompt) and then sending something to it in reply. The overall pattern is spawn, expect, send, expect, send, …, close, wait. If you don't expect from time to time, there are some buffers that fill up, which is probably what's happening to you.
Let's fix the section with the problems (though you should be expecting the prompt before this too):
send "cd Log\r"
expect -ex $prompt
send -- "pwd\r"
expect -ex $prompt
send "ls -t | head -3\r"
# Initialise a variable to hold the list of files produced
set newestFiles {}
# SKIP OVER THE LINE "TYPED IN" JUST ABOVE
expect \n
expect {
-re {^([^\r\n]*)\r\n} {
lappend newestFiles $expect_out(1,string)
exp_continue
}
-ex $prompt
}
# Prove what we've found for demonstration purposes
send_user "Found these files: \[[join $newestFiles ,]\]\n"
I've also made a few other corrections. In particular, send has no useful result itself, so we need an expect with a regular expression (use the -re flag) to pick out the filenames. I like to use the other form of the expect command for this, as that lets me match against several things at once. (I'm using the -ex option for exact matching with the prompts because that works better in my testing; you might need it, or might not.)
Also, make sure you use \r at the end of a line sent with send or the other side will be still be waiting “for you to press Return” which is what the \r simulates. And don't forget to use:
exp_internal 1
when debugging your code, as that tells you exactly what expect is up to.
Using cygwin 32bit, expect v5.45 (what comes with the latest cygwin, it seems).
COMMAND="$WIN_BUILD_ROOT\\scripts\\signBinaries.bat $BUS $NET_DRIVE $WIN_SIGNING_ROOT"
$CLIENT_BUILD_ROOT/scripts/runCommand.sh $arg1 $arg2 $arg3 $arg4 $COMMAND
runCommand.sh:
#!C:\cygwin\bin\expect.exe -f
set timeout 9
set arg1 [lindex $argv 0]
set arg2 [lindex $argv 1]
set arg3 [lindex $argv 2]
set arg4 [lindex $argv 3]
set COMMAND [lrange $argv 4 end]
send -- "$COMMAND\r"
Gives me:
{s:\git\builds\scripts\signBinaries.bat} 64 s {s:\git\builds\}
The filename, directory name, or volume label syntax is incorrect.
The scenario is, the first four arguments are fixed. There could then be a variable number of commands following those, which I want to have as one command to be executed by expect. If I just use [lindex argv 4], I only get the signBinaries script name, whether or not it's enclosed in quotes. Using lrange (found via googling) instead, it's enclosing the string arguments in braces, as shown. Why would it be modifying my arguments in this fashion and how do I fix it so that $COMMAND contains the command as I intended it to be?
In Tcl, which expect extends, you have to be aware of the datatype of your variables: is it a string or a list? When a list gets stringified, it may quote some of its elements if they contain "metacharacters" (like backslash and braces).
In most cases when you want to use the contents of a list as a single string, it's best to stringify it yourself:
set COMMAND [list "s:\\git\\builds\\scripts\\signBinaries.bat" 64 s "s:\\git\\builds\\"]
puts $COMMAND
# => {s:\git\builds\scripts\signBinaries.bat} 64 s s:\\git\\builds\\
puts [join $COMMAND " "]
# => s:\git\builds\scripts\signBinaries.bat 64 s s:\git\builds\
Aha! Figured it out:
set COMMAND [join [lrange $argv 4 end]]
Gives me:
s:\git\builds\scripts\signBinaries.bat 64 s s:\git\builds
"BUS: 64"
"DRIVE: s"
"WORKSPACE: s:\git\builds"
I'm trying to create a sh script using expect script for embedded system (I don't want to change firmware to include this script). So, I have the following script, which doesn't work because of wrong usage of val in if-block:
#!/usr/bin/env expect
set timeout 20
set ipaddr [lindex $argv 0]
spawn telnet $ipaddr
expect "soc1 login: "
send "root\n"
expect "prompt # "
send "val=`some_command`\n"
expect "prompt # "
send "if [ \$val -eq 0 ]; then echo Good; fi\n"
# its here ^^^^^
expect "prompt # "
send "exit\n"
interact
I've tried to use $$ and it doesn't help.
How to fix this script to allow usage of variables inside sh script?
The problem of the script is in the interpretation by Tcl of [ and ] - these brackets should be escaped:
send "if \[ \$val -eq 0 \]; then echo Good; fi\n"
# its here ^^^^^
I have an expect script which I need to run every 3 mins on my management node to collect tx/rx values for each port attached to DCX Brocade SAN Switch using the command #portperfshow#
Each time I try to use crontab to execute the script every 3 mins, the script does not work!
My expect script starts with #!/usr/bin/expect -f and I am calling the script using the following syntax under cron:
3 * * * * /usr/bin/expect -f /root/portsperfDCX1/collect-all.exp sanswitchhostname
However, when I execute the script (not under cron) it works as expected:
root# ./collect-all.exp sanswitchhostname
works just fine.
Please Please can someone help! Thanks.
The script collect-all.exp is:
#!/usr/bin/expect -f
#Time and Date
set day [timestamp -format %d%m%y]
set time [timestamp -format %H%M]
#logging
set LogDir1 "/FPerf/PortsLogs"
et timeout 5
set ipaddr [lrange $argv 0 0]
set passw "XXXXXXX"
if { $ipaddr == "" } {
puts "Usage: <script.exp> <ip address>\n"
exit 1
}
spawn ssh admin#$ipaddr
expect -re "password"
send "$passw\r"
expect -re "admin"
log_file "$LogDir1/$day-portsperfshow-$time"
send "portperfshow -tx -rx -t 10\r"
expect timeout "\n"
send \003
log_file
send -- "exit\r"
close
I had the same issue, except that my script was ending with
interact
Finally I got it working by replacing it with these two lines:
expect eof
exit
Changing interact to expect eof worked for me!
Needed to remove the exit part, because I had more statements in the bash script after the expect line (calling expect inside a bash script).
There are two key differences between a program that is run normally from a shell and a program that is run from cron:
Cron does not populate (many) environment variables. Notably absent are TERM, SHELL and HOME, but that's just a small proportion of the long list that will be not defined.
Cron does not set up a current terminal, so /dev/tty doesn't resolve to anything. (Note, programs spawned by Expect will have a current terminal.)
With high probability, any difficulties will come from these, especially the first. To fix, you need to save all your environment variables in an interactive session and use these in your expect script to repopulate the environment. The easiest way is to use this little expect script:
unset -nocomplain ::env(SSH_AUTH_SOCK) ;# This one is session-bound anyway
puts [list array set ::env [array get ::env]]
That will write out a single very long line which you want to put near the top of your script (or at least before the first spawn). Then see if that works.
Jobs run by cron are not considered login shells, and thus don't source your .bashrc, .bash_profile, etc.
If you want that behavior, you need to add it explicitly to the crontab entry like so:
$ crontab -l
0 13 * * * bash -c '. .bash_profile; etc ...'
$