ssh to remote server with arguments to run scripts - ssh

I have lots of data that needs to be processed, and have access to 3 separate remote servers. The logic is to split up the data crunching among the 3 different servers instead of running on a single one. Note, that all three remote servers are able to point to a single directory, which is where I have the master scripts to process all of the data. The problem I am have is carrying over my arguments when I call different bash scripts.
For example, I have the master script which looks something like:
processing stuff
more stuff
# call the first script
$scriptdir/step1.csh $date $time $name
Within step1.csh, if I have something very simple where I am able to connect to one of the remote servers and output the hostname to a text file, such as:
#!/bin/bash
ssh name#hostname bash -c '
echo `hostname` > host.txt
I get the desired outcome, where 'host.txt' will be the hostname of the desired connected hostname. However, If step1.csh looks like:
#!/bin/bash
mydate=$1
mytime=$2
myname=$3
ssh name#hostname bash '
echo `hostname` > host.txt
echo ${mydate} > host.txt
I get the error saying that 'mydate: undefined variable'
Furthermore, If I do something along the lines of:
#!/bin/bash
mydate=$1
mytime=$2
myname=$3
ssh name#hostname "python /path/to/somewhere/to/run/${mydate}/and/${mytime}
It still runs on the local, and not remote server. What am I missing here?

So the first part:
#!/bin/bash
mydate=$1
mytime=$2
myname=$3
ssh name#hostname bash '
echo `hostname` > host.txt
echo ${mydate} > host.txt
The solution is:
#!/bin/bash
mydate=$1
mytime=$2
myname=$3
ssh -T name#hostname << EOF
echo `hostname` > host.txt
echo ${mydate} > host.txt
EOF
However, I am still having issues as in where I try to run a python script on the remote server; it is always ran on the local server.

Related

Use findstr to find input from file a within file b, and send the output to a different file

I'm trying to write a batch script in Windows to take a list of IP addresses and ping them. Once a site doesn't respond, I want Windows to take all of the unresponsive IP addresses, and parse them through a comparison file that has the IP and physical street addresses of these systems. Once the unresponsive sites are parsed through the comparison file, I want the end result to be the matching info from the compare file for only the sites that are unresponsive. I already have script written for Linux that does this same thing, but I wanted a Windows version for some of the customers I work with who aren't Linux savvy.
Here is my script:
#Echo Off
Set "ServerList=C:\Users\<mylogin>\ip.txt"
Set "LogFile=C:\Users\<mylogin>\PingResults.txt"
If Not Exist "%ServerList%" Exit /B
>"%LogFile%" (For /F UseBackQ %%A In ("%ServerList%"
) Do Ping -n 1 %%A|Find "TTL=">Nul&&(Echo Yes [%%A] > Nul)||Echo No [%%A])
findstr /f:%LogFile sites.txt > Down.txt
The script itself seems to execute just fine, but it doesn't put anything in the final output file of Down.txt, I'm positive I have something wrong in my findstr command.
Below is my Linux script that does this exact same thing. Yeah it's clunky but it gets the job done:
#!/bin/bash
# Script to test ssh connectivity using expect script
rm -f results.txt
clear
echo "Please be patient while the script runs..."
while read user ip port pass; do
${PWD}/test_ssh_edge_device.sh $user $ip $port $pass >> results.txt
done < rekor_edge_device_list.txt
#This will boil down the results of the ping script so only the IP address is left
cat results.txt | grep -B1 time > refined.txt
cat refined.txt | grep -Eo "\b([0-9]{1,3}\.){3}[0-9]{1,3}\b" > refinedip.txt
#This will take the boiled down IP addresses and check them against the compare file, final output will be all system info for only downed systems
list=refinedip.txt
rm -f iplist.txt
exec 3<&0
exec 0<$list
while read line
do
cat compare.txt | grep $line >> iplist.txt
done
exec 0<&3
cat iplist.txt

Unable to run a postgresql script from bash

I am learning the shell language. I have creating a shell script whose function is to login into the DB and run a .sql file. Following are the contents of the script -
#!/bin/bash
set -x
echo "Login to postgres user for autoqa_rpt_production"
$DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT
echo "Running SQL Dump - auto_qa_db_sync"
\\i auto_qa_db_sync.sql
After running the above script, I get the following error
./autoqa_script.sh: 39: ./autoqa_script.sh: /i: not found
Following one article, I tried reversing the slash but it didn't worked.
I don't understand why this is happening. Because when I try manually running the sql file, it works properly. Can anyone help?
#!/bin/bash
set -x
echo "Login to postgres user for autoqa_rpt_production and run script"
$DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT -f auto_qa_db_sync.sql
The lines you put in a shell script are (moreless, let's say so for now) equivalent to what you would put right to the Bash prompt (the one ending with '$' or '#' if you're a root). When you execute a script (a list of commands), one command will be run after the previous terminates.
What you wanted to do is to run the client and issue a "\i ./autoqa_script.sh" comand in it.
What you did was to run the client, and after the client terminated, issue that command in Bash.
You should read about Bash pipelines - these are the way to run programs and input text inside them. Following your original idea to solving the problem, you'd write something like:
echo '\i auto_qa_db_sync.sql' | $DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT
Hope that helps to understand.

Transfer files over SSH, then appended to another file

I'm trying to automate a script that copies a file from my local server to a remote server on the command line. I've done the research on scp and know how to copy the file to the remote server, but then I want to append that file to another.
This is my code:
scp ~/file.txt user#host:
ssh user#host cat file.txt >> other_file.txt
When I enter everything into the command line manually as such, everything works fine:
scp ~/file.txt user#host:
ssh user#host
cat file.txt >> other_file.txt
But when I run the script, only the file is copied, not appended to the end of other_file.txt. Help?
The second line of your code should be
ssh user#host "cat file.txt >> other_file.txt"
Three important points:
You don't want your local shell to interpret >> in any way (which it does if it's unquoted)
There is a remote shell which will interpret >> in the command correctly.
Final arguments to ssh are "joined" to form a command, not carried into an argv array as they are. It may be convenient but it also may lead to confusion or bugs: ssh cat "$MYFILE" and ssh "cat '$MYFILE'" both work in a common use case, but they both break for different values of $MYFILE.
You need to enclose the command to be run on the remote host in quotes. Otherwise, the redirection is being done locally rather than remotely. Try this instead:
scp ~/file.txt user#host:
ssh user#host 'cat file.txt >> other_file.txt'
Try this:
$ cat file.txt| ssh hostname 'cat >> other_file.txt'

Using expect to login to amazon server with .PEM file

I need to do the following:
Log into my amazon server
Change to a specific directory and run a script
The script executes an svn up, I need to be able to pass my username and password to this script.
I've read I might be able to do this with expect? Can I do the login via a shell script and then invoke expect to run the custom script?
Basically, I'm just looking for a good way to do this and would appreciate a pointer in the right direction.
You can use ssh to pass a shell commands to be run on remote Instance.
For example, here's how I check logs on multiple Servers:
#!/bin/bash
nas_servers=(
"ec2-xx-xx-xxx-xxx.ap-xxxx.compute.amazonaws.com"
"ec2-xx-xx-xxx-xxx.ap-xxxx.compute.amazonaws.com"
"ec2-xx-xx-xxx-xxx.ap-xxxx.compute.amazonaws.com"
"ec2-xx-xx-xxx-xxx.ap-xxxx.compute.amazonaws.com"
)
for s in "${nas_servers[#]}"
do
echo "Cheking $s:"
ret=$(ssh -i ~/pem/Key.pem "user#$s" bash << 'EOF'
files=/var/log/syslog*
for f in $files
do
if [[ ${f##*.} = 'gz' ]]; then
cmd=zcat
else
cmd=cat
fi
$cmd $f | egrep -wi 'error|warn|crit|fail'
done
EOF
)
if [[ -z $ret ]]; then
echo "No errors found."
else
echo "$ret"
fi
done

How to inject commands at the start of an interactive SSH session?

I want to be able to just ssh to a server where I cannot modify profiles and set up the environment with several commands before getting the usual interactive session.
Any ideas?
I've been using an expect script with an "interact" command at the end - which works for most things but is clumsy and breaks some console apps. Also been extermienting with empty-expect and socat. Any other suggestions?
If you're able to write somewhere on the filesystem, you may be able to invoke bash with a custom rc file like this:
ssh me#example.com -t bash --rcfile /home/user/my_private_profile -i
Note that this appears to only work for interactive shell, not login shells. The -t option to ssh makes it allocate a pty even though you're specifying a command.
If you can't write to the filesystem anywhere, you could use a subshell to supply a named pipe as the rcfile:
$ ssh ares -t "bash --rcfile <(echo 'FOO=foo';echo 'BAR=bar') -i"
axa#ares:~$ echo $FOO
foo
axa#ares:~$ echo $BAR
bar