How to use ssh command in shell script? - ssh

I know that we shuld do
ssh user#target
but where do we specify the password ?
Hmm thanks for all your replies.
My requirement is I have to start up some servers on different machines. All servers should be started with one shell script. Well, entering password every time seems little bad but I guess I will have to resort to that option. One reason why I don't want to save the public keys is I may not connect to same machines every time. It is easy to go back and modify the script to change target addresses though.

The best way to do this is by generating a private/public key pair, and storing your public key on the remote server. This is a secure way to login w/o typing in a password each time.
Read more here

This cannot be done with a simple ssh command, for security reasons. If you want to use the password route with ssh, the following link shows some scripts to get around this, if you are insistent:
Scripts to automate password entry

The ssh command will prompt for your password. It is unsafe to specify passwords on the commandline, as the full command that is executed is typically world-visible (e.g. ps aux) and also gets saved in plain text in your command history file. Any well written program (including ssh) will prompt for the password when necessary, and will disable teletype echoing so that it isn't visible on the terminal.
If you are attempting to execute ssh from cron or from the background, use ssh-agent.

The way I have done this in the past is just to set up a pair of authentication keys.
That way, you can log in without ever having to specify a password and it works in shell scripts. There is a good tutorial here:
http://linuxproblem.org/art_9.html

SSH Keys are the standard/suggested solution. The keys must be setup for the user that the script will run as.
For that script user, see if you have any keys setup in ~/.ssh/ (Key files will end with a .pub extension)
If you don't have any keys setup you can run:
ssh-keygen -t rsa
which will generate ~/.ssh/id_rsa.pub (the -t option has other types as well)
You can then copy the contents of this file to ~(remote-user)/.ssh/authorized_keys on the remote machine.
As the script user, you can test that it works by:
ssh remote-user#remote-machine
You should be logged in without a password prompt.
Along the same lines, now when your script is run from that user, it can auto SSH to the remote machine.

If you really want to use password authentication , you can try expect. See here for an example

Related

Calling SSH command from Jenkins

Jenkins keeps using the default "jenkins" user when executing builds. My build requires a number of SSH calls. However these SSH calls fails with Host verification exceptions because i haven't been able connect place the public key for this user on the target server.
I don't know where the default "jenkins" user is configured and therefore cant generate the required public key to place on the target server.
Any suggestions for either;
A way to force Jenkins to use a user i define
A way to enable SSH for the default Jenkins user
Fetch the password for the default 'jenkins' user
Ideally I would like to be able do both both any help greatly appreciated.
Solution: I was able access the default Jenkins user with an SSH request from the target server. Once i was logged in as the jenkins user i was able generate the public/private RSA keys which then allowed for password free access between servers
Because when having numerous slave machine it could be hard to anticipate on which of them build will be executed, rather then explicitly calling ssh I highly suggest using existing Jenkins plug-ins for SSH executing a remote commands:
Publish Over SSH - execute SSH commands or transfer files over SCP/SFTP.
SSH - execute SSH commands.
The default 'jenkins' user is the system user running your jenkins instance (master or slave). Depending on your installation this user can have been generated either by the install scripts (deb/rpm/pkg etc), or manually by your administrator. It may or may not be called 'jenkins'.
To find out under what user your jenkins instance is running, open the http://$JENKINS_SERVER/systemInfo, available from your Manage Jenkins menu.
There you will find your user.home and user.name. E.g. in my case on a Mac OS X master:
user.home /Users/Shared/Jenkins/Home/
user.name jenkins
Once you have that information you will need to log onto that jenkins server as the user running jenkins and ssh into those remote servers to accept the ssh fingerprints.
An alternative (that I've never tried) would be to use a custom jenkins job to accept those fingerprints by for example running the following command in a SSH build task:
ssh -o "StrictHostKeyChecking no" your_remote_server
This last tip is of course completely unacceptable from a pure security point of view :)
So one might make a "job" which writes the host keys as a constant, like:
echo "....." > ~/.ssh/known_hosts
just fill the dots from ssh-keyscan -t rsa {ip}, after you verify it.
That's correct, pipeline jobs will normally use the user jenkins, which means that SSH access needs to be given for this account for it work in the pipeline jobs. People have all sorts of complex build environments so it seems like a fair requirement.
As stated in one of the answers, each individual configuration could be different, so check under "System Information" or similar, in "Manage Jenkins" on the web UI. There should be a user.home and a user.name for the home directory and the username respectively. On my CentOS installation these are "/var/lib/jenkins/" and "jenkins".
The first thing to do is to get a shell access as user jenkins in our case. Because this is an auto-generated service account, the shell is not enabled by default. Assuming you can log in as root or preferably some other user (in which case you'll need to prepend sudo) switch to jenkins as follows:
su -s /bin/bash jenkins
Now you can verify that it's really jenkins and that you entered the right home directory:
whoami
echo $HOME
If these don't match what you see in the configuration, do not proceed.
All is good so far, let's check what keys we already have:
ls -lah ~/.ssh
There may only be keys created with the hostname. See if you can use them:
ssh-copy-id user#host_ip_address
If there's an error, you may need to generate new keys:
ssh-keygen
Accept the default values, and no passphrase, if it prompts you to add the new keys to the home directory, without overwriting existing ones. Now you can run ssh-copy-id again.
It's a good idea to test it with something like
ssh user#host_ip_address ls
If it works, so should ssh, scp, rsync etc. in the Jenkins jobs. Otherwise, check the console output to see the error messages and try those exact commands on the shell as done above.

Script to ssh to remote server and overwrite file

Been toying around with my Raspberry Pi running raspbian.
I'd like to update a webpage with a shell script that requires no input, such as password.
I just tried creating the keys and putting them in the .ssh file on the remote server, but when I run my simple shell script of ssh user#domain.net 'ls' and it still prompts me for a password.
I also looked into paramiko slightly, but didn't get very far with it.
All I need is to update/replace an html file with text/information that I have.
Thanks
I think your Public/Private Keys for Authentication is not configured correctly in your server.
Can you check this link which explains the authentication step:SSH Authentication

still asking for password even after setting up the machine for Password-less SSH Login

I need to copy a file from a remote machine to my local machine and I need to automate it.
I've tried SCP command and it's working, however, I could not automate the part wherein it is asking for the password of the user of the local machine and the remote machine.
Based on this article I can Perform SSH Login Without Password Using ssh-keygen & ssh-copy-id
after following all the instructions written there, I tried to access the remote machine using this
ssh lalala#XXX.XXX.XXX.XXX
it works, it doesnt ask for the password anymore. But when I tried copying a file from that machine using the command below,
scp lalala#XXX.XXX.XXX.XXX:'/a/b/c.txt' lelele#XXX.XXX.XXX.YYY:'/b/c/'
it still asks for the password of the localmachine which is the lelele#XXX.XXX.XXX.YYY
I wonder if I did something wrong? what could it be? is there something wrong with the format of the command?
BTW, im using Centos, and I'm planning to code it using python
If you are copying to your local machine why don't you just do
scp lalala#XXX.XXX.XXX.XXX:'/a/b/c.txt' /b/c/
?
I tried your line on some machine with similar setup and didn't get asked for password; I got an error instead, but this is probably due to differences in our configurations. I tried mine and it worked.
Regarding whether your connection succeeds in the remote machine you could tail this file there:
tail -f /var/log/secure
If you see no error there you can be sure (well, never say always) your layout with the generated keys is working.
In this case I bet you'll see no error there
I think you may have multiple ssh keys and set identies only as yes. If so, please check this answer: https://askubuntu.com/a/999306/398861

Prevent interactive ssh prompts

I have an ssh script that uses a local key for login to the remote host - nothing too exciting there. The key is passworded and I usually add it to an agent to avoid prompting.
Occasionally I run the program before the agent is running and it will hang waiting for the unlock phrase. In such cases, rather than prompt interactively, I want the command to simply fail.
Anyone know if there's an option for this?
Sure is.
ssh REMOTE_HOST -o "BatchMode yes"

Using expect to pass a password to ssh

How can I use expect to send a password to an ssh connection.
say the password was p#ssword
and the ssh command was
ssh me#127.0.0.1
What would I do with expect to a make it input the password when it says
me#127.0.0.1's password:
?
The proper action of using an SSH key pair isn't an option because I would have to use ssh (scp) to put the key on the server, which would ask for a password.
I always used the "proper" solution, but I used expect in other situations.
Here I found following suggestion:
#!/usr/local/bin/expect
spawn sftp -b cmdFile user#yourserver.com
expect "password:"
send "shhh!\n";
interact
Would it not be easier to use public key authentication and use a key with no passphrase?
As the user on the source machine do this to make an RSA key
ssh-keygen -t rsa
Now copy ~/.ssh/id_rsa.pub to the target machine and append it to the authorized_keys file of the target user
Your quickest way forward (unless you want to become a Tcl expert, which would be... unusual... in 2009) is probably to use autoexpect. Here's the man page:
http://expect.nist.gov/example/autoexpect.man.html
In short, fire up autoexpect, run your ssh session, finish up what you need to do, stop autoexpecting and then beat your keyboard over the resulting mess until it works :) I'm assuming you don't need anything more than a quick hack to get your keys sorted out and then, well it sounds like you know the score already with that.
And there's this question which already contains an example close to what you seek.
Cygwin has autoexpect just not in the bin package. run setup.exe and search for expect and check the source checkbox. you will see the resulting tree in /usr/src and in there there is a expect/expect/examples directory. in there lives a copy of the autoexpect script.
Key solution will not work... because the keys have to be readable only by the person running ssh. On xp you cannot create key structure with the correct permissions. So ssh will not read them. This may have changed, but last i checked it still not not work.
I'm pretty sure it is not possible to do what you're trying to do. Most *nix applications that prompt for a password read from the TTY directly, not stdin, so you can't pipe the password in. You can, as others have mentioned, configure SSH to not prompt for a password, as explained here.
After I was downvoted for no apparent reason, I went and did a little more research on the expect command and discovered that it has a send_tty command that sends to /dev/tty instead of stdin, which might actually do what you want... I was previously unaware of this feature. I still recommend putting the key on the server, however.