I did ssh-add. During the session it works fine, but when I exit and reconnect to the server with ssh it does not work anymore ssh-add -l says: no identities.
I do start the ssh-agent in my .bash_profile using eval $(ssh-agent).
Any ideas what I can do to keep the identities?
// EDIT
So here is my scenario. I am connecting to my web-space using ssh. I want to pull data from github using git pull. I added the connection to github and git pull works fine. But it asks for the passphrase. Doing a ssh-add adn adding the passphrase stops that, but only for the current session.
I have to start the ssh-agent using something like eval $(ssh-agent) because it does not autostart on the server.
The main problem I am having is, that I need a script to do the git pull which is invoked by a request from github, so I can not give it the passphrase.
If you're spawning the agent when your session starts, then it'll die when you disconnect. You could connect to a screen or byobu/tmux server which keeps the agent alive (you can skip an instance of bash if you connect like ssh user#hostname -t 'byobu').
Otherwise, have the agent come up when the machine boots so your session comings and goings don't affect it.
Edit: you can also forward your agent from your local machine. This works very well if you happen to have the same keys available on both machines. Try something like this in your ~/.ssh/config:
Host whatever
Hostname whatever.com
User username
ForwardAgent yes
You invoke this with ssh whatever, in case you are not familiar with that config file.
Related
I have setup an ansible environment with a control machine (centos) and 3 other remote hosts (centos). Everything is fine with regards to the actual functioning but I want it to work a little seamlessly I guess.
I have setup the ssh authentication using #ssh-key-gen on my master server and then used #ssh-copy-id to all my 3 hosts for the passphrase and it works.
Now each time I run my ansible command to these servers it asks me for passphrase and only then the command completes. I dont want that to happen. I tried defining that in my hosts file as you see below but that hasnt worked. I even tried with the vars and it doesnt work with that as well. When i run the command #ansible servers -m ping it asks me for the ssh passphrase and the it runs...
[servers]
10.0.0.1
ansible_ssh_user=root ansible_ssh_private_key_file=/home/ansible/.ssh/id_rsa
Thanks
A
Now each time I run my ansible command to these servers it asks me for passphrase and only then the command completes. I dont want that to happen.
Generate your ssh key without passphrase.
or
Setup ssh key agent.
This is a bit off-topic for SO
I often use an ssh tunnel. I open up one terminal to create the tunnel (e.g. ssh -L 1111:servera:2222 user#serverb). Then I open a new terminal to do my work. Is there a way to establish the tunnel in a terminal and somehow put it in the background so I don't need to open up a new terminal? I tried putting "&" at the end, but that didn't do the trick. The tunnel went into the background before I could enter the password. Then I did fg, entered the password and I was stuck in the ssh session.
I know one possible solution would be to use screen or tmux or something like that. Is there a simple solution I'm missing?
There is the -f and -N options exactly for that:
-f Requests ssh to go to background just before command execution. This is useful if
ssh is going to ask for passwords or passphrases, but the user wants it in the
background. This implies -n. The recommended way to start X11 programs at a remote
site is with something like ssh -f host xterm.
If the ExitOnForwardFailure configuration option is set to ``yes'', then a client
started with -f will wait for all remote port forwards to be successfully established
before placing itself in the background.
-N Do not execute a remote command. This is useful for just forwarding ports
(protocol version 2 only).
So the full command would be ssh -fNL 1111:servera:2222 user#serverb.
A way to prevent ssh asking for the password would also be to use SSH public keys for authentication with an agent that either saves the password or prompts it using an external graphical program such as pinentry.
It might also be useful for you to look into autossh, which will reconnect your SSH automatically if the connection drops.
I connect to a server that runs xubuntu and start ssh-agent there. Then I execute ssh-add on the remote server and run rysnc commands that would require to enter the passwort mutliple times.
With my solution I only have to enter it one time. But how can I start the ssh-agent permanentely? I want to reuse it over multiple ssh sessions.
My solution so far:
ssh myhost 'eval $(ssh-agent); ssh-add;'
You can use agent-forwarding in ssh: -A switch. Basically, it will create the agent on your host and when you connect to myhost, you will have your agent in all your sessions and you will not be prompted for password again.
Basically agent should be started automatically with your session, so all you need to do is to add your keys locally and then connect to remote hosts with -A switch.
It is impossible to have running ssh-agent permanently, since it is running under your session. Basically if you don't close the first session, there is some way to connect to it, but it is certainly not the thing you want to do
I'm sorry to have to ask this question, but I feel like I've tried every answer so far on SO with no luck.
I have my local machine and my remote server. Jenkins is up and running on my server.
If I open up terminal and do something like scp /path/to/file user#server:/path/to/wherever then my ssh works fine without requiring a password
If I run this command inside of my Jenkins job I get 'Host Key Verification Failed'
So I know my SSH is working correctly the way I want, but why can't I get Jenkins to use this SSH key?
Interesting thing is, it did work fine when I first set up Jenkins and the key, then I think I restarted my local machine, or restarted Jenkins, then it stopped working. It's hard to say exactly what caused it.
I've also tried several options regarding ssh-agent and ssh-add but those don't seem to work.
I verified the local machine .pub is on the server in the /user/.ssh folder and is also in the authorized keys file. The folder is owned by user.
Any thoughts would be much appreciated and I can provide more info about my problem. Thanks!
Update:
Per Kensters suggestion I did su - jenkins, then ssh server, and it asked me to add to known hosts. So I thought this was a step in the right direction. But the same problem persisted afterward.
Something I did not notice before I can ssh server without password when using my myUsername account. But if I switch to the jenkins user, then it asks me for my password when I do ssh server.
I also tried ssh-keygen -R server as suggested to no avail.
Try
su jenkins
ssh-keyscan YOUR-HOSTNAME >> ~/.ssh/known_hosts
SSH Slaves Plugin doesn't support ECDSA. The command above should add RSA key for ssh-slave.
Host Key Verification Failed
ssh is complaining about the remote host key, not the local key that you're trying to use for authentication.
Every SSH server has a host key which is used to identify the server to the client. This helps prevent clients from connecting to servers which are impersonating the intended server. The first time you use ssh to connect to a particular host, ssh will normally prompt you to accept the remote host's host key, then store the key locally so that ssh will recognize the key in the future. The widely used OpenSSH ssh program stores known host keys in a file .ssh/known_hosts within each user's home directory.
In this case, one of two things is happening:
The user ID that Jenkins is using to run these jobs has never connected to this particular remote host before, and doesn't have the remote host's host key in its known_hosts file.
The remote host key has changed for some reason, and it no longer matches the key which is stored in the Jenkins user's known_hosts file.
You need to update the known_hosts file for the user which jenkins is using to run these ssh operations. You need to remove any old host key for this host from the file, then add the host's new host key to the file. The simplest way is to use su or sudo to become the Jenkins user, then run ssh interactively to connect to the remote server:
$ ssh server
If ssh prompts you to accept a host key, say yes, and you're done. You don't even have to finish logging in. If it prints a big scary warning that the host key has changed, run this to remove the existing host from known_hosts:
$ ssh-keygen -R server
Then rerun the ssh command.
One thing to be aware of: you can't use a passphrase when you generate a key that you're going to use with Jenkins, because it gives you no opportunity to enter such a thing (seeing as it runs automated jobs with no human intervention).
On Ubuntu 14.04 I have a private key in:
~/.ssh/id_rsa
I have installed the public key on the server I wish to connect to and indeed when I run the following, I do connect as expected:
ssh me#my-server-ip.com
I then deleted the private key on the client but running the above command still connects me. This leads me to believe that the SSH binary is running in some kind of daemon mode wherein it is caching the private key in memory? Is that correct? Short of a reboot, how do I 'flush' SSH to stop using the private key. Thanks
Run the following command after removing ~/.ssh/id_rsa
ssh-add -D
This commando removes all cached ssh identities from the ssh-agent.
If you type ssh me#my-server-ip.com now, the password prompt will show.
You can check with ssh-add -L what identities the ssh-agent has cached.
I know I'm a little late to this party, but for the enlightenment of others...
It sounds like you have your private SSH key (identity) cached in ssh-agent. Now it is worth noting that ssh-agent does not retain the key cache over a reboot or logout/login cycle, although some systems depending on configuration may add your key during either of those processes. However, in your instance, a reboot or possibly a logout/login cycle would remove the private key from the agent's cache. This is because you have already removed the ~/.ssh/id_rsa file and it therefore cannot be re-initialized into the agent.
For everyone else who may not have yet deleted their ~/.ssh/id_rsa file(s) or if you don't want to reboot or logout/in right now the following should prove useful.
First, you will want to remove any ~user/.ssh/id_rsa files which you wish to no longer be cached by ssh-agent.
Next, verify that there are, in fact, identities still being held open in 'ssh-agent' by running the following command:
ssh-add -L
This will list the public key parameters of all identities that the agent has actively cached. (Note: ssh-add -l will instead list the fingerprints of all keys/identities that are actively cached.) For each that you would like to remove you should run the following:
ssh-add -d /path/to/matching/public/key/file
If you just want to clear out ALL keys/identities from the agent then run this instead:
ssh-add -D
At this point, the key(s) desired to be removed will be no longer accessible to the agent and with the actual identity file removed, there shouldn't' be any way possible for an attempted remote SSH connection with that user to connect without using a different authorization method if configured/allowed.