Managing ssh keys - ssh

Suppose that I have a remote server and GitLab private repository.
I do not want to add key from my server to ssh keys in GitLab account, so I make ssh key forwarding.
However, when I try to run my initial installation scripts with sudo, I cannot pull the repository because with sudo my ssh keys are not accessible on the server.
How can I handle it?

Check first, as in here, if you can allow the other user (sudo -u otheruser) to access $SSH_AUTH_SOCK file, and it's directory.
That way, you would still benefit from the original SSH tunnel: simply giving more permission to $(dirname $SSH_AUTH_SOCK) can be enough.

Related

Access to jumpbox as normal user and change to root user in ansible

Here is my situation. I want to access a server through a jumpbox/bastion host.
so, I will login as normal user in jumpbox and then change user to root after that login to remote server using root. I dont have direct access to root in jumpbox.
$ ssh user#jumpbox
$ user#jumpbox:~# su - root
Enter Password:
$ root#jumpbox:~/ ssh root#remoteserver
Enter Password:
$ root#remoteserver:~/
Above is the manual workflow. I want to achieve this in ansible.
I have seen something like this.
ansible_ssh_common_args: '-o ProxyCommand="ssh -W %h:%p -q user#jumpbox"'
This doesnot work when we need to switch to root and login to remote server.
There are a few things to unpack here:
General Design / Issue:
This isn't an Ansible issue, it's an ssh issue/proxy misconfiguration.
A bastion host/ssh proxy isn't meant to be logged into and have commands ran directly on it interactively (like su - root, enter password, then ssh...). That's not really a bastion, that's just a server you're logging into and running commands on. It's not an actual ssh proxy/bastion/jump role. At that point you might as well just run Ansible on the host.
That's why things like ProxyJump and ProxyCommand aren't working. They are designed to work with ssh proxies that are configured as ssh proxies (bastions).
Running Ansible Tasks as Root:
Ansible can run with sudo during task execution (it's called "become" in Ansible lingo), so you should never need to SSH as the literal root user with Ansible (shouldn't ssh as root ever really).
Answering the question:
There are a lot of workarounds for this, but the straightforward answer here is to configure the jump host as a proper bastion and your issue will go away. An example...
As the bastion "user", create an ssh key pair, or use an existing one.
On the bastion, edit the users ~/.ssh/config file to access the target server with the private key and desired user.
EXAMPLE user#bastion's ~/.ssh/config (I cringe seeing root here)...
Host remote-server
User root
IdentityFile ~/.ssh/my-private-key
Add the public key created in step 1 to the target servers ~/.ssh/authorized_keys file for the user you're logging in as.
After that type of config, your jump host is working as a regular ssh proxy. You can then use ProxyCommand or ProxyJump as you had tried to originally without issue.

Jenkins won't use SSH key

I'm sorry to have to ask this question, but I feel like I've tried every answer so far on SO with no luck.
I have my local machine and my remote server. Jenkins is up and running on my server.
If I open up terminal and do something like scp /path/to/file user#server:/path/to/wherever then my ssh works fine without requiring a password
If I run this command inside of my Jenkins job I get 'Host Key Verification Failed'
So I know my SSH is working correctly the way I want, but why can't I get Jenkins to use this SSH key?
Interesting thing is, it did work fine when I first set up Jenkins and the key, then I think I restarted my local machine, or restarted Jenkins, then it stopped working. It's hard to say exactly what caused it.
I've also tried several options regarding ssh-agent and ssh-add but those don't seem to work.
I verified the local machine .pub is on the server in the /user/.ssh folder and is also in the authorized keys file. The folder is owned by user.
Any thoughts would be much appreciated and I can provide more info about my problem. Thanks!
Update:
Per Kensters suggestion I did su - jenkins, then ssh server, and it asked me to add to known hosts. So I thought this was a step in the right direction. But the same problem persisted afterward.
Something I did not notice before I can ssh server without password when using my myUsername account. But if I switch to the jenkins user, then it asks me for my password when I do ssh server.
I also tried ssh-keygen -R server as suggested to no avail.
Try
su jenkins
ssh-keyscan YOUR-HOSTNAME >> ~/.ssh/known_hosts
SSH Slaves Plugin doesn't support ECDSA. The command above should add RSA key for ssh-slave.
Host Key Verification Failed
ssh is complaining about the remote host key, not the local key that you're trying to use for authentication.
Every SSH server has a host key which is used to identify the server to the client. This helps prevent clients from connecting to servers which are impersonating the intended server. The first time you use ssh to connect to a particular host, ssh will normally prompt you to accept the remote host's host key, then store the key locally so that ssh will recognize the key in the future. The widely used OpenSSH ssh program stores known host keys in a file .ssh/known_hosts within each user's home directory.
In this case, one of two things is happening:
The user ID that Jenkins is using to run these jobs has never connected to this particular remote host before, and doesn't have the remote host's host key in its known_hosts file.
The remote host key has changed for some reason, and it no longer matches the key which is stored in the Jenkins user's known_hosts file.
You need to update the known_hosts file for the user which jenkins is using to run these ssh operations. You need to remove any old host key for this host from the file, then add the host's new host key to the file. The simplest way is to use su or sudo to become the Jenkins user, then run ssh interactively to connect to the remote server:
$ ssh server
If ssh prompts you to accept a host key, say yes, and you're done. You don't even have to finish logging in. If it prints a big scary warning that the host key has changed, run this to remove the existing host from known_hosts:
$ ssh-keygen -R server
Then rerun the ssh command.
One thing to be aware of: you can't use a passphrase when you generate a key that you're going to use with Jenkins, because it gives you no opportunity to enter such a thing (seeing as it runs automated jobs with no human intervention).

Pass ssh options to ssh-copy-id

I'm stuck in the Permission denied (publickey) hell trying to copy public key to a remote server so Jenkins can rsync files during builds.
Running:
sudo ssh-copy-id -i id_rsa.pub ubuntu#xx.xx.xx.xx
I have done this for another server, but that one has a separate key pair for SSH assigned by EC2, and my current guess is that ssh-copy-id is trying to use wrong private key for this connection. Is there a way to pass -vv to ssh-copy-id so I can see what jey it's trying to use. I've looked into the -o switch, but can't seem to get it right.
Thank you.
So here's what I've done:
added following to /etc/ssh/ssh_config:
Host xx.xx.xx.xx
User ubuntu
IdentityFile ~/.ssh/key-name-for-that-machine.pem
Then copied key-name-for-that-machine.pem into /var/lib/jenkins/.ssh
Didn't run ssh-copy-id again, simply have rsync use that key file when moving stuff, here's the rsync script:
rsync -rvh -e 'ssh -v' "/tmp/project-DEV-${BUILD_ID}/" ubuntu#xx.xx.xx.xx:"/www/www.project-dir.net/"
my guess would by running it without sudo. But that's depending on how you normally log into the server.
If you normally login by using ssh ubuntu#xx.xx.xx.xx then lose the
sudo.
If not than try to login with sudo ssh ubuntu#xx.xx.xx.xx
Reading your question, at least one of these should fail.

SSH keys setup but still asking for password (but not for 2nd, 3rd, etc. sessions)

The target server is a relatively clean install of Ubuntu 14.04. I generated a new ssh key using ssh-keygen and added it to my server using ssh-copy-id. I also checked that the public key was in the ~/.ssh/authorized_keys file on the server.
Even still, I am prompted for a password every time I try to ssh into the server.
I noticed something weird however. After I log into my first session using my password, the next concurrent sessions don't ask for a password. They seem to be using the ssh key properly. I've noticed this behaviour on two different clients (Mint OSX).
Are you sure your SSH key isn't protected by a password? Try the following:
How do I remove the passphrase for the SSH key without having to create a new key?
If that's not the case, it may just be that ssh is having trouble locating your private key. Try using the -i flag to explicitly point out its location.
ssh -i /path/to/private_key username#yourhost.com
Thank you Samuel Jun for the link to help.ubuntu.com - SSH Public Key Login Troubleshooting !
Just a little caveat:
If you copy your authorized keys file outside your encrypted home directory please make sure your root install is encrypted as well (imho Ubuntu still allows for unencrypted root install coupled with encryption of the home directory).
Otherwise this defeats the whole purpose of using encryption in the first place ;)
If this is happening to you on Windows (I'm on Windows 10)
Try running the program that you're trying to connect via ssh to the server as administrator.
For me I was using powershell with scoop to install a couple of things so that I could ssh straight from it. Anyway... I ran PowerShell as admin and tried connecting again and it didn't ask for my password.
For LinuxSE
Check the SE context with
% ls -dZ ~user/.ssh
Must contain unconfined_u:object_r:ssh_home_t:s0
If not, that was the problem , as root run
# for i in ~user/.ssh ~user/.ssh/*
do
semanage fcontext -a -t ssh_home_t $i
done
# restorecon -v -R ~user/.ssh
It looks like it's related to encryption on your home directory and therefore the authorized_keys file cannot be read.
https://unix.stackexchange.com/a/238570
Make sure your ssh public key was copied to the remote host in the right format. If you open the key file to edit it should read 1 line.
Basically, just do ssh-copy-id username#remote. It will take care of the rest.

Jenkins configuration with ssh passphrase

I am able to run a jenkins build with a local git repository, but only with no-passphrase ssh key. When I have passphrase, I start getting permission issues in the build.
How can i configure jenkins to use passphrase?
--
I am also new to ssh. Here is how I configured my jenkins (on ubuntu).
su jenkins
ssh-keygen ....
cat key.pub
su user_with_github_repo
cd ~/.ssh/
append jenkins key.pub to authorized_keys
The issue you are having is likely due to the fact that ssh will ask interactively for the passphrase. I recommend against trying to enter the passphrase non-interactively in your script as that seems to add very little in terms of security.
Rather, you could use ssh-agent and ssh-add to unlock the key and keep it in memory. ssh-add adds the key to ssh-agent, which is a deamon process. You would unlock the key when the server starts and Jenkins would then be able to authenticate using the key stored in memory.
To do this, run ssh-agent on server boot and capture its output (two exports, SSH_AUTH_SOCK and SSH_AGENT_PID) to a file. It should run as the jenkins user. Use ssh-add to unlock the key. Then source the output file whenever you want to authorise using that key, in your Jenkins build script for example. Et voila!