Ansible issue with ssh authentication - ssh

i have searched around this problem for a while now but didnt find anything that helps.
We are using ansible to automate our Juniper devices and therefore use the ansible juniper modules. When i try to use "junos_facts" for example, i can execute it without problems on host1, but on host2 i get either a PasswordRequiredException or an AuthenticationException when i add -k in the cli
TASK [proact-junos-test : Gather JunOS facts] ***************************************************************************************************************************************************
fatal: [host2]: FAILED! => {"changed": false, "msg": "PasswordRequiredException('Private key file is encrypted')"}
ok: [host1]
i tried every possible combination of parameters in cli, in ansible.cfg, in the playbook itself. For some reason it works on one host but not the other. I have deployed the same key on both host and have it stored in my ssh-agent. I can ssh to both hosts without a problem.
Can anyone help me with this? Thanks

For anyone having the same issue, the problem was that the remote host didn't accept my SSH key algorithm, because, since Paramiko 2.9, it was deprecated.
So, I installed Paramike 2.8.1 and it worked

As far as I can understand, the problem is that ssh key is encrypted. Try to add ssh key to ssh agent (if you have it).
If you don't have, there is a simple trick:
eval $(ssh-agent)
ssh-add path/to/private/ssh/key
ansible ...
If you are running this in CI/CD environment you'll need to fight with ssh-add about the way to ask password, but that's a different story.

Related

VitrtualBox Connection to GitLab by using SSH keys

I'm stuck with a little ssh problem. I'm working with a Windows10 which has its pair of ssh keys generated via PuttyGen (rsa) by using domain's mail. I use this pair to connect via Ssh to my GitLab repository and all works fine.
I decided to create a Ubuntu VM via VirtualBox on the same machine, then I generated a new ssh keys pair into the VM using
ssh-keygen -t rsa -C "my.email#example.com" -b 4096
with the same mail of windows10's domain. After that I added this new public key into my GitLab account. However, when I test this new pair of keys via
ssh -Tv git#gitlab.com
where "gitlab.com" is my gitlab repository, I receive, along with some debug messages (which don't contain any useful information)
Permission denied (publickey)
Now, my question is as follows:
is there something that I should do differently as usual to setup a new pair of SSh keys into a VirtualMachine which use the same network of the Host machine? Or, theoretically, should it work fine just as I did?
Thank you
EDIT: I've also tried to copy the same VM SSH keys into my Windows machine, replacing the old one, and they works. So it's not a generation key problem, I think it's really a problem of VirtualBox or Virtualization in general, any help?

SSH Key pairs with Ansible

I have setup an ansible environment with a control machine (centos) and 3 other remote hosts (centos). Everything is fine with regards to the actual functioning but I want it to work a little seamlessly I guess.
I have setup the ssh authentication using #ssh-key-gen on my master server and then used #ssh-copy-id to all my 3 hosts for the passphrase and it works.
Now each time I run my ansible command to these servers it asks me for passphrase and only then the command completes. I dont want that to happen. I tried defining that in my hosts file as you see below but that hasnt worked. I even tried with the vars and it doesnt work with that as well. When i run the command #ansible servers -m ping it asks me for the ssh passphrase and the it runs...
[servers]
10.0.0.1
ansible_ssh_user=root ansible_ssh_private_key_file=/home/ansible/.ssh/id_rsa
Thanks
A
Now each time I run my ansible command to these servers it asks me for passphrase and only then the command completes. I dont want that to happen.
Generate your ssh key without passphrase.
or
Setup ssh key agent.
This is a bit off-topic for SO

How to setup SSH connection with Ansible?

I am brand new to learning Ansible. Here is a pretty easy example.
I have computer A, where I will be running playbooks from.
And 10 other host machines that need to be configured. My question is, do I just need to put the public SSH key of my host machine on the 10 hosts in ~/.ssh/authorized_keys ?
I guess my understanding of how to efficiently setup SSH connections between my main computer and all the clients is a little fuzzy. Any help would be appreciated here.
You create a file called hosts with this content
[test-vms]
10.0.0.100 ansible_ssh_pass='password' ansible_ssh_user='username'
In above hosts file leave off ansible_ssh_pass='password' if using ssh keys ... Then you can create a playbook with the commands and call the playbook like below. The first line of the playbook needs to have the hosts declaration
---
- hosts: test-vms
tasks:
-name: "This is a test task"
command: /bin/hostname
Finally, you call the playbook like this
ansible-playbook -i <hosts-file> <playbook.yaml>
Ansible simply uses SSH so you can either copy the public key as you describe or use password authentication using the --user and --ask-pass flags.
Yes. As far as connection to hosts go, Ansible sets up SSH connection between the master machine and the host machines. You have to add the SSH fingerprints to the end machines. You can always skip the Are you sure you want to continue connecting (yes/no/[fingerprint]) step i.e., adding the fingerprints to .ssh/known_hosts by setting host_key_checking=false
I found this great video for initial Ansible Setup - https://youtu.be/-Q4T9wLsvOQ - maybe this can help!

Ansible ask to verify the ssh fingerprint and fails to ssh into newly created ec2 instance

I am creating ec2 instances and configuring them using ansible scripts. I have used
[ssh_connection]
pipelining=true
in my ansible.cfg file but it still asks to verify the ssh fingerprint, when I type yes and press enter it fails to login to the instance.
Just to let you know I am using ansible dynamic inventory and hence am not storing IPs or dns in hosts file.
Any help will be much appreciated.
TIA
Pipelining doesn't have any effect on authentication - it bundles up individual module calls into one bigger file to transfer over once a connection has been established.
In order not to stop execution and prompt you to accept the SSH key, you need to disable strict host key checking, not enable pipelining.
You can set that by exporting ANSIBLE_HOST_KEY_CHECKING=False or set it in ansible.cfg with:
[defaults]
host_key_checking=False
The latter is probably better for your use case, because it's persistent.
Note that even though this is a setting that deals with ssh connections, it is in the [defaults] section, not the [ssh_connection] one.
==
The fact that when you type yes you fail to log in makes it seem like this might not be your only problem, but you haven't given enough information to solve the rest.
If you're still having connection issues after disabling host key checking, edit the question to add the output of you SSHing into the instance manually, alongside the output of an ansible play with -vvv for verbose output.
First steps to look through when troubleshooting:
What are the differences between when I connect and when Ansible does?
Is the ansible_ssh_user set to the right user for the ec2 instance?
Is the ansible_ssh_private_key_file the same as the private part of the keypair you assigned the instance on creation?
Is ansible_ssh_host set correctly by whatever is generating your dynamic inventory?
I think you can find the answer here: ansible ssh prompt known_hosts issue
Basically, when you run ansible-playbook, you will need to use the argument:
ANSIBLE_HOST_KEY_CHECKING=False
Make sure you have your private key added (ssh-add your_private_key).

Jenkins won't use SSH key

I'm sorry to have to ask this question, but I feel like I've tried every answer so far on SO with no luck.
I have my local machine and my remote server. Jenkins is up and running on my server.
If I open up terminal and do something like scp /path/to/file user#server:/path/to/wherever then my ssh works fine without requiring a password
If I run this command inside of my Jenkins job I get 'Host Key Verification Failed'
So I know my SSH is working correctly the way I want, but why can't I get Jenkins to use this SSH key?
Interesting thing is, it did work fine when I first set up Jenkins and the key, then I think I restarted my local machine, or restarted Jenkins, then it stopped working. It's hard to say exactly what caused it.
I've also tried several options regarding ssh-agent and ssh-add but those don't seem to work.
I verified the local machine .pub is on the server in the /user/.ssh folder and is also in the authorized keys file. The folder is owned by user.
Any thoughts would be much appreciated and I can provide more info about my problem. Thanks!
Update:
Per Kensters suggestion I did su - jenkins, then ssh server, and it asked me to add to known hosts. So I thought this was a step in the right direction. But the same problem persisted afterward.
Something I did not notice before I can ssh server without password when using my myUsername account. But if I switch to the jenkins user, then it asks me for my password when I do ssh server.
I also tried ssh-keygen -R server as suggested to no avail.
Try
su jenkins
ssh-keyscan YOUR-HOSTNAME >> ~/.ssh/known_hosts
SSH Slaves Plugin doesn't support ECDSA. The command above should add RSA key for ssh-slave.
Host Key Verification Failed
ssh is complaining about the remote host key, not the local key that you're trying to use for authentication.
Every SSH server has a host key which is used to identify the server to the client. This helps prevent clients from connecting to servers which are impersonating the intended server. The first time you use ssh to connect to a particular host, ssh will normally prompt you to accept the remote host's host key, then store the key locally so that ssh will recognize the key in the future. The widely used OpenSSH ssh program stores known host keys in a file .ssh/known_hosts within each user's home directory.
In this case, one of two things is happening:
The user ID that Jenkins is using to run these jobs has never connected to this particular remote host before, and doesn't have the remote host's host key in its known_hosts file.
The remote host key has changed for some reason, and it no longer matches the key which is stored in the Jenkins user's known_hosts file.
You need to update the known_hosts file for the user which jenkins is using to run these ssh operations. You need to remove any old host key for this host from the file, then add the host's new host key to the file. The simplest way is to use su or sudo to become the Jenkins user, then run ssh interactively to connect to the remote server:
$ ssh server
If ssh prompts you to accept a host key, say yes, and you're done. You don't even have to finish logging in. If it prints a big scary warning that the host key has changed, run this to remove the existing host from known_hosts:
$ ssh-keygen -R server
Then rerun the ssh command.
One thing to be aware of: you can't use a passphrase when you generate a key that you're going to use with Jenkins, because it gives you no opportunity to enter such a thing (seeing as it runs automated jobs with no human intervention).