Ansible SSH forwarding doesn't seem to work with Vagrant - ssh

OK, strange question. I have SSH forwarding working with Vagrant. But I'm trying to get it working when using Ansible as a Vagrant provisioner.
I found out exactly what Ansible is executing, and tried it myself from the command line, sure enough, it fails there too.
[/common/picsolve-ansible/u12.04%]ssh -o HostName=127.0.0.1 \
-o User=vagrant -o Port=2222 -o UserKnownHostsFile=/dev/null \
-o StrictHostKeyChecking=no -o PasswordAuthentication=no \
-o IdentityFile=/Users/bryanhunt/.vagrant.d/insecure_private_key \
-o IdentitiesOnly=yes -o LogLevel=FATAL \
-o ForwardAgent=yes "/bin/sh \
-c 'git clone git#bitbucket.org:bryan_picsolve/poc_docker.git /home/vagrant/poc_docker' "
Permission denied (publickey,password).
But when I just run vagrant ssh the agent forwarding works correctly, and I can checkout R/W my github project.
[/common/picsolve-ansible/u12.04%]vagrant ssh
vagrant#vagrant-ubuntu-precise-64:~$ /bin/sh -c 'git clone git#bitbucket.org:bryan_picsolve/poc_docker.git /home/vagrant/poc_docker'
Cloning into '/home/vagrant/poc_docker'...
remote: Counting objects: 18, done.
remote: Compressing objects: 100% (14/14), done.
remote: Total 18 (delta 4), reused 0 (delta 0)
Receiving objects: 100% (18/18), done.
Resolving deltas: 100% (4/4), done.
vagrant#vagrant-ubuntu-precise-64:~$
Has anyone got any idea how it is working?
Update:
By means of ps awux I determined the exact command being executed by Vagrant.
I replicated it and git checkout worked.
ssh vagrant#127.0.0.1 -p 2222 \
-o Compression=yes \
-o StrictHostKeyChecking=no \
-o LogLevel=FATAL \
-o StrictHostKeyChecking=no \
-o UserKnownHostsFile=/dev/null \
-o IdentitiesOnly=yes \
-i /Users/bryanhunt/.vagrant.d/insecure_private_key \
-o ForwardAgent=yes \
-o LogLevel=DEBUG \
"/bin/sh -c 'git clone git#bitbucket.org:bryan_picsolve/poc_docker.git /home/vagrant/poc_docker' "

As of ansible 1.5 (devel aa2d6e47f0) last updated 2014/03/24 14:23:18 (GMT +100) and Vagrant 1.5.1 this now works.
My Vagrant configuration contains the following:
config.vm.provision "ansible" do |ansible|
ansible.playbook = "../playbooks/basho_bench.yml"
ansible.sudo = true
ansible.host_key_checking = false
ansible.verbose = 'vvvv'
ansible.extra_vars = { ansible_ssh_user: 'vagrant',
ansible_connection: 'ssh',
ansible_ssh_args: '-o ForwardAgent=yes'}
It is also a good idea to explicitly disable sudo use. For example, when using the Ansible git module, I do this:
- name: checkout basho_bench repository
sudo: no
action: git repo=git#github.com:basho/basho_bench.git dest=basho_bench

The key difference appears to be the UserKnownHostFile setting. Even with StrictHostKeyChecking turned off, ssh quietly disables certain features including agent forwarding when there is a conflicting entry in the known hosts file (these conflicts are common for vagrant since multiple VMs may have the same address at different times). It works for me if I point UserKnownHostFile to /dev/null:
config.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
ansible.raw_ssh_args = ['-o UserKnownHostsFile=/dev/null']
end

Here's a workaround:
Create an ansible.cfg file in the same directory as your Vagrantfile with the following lines:
[ssh_connection]
ssh_args = -o ForwardAgent=yes

You can simply add this line to your Vagrantfile to enable the ssh forwarding:
config.ssh.forward_agent = true
Note: Don't forget to execute the task with become: false
Hope, this will help.

I've found that I need to do two separate things (on Ubuntu 12.04) to get it working:
the -o ForwardAgent thing that #Lorin mentions
adding /etc/sudoers.d/01-make_SSH_AUTH_SOCK_AVAILABLE with these contents:
Defaults env_keep += "SSH_AUTH_SOCK"

I struggled with a very similar problem for a few hours.
Vagrant 1.7.2
ansible 1.9.4
My symptoms:
failed: [vagrant1] => {"cmd": "/usr/bin/git ls-remote '' -h refs/heads/HEAD", "failed": true, "rc": 128}
stderr: Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
msg: Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
FATAL: all hosts have already failed -- aborting
SSH'ing into the guest, I found that my ssh-agent was forwarding as expected:
vagrant#vagrant-ubuntu-trusty-64:~$ ssh -T git#github.com
Hi baxline! You've successfully authenticated, but GitHub does not provide shell access.
However, from the host machine, I could not open the connection:
$ ansible web -a "ssh-add -L"
vagrant1 | FAILED | rc=2 >>
Could not open a connection to your authentication agent.
After confirming that my ansible.cfg file was set up, as #Lorin noted, and my Vagrantfile set config.ssh.forward_agent = true, I still came up short.
The solution was to delete all lines in my host's ~/.ssh/known_hosts file that were associated with my guest. For me, they were the lines that started with:
[127.0.0.1]:2201 ssh-rsa
[127.0.0.1]:2222 ssh-rsa
[127.0.01]:2222 ssh-rsa
[127.0.0.1]:2200 ssh-rsa
Note the third line has a funny ip address. I'm not certain, but I believe that line was the culprit. These lines are created as I destroy and create vagrant VMs.

Related

Forcing Ansible to use SSH password

I'm trying to use Ansible to set up hosts that will initially only be accessible via SSH with a password (not a key file) (yes, my first playbook is to set up key based access).
I can access the hosts using SSH passwords from the command line.
Running Ansible in verbose mode gives the following output
EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="debian"' -o ConnectTimeout=30 -o ControlPath=/home/home/.ansible/cp/2d22e058dc 192.168.122.11 '/bin/sh -c '"'"'echo ~debian && sleep 0'"'"''
<192.168.122.11> (255, b'', b'OpenSSH_8.9p1 Ubuntu-3, OpenSSL 3.0.2 15 Mar 2022
debug1: Reading configuration data /home/home/.ssh/config
...
debug3: no such identity: /home/home/.ssh/id_dsa: No such file or directory
debug2: we did not send a packet, disable method
debug1: No more authentication methods to try.
debian#192.168.122.11: Permission denied (publickey,password).
It looks like the SSH client is being forced to not use passwords PasswordAuthentication=no and there is nothing in the rest of the output that indicates it is trying.
This is my hosts file (no they are not the real passwords)
all:
children:
init:
hosts:
bullseye-apps:
bullseye-backup:
vars:
ansible_ssh_pass: 'password'
ansible_become_pass: 'password'
So I think I should be giving Ansible the option to use passwords.
I run my playbook as follows
ansible-playbook -i test-hosts.yml playbook.yml
I've recently upgraded my OS (to Pop_OS! 22.04) and haven't run these playbooks in a while so possibly a change in Ansible?
$ ansible --version
ansible 2.10.8
config file = /home/home/Projects/federated-agency/ansible.cfg
configured module search path = ['/home/home/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 3.10.4 (main, Apr 2 2022, 09:04:19) [GCC 11.2.0]
Any thoughts?
So it looks like I misspelled the user names of one of the hosts, damnit!

Why isn't ssh-agent authenticating the private key's passphrase when I use Ansible?

In short,
ssh-agent will authenticate the passphrase when I ssh into the remote server from the command line, but whenever I execute an ansible playbook it asks for the passphrase. My question is, why won't ssh-agent authenticate the passphrase for Ansible? How can I get it to work?
In detail,
I created a password protected private key and corresponding public key and uploaded the public key to the server.
I invoked the ssh-agent using eval $(ssh-agent) and then ssh-add /etc/ansible/ssh/private-key.pem
Typing ssh-agent -l shows that the key has been added.
I can successfully ssh into the machine from the command line using ssh username#ipaddress without being asked for the passphrase.
but if I execute a playbook or do something like sudo ansible -m ping server it will say
Enter passphrase for key '/etc/ansible/ssh/private-key.pem':
I tried it again in verbose mode and it gives me the following information
ansible 2.4.2.0
config file = /etc/ansible/ansible.cfg configured
module search path = [u'/etc/ansible/library']
ansible python module
location = /usr/lib/python2.7/dist-packages/ansible
executable
location = /usr/bin/ansible
python version = 2.7.12 (default, Nov 20
2017, 18:23:56) [GCC 5.4.0 20160609] Using /etc/ansible/ansible.cfg as
config file Parsed /etc/ansible/hosts inventory source with ini plugin
META: ran handlers Using module file
/usr/lib/python2.7/dist-packages/ansible/modules/system/ping.py
<35.230.127.195> ESTABLISH SSH CONNECTION FOR USER: user6
<35.230.127.195> SSH: EXEC ssh -C -o ControlMaster=auto -o
ControlPersist=60s -o StrictHostKeyChecking=no -o Port=22 -o
'IdentityFile="/etc/ansible/ssh/private-key.pem"' -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey
-o PasswordAuthentication=no -o User=user6 -o ConnectTimeout=10 -o ControlPath=/home/user6/.ansible/cp/e26536be01 35.230.127.195 '/bin/sh
-c '"'"'echo ~ && sleep 0'"'"'' Enter passphrase for key '/etc/ansible/ssh/private-key.pem':
My Environment
Ansible version is 2.4.2.0
Python version is 2.7.12
OpenSSH_7.2p2 Ubuntu-4ubuntu2.2, OpenSSL 1.0.2g
The ssh keys were created using RSA (not SSH-1 RSA)
and 4096 bits.
In ansible.cfg transport is set to smart.
The key is encrypted using ansible-vault, but I've tried doing it
without encryption and it makes no difference.
Please help, I don't have much hair left.
UPDATE: Using transport = local executes everything locally (ie it doesn't execute the ansible playbook on the remote server(even though it looks like it does)).
Go to ansible.cfg file at below location:
/etc/ansible/ansible.cfg
And set the transport = local :
transport = local
Thanks

I can ssh just fine, but ansible says "no route to host"

I wrote a script to run up several vms using vagrant, which I have to then provision with ansible. Unfortunately my host is a windows machine, so I thought I could solve the issue by putting all the vms into a vpn and then provision them from another machine in the same vpn.
In theory, it works... I can ssh into the other machines without trouble. But when I run my ansible playbook, ansible fails.
At first I got the message "ssh: connect to host 10.1.2.100 [10.1.2.100] port 22: No route to host" when running ansible with -vvvv
This was in the evening, and I was very tired, and this error didn't recur the following morning. Not sure if it's got something to do with the vm I'm doing deployment from being rebooted in the meantime, or the receiving machine being destroyed and uped completely since then. In any case, the problem has not gone away.
results now, after recreating both vms:
# ansible-playbook -i vms -k -u vagrant vms.yml -vvvv
result:
<10.1.2.100> ESTABLISH SSH CONNECTION FOR USER: vagrant <10.1.2.100>
SSH: EXEC sshpass -d14 ssh -C -vvv -o ServerAliveInterval=50 -o
User=vagrant -o ConnectTimeout=10 -tt 10.1.2.100 '( umask 22 && mkdir
-p "$( echo $HOME/.ansible/tmp/ansible-tmp-1455781388.36-25193904947084 )" && echo
"$( echo $HOME/.ansible/tmp/ansible-tmp-1455781388.36-25193904947084
)" )' fatal: [10.1.2.100]: FAILED! => {"failed": true, "msg": "ERROR!
Using a SSH password instead of a key is not possible because Host Key
checking is enabled and sshpass does not support this. Please add
this host's fingerprint to your known_hosts file to manage this
host."}
So far so clear. I ssh into the other instance to add it to the known hosts. This works without any trouble.
Back to ansible, I try the same command again. The result now is:
<10.1.2.100> ESTABLISH SSH CONNECTION FOR USER: vagrant <10.1.2.100>
SSH: EXEC sshpass -d14 ssh -C -vvv -o ServerAliveInterval=50 -o
StrictHostKeyChecking=no -o User=vagrant -o ConnectTimeout=10 -tt
10.1.2.100 '( umask 22 && mkdir -p "$( echo $HOME/.ansible/tmp/ansible-tmp-1455782149.99-271768166468916 )" &&
echo "$( echo
$HOME/.ansible/tmp/ansible-tmp-1455782149.99-271768166468916 )" )'
<10.1.2.100> PUT /tmp/tmpXQKa8Z TO
/home/vagrant/.ansible/tmp/ansible-tmp-1455782149.99-271768166468916/setup
<10.1.2.100> SSH: EXEC sshpass -d14 sftp -b - -C -vvv -o
ServerAliveInterval=50 -o StrictHostKeyChecking=no -o User=vagrant -o
ConnectTimeout=10 '[10.1.2.100]' fatal: [10.1.2.100]: UNREACHABLE! =>
{"changed": false, "msg": "ERROR! SSH Error: data could not be sent to
the remote host. Make sure this host can be reached over ssh",
"unreachable": true}
Well, I made sure the host was reachable by ssh, thank you very much! Ansible still can't get through, and I'm about to get a brain tumor from thinking of things that might be the problem.
Any suggestions what might be the problem?
This issue was reported here, with some workarounds:
https://github.com/ansible/ansible/issues/15321
The consensus seems to be either to a. use ansible_password or b. use -u username in the connection parameters. However, any number of things can disrupt an SSH connection in ways that make it look "unreachable" to higher level apps, so I recommend going through each of the steps outlined in that ticket.

Ansible & Vagrant development environment

I have just discovered Ansible and it is great! I have written some cool playbooks to manage 0downtime docker deployments on my servers, but I waste quite a bit of time waiting things to happen due to the fact that I sometimes have to work with poor internet connection. So i thought, I might be able to run Ansible against boot2docker, but got no success and after doing a lil bit of research I realized it would be too hacky and it would never behave like my actual Ubuntu server. So here I am trying to make it work with Vagrant.
I want to achive something like Laptop > Ansible > Vagrant Box; don`t want to run the playbooks from the Vagrant Box!
VagrantFile
Vagrant.configure(2) do |config|
config.vm.box = "ubuntu/trusty64"
config.ssh.forward_agent = true
end
vagrant ssh-config
Host default
HostName 127.0.0.1
User vagrant
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile "/Users/cesco/Code/vagrant/.vagrant/machines/default/virtualbox/private_key"
IdentitiesOnly yes
LogLevel FATAL
ForwardAgent yes
Thanks to some SO question I was able to do this:
$ vagrant ssh-config > vagrant-ssh
$ ssh -F vagrant-ssh default
$ vagrant#vagrant-ubuntu-trusty-64:~$
But I keep getting localhost | FAILED => SSH Error: Permission denied (publickey,password).every time I try to run the Ansible ping ont the vagrant box.
Ansible inventory
[staging]
vagrant#localhost
Ansible config
[ssh_connection]
ssh_args = -o UserKnownHostsFile=/dev/null \
-o StrictHostKeyChecking=no \
-o PasswordAuthentication=no \
-o IdentityFile=/Users/cesco/.vagrant.d/insecure_private_key \
-o IdentitiesOnly=yes \
-o LogLevel=FATAL \
-p 2222
How do I translate the ssh file to ansible configurantion?
It does not work on the command line also!
ssh -vvv vagrant#localhost -p 2222 -i /Users/cesco/.vagrant.d/insecure_private_key -o IdentitiesOnly=yes -o LogLevel=FATAL -o PasswordAuthentication=no -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null
To use vagrant with and classic ssh connection, first add another private IP to your Vagrant file.
config.vm.network "private_network", ip: "192.168.1.2"
Reload your instance
vagrant reload
Then you can connect by ssh using the private key.
ssh -vvv vagrant#192.168.1.2 -p 2222 -i /Users/cesco/.vagrant.d/insecure_private_key
That is the best way.
You misunderstand. The vagrant ansible plugin does not run ansible from the vagrant, but instead SSHs into the vagrant from your local box. That's the way to go since it means with a few small changes you can target a remote host instead.
To get it working you need to add something like this to your Vagrantfile:
config.vm.provision "ansible" do |ansible|
ansible.playbook = "ansible/vagrant.yml"
ansible.sudo = true
ansible.ask_vault_pass = true # comment out if you don't need
ansible.verbose = 'vv' # comment out if you don't want
ansible.groups = {
"tag_Role_myrole" => ["myrole"]
}
ansible.extra_vars = {
role: "myrole"
}
end
# Set the name of the VM.
config.vm.define "myrole" do |myrole|
luigi.vm.hostname = "myrole"
end
Create/update your ansible.cfg file with:
hostfile = ../.vagrant/provisioners/ansible/inventory/vagrant_ansible_inventory
Create a hosts inventory file containing:
localhost=127.0.0.1 ansible_connection=local
Now vagrant up will bring up and provision the instance, or run vagrant provision to (re)provision a running vagrant.
To run a playbook directly against your vagrant use:
ansible-playbook -u vagrant --private-key=~/.vagrant.d/insecure_private_key yourplaybook.yml

ssh connection to Vagrant virtual machine using Ansible fails

I'm new to Ansible.I set-up an Ubuntu virtual machine using Vagrant. I'm able to ssh into the machine using ssh vagrant#172.16.23.228. I have created an ssh key with the same password as the vm, added it to the agent and specified the path in my hosts file.
After following the instructions here I started to receive the following errors, when running this command (ansible all --inventory-file=hosts.ini --module-name ping -u vagrant -vvvv):
Not sure what I'm missing from my set-up, what else I need to check?
<172.16.23.228> ESTABLISH CONNECTION FOR USER: vagrant
<172.16.23.228> REMOTE_MODULE ping
<172.16.23.228> EXEC ssh -C -tt -vvv -o ControlMaster=auto -o ControlPersist=60s -o ControlPath="/Users/user/.ansible/cp/ansible-ssh-%h-%p-%r" - o Port=22 -o IdentityFile="~Users/user/.ssh/onemachine_rsa" -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 172.16.23.228 /bin/sh -c 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1451080871.59-247915080664557 && chmod a+rx $HOME/.ansible/tmp/ansible-tmp-1451080871.59-247915080664557 && echo $HOME/.ansible/tmp/ansible-tmp-1451080871.59-247915080664557'
172.16.23.228 | FAILED => SSH Error: tilde_expand_filename: No such user Users
while connecting to 172.16.23.228:22
It is sometimes useful to re-run the command using -vvvv, which prints SSH debug output to help diagnose the issue.
My hosts file looks like:
[testserver]
172.16.23.228 ansible_ssh_port=22 ansible_ssh_user=vagrant ansible_ssh_private_key_file=~Users/user/.ssh/onemachine_rsa
What you're doing can work, but I highly recommend using the built-in Ansible provisioner in Vagrant. It will make your life easier and improve your Vagrant skills at the same time. And if you need to execute any shell scripts, use the shell provisioner.
Providing this answer for the benefit of those, like me, who arrive later at the party. Latest Vagrant installations install a private key in a local directory instead of using the admittedly insecure private key for every VM. You'll have to create an ansible_hosts file like this one:
[vagrantboxes]
jessie ansible_ssh_port=2222 ansible_ssh_host=127.0.0.1
[vagrantboxes:vars]
ansible_ssh_user=vagrant
ansible_ssh_private_key_file=.vagrant/machines/default/virtualbox/private_key
Where the key is the last line, which provides a path to the actual private key used in the virtual machine that has been started up from this particular directory.
The path to your ansible_ssh_private_key_file is incorrect. Try ansible_ssh_private_key_file=~/.ssh/onemachine_rsa instead. The tilde in this case expands to the home directory of your user on the local machine you're running ansible from.