Host is unreachable by SSH while running Ansible playbook. Ran Ansible playbook on 15 hosts, 13 were provisioned successfully, 1 was unreachable even though they were configured the same. Here is the actual error received. Can anyone help? Thx
fatal: [Host]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: FIPS mode initialized\r\nDisabling GSSAPIKeyExchange. Not usable in FIPS mode\r\n
…
\nPermission denied (publickey,gssapi-keyex,gssapi-with-mic,password,keyboard-interactive).", "unreachable": true}
Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password,keyboard-interactive)
you may check permission of .ssh/authorized_keys.
seem issue of your public key of destination server.
Related
While connecting to a managed host(netapp device) using command module, I get the below error.
TASK [Gathering Facts] *********************************************************
fatal: [10.20.30.40]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: X11 forwarding request failed", "unreachable": true}
How to set ssh setting "ForwardX11 no" with ansible configuration / ansible-playbook command line option.
I don't want to change ssh settings in user directory.
Try passing ssh arguments in command line ansible-playbook --ssh-common-args='-o ForwardX11=no' <rest_of_the_commands>
when i execute ansible playbook from one server to other remote server i'm getting an error as
"msg": "Failed to connect to the host via ssh: ssh_askpass: exec(/usr/bin/ssh-askpass): No such file or directory\r\nHost key verification failed.", "unreachable": true"
blow is my play book
- hosts: igwcluster_AM:igwcluster_IS
become: true
become_method: sudo
gather_facts: True
tasks:
- name: Install Oracle Java 8
script:/data2/jenkins/workspace/PreReq_Install_To_Servers/IGW/IGW_Cluster/prereqs_Products/Java.sh
I'm using two host groups and each group has 2 servers.
Error log:
UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: ssh_askpass: exec(/usr/bin/ssh-askpass): No such file or directory\r\nHost key verification failed.", "unreachable": true}
Note : I have tried with
host_key_checking = False
ssh_args = -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no
But still it fails. please advise me on this
First of all you have to put space after "script:" and place script exactly under "name:" so it will look like that.
tasks:
- name: Install Oracle Java 8
script: /data2/jenkins/workspace/PreReq_Install_To_Servers/IGW/IGW_Clust/prereqs_Products/Java.sh
Try to use ssh key for ssh authorization.
On the server that you are execute ansible playbook from, generate ssh key if you didn't already, you can do it with simple command:
ssh-keygen
(press enter till command exit)
Next copy it to remote server by ssh copy id command:
ssh-copy-id <remote server IP/FQDN>
After this your ansible server will be able to connect to remote server without password prompt and this error should not appear.
If this method doesn't work for you please share this information:
hosts file
become user that you are using to run this playbook
I'm running this Ansible ad-hoc command on Ubuntu 16.x (ansible ver. 2.2.1.0 and 2.2.2.0)
ansible host_alias -a "df -h" -u USER
where host_alias is the defined the ansible hosts file (defines an ec2 instance and its .pem file).
the host file looks like this:
[host_alias]
my_host.compute.amazonaws.com
private_key_file=/path/to/key/my_key.pem
I get this error:
private_key_file=/path/to/key/my_key.pem | UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: ssh: Could not resolve hostname private_key_file=/path/to/key/my_key.pem: Name or service not known\r\n",
"unreachable": true
}
my_host.compute.amazonaws.com | UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey).\r\n",
"unreachable": true
The same host and key work fine when I ssh (defined by ~/.ssh/config).
I have made triple sure the key is there and has read permissions. I also tried setting the ansible_user in the Ansible hosts file.
Any ideas?
Please check the format of the Ansible inventory file in the documentation.
You have defined two hosts in a host group named host_alias:
the first host is: my_host.compute.amazonaws.com,
the second host is: private_key_file=/path/to/key/my_key.pem.
Ansible complains it cannot connect to the second host:
Could not resolve hostname private_key_file=/path/to/key/my_key.pem
It also cannot connect to the first host, because the SSH key is not defined:
Failed to connect to the host via ssh: Permission denied (publickey).
On top of the mistake of splitting the hostname and the parameter into separate lines, you also got the name of the parameter wrong -- it should be ansible_ssh_private_key_file.
The parameters are listed in a later section of the same document.
Your inventory file should look like this:
[host_group_name]
my_host.compute.amazonaws.com ansible_ssh_private_key_file=/path/to/key/my_key.pem
and your command:
ansible host_group_name -a "df -h" -u USER
The second line needs to be dropped in the
[host_alias] section.
The above section is meant for hosts only.
Once you do that try
ansible all -m ping
to check if you can ping the host.
I have two Vagrant instances running having different IP:
192.168.33.17 [Ansible installed here]
192.168.33.19 [Another server where I am trying to connect]
My Ansible hosts file is in /etc/ansible/hosts and it looks like:
[example]
192.168.33.19:2222
I can easily connect via SSH to the second server with command:
ssh vagrant#192.168.33.19
without password.
But running the Ansible command yields error:
[root#centos72x64 vagrant]# ansible example -m ping -u vagrant
192.168.33.19 | UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh.",
"unreachable": true
}
How I can solve this error?
My Ansible hosts file is in /etc/ansible/hosts and it looks like
[example]
192.168.33.19:2222
You don't put port number in the Ansible inventory file this way. To learn how to do it, confer the docs.
But you also mentioned:
I can easily connect via ssh to the second server with command
ssh vagrant#192.168.33.19
So you don't use port 2222 at all.
I am trying to copy an SSH public key to a newly created VM:
- hosts: vm1
remote_user: root
tasks:
- name: deploy ssh key to account
authorized_key: user='root' key="{{lookup('file','/root/.ssh/id_rsa.pub')}}"
But getting error:
fatal: [jenkins]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).\r\n", "unreachable": true}
So to establish SSH I need first to establish SSH?
How can I establish SSH for newly created KVM automatically, without manual key copy.
(host_key_checking = False in ancible.cfg)
Assuming the target machine allows root-login with password (from the error message it seems it does), you must provide the credentials to your playbook:
ansible-playbook playbook.yml --extra-vars "ansible_ssh_user=root ansible_ssh_pass=password"
Something I tried (and it worked) when I had this same issue:
ansible target-server-name -m command -a "whatever command" -k
The -k prompts you for the ssh password to the target server.
Add below changes to the /etc/ansible/hosts file:
[target-server-name]
target_server_ip
Example:
ansible target-server-name -m ping -k