incorrect sudo password ansible - ssh

In my ansible run i am getting the following error:
PLAY [test hashi vault] ******************************************************************************************************
TASK [Gathering Facts] *******************************************************************************************************
/usr/lib/python2.7/site-packages/urllib3/connectionpool.py:988: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vault.domain'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecureRequestWarning,
ok: [192.168.1.200]
TASK [show bar] **************************************************************************************************************
/usr/lib/python2.7/site-packages/urllib3/connectionpool.py:988: InsecureRequestWarning: Unverified HTTPS request is being made to host 'vault.domain'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecureRequestWarning,
fatal: [192.168.1.200]: FAILED! => {"msg": "Incorrect sudo password"}
PLAY RECAP *******************************************************************************************************************
192.168.1.200 : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
i know that the password is correct having done a debug and the same password works when extracting from vault using curl. this is the new code were i get the error:
---
- name: test hashi vault
hosts: all
remote_user: ec2-user
tasks:
- name: show bar
systemd:
state: restarted
name: sssd.service
async: 45
become: yes
become_method: sudo
this is what im running:
ansible-playbook -l 192.168.1.200 test.yml --private-key=/home/rehna/.ssh/testKeyPair.pem --vault-password-file /etc/ansible/ansible.vault -e #credentials
contents of credentials:
ansible_user: ec2-user
ansible_become_pass: "{{ lookup('hashi_vault', 'secret=secret/test/ec2_password auth_method=userpass username={{vault_user}} password={{vault_password}} url={{vault_url}}:{{vault_port}} validate_certs=false') }}"
hosts
[ec2]
192.168.1.200
[test_env]
192.168.1.200 remote_user=ec2-user
from /var/log/secure:
unix_chkpwd[30174]: password check failed for user (ec2-user)
sudo: pam_unix(sudo:auth): authentication failure; logname=ec2-user uid=1000 euid=0 tty=/dev/pts/4 ruser=ec2-user rhost= user=ec2-user
sudo: pam_unix(sudo:auth): conversation failed
sudo: pam_unix(sudo:auth): auth could not identify password for [ec2-user]
should be like this:
sudo: ec2-user : TTY=pts/4 ; PWD=/home/ec2-user ; USER=root ; COMMAND=/bin/passwd --stdin ec2-user
sudo: pam_unix(sudo:session): session opened for user root by ec2-user(uid=0)
sudo: pam_unix(sudo:session): session closed for user root

the format of the data returned is dict key/value pairs.
you need to extract the content from the return data provided by the lookup:
ec2_pass: "{{ lookup('hashi_vault', 'secret=secret/test/ec2_password auth_method=userpass username={{vault_user}} password={{vault_password}} url={{vault_url}}:{{vault_port}} validate_certs=false') }}"
ansible_become_pass: "{{ec2_pass.value}}"

Related

Error in ssh-rsa key when run a ansible playbook

I have the following playbook:
---
- name: Get Nokia Info
hosts: LAB9ERIP008
connection: local
gather_facts: no
tasks:
- name: run show version command
sros_command:
commands: show version
register: config
- name: create backup of configuration
copy:
content: "{{config.stdout[0]}}"
dest: "/home/dafe/scripts/ansible/backups/show_version_{{inventory_hostname}}.txt"
And when I run the playbook, give me the following error:
[dafe#CETPMGIP001 ansible]$ ansible-playbook nokia.yml -i myhostsfile
PLAY [Get Cisco Info] **************************************************************************************************************
TASK [run show version command] ****************************************************************************************************
fatal: [LAB9ERIP008]: FAILED! => {"msg": "paramiko: The authenticity of host '10.150.16.129' can't be established.\nThe ssh-rsa key fingerprint is fca0d4eb97414dc5b5a13fa552e5dd69."}
to retry, use: --limit #/home/dafe/scripts/ansible/nokia.retry
PLAY RECAP *************************************************************************************************************************
LAB9ERIP008 : ok=0 changed=0 unreachable=0 failed=1
I tried to put in myhostsfile the var:
ansible_ssh_private_key_file=/home/dafe/.ssh/known_hosts
But continues to give, the same error.
If I do ssh manually to the host and add the key:
[dafe#CETPMGIP001 ansible]$ ssh dafernandes#10.150.16.129
The authenticity of host '10.150.16.129 (10.150.16.129)' can't be established.
RSA key fingerprint is SHA256:0YQYfLnRCQDZzpZ1+8ekW/Gks6mTxpI4xA56siaQUsM.
RSA key fingerprint is MD5:fc:a0:d4:eb:97:41:4d:c5:b5:a1:3f:a5:52:e5:dd:69.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '10.150.16.129' (RSA) to the list of known hosts.
TiMOS-C-16.0.R6 cpm/hops64 Nokia 7750 SR Copyright (c) 2000-2019 Nokia.
All rights reserved. All use subject to applicable license agreements.
Built on Wed Feb 27 14:42:05 PST 2019 by builder in /builds/c/160B/R6/panos/main
dafernandes#10.150.16.129's password:
And then run the playbook does not make the mistake anymore:
[dafe#CETPMGIP001 ansible]$ ansible-playbook nokia.yml -i myhostsfile
PLAY [Get Cisco Info] **************************************************************************************************************
TASK [run show version command] ****************************************************************************************************
ok: [LAB9ERIP008]
TASK [create backup of configuration] **********************************************************************************************
ok: [LAB9ERIP008]
PLAY RECAP *************************************************************************************************************************
LAB9ERIP008 : ok=2 changed=0 unreachable=0 failed=0
How can I solve this?
Thanks.
David
In the [defaults] section of your ansible.cfg file try setting the key host_key_checking = false.
This is obviously not as secure.
Being that SSH is the primary mechanism Ansible uses to communicate with target hosts, it is important that SSH is configured properly in your environment before attempting to execute Ansible playbooks.
The underlying problem in this case is likely that the SSH key associated with the SSH host you are trying to connect to has changed and no longer matches what is in ~/.ssh/known-hosts. More information about what SSH host keys are for can be found here.

Ansible ssh giving permission denied error

I am trying to execute an ansible script against a server1.xxx.com, I am getting a permission denied error.
I have created a ssh key using command
ssh-keygen -f t11pkey
and also have added passphrase, copied the key to the server.
ssh-copy-id -i /home/user.name/t11pkey.pub user.name#server1.xxx.com
my ~/.ssh/config
Host server?.xxx.com
User user.name
Port 22
IdentityFile /home/user.name/.ssh/t11pkey.pub
Permission of my keys:
-rw------- 1 user.name Domain Users 1766 Dec 5 10:55 t11pkey
-rw------- 1 user.name Domain Users 412 Dec 5 10:55 t11pkey.pub
ansible.cfg
[defaults]
filter_plugins =./filter_plugins
roles_path = ./roles
sudo_user = root
host_key_checking = False
retry_files_enabled = False
[ssh_connection]
ssh_args = -F /home/user.name/.ssh/config -o ControlMaster=auto -o ControlPersist=30m
control_path = ~/.ssh/ansible-%%r#%%h:%%p
inventory file
[new]
server1.xxx.com
my ansible-playbook
- hosts: new
remote_user: user.name
become: true
vars_files:
- xx.yml
- xx.yml
- xx.yml
roles:
- role: ~/path/to/the/role
Anisble error:
TASK [Gathering Facts] *****************************************************************************************************************************************************
Enter passphrase for key '/home/user.name/.ssh/t11pkey.pub':
Enter passphrase for key '/home/user.name/.ssh/t11pkey.pub':
Enter passphrase for key '/home/user.name/.ssh/t11pkey.pub':
fatal: [server1.xxx.com]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).\r\n", "unreachable": true}
ansible --version: ansible 2.3.1.0 (stable-2.3 5512c94017) last updated 2017/06/21 22:56:43 (GMT -400)
IdentityFile parameter in the config file should point to the private key (t11pkey), not the public one (t11pkey.pub).

How to run Ansible playbook to multiple servers in a right way?

Ansible use ssh to setup softwares to remote hosts.
If there are some fresh machines just been installed, run Ansible playbook from one host will not connect them because of no authorized_keys on remote hosts.
If copy the Ansible host's pub key to those target hosts like:
$ ssh user#server "echo \"`cat .ssh/id_rsa.pub`\" >> .ssh/authorized_keys"
First should ssh login and make file on every remote host:
$ mkdir .ssh
$ touch .ssh/authorized_keys
Is this the common way to run Ansible playbook to remote servers? Is there a better way exist?
I think it's better to do that using Ansible as well, with the authorized_key module. For example, to authorize your key for user root:
ansible <hosts> -m authorized_key -a "user=root state=present key=\"$(cat ~/.ssh/id_rsa.pub)\"" --ask-pass
This can be done in a playbook also, with the target user as a variable that defaults to root:
- hosts: <NEW_HOSTS>
vars:
- username: root
tasks:
- name: Add authorized key
authorized_key:
user: "{{ username }}"
state: present
key: "{{ lookup('file', '/home/<YOUR_USER>/.ssh/id_rsa.pub') }}"
And executed with:
ansible-playbook auth.yml --ask-pass -e username=<TARGET_USER>
Your user should have privileges, if not use became.

Failed to connect to host via SSH on Vagrant with Ansible Playbook

I was not able to find where the actual problem is. I executed below playbook with my private key:
---
- hosts: localhost
gather_facts: false
sudo: yes
tasks:
- name: Install package libpcre3-dev
apt: name=libpcre3-dev state=latest
But I am getting the error below on Vagrant Ubuntu machine:
PLAY [localhost]
*********************************************************************
TASK [Install package ]
***************************************************
fatal: [vagrant]: UNREACHABLE! => {"changed": false, "msg": "Failed to
connect to the host via ssh: Permission denied (publickey,password).\r\n",
"unreachable": true}
to retry, use: --limit #/home/vagrant/playbooks/p1.retry
PLAY RECAP
*********************************************************************
vagrant : ok=0 changed=0 unreachable=1 failed=0
What could be the possible suggestion?
You are running a playbook against a localhost with SSH connection (default in Ansible) and this fails. Most likely because you never configured the account on your machine to accept the key from itself. Using defaults, you'd need to add the ~/.ssh/id_rsa.pub to ~/.ssh/authorized_keys.
Instead, to run on locally add connection: local to the play:
---
- hosts: localhost
connection: local
tasks:
- debug:
And it will give you a proper response:
TASK [debug] *******************************************************************
ok: [localhost] => {
"msg": "Hello world!"
}

Ansible - establishing initial SSH connection

I am trying to copy an SSH public key to a newly created VM:
- hosts: vm1
remote_user: root
tasks:
- name: deploy ssh key to account
authorized_key: user='root' key="{{lookup('file','/root/.ssh/id_rsa.pub')}}"
But getting error:
fatal: [jenkins]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).\r\n", "unreachable": true}
So to establish SSH I need first to establish SSH?
How can I establish SSH for newly created KVM automatically, without manual key copy.
(host_key_checking = False in ancible.cfg)
Assuming the target machine allows root-login with password (from the error message it seems it does), you must provide the credentials to your playbook:
ansible-playbook playbook.yml --extra-vars "ansible_ssh_user=root ansible_ssh_pass=password"
Something I tried (and it worked) when I had this same issue:
ansible target-server-name -m command -a "whatever command" -k
The -k prompts you for the ssh password to the target server.
Add below changes to the /etc/ansible/hosts file:
[target-server-name]
target_server_ip
Example:
ansible target-server-name -m ping -k