In shell I am following the below approach to become root user without any password. And it is working fine.
ssh-agent bash
ssh-add /repository/ansible/.ssh/id_rsa_ansible
ssh -A ansible#e8-df1
[ansible#e8-df1 ~]$ sudo -i
[root#e8-df1 ~]#
However, In ansible, I do not achieve the same and getting error. Below is my ansible inventory and playbook.
Inventory:
[qv]
e8-df1
e8-df2
[qv:vars]
ansible_ssh_user=ansible
ansible_ssh_private_key_file=/repository/ansible/.ssh/id_rsa_ansible
Playbook:
---
- hosts: qv
become: yes
roles:
- abc
Error:
fatal: [e8-df1]: FAILED! => {
"changed": false,
"failed": true,
"invocation": {
"module_name": "setup"
},
"module_stderr": "Shared connection to e8-df1 closed.\r\n",
"module_stdout": "sudo: a password is required\r\n",
"msg": "MODULE FAILURE"
}
fatal: [e8-df2]: FAILED! => {
"changed": false,
"failed": true,
"invocation": {
"module_name": "setup"
},
"module_stderr": "Shared connection to e8-df2 closed.\r\n",
"module_stdout": "sudo: a password is required\r\n",
"msg": "MODULE FAILURE"
}
I have gone through some documents and Q&As and they are suggesting to add below line in the sudoers file.
ansible ALL=(ALL) NOPASSWD: ALL
Now, I am not able to realize why the shell procedure is working without the sudoers configuration. And if there is any other way to achieve the same in the ansible?
The problem is that when you connect via shell, you are passing the Agent in the SSH connection using the -A parameter, in Ansible you need to configure this behavior if you want to pass the agent on SSH connection.
Here a related question with a solution: SSH Agent Forwarding with Ansible
Basically you need to provide on ansible.cfg the SSH parameter that you want, also you can add the parameters to hosts you are connecting, with a configuration of SSH client on ~/.ssh/config.
You need to setup this private_key_file = /path/to/file in configuration file /etc/ansible/ansible.cfg
As per you questioned it will should look like as below:
private_key_file = /repository/ansible/.ssh/id_rsa_ansible
Hope this helps.
Related
I have an Ansible Playbook that is meant to create an Apache server locally on my machine. However, everytime I try running the playbook I get the following error messages:
[WARNING]: Could not match supplied host pattern, ignoring: apache
skipping: no hosts matched
This is what my Ansible Playbook looks like:
- hosts: apache
tasks:
- name: install apache2
apt: name=apache2 update_cache=yes state=latest
And this is what my ansible.cfg and hosts files look like:
ansible.cfg:
[defaults]
hostfile = hosts
inventory = /etc/ansible/hosts
hosts:
myserver ansible_host=127.0.0.1 ansible_user=ubuntu ansible_connection=local
This is all on an Ubuntu VM, if that matters. What am I doing wrong?
Edit: Alright, I no longer get that error after doing Shraddheya's fix, but now I am getting this error:
fatal: [myserver]: FAILED! => {"ansible_facts": {}, "changed": false, "failed_modules": {"ansible.legacy.setup": {"ansible_facts": {discovered_interpreter_python": "/usr/bin/python3"}, "failed": true, "module_stderr": "sudo: a password is required\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}}, "msg": "The following modules fialed to execute: ansible.legacy.setup\n"}
Alter your hosts file a little to include myserver in apache hosts-group,
hosts file must read:
[apache]
myserver ansible_host=127.0.0.1 ansible_user=ubuntu ansible_connection=local
While connecting to a managed host(netapp device) using command module, I get the below error.
TASK [Gathering Facts] *********************************************************
fatal: [10.20.30.40]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: X11 forwarding request failed", "unreachable": true}
How to set ssh setting "ForwardX11 no" with ansible configuration / ansible-playbook command line option.
I don't want to change ssh settings in user directory.
Try passing ssh arguments in command line ansible-playbook --ssh-common-args='-o ForwardX11=no' <rest_of_the_commands>
while running this...
$ ansible all -m ping
I am getting this...
[WARNING]: sftp transfer mechanism failed on [test#172.31.48.154]. Use ANSIBLE_DEBUG=1 to see detailed information
[WARNING]: scp transfer mechanism failed on [test#172.31.48.154]. Use ANSIBLE_DEBUG=1 to see detailed information
test#172.31.48.154 | FAILED! => {
"failed": true,
"msg": "failed to transfer file to /home/test/.ansible/tmp/ansible-tmp-148610479.7-240708330710714/ping.py:\n\nssh: Could not resolve hostname 172.31.48.154]:Name or service not known\r\nlost connection\n"
}
localhost | SUCCESS => {
"changed": false,
"ping": "pong"
}
Using username#host is not supported in the inventory. See issue #14255.
Instead, you can write your inventory file like:
host.example.com ansible_connection=ssh ansible_user=test
Or run your command like:
$ ansible all -m ping -u test
I am trying to copy an SSH public key to a newly created VM:
- hosts: vm1
remote_user: root
tasks:
- name: deploy ssh key to account
authorized_key: user='root' key="{{lookup('file','/root/.ssh/id_rsa.pub')}}"
But getting error:
fatal: [jenkins]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).\r\n", "unreachable": true}
So to establish SSH I need first to establish SSH?
How can I establish SSH for newly created KVM automatically, without manual key copy.
(host_key_checking = False in ancible.cfg)
Assuming the target machine allows root-login with password (from the error message it seems it does), you must provide the credentials to your playbook:
ansible-playbook playbook.yml --extra-vars "ansible_ssh_user=root ansible_ssh_pass=password"
Something I tried (and it worked) when I had this same issue:
ansible target-server-name -m command -a "whatever command" -k
The -k prompts you for the ssh password to the target server.
Add below changes to the /etc/ansible/hosts file:
[target-server-name]
target_server_ip
Example:
ansible target-server-name -m ping -k
I am trying to use Ansible to create an infrastructure for ssh connections.
- name: Copy ssh key to each server
copy: src=static_folder_key dest=/home/ec2-user/.ssh/ mode=0600
- name: Enable ssh Agent
shell: eval $(ssh-agent -s)
- name: Adding ssh key for static forlder project
shell: ssh-add /home/ec2-user/.ssh/static_folder_key
sudo: True
I create a new ssh key and copy to my servers. Then I execute the agent and later I add the new key to allow the connection. But When I execute the ansible I got this error.
TASK: [git | Adding ssh key for static forlder project] ***********************
failed: [admin_vehicles] => {"changed": true, "cmd": "ssh-add /home/ec2-user/.ssh/static_folder_key", "delta": "0:00:00.004346", "end": "2015-08-12 15:05:00.878208", "rc": 2, "start": "2015-08-12 15:05:00.873862", "warnings": []}
stderr: Could not open a connection to your authentication agent.
failed: [leads_messages] => {"changed": true, "cmd": "ssh-add /home/ec2-user/.ssh/static_folder_key", "delta": "0:00:00.004508", "end": "2015-08-12 15:05:01.286031", "rc": 2, "start": "2015-08-12 15:05:01.281523", "warnings": []}
stderr: Could not open a connection to your authentication agent.
FATAL: all hosts have already failed -- aborting
If I execute this actions manually, everything goes fine.
ssh-add /home/ec2-user/.ssh/static_folder_key
Identity added: /home/ec2-user/.ssh/static_folder_key (/home/ec2-user/.ssh/static_folder_key)
So any tips? Maybe I am missing something in my playbook task?
The solution for this is to invoke eval "$(ssh-agent)" before the ssh-add. Initially I tried with two Ansible tasks but it failed the same way since they are atomic and cannot persist the state. The ultimate solution I end up with is to invoke both commands in a single task like this:
- name: Evaluating the authentication agent & adding the key...
shell: |
eval "$(ssh-agent)"
ssh-add ~/.ssh/id_rsa_svn_ssh
The environment for each task is independent, so you cannot leave ssh-agent settings made in one task to others.
I strongly recommend you to utilize SSH agent forwading. Put the following in ~/.ssh/config, then run ssh-agent and ssh-add static_folder_key locally before running ansible-playbook. That's all.
Host *
ForwardAgent yes
Even when agent forwarding is not an option, you don't have to run ssh-agent for a private key file with no passphrase. Copy the following configuration in ~/.ssh/config on remote hosts and run ssh to static-folder-host.
Host static-folder-host
Hostname static-folder-host.static-folder-domain
User static-folder-user
IdentityFile ~/.ssh/static_folder_key