Keep "ansible_ssh_common_args" variable secret - ssh

My ansible playbook run through proxy but i want to keep this below variable secret from user who use my playbook vars:
"ansible_ssh_comoon_args:'-o ProxyCommand="ssh -W %h:%p -q user#hostname""
Note: I dont want to keep the above variable either in inventory file or playbook or ansible.config file
Thanks in advance

Related

Use ansible vault passwords for ask-become-pass and ssh password

I would like to use ansible vault passwords for the ssh and become passwords when running ansible-playbook. This way I dont need to type them in when using the parameters --ask-become-pass or the ssh password.
Problem:
Every time I run my ansible-playbook command I am prompted for a ssh and become password.
My original command where I need to type the SSH and become password:
ansible-playbook playbook.yaml --ask-become-pass -e ansible_python_interpreter='/usr/bin/python3' -i inventory -k --ask-vault-pass -T 40
Command I have tried to make ansible-playbook use my vault passwords instead of my typing them in:
ansible-playbook playbook.yaml -e ansible_python_interpreter='/usr/bin/python3' -i inventory -k -T 40 --extra-vars #group_vars/all/main.yaml
I tried creating the directory structure from where the command is run group_vars/all/main.yaml, where main.yaml has my ansible vault passwords for "ansible_ssh_user", "ansible_ssh_pass", and "ansible_become_pass"
I even tried putting my password in the command:
ansible-playbook playbook.yaml -e ansible_python_interpreter='/usr/bin/python3' -i inventory -k -T 40 --extra-vars ansible_ssh_pass=$'"MyP455word"'
ansible-playbook playbook.yaml -e ansible_python_interpreter='/usr/bin/python3' -i inventory -k -T 40 --extra-vars ansible_ssh_pass='MyP455word'
Every time I run my playbook command, I keep getting prompted for a SSH pass and become pass. What am I missing here?
I have already read these two posts, both of which were not clear to me on the exact process, so neither helped:
https://serverfault.com/questions/686347/ansible-command-line-retriving-ssh-password-from-vault
Ansible vault password in group_vars not detected
Any recommendations?
EDIT: Including my playbook, role, settings.yaml, and inventory file as well.
Here is my playbook:
- name: Enable NFS server
hosts: nfs_server
gather_facts: False
become: yes
roles:
- { role: nfs_enable }
Here is the role located in roles/nfs_enable/tasks/main.yaml
- name: Include vars
include_vars:
file: ../../../settings.yaml
name: settings
- name: Start NFS service on server
systemd:
state: restarted
name: nfs-kernel-server.service
Here is my settings file
#nfs share directory
nfs_ssh_user: admin
nfs_share_dir: "/nfs-share/logs/"
ansible_become_pass: !vault |
$ANSIBLE_VAULT;1.1;AES256
55543131373731393764333932626261383765326432613239356638616234643335643438326165
3332363366623937386635653463656537353663326139360a316436356634386135653038643238
61313123656332663232633833366133373630396434346165336337623364383261356234653461
3335386135553835610a303666346561376161366330353935363937663233353064653938646263
6539
ansible_ssh_pass: !vault |
$ANSIBLE_VAULT;1.1;AES256
55543131373731393764333932626261383765326432613239356638616234643335643438326165
3332363366623937386635653463656537353663326139360a316436356634386135653038643238
61313123656332663232633833366133373630396434346165336337623364383261356234653461
3335386135553835610a303666346561376161366330353935363937663233353064653938646263
6539
Here is my inventory
[nfs_server]
10.10.10.10 ansible_ssh_user=admin ansible_ssh_private_key_file=~/.ssh/id_ed25519

ANSIBLE: using variables on a hosts file (ssh)

I´m newbie with this fantastic automation engine, and have a little issue with the vars file:
By the momment, I must connect via SSH without keypars using an specifics users and password.
hosts file
[all:vars]
connection_mode1=ssh
ssh_user1=user1
ssh_pass1=pass1
[serverstest]
host1 ansible_connection=connection_mode1 ansible_ssh_user=ssh_user1 ansible_ssh_pass=ssh_pass1
I'm also trying wrap with "" and {} but doesn't works.
How can I use variables on this parameters?
ansible_ssh_user has been deprecated since v. 2.0. It becomes ansible_user. See here.
Never store ansible_ssh_pass variable in plain text; always use a vault. See Variables and Vaults.
Anyway, having a mytest.inventory file as follows
[all:vars]
ssh_user1=user1
[serverstest]
host1 ansible_user="{{ ssh_user1 }}"
it works, e.g.
ansible -i mytest.inventory serverstest -m ping -k
Option -k asks for the password.
If you still want to write the password in the inventory you can leave the password variable definition and add ansible_ssh_pass="{{ ssh_pass1 }}"
[serverstest]
192.168.15.201 ansible_user="{{ ssh_user1 }}" ansible_ssh_pass="{{ ssh_pass1 }}"

Ansible percent expand

I have an ansible playbook which connects to a virtual machine via a non-standard ssh port (forwarded to localhost) and a different user than the host user (vagrant).
The ssh port is specified in the ansible inventory:
[vms]
localhost:2222
The username given on the command line to ansible-playbook:
ansible-playbook -i <inventory from above> <some playbook> -u vagrant
The communication with the VM works correctly, however, %p always expands to 22 and %r to the host username.
Consequently, I cannot flush the SSH connection (for the user's changed group membership to take effect) like this:
- name: flush the ssh connection
command: ssh -o ControlPath="~/.ansible/cp/ansible-ssh-%h-%p-%r" -O stop {{inventory_hostname}}
delegate_to: 127.0.0.1
Am I making a silly mistake somewhere? Alternatively, is there a different way to flush the SSH connection?
The percent expand is not expanded by ansible, but by ssh later on.
Sorry, forgot to add the most important part
Using
command: ssh -o ControlPath=[...] -O stop {{inventory_hostname}}
will use default port, because you didn't specify it on the command-line. You would have to specify also the port to "flush" the connection this way:
command: ssh -o ControlPath=[...] -O stop -p {{inventory_port}} {{inventory_hostname}}
But I don't think it is needed. Ansible should clean up the connections when the playbook ends and I don't see any different reason why to do that.

Call ssh-copy-id in an Ansible playbook - How to handle password prompt?

I have two servers. I manage serverA with Ansible. serverB is not managed with Ansible. I want serverA to be able to access serverB by copying the ssh_pub_key of serverA to serverB.
This can be done manually by calling ssh-copy-id user#serverB on serverA.
I want to do this with Ansible on serverA automatically.
- name: Register ssh key at serverB
command: ssh-copy-id -i /home/{{user}}/.ssh/id_rsa.pub -o StrictHostKeyChecking=no user#serverB
Calling ssh-copy-id requires me to enter my ssh password for user#serverB, so the key can be copied.
How can I do this via ansible? I want it to ask for the user#serverB password interactively while executing the playbook. Storing the password in ansible vault is also an option. Then I still do not know how to avoid the interactive password call of ssh-copy-id though.
I also added -o StrictHostKeyChecking=no to the call because this is another interaction that normally requires user interaction when calling ssh-copy-id.
If using the ssh-copy-id command is not a restriction, you might as well try out the Ansible authorized_key module.
Then your code could look something like this:
authorized_key:
user: <user>
key: "{{ lookup('file', '/home/' + lookup('env', 'USER') + '/.ssh/id_rsa.pub') }}"
You can try sshpass tool. It would require modification of your command like this:
command: sshpass -p password ssh-copy-id -i /home/{{user}}/.ssh/id_rsa.pub -o StrictHostKeyChecking=no user#serverB
but there are other options how to provide the password -- see the sshpass(1) manual page.

Ansible prompts password when using synchronize

I'm using ansible in the following way:
ansible-playbook -f 1 my-play-book.yaml --ask-pass --ask-sudo-pass
After this I'm asked to enter the ssh & sudo passwords (same password for both).
Inside my playbook file I'm using synchronize task:
synchronize: mode=push src=rel/path/myfolder/ dest=/abs/path/myfolder/
For each host, I'm prompted to enter the ssh password of the remote host (the same that I entered in the beginning of the playbook run)
How can I avoid entering the password when executing synchronize task?
If you have setup the ssh keys correctly on the <host>, then the following should work.
ansible all -m synchronize -a "mode=push src=rel/path/myfolder/ dest=/abs/path/myfolder/" -i <host>, -vvv
I was able to get the above working without any password prompt.