How to send Public Key while running ansible-playbook? - ssh

I am working on the vault-ssh secret engine method which is using the authorized CA for signing the keys and you can authenticate to the client with that signed certificate,
You can check this link:- https://www.vaultproject.io/docs/secrets/ssh/signed-ssh-certificates
I am able to log in to the client machine using the command:-
ssh -i id_rsa -i signed-vault.crt test#client-ip
same thing I need to do with ansible, but I find that in ansible there is no way to send the public key on the go, or while running the ansible-playbook command, you can send private key using the --private-key option.
So need help is there any way by which we can send the public key on the go or any work-around on this.
if needed more clarity on the vault-ssh-setup you can check out this blog
https://brian-candler.medium.com/using-hashicorp-vault-as-an-ssh-certificate-authority-14d713673c9a

i am searching something on the google and find this github link :- https://gist.github.com/nehrman/3951a9f61083e462c60aeffcd942acb8
we can use the set_fact to get this thing done.
- set_fact:
ansible_ssh_private_key_file: "{{ r_tempfile.path }}/id_rsa"
- set_fact:
ansible_ssh_extra_args: "-i {{ r_tempfile.path }}/signed"
- set_fact:
ansible_user: ansible
Thanks to the nehrman for writing this.
https://github.com/nehrman

Related

Is it possible to set Ansible Vars from existing facts?

I find it easy to describe what I want to do showing how I tried to implement it, as below.
In one playbook1.yml, I have:
- name: set facts with ssh connecton details
set_fact:
ssh_user: "user01"
ssh_pass: "passw0rd"
In my playbook2.yml, I want to do something similar to this:
import_playbook: playbook1.yml
Vars:
ansible_user: "{{ ssh_user }}"
ansible_ssh_pass: "{{ ssh_pass }}"
After this, my tasks in the playbook2.yml should attempt to use sshpass when connecting to my remote hosts, getting user and pass for the "ansible_*" Vars above.
Can be done? Clearly that is not working and I am unable to find a solution for this.
Setting Ansible facts for the ssh Vars does work - clearly it needs Vars to trigger sshpass and use password for remote access.
I know ssh keys is the way to go, and that is being covered, however I also need a solution for this specific use case.
Thanks in advance for any help.
The only solution I found so far was:
playbook1:
Set the facts, and save to a file
On playbook2:
Vars:
ansible_user: "{{lookup(file',('userfactsfile')}}"
ansible_ssh_pass: "{{lookup(file',('passfactsfile')}}"
Tasks:
- import_playbook: playbook1.yml
...
my actions here
...
delete userfactsfile
delete passfactsfile
There is still a problem: the password is saved in clear text for the duration of the playbook run. If there is an unexpected interruption in the process, the password file might be left stored in the server (which is the primary concern being addressed in this very same work).
An acceptable solution would be:
Encrypt the password before saving it to file in paybook1, However, I faced some technical challenges (I am pretty noob in Ansible), but still a viable solution if I could achieve.
The encryption in playbook1 would use the actual password as passphrase.The passphrase is persistent across both playbooks, in a fact.
In playbook2, this password would be used to decrypt the password file in the Vars lookup.

Is it possible to add an ssh key to the agent for a private repo in an ansible playbook?

I am using Ansible to provision a Vagrant environment. As part of the provisioning process, I need to connect from the currently-provisioning VM to a private external repository using an ssh key in order to use composer to pull in modules for an application. I've done a lot of reading on this before asking this question, but still can't seem to comprehend what's going on.
What I want to happen is:
As part of the playbook, on the Vagrant VM, I add the ssh key to the private repo to the ssh-agent
Using that private key, I am then able to use composer to require modules from the external source
I've read articles which highlight specifying the key in playbook execution. (E.g. ansible-play -u username --private-key play.yml) As far as I understand, this isn't for me, as I'm calling the playbook via Vagrant file. I've also read articles which mention ssh forwarding. (SSH Agent Forwarding with Ansible). Based on what I have read, this is what I've done:
On the VM being provisioned, I insert a known_hosts file which consists of the host entries of the machines which house the repos I need:
On the VM being provisioned, I have the following in ~/.ssh/config:
Host <VM IP>
ForwardAgent yes
I have the following entries in my ansible.cfg to support ssh forwarding:
[defaults]
transport = ssh
[ssh_connection]
ssh_args=-o ForwardAgent=yes -o ControlMaster=auto -o ControlPersist=60s -o ControlPath=/tmp/ansible-ssh-%h-%p-%r
[privilege_escalation]
pipelining = False
I have also added the following task to the playbook which tries to
use composer:
- name: Add ssh agent line to sudoers
become: true
lineinfile:
dest: /etc/sudoers
state: present
regexp: SSH_AUTH_SOCK
line: Defaults env_keep += "SSH_AUTH_SOCK"
I exit the ansible provisioner and add the private key on the provisioned VM to the agent via a shell provisioner (This is where I suspect I'm going wrong)
Then, I attempt to use composer, or call git via the command module. Like this, for example, to test:
- name: Test connection
command: ssh -T git#github.com
Finally, just in case I wasn't understanding ssh connection forwarding correctly, I assumed that what was supposed to happen was that I needed to first add the key to my local machine's agent, then forward that through to the provisioned VM to use to grab the repositories via composer. So I used ssh-add on my local machine before executing vagrant up and running the provisioner.
No matter what, though, I always get permission denied when I do this. I'd greatly appreciate some understanding as to what I may be missing in my understanding of how ssh forwarding should be working here, as well as any guidance for making this connection happen.
I'm not certain I understand your question correctly, but I often setup machines that connect to a private bitbucket repository in order to clone it. You don't need to (and shouldn't) use agent forwarding for that ("ssh forwarding" is unclear; there's "authentication agent forwarding" and "port forwarding", but you need neither in this case).
Just to be clear with terminology, you are running Ansible in your local machine, you are provisioning the controlled machine, and you want to ssh from the controlled machine to a third-party server.
What I do is I upload the ssh key to the controlled machine, in /root/.ssh (more generally $HOME/.ssh where $HOME is the home directory of the controlled machine user who will connect to the third-party server—in my case that's root). I don't use the names id_rsa and id_rsa.pub, because I don't want to touch the default keys of that user (these might have a different purpose; for example, I use them to backup the controlled machine). So this is the code:
- name: Install bitbucket aptiko_ro ssh key
copy:
dest: /root/.ssh/aptiko_ro_id_rsa
mode: 0600
content: "{{ aptiko_ro_ssh_key }}"
- name: Install bitbucket aptiko_ro ssh public key
copy:
dest: /root/.ssh/aptiko_ro_id_rsa.pub
content: "{{ aptiko_ro_ssh_pub_key }}"
Next, you need to tell the controlled machine ssh this: "When you connect to the third-party server, use key X instead of the default key, and logon as user Y". You tell it in this way:
- name: Install ssh config that uses aptiko_ro keys on bitbucket
copy:
dest: /root/.ssh/config
content: |
Host bitbucket.org
IdentityFile ~/.ssh/aptiko_ro_id_rsa
User aptiko_ro

ANSIBLE: using variables on a hosts file (ssh)

I´m newbie with this fantastic automation engine, and have a little issue with the vars file:
By the momment, I must connect via SSH without keypars using an specifics users and password.
hosts file
[all:vars]
connection_mode1=ssh
ssh_user1=user1
ssh_pass1=pass1
[serverstest]
host1 ansible_connection=connection_mode1 ansible_ssh_user=ssh_user1 ansible_ssh_pass=ssh_pass1
I'm also trying wrap with "" and {} but doesn't works.
How can I use variables on this parameters?
ansible_ssh_user has been deprecated since v. 2.0. It becomes ansible_user. See here.
Never store ansible_ssh_pass variable in plain text; always use a vault. See Variables and Vaults.
Anyway, having a mytest.inventory file as follows
[all:vars]
ssh_user1=user1
[serverstest]
host1 ansible_user="{{ ssh_user1 }}"
it works, e.g.
ansible -i mytest.inventory serverstest -m ping -k
Option -k asks for the password.
If you still want to write the password in the inventory you can leave the password variable definition and add ansible_ssh_pass="{{ ssh_pass1 }}"
[serverstest]
192.168.15.201 ansible_user="{{ ssh_user1 }}" ansible_ssh_pass="{{ ssh_pass1 }}"

Ansible `authorized_key` copies the key to remote user but not working when trying to ssh

I have the following task in my ansible playbook that adds my ssh public key for a remote user pranjal that was already created by a previous task.
- authorized_key:
user: pranjal
key: "{{ lookup('file', 'pranjal.pub') }}"
When I run the ansible playbook, it runs successfully. However when I try logging in to the server using: ssh pranjal#<server_ip>
I get a Permission denied (publickey) error.
To be sure I logged into server from another user and double checked that key listed in /home/pranjal/.ssh/authorized_keys matches with my local public key that I am using to login.
The issue that I am guessing here could be a permissions issue and I understood the solution from a related question.
But how do we change permissions of authorized_key from within the Ansible task itself? (So that I don't have to separately log into the instance to modify permissions of .ssh/authorized_keys)
- file: path=/home/pranjal/.ssh state=directory owner=pranjal mode=0700
- file: path=/home/pranjal/.ssh/authorized_keys state=file owner=pranjal mode=0600
You may also want to check/verify /etc/ssh/sshd_config has the following:
PubkeyAuthentication yes
You can debug further by ssh -vvv pranjal#<server_ip>

Command to send public key to remote host

I remember there is a command to send public key to the remote host that I want. I want to use that feature to send one of my public keys to the other host. How can I do that?
You are looking for ssh-copy-id. All this command does is create .ssh and .ssh/authorized_keys and set their permissions appropriately if they don't exist. Then it appends your public key to the end of .ssh/authorized_keys.
You might be looking for this command:
cat ~/.ssh/id_rsa.pub | ssh user#hostname 'cat >> .ssh/authorized_keys'
It appends your public key to the servers authorized keys.
Source
If your server is already set up to not accept password-based login, you might get a Permission denied (publickey) error.
This is another method to send the key, using netcat, so you don't have to authenticate. It will only work over a local network, but you can use port forwarding to do this over the internet.
On the server:
$ nc -l 55555 >> ~/.ssh/authorized_keys
On the client (replace HOSTNAME with the hostname or IP of the server):
$ nc HOSTNAME 55555 < ~/.ssh/id_rsa.pub
You can replace 55555 with an open port of your choice.
source: chat over lan from linux to linux?
Appendix for total newbies: I don't think anyone's mentioned this yet, but if you get ERROR: failed to open ID file '/home/username/.pub': No such file, you need to generate a key first. The Ubuntu help pages have a great guide on Generating RSA Keys.
In other answers there's no example for ssh-copy-id so here it is(first you need to generate the key)
ssh-copy-id user#url