I created a SSH key-pair using the following command:
ssh-keygen -t rsa -C "remote-user" -b 4096
I saved the key pair inside ~/.ssh directory of my local computer:
-rw------- 1 steve.rogers INTRA\users 3381 Jan 18 16:52 remote-user
-rw------- 1 steve.rogers INTRA\users 742 Jan 18 16:52 remote-user.pub
I have an instance in GCP and I have added the above public_key for the user remote_user to it.
After that I tried to SSH into the instance using the following command:
ssh -i ~/.ssh/remote-user remote-user#<gcp-instance-ip>
I was successfully able to ssh into the machine.
After that I tried to execute my playbook:
ansible-playbook setup.yaml --tags "mytag" --extra-vars "env=stg" -i /environments/stg/hosts
The execution did not succeed and ended up with the following error:
<gcp-server102> ESTABLISH SSH CONNECTION FOR USER: None
<gcp-server102> SSH: EXEC ssh -o ControlPersist=15m -F ssh.config -q -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/Users/steve.rogers/.ansible/cp/7be59d33ab gcp-server102 '/bin/sh -c '"'"'echo ~ && sleep 0'"'"''
Read vars_file './environments/{{env}}/group_vars/all.yaml'
<gcp-server101> ESTABLISH SSH CONNECTION FOR USER: None
<gcp-server101> SSH: EXEC ssh -o ControlPersist=15m -F ssh.config -q -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/Users/steve.rogers/.ansible/cp/6a68673873 gcp-server101 '/bin/sh -c '"'"'echo ~ && sleep 0'"'"''
Read vars_file './environments/{{env}}/group_vars/all.yaml'
<gcp-server201> ESTABLISH SSH CONNECTION FOR USER: None
<gcp-server201> SSH: EXEC ssh -o ControlPersist=15m -F ssh.config -q -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/Users/steve.rogers/.ansible/cp/e330878269 gcp-server201 '/bin/sh -c '"'"'echo ~ && sleep 0'"'"''
<gcp-server202> ESTABLISH SSH CONNECTION FOR USER: None
<gcp-server202> SSH: EXEC ssh -o ControlPersist=15m -F ssh.config -q -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/Users/steve.rogers/.ansible/cp/6f7ebc0471 gcp-server202 '/bin/sh -c '"'"'echo ~ && sleep 0'"'"''
<gcp-server102> (255, b'', b'remote-user#<gcp-instance-ip>: Permission denied (publickey).\r\n')
<gcp-server101> (255, b'', b'remote-user#<gcp-instance-ip>: Permission denied (publickey).\r\n')
fatal: [gcp-server102]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: remote-user#<gcp-instance-ip>: Permission denied (publickey).",
"unreachable": true
}
fatal: [gcp-server101]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: remote-user#<gcp-instance-ip>: Permission denied (publickey).",
"unreachable": true
}
<gcp-server202> (255, b'', b'remote-user#<gcp-instance-ip>: Permission denied (publickey).\r\n')
<gcp-server201> (255, b'', b'remote-user#<gcp-instance-ip>: Permission denied (publickey).\r\n')
fatal: [gcp-server202]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: remote-user#<gcp-instance-ip>: Permission denied (publickey).",
"unreachable": true
}
fatal: [gcp-server201]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: remote-user#<gcp-instance-ip>: Permission denied (publickey).",
"unreachable": true
}
I sshed into the gcp_instance and checked the authorized_keys file and it contains the correct public_key for the remote_user.
These are my ssh.config, ansible.cfg and host files:
ssh.config:
Host bastion-host
User remote-user
HostName <gcp-instance-ip>
ProxyCommand none
IdentityFile ~/.ssh/remote-user
BatchMode yes
PasswordAuthentication no
Host gcp-server*
ServerAliveInterval 60
TCPKeepAlive yes
ProxyCommand ssh -q -A remote-user#<gcp-instance-ip> nc %h %p
ControlMaster auto
ControlPersist 8h
User remote-user
IdentityFile ~/.ssh/remote-user
ansible.cfg:
[ssh_connection]
ssh_args = -o ControlPersist=15m -F ssh.config -q
scp_if_ssh = True
[defaults]
host_key_checking = False
host
[gcp_instance]
gcp_instance01
[gcp_instance:children]
gcp_instance_level_01
gcp_instance_level_02
[gcp_instance_level_01]
gcp-server102
gcp-server101
[gcp_instance_level_02]
gcp-server201
gcp-server202
What could be the issue that is preventing my playbook from executing?
Related
I'm trying to synchronize a remote VM with a playbook but it hangs in the ssh connection; I will be grateful for your help; this is the detail of the error :
root#pve:~# ansible-playbook ansible/playbook/timezone.yml -vvv
ansible-playbook 2.7.7
config file = /etc/ansible/ansible.cfg
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3/dist-packages/ansible
executable location = /usr/bin/ansible-playbook
python version = 3.7.3 (default, Jan 22 2021, 20:04:44) [GCC 8.3.0]
Using /etc/ansible/ansible.cfg as config file
/etc/ansible/hosts did not meet host_list requirements, check plugin documentation if this is unexpected
/etc/ansible/hosts did not meet script requirements, check plugin documentation if this is unexpected
Parsed /etc/ansible/hosts inventory source with ini plugin
PLAYBOOK: timezone.yml *************************************************************************************************1 plays in ansible/playbook/timezone.yml
PLAY [Set timezone and configure timesyncd] ****************************************************************************META: ran handlers
TASK [set timezone] ****************************************************************************************************task path: /root/ansible/playbook/timezone.yml:6
<10.10.2.56> ESTABLISH SSH CONNECTION FOR USER: ibrahimy
<10.10.2.56> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ibrahimy -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/b542c5a8f2 10.10.2.56 '/bin/sh -c '"'"'echo ~ibrahimy && sleep 0'"'"''
<10.10.2.56> (0, b'/home/ibrahimy\n', b'')
<10.10.2.56> ESTABLISH SSH CONNECTION FOR USER: ibrahimy
<10.10.2.56> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ibrahimy -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/b542c5a8f2 10.10.2.56 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/ibrahimy/.ansible/tmp `"&& mkdir /home/ibrahimy/.ansible/tmp/ansible-tmp-1640189176.0268686-25363-16402981562839 && echo ansible-tmp-1640189176.0268686-25363-16402981562839="` echo /home/ibrahimy/.ansible/tmp/ansible-tmp-1640189176.0268686-25363-16402981562839 `" ) && sleep 0'"'"''
<10.10.2.56> (0, b'ansible-tmp-1640189176.0268686-25363-16402981562839=/home/ibrahimy/.ansible/tmp/ansible-tmp-1640189176.0268686-25363-16402981562839\n', b'')
Using module file /usr/lib/python3/dist-packages/ansible/modules/commands/command.py
<10.10.2.56> PUT /root/.ansible/tmp/ansible-local-25357npsegr7h/tmpzsxo66mr TO /home/ibrahimy/.ansible/tmp/ansible-tmp-1640189176.0268686-25363-16402981562839/AnsiballZ_command.py
<10.10.2.56> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ibrahimy -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/b542c5a8f2 '[10.10.2.56]'
<10.10.2.56> (0, b'sftp> put /root/.ansible/tmp/ansible-local-25357npsegr7h/tmpzsxo66mr /home/ibrahimy/.ansible/tmp/ansible-tmp-1640189176.0268686-25363-16402981562839/AnsiballZ_command.py\n', b'')
<10.10.2.56> ESTABLISH SSH CONNECTION FOR USER: ibrahimy
<10.10.2.56> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ibrahimy -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/b542c5a8f2 10.10.2.56 '/bin/sh -c '"'"'chmod u+x /home/ibrahimy/.ansible/tmp/ansible-tmp-1640189176.0268686-25363-16402981562839/ /home/ibrahimy/.ansible/tmp/ansible-tmp-1640189176.0268686-25363-16402981562839/AnsiballZ_command.py && sleep 0'"'"''
<10.10.2.56> (0, b'', b'')
<10.10.2.56> ESTABLISH SSH CONNECTION FOR USER: ibrahimy
<10.10.2.56> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ibrahimy -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/b542c5a8f2 -tt 10.10.2.56 '/bin/sh -c '"'"'/usr/bin/python3 /home/ibrahimy/.ansible/tmp/ansible-tmp-1640189176.0268686-25363-16402981562839/AnsiballZ_command.py && sleep 0'"'"''
while collecting information it connects successfully, but while performing task it crashes and the error is as follows:
fatal: [10.10.2.56]: FAILED! => {
"changed": true,
"cmd": "timedatectl set-timezone Africa/Casablanca",
"delta": "0:00:25.834940",
"end": "2021-12-22 02:44:26.708523",
"invocation": {
"module_args": {
"_raw_params": "timedatectl set-timezone Africa/Casablanca",
"_uses_shell": true,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"warn": true
}
},
"msg": "non-zero return code",
"rc": 1,
"start": "2021-12-22 02:44:00.873583",
"stderr": "\u001b[0;1;31mFailed to set time zone: Connection timed out\u001b[0m",
"stderr_lines": [
"\u001b[0;1;31mFailed to set time zone: Connection timed out\u001b[0m"
],
"stdout": "",
"stdout_lines": []
}
the playbook file :
- name: Set timezone and configure timesyncd
hosts: "*"
gather_facts: yes
become: no
tasks:
- name: set timezone
shell: timedatectl set-timezone Africa/Casablanca
So I am trying to run a ansible playbook with become=yes because when I run it as my normal user he has no permission and the playbook fails. But he has sudo access on the server if I run commands manually. I can reach the other server and playbooks run without become=yes when I do things in my own home directory on the slave server. But that's it. And when I use become=yes I get this error and I don't know how to fix it. Can someone please help me. This is the error below
PLAY [install ansible] ************************************************************************************************************************************************************************************
TASK [Gathering Facts] ************************************************************************************************************************************************************************************
fatal: [h0011146.associatesys.local]: FAILED! => {"ansible_facts": {}, "changed": false, "failed_modules": {"setup": {"failed": true, "module_stderr": "Shared connection to h0011146.associatesys.local closed.\r\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}}, "msg": "The following modules failed to execute: setup\n"}
PLAY RECAP ************************************************************************************************************************************************************************************************
h0011146.associatesys.local : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
koebra#h0011145: /etc/ansible/roles>
THIS IS MY HOSTS FILE
#
# It should live in /etc/ansible/hosts
#
# - Comments begin with the '#' character
# - Blank lines are ignored
# - Groups of hosts are delimited by [header] elements
# - You can enter hostnames or ip addresses
# - A hostname/ip can be a member of multiple groups
[slave]
h0011146.associatesys.local ansible_connection=ssh ansible_python_interpreter=/usr/bin/python # ansible_user=root
This is the playbook that fails
---
- name: install ansible
hosts: slave
become: yes
tasks:
- name: install
yum:
name: ansible
state: latest
THIS IS THE FULL OUTPUT OF -VVV
koebra#h0011145: /etc/ansible/roles> ansible-playbook ansible.yml
PLAY [install ansible] ************************************************************************************************************************************************************************************
TASK [Gathering Facts] ************************************************************************************************************************************************************************************
^C [ERROR]: User interrupted execution
koebra#h0011145: /etc/ansible/roles> ansible-playbook ansible.yml -vvv
ansible-playbook 2.9.10
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/koebra/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /usr/bin/ansible-playbook
python version = 2.7.5 (default, Jun 11 2019, 14:33:56) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
Using /etc/ansible/ansible.cfg as config file
host_list declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
script declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Parsed /etc/ansible/hosts inventory source with ini plugin
PLAYBOOK: ansible.yml *************************************************************************************************************************************************************************************
1 plays in ansible.yml
PLAY [install ansible] ************************************************************************************************************************************************************************************
TASK [Gathering Facts] ************************************************************************************************************************************************************************************
task path: /etc/ansible/roles/ansible.yml:3
<h0011146.associatesys.local> ESTABLISH SSH CONNECTION FOR USER: None
<h0011146.associatesys.local> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/koebra/.ansible/cp/8a6e5420a0 h0011146.associatesys.local '/bin/sh -c '"'"'echo ~ && sleep 0'"'"''
<h0011146.associatesys.local> (0, '/home/koebra\n', '')
<h0011146.associatesys.local> ESTABLISH SSH CONNECTION FOR USER: None
<h0011146.associatesys.local> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/koebra/.ansible/cp/8a6e5420a0 h0011146.associatesys.local '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/koebra/.ansible/tmp `"&& mkdir /home/koebra/.ansible/tmp/ansible-tmp-1606933213.23-55559-199169178631287 && echo ansible-tmp-1606933213.23-55559-199169178631287="` echo /home/koebra/.ansible/tmp/ansible-tmp-1606933213.23-55559-199169178631287 `" ) && sleep 0'"'"''
<h0011146.associatesys.local> (0, 'ansible-tmp-1606933213.23-55559-199169178631287=/home/koebra/.ansible/tmp/ansible-tmp-1606933213.23-55559-199169178631287\n', '')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/system/setup.py
<h0011146.associatesys.local> PUT /home/koebra/.ansible/tmp/ansible-local-55549z92f94/tmpO76wSg TO /home/koebra/.ansible/tmp/ansible-tmp-1606933213.23-55559-199169178631287/AnsiballZ_setup.py
<h0011146.associatesys.local> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/koebra/.ansible/cp/8a6e5420a0 '[h0011146.associatesys.local]'
<h0011146.associatesys.local> (0, 'sftp> put /home/koebra/.ansible/tmp/ansible-local-55549z92f94/tmpO76wSg /home/koebra/.ansible/tmp/ansible-tmp-1606933213.23-55559-199169178631287/AnsiballZ_setup.py\n', '')
<h0011146.associatesys.local> ESTABLISH SSH CONNECTION FOR USER: None
<h0011146.associatesys.local> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/koebra/.ansible/cp/8a6e5420a0 h0011146.associatesys.local '/bin/sh -c '"'"'chmod u+x /home/koebra/.ansible/tmp/ansible-tmp-1606933213.23-55559-199169178631287/ /home/koebra/.ansible/tmp/ansible-tmp-1606933213.23-55559-199169178631287/AnsiballZ_setup.py && sleep 0'"'"''
<h0011146.associatesys.local> (0, '', '')
<h0011146.associatesys.local> ESTABLISH SSH CONNECTION FOR USER: None
<h0011146.associatesys.local> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/koebra/.ansible/cp/8a6e5420a0 -tt h0011146.associatesys.local '/bin/sh -c '"'"'sudo -H -S -n -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-xlbmctdergsnsmfzmvctpkiayaendarz ; /usr/bin/python /home/koebra/.ansible/tmp/ansible-tmp-1606933213.23-55559-199169178631287/AnsiballZ_setup.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<h0011146.associatesys.local> (1, '', 'Shared connection to h0011146.associatesys.local closed.\r\n')
<h0011146.associatesys.local> Failed to connect to the host via ssh: Shared connection to h0011146.associatesys.local closed.
<h0011146.associatesys.local> ESTABLISH SSH CONNECTION FOR USER: None
<h0011146.associatesys.local> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/koebra/.ansible/cp/8a6e5420a0 h0011146.associatesys.local '/bin/sh -c '"'"'rm -f -r /home/koebra/.ansible/tmp/ansible-tmp-1606933213.23-55559-199169178631287/ > /dev/null 2>&1 && sleep 0'"'"''
<h0011146.associatesys.local> (0, '', '')
fatal: [h0011146.associatesys.local]: FAILED! => {
"ansible_facts": {},
"changed": false,
"failed_modules": {
"setup": {
"failed": true,
"module_stderr": "Shared connection to h0011146.associatesys.local closed.\r\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
},
"msg": "The following modules failed to execute: setup\n"
}
PLAY RECAP ************************************************************************************************************************************************************************************************
h0011146.associatesys.local : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
THIS WAS THE OUTPUT IN /VAR/LOG/MESSAGES OF MASTER SERVER
Dec 2 12:33:40 h0011145 dzdo[56701]: WARN dz.common Username not found for given run as user cas. Error: No such file or directory
Dec 2 12:33:40 h0011145 adclient[2410]: INFO AUDIT_TRAIL|Centrify Suite|dzdo|1.0|4|dzdo granted|5|user=koebra(type:ad,koebra#PROD-AM.AMERITRADE.COM) pid=56701 utc=1606934020062 centrifyEventID=30004 DASessID=df052d84-b898-d44b-81ff-6eeced715fc4 DAInst=N/A status=GRANTED service=dzdo command=/usr/bin/tail runas=root role=ad.role.unix.admin/Unix env=(none) MfaRequired=false EntityName=prod-am.ameritrade.com\\h0011145
koebra#h0011145: /etc/ansible/roles>
I've an issue when updating a Vagrant box (Vagrant 2.2.3 and Windows 10).
The cause of error is rsync, it can't synchronize (so, my shared folders are not working, I think) :
Command: "rsync" "--verbose" "--archive" "--delete" "-z" "--copy-links" "--chmod=ugo=rwX" "--no-perms" "--no-owner" "--no-group" "--rsync-path" "sudo rsync" "-e" "ssh -p 2222 -o LogLevel=FATAL -o IdentitiesOnly=yes -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i 'C:/Users/my_user/boxes-puphpet/debian/.vagrant/machines/default/virtualbox/private_key'" "--exclude" ".vagrant/" "/cygdrive/c/Users/my_user/boxes-puphpet/debian/" "vagrant#127.0.0.1:/vagrant"
Error: rsync: pipe: Connection timed out (116)
rsync error: error in IPC code (code 14) at pipe.c(59) [sender=3.1.3]
INFO interface: Machine: error-exit ["Vagrant::Errors::RSyncError", "There was an error when attempting to rsync a synced folder.\nPlease inspect the error message below for more info.\n\nHost path: /cygdrive/c/Users/my_user/boxes-puphpet/debian/\nGuest path: /vagrant\nCommand: \"rsync\" \"--verbose\" \"--archive\" \"--delete\" \"-z\" \"--copy-links\" \"--chmod=ugo=rwX\" \"--no-perms\" \"--no-owner\" \"--no-group\" \"--rsync-path\" \"sudo rsync\" \"-e\" \"ssh -p 2222 -o LogLevel=FATAL -o IdentitiesOnly=yes -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i 'C:/Users/my_user/boxes-puphpet/debian/.vagrant/machines/default/virtualbox/private_key'\" \"--exclude\" \".vagrant/\" \"/cygdrive/c/Users/my_user/boxes-puphpet/debian/\" \"vagrant#127.0.0.1:/vagrant\"\nError: rsync: pipe: Connection timed out (116)\nrsync error: error in IPC code (code 14) at pipe.c(59) [sender=3.1.3]\n"]
Here my Vagranfile :
# -*- mode: ruby -*-
# vi: set ft=ruby :
Vagrant.configure("2") do |config|
config.vm.box = "debian/jessie64"
config.vm.box_version = "8.10.0"
config.vm.network "private_network", ip: "192.168.56.222"
config.vm.synced_folder "C:/Users/f.pestre/www/debian.vm/www/", "/var/www"
config.vm.provider "virtualbox" do |vb|
vb.memory = "4048"
end
#config.vm.provision :shell, path: "bootstrap.sh"
end
I can login with vagrant ssh, but the sync folder doesn't work, at all.
Thanks.
F.
Add below to your vagrant file
config.vm.synced_folder '.', '/vagrant', disabled: true
I am trying to provison an EC2 instance and to install a LAMP server on it using Ansible from localhost. I have successfully provisioned the instance, but I was not able to install apache,php and mysql due to this error "Failed to connect to the host via ssh.".
OS: El Capitan 10.11.6
Ansible: 2.0.2.0
Here is the playbook: `---
- hosts: localhost
connection: local
gather_facts: no
vars_files:
- "vars/{{ project_name }}.yml"
- "vars/vpc_info.yml"
tasks:
- name: Provision
local_action:
module: ec2
region: "xxxxxx"
vpc_subnet_id: "xxxxxx"
assign_public_ip: yes
key_name: "xxxxxxx"
instance_type: "t2.nano"
image: "xxxxxxxx"
wait: yes
instance_tags:
Name: "LAMP"
class: "test"
environment: "dev"
project: "{{ project_name }}"
az: a
exact_count: 1
count_tag:
Name: "LAMP"
monitoring: yes
register: ec2a
- hosts: lamp
roles:
- lamp_server
The content of the ansible.cfg file:
[defaults]
private_key_file=/Users/nico/.ssh/xxxxx.pem
The inventory:
lamp ansible_ssh_host=<EC2 IP> ansible_user=ubuntu
The command used for running the playbook:
ansible-playbook -i inventory ec2_up.yml -e project_name="lamp_server" -vvvv
Output:
ESTABLISH SSH CONNECTION FOR USER: ubuntu
<xxxxxxxxxx> SSH: EXEC ssh -C -vvv -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/Users/nico/.ssh/xxxxxxx.pem"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ubuntu -o ConnectTimeout=10 -o ControlPath=/Users/nico/.ansible/cp/ansible-ssh-%h-%p-%r xxxxxxx '/bin/sh -c '"'"'( umask 22 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1475186461.08-93383010782630 `" && echo "` echo $HOME/.ansible/tmp/ansible-tmp-1475186461.08-93383010782630 `" )'"'"''
52.28.251.117 | UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh.",
"unreachable": true
}
I have read a lot of threads regarding this error, but nothing helped me. :(
ansible-playbook -i inventory ec2_up.yml -e project_name="lamp_server" -vvvv -c paramiko
I wrote below command, which will copy the id_dsa.pub file to other server as part of my auto login feature. But every time below message is coming on the console:
spawn scp -o StrictHostKeyChecking=no /opt/mgtservices/.ssh/id_dsa.pub root#12.43.22.47:/root/.ssh/id_dsa.pub
Password:
Password:
Below script I wrote for this:
function sshkeygenerate()
{
if ! [ -f $HOME/.ssh/id_dsa.pub ] ;then expect -c" spawn ssh-keygen -t dsa -f $HOME/.ssh/id_dsa
expect y/n { send y\r ; exp_continue } expect passphrase): { send \r ; exp_continue}expect again: { send \r ; exp_continue}
spawn chmod 700 $HOME/.ssh && chmod 700 $HOME/.ssh/*
exit "
fi
expect -c"spawn scp -o StrictHostKeyChecking=no $HOME/.ssh/id_dsa.pub root"#"12.43.22.47:/root/.ssh/id_dsa.pub
expect *assword: { send $ROOTPWD\r }expect yes/no { send yes\r ; exp_continue }
spawn ssh -o StrictHostKeyChecking=no root"#"12.43.22.47 \"chmod 755 /root/.ssh/authorized_keys\"
expect *assword: { send $ROOTPWD\r }expect yes/no { send yes\r ; exp_continue }
spawn ssh -o StrictHostKeyChecking=no root"#"12.43.22.47 \"cat /root/.ssh/id_dsa.pub >> /root/.ssh/authorized_keys\"
expect *assword: { send $ROOTPWD\r }expect yes/no { send yes\r; exp_continue }
sleep 1
exit"
}
You should first create a passwordless ssh to the destination server, then you won't need to enter the password while you will do the scp.
Assuming 192.168.0.11 is the destination machine:
1) ssh-keygen -t rsa
2) ssh sheena#192.168.0.11 mkdir -p .ssh
3) cat .ssh/id_rsa.pub | ssh sheena#192.168.0.11 'cat >> .ssh/authorized_keys'
4) ssh sheena#192.168.0.11 "chmod 700 .ssh; chmod 640 .ssh/authorized_keys"
Link for the refernce:
http://www.tecmint.com/ssh-passwordless-login-using-ssh-keygen-in-5-easy-steps/