How to use local ssh key in vagrant ubuntu 14.04 - ssh

How to set Vagrant to use local ssh keys for github. I have workign ssh keys on my local machine and created vagrant machine, and i want to use my local ssh keys in my vagrant ubuntu machine. How can i achieve this?
Vagrant::Config.run do |config|
# stuff
config.ssh.forward_agent = true
end
this does not working for me, any suggestions?

Related

"ssh_exchange_identification: read: Connection reset by peer" happens while tryng to log in vagrant box over ssh

I was trying to deploy 4 vagrant box of two different OS. two of them are Ubuntu and other two are Centos. My vagrant file configuration is below:-
config.vm.define "ubuntu" do |ubuntu|
ubuntu.vm.hostname="ubuntu"
ubuntu.vm.box="bento/ubuntu-17.10"
ubuntu.vm.network "private_network", ip:"192.168.33.10"
end
config.vm.define "centos" do |centos|
centos.vm.hostname="centos"
centos.vm.box="bento/centos-7.4"
centos.vm.network "private_network", ip:"192.168.33.20"
end
config.vm.define "server1" do |server1|
server1.vm.hostname="server1"
server1.vm.box="bento/ubuntu-17.10"
server1.vm.network "private_network", ip:"192.168.33.30"
end
config.vm.define "server2" do |server2|
server2.vm.hostname="server2"
server2.vm.box="bento/centos-7.4"
server2.vm.network "private_network", ip:"192.168.33.40"
end
After successfully executing executing vagrant up I checked the vagrant status and found all box were running ok.
ubuntu running (virtualbox)
centos running (virtualbox)
server1 running (virtualbox)
server2 running (virtualbox)
however when I tried to login to each machine using vagrant ssh ubuntu , vagrant ssh centos , vagrant ssh server1 ,vagrant ssh server2 commands , every machine could be logged in except server1. While I tried to access server2 using vagrant ssh server1 that error showed:-
"ssh_exchange_identification: read: Connection reset by peer"
I using vagrant 2.2.4 on my elementary OS Loki system. Seeking help from experienced.TIA
The issue has solved. I dont no what was wrong actually! However, I tried following simple steps and fortunately it worked: 1. vagrant destroy 2. vagrant up

Unable to login to vagrant with vagrant up after setting up ssh keys

I have a new user in my vagrant box(trusty64) and I am trying to ssh into it. Instead of logging into vagrant user after vagrant up, I want to login to my username.
What I have done so far
Created a user in my guest machine.
Created ssh key in my host using ssh-keygen
Copied the ssh key to the guest using ssh-copy-id -p 2222 -i shash#127.0.0.1
and the part of the Vagrantfile looks like this
config.vm.box = "ubuntu/trusty64"
config.ssh.username = "shash"
config.ssh.forward_agent = true
config.ssh.private_key_path = "~/.ssh/authorized_keys"
I can use ssh -p '2222' 'shash#127.0.0.1' to login directly but when I give vagrant up I keep getting the following error
default: Warning: Connection timeout. Retrying...
default: Warning: Authentication failure. Retrying...
default: Warning: Authentication failure. Retrying...
Any help in sorting out this is really appreciated.Thanks!
A complete set-up guide would be really helpful
The vagrant file will access that users home directory when you specify '~'.
config.ssh.private_key_path = "/home/shash/.ssh/authorized_keys"
Give that a go!
Add it to the Vagrantfile:
Vagrant.configure("2") do |config|
config.ssh.private_key_path = "~/.ssh/id_rsa"
config.ssh.forward_agent = true
end
config.ssh.private_key_path is your local private key
Your private key must be available to the local ssh-agent. You can check with ssh-add -L, if it's not listed add it with ssh-add ~/.ssh/id_rsa
Don't forget to add you public key to ~/.ssh/authorized_keys on the Vagrant VM. You can do it copy-and-pasting or using a tool like ssh-copy-id
https://stackoverflow.com/a/23554973/3563993

Can't ssh to vagrant VMs using the insecure private key (vagrant 1.7.2)

I have a cluster of 3 VMs. Here is the Vagrantfile:
# -*- mode: ruby -*-
# vi: set ft=ruby :
hosts = {
"host0" => "192.168.33.10",
"host1" => "192.168.33.11",
"host2" => "192.168.33.12"
}
Vagrant.configure("2") do |config|
config.vm.box = "precise64"
config.vm.box_url = "http://files.vagrantup.com/precise64.box"
config.ssh.private_key_path = File.expand_path('~/.vagrant.d/insecure_private_key')
hosts.each do |name, ip|
config.vm.define name do |machine|
machine.vm.hostname = "%s.example.org" % name
machine.vm.network :private_network, ip: ip
machine.vm.provider "virtualbox" do |v|
v.name = name
# #v.customize ["modifyvm", :id, "--memory", 200]
end
end
end
end
This used to work until I upgraded recently:
ssh -i ~/.vagrant.d/insecure_private_key vagrant#192.168.33.10
Instead, vagrant asks for a password.
It seems that recent versions of vagrant (I'm on 1.7.2) create a secure private key for each machine. I discovered it by running
vagrant ssh-config
The output shows different keys for each host. I verified the keys are different by diffing them.
I tried to force the insecure key by setting in Vagrantfile the config.ssh.private_key_path, but it doesn't work.
The reason I want to use the insecure key for all machines is that I want to provision them from the outside using ansible. I don't want to use the Ansible provisioner, but treat the VMs as remote servers. So, the Vagrantfile is just used to specify the machines in the cluster and then provisioning will be done externally.
The documentation still says that by default machines will use the insecure private key.
How can I make my VMs use the insecure private key?
Vagrant changed the behaviour between 1.6 and 1.7 versions and now will insert auto generated insecure key instead of the default one.
You can cancel this behaviour by setting config.ssh.insert_key = false in your Vagrantfile.
Vagrant shouldn't replace insecure key if you specify private_key_path like you did, however the internal logic checks if the private_key_path points to the default insecure_private_key, and if it does, Vagrant will replace it.
More info can be found here.
When Vagrant creates a new ssh key it's saved with the default configuration below the Vagrantfile directory at .vagrant/machines/default/virtualbox/private_key.
Using the autogenerated key you can login with that from the same directory as the Vagrantfile like this:
ssh -i .vagrant/machines/default/virtualbox/private_key -p 2222 vagrant#localhost
To learn about all details about the actual ssh configuration of a vagrant box use the vagrant ssh-config command.
# vagrant ssh-config
Host default
HostName 127.0.0.1
User vagrant
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile /Users/babo/src/centos/.vagrant/machines/default/virtualbox/private_key
IdentitiesOnly yes
LogLevel FATAL
Adding config.ssh.insert_key = false to the Vagrantfile and removing the new vm private key .vagrant/machines/default/virtualbox/private_key vagrant automatically updates vagrant ssh-config with the correct private key ~/.vagrant.d/insecure_private_key. The last thing I had to do was ssh into the vm and update the authorized keys file on the vm. curl https://raw.githubusercontent.com/mitchellh/vagrant/master/keys/vagrant.pub > ~/.ssh/authorized_keys
tldr;
ssh vagrant#127.0.0.1 -p2222 -i/~/www/vw/vw-environment/.vagrant/machines/default/virtualbox/private_key
I couldn't get this to work, so in the end I added the following to the ssh.rb ruby script (/opt/vagrant/embedded/gems/gems/vagrant-1.7.1//lib/vagrant/util/ssh.rb)
print(*command_options)
just before this line that executes the ssh call
SafeExec.exec("ssh", *command_options)
So that prints out all the command options passed to the ssh call, from there you can work out something that works for you based on what vagrant calculates to be the correct ssh parameters.
If you are specifically using Ansible (not the Vagrant Ansible provisioner), you might want to consider using the vagrant dynamic inventory script from Ansible's repo:
https://github.com/ansible/ansible/blob/devel/contrib/inventory/vagrant.py
Alternatively, you'd can handcraft your own script and dynamically build your own vagrant inventory file:
SYSTEMS=$(vagrant status | grep running | cut -d ' ' -f1)
echo '[vagrant_systems]' > vagrant.ini
for SYSTEM in ${SYSTEMS}; do
SSHCONFIG=$(vagrant ssh-config ${SYSTEM})
IDENTITY_FILE=$(echo "${SSHCONFIG}" | grep -o "\/.*${SYSTEM}.*")
PORT=$(echo "${SSHCONFIG}" | grep -oE '[0-9]{4,5}')
echo "${SYSTEM} ansible_ssh_host=127.0.0.1 ansible_ssh_port=${PORT} ansible_ssh_private_key_file=${IDENTITY_FILE}" >> vagrant.ini
done
Then use ansible-playbook -i=vagrant.ini
If you try to use the ~/.ssh/config, you'll have to dynamically create or edit existing entries, as the ssh ports can change (due to the collision detection in Vagrant).

vagrant ssh keys chicken and egg

I use vagrant with a 3rd party linux box.
The box has the default vagrant/vagrant credentials.
In my Vagrantfile I want it to use ssh so I have this
config.vm.provision :shell, :path => "bootstrap.sh"
config.ssh.private_key_path = "~/.ssh/id_rsa"
config.ssh.forward_agent = true
In my bootstrap script I want to add my public key to authorized_keys. This works if I do it post VM creation.
But when I re-provision the VM from scratch, the VM has not yet received the public key through my bootstrap shell script.
How can I have vagrant install my public key in authorized_keys and authenticate with vagrant/vagrant until this has happened? Or is there a better way?
Found something that works
Based on this Vagrant insecure by default?
Where we have
config.ssh.private_key_path = ["#{ENV['HOME']}/.ssh/id_rsa", \
"#{ENV['HOME']}/.vagrant.d/insecure_private_key"]
This seems to have the effect that vagrant tries keys until it finds one that works (the example enumerates host file system paths too - very nice indeed.)

How to use ssh agent forwarding with "vagrant ssh"?

Rather than create a new SSH key pair on a vagrant box, I would like to re-use the key pair I have on my host machine, using agent forwarding. I've tried setting config.ssh.forward_agent to TRUE in the Vagrantfile, then rebooted the VM, and tried using:
vagrant ssh -- -A
...but I'm still getting prompted for a password when I try to do a git checkout. Any idea what I'm missing?
I'm using vagrant 2 on OS X Mountain Lion.
Vagrant.configure("2") do |config|
config.ssh.private_key_path = "~/.ssh/id_rsa"
config.ssh.forward_agent = true
end
config.ssh.private_key_path is your local private key
Your private key must be available to the local ssh-agent. You can check with ssh-add -L, if it's not listed add it with ssh-add ~/.ssh/id_rsa
Don't forget to add you public key to ~/.ssh/authorized_keys on the Vagrant VM. You can do it copy-and-pasting or using a tool like ssh-copy-id
Add it to the Vagrantfile
Vagrant::Config.run do |config|
# stuff
config.ssh.forward_agent = true
end
See the docs
In addition to adding "config.ssh.forward_agent = true" to the vagrant file make sure the host computer is set up for agent forwarding. Github provides a good guide for this. (Check out the troubleshooting section).
I had this working with the above replies on 1.4.3, but stopped working on 1.5. I now have to run ssh-add to work fully with 1.5.
For now I add the following line to my ansible provisioning script.
- name: Make sure ssk keys are passed to guest.
local_action: command ssh-add
I've also created a gist of my setup: https://gist.github.com/KyleJamesWalker/9538912
If you are on Windows, SSH Forwarding in Vagrant does not work properly by default (because of a bug in net-ssh). See this particular Vagrant bug report: https://github.com/mitchellh/vagrant/issues/1735
However, there is a workaround! Simply auto-copy your local SSH key to the Vagrant VM via a simple provisioning script in your VagrantFile. Here's an example:
https://github.com/mitchellh/vagrant/issues/1735#issuecomment-25640783
When we recently tried out the vagrant-aws plugin with Vagrant 1.1.5, we ran into an issue with SSH agent forwarding. It turned out that Vagrant was forcing IdentitiesOnly=yes without an option to change it to no. This forced Vagrant to only look at the private key we listed in the Vagrantfile for the AWS provider.
I wrote up our experiences in a blog post. It may turn into a pull request at some point.
Make sure that the VM does not launch its own SSH agent. I had this line in my ~/.profile
eval `ssh-agent`
After removing it, SSH agent forwarding worked.
The real problem is Vagrant using 127.0.0.1:2222 as default port-forward.
You can add one (not 2222, 2222 is already occupied by default)
config.vm.network "forwarded_port", guest: 22, host:2333, host_ip: "0.0.0.0"
"0.0.0.0" is way take request from external connection.
then
ssh -p 2333 vagrant#192.168.2.101 (change to your own host ip address, dud)
will working just fine.
Do thank me, Just call me Leifeng!
On Windows, the problem is that Vagrant doesn't know how to communicate with git-bash's ssh-agent. It does, however, know how to use PuTTY's Pageant. So, as long as Pageant is running and has loaded your SSH key, and as long as you've set config.ssh.forward_agent, this should work.
See this comment for details.
If you use Pageant, then the workaround of updating the Vagrantfile to copy SSH keys on Windows is no longer necessary.