How to use ssh agent forwarding with "vagrant ssh"? - ssh

Rather than create a new SSH key pair on a vagrant box, I would like to re-use the key pair I have on my host machine, using agent forwarding. I've tried setting config.ssh.forward_agent to TRUE in the Vagrantfile, then rebooted the VM, and tried using:
vagrant ssh -- -A
...but I'm still getting prompted for a password when I try to do a git checkout. Any idea what I'm missing?

I'm using vagrant 2 on OS X Mountain Lion.
Vagrant.configure("2") do |config|
config.ssh.private_key_path = "~/.ssh/id_rsa"
config.ssh.forward_agent = true
end
config.ssh.private_key_path is your local private key
Your private key must be available to the local ssh-agent. You can check with ssh-add -L, if it's not listed add it with ssh-add ~/.ssh/id_rsa
Don't forget to add you public key to ~/.ssh/authorized_keys on the Vagrant VM. You can do it copy-and-pasting or using a tool like ssh-copy-id

Add it to the Vagrantfile
Vagrant::Config.run do |config|
# stuff
config.ssh.forward_agent = true
end
See the docs

In addition to adding "config.ssh.forward_agent = true" to the vagrant file make sure the host computer is set up for agent forwarding. Github provides a good guide for this. (Check out the troubleshooting section).

I had this working with the above replies on 1.4.3, but stopped working on 1.5. I now have to run ssh-add to work fully with 1.5.
For now I add the following line to my ansible provisioning script.
- name: Make sure ssk keys are passed to guest.
local_action: command ssh-add
I've also created a gist of my setup: https://gist.github.com/KyleJamesWalker/9538912

If you are on Windows, SSH Forwarding in Vagrant does not work properly by default (because of a bug in net-ssh). See this particular Vagrant bug report: https://github.com/mitchellh/vagrant/issues/1735
However, there is a workaround! Simply auto-copy your local SSH key to the Vagrant VM via a simple provisioning script in your VagrantFile. Here's an example:
https://github.com/mitchellh/vagrant/issues/1735#issuecomment-25640783

When we recently tried out the vagrant-aws plugin with Vagrant 1.1.5, we ran into an issue with SSH agent forwarding. It turned out that Vagrant was forcing IdentitiesOnly=yes without an option to change it to no. This forced Vagrant to only look at the private key we listed in the Vagrantfile for the AWS provider.
I wrote up our experiences in a blog post. It may turn into a pull request at some point.

Make sure that the VM does not launch its own SSH agent. I had this line in my ~/.profile
eval `ssh-agent`
After removing it, SSH agent forwarding worked.

The real problem is Vagrant using 127.0.0.1:2222 as default port-forward.
You can add one (not 2222, 2222 is already occupied by default)
config.vm.network "forwarded_port", guest: 22, host:2333, host_ip: "0.0.0.0"
"0.0.0.0" is way take request from external connection.
then
ssh -p 2333 vagrant#192.168.2.101 (change to your own host ip address, dud)
will working just fine.
Do thank me, Just call me Leifeng!

On Windows, the problem is that Vagrant doesn't know how to communicate with git-bash's ssh-agent. It does, however, know how to use PuTTY's Pageant. So, as long as Pageant is running and has loaded your SSH key, and as long as you've set config.ssh.forward_agent, this should work.
See this comment for details.
If you use Pageant, then the workaround of updating the Vagrantfile to copy SSH keys on Windows is no longer necessary.

Related

Vagrant ssh forward_agent without entering vagrant user password repeadley

I currently use Vagrant and Chef to provision a VM and setup my PHP based project. This includes running composer install which essentially does a git clone of a number of private repositories.
After setting up ssh agent forwarding as outlined in the docs and the answers here: How to use ssh agent forwarding with "vagrant ssh"? I have successfully got it working.
The problem I'm having is when ever I boot a VM, provision a VM or SSH into a VM I'm now asked for vagrants default password, see examples below:
==> web: Waiting for machine to boot. This may take a few minutes...
web: SSH address: 192.168.77.185:22
web: SSH username: vagrant
web: SSH auth method: private key
Text will be echoed in the clear. Please install the HighLine or Termios libraries to suppress echoed text.
vagrant#192.168.77.185's password:
Example 2
➜ vagrant git:(master) ✗ vagrant ssh
vagrant#192.168.77.185's password:
This is pretty inconvenient as I work across a number of projects, including destroying and creating some a number of times a day (Chef test kitchen). Is there anyway to automatically use my public key as well so I don't need to continually enter a password?
I ran into a similar issue recently after creating a new Vagrant box from scratch. The problem turned out to be old entries in ~/.ssh/known_hosts (on OS X).
Try the following (assumes OS X or linux):
ssh into your Vagrant machine
type ip addr or ifconfig or the like (depending on your OS)
take note of the IP addresses listed, including 127.0.0.1
on your host machine, run ssh-keygen -R {vm-ip-address} (make sure to include 127.0.0.1 and [127.0.0.1]) for the addresses in step 3
confirm the relevant entries have been removed from ~/.ssh/known_hosts
vagrant reload
vagrant ssh
Alternatively, you can just delete/move/rename the ~/.ssh/known_hosts file, though this will require reconfirming authenticity again for multiple machines you've already ssh'd to.
I hope that helps.
Reference: http://www.geekride.com/ssh-warning-remote-host-identification-has-changed/

Vagrant ssh connect to host 127.0.0.1:2222 port 22: Bad file number

Whenever I try to connect to my local Vagrant, I get this error when I run ssh vagrant#127.0.0.1:2222 from the Windows git bash:
ssh: connect to host 127.0.0.1:2222 port 22: Bad file number
It was working previously, so I'm not sure what could have caused this. When I try to do an SFTP connection in PHPStorm 8, I get this error:
Connection to '127.0.0.1' failed.
SSH_MSG_DISCONNECT: 2 Too many authentication failures for vagrant
I've tried vagrant destroy with vagrant box remove laravel/homestead and then recreating the box from a backup I had that previously worked using vagrant box add laravel/homestead homestead.box but I still get the same errors.
I'm on Windows 7.
What can I do to get access to my vagrant box commandline again?
Try command:
ssh -p 2222 vagrant#127.0.0.1
The answer by outboundexplorer above is the correct one I believe.Here is my step-by-step approach on how I did this:
Step 1: Find out exactly what SSH settings to use
Ensure the vagrant box is running (you've done vagrant up that is)
From the command line, go to your project directory (the one where the Vagrantfile is located) and run vagrant ssh-config.
You'll get an output like this:
Host default
HostName 127.0.0.1
User ubuntu
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile C:/Projects/my-test-project/.vagrant/machines/default/virtualbox/private_key
IdentitiesOnly yes
LogLevel FATAL
Step 2: Setting up PHPStorm to SFTP to the Vagrant box
Based on the config settings shown above, I set up the following SFTP remote deployment server:
SFTP host: 127.0.0.1
Port: 2222
Root path: /home/ubuntu/my-test-project (this is the folder inside the Vagrant box where the files will be uploaded to, change to whatever suits your needs)
User name: ubuntu
Auth type: Select "Key pair (OpenSSH or PuTTY)"
Private key file: Point to the IdentityFile path shown (C:/Projects/....)
... and that was it.
I got this same failure when using PHpStorm to SSH into the VirtualBox guest machine that i had set up with Vagrant. Everything worked fine before I upgraded to Windows 10. After upgrading, first of all i had to upgrade to VirtualBox and Vagrant latest versions to get everything to work on Windows 10.
But then i couldn't ssh into the guest machine using the PhpStorm ssh client. After much reading, everything seemed to suggest that I had too many ssh-keys installed on my Windows machine, but checking regedit just showed that I only had a couple of keys which should be less than the suggested max 5 keys (as default). In the end i did vagrant ssh which didn't allow me to ssh into the guest machine, but it did reconfirm the ssh details for me. I then realized that after all the new installs it didn't want me to use the C:\Users\Andy\.vagrant.d\insecure_private_key key but instead use a key that it had placed within the project itself at C:/Users/Andy/CodeLab5/vagrant/.vagrant/machines/default/virtualbox/private_key.
Everything is working as it should again now :)
Make sure your vagrant is up and running by command : vagrant up
and then do vagrant ssh. It will connect to vagrant localhost

Can't ssh to vagrant VMs using the insecure private key (vagrant 1.7.2)

I have a cluster of 3 VMs. Here is the Vagrantfile:
# -*- mode: ruby -*-
# vi: set ft=ruby :
hosts = {
"host0" => "192.168.33.10",
"host1" => "192.168.33.11",
"host2" => "192.168.33.12"
}
Vagrant.configure("2") do |config|
config.vm.box = "precise64"
config.vm.box_url = "http://files.vagrantup.com/precise64.box"
config.ssh.private_key_path = File.expand_path('~/.vagrant.d/insecure_private_key')
hosts.each do |name, ip|
config.vm.define name do |machine|
machine.vm.hostname = "%s.example.org" % name
machine.vm.network :private_network, ip: ip
machine.vm.provider "virtualbox" do |v|
v.name = name
# #v.customize ["modifyvm", :id, "--memory", 200]
end
end
end
end
This used to work until I upgraded recently:
ssh -i ~/.vagrant.d/insecure_private_key vagrant#192.168.33.10
Instead, vagrant asks for a password.
It seems that recent versions of vagrant (I'm on 1.7.2) create a secure private key for each machine. I discovered it by running
vagrant ssh-config
The output shows different keys for each host. I verified the keys are different by diffing them.
I tried to force the insecure key by setting in Vagrantfile the config.ssh.private_key_path, but it doesn't work.
The reason I want to use the insecure key for all machines is that I want to provision them from the outside using ansible. I don't want to use the Ansible provisioner, but treat the VMs as remote servers. So, the Vagrantfile is just used to specify the machines in the cluster and then provisioning will be done externally.
The documentation still says that by default machines will use the insecure private key.
How can I make my VMs use the insecure private key?
Vagrant changed the behaviour between 1.6 and 1.7 versions and now will insert auto generated insecure key instead of the default one.
You can cancel this behaviour by setting config.ssh.insert_key = false in your Vagrantfile.
Vagrant shouldn't replace insecure key if you specify private_key_path like you did, however the internal logic checks if the private_key_path points to the default insecure_private_key, and if it does, Vagrant will replace it.
More info can be found here.
When Vagrant creates a new ssh key it's saved with the default configuration below the Vagrantfile directory at .vagrant/machines/default/virtualbox/private_key.
Using the autogenerated key you can login with that from the same directory as the Vagrantfile like this:
ssh -i .vagrant/machines/default/virtualbox/private_key -p 2222 vagrant#localhost
To learn about all details about the actual ssh configuration of a vagrant box use the vagrant ssh-config command.
# vagrant ssh-config
Host default
HostName 127.0.0.1
User vagrant
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile /Users/babo/src/centos/.vagrant/machines/default/virtualbox/private_key
IdentitiesOnly yes
LogLevel FATAL
Adding config.ssh.insert_key = false to the Vagrantfile and removing the new vm private key .vagrant/machines/default/virtualbox/private_key vagrant automatically updates vagrant ssh-config with the correct private key ~/.vagrant.d/insecure_private_key. The last thing I had to do was ssh into the vm and update the authorized keys file on the vm. curl https://raw.githubusercontent.com/mitchellh/vagrant/master/keys/vagrant.pub > ~/.ssh/authorized_keys
tldr;
ssh vagrant#127.0.0.1 -p2222 -i/~/www/vw/vw-environment/.vagrant/machines/default/virtualbox/private_key
I couldn't get this to work, so in the end I added the following to the ssh.rb ruby script (/opt/vagrant/embedded/gems/gems/vagrant-1.7.1//lib/vagrant/util/ssh.rb)
print(*command_options)
just before this line that executes the ssh call
SafeExec.exec("ssh", *command_options)
So that prints out all the command options passed to the ssh call, from there you can work out something that works for you based on what vagrant calculates to be the correct ssh parameters.
If you are specifically using Ansible (not the Vagrant Ansible provisioner), you might want to consider using the vagrant dynamic inventory script from Ansible's repo:
https://github.com/ansible/ansible/blob/devel/contrib/inventory/vagrant.py
Alternatively, you'd can handcraft your own script and dynamically build your own vagrant inventory file:
SYSTEMS=$(vagrant status | grep running | cut -d ' ' -f1)
echo '[vagrant_systems]' > vagrant.ini
for SYSTEM in ${SYSTEMS}; do
SSHCONFIG=$(vagrant ssh-config ${SYSTEM})
IDENTITY_FILE=$(echo "${SSHCONFIG}" | grep -o "\/.*${SYSTEM}.*")
PORT=$(echo "${SSHCONFIG}" | grep -oE '[0-9]{4,5}')
echo "${SYSTEM} ansible_ssh_host=127.0.0.1 ansible_ssh_port=${PORT} ansible_ssh_private_key_file=${IDENTITY_FILE}" >> vagrant.ini
done
Then use ansible-playbook -i=vagrant.ini
If you try to use the ~/.ssh/config, you'll have to dynamically create or edit existing entries, as the ssh ports can change (due to the collision detection in Vagrant).

Vagrant ssh authentication failure

The problem with ssh authentication:
==> default: Clearing any previously set forwarded ports...
==> default: Clearing any previously set network interfaces...
==> default: Preparing network interfaces based on configuration...
default: Adapter 1: nat
default: Adapter 2: bridged
==> default: Forwarding ports...
default: 22 => 2222 (adapter 1)
==> default: Booting VM...
==> default: Waiting for machine to boot. This may take a few minutes...
default: SSH address: 127.0.0.1:2222
default: SSH username: vagrant
default: SSH auth method: private key
default: Error: Connection timeout. Retrying...
default: Error: Connection timeout. Retrying...
default: Error: Connection timeout. Retrying...
default: Error: Connection timeout. Retrying...
default: Error: Authentication failure. Retrying...
default: Error: Authentication failure. Retrying...
default: Error: Authentication failure. Retrying...
default: Error: Authentication failure. Retrying...
default: Error: Authentication failure. Retrying...
I can Ctrl+C out of the authentication loop and then successfully ssh in manually.
I performed the following steps on the guest box:
Enabled Remote Login for All Users.
Created the ~/.ssh directory with 0700 permissions.
Created the ~/.ssh/authorized_keys file with 0600 permissions.
Pasted this public key
into ~/.ssh/authorized_keys
I've also tried using a private (hostonly) network instead of the public (bridged) network, using this line in the Vagrantfile:
config.vm.network "private_network", ip: "172.16.177.7"
I get the same output (except Adapter 2: hostonly) but then cannot ssh in manually.
I also tried config.vm.network "private_network", ip: "10.0.0.100".
I also tried setting config.ssh.password in the Vagrantfile. This does output SSH auth method: password but still doesn't authenticate.
And I also tried rebuilding the box and rechecking all the above.
It looks like others have had success with this configuration, so there must be something I'm doing wrong.
I found this thread and enabled the GUI, but that doesn't help.
For general information: by default to ssh-connect you may simply use
user: vagrant password: vagrant
https://www.vagrantup.com/docs/boxes/base.html#quot-vagrant-quot-user
First, try: to see what vagrant insecure_private_key is in your machine config
$ vagrant ssh-config
Example:
$ vagrant ssh-config
Host default
HostName 127.0.0.1
User vagrant
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile C:/Users/konst/.vagrant.d/insecure_private_key
IdentitiesOnly yes
LogLevel FATAL
http://docs.vagrantup.com/v2/cli/ssh_config.html
Second, do:
Change the contents of file insecure_private_key with the contents of your personal system private key
Or use:
Add it to the Vagrantfile:
Vagrant.configure("2") do |config|
config.ssh.private_key_path = "~/.ssh/id_rsa"
config.ssh.forward_agent = true
end
config.ssh.private_key_path is your local private key
Your private key must be available to the local ssh-agent. You can check with ssh-add -L. If it's not listed, add it with ssh-add ~/.ssh/id_rsa
Don't forget to add your public key to ~/.ssh/authorized_keys on the Vagrant VM. You can do it by copy-and-pasting or using a tool like ssh-copy-id (user: root password: vagrant port: 2222) ssh-copy-id '-p 2222 root#127.0.0.1'
If still does not work try this:
Remove insecure_private_key file from c:\Users\USERNAME\.vagrant.d\insecure_private_key
Run vagrant up (vagrant will be generate a new insecure_private_key file)
In other cases, it is helpful to just set forward_agent in Vagrantfile:
Vagrant::Config.run do |config|
config.ssh.forward_agent = true
end
Useful:
Configurating git may be with git-scm.com
After setup this program and creating personal system private key will be in yours profile path: c:\users\USERNAME\.ssh\id_rsa.pub
PS: Finally - suggest you look at Ubuntu on Windows 10
None of the above worked for me. Somehow the box had the wrong public key added in the vagrant user authorised_keys file.
If you can still ssh on the box with the vagrant password (password is vagrant), i.e.
ssh vagrant#localhost -p 2222
then copy the public key content from https://raw.githubusercontent.com/mitchellh/vagrant/master/keys/vagrant.pub to the authorised_keys file with the following command
echo "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA6NF8iallvQVp22WDkTkyrtvp9eWW6A8YVr+kz4TjGYe7gHzIw+niNltGEFHzD8+v1I2YJ6oXevct1YeS0o9HZyN1Q9qgCgzUFtdOKLv6IedplqoPkcmF0aYet2PkEDo3MlTBckFXPITAMzF8dJSIFo9D8HfdOV0IAdx4O7PtixWKn5y2hMNG0zQPyUecp4pzC6kivAIhyfHilFR61RGL+GPXQ2MWZWFYbAGjyiYJnAmCP3NOTd0jMZEnDkbUvxhMmBYSdETk1rRgm+R4LOzFUGaHqHDLKLX+FIPKcF96hrucXzcWyLbIbEgE98OHlnVYCzRdK8jlqm8tehUc9c9WhQ== vagrant insecure public key" > .ssh/authorized_keys
When done exit the VM and try vagrant ssh again. It should work now.
If you experience this issue on vagrant 1.8.5, then check out this thread on github:
https://github.com/mitchellh/vagrant/issues/7610
It's caused basically by a permission issue, the workaround is just
vagrant ssh
password: vagrant
chmod 0600 ~/.ssh/authorized_keys
exit
then
vagrant reload
FYI: this issue only affects CentOS, Ubuntu works fine.
Run the following commands in guest machine/VM:
wget https://raw.githubusercontent.com/mitchellh/vagrant/master/keys/vagrant.pub -O ~/.ssh/authorized_keys
chmod 700 ~/.ssh
chmod 600 ~/.ssh/authorized_keys
chown -R vagrant:vagrant ~/.ssh
Then do vagrant halt. This will remove and regenerate your private keys.
(These steps assume you have already created or already have the ~/.ssh/ and ~/.ssh/authorized_keys directories under your home folder.)
In my experience, this has been a surprisingly frequent problem with new vagrant machines. By far the easiest way to solve it, instead of altering the configuration itself, has been creating the required ssh keys manually on the client, then using the private key on the host.
Log in to vagrant machine: vagrant ssh, use default password vagrant.
Create ssh keys: for example, ssh-keygen -t rsa -b 4096 -C "vagrant" (as adviced by GitHub's relevant guide).
Rename the public key file (by default id_rsa.pub), overriding the old one: mv .ssh/id_rsa.pub .ssh/authorized_keys.
Reload ssh service in case needed: sudo service ssh reload.
Copy the private key file (by default id_rsa) to the host machine: for instance, use a fine combination of cat and clipboard, cat .ssh/id_rsa, paint and copy (better ways must exist, go invent one!).
Logout from the vagrant machine: logout.
Find the current private key used by vagrant by looking at its configuration: vagrant ssh-config (look for instance ÌdentityFile "/[...]/private_key".
Replace the current private key with the one you created at the host machine: for example, nano /[...]/private_key and paste from the clipboard, if all else fails. (Note, however, that if your private_key is not project specific but shared by multiple vagrant machines, you better configure the path yourself in order to not break other perfectly working machines! Changing the path is as simple as adding a line config.ssh.private_key_path = "path/to/private_key" into the Vagrantfile.) Furthermore, if you are using PuPHPet generated machine, you can store your private key to file puphpet/files/dot/ssh/id_rsa and it will be added to Vagrantfile's ssh config automatically.
Test the setup: vagrant ssh should now work.
Should that be the case, congratulate yourself, logout, run vagrant provision if needed and carry on with the meaningful task at hand.
If you still face problems, it may come handy to add verbose flag to ssh command to ease debugging. You can pass that (or any other option, for that matter) after double dash. For example, typing vagrant ssh -- -v. Feel free to add as many v's as you need, each will give you more information.
Unable to run vagrant up because it gets stuck and times out? I recently had a "water in laptop incident" and had to migrate to a new one(on a MAC by the way). I successfully got all my projects up and running beside the one, which was using vagrant.
$ vagrant up
Bringing machine 'default' up with 'virtualbox' provider...
==> default: Clearing any previously set forwarded ports...
==> default: Clearing any previously set network interfaces...
==> default: Preparing network interfaces based on configuration...
default: Adapter 1: nat
default: Adapter 2: hostonly
==> default: Forwarding ports...
default: 8000 (guest) => 8877 (host) (adapter 1)
default: 8001 (guest) => 8878 (host) (adapter 1)
default: 8080 (guest) => 7777 (host) (adapter 1)
default: 5432 (guest) => 2345 (host) (adapter 1)
default: 5000 (guest) => 8855 (host) (adapter 1)
default: 22 (guest) => 2222 (host) (adapter 1)
==> default: Running 'pre-boot' VM customizations...
==> default: Booting VM...
==> default: Waiting for machine to boot. This may take a few minutes...
default: SSH address: 127.0.0.1:2222
default: SSH username: vagrant
default: SSH auth method: private key
default: Warning: Authentication failure. Retrying...
default: Warning: Authentication failure. Retrying...
default: Warning: Authentication failure. Retrying...
It couldn't authenticate, retried again and again and eventually gave up.
This is how I got it back in shape in 3 steps:
1 - Find the IdentityFile used by Vagrant:
$ vagrant ssh-config
Host default
HostName 127.0.0.1
User vagrant
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile /Users/ned/.vagrant.d/insecure_private_key
IdentitiesOnly yes
LogLevel FATAL
2 - Check the public key in the IdentityFile:
$ ssh-keygen -y -f <path-to-insecure_private_key>
It'd output something like this:
ssh-rsa AAAAB3Nyc2EAAA...9gE98OHlnVYCzRdK8jlqm8hQ==
3 - Log in to the Vagrant guest with the password vagrant:
ssh -p 2222 -o UserKnownHostsFile=/dev/null vagrant#127.0.0.1
The authenticity of host '[127.0.0.1]:2222 ([127.0.0.1]:2222)' can't be established.
RSA key fingerprint is dc:48:73:c3:18:e4:9d:34:a2:7d:4b:20:6a:e7:3d:3e.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '[127.0.0.1]:2222' (RSA) to the list of known hosts.
vagrant#127.0.0.1's password: vagrant
Welcome to Ubuntu 16.04.1 LTS (GNU/Linux 4.4.0-31-generic x86_64)
...
NOTE: if vagrant guest is configured to disallow password authentication you need to open VBox' GUI, double click guest name, login as vagrant/vagrant, then sudo -s and edit /etc/ssh/sshd_config and look for PasswordAuthentication no line (usually at the end of the file), replace no with yes and restart sshd (i.e. systemctl reload sshd or /etc/init.d/sshd restart).
4 - Add the public key to the /home/vagrant/authorized_keys file.
$ echo "ssh-rsa AA2EAAA...9gEdK8jlqm8hQ== vagrant" > /home/vagrant/.ssh/authorized_keys
5 - Exit (CTRL+d) and stop the Vagrant guest and then bring it back up.
IMPORTANT if you use any provisioning tools (i.e. Ansible etc) disable it before restarting your guest as Vagrant will think your guest is not provisioned because of use of insecure private key. It will reinstall the key and then run your provisioner!
$ vagrant halt
$ vagrant up
Hopefully you will have your arms in the air now...
I got this, with just a minor amend, from Ned Batchelders article - Ned you are a champ!
This can also happen if you're trying to force your VM to use a root user by default for SSH....
For example, a config like so in your Vagrantfile may cause this failure:
config.ssh.username = 'root'
config.ssh.password = 'vagrant'
config.ssh.insert_key = 'true'
Solution: Comment out those lines and try again!
Problem I was getting the ssh authentication errors, on a box I provisioned. The original was working ok.
The problem for me was I was missing a private key in .vagrant/machines/default/virtualbox/private_key. I copied the private key from the same relative location from the original box and Viola!
I have found a way around the mess with the keys on Win 8.2 where I did not succeed with any of the methods mentioned here. It may be interesting that exactly the same combination of VirtualBox, Vagrant, and the box run on Win 7 Ultimate without any problems.
I switched to the password authentication by adding the following commands in Vagrantfile:
config.ssh.password = "vagrant"
config.ssh.insert_key = false
Note that I'm not sure that this is the only changes required because I already did:
I generated a new RSA key pair and changed authorized_keys file accordingly (all in the virtual machine, see the suggestions above and elsewhere)
I copied the private key to the same directory where Vagrantfile resides and added
config.ssh.private_key_path = "./id_rsa"
But I believe that these changes were irrelevant. I spent a plenty of time trying, so I did not change the working configuration by obvious reasons :)
for me, this was resolved by changing the permissions on .ssh folder in vagrant home directort (i.e. "~vagrant/.ssh"). I think I messed up the permissions when I was setting up ssh keys for my application.
It seems that 'authorized_keys' file must be 'rw' only for 'vagrant' user so "chmod 600 authorized_keys"; the same goes for the directory itself and its parent:
so:
chmod 600 authorized_keys
chmod 700 .
chmod 700 ..
It was only after I had all these permissions restored that vagrant ssh started to work again.
I think it's something to do with ssh security. It refuses to recognise certificates if they are any way accessible beyond the current user, so vagrants attempts to login are thus rejected.
If you are using default SSH setup in your VagrantFile and started seeing SSH authentication errors after re-associating your VM box due to crash, try replacing public key in your vagrant machine.
Vagrant replaces public key associated with insecure private key pair at each log out due to security reasons. If you didn't properly shut down your machine, public/private key pair can go out of sync, causing SSH authentication error.
To resolve this issue, simply load up the current insecure private key and then copy the public key pair into your VM's authorized_keys file.
This might be the last answer in the list but this worked for me and I did not find this answer anywhere, I found it my self after 2 days of researches so you've better try this if nothing else worked for you until now.
In my case the problem came from my VirtualBox. I don't know for what reason an option was disabled and it should have been enabled.
As you can see in the image, there were some network problems with my VirtualBox and what I had to do in order to fix this problem was to select my machine, press on settings, network tab and after that make sure that the option Cable Connected was selected. In my case this option was not selected and I it failed at this step:
default: SSH address: 127.0.0.1:2222
default: SSH username: vagrant
default: SSH auth method: private key
First I thought that the port is already in use, after that I reinstalled Vagrant and I also tried other things but none of them worked for me.
This has happened to me several times and the way I solved it was :
Check and make sure your Vagrantfile has the correct private key path :
config.ssh.private_key_path = "/home/razvan/.ssh/id_rsa"
Execute > vagrant ssh command in a linux terminal
On your vagrant machine go to
cd /home/vagrant/.ssh
and check if the ssh key in the authorized_keys file is the same as the one you have on your local machine in ~/.ssh/id_rsa.pub. If not replace the one from your vagrant authorized_keys with the one on your local machine found in ~/.ssh/id_rsa.pub.
Reload Vagrant :
vagrant reload
Hope this helps someone else. Cheers!
1. Locate the private key in the host:
vagrant ssh-config
#
Output:
Host default
...
Port 2222
...
IdentityFile /home/me/.vagrant.d/[...]/virtualbox/vagrant_private_key
...
2. Store the private key path and the port number in variables:
Use these two commands with the output from above:
pk="/home/me/.vagrant.d/.../virtualbox/vagrant_private_key"
port=2222
#
3. Generate a public key and upload it to the guest machine:
Copy/pasta, no changes needed:
ssh-keygen -y -f $pk > authorized_keys
scp -P $port authorized_keys vagrant#localhost:~/.ssh/
vagrant ssh -c "chmod 600 ~/.ssh/authorized_keys"
rm authorized_keys
#
If you are using windows and this issue come unexpectedly, please try the following code in configuration.
config.ssh.username = 'vagrant'
config.ssh.password = 'vagrant'
config.ssh.insert_key = 'true'
This basically uses the default vagrant configuration.
Mac Solution:
Added local ssh id_rsa key to vagrant private key
vi /Users//.vagrant/machines/default/virtualbox/private_key
/Users//.ssh/id_rsa
copied public key /Users//.ssh/id_rsa.pub on vagrant box authorized_keys
ssh vagrant#localhost -p 2222 (password: vagrant)
ls -la
cd .ssh
chmod 0600 ~/.ssh/authorized_keys
vagrant reload
Problem resolved.
Thanks to
Make sure your first network interface is NAT. The other second network interface can be anything you want when you're building box. Don't forget the Vagrant user, as discussed in the Google thread.
Good luck.
also could not get beyond:
default: SSH auth method: private key
When I used the VirtualBox GUI, it told me there was an OS processor mismatch.
To get vagrant up progressing further, in the BIOS settings I had to counter-intuitively:
Disable: Virtualisation
Enable: VT-X
Try toggling these setting in your BIOS.
First of all you should remove the autogenerated insecure_private_key file, then regenerate this file by typing
vagrant ssh-config
then
vagrant halt
vagrant up
It should work
I resolved the issue in the following manner.
1. Create new SSH key using Git Bash
$ ssh-keygen -t rsa -b 4096 -C "vagrant#localhost"
# Creates a new ssh key, using the provided email as a label
Generating public/private rsa key pair.
When you're prompted to "Enter a file in which to save the key," press Enter. This accepts the default file location.
Enter a file in which to save the key (/Users/[you]/.ssh/id_rsa): [Press enter]
At the prompt, type a secure passphrase. You can leave empty and press enter if you do not need a passphrase.
Enter a file in which to save the key (/Users/[you]/.ssh/id_rsa): [Press enter]
To connect to your Vagrant VM type following command
ssh vagrant#localhost -p 2222
When you get following message type “yes” and press enter.
The authenticity of host 'github.com (192.30.252.1)' can't be established.
RSA key fingerprint is 16:27:ac:a5:76:28:2d:36:63:1b:56:4d:eb:df:a6:48.
Are you sure you want to continue connecting (yes/no)?
Now to establish a SSH connection type : $ vagrant ssh
Copy the host public key into authorized_keys file in Vagrant VM. For that, go to “Users/[you]/.ssh” folder and copy the content in id_rsa.pub file in host machine and past into “~/.ssh/authorized_keys” file in Vagrant VM.
Change permission on SSH folder and authorized_keys file in Vagrant VM
Restart vagrant with : $ vagrant reload
Another simple solution, in windows, go to the file Homestead/Vagrantfile and add these lines to connect with a username/password instead of a private key:
config.ssh.username = "vagrant"
config.ssh.password = "vagrant"
config.ssh.insert_key = false
So, finally part of the file will look like this :
if File.exists? homesteadYamlPath then
settings = YAML::load(File.read(homesteadYamlPath))
elsif File.exists? homesteadJsonPath then
settings = JSON.parse(File.read(homesteadJsonPath))
end
config.ssh.username = "vagrant"
config.ssh.password = "vagrant"
config.ssh.insert_key = false
Homestead.configure(config, settings)
if File.exists? afterScriptPath then
config.vm.provision "shell", path: afterScriptPath, privileged: false
end
Hope this help ..
Just adding my solution:
rm /Users/myusername/.ssh/config
vagrant ssh-config >> /Users/myusername/.ssh/config
Somewhat similar to other proposed solutions here.
Between all of the responses here, there are lots of good things to try. For completeness, if you
ssh vagrant#localhost -p 2222
as #Bizmate suggests, and it fails, be sure you have
AllowUsers vagrant
in the /etc/ssh/sshd_config of your guest/vagrant machine.
I am using Vagrant with a Puphpet setup from May 2015 and had this problem. It appears that the configuration that was generated didn't handle Vagrant 1.7.4 (or maybe a bit earlier?) behavior of regenerating ssh keys if it detects an insecure key.
I solved it by adding the following in my Puphpet generated Vagrantfile (local setup) inside the "if File.file?(customKey)" clause:
config.ssh.insert_key = false
Reference commit
This the all correct steps that I followed for fix this bellow issue occurred when vagrant up command run.
These are the steps that I followed
create a folder. e.g F:\projects
Open this folder in git bash and run this command
ssh-keygen -t rsa -b 4096 -C "your_email#example.com" (put a valid email address)
Then generating key pair in two separate files in the project folder. e.g project(private key file), project.pub (public key file)
Go to this location C:\Users\acer.vagrant.d and find file
insecure_private_key
Get backup of the file and copy the content of newly created private key and paste it in insecure_private_key file. Then copy insecure_private_key and paste it in this location too.
Now vagrant up in your project location. after generating above issue type vagrant ssh and go inside giving username, password. (in default username and password is set as vagrant)
Go inside to this location cd /home/vagrant/.ssh and type mv authorized_keys authorized_keys_bk
Then type ls -al and type vi authorized_keys for open authorized_keys file vi editor.
Open generated public key from notepad++ (project.pub) and copy content
Then press i on git bash to enable insert mode on vi editor and right click and paste. After press escape to get out from insert mode
:wq! for save the file and type ls -al
Then permissions are set like bellow no need to change
drwx------. 2 vagrant vagrant 4096 Feb 13 15:33 .
drwx------. 4 vagrant vagrant 4096 Feb 13 14:04 ..
-rw-------. 1 vagrant vagrant 743 Feb 13 14:26 authorized_keys
-rw-------. 1 root root 409 Feb 13 13:57 authorized_keys_bk
-rw-------. 1 vagrant vagrant 409 Jan 2 23:09 authorized_keys_originial
Otherwise type chmod 600 authorized_keys and type this command too chown vagrant:vagrant authorized_keys
Finally run the vagrant halt and vagrant up again.
************************THIS IS WORK FINE FOR ME*******************************
Just for those people that have been idiots like me, or have had something odd happen to their vagrant machine. This error can also occur when you changed the permissions of the vagrant user's home directory (deliberately or by accident).
You can log in instead (as described in other posts) using the password ('vagrant') and then run the following command to fix the permissions.
sudo chown -R vagrant:vagrant /home/vagrant
Then you should be able to log in again without entering the password.
TL;DR: The permissions on your vagrant home folder are wrong.
Simple:
homestead destroy
homestead up
Edit (Not as simple as first thought):
The issue was that new versions of homestead use php7.0 and some other stuff. To avoid this mess up make sure you set the verison in Homestead.yml:
version: "0"
I solved this problem by running commands on windows 7 CMD as given in this here is the link last post on this thread,
https://github.com/mitchellh/vagrant/issues/6744
Some commands that will reinitialize various network states:
Reset WINSOCK entries to installation defaults : netsh winsock reset catalog
Reset TCP/IP stack to installation defaults : netsh int ip reset reset.log
Flush DNS resolver cache : ipconfig /flushdns
Renew DNS client registration and refresh DHCP leases : ipconfig /registerdns
Flush routing table : route /f
Been beating my head on this for the last couple of days on a repackaged base box. (Mac OS X, El Capitan)
Following #Radek 's procedure I did 'vagrant ssh-config' on the source box and got:
...
/Users/Shared/dev/<source-box-name>/.vagrant/machines/default/virtualbox/private_key
...
On the new copy, that command gave me:
...
IdentityFile /Users/<username>/.vagrant.d/insecure_private_key
...
So, I just added this line in the new copy:
...
config.ssh.private_key_path = "/Users/Shared/dev/<source-box-name>/.vagrant/machines/default/virtualbox/private_key"
...
Not perfect, but I can get on with my life.
Not sure your case is the same as mine though.
In my case vagrant ssh failed in key authentication and asked for password.
I found my old setting below in my ~/.ssh/config (at the top of the file).
PubkeyAcceptedKeyTypes ssh-dss,ssh-rsa
After removing this, key authentication started working. No more password asked.

How to forward local keypair in a SSH session?

I manually deploy websites through SSH, I manage source code in github/bitbucket. For every new site I'm currently generating a new keypair on the server and adding it to github/bitbucket, so that I can pull chances from server.
I came across a feature in capistrano to use local machine's key pair for pulling updates to server, which is ssh_options[:forward_agent] = true
How can I do something like this and forward my local machine's keypair to the server I'm SSH-ing into, so that I can avoid adding keys into github/bitbucket for every new site.
This turned out to be very simple, complete guide is here Using SSH Forwarding
In essence, you need to create a ~/.ssh/config file, if it doesn't exist.
Then, add the hosts (either domain name or IP address in the file and set ForwardAgent yes)
Sample Code:
Host example.com
ForwardAgent yes
Makes SSH life a lot easier.
Create ~/.ssh/config
Fill it with (host address is the address of the host you want to allow creds to be forwarded to):
Host [host address]
ForwardAgent yes
If you haven't already run ssh-agent, run it:
ssh-agent
Take the output from that command and paste it into the terminal. This will set the environment variables that need to be set for agent forwarding to work. Optionally, you can replace this and step 3 with:
eval "$(ssh-agent)"
Add the key you want forwarded to the ssh agent:
ssh-add [path to key if there is one]/[key_name].pem
Log into the remote host:
ssh -A [user]#[hostname]
From here, if you log into another host that accepts that key, it will just work:
ssh [user]#[hostname]
To use it simply with the default identity (id_rsa) you can use the following couple of command:
ssh-add
ssh -A [username]#[server-address]
The configuration file is very helpful but the trick for agent forwarding does the ssh-add command. It seems that this have to be initial triggered before any remote connections or after restart of the computer. To permanently add the key try the following solution from the user daminetreg:
Add private key permanently with ssh-add on Ubuntu
It is very useful :
ssh -i [private-key] -A [user]#[host]
You can set one command in bash_aliases or other command routines.