Vagrant ssh forward_agent without entering vagrant user password repeadley - ssh

I currently use Vagrant and Chef to provision a VM and setup my PHP based project. This includes running composer install which essentially does a git clone of a number of private repositories.
After setting up ssh agent forwarding as outlined in the docs and the answers here: How to use ssh agent forwarding with "vagrant ssh"? I have successfully got it working.
The problem I'm having is when ever I boot a VM, provision a VM or SSH into a VM I'm now asked for vagrants default password, see examples below:
==> web: Waiting for machine to boot. This may take a few minutes...
web: SSH address: 192.168.77.185:22
web: SSH username: vagrant
web: SSH auth method: private key
Text will be echoed in the clear. Please install the HighLine or Termios libraries to suppress echoed text.
vagrant#192.168.77.185's password:
Example 2
➜ vagrant git:(master) ✗ vagrant ssh
vagrant#192.168.77.185's password:
This is pretty inconvenient as I work across a number of projects, including destroying and creating some a number of times a day (Chef test kitchen). Is there anyway to automatically use my public key as well so I don't need to continually enter a password?

I ran into a similar issue recently after creating a new Vagrant box from scratch. The problem turned out to be old entries in ~/.ssh/known_hosts (on OS X).
Try the following (assumes OS X or linux):
ssh into your Vagrant machine
type ip addr or ifconfig or the like (depending on your OS)
take note of the IP addresses listed, including 127.0.0.1
on your host machine, run ssh-keygen -R {vm-ip-address} (make sure to include 127.0.0.1 and [127.0.0.1]) for the addresses in step 3
confirm the relevant entries have been removed from ~/.ssh/known_hosts
vagrant reload
vagrant ssh
Alternatively, you can just delete/move/rename the ~/.ssh/known_hosts file, though this will require reconfirming authenticity again for multiple machines you've already ssh'd to.
I hope that helps.
Reference: http://www.geekride.com/ssh-warning-remote-host-identification-has-changed/

Related

SSH to Github not working

SSH has been working fine for the last few weeks since I got my new PC. I've had no problems but today I started getting:
ssh: connect to host github.com port 22: resource temporarily unavailable
I did some googling and found that there is a common issue with WSL which sometimes causes this, but I'm unable to SSH from my bash shell, or from cmd/powershell.
This is the part that confuses me, if I do: ssh -T git#192.30.253.113 I am prompted for the password to my key, it successfully authenticates and responds with "Hi alexmk92! You've successfully authenticated".
Great, that at least proves that my firewall isn't blocking SSH on port 22. But why does git#github.com throw the resource failed error? My initial thought is that this could be a DNS problem.
So I tried to configure my network adapter to use Google's DNS server (8.8.8.8 and 8.8.4.4) I even configured the IPV6 DNS servers just in case. Following this I did an ipconfig /flushdns, attempted to connect via git#github.com again and BAM the same result, however git#192.30.253.113 still works.
I'm guessing another potential cause is that github.com is behind a load balancer and one of the IP's on the cluster could be black-listed somewhere on my machine? I'm just pulling guesses out of thin air now, any help would be greatly appreciated, this is driving me insane.
After some further Googling it turned out that my machine did not have a hosts entry for github.com and it was unable to automatically resolve it.
In Windows Subsystem for Linux I created a ssh config file
touch ~/.ssh/config
(for some reason the base distro of Ubuntu 18.04 on the windows marketplace didn't have one) I then had to make sure the file permissions were correct:
chmod 755 ~/.ssh/config
Once the file was created, I edited it with
sudo nano ~/.ssh/config
and added github.com as a Host.
Host github.com
Hostname ssh.github.com
Port 22
Upon saving, I ran
sudo /etc/init.d/ssh restart
and attempted
ssh -T git#github.com
Everything now seems to be working.
In my case my ISP did not allow ssh, so it was not working from cmd and wsl both. Got around it using vpn
To have successful SSH connection to Github, SSH key has to be import into Github
Open Git bash or Terminal
Run the command ssh-keygen
Choose all default option
A private and a public key gets generated in the folder * < user_home>/.ssh/*
Login to Github.com
Navigate to account settings
Choose item "SSH and GPG Keys" from the side navigation bar
click added new SSh key
Copy and save public key content from * < user_home>/.ssh/id_rsa.pub *

Is it possible to add an ssh key to the agent for a private repo in an ansible playbook?

I am using Ansible to provision a Vagrant environment. As part of the provisioning process, I need to connect from the currently-provisioning VM to a private external repository using an ssh key in order to use composer to pull in modules for an application. I've done a lot of reading on this before asking this question, but still can't seem to comprehend what's going on.
What I want to happen is:
As part of the playbook, on the Vagrant VM, I add the ssh key to the private repo to the ssh-agent
Using that private key, I am then able to use composer to require modules from the external source
I've read articles which highlight specifying the key in playbook execution. (E.g. ansible-play -u username --private-key play.yml) As far as I understand, this isn't for me, as I'm calling the playbook via Vagrant file. I've also read articles which mention ssh forwarding. (SSH Agent Forwarding with Ansible). Based on what I have read, this is what I've done:
On the VM being provisioned, I insert a known_hosts file which consists of the host entries of the machines which house the repos I need:
On the VM being provisioned, I have the following in ~/.ssh/config:
Host <VM IP>
ForwardAgent yes
I have the following entries in my ansible.cfg to support ssh forwarding:
[defaults]
transport = ssh
[ssh_connection]
ssh_args=-o ForwardAgent=yes -o ControlMaster=auto -o ControlPersist=60s -o ControlPath=/tmp/ansible-ssh-%h-%p-%r
[privilege_escalation]
pipelining = False
I have also added the following task to the playbook which tries to
use composer:
- name: Add ssh agent line to sudoers
become: true
lineinfile:
dest: /etc/sudoers
state: present
regexp: SSH_AUTH_SOCK
line: Defaults env_keep += "SSH_AUTH_SOCK"
I exit the ansible provisioner and add the private key on the provisioned VM to the agent via a shell provisioner (This is where I suspect I'm going wrong)
Then, I attempt to use composer, or call git via the command module. Like this, for example, to test:
- name: Test connection
command: ssh -T git#github.com
Finally, just in case I wasn't understanding ssh connection forwarding correctly, I assumed that what was supposed to happen was that I needed to first add the key to my local machine's agent, then forward that through to the provisioned VM to use to grab the repositories via composer. So I used ssh-add on my local machine before executing vagrant up and running the provisioner.
No matter what, though, I always get permission denied when I do this. I'd greatly appreciate some understanding as to what I may be missing in my understanding of how ssh forwarding should be working here, as well as any guidance for making this connection happen.
I'm not certain I understand your question correctly, but I often setup machines that connect to a private bitbucket repository in order to clone it. You don't need to (and shouldn't) use agent forwarding for that ("ssh forwarding" is unclear; there's "authentication agent forwarding" and "port forwarding", but you need neither in this case).
Just to be clear with terminology, you are running Ansible in your local machine, you are provisioning the controlled machine, and you want to ssh from the controlled machine to a third-party server.
What I do is I upload the ssh key to the controlled machine, in /root/.ssh (more generally $HOME/.ssh where $HOME is the home directory of the controlled machine user who will connect to the third-party server—in my case that's root). I don't use the names id_rsa and id_rsa.pub, because I don't want to touch the default keys of that user (these might have a different purpose; for example, I use them to backup the controlled machine). So this is the code:
- name: Install bitbucket aptiko_ro ssh key
copy:
dest: /root/.ssh/aptiko_ro_id_rsa
mode: 0600
content: "{{ aptiko_ro_ssh_key }}"
- name: Install bitbucket aptiko_ro ssh public key
copy:
dest: /root/.ssh/aptiko_ro_id_rsa.pub
content: "{{ aptiko_ro_ssh_pub_key }}"
Next, you need to tell the controlled machine ssh this: "When you connect to the third-party server, use key X instead of the default key, and logon as user Y". You tell it in this way:
- name: Install ssh config that uses aptiko_ro keys on bitbucket
copy:
dest: /root/.ssh/config
content: |
Host bitbucket.org
IdentityFile ~/.ssh/aptiko_ro_id_rsa
User aptiko_ro

Vagrant ssh connect to host 127.0.0.1:2222 port 22: Bad file number

Whenever I try to connect to my local Vagrant, I get this error when I run ssh vagrant#127.0.0.1:2222 from the Windows git bash:
ssh: connect to host 127.0.0.1:2222 port 22: Bad file number
It was working previously, so I'm not sure what could have caused this. When I try to do an SFTP connection in PHPStorm 8, I get this error:
Connection to '127.0.0.1' failed.
SSH_MSG_DISCONNECT: 2 Too many authentication failures for vagrant
I've tried vagrant destroy with vagrant box remove laravel/homestead and then recreating the box from a backup I had that previously worked using vagrant box add laravel/homestead homestead.box but I still get the same errors.
I'm on Windows 7.
What can I do to get access to my vagrant box commandline again?
Try command:
ssh -p 2222 vagrant#127.0.0.1
The answer by outboundexplorer above is the correct one I believe.Here is my step-by-step approach on how I did this:
Step 1: Find out exactly what SSH settings to use
Ensure the vagrant box is running (you've done vagrant up that is)
From the command line, go to your project directory (the one where the Vagrantfile is located) and run vagrant ssh-config.
You'll get an output like this:
Host default
HostName 127.0.0.1
User ubuntu
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile C:/Projects/my-test-project/.vagrant/machines/default/virtualbox/private_key
IdentitiesOnly yes
LogLevel FATAL
Step 2: Setting up PHPStorm to SFTP to the Vagrant box
Based on the config settings shown above, I set up the following SFTP remote deployment server:
SFTP host: 127.0.0.1
Port: 2222
Root path: /home/ubuntu/my-test-project (this is the folder inside the Vagrant box where the files will be uploaded to, change to whatever suits your needs)
User name: ubuntu
Auth type: Select "Key pair (OpenSSH or PuTTY)"
Private key file: Point to the IdentityFile path shown (C:/Projects/....)
... and that was it.
I got this same failure when using PHpStorm to SSH into the VirtualBox guest machine that i had set up with Vagrant. Everything worked fine before I upgraded to Windows 10. After upgrading, first of all i had to upgrade to VirtualBox and Vagrant latest versions to get everything to work on Windows 10.
But then i couldn't ssh into the guest machine using the PhpStorm ssh client. After much reading, everything seemed to suggest that I had too many ssh-keys installed on my Windows machine, but checking regedit just showed that I only had a couple of keys which should be less than the suggested max 5 keys (as default). In the end i did vagrant ssh which didn't allow me to ssh into the guest machine, but it did reconfirm the ssh details for me. I then realized that after all the new installs it didn't want me to use the C:\Users\Andy\.vagrant.d\insecure_private_key key but instead use a key that it had placed within the project itself at C:/Users/Andy/CodeLab5/vagrant/.vagrant/machines/default/virtualbox/private_key.
Everything is working as it should again now :)
Make sure your vagrant is up and running by command : vagrant up
and then do vagrant ssh. It will connect to vagrant localhost

Vagrant ssh authentication failure

The problem with ssh authentication:
==> default: Clearing any previously set forwarded ports...
==> default: Clearing any previously set network interfaces...
==> default: Preparing network interfaces based on configuration...
default: Adapter 1: nat
default: Adapter 2: bridged
==> default: Forwarding ports...
default: 22 => 2222 (adapter 1)
==> default: Booting VM...
==> default: Waiting for machine to boot. This may take a few minutes...
default: SSH address: 127.0.0.1:2222
default: SSH username: vagrant
default: SSH auth method: private key
default: Error: Connection timeout. Retrying...
default: Error: Connection timeout. Retrying...
default: Error: Connection timeout. Retrying...
default: Error: Connection timeout. Retrying...
default: Error: Authentication failure. Retrying...
default: Error: Authentication failure. Retrying...
default: Error: Authentication failure. Retrying...
default: Error: Authentication failure. Retrying...
default: Error: Authentication failure. Retrying...
I can Ctrl+C out of the authentication loop and then successfully ssh in manually.
I performed the following steps on the guest box:
Enabled Remote Login for All Users.
Created the ~/.ssh directory with 0700 permissions.
Created the ~/.ssh/authorized_keys file with 0600 permissions.
Pasted this public key
into ~/.ssh/authorized_keys
I've also tried using a private (hostonly) network instead of the public (bridged) network, using this line in the Vagrantfile:
config.vm.network "private_network", ip: "172.16.177.7"
I get the same output (except Adapter 2: hostonly) but then cannot ssh in manually.
I also tried config.vm.network "private_network", ip: "10.0.0.100".
I also tried setting config.ssh.password in the Vagrantfile. This does output SSH auth method: password but still doesn't authenticate.
And I also tried rebuilding the box and rechecking all the above.
It looks like others have had success with this configuration, so there must be something I'm doing wrong.
I found this thread and enabled the GUI, but that doesn't help.
For general information: by default to ssh-connect you may simply use
user: vagrant password: vagrant
https://www.vagrantup.com/docs/boxes/base.html#quot-vagrant-quot-user
First, try: to see what vagrant insecure_private_key is in your machine config
$ vagrant ssh-config
Example:
$ vagrant ssh-config
Host default
HostName 127.0.0.1
User vagrant
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile C:/Users/konst/.vagrant.d/insecure_private_key
IdentitiesOnly yes
LogLevel FATAL
http://docs.vagrantup.com/v2/cli/ssh_config.html
Second, do:
Change the contents of file insecure_private_key with the contents of your personal system private key
Or use:
Add it to the Vagrantfile:
Vagrant.configure("2") do |config|
config.ssh.private_key_path = "~/.ssh/id_rsa"
config.ssh.forward_agent = true
end
config.ssh.private_key_path is your local private key
Your private key must be available to the local ssh-agent. You can check with ssh-add -L. If it's not listed, add it with ssh-add ~/.ssh/id_rsa
Don't forget to add your public key to ~/.ssh/authorized_keys on the Vagrant VM. You can do it by copy-and-pasting or using a tool like ssh-copy-id (user: root password: vagrant port: 2222) ssh-copy-id '-p 2222 root#127.0.0.1'
If still does not work try this:
Remove insecure_private_key file from c:\Users\USERNAME\.vagrant.d\insecure_private_key
Run vagrant up (vagrant will be generate a new insecure_private_key file)
In other cases, it is helpful to just set forward_agent in Vagrantfile:
Vagrant::Config.run do |config|
config.ssh.forward_agent = true
end
Useful:
Configurating git may be with git-scm.com
After setup this program and creating personal system private key will be in yours profile path: c:\users\USERNAME\.ssh\id_rsa.pub
PS: Finally - suggest you look at Ubuntu on Windows 10
None of the above worked for me. Somehow the box had the wrong public key added in the vagrant user authorised_keys file.
If you can still ssh on the box with the vagrant password (password is vagrant), i.e.
ssh vagrant#localhost -p 2222
then copy the public key content from https://raw.githubusercontent.com/mitchellh/vagrant/master/keys/vagrant.pub to the authorised_keys file with the following command
echo "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA6NF8iallvQVp22WDkTkyrtvp9eWW6A8YVr+kz4TjGYe7gHzIw+niNltGEFHzD8+v1I2YJ6oXevct1YeS0o9HZyN1Q9qgCgzUFtdOKLv6IedplqoPkcmF0aYet2PkEDo3MlTBckFXPITAMzF8dJSIFo9D8HfdOV0IAdx4O7PtixWKn5y2hMNG0zQPyUecp4pzC6kivAIhyfHilFR61RGL+GPXQ2MWZWFYbAGjyiYJnAmCP3NOTd0jMZEnDkbUvxhMmBYSdETk1rRgm+R4LOzFUGaHqHDLKLX+FIPKcF96hrucXzcWyLbIbEgE98OHlnVYCzRdK8jlqm8tehUc9c9WhQ== vagrant insecure public key" > .ssh/authorized_keys
When done exit the VM and try vagrant ssh again. It should work now.
If you experience this issue on vagrant 1.8.5, then check out this thread on github:
https://github.com/mitchellh/vagrant/issues/7610
It's caused basically by a permission issue, the workaround is just
vagrant ssh
password: vagrant
chmod 0600 ~/.ssh/authorized_keys
exit
then
vagrant reload
FYI: this issue only affects CentOS, Ubuntu works fine.
Run the following commands in guest machine/VM:
wget https://raw.githubusercontent.com/mitchellh/vagrant/master/keys/vagrant.pub -O ~/.ssh/authorized_keys
chmod 700 ~/.ssh
chmod 600 ~/.ssh/authorized_keys
chown -R vagrant:vagrant ~/.ssh
Then do vagrant halt. This will remove and regenerate your private keys.
(These steps assume you have already created or already have the ~/.ssh/ and ~/.ssh/authorized_keys directories under your home folder.)
In my experience, this has been a surprisingly frequent problem with new vagrant machines. By far the easiest way to solve it, instead of altering the configuration itself, has been creating the required ssh keys manually on the client, then using the private key on the host.
Log in to vagrant machine: vagrant ssh, use default password vagrant.
Create ssh keys: for example, ssh-keygen -t rsa -b 4096 -C "vagrant" (as adviced by GitHub's relevant guide).
Rename the public key file (by default id_rsa.pub), overriding the old one: mv .ssh/id_rsa.pub .ssh/authorized_keys.
Reload ssh service in case needed: sudo service ssh reload.
Copy the private key file (by default id_rsa) to the host machine: for instance, use a fine combination of cat and clipboard, cat .ssh/id_rsa, paint and copy (better ways must exist, go invent one!).
Logout from the vagrant machine: logout.
Find the current private key used by vagrant by looking at its configuration: vagrant ssh-config (look for instance ÌdentityFile "/[...]/private_key".
Replace the current private key with the one you created at the host machine: for example, nano /[...]/private_key and paste from the clipboard, if all else fails. (Note, however, that if your private_key is not project specific but shared by multiple vagrant machines, you better configure the path yourself in order to not break other perfectly working machines! Changing the path is as simple as adding a line config.ssh.private_key_path = "path/to/private_key" into the Vagrantfile.) Furthermore, if you are using PuPHPet generated machine, you can store your private key to file puphpet/files/dot/ssh/id_rsa and it will be added to Vagrantfile's ssh config automatically.
Test the setup: vagrant ssh should now work.
Should that be the case, congratulate yourself, logout, run vagrant provision if needed and carry on with the meaningful task at hand.
If you still face problems, it may come handy to add verbose flag to ssh command to ease debugging. You can pass that (or any other option, for that matter) after double dash. For example, typing vagrant ssh -- -v. Feel free to add as many v's as you need, each will give you more information.
Unable to run vagrant up because it gets stuck and times out? I recently had a "water in laptop incident" and had to migrate to a new one(on a MAC by the way). I successfully got all my projects up and running beside the one, which was using vagrant.
$ vagrant up
Bringing machine 'default' up with 'virtualbox' provider...
==> default: Clearing any previously set forwarded ports...
==> default: Clearing any previously set network interfaces...
==> default: Preparing network interfaces based on configuration...
default: Adapter 1: nat
default: Adapter 2: hostonly
==> default: Forwarding ports...
default: 8000 (guest) => 8877 (host) (adapter 1)
default: 8001 (guest) => 8878 (host) (adapter 1)
default: 8080 (guest) => 7777 (host) (adapter 1)
default: 5432 (guest) => 2345 (host) (adapter 1)
default: 5000 (guest) => 8855 (host) (adapter 1)
default: 22 (guest) => 2222 (host) (adapter 1)
==> default: Running 'pre-boot' VM customizations...
==> default: Booting VM...
==> default: Waiting for machine to boot. This may take a few minutes...
default: SSH address: 127.0.0.1:2222
default: SSH username: vagrant
default: SSH auth method: private key
default: Warning: Authentication failure. Retrying...
default: Warning: Authentication failure. Retrying...
default: Warning: Authentication failure. Retrying...
It couldn't authenticate, retried again and again and eventually gave up.
This is how I got it back in shape in 3 steps:
1 - Find the IdentityFile used by Vagrant:
$ vagrant ssh-config
Host default
HostName 127.0.0.1
User vagrant
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile /Users/ned/.vagrant.d/insecure_private_key
IdentitiesOnly yes
LogLevel FATAL
2 - Check the public key in the IdentityFile:
$ ssh-keygen -y -f <path-to-insecure_private_key>
It'd output something like this:
ssh-rsa AAAAB3Nyc2EAAA...9gE98OHlnVYCzRdK8jlqm8hQ==
3 - Log in to the Vagrant guest with the password vagrant:
ssh -p 2222 -o UserKnownHostsFile=/dev/null vagrant#127.0.0.1
The authenticity of host '[127.0.0.1]:2222 ([127.0.0.1]:2222)' can't be established.
RSA key fingerprint is dc:48:73:c3:18:e4:9d:34:a2:7d:4b:20:6a:e7:3d:3e.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '[127.0.0.1]:2222' (RSA) to the list of known hosts.
vagrant#127.0.0.1's password: vagrant
Welcome to Ubuntu 16.04.1 LTS (GNU/Linux 4.4.0-31-generic x86_64)
...
NOTE: if vagrant guest is configured to disallow password authentication you need to open VBox' GUI, double click guest name, login as vagrant/vagrant, then sudo -s and edit /etc/ssh/sshd_config and look for PasswordAuthentication no line (usually at the end of the file), replace no with yes and restart sshd (i.e. systemctl reload sshd or /etc/init.d/sshd restart).
4 - Add the public key to the /home/vagrant/authorized_keys file.
$ echo "ssh-rsa AA2EAAA...9gEdK8jlqm8hQ== vagrant" > /home/vagrant/.ssh/authorized_keys
5 - Exit (CTRL+d) and stop the Vagrant guest and then bring it back up.
IMPORTANT if you use any provisioning tools (i.e. Ansible etc) disable it before restarting your guest as Vagrant will think your guest is not provisioned because of use of insecure private key. It will reinstall the key and then run your provisioner!
$ vagrant halt
$ vagrant up
Hopefully you will have your arms in the air now...
I got this, with just a minor amend, from Ned Batchelders article - Ned you are a champ!
This can also happen if you're trying to force your VM to use a root user by default for SSH....
For example, a config like so in your Vagrantfile may cause this failure:
config.ssh.username = 'root'
config.ssh.password = 'vagrant'
config.ssh.insert_key = 'true'
Solution: Comment out those lines and try again!
Problem I was getting the ssh authentication errors, on a box I provisioned. The original was working ok.
The problem for me was I was missing a private key in .vagrant/machines/default/virtualbox/private_key. I copied the private key from the same relative location from the original box and Viola!
I have found a way around the mess with the keys on Win 8.2 where I did not succeed with any of the methods mentioned here. It may be interesting that exactly the same combination of VirtualBox, Vagrant, and the box run on Win 7 Ultimate without any problems.
I switched to the password authentication by adding the following commands in Vagrantfile:
config.ssh.password = "vagrant"
config.ssh.insert_key = false
Note that I'm not sure that this is the only changes required because I already did:
I generated a new RSA key pair and changed authorized_keys file accordingly (all in the virtual machine, see the suggestions above and elsewhere)
I copied the private key to the same directory where Vagrantfile resides and added
config.ssh.private_key_path = "./id_rsa"
But I believe that these changes were irrelevant. I spent a plenty of time trying, so I did not change the working configuration by obvious reasons :)
for me, this was resolved by changing the permissions on .ssh folder in vagrant home directort (i.e. "~vagrant/.ssh"). I think I messed up the permissions when I was setting up ssh keys for my application.
It seems that 'authorized_keys' file must be 'rw' only for 'vagrant' user so "chmod 600 authorized_keys"; the same goes for the directory itself and its parent:
so:
chmod 600 authorized_keys
chmod 700 .
chmod 700 ..
It was only after I had all these permissions restored that vagrant ssh started to work again.
I think it's something to do with ssh security. It refuses to recognise certificates if they are any way accessible beyond the current user, so vagrants attempts to login are thus rejected.
If you are using default SSH setup in your VagrantFile and started seeing SSH authentication errors after re-associating your VM box due to crash, try replacing public key in your vagrant machine.
Vagrant replaces public key associated with insecure private key pair at each log out due to security reasons. If you didn't properly shut down your machine, public/private key pair can go out of sync, causing SSH authentication error.
To resolve this issue, simply load up the current insecure private key and then copy the public key pair into your VM's authorized_keys file.
This might be the last answer in the list but this worked for me and I did not find this answer anywhere, I found it my self after 2 days of researches so you've better try this if nothing else worked for you until now.
In my case the problem came from my VirtualBox. I don't know for what reason an option was disabled and it should have been enabled.
As you can see in the image, there were some network problems with my VirtualBox and what I had to do in order to fix this problem was to select my machine, press on settings, network tab and after that make sure that the option Cable Connected was selected. In my case this option was not selected and I it failed at this step:
default: SSH address: 127.0.0.1:2222
default: SSH username: vagrant
default: SSH auth method: private key
First I thought that the port is already in use, after that I reinstalled Vagrant and I also tried other things but none of them worked for me.
This has happened to me several times and the way I solved it was :
Check and make sure your Vagrantfile has the correct private key path :
config.ssh.private_key_path = "/home/razvan/.ssh/id_rsa"
Execute > vagrant ssh command in a linux terminal
On your vagrant machine go to
cd /home/vagrant/.ssh
and check if the ssh key in the authorized_keys file is the same as the one you have on your local machine in ~/.ssh/id_rsa.pub. If not replace the one from your vagrant authorized_keys with the one on your local machine found in ~/.ssh/id_rsa.pub.
Reload Vagrant :
vagrant reload
Hope this helps someone else. Cheers!
1. Locate the private key in the host:
vagrant ssh-config
#
Output:
Host default
...
Port 2222
...
IdentityFile /home/me/.vagrant.d/[...]/virtualbox/vagrant_private_key
...
2. Store the private key path and the port number in variables:
Use these two commands with the output from above:
pk="/home/me/.vagrant.d/.../virtualbox/vagrant_private_key"
port=2222
#
3. Generate a public key and upload it to the guest machine:
Copy/pasta, no changes needed:
ssh-keygen -y -f $pk > authorized_keys
scp -P $port authorized_keys vagrant#localhost:~/.ssh/
vagrant ssh -c "chmod 600 ~/.ssh/authorized_keys"
rm authorized_keys
#
If you are using windows and this issue come unexpectedly, please try the following code in configuration.
config.ssh.username = 'vagrant'
config.ssh.password = 'vagrant'
config.ssh.insert_key = 'true'
This basically uses the default vagrant configuration.
Mac Solution:
Added local ssh id_rsa key to vagrant private key
vi /Users//.vagrant/machines/default/virtualbox/private_key
/Users//.ssh/id_rsa
copied public key /Users//.ssh/id_rsa.pub on vagrant box authorized_keys
ssh vagrant#localhost -p 2222 (password: vagrant)
ls -la
cd .ssh
chmod 0600 ~/.ssh/authorized_keys
vagrant reload
Problem resolved.
Thanks to
Make sure your first network interface is NAT. The other second network interface can be anything you want when you're building box. Don't forget the Vagrant user, as discussed in the Google thread.
Good luck.
also could not get beyond:
default: SSH auth method: private key
When I used the VirtualBox GUI, it told me there was an OS processor mismatch.
To get vagrant up progressing further, in the BIOS settings I had to counter-intuitively:
Disable: Virtualisation
Enable: VT-X
Try toggling these setting in your BIOS.
First of all you should remove the autogenerated insecure_private_key file, then regenerate this file by typing
vagrant ssh-config
then
vagrant halt
vagrant up
It should work
I resolved the issue in the following manner.
1. Create new SSH key using Git Bash
$ ssh-keygen -t rsa -b 4096 -C "vagrant#localhost"
# Creates a new ssh key, using the provided email as a label
Generating public/private rsa key pair.
When you're prompted to "Enter a file in which to save the key," press Enter. This accepts the default file location.
Enter a file in which to save the key (/Users/[you]/.ssh/id_rsa): [Press enter]
At the prompt, type a secure passphrase. You can leave empty and press enter if you do not need a passphrase.
Enter a file in which to save the key (/Users/[you]/.ssh/id_rsa): [Press enter]
To connect to your Vagrant VM type following command
ssh vagrant#localhost -p 2222
When you get following message type “yes” and press enter.
The authenticity of host 'github.com (192.30.252.1)' can't be established.
RSA key fingerprint is 16:27:ac:a5:76:28:2d:36:63:1b:56:4d:eb:df:a6:48.
Are you sure you want to continue connecting (yes/no)?
Now to establish a SSH connection type : $ vagrant ssh
Copy the host public key into authorized_keys file in Vagrant VM. For that, go to “Users/[you]/.ssh” folder and copy the content in id_rsa.pub file in host machine and past into “~/.ssh/authorized_keys” file in Vagrant VM.
Change permission on SSH folder and authorized_keys file in Vagrant VM
Restart vagrant with : $ vagrant reload
Another simple solution, in windows, go to the file Homestead/Vagrantfile and add these lines to connect with a username/password instead of a private key:
config.ssh.username = "vagrant"
config.ssh.password = "vagrant"
config.ssh.insert_key = false
So, finally part of the file will look like this :
if File.exists? homesteadYamlPath then
settings = YAML::load(File.read(homesteadYamlPath))
elsif File.exists? homesteadJsonPath then
settings = JSON.parse(File.read(homesteadJsonPath))
end
config.ssh.username = "vagrant"
config.ssh.password = "vagrant"
config.ssh.insert_key = false
Homestead.configure(config, settings)
if File.exists? afterScriptPath then
config.vm.provision "shell", path: afterScriptPath, privileged: false
end
Hope this help ..
Just adding my solution:
rm /Users/myusername/.ssh/config
vagrant ssh-config >> /Users/myusername/.ssh/config
Somewhat similar to other proposed solutions here.
Between all of the responses here, there are lots of good things to try. For completeness, if you
ssh vagrant#localhost -p 2222
as #Bizmate suggests, and it fails, be sure you have
AllowUsers vagrant
in the /etc/ssh/sshd_config of your guest/vagrant machine.
I am using Vagrant with a Puphpet setup from May 2015 and had this problem. It appears that the configuration that was generated didn't handle Vagrant 1.7.4 (or maybe a bit earlier?) behavior of regenerating ssh keys if it detects an insecure key.
I solved it by adding the following in my Puphpet generated Vagrantfile (local setup) inside the "if File.file?(customKey)" clause:
config.ssh.insert_key = false
Reference commit
This the all correct steps that I followed for fix this bellow issue occurred when vagrant up command run.
These are the steps that I followed
create a folder. e.g F:\projects
Open this folder in git bash and run this command
ssh-keygen -t rsa -b 4096 -C "your_email#example.com" (put a valid email address)
Then generating key pair in two separate files in the project folder. e.g project(private key file), project.pub (public key file)
Go to this location C:\Users\acer.vagrant.d and find file
insecure_private_key
Get backup of the file and copy the content of newly created private key and paste it in insecure_private_key file. Then copy insecure_private_key and paste it in this location too.
Now vagrant up in your project location. after generating above issue type vagrant ssh and go inside giving username, password. (in default username and password is set as vagrant)
Go inside to this location cd /home/vagrant/.ssh and type mv authorized_keys authorized_keys_bk
Then type ls -al and type vi authorized_keys for open authorized_keys file vi editor.
Open generated public key from notepad++ (project.pub) and copy content
Then press i on git bash to enable insert mode on vi editor and right click and paste. After press escape to get out from insert mode
:wq! for save the file and type ls -al
Then permissions are set like bellow no need to change
drwx------. 2 vagrant vagrant 4096 Feb 13 15:33 .
drwx------. 4 vagrant vagrant 4096 Feb 13 14:04 ..
-rw-------. 1 vagrant vagrant 743 Feb 13 14:26 authorized_keys
-rw-------. 1 root root 409 Feb 13 13:57 authorized_keys_bk
-rw-------. 1 vagrant vagrant 409 Jan 2 23:09 authorized_keys_originial
Otherwise type chmod 600 authorized_keys and type this command too chown vagrant:vagrant authorized_keys
Finally run the vagrant halt and vagrant up again.
************************THIS IS WORK FINE FOR ME*******************************
Just for those people that have been idiots like me, or have had something odd happen to their vagrant machine. This error can also occur when you changed the permissions of the vagrant user's home directory (deliberately or by accident).
You can log in instead (as described in other posts) using the password ('vagrant') and then run the following command to fix the permissions.
sudo chown -R vagrant:vagrant /home/vagrant
Then you should be able to log in again without entering the password.
TL;DR: The permissions on your vagrant home folder are wrong.
Simple:
homestead destroy
homestead up
Edit (Not as simple as first thought):
The issue was that new versions of homestead use php7.0 and some other stuff. To avoid this mess up make sure you set the verison in Homestead.yml:
version: "0"
I solved this problem by running commands on windows 7 CMD as given in this here is the link last post on this thread,
https://github.com/mitchellh/vagrant/issues/6744
Some commands that will reinitialize various network states:
Reset WINSOCK entries to installation defaults : netsh winsock reset catalog
Reset TCP/IP stack to installation defaults : netsh int ip reset reset.log
Flush DNS resolver cache : ipconfig /flushdns
Renew DNS client registration and refresh DHCP leases : ipconfig /registerdns
Flush routing table : route /f
Been beating my head on this for the last couple of days on a repackaged base box. (Mac OS X, El Capitan)
Following #Radek 's procedure I did 'vagrant ssh-config' on the source box and got:
...
/Users/Shared/dev/<source-box-name>/.vagrant/machines/default/virtualbox/private_key
...
On the new copy, that command gave me:
...
IdentityFile /Users/<username>/.vagrant.d/insecure_private_key
...
So, I just added this line in the new copy:
...
config.ssh.private_key_path = "/Users/Shared/dev/<source-box-name>/.vagrant/machines/default/virtualbox/private_key"
...
Not perfect, but I can get on with my life.
Not sure your case is the same as mine though.
In my case vagrant ssh failed in key authentication and asked for password.
I found my old setting below in my ~/.ssh/config (at the top of the file).
PubkeyAcceptedKeyTypes ssh-dss,ssh-rsa
After removing this, key authentication started working. No more password asked.

How to use ssh agent forwarding with "vagrant ssh"?

Rather than create a new SSH key pair on a vagrant box, I would like to re-use the key pair I have on my host machine, using agent forwarding. I've tried setting config.ssh.forward_agent to TRUE in the Vagrantfile, then rebooted the VM, and tried using:
vagrant ssh -- -A
...but I'm still getting prompted for a password when I try to do a git checkout. Any idea what I'm missing?
I'm using vagrant 2 on OS X Mountain Lion.
Vagrant.configure("2") do |config|
config.ssh.private_key_path = "~/.ssh/id_rsa"
config.ssh.forward_agent = true
end
config.ssh.private_key_path is your local private key
Your private key must be available to the local ssh-agent. You can check with ssh-add -L, if it's not listed add it with ssh-add ~/.ssh/id_rsa
Don't forget to add you public key to ~/.ssh/authorized_keys on the Vagrant VM. You can do it copy-and-pasting or using a tool like ssh-copy-id
Add it to the Vagrantfile
Vagrant::Config.run do |config|
# stuff
config.ssh.forward_agent = true
end
See the docs
In addition to adding "config.ssh.forward_agent = true" to the vagrant file make sure the host computer is set up for agent forwarding. Github provides a good guide for this. (Check out the troubleshooting section).
I had this working with the above replies on 1.4.3, but stopped working on 1.5. I now have to run ssh-add to work fully with 1.5.
For now I add the following line to my ansible provisioning script.
- name: Make sure ssk keys are passed to guest.
local_action: command ssh-add
I've also created a gist of my setup: https://gist.github.com/KyleJamesWalker/9538912
If you are on Windows, SSH Forwarding in Vagrant does not work properly by default (because of a bug in net-ssh). See this particular Vagrant bug report: https://github.com/mitchellh/vagrant/issues/1735
However, there is a workaround! Simply auto-copy your local SSH key to the Vagrant VM via a simple provisioning script in your VagrantFile. Here's an example:
https://github.com/mitchellh/vagrant/issues/1735#issuecomment-25640783
When we recently tried out the vagrant-aws plugin with Vagrant 1.1.5, we ran into an issue with SSH agent forwarding. It turned out that Vagrant was forcing IdentitiesOnly=yes without an option to change it to no. This forced Vagrant to only look at the private key we listed in the Vagrantfile for the AWS provider.
I wrote up our experiences in a blog post. It may turn into a pull request at some point.
Make sure that the VM does not launch its own SSH agent. I had this line in my ~/.profile
eval `ssh-agent`
After removing it, SSH agent forwarding worked.
The real problem is Vagrant using 127.0.0.1:2222 as default port-forward.
You can add one (not 2222, 2222 is already occupied by default)
config.vm.network "forwarded_port", guest: 22, host:2333, host_ip: "0.0.0.0"
"0.0.0.0" is way take request from external connection.
then
ssh -p 2333 vagrant#192.168.2.101 (change to your own host ip address, dud)
will working just fine.
Do thank me, Just call me Leifeng!
On Windows, the problem is that Vagrant doesn't know how to communicate with git-bash's ssh-agent. It does, however, know how to use PuTTY's Pageant. So, as long as Pageant is running and has loaded your SSH key, and as long as you've set config.ssh.forward_agent, this should work.
See this comment for details.
If you use Pageant, then the workaround of updating the Vagrantfile to copy SSH keys on Windows is no longer necessary.