I'm following a few tutorials to learn vagrant and ansible. I get to a point in a tutorial where I have an inventory file of boxes that it will supposedly provision for me:
[loadbalancer]
lb01
[webserver]
app01
app02
[database]
db01
[control]
control ansible_connection=local
Please correct me where I'm wrong, but I think I should have setup the authorized_keys file for each of these machines manually by using "Vagrant up", followed by "vagrant ssh lb01" and placing my public key manually in authorized_keys. Or is there a quicker way to do this part? I certainly hope so.
Thanks!
Mike
If you are using Vagrant, you can use the ansible provisioner.
config.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
end
Vagrant takes care of setting up the inventory file and the related SSH private keys for you.
If you do want to see what inventory file has been generated, you can find that at
.vagrant/provisioners/ansible/inventory/vagrant_ansible_inventory
Related
I am trying to follow this vagrant tutorial. I get error after my first two command. I wrote these two command from command line
$ vagrant init hashicorp/precise64
$ vagrant up
After I ran vagrant up command I get this message.
The private key to connect to the machine via SSH must be owned
by the user running Vagrant. This is a strict requirement from
SSH itself. Please fix the following key to be owned by the user
running Vagrant:
/media/bcc/Other/Linux/vagrant3/.vagrant/machines/default/virtualbox/private_key
And then if I run any command I get the same error. Even if I run vagrant ssh I get the same error message. Please help me to fix the problem.
I am on linux mint and using virutal box as well.
Exactly as the error message tells you:
The private key to connect to the machine via SSH must be owned
by the user running Vagrant.
Therefore check permissions of file using
stat /media/bcc/Other/Linux/vagrant3/.vagrant/machines/default/virtualbox/private_key
check what user you are running using
id
or
whoami
and then modify owner of the file:
chown `whoami` /media/bcc/Other/Linux/vagrant3/.vagrant/machines/default/virtualbox/private_key
Note that this might not be possible if your /media/bbc/ is some non-linux filesystem that does not support linux permissions. In that case you should choose more suitable location for you private key.
Jakuje has the correct answer - if the file system you are working on supports changing the owner.
If you are trying to mount the vagrant box off of NTFS, it is not possible to change the owner of the key file.
If you want to mount the file on NTFS and you are running a local instance you can try the following which worked for me:
Vagrant Halt
[remove the vagrant box]
[Add the following line to Vagrantfile]
config.ssh.insert_key=false
[** you may need to remove and clone your project again]
Vagrant Provision
This solution may not be suitable for a live instance - it uses the default insecure ssh key. If you require more security you might be able to find a more palatable soultion here https://www.vagrantup.com/docs/vagrantfile/ssh_settings.html
If you put vagrant data on NTFS you can use this trick to bypass the keyfile ownership/permissions check.
Copy your key file to $HOME/.ssh/ or where-ever on a suitable filesystem where you can set it to the correct ownership and permissions. Then simply create a symlink (!) to it inside the NTFS directory (where you have set $VAGRANT_HOME, for example) like this:
ln -sr $HOME/.ssh/your_key_file your_key_file
I am brand new to learning Ansible. Here is a pretty easy example.
I have computer A, where I will be running playbooks from.
And 10 other host machines that need to be configured. My question is, do I just need to put the public SSH key of my host machine on the 10 hosts in ~/.ssh/authorized_keys ?
I guess my understanding of how to efficiently setup SSH connections between my main computer and all the clients is a little fuzzy. Any help would be appreciated here.
You create a file called hosts with this content
[test-vms]
10.0.0.100 ansible_ssh_pass='password' ansible_ssh_user='username'
In above hosts file leave off ansible_ssh_pass='password' if using ssh keys ... Then you can create a playbook with the commands and call the playbook like below. The first line of the playbook needs to have the hosts declaration
---
- hosts: test-vms
tasks:
-name: "This is a test task"
command: /bin/hostname
Finally, you call the playbook like this
ansible-playbook -i <hosts-file> <playbook.yaml>
Ansible simply uses SSH so you can either copy the public key as you describe or use password authentication using the --user and --ask-pass flags.
Yes. As far as connection to hosts go, Ansible sets up SSH connection between the master machine and the host machines. You have to add the SSH fingerprints to the end machines. You can always skip the Are you sure you want to continue connecting (yes/no/[fingerprint]) step i.e., adding the fingerprints to .ssh/known_hosts by setting host_key_checking=false
I found this great video for initial Ansible Setup - https://youtu.be/-Q4T9wLsvOQ - maybe this can help!
I currently use Vagrant and Chef to provision a VM and setup my PHP based project. This includes running composer install which essentially does a git clone of a number of private repositories.
After setting up ssh agent forwarding as outlined in the docs and the answers here: How to use ssh agent forwarding with "vagrant ssh"? I have successfully got it working.
The problem I'm having is when ever I boot a VM, provision a VM or SSH into a VM I'm now asked for vagrants default password, see examples below:
==> web: Waiting for machine to boot. This may take a few minutes...
web: SSH address: 192.168.77.185:22
web: SSH username: vagrant
web: SSH auth method: private key
Text will be echoed in the clear. Please install the HighLine or Termios libraries to suppress echoed text.
vagrant#192.168.77.185's password:
Example 2
➜ vagrant git:(master) ✗ vagrant ssh
vagrant#192.168.77.185's password:
This is pretty inconvenient as I work across a number of projects, including destroying and creating some a number of times a day (Chef test kitchen). Is there anyway to automatically use my public key as well so I don't need to continually enter a password?
I ran into a similar issue recently after creating a new Vagrant box from scratch. The problem turned out to be old entries in ~/.ssh/known_hosts (on OS X).
Try the following (assumes OS X or linux):
ssh into your Vagrant machine
type ip addr or ifconfig or the like (depending on your OS)
take note of the IP addresses listed, including 127.0.0.1
on your host machine, run ssh-keygen -R {vm-ip-address} (make sure to include 127.0.0.1 and [127.0.0.1]) for the addresses in step 3
confirm the relevant entries have been removed from ~/.ssh/known_hosts
vagrant reload
vagrant ssh
Alternatively, you can just delete/move/rename the ~/.ssh/known_hosts file, though this will require reconfirming authenticity again for multiple machines you've already ssh'd to.
I hope that helps.
Reference: http://www.geekride.com/ssh-warning-remote-host-identification-has-changed/
I use vagrant with a 3rd party linux box.
The box has the default vagrant/vagrant credentials.
In my Vagrantfile I want it to use ssh so I have this
config.vm.provision :shell, :path => "bootstrap.sh"
config.ssh.private_key_path = "~/.ssh/id_rsa"
config.ssh.forward_agent = true
In my bootstrap script I want to add my public key to authorized_keys. This works if I do it post VM creation.
But when I re-provision the VM from scratch, the VM has not yet received the public key through my bootstrap shell script.
How can I have vagrant install my public key in authorized_keys and authenticate with vagrant/vagrant until this has happened? Or is there a better way?
Found something that works
Based on this Vagrant insecure by default?
Where we have
config.ssh.private_key_path = ["#{ENV['HOME']}/.ssh/id_rsa", \
"#{ENV['HOME']}/.vagrant.d/insecure_private_key"]
This seems to have the effect that vagrant tries keys until it finds one that works (the example enumerates host file system paths too - very nice indeed.)
Rather than create a new SSH key pair on a vagrant box, I would like to re-use the key pair I have on my host machine, using agent forwarding. I've tried setting config.ssh.forward_agent to TRUE in the Vagrantfile, then rebooted the VM, and tried using:
vagrant ssh -- -A
...but I'm still getting prompted for a password when I try to do a git checkout. Any idea what I'm missing?
I'm using vagrant 2 on OS X Mountain Lion.
Vagrant.configure("2") do |config|
config.ssh.private_key_path = "~/.ssh/id_rsa"
config.ssh.forward_agent = true
end
config.ssh.private_key_path is your local private key
Your private key must be available to the local ssh-agent. You can check with ssh-add -L, if it's not listed add it with ssh-add ~/.ssh/id_rsa
Don't forget to add you public key to ~/.ssh/authorized_keys on the Vagrant VM. You can do it copy-and-pasting or using a tool like ssh-copy-id
Add it to the Vagrantfile
Vagrant::Config.run do |config|
# stuff
config.ssh.forward_agent = true
end
See the docs
In addition to adding "config.ssh.forward_agent = true" to the vagrant file make sure the host computer is set up for agent forwarding. Github provides a good guide for this. (Check out the troubleshooting section).
I had this working with the above replies on 1.4.3, but stopped working on 1.5. I now have to run ssh-add to work fully with 1.5.
For now I add the following line to my ansible provisioning script.
- name: Make sure ssk keys are passed to guest.
local_action: command ssh-add
I've also created a gist of my setup: https://gist.github.com/KyleJamesWalker/9538912
If you are on Windows, SSH Forwarding in Vagrant does not work properly by default (because of a bug in net-ssh). See this particular Vagrant bug report: https://github.com/mitchellh/vagrant/issues/1735
However, there is a workaround! Simply auto-copy your local SSH key to the Vagrant VM via a simple provisioning script in your VagrantFile. Here's an example:
https://github.com/mitchellh/vagrant/issues/1735#issuecomment-25640783
When we recently tried out the vagrant-aws plugin with Vagrant 1.1.5, we ran into an issue with SSH agent forwarding. It turned out that Vagrant was forcing IdentitiesOnly=yes without an option to change it to no. This forced Vagrant to only look at the private key we listed in the Vagrantfile for the AWS provider.
I wrote up our experiences in a blog post. It may turn into a pull request at some point.
Make sure that the VM does not launch its own SSH agent. I had this line in my ~/.profile
eval `ssh-agent`
After removing it, SSH agent forwarding worked.
The real problem is Vagrant using 127.0.0.1:2222 as default port-forward.
You can add one (not 2222, 2222 is already occupied by default)
config.vm.network "forwarded_port", guest: 22, host:2333, host_ip: "0.0.0.0"
"0.0.0.0" is way take request from external connection.
then
ssh -p 2333 vagrant#192.168.2.101 (change to your own host ip address, dud)
will working just fine.
Do thank me, Just call me Leifeng!
On Windows, the problem is that Vagrant doesn't know how to communicate with git-bash's ssh-agent. It does, however, know how to use PuTTY's Pageant. So, as long as Pageant is running and has loaded your SSH key, and as long as you've set config.ssh.forward_agent, this should work.
See this comment for details.
If you use Pageant, then the workaround of updating the Vagrantfile to copy SSH keys on Windows is no longer necessary.