existing virtualbox machine exported using vagramt but I can't use it - virtual-machine

I had an existing opensuse 64 bit machine which i exported using
vagrant package --base opensuse64 --output opensuse.box
After creating box I created another folder 'package-test' and copied the created box file there. Then I used
vagrant init opensuse opensuse.box
and then
vagrant up
but I am unable to connect to it via ssh.
Am I doing something wrong?
Thanks

To make vagrant ssh work, your OpenSUSE VM has to be configured for Public Key Authentication using Vagrant's key pair.
If you want to use password authentication, you'll have to specify the ssh port and use username/password known to you.
NOTE: If this is a vagrant base box, by default you can login as vagrant/vagrant with sudo privilege, as per the packaging guide.
If you want to use your own key pair, you can copy the public key and add it to the VM's ~/.ssh/authorized_keys.
Examples
Manual (1 liner)
cat /path/to/vagrant.pub | ssh user#host 'cat >> ~/.ssh/authorized_keys'
Use ssh-copy-id
# -i defaults to ~/.ssh/id_rsa.pub
ssh-copy-id user#host
# custom pub key
ssh-copy-id -i vagrant.pub user#host
NOTE: make sure ~/.ssh and ~/.ssh/authorized_keys in the VM have proper permission.

Related

Gitlab SSH Key | must the ssh have the same username as gitlab account

Problem
Does the 'user#host' of the id_rsa.pub need to match with the actual machine username - host and then with the username in Gitlab?
Example:
Gitlab username name: #john.doe
Ubuntu Machine hostname command hostname: JOHNDOE
Ubuntu username: mark
Username in the id_rsa.pub: ...fsdfsdfsdfsd mark#JOHNDOE
So as you can see, my user in Ubuntu is mark and the ssh rsa key generated has a mark#JOHNDOE as last domain. But should it be john.doe#JOHNDOE instead (both in Ubuntu user and in the ssh pub key)?
And let's say that for some reason I cannot change the user in my Ubuntu machine.
I honestly think the answer is no and the issue is on my user in Gitlab that has some missing permissions or some network related problem, and I'm just paranoid but just to make sure that is not related with the ssh keys.
I think the ssh key just need to match the one in Gitlab and the username in the key has nothing to do (because actually, you can change it with command -C "john.doe#JOHNDOE" which is a Comment and anyway it still gives me errors. But again I have the doubt is ALSO the username of the Ubuntu must be john.doe
Command run / Troubleshot
OS: Ubuntu 18.04.6 LTS on Windows 10 x86_64 (WSL)
I need to connect via VPN (all other https services works via Browser so it should be fine)
Creating ssh
ssh-keygen -t rsa -b 2048
Add SSH
cat ~/.ssh/id_rsa.pub
# Then copy the key to gitlba key - etc..
Also try do
eval $(ssh-agent -s)
ssh-add -D
ssh-add ~/.ssh/id_rsa.pub
Error
Do a git clone
Please make sure you have the correct access rights and the repository exists.
Connecting
ssh -T git#gitlab.example.com
banner exchange: Connection to [here the IP but removed] port 22: Connection timed out
Ok so the problem is the VPN software that I used.
As #Raya pointed out the answer to my question is:
No, the user in the ssh public key does not matter
As soon as I changed the VPN it start to work therefore the problem was Network related
Will auto-post the answer and mark as accepted, but won't close the question so if anyone has better information can add it.

ESXi keeps prompting for password after adding ssh public key to authorized_keys

I want to add my ssh public key to the ESXi 7 host, so that I can login via ssh without using password.
But the esx host keep prompting me for the password.
I have tried the following:
Scenario A
When using the "normal" way of adding ssh keys to a host.
Make a ssh key pair with ssh-keygen -t rsa
Push ssh public key to ESXi host with ssh-copy-id root#esx.host
Now try login to esx host using ssh root#esx.host
This will prompt you for a password again.
Reason for failing
The ssh key is added to the esx hosts ~/.ssh/authorized_keys - but the SSH service, expect to find the keys in /etc/ssh/keys-root/authorized_keys.
Scenario B
Adding the the right place
Copy the key into esx by cat ~/.ssh/id_rsa.pub | ssh root#esx.host 'cat >>/etc/ssh/keys-root/authorized_keys'
Try login again with ssh root#esx.host
Still asking for password.
Scenario B is failing for a reason
Reason for failing
The ssh key is generated with by default 2048 bits, but should be 4096 bits.
Final Solution
# Generate the 4096 ssh key
ssh-keygen -t rsa -b 4096
# Copy the public key the right place on the esx host
cat ~/.ssh/id_rsa.pub | ssh root#esx.host 'cat >>/etc/ssh/keys-root/authorized_keys'
# Then login
ssh root#esx.host
Tada - now logged in without using password
Password:
The time and date of this login have been sent to the system logs.
WARNING:
All commands run on the ESXi shell are logged and may be included in
support bundles. Do not provide passwords directly on the command line.
Most tools can prompt for secrets or accept them from standard input.
VMware offers supported, powerful system administration tools. Please
see www.vmware.com/go/sysadmintools for details.
The ESXi Shell can be disabled by an administrative user. See the
vSphere Security documentation for more information.
[root#esx.host:~]

Error Public Key when trying to ssh into Google Cloud Platform VM

I had been using VSCode's remote-ssh to access my virtual machines running on google cloud. This had been working perfectly fine until I made a snapshot of my most recent instance and created a new instance out of this on a larger VM. Now when I try to connect (through any method) I get: " Permission denied (publickey).". I have spent countless hours deleting and re-adding, and recreating my ssh keys to no avail. Before I simply ran "gcloud compute config-ssh" and this created a working config file, but now this works. Please help, I have tried everything and there is simply no way for me to ssh. On the website I can click the ssh button to open up their shell, but cannot do it from my terminal
The problem may be related to the lack of identification of your SSH private key during connection in VSCode. You can indicate your private key adding IdentityFile option pointing to your SSH private key, this in your SSH connection host entries in SSH configuration files:
Host vm_name
HostName external_ip
IdentityFile /path/to/ssh_private_key
Port port_number
Here the long story if you or someone need more information.
You can go from the start for ensure that you do no have compromise your SSH keys and that is the origin of problem.
Create SSH Key
First, create new ssh keys.In the computer that you will use to access your remote host, that is Google VM instance, open your terminal or cmd and go to the ssh folder to generate the keys.
My ssh config and keys are under my user directory, /home/my_user/.ssh on Linux or C:\Users\my_user\.ssh on Windows.
The I will cd to one of these path, depending on for which of them I using at the moment.
Linux:
cd /home/my_user/.ssh
Windows:
cd C:\Users\my_user\.ssh
Command to generate SSH key
ssh-keygen -t rsa -f my_ssh_key -C user
my_ssh_key: the name your key, you can put what you want to better identify
user: must be the user that you want to use to connect at your Google VM instance.
This will generate an Private Key named my_ssh_key and a Public key named my_ssh_key.pub.
Alternatively, stay in any location of operating system and passing the absolute path where to generate the keys:
Linux:
ssh-keygen -t rsa -f /home/my_user/.ssh/my_ssh_key -C user
Windows:
ssh-keygen -t rsa -f C:\Users\my_user\.ssh\my_ssh_key -C user
Copy the public key in your Google cloud VM authorized_keys file
/home/my_user/.ssh/authorized_keys
** Do not rewrite anyone public key that already exists jus append in the file of authorized_keys file.
Add new ssh Host entry for remote connection
Click on Remote SSH manager, the icon at the bottom right of the VS Code, click on the Remote SSH: Open Configuration File option and choose your ssh configuration file to add another SSH entry for remote connection.
The config file must be under SSH directory, the same path used in the step of generate SSH keys.
Linux:
/home/my_user/.ssh/config
Windows:
C:\Users\my_user\.ssh\config
To add another Host, write the following make the properly changes:
Host vm_name
HostName external_ip
IdentityFile /path/to/ssh_private_key
Port port_number
vm_name: is alias to connect with ssh command in practical way, could be what you want.
external_ip: the external of your Google VM instance, you can get in the VM instances panel at https://console.cloud.google.com/
IdentityFile: the path for yout private SSH key, the file that you generated that note have .pub extension.
Linux:
/home/my_user/.ssh/my_ssh_key
Windows:
C:\Users\my_user\.ssh\my_ssh_key
Port: the por number of SSH of your Google VM instance, 22 is the default port.
Now it is just choose this host to connect to your Google VM instance.
For more details about SSH settings on Google Cloud Platform: https://cloud.google.com/compute/docs/instances/adding-removing-ssh-keys#linux-and-macos_1

Is there a default password to connect to vagrant when using `homestead ssh` for the first time?

I'm trying to connect to vagrant via homestead ssh:
vagrant#127.0.0.1's password:
But my public key password doesn't work.
My Homestead.yaml looks like this:
authorize: ~/.ssh/id_rsa.pub
keys:
- ~/.ssh/id_rsa
I'm using "Laravel Homestead version 2.0.14" with "Vagrant 1.7.2".
After trying a lot of passwords and becoming totally confused why my public key password is not working I found out that I have to use vagrant as password.
Maybe this info helps someone else too - that's because I've written it down here.
Edit:
According to the Vagrant documentation, there is usually a default password for the user vagrant which is vagrant.
Read more on here: official website
In recent versions however, they have moved to generating keypairs for each machine. If you would like to find out where that key is, you can run vagrant ssh -- -v. This will show the verbose output of the ssh login process. You should see a line like
debug1: Trying private key: /home/aaron/Documents/VMs/.vagrant/machines/default/virtualbox/private_key
I've a same problem. After move machine from restore of Time Machine, on another host. There problem it's that ssh key for vagrant it's not your key, it's a key on Homestead directory.
Solution for me:
Use vagrant / vagrant for access ti VM of Homestead
vagrant ssh-config for see config of ssh
run on terminal
vagrant ssh-config
Host default
HostName 127.0.0.1
User vagrant
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile "/Users/MYUSER/.vagrant.d/insecure_private_key"
IdentitiesOnly yes
LogLevel FATAL
ForwardAgent yes
Create a new pair of SSH keys
ssh-keygen -f /Users/MYUSER/.vagrant.d/insecure_private_key
Copy content of public key
cat /Users/MYUSER/.vagrant.d/insecure_private_key.pub
On other shell in Homestead VM Machine copy into authorized_keys
vagrant#homestad:~$ echo 'CONTENT_PASTE_OF_PRIVATE_KEY' >> ~/.ssh/authorized_keys
Now can access with vagrant ssh
By default Vagrant uses a generated private key to login, you can try this:
ssh -l ubuntu -p 2222 -i .vagrant/machines/default/virtualbox/private_key 127.0.0.1
This is the default working setup https://www.youtube.com/watch?v=XiD7JTCBdpI
Use Connection Method: standard TCP/IP over ssh
Then ssh hostname: 127.0.0.1:2222
SSH Username: vagrant password vagrant
MySQL Hostname: localhost
Username: homestead password:secret
On a Windows machine I was able to log to to ssh from git bash with
ssh vagrant#VAGRANT_SERVER_IP without providing a password
Using Bitvise SSH client on window
Server host: VAGRANT_SERVER_IP
Server port: 22
Username: vagrant
Password: vagrant
In my case I learned through the output from:
vagrant ssh -- -v
The problem was my private key generated by vagrant was ignored because the permissions were too open (on Windows 10).
The log lines were:
Permissions for 'C:/My Folder/.vagrant/machines/default/virtualbox/private_key'
are too open. It is required that your private key files are NOT
accessible by others. This private key will be ignored.
So in Windows Explorer, navigate to the private key for the VM on the path in your log, right-click and select properties. Then go to the Security tab and click the Advanced button. Next, Add your specific user with Full Control, and then select whichever group also has permissions and click the Disable inheritance button at the bottom of the dialog and chose to remove all inheritance. You should be left with just your own user account having permissions on the private_key file. Click Apply and close the properties dialog, then try vagrant ssh again. It should now let you in without asking for a password.

How do I setup passwordless ssh on AWS

How do I setup passwordless ssh between nodes on AWS cluster
Following steps to setup password less authentication are tested thoroughly for Centos and Ubuntu.
Assumptions:
You already have access to your EC2 machine. May be using the pem key or you have credentials for a unix user which has root permissions.
You have already setup RSA keys on you local machine. Private key and public key are available at "~/.ssh/id_rsa" and "~/.ssh/id_rsa.pub" respectively.
Steps:
Login to you EC2 machine as a root user.
Create a new user
useradd -m <yourname>
sudo su <yourname>
cd
mkdir -p ~/.ssh
touch ~/.ssh/authorized_keys
Append contents of file ~/.ssh/id_rsa.pub on you local machine to ~/.ssh/authorized_keys on EC2 machine.
chmod -R 700 ~/.ssh
chmod 600 ~/.ssh/*
Make sure sshing is permitted by the machine. In file /etc/ssh/sshd_config, make sure that line containing "PasswordAuthentication yes" is uncommented. Restart sshd service if you make any change in this file:
service sshd restart # On Centos
service ssh restart # On Ubuntu
Your passwordless login should work now. Try following on your local machine:
ssh -A <yourname>#ec2-xx-xx-xxx-xxx.ap-southeast-1.compute.amazonaws.com
Making yourself a super user. Open /etc/sudoers. Make sure following two lines are uncommented:
## Allows people in group wheel to run all commands
%wheel ALL=(ALL) ALL
## Same thing without a password
%wheel ALL=(ALL) NOPASSWD: ALL
Add yourself to wheel group.
usermod -aG wheel <yourname>
This may help someone
Copy the pem file on the machine then copy the content of pem file to the .ssh/id_rsa file you can use bellow command or your own
cat my.pem > ~/.ssh/id_rsa
try ssh localhost it should work and same with the other machines in the cluster
how I made Paswordless shh work between two instances is the following:
create ec2 instances – they should be in the same subnet and have the same security group
Open ports between them – make sure instances can communicate to each other. Use the default security group which has one rule relevant for this case:
Type: All Traffic
Source: Custom – id of the security group
Log in to the instance you want to connect from to the other instance
Run:
1 ssh-keygen -t rsa -N "" -f /home/ubuntu/.ssh/id_rsa
to generate a new rsa key.
Copy your private AWS key as ~/.ssh/my.key (or whatever name you want to use)
Make sure you change the permission to 600
1 chmod 600 .ssh/my.key
Copy the public key to the instance you wish to connect to passwordless
1 cat ~/.ssh/id_rsa.pub | ssh -i ~/.ssh/my.key ubuntu#10.0.0.X "cat >> ~/.ssh/authorized_keys"
If you test the passwordless ssh to the other machine, it should work.
1 ssh 10.0.0.X
you can use ssh keys like described here:
http://pkeck.myweb.uga.edu/ssh/