Avoid to insert path of SSH key pair when connecting through passwordless login - ssh

I've set a passwordless connection through ssh using SSH key pair.
So if I run the command:
ssh -i /root/.ssh/root_master master#ip
I'm able to connect to master#ip without typing the pwd.
However I would like to connect without typing
-i /root/.ssh/root_master
but just typing
ssh master#ip
Can anyone help me?

localHost $ ssh remotePassword#remoteHostname
If you want to connect to remote server just by typing above command; you must create ssh trust between your local host and remote host.
Step 1: Create ssh setup on both the host. ( usually, .ssh directory is present at ~ directory )
Step 2: Generate RSA key pair on both the hosts. To generate RSA key pair
cd ~; mkdir -p .ssh; cd .ssh
ssh-keygen -t rsa -f "id_rsa" -N "\" -P "\"; chmod 400 id_rsa
touch authorized_keys; touch known_hosts
Step 3: Write id_rsa.pub file of local host to authorized_keys file of remote host and vice-versa (in case, you want to build both sides trust)
Step 4: Also make entry into known_hosts file or it will automatically create when you will connect for the first time.
This way you can create ssh trust between host and so make them passwordless.
Another way to do this is to usee new ssh module of perl.

Related

VS Code jump-box setup with SSH keys

Establishing an SSH connection via a jump box
Hi everyone, I have been trying to set up my environment on VS Code to run my code from my Windows laptop on a remote Linux server (through my University department's proxy), by following this tutorial. It is working fine, but every time I connect to the host, I need to enter my password and would like to avoid this by configuring my SSH keys - it seems like I haven't found the proper way to do so.
Generating the keys
Let's call my local Windows machine local, the proxy host1 and the final endpoint host2. I created a private/public key pair on local, transferred the public key to host1 so that it is now in ~/.ssh/authorized_keys, and repeated the process by generating a new key pair on host1 and transferring the public key on host2. I followed the instructions here for generating and transferring the keys:
Generate key on local:
ssh-keygen -t rsa -b 4096
Transfer public key to host1:
$USER_AT_HOST="your-user-name-on-host#hostname"
$PUBKEYPATH="$HOME\.ssh\id_rsa.pub"
$pubKey=(Get-Content "$PUBKEYPATH" | Out-String); ssh "$USER_AT_HOST" "mkdir -p ~/.ssh && chmod 700 ~/.ssh && echo '${pubKey}' >> ~/.ssh/authorized_keys && chmod 600 ~/.ssh/authorized_keys"
Generate key on host1:
ssh-keygen -t rsa -b 4096
Transfer public key to host2:
export USER_AT_HOST="your-user-name-on-host#hostname"
export PUBKEYPATH="$HOME/.ssh/id_rsa.pub"
ssh-copy-id -i "$PUBKEYPATH" "$USER_AT_HOST"
VS Code config
I then edited my config file according to this, which now looks as follows:
Host host1
HostName host1
User me
ForwardX11 yes
IdentityFile C:\Users\Me\.ssh\id_rsa
Host host2
HostName host2
ProxyCommand C:\Windows\System32\OpenSSH\ssh.exe -q -W %h:%p host1
ForwardX11Trusted yes
User me
IdentityFile ~/.ssh/id_rsa
It seems that the first jump works fine (I don't need to enter my password twice) but I am still asked for it when establishing the connection. My guess is that I haven't configured the IdentityFile properly? When connecting through PowerShell in two steps (i.e. SSH into host1 and then host2), I don't need to enter my password. I would really appreciate any advice!
I've been stucking in the same situation.I tried a lot ,and finally managed to connect without password prompts. Below it's how I've done it, Hope it'll help.
Suppose I(machine A) want to connect to machine C via Machine B(JumpServer), generate keys using ssh-keygen on machine A, then copy the content of public key file(default as id_rsa.pub) to authorized keys file(default as authorized_keys in .ssh folder) of both machine B and machine C(or using ssh-copy-id if available). At last the IdentityFile field of both hosts of machine B and machine C in the config file(host1 and host2 in your case), fill them with ~/.ssh/id_rsa or C:\Users\your_user_name.ssh\id_rsa(the private key you generate on machine A ).
Finally it connects as expected.(I guess in this siutation but not for sure that the identity file in the local machine A is always the subject to connect, so machine B and machine C need to use the identity of machine A for all authorizations)
I met exactly the same situation, that is making this ssh connection: local (Windows) -> host1 (Linux) -> host2 (Linux)
The problem here is that for the second jump to host2, the ProxyCommand "ssh.exe -q -W %h:%p host1" actually looks for host2's IdentityFile "~/.ssh/id_rsa" on local. Because the keys you generated on host1 is different from the one on local, using the key on local would fail to make the second jump.
Solutions:
Simply use the same key for two jumps. Copy the id_rsa.pub on local to host2's authorized_keys.
Copy the key files on host1 to local, rename them and fill host2's IdentityFile with the path of the key file on local.
Referring to this question, modifying the ProxyCommand may enable ssh to use the key on host1 during the second jump. However, I haven't been able to make it work on my Windows local machine.

scp is still requesting password

I want to copy big files from one linux server(SLES11) to another(SunOS) via bash scripting. I dont want to have a password promt so I used ssh-keygen to generate key about this connection.These are the steps I followed:
ssh-keygen -t rsa -b 2048
ssh-copy-id -i /home/username/.ssh/id_rsa.pub swtrans#111.111.111.111
ssh -i id_rsa.pub swtrans#111.111.111.111
After this scp command still requests password.
I am not 'root' user in both servers.
I changed permissions to 700 to the .ssh directory and 640 to the file authorized_keys in the remote server.
ssh -i id_rsa.pub swtrans#111.111.111.111
The -i argument accepts the private key, not the public one. You should use
ssh -i id_rsa swtrans#111.111.111.111
If it will not help, please provide the errors you can see in the server log and in the client

SSH "Failed to add the host to the list of known hosts" Openshift

I tried to use ssh command to connect to another remote host.
ssh -p 21098 -i $OPENSHIFT_DATA_DIR/.ssh/host_key user#domain.com
The authenticity of host '[domain.com]:21098 ([124.219.148.93]:21098)' can't be established.
RSA key fingerprint is 12:15:79:55:c6:2a:66:1e:82:94:da:19:e1:ca:21:3d.
Are you sure you want to continue connecting (yes/no)?yes
Failed to add the host to the list of known hosts (/var/lib/openshift/541b685c5973cae7bbf006f4/.ssh/known_hosts).
Connection closed by 124.219.148.93
I suppose we do not have access to home/.ssh. So how to solve this problem?
One can pass options to SSH on command line, like this:
ssh -o UserKnownHostsFile=/tmp/known_host_file -p 21098 -i $OPENSHIFT_DATA_DIR/.ssh/host_key user#domain.com
Here is related answer: ssh use known_hosts other than $HOME/.ssh/known_hosts

Connecting with ssh from OSX to VHM centos server

I have the following situation:
VHM cpanel server (using centos)
pc mac: OS X Mavericks
I'm trying to setup ssh connection from my pc to vhm cpanel server.
I've made the following steps:
In OS X:
I've generate a public/private key like this:
$ cd ~/.ssh
$ ssh-keygen -t rsa
I've succesfully generated with a passphrase:
id_rsa
id_rsa.pub
IN VHM:
SSH Password Authorization Tweak OFF
Manage root’s SSH Keys > Import key
I've paste my id_rsa.pub key in the Public Key box.
I authorised the key
IN OS X Terminal:
$ ssh 111.111.111.11
( where 111.111.111.11 is the server adresss)
enter password: xxxxxx
Permission denied, please try again.
I've also tried
$ ssh root#111.111.111.11
but same results
I'm doing something wrong ?
These are correct steps to give ssh access ?
Try to put your key on a .txt file and login using that command:
ssh –i key.txt –l root <IP>

How do I setup passwordless ssh on AWS

How do I setup passwordless ssh between nodes on AWS cluster
Following steps to setup password less authentication are tested thoroughly for Centos and Ubuntu.
Assumptions:
You already have access to your EC2 machine. May be using the pem key or you have credentials for a unix user which has root permissions.
You have already setup RSA keys on you local machine. Private key and public key are available at "~/.ssh/id_rsa" and "~/.ssh/id_rsa.pub" respectively.
Steps:
Login to you EC2 machine as a root user.
Create a new user
useradd -m <yourname>
sudo su <yourname>
cd
mkdir -p ~/.ssh
touch ~/.ssh/authorized_keys
Append contents of file ~/.ssh/id_rsa.pub on you local machine to ~/.ssh/authorized_keys on EC2 machine.
chmod -R 700 ~/.ssh
chmod 600 ~/.ssh/*
Make sure sshing is permitted by the machine. In file /etc/ssh/sshd_config, make sure that line containing "PasswordAuthentication yes" is uncommented. Restart sshd service if you make any change in this file:
service sshd restart # On Centos
service ssh restart # On Ubuntu
Your passwordless login should work now. Try following on your local machine:
ssh -A <yourname>#ec2-xx-xx-xxx-xxx.ap-southeast-1.compute.amazonaws.com
Making yourself a super user. Open /etc/sudoers. Make sure following two lines are uncommented:
## Allows people in group wheel to run all commands
%wheel ALL=(ALL) ALL
## Same thing without a password
%wheel ALL=(ALL) NOPASSWD: ALL
Add yourself to wheel group.
usermod -aG wheel <yourname>
This may help someone
Copy the pem file on the machine then copy the content of pem file to the .ssh/id_rsa file you can use bellow command or your own
cat my.pem > ~/.ssh/id_rsa
try ssh localhost it should work and same with the other machines in the cluster
how I made Paswordless shh work between two instances is the following:
create ec2 instances – they should be in the same subnet and have the same security group
Open ports between them – make sure instances can communicate to each other. Use the default security group which has one rule relevant for this case:
Type: All Traffic
Source: Custom – id of the security group
Log in to the instance you want to connect from to the other instance
Run:
1 ssh-keygen -t rsa -N "" -f /home/ubuntu/.ssh/id_rsa
to generate a new rsa key.
Copy your private AWS key as ~/.ssh/my.key (or whatever name you want to use)
Make sure you change the permission to 600
1 chmod 600 .ssh/my.key
Copy the public key to the instance you wish to connect to passwordless
1 cat ~/.ssh/id_rsa.pub | ssh -i ~/.ssh/my.key ubuntu#10.0.0.X "cat >> ~/.ssh/authorized_keys"
If you test the passwordless ssh to the other machine, it should work.
1 ssh 10.0.0.X
you can use ssh keys like described here:
http://pkeck.myweb.uga.edu/ssh/