SSH to Github not working - ssh

SSH has been working fine for the last few weeks since I got my new PC. I've had no problems but today I started getting:
ssh: connect to host github.com port 22: resource temporarily unavailable
I did some googling and found that there is a common issue with WSL which sometimes causes this, but I'm unable to SSH from my bash shell, or from cmd/powershell.
This is the part that confuses me, if I do: ssh -T git#192.30.253.113 I am prompted for the password to my key, it successfully authenticates and responds with "Hi alexmk92! You've successfully authenticated".
Great, that at least proves that my firewall isn't blocking SSH on port 22. But why does git#github.com throw the resource failed error? My initial thought is that this could be a DNS problem.
So I tried to configure my network adapter to use Google's DNS server (8.8.8.8 and 8.8.4.4) I even configured the IPV6 DNS servers just in case. Following this I did an ipconfig /flushdns, attempted to connect via git#github.com again and BAM the same result, however git#192.30.253.113 still works.
I'm guessing another potential cause is that github.com is behind a load balancer and one of the IP's on the cluster could be black-listed somewhere on my machine? I'm just pulling guesses out of thin air now, any help would be greatly appreciated, this is driving me insane.

After some further Googling it turned out that my machine did not have a hosts entry for github.com and it was unable to automatically resolve it.
In Windows Subsystem for Linux I created a ssh config file
touch ~/.ssh/config
(for some reason the base distro of Ubuntu 18.04 on the windows marketplace didn't have one) I then had to make sure the file permissions were correct:
chmod 755 ~/.ssh/config
Once the file was created, I edited it with
sudo nano ~/.ssh/config
and added github.com as a Host.
Host github.com
Hostname ssh.github.com
Port 22
Upon saving, I ran
sudo /etc/init.d/ssh restart
and attempted
ssh -T git#github.com
Everything now seems to be working.

In my case my ISP did not allow ssh, so it was not working from cmd and wsl both. Got around it using vpn

To have successful SSH connection to Github, SSH key has to be import into Github
Open Git bash or Terminal
Run the command ssh-keygen
Choose all default option
A private and a public key gets generated in the folder * < user_home>/.ssh/*
Login to Github.com
Navigate to account settings
Choose item "SSH and GPG Keys" from the side navigation bar
click added new SSh key
Copy and save public key content from * < user_home>/.ssh/id_rsa.pub *

Related

kex_exchange_identification: Connection closed by remote host

I've wanted to connect my share hosting with ssh. So I generate an ssh key in the ssh action of cpanel and authorized it. Then I've downloaded the private key and drop it in the ./ssh folder of my MacBook.I've used this code to connect my host.
ssh -p 2083 username#host IP
but I got this error:
kex_exchange_identification: Connection closed by remote host
How can I solve my problem?
I run into a similar case with a small computer I have in my desk. What I did to debug the issue was to run sshd -t, which runs the sshd daemon in debug mode. This command reported that the permissions of my keys were invalid. All I had to do then was to go in the folder where the keys are stored and issue chmod 0600 <your_ssh_keys>.
Maybe the action you run generated things with the wrong permissions too.
I got this error when using docker command with remote host
docker -H ssh://user#server compose up
after some digging i found on my remote server in auth logs (/var/log/auth.log) this:
Aug 8 14:51:46 user sshd[1341]: error: beginning MaxStartups throttling
Aug 8 14:51:46 user sshd[1341]: drop connection #10 from [some_ip]:32992 on [some_ip]:22 past MaxStartups
This lead me to change MaxStartups settings in /etc/ssh/sshd_config. After restarting ssh service everything worked like a charm.
I had same problem and it was happend as I use ProxyCommand in ssh config file. In my case the Host was not defined correctly which then caused the same error!

How to specify RemoteForward in the ssh config file?

I'm trying to setup a an ssh tunnel with remote port forwarding. The idea is the have a VPS act as a means to ssh into remote deployed systems (which currently incorporate a Raspberry Pi). Everything seems to work, but I run into issues when trying to move all arguments into the ~/.ssh/config file.
what does work is the setting of the HostName, User, Port and IdentityFile. However setting the RemoteForward parameter does not seem to work.
The following works:
ssh -R 5555:localhost:22 ssh-tunnel
How ever when using the following line in the config file;
Host ssh-tunnel
...
RemoteForward 5555 localhost:22
The following command returns the message "Bad remote forwarding specification 'ssh-tunnel'"
ssh -R ssh-tunnel
Obvious I found the answer almost immediately after posting the question. Using the -R flag requires you to set the remote forwarding in the command line call. However because remote forwarding is set in the config file you shouldn't add it to the command. However something confusing occurs in that aside from setting up the tunnel you also ssh into the remote server. To avoid this add the -f and the -N flag. This results in the following command:
ssh -f -N ssh-tunnel

Host key verification failed on first ssh connection

I'm trying to log with ssh on my EC2 instance with a new dual-booted ubuntu 16.04. It's the first time i'm logging in with this client, so there is nothing in .ssh/known_hosts to be deleted, as it is suggested in many other posts like this one.
When I run :
ssh -i "my_key.pem" ubuntu#servername.amazonaws.com
I get:
The authenticity of host 'servername.amazonaws.com (serverip)' can't be established.
ECDSA key fingerprint is SHA256:***************************.
Are you sure you want to continue connecting (yes/no)?
Host key verification failed.
Since i can log with the exac same key from putty on my windows computer,and also from a mac with the same key, this doesnt seem to be key-related.
Anyone out there to help? Thanks in advance!
EDIT: i installed putty on linux, since it was working on windows. Doesnt work either.
nmap localhost gives me port 22 open.
nmap my.ip doesnt.
I tried to ssh to another address, and same results on ssh and putty :(
EDIT2: not a duplicate of BitBucket: Host key authentication failed
Problem solved: it was just me who only pressed Enter on "Are you sure you want to continue connecting (yes/no)?" and not typing yes. Thanks #Kenster
If you do not get any option to continue to connect and it fails permanently, then you could use the command with StrictHostKeyChecking=no option like following :
ssh -i "my_key.pem" ubuntu#servername.amazonaws.com -o StrictHostKeyChecking=no

Vagrant ssh connect to host 127.0.0.1:2222 port 22: Bad file number

Whenever I try to connect to my local Vagrant, I get this error when I run ssh vagrant#127.0.0.1:2222 from the Windows git bash:
ssh: connect to host 127.0.0.1:2222 port 22: Bad file number
It was working previously, so I'm not sure what could have caused this. When I try to do an SFTP connection in PHPStorm 8, I get this error:
Connection to '127.0.0.1' failed.
SSH_MSG_DISCONNECT: 2 Too many authentication failures for vagrant
I've tried vagrant destroy with vagrant box remove laravel/homestead and then recreating the box from a backup I had that previously worked using vagrant box add laravel/homestead homestead.box but I still get the same errors.
I'm on Windows 7.
What can I do to get access to my vagrant box commandline again?
Try command:
ssh -p 2222 vagrant#127.0.0.1
The answer by outboundexplorer above is the correct one I believe.Here is my step-by-step approach on how I did this:
Step 1: Find out exactly what SSH settings to use
Ensure the vagrant box is running (you've done vagrant up that is)
From the command line, go to your project directory (the one where the Vagrantfile is located) and run vagrant ssh-config.
You'll get an output like this:
Host default
HostName 127.0.0.1
User ubuntu
Port 2222
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile C:/Projects/my-test-project/.vagrant/machines/default/virtualbox/private_key
IdentitiesOnly yes
LogLevel FATAL
Step 2: Setting up PHPStorm to SFTP to the Vagrant box
Based on the config settings shown above, I set up the following SFTP remote deployment server:
SFTP host: 127.0.0.1
Port: 2222
Root path: /home/ubuntu/my-test-project (this is the folder inside the Vagrant box where the files will be uploaded to, change to whatever suits your needs)
User name: ubuntu
Auth type: Select "Key pair (OpenSSH or PuTTY)"
Private key file: Point to the IdentityFile path shown (C:/Projects/....)
... and that was it.
I got this same failure when using PHpStorm to SSH into the VirtualBox guest machine that i had set up with Vagrant. Everything worked fine before I upgraded to Windows 10. After upgrading, first of all i had to upgrade to VirtualBox and Vagrant latest versions to get everything to work on Windows 10.
But then i couldn't ssh into the guest machine using the PhpStorm ssh client. After much reading, everything seemed to suggest that I had too many ssh-keys installed on my Windows machine, but checking regedit just showed that I only had a couple of keys which should be less than the suggested max 5 keys (as default). In the end i did vagrant ssh which didn't allow me to ssh into the guest machine, but it did reconfirm the ssh details for me. I then realized that after all the new installs it didn't want me to use the C:\Users\Andy\.vagrant.d\insecure_private_key key but instead use a key that it had placed within the project itself at C:/Users/Andy/CodeLab5/vagrant/.vagrant/machines/default/virtualbox/private_key.
Everything is working as it should again now :)
Make sure your vagrant is up and running by command : vagrant up
and then do vagrant ssh. It will connect to vagrant localhost

Open MPI can't launch remote nodes via SSH

I am trying to set up Open MPI between a few machines on out network.
Open MPI works fine locally, but I just can't get it to work on a remote node.
I can ssh into the remote machine (without password) just fine, but if I try something like
mpiexec -n 4 --host remote.host hello_c
then the ssh connection just times out.
I checked several tutorials but the only configuration instructions they give is "make sure you can ssh into the remote machine without a password". I did and I still can't launch nodes on remote machines. What's the problem?
I've the same issue. Try to connect in ssh with rsa certificates
Edit 03/24 : This not work.. sorry