I'm running a google cloud instance. I'm able to successfully connect to the instance via ssh.
But I'm not able to do the port forwarding to my localhost.
Here's the command I used:
ssh -L 16006:127.0.0.1:8080 username#instance_external_ip
When I run the above command , I get the following error
The authenticity of the host cannot be determined.
username#instance_external_ip : Permission Denied (public key)
How to solve this problem?
I found the answer for this question. The problem I had was that the server did not know the ssh keys. So, I did the following and it worked.
I deleted all the ssh keys in the my local machine and connect to my gcloud instance using the following command. gcloud command creates the ssh keys automatically and it transfers to the cloud ssh keys automatically. So, no need to manually copy paste the keys.
gcloud compute --project "project_name" ssh --zone "zone_name" "instance_name"
After this I connected to my instance using ssh. Before doing if you try to ssh tunnel , as the server won't be aware of the localhost, it will say permission denied on running ssh -L .....
Therefore, instead of directly connecting through ssh -L ... , connect along with ssh-key file stored in .ssh directory. Use the following command.
ssh -i ~/.ssh/google_compute-engine -L <ur localhost port number>:127.0.0.1:<remote_host_port> username#server_ip
Related
I had been using VSCode's remote-ssh to access my virtual machines running on google cloud. This had been working perfectly fine until I made a snapshot of my most recent instance and created a new instance out of this on a larger VM. Now when I try to connect (through any method) I get: " Permission denied (publickey).". I have spent countless hours deleting and re-adding, and recreating my ssh keys to no avail. Before I simply ran "gcloud compute config-ssh" and this created a working config file, but now this works. Please help, I have tried everything and there is simply no way for me to ssh. On the website I can click the ssh button to open up their shell, but cannot do it from my terminal
The problem may be related to the lack of identification of your SSH private key during connection in VSCode. You can indicate your private key adding IdentityFile option pointing to your SSH private key, this in your SSH connection host entries in SSH configuration files:
Host vm_name
HostName external_ip
IdentityFile /path/to/ssh_private_key
Port port_number
Here the long story if you or someone need more information.
You can go from the start for ensure that you do no have compromise your SSH keys and that is the origin of problem.
Create SSH Key
First, create new ssh keys.In the computer that you will use to access your remote host, that is Google VM instance, open your terminal or cmd and go to the ssh folder to generate the keys.
My ssh config and keys are under my user directory, /home/my_user/.ssh on Linux or C:\Users\my_user\.ssh on Windows.
The I will cd to one of these path, depending on for which of them I using at the moment.
Linux:
cd /home/my_user/.ssh
Windows:
cd C:\Users\my_user\.ssh
Command to generate SSH key
ssh-keygen -t rsa -f my_ssh_key -C user
my_ssh_key: the name your key, you can put what you want to better identify
user: must be the user that you want to use to connect at your Google VM instance.
This will generate an Private Key named my_ssh_key and a Public key named my_ssh_key.pub.
Alternatively, stay in any location of operating system and passing the absolute path where to generate the keys:
Linux:
ssh-keygen -t rsa -f /home/my_user/.ssh/my_ssh_key -C user
Windows:
ssh-keygen -t rsa -f C:\Users\my_user\.ssh\my_ssh_key -C user
Copy the public key in your Google cloud VM authorized_keys file
/home/my_user/.ssh/authorized_keys
** Do not rewrite anyone public key that already exists jus append in the file of authorized_keys file.
Add new ssh Host entry for remote connection
Click on Remote SSH manager, the icon at the bottom right of the VS Code, click on the Remote SSH: Open Configuration File option and choose your ssh configuration file to add another SSH entry for remote connection.
The config file must be under SSH directory, the same path used in the step of generate SSH keys.
Linux:
/home/my_user/.ssh/config
Windows:
C:\Users\my_user\.ssh\config
To add another Host, write the following make the properly changes:
Host vm_name
HostName external_ip
IdentityFile /path/to/ssh_private_key
Port port_number
vm_name: is alias to connect with ssh command in practical way, could be what you want.
external_ip: the external of your Google VM instance, you can get in the VM instances panel at https://console.cloud.google.com/
IdentityFile: the path for yout private SSH key, the file that you generated that note have .pub extension.
Linux:
/home/my_user/.ssh/my_ssh_key
Windows:
C:\Users\my_user\.ssh\my_ssh_key
Port: the por number of SSH of your Google VM instance, 22 is the default port.
Now it is just choose this host to connect to your Google VM instance.
For more details about SSH settings on Google Cloud Platform: https://cloud.google.com/compute/docs/instances/adding-removing-ssh-keys#linux-and-macos_1
I am using ssh command to connect to one of the nodes in cluster from master node (google cloud shell)
$ kubectl get nodes
$ kubectl describe node gke-hello-server-default-pool-03b44665-ng8w
I selected the external IP and then tried using
ssh 35.247.97.140
Permission denied (publickey).
again below
where nodename is
ssh -i ~/.ssh/id_rsa.pub hostname#35.247.97.140
Permission denied (publickey).
But in both case I am getting permission denied.
Easiest way would be using
gcloud compute ssh --project [PROJECT_ID] --zone [ZONE] [INSTANCE_NAME]
to connect to an Linux instance.
There is a documentation available on cloud about Connecting to instances.
And if for some reasons that will not work you should navigate to GCP > Compute Engine > VM instances, choose the instance you wish to connect and click on ssh button.
This will connect you on your home user.
So you can add a new user to the system and generate ssh-key for it.
Here is a documentation about Managing SSH keys in metadata.
I am attempting to connect (via SSH) one GCE VM instance to another GCE VM instance (which will be referred to as Machine 1 and Machine 2 from now one).
So far I have generated (via ssh-keygen -t rsa -f ~/.ssh/ssh_key) a public and private key on Machine 1, and have added the contents of ssh_key.pub to the ~/.ssh/authorized_keys file on Machine 2.
However, whenever I try to connect them via ssh using the following command: gcloud compute ssh --project [PROJECT_ID] --zone [ZONE] [Machine_2_Name] it simply times out (Connection timed out. ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].)
I have doubled checked that each VM instance has plenty of disk space, and their firewall settings are permissive, and OS Login is not enabled. I have read through the answer here but nothing is working.
What am I doing wrong? How do I properly SSH from one GCE VM instance to another?
The problem I was having was that each VM was using a different network/sub-network with different firewall configurations. After making one using the same network/sub-network, I was able to easily ssh into one from the other via
username#machine1:~$ ssh machine2
I tested the same scenario on my side and I got the same result as you said. Then I ran this command inside the machine to debug the SSH process to try to narrow down the issue:
gcloud compute ssh YOUR_INSTANCE_NAME --zone ZONE --ssh-flag="-vvv"
Then I got this result:
debug1: connect to address 35.x.x.x port 22: Connection timed out
ssh: connect to host 35.x.x.x port 22: Connection timed out
So, means the instance 1 is unable to connect to the external IP address of instance 2. I only added a new firewall rule and it works.
After running above mentioned command, if you see any permission denied message, it means you did not copy the public key to the source machine properly.
I am trying to ssh into a VM from another VM in Google Cloud using the gcloud compute ssh command. It fails with the below message:
/usr/local/bin/../share/google/google-cloud-sdk/./lib/googlecloudsdk/compute/lib/base_classes.py:9: DeprecationWarning: the sets module is deprecated
import sets
Connection timed out
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255]. See https://cloud.google.com/compute/docs/troubleshooting#ssherrors for troubleshooting hints.
I made sure the ssh keys are in place but still it doesn't work. What am I missing here?
There is an assumption that you have connected to the externally-visible instance using SSH beforehand with gcloud.
From your local machine, start ssh-agent with the following command to manage your keys for you:
me#local:~$ eval `ssh-agent`
Call ssh-add to load the gcloud compute public keys from your local computer into the agent, and use them for all SSH commands for authentication:
me#local:~$ ssh-add ~/.ssh/google_compute_engine
Log into an instance with an external IP address while supplying the -A argument to enable authentication agent forwarding.
gcloud compute ssh --ssh-flag="-A" INSTANCE
source: https://cloud.google.com/compute/docs/instances/connecting-to-instance#sshbetweeninstances.
I am not sure about the 'flags' because it's not working for me bu maybe I have a different OS or Gcloud version and it will work for you.
Here are the steps I ran on my Mac to connect to the Google Dataproc master VM and then hop onto a worker VM from the master MV. I ssh'd to the master VM to get the IP.
$ gcloud compute ssh cluster-for-cameron-m
Warning: Permanently added '104.197.45.35' (ECDSA) to the list of known hosts.
I then exited. I enabled forwarding for that host.
$ nano ~/.ssh/config
Host 104.197.45.35
ForwardAgent yes
I added the gcloud key.
$ ssh-add ~/.ssh/google_compute_engine
I then verified that it was added by listing the key fingerprints with ssh-add -l. I reconnected to the master VM and ran ssh-add -l again to verify that the keys were indeed forwarded. After that, connecting to the worker node worked just fine.
ssh cluster-for-cameron-w-0
About using SSH Agent Forwarding...
Because instances are frequently created and destroyed on the cloud, the (recreated) host fingerprint keeps changing. If the new fingerprint doesn't match with ~/.ssh/known_hosts, SSH automatically disables Agent Forwarding. The solution is:
$ ssh -A -o UserKnownHostsFile=/dev/null ...
I'd like to run a rsync command from a vagrant vm to a remote server (to push files) without the need for a password.
So, the involved machines are: host, guest vm and remote
host is authorized on remote via authorized_keys, however when I run the rsync command from the vm I get asked for a password.
Is there a way to get passwordless rsync from the vm using the keys on the already-authorized host?
I'd like to avoid copying a new authorized key to the remote every time I create a vm.
Also, adding my server's password in the vagrant file is not an option.
Use ssh key forwarding via ssh-agent. Follow these steps:
On your host machine:
ssh-add PATH_TO_KEY <use Tab if unsure>
vagrant ssh
In the vagrant box edit your ~/.ssh/config:
Host name_or_ip_of_remote
ForwardAgent yes
Now try to connect to the remote from the vagrant box:
ssh name_or_ip_of_remote
It should work without a password. As rsync is using ssh under the hood, it will work without a password too.