Unable to connect to worker node using ssh in google cloud - ssh

I am using ssh command to connect to one of the nodes in cluster from master node (google cloud shell)
$ kubectl get nodes
$ kubectl describe node gke-hello-server-default-pool-03b44665-ng8w
I selected the external IP and then tried using
ssh 35.247.97.140
Permission denied (publickey).
again below
where nodename is
ssh -i ~/.ssh/id_rsa.pub hostname#35.247.97.140
Permission denied (publickey).
But in both case I am getting permission denied.

Easiest way would be using
gcloud compute ssh --project [PROJECT_ID] --zone [ZONE] [INSTANCE_NAME]
to connect to an Linux instance.
There is a documentation available on cloud about Connecting to instances.
And if for some reasons that will not work you should navigate to GCP > Compute Engine > VM instances, choose the instance you wish to connect and click on ssh button.
This will connect you on your home user.
So you can add a new user to the system and generate ssh-key for it.
Here is a documentation about Managing SSH keys in metadata.

Related

How do I SSH into gcloud compute instance as ubuntu user?

We have a VM Instance in Google Cloud that is being used by two user. We want to create another user the can be accessible by both. We chose the user ubuntu for that like in AWS. But the problem is we can login when running the following command.
gcloud compute --project "project" ssh --zone "us-east1-b" "gpunew3"
It's showing the following error.
ubuntu#35.196.254.72: Permission denied (publickey).
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
What can be the possible way to achieve this.
I suspect a key pair mismatch. Please run the follow command to remove the private key file:
rm .ssh/google_compute_engine
Then run your command to ssh again, which will recreate a new key pair.
I had the same problem before,and i fixed it by adding my public key in VM Instance and then connect to VM Instance by ssh like this
ssh VM External IP
https://cloud.google.com/compute/docs/instances/adding-removing-ssh-keys

Permission Denied (public key)

I'm running a google cloud instance. I'm able to successfully connect to the instance via ssh.
But I'm not able to do the port forwarding to my localhost.
Here's the command I used:
ssh -L 16006:127.0.0.1:8080 username#instance_external_ip
When I run the above command , I get the following error
The authenticity of the host cannot be determined.
username#instance_external_ip : Permission Denied (public key)
How to solve this problem?
I found the answer for this question. The problem I had was that the server did not know the ssh keys. So, I did the following and it worked.
I deleted all the ssh keys in the my local machine and connect to my gcloud instance using the following command. gcloud command creates the ssh keys automatically and it transfers to the cloud ssh keys automatically. So, no need to manually copy paste the keys.
gcloud compute --project "project_name" ssh --zone "zone_name" "instance_name"
After this I connected to my instance using ssh. Before doing if you try to ssh tunnel , as the server won't be aware of the localhost, it will say permission denied on running ssh -L .....
Therefore, instead of directly connecting through ssh -L ... , connect along with ssh-key file stored in .ssh directory. Use the following command.
ssh -i ~/.ssh/google_compute-engine -L <ur localhost port number>:127.0.0.1:<remote_host_port> username#server_ip

Google DataProc Spark - getting "permission denied (publickey)" error when trying to SSH to a worker node

small cluster. 1 master, 2 workers. I can access all nodes (master+slave) just fine using gcloud SDK. However, once I access the master node and try to ssh to a slave node, I get "permission denied (publickey)" error. Note that I can ping the node successfully, but SSH does not work.
Dataproc does not install SSH keys between the master and worker nodes, so that is working as intended.
You may be able to use SSH agent forwarding. With something like:
# Add Compute Engine private key to SSH agent
ssh-add ~/.ssh/google_compute_engine
# Forward key to SSH agent of master
gcloud compute ssh --ssh-flag="-A" [CLUSTER]-m
# SSH into worker
ssh [CLUSTER]-w-0
You could also configure SSH keys using an initialization action or use gcloud ssh from the master node (if you gave the cluster the compute.rw scope).

Can I use autossh for gcloud compute ssh?

For some reason ssh doesn't work to set up a tunnel to my Google Compute Engine instance. I have to use gcloud compute ssh. I'd really like to set up a persistent/resilient tunnel, like one gets with autossh. Is there any way I can do so using gcloud compute ssh?
gcloud compute ssh simply copies your ssh key to the project sshKeys metadata (see Cloud Console > Compute Engine > Metadata > SSH Keys) and runs standalone SSH with the ~/.ssh/google_compute_engine key. To see the exact command line invoked, run gcloud compute ssh --dry-run .... Anything that's possible with typical SSH is possible with gcloud compute ssh.
Another option to investigate is gcloud compute config-ssh, which syncs your ~/.ssh/google_compute_engine SSH key to the project and sets up your ~/.ssh/config file so that you can run ssh without gcloud.

gcloud compute ssh from one VM to another VM on Google Cloud

I am trying to ssh into a VM from another VM in Google Cloud using the gcloud compute ssh command. It fails with the below message:
/usr/local/bin/../share/google/google-cloud-sdk/./lib/googlecloudsdk/compute/lib/base_classes.py:9: DeprecationWarning: the sets module is deprecated
import sets
Connection timed out
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255]. See https://cloud.google.com/compute/docs/troubleshooting#ssherrors for troubleshooting hints.
I made sure the ssh keys are in place but still it doesn't work. What am I missing here?
There is an assumption that you have connected to the externally-visible instance using SSH beforehand with gcloud.
From your local machine, start ssh-agent with the following command to manage your keys for you:
me#local:~$ eval `ssh-agent`
Call ssh-add to load the gcloud compute public keys from your local computer into the agent, and use them for all SSH commands for authentication:
me#local:~$ ssh-add ~/.ssh/google_compute_engine
Log into an instance with an external IP address while supplying the -A argument to enable authentication agent forwarding.
gcloud compute ssh --ssh-flag="-A" INSTANCE
source: https://cloud.google.com/compute/docs/instances/connecting-to-instance#sshbetweeninstances.
I am not sure about the 'flags' because it's not working for me bu maybe I have a different OS or Gcloud version and it will work for you.
Here are the steps I ran on my Mac to connect to the Google Dataproc master VM and then hop onto a worker VM from the master MV. I ssh'd to the master VM to get the IP.
$ gcloud compute ssh cluster-for-cameron-m
Warning: Permanently added '104.197.45.35' (ECDSA) to the list of known hosts.
I then exited. I enabled forwarding for that host.
$ nano ~/.ssh/config
Host 104.197.45.35
ForwardAgent yes
I added the gcloud key.
$ ssh-add ~/.ssh/google_compute_engine
I then verified that it was added by listing the key fingerprints with ssh-add -l. I reconnected to the master VM and ran ssh-add -l again to verify that the keys were indeed forwarded. After that, connecting to the worker node worked just fine.
ssh cluster-for-cameron-w-0
About using SSH Agent Forwarding...
Because instances are frequently created and destroyed on the cloud, the (recreated) host fingerprint keeps changing. If the new fingerprint doesn't match with ~/.ssh/known_hosts, SSH automatically disables Agent Forwarding. The solution is:
$ ssh -A -o UserKnownHostsFile=/dev/null ...