We have a VM Instance in Google Cloud that is being used by two user. We want to create another user the can be accessible by both. We chose the user ubuntu for that like in AWS. But the problem is we can login when running the following command.
gcloud compute --project "project" ssh --zone "us-east1-b" "gpunew3"
It's showing the following error.
ubuntu#35.196.254.72: Permission denied (publickey).
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
What can be the possible way to achieve this.
I suspect a key pair mismatch. Please run the follow command to remove the private key file:
rm .ssh/google_compute_engine
Then run your command to ssh again, which will recreate a new key pair.
I had the same problem before,and i fixed it by adding my public key in VM Instance and then connect to VM Instance by ssh like this
ssh VM External IP
https://cloud.google.com/compute/docs/instances/adding-removing-ssh-keys
Related
I'd like to solve the following problem using command line:
I'm trying to run the following PoC script from a GCE VM in project-a.
gcloud config set project project-b
gcloud compute instances create gce-vm-b --zone=us-west1-a
gcloud compute ssh --zone=us-west1-a gce-vm-b -- hostname
The VM is created successfully:
NAME ZONE MACHINE_TYPE PREEMPTIBLE INTERNAL_IP EXTERNAL_IP STATUS
gce-vm-b us-west1-a n1-standard-16 10.12.34.56 12.34.56.78 RUNNING
But get the following error when trying to SSH:
WARNING: The public SSH key file for gcloud does not exist.
WARNING: The private SSH key file for gcloud does not exist.
WARNING: You do not have an SSH key for gcloud.
WARNING: SSH keygen will be executed to generate a key.
Generating public/private rsa key pair.
Your identification has been saved in /root/.ssh/google_compute_engine.
Your public key has been saved in /root/.ssh/google_compute_engine.pub.
The key fingerprint is:
...
Updating project ssh metadata...
.....................Updated [https://www.googleapis.com/compute/v1/projects/project-b].
>.done.
>Waiting for SSH key to propagate.
>ssh: connect to host 12.34.56.78 port 22: Connection timed out
>ERROR: (gcloud.compute.ssh) Could not SSH into the instance. It is possible that your SSH key has not propagated to the instance yet. Try running this command again. If you still cannot connect, verify that the firewall and instance are set to accept ssh traffic.
Running gcloud compute config-ssh hasn't changed anything in the error message. It's still ssh: connect to host 12.34.56.78 port 22: Connection timed out
I've tried adding a firewall rule to the project:
gcloud compute firewall-rules create default-allow-ssh --allow tcp:22
.
Creating firewall...
...........Created [https://www.googleapis.com/compute/v1/projects/project-b/global/firewalls/default-allow-ssh].
done.
NAME NETWORK DIRECTION PRIORITY ALLOW DENY
default-allow-ssh default INGRESS 1000 tcp:22
The error is now Permission denied (publickey).
gcloud compute ssh --zone=us-west1-a gce-vm-b -- hostname
.
Pseudo-terminal will not be allocated because stdin is not a terminal.
Warning: Permanently added 'compute.4123124124324242' (ECDSA) to the list of known hosts.
Permission denied (publickey).
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
P.S. The project-a "VM" is a container run by Prow cluster (which is run by GKE).
"Permission denied (publickey)" means it is unable to validate the public key for the username.
You haven't specified the user in your command, so the user from the environment is selected and it may not be allowed into the instance gce-vm-b. Specify a valid user for the instance in your command according to the public SSH key metadata.
I'm running a google cloud instance. I'm able to successfully connect to the instance via ssh.
But I'm not able to do the port forwarding to my localhost.
Here's the command I used:
ssh -L 16006:127.0.0.1:8080 username#instance_external_ip
When I run the above command , I get the following error
The authenticity of the host cannot be determined.
username#instance_external_ip : Permission Denied (public key)
How to solve this problem?
I found the answer for this question. The problem I had was that the server did not know the ssh keys. So, I did the following and it worked.
I deleted all the ssh keys in the my local machine and connect to my gcloud instance using the following command. gcloud command creates the ssh keys automatically and it transfers to the cloud ssh keys automatically. So, no need to manually copy paste the keys.
gcloud compute --project "project_name" ssh --zone "zone_name" "instance_name"
After this I connected to my instance using ssh. Before doing if you try to ssh tunnel , as the server won't be aware of the localhost, it will say permission denied on running ssh -L .....
Therefore, instead of directly connecting through ssh -L ... , connect along with ssh-key file stored in .ssh directory. Use the following command.
ssh -i ~/.ssh/google_compute-engine -L <ur localhost port number>:127.0.0.1:<remote_host_port> username#server_ip
I have two google accounts and 1 Compute Engine Instance. I ssh to it from browser using two different account. Then i run the following command:
sudo gcloud compute ssh myinstance
It succeeds on account1 and fails on account2 with the following error:
Permission denied (publickey).
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255]. See https://cloud.google.com/compute/docs/troubleshooting#ssherrors for troubleshooting hints.
I can even run
sudo gcloud compute ssh account2#myinstance when login as account1 and it succeeds.
How to make gcloud compute ssh command on account2 executable?
Similar thing happens when i try to configure shh keys using config-ssh command. I can use ssh myinstance on account1 and get the following error on account2:
ssh: Could not resolve hostname myinstance: Name or service not known
Maybe the problem is that account1 is the creator of instance
Perhaps you forgot adding metadata to compute engine of account2
I am trying to ssh into a VM from another VM in Google Cloud using the gcloud compute ssh command. It fails with the below message:
/usr/local/bin/../share/google/google-cloud-sdk/./lib/googlecloudsdk/compute/lib/base_classes.py:9: DeprecationWarning: the sets module is deprecated
import sets
Connection timed out
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255]. See https://cloud.google.com/compute/docs/troubleshooting#ssherrors for troubleshooting hints.
I made sure the ssh keys are in place but still it doesn't work. What am I missing here?
There is an assumption that you have connected to the externally-visible instance using SSH beforehand with gcloud.
From your local machine, start ssh-agent with the following command to manage your keys for you:
me#local:~$ eval `ssh-agent`
Call ssh-add to load the gcloud compute public keys from your local computer into the agent, and use them for all SSH commands for authentication:
me#local:~$ ssh-add ~/.ssh/google_compute_engine
Log into an instance with an external IP address while supplying the -A argument to enable authentication agent forwarding.
gcloud compute ssh --ssh-flag="-A" INSTANCE
source: https://cloud.google.com/compute/docs/instances/connecting-to-instance#sshbetweeninstances.
I am not sure about the 'flags' because it's not working for me bu maybe I have a different OS or Gcloud version and it will work for you.
Here are the steps I ran on my Mac to connect to the Google Dataproc master VM and then hop onto a worker VM from the master MV. I ssh'd to the master VM to get the IP.
$ gcloud compute ssh cluster-for-cameron-m
Warning: Permanently added '104.197.45.35' (ECDSA) to the list of known hosts.
I then exited. I enabled forwarding for that host.
$ nano ~/.ssh/config
Host 104.197.45.35
ForwardAgent yes
I added the gcloud key.
$ ssh-add ~/.ssh/google_compute_engine
I then verified that it was added by listing the key fingerprints with ssh-add -l. I reconnected to the master VM and ran ssh-add -l again to verify that the keys were indeed forwarded. After that, connecting to the worker node worked just fine.
ssh cluster-for-cameron-w-0
About using SSH Agent Forwarding...
Because instances are frequently created and destroyed on the cloud, the (recreated) host fingerprint keeps changing. If the new fingerprint doesn't match with ~/.ssh/known_hosts, SSH automatically disables Agent Forwarding. The solution is:
$ ssh -A -o UserKnownHostsFile=/dev/null ...
I cannot ssh from my computer into the server hosted on Google Cloud.
I tried the normal ssh-keygen with user#domain.com and uploading the public key, which worked last time, but this time it didn't. The issue started after I changed the password for the account. After that I could no longer ssh or sftp into the account, although I wasn't disconnected until I disconnected.
I then tried the gcloud ssh user#instance and it ran fine and told me it just hasn't propagated yet.
I added AllowUsers user to the server's ssh config file and I restarted ssh on the server, but still the same result
Here's the error:
Permission denied (publickey).
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
Update:
I've been working with Google tech support and this issue is still unresolvable. A file called authorized_keys permissions keep getting changed on boot to another user, who I also cannot log in as.
So I change it to:
thisUser:www-data 755
but on boot it changes it to:
otherUser:otherUser 600
There are a couple of things in order to fix this. You can take advantage of the metadata feature in GCE and add a startup script that would automatically change the permissions.
From the developers console, go to your Instance > Metadata and add a pair of Key/value
key : startup-script
value: chmod 755 /home/your_user/.ssh/authorized_keys OR chmod 755 ~/.ssh
after rebooting you should check the Serial Ouput option further down that page and see if it ran on startup. it should show you something along these lines :
startup script found in metadata.
startupscript: Running startup script /var/run/google.startup.script
Further information can be found HERE
Hope that helps!
I solved this by deleting the existing ssh key under Custom metadata in the VM settings. I then could login on ssh