When I create a new instance in GCE I'm able to ssh to that new instance without an issue. gcutil checks to see if I have the keys (google_compute_engine) and if not, it will create them for me. It will then push the keys to the instance and will pause for five minutes to ensure the keys are placed there. Again, this all works smoothly on a new instance that I create. This also tells me that my ssh works.
However, when trying to connect to another instance that already exists using "gcutil --project= ssh ", it produces "Permission Denied (publickey). I removed the keys and re-ran the command and same error. The expected result would be like above - i.e create keys and push them to the instance. But this doesn't happen.
ssh -vvv has no useful info. /var/log/auth.log doesn't even show an attempted connection.
Anyone in GCE world/ssh have any idea why gcutil works so smoothly for one instance but not for the other? What should I check for debugging this?
Thanks in advance.
You might want to look at the answer to this question that explains how gcutil works. It covers a number of different scenarios at the end.
Related
I have never had this issue before was hoping someone could help explain why this is happening, as I can't find anything online about it.
I was just given access to a new bitbucket project and account. When I first add my ssh key to my new account, I am able to clone things down easily everything works normally. For some reason though, after a few hours, I suddenly start getting Permission denied (public key) errors, and the only way i've found to solve it is to delete my previous ssh key and create a brand new one!
I am on a mac and I use ssh-keygen to create my key, and i always run ssh-add to add it to the agent. Then I delete the public key from bitbucket and create a new one with the new key's info, and it starts working again but only for a few hours!!
Is this a setting on the bitbucket i've been given access to? It doesn't seem to be as everyone I've spoken to that also has access does not have this issue. This has always been the way i've added ssh keys and I have no clue what I'm doing wrong.
Thanks!
I'm running into a weird issue with the Google Cloud VM interface. I'm working with my team on the same Google Cloud VM project, each with our own instances.
The problem: I am unable to SSH into my instance, yet am able to SSH into my teammates' instances. Whenever I SSH using the google cloud online interface, the SSH keys never transfer properly. Despite deleting and recreating keys for my computer, I always get Permission denied (publickey) (I'm even getting this on the Google Cloud shell). Even stranger: my teammates are able to SSH into my instance. This is a new phenomenon I hadn't encountered a month ago when I first used the VM successfully.
Can anyone provide me with insight as to how to diagnose the issue, and even better, a solution? I can provide debug information if you'd find useful.
Here is output when using the verbosity flag:
Output using verbosity flag
Here is the output from Armando's recommendation of using systemctl status google-guest-agent: check ownership status
Here is the output from Anthony's recommendation of creating new keys all in one line. Anthony's recommendation to recreate keys in gcloud shell
I'm pretty new to the Gcloud environment, but getting the hang of it.
Though with our first project live on an instance, I've been shuffeling some static IP's, instances and snapshots around for optimal deployment workflow. Though whats going on now, I can't understand;
I have two instances (i.e.) live-1 and dev-2.
Now I can connect to live-1 using gcloud compute ssh live-1 and it's okay.
When I try to connect to dev-2 using gcloud compute ssh dev-2, it logs me in to live-1.
The first time I tried to ssh to dev-2 it took longer than usual. After that it just connects me to the wrong instance immediately.
The goal was (as you might've guessed) to copy the live environment to a testing one. I did create an image of live-1, and cloned it to setup dev-2 with it. But in my earlier experience trying this, this was possible and worked as expected.
Whenever I use the Compute Console in the browser and use the online SSH tool from the instance list, it does connect to dev-2 properly. But on my local machine, using aformentioned command, connects me to live-1.
I already removed the IP for dev-2 from my known hosts, figuring it's cached somewhere, but no luck. What am I missing here?
Edit: I found out just now that the instances are separated though 'named' the same; if I login to dev-2, I do see myuser#live-1: in the shell, but it appears it is running a separate instance. I created a dummy file on the supposed dev-2, and it doesn't show up at the actual live-1 machine.
So this is very confusing; I rely on the 'user-tag' thing in front of every shell line to know where and what I'm actually working on; having two instances with the same name but different environments is confusing.
Ok, it was dead simple. Just run sudo hostname [desiredhostname] in the terminal, and restart it.
So in my case I logged in to dev-2 and ran sudo hostname dev-2.
I am trying to access an ec2 instance using a different (mac) computer. In order to do so, I created a new keypair, used chmod 600 to set the permission, and then used ssh-add. When I try to ssh into my ec2 instance, I get "permission denied (publickey)". I'm sure my error is something idiotic and simple, but I can't seem to find it, can anyone help me out?
You need to ensure the ~/.ssh/id_rsa.pub (if it's an RSA key) from your Mac is appended to the ~/.ssh/authorized_keys file on the target machine. Normally, if this is a default Amazon API, the user is "ec2-user" -- ~ec2-user/.ssh/authorized_keys
REMEMBER TO APPEND and not remove other entries in that file -- else, you risk locking yourself out of that machine ...
is your private key on the new computer?
You need to put that on the computer you ssh in with. I usually keep mine on a flash drive.... I am not running linux atm so i forget the default directory it checks. Maybe this joggs your memory some. I think the directory would be like ~/.id_rsa/ or something?
I've followed a couple of tutorials for creating an Amazon EC2 instance using the command line tools
http://www.zabada.com/tutorials/deploying-a-rails-application-to-production-on-amazon-ec2.php
http://www.smartfrog.org/wiki/display/sf/Starting+an+EC2+Image+by+Hand
and all is well, i
ec2-add-keypair (directing the output of ec2-add-keypair directly to a file in ~/.ssh)
chmod 600 the keypair
ec2-run-instance
ec2-describe-instances
then, when the new instance is running try to ssh on
ssh -i ~/.ssh/ec2-keypair ec2-user#foo.bar.amazon.com
At this point i'm ALWAYS prompted for a password. Obviously there's no password so it always refuses me access.
My question is, what am i doing wrong here? Why am i being prompted for a password and how can i put this right so i can ssh onto the machine i've just started?
I'm guessing this is something to do with my local setup, but as far as i know this machine hasn't had anything custom done with .ssh (there's certainly no config file or anything like that lying around that might be screwing with things).
Anyone have any ideas or suggestions?
ec2-user# ? Why not root# ?