SSH connection to gcloud VM instance not working - ssh

I am using a VM instance with 30gb disk space. Cloning my projects into this instance. While I was cloning one project, it showed "disk is out of space". So, I turned off all the processes running in this instance, closed all SSH connections and stopped the instance. Went to Disks, increased space from 30GB to 35GB. While starting a new SSH connection with this instance, it is not connecting. Sometimes, it goes on trying but doesnt connect. Sometimes, it shows an error
Connection via Cloud Identity-Aware Proxy Failed
Code: 4003
Reason: failed to connect to backend
You may be able to connect without using the Cloud Identity-Aware Proxy.
Tried to connect from gcloud shell, it shows me that no SSH keys are present. And ~/.ssh/authorized_keys is empty. Tried to copy key from google_compute_engine.pub to authorized_keys but not worked. Tried by creating new key and then copying it to authorized_keys > Not worked. While copying these keys, it showed me permission denied(publickey) error. While connecting from gcloud shell too, shows the same erroe.

Related

Not able to login after migrating libvert on-prem boot disk to Google cloud platform using cloud endure migration service

I migrated the vm from libvirt to Google Cloud Platform using Cloudendure. The initial sync is complete and is in Data Replication stage from over a week. Once the VM is launched using test mode and try to putty using ssh it throws Connection Refused exited with error code 255.
I tried to log in using my on-premise local machine username and SSH key with putty, As it is told in the Cloudendure documentation that I can log in to the replicated server using same credentials
The firewall rule in GCP and the machine allows port 22 for incoming connections. SSH key is also updated properly in metadata section and saying SSH key is not propagated properly.
I thought there is a problem with my local machine ufw rules and tried turning off firewall and replicated again but no use. Also tried adding SSH rule to ufw allow connections from 0.0.0.0/0 still I'm not able to connect to VM which is replicated and launched in test mode.
Steps tried:
I tried interactive console method where I tried to log in using serial-port, but the problem is it is asking for ID and password. Where I don't have PASSWORD and using only SSH keys to log-into.
Tried using Static IP for an instance. before replicating boot disk I added firewall rule allow SSH from that static-IP then I replicated and tried to login (assuming that it is blocking connection via this IP).
Followed this article to install Linux Guest OS.
Generated SSH key using ssh-keygen -t RSA -C "" in gcloud shell.
I cannot ssh into the Linux environment. Appreciate the help
Operating System: Ubuntu 18.04 LTS x64
ANy help would be greatful.

Connecting to a gcloud vm instance via ssh

Problem:
I cannot connect via ssh to my vm instance on gcloud
Description:
I am using gcloud with my own domain as the user userid#mydomain.com
I created a vm instance on Google Cloud Engine
I installed "WordPress Multisite Certified by Bitnami" via Cloud Launcher
The vm is up and running, I can even access a wordpress page.
In the section "VM Instances", I click the button SSH, but cannot connect, the window prompts:
Transferring SSH keys to the VM.
Could not connect, retrying (1/3)...
The log does not contain any errors.
Attempt 1: Gcloud API
In the Windows CMD, I can successfully connect by calling
gcloud compute ssh userid_mydomain_com#MY_INSTANCE_NAME
It generates file in C:\Users\ACCOUNT_NAME.ssh, but the file google_compute_engine.pub contains my Windows account name at the end in the format HOST\account_name#host, not my gcloud user name (userid#mydomain.com)!
I retried the ssh button, but it still failed.
I then pasted the content from the file google_compute_engine.pub into the ssh key field, but it still doesnt work, perhaps due to the wrong user name? I changed the name, but that didnt help.
Attempt 2: PuttyGen
I also tried generating ssh key with PuttyGen as described here and used userid#mydomain.com in the key comment. With or without the newly generated public key in the vm instance configuraton, I can not ssh into the vm.
Question:
How can I access my vm on gcloud via the ssh button and via putty?
Update:
I can connect to my vm instance using Putty and WinSCP.
I entered the keyfile provided in the Bitnami launchpad and the user name 'bitnami'. The ssh buttons, however, still dont make me enter.
You might not be using the right credentials. However, it's pretty weird the Web Browser Terminal Google offers
you through the SSH button doesn't work.
Please take a look to the documentation below and try to follow the steps, it might help you:
https://docs.bitnami.com/google/faq/#how-to-connect-to-the-server-through-ssh
You can connect to a VM instance via many ways:
SSH from Google cloud console
Connect via Cloud shell
Connect via your local terminal with gcloud command with IAP by using --tunnel-through-iap parameter.
gcloud compute ssh userid_mydomain_com#MY_INSTANCE_NAME --tunnel-through-iap

Unable to connect to instance through SSH in google compute engine , another instance of same account works fine

I am trying to connect to my instance using gcloud compute ssh new-instance .. it's gives the following error:
ssh: connect to host 107.167.180.68 port 22: Connection refused
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
See https://cloud.google.com/compute/docs/troubleshooting#ssherrors for troubleshooting hints.
I had already tried all the possible solution mentioned in the google document.
Any suggestions on how to get a backup of the Database and file? The site has been down for the last two days
Thanks in advance
I'd recommend looking at the serial console output of the VM instance using gcloud compute instances get-serial-port-ouput or using "View serial port" button on the instance page in Cloud Console. That output should give you information about what is wrong with the VM, such as whether it runs out of memory or ran out of disk space or something like that. Also, make sure you didn't change the VM's network firewall rules to accidentally disallow incoming traffic on port 22.
The documentation page for SSH from the Browser also has some additional tips on how to explore this kind of issues - see here and here.
You can use the ssh keys with other instances of your account if you update the ssh keys in your metadata by
sudo gcloud compute config-ssh

Google Cloud server (GCE), custom image, SSH login issue

I'm playing with Google Compute Engine(GCE) as I'm planning to migrate the cloud service provider from Rackspace(reason: GCE has good upgrade plans with best discount price).
I have few issues with GCE and one of them is Ubuntu os/image not supported by default. But there is an alternate method to run any linux distro in GCE, which is called Building an image from scratch for uploading custom images and creating instances(servers) from uploaded image.
I could able to create and run the instances from the Ubuntu image I uploaded to GCE following the link hagikuratakeshi.hatenablog.com. This is simply running ubuntu in general. I didn't face any problem but google's gcutil tool prompts for ssh passphrase and adds the key in GCE meta data but accepts only password logins(then why it prompts for passphrase).
I want to strictly follow Building an image from scratch as recommended by google. But after following all the steps, I could not able to login to my server instance via SSH. I guess this happens when I install Google Compute Engine image packages: google-startup-scripts_1.1.2-1_all.deb, google-compute-daemon_1.1.2-1_all.deb & python-gcimagebundle_1.1.2-1_all.deb. These packages/scripts make some changes to the instance at the startup and also to SSH configuration which are Strongly recommended. Once I strictly follow the link or once I install these packages I could not able to establish SSH connection once the instance is rebooted. The error message similar to the one below is shown while trying to connect:
test#machine1:~$ gcutil --service_version="v1" --project="mypro-555" ssh --zone="asia-east1-a" "server-instance-1"
INFO: Running command line: ssh o UserKnownHostsFile=/dev/null -o CheckHostIP=no -o StrictHostKeyChecking=no -i /home/test/.ssh/google_compute_engine -A -p 22 test#101.167.xxx.xxx -
ssh: connect to host 101.167.xxx.xxx port 22: Connection refused
NOTE: The user account test is available and common on both local and GCE server!.
My main problem is SSH connection when I strictly follow the steps. If I upload the fresh image and then follow the recommended steps connecting SSH, I could not do SSH again once I restart the instance (or) if I setup everything in the uploaded image before uploading, the created instance will be running but I could not able to connect atleast ones and the error is same.
Anybody using GCE with your custom image?, are you allowed to connected even after following the recommended settings?. Anyone already fixed this SSH issue?. Please post your comments!
EDIT 1
I could not figure out from the logs and here is the output of gcutil getserialportoutput server-instance-1.
The key here is that your ssh client says "connection refused". This indicates that there is indeed a machine at that IP address, but it's not accepting SSH connections. There are a few possible explanations:
The ssh daemon isn't running, or is listening on the wrong interface
Your instance is configured with a firewall that's denying SSH traffic
The GCE firewall rule to allow SSH traffic has been removed

how to make amazon EC2 instances authenticate each other automatically?

I am using aws java sdk to launch EC2 instances (running Ubuntu 12.04) and run a distributed tool on them, the tool uses openMPI for message passing between the nodes and openMPI uses SSH to connect nodes with each other.
The problem is that the EC2 instances don't authenticate each other for SSH connections by default, this tutorial shows how to set up SSH by generating keys and adding them to nodes, However, when I tried to add the generated key to the slaves using the command
$ scp /home/mpiuser/.ssh/id_dsa.pub mpiuser#slave1:.ssh/authorized_keys
I still got permission denied. Also, after generating new keys, I was not able to log in using the ".pem" key that I got from amazon.
I am not experienced with SSH keys, but I would like to have some way of configuring each EC2 instance (when its firstly created) to authenticate the others, for example by coping a key into each of them. Is this possible and how It could be done?
P.S.: I can connect to each instance once it is launched and can execute any commands on them over SSH.
I found the solution, I added the amazon private key (.pem) in the image (AMI) that I use to create the EC2 instances and I changed the /etc/ssh/ssh_config file by adding a new identity file
IdentityFile /path/to/the/key/file
This made SSH recognize the .pem private key when it tries to connect to any other EC2 instance created with the same key.
I also changed StrictHostKeyChecking to no, which stopped the message "authenticity of host xxx can't be established" which requires users interaction to proceed with connecting to that host.