unable to connect to the google cloud instance (port 22: Operation timed out) - ssh

I had created the the keys for the google cloud instance and when I try to ssh into the instance. I'm unable to access it and getting an error: port 22: Operation timed out. This happens through both the desktop ssh client and also through the google cloud shell. I had created a project wide keys in the metadata section to access it.
Any insights on why I'm unable to connect will be highly helpful. The instance was created a day ago.

Couldn't figure out why I was not able to ssh into the server previously. But now there is a single click option to ssh into the google cloud instances. You may close this thread.

Related

gcloud compute ssh can't connect at office's network

I'm running an instance on google cloud and I can access it normally using:
$ gcloud compute ssh instance-2
But I've got the following error when I try to access when I'm at office. I've tested rooting my mobile's internet and works fine. I tested on all instances and get the same response. Is that possible a rule that blocks my network in some way? I have a dynamically ip on my office's network.
$ gcloud compute ssh instance-2
No zone specified. Using zone [us-central1-c] for instance: [instance-2].
ssh: connect to host xx.xxx.xxx.xxx port 22: Connection timed out
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
After some days trying different solutions I finally found the problem.
The office's internet service provider has a setting that an ip number serve multiple clients, I opened a ticket and complained about the fact and they changed it. There are also some limitations in the modem, so I've changed the mode to bridge and connect another router to serve WiFi.

SSH Google Compute Engine down?

There's an issue with SSH access today ? Don't know why I can't access to my instances today, from a MobaXterm or the SSH webinterface in Google Cloud (impossible to connect port 22).
From Google Cloud Shell => ssh: connect to host XXXX port 22: Connection timed out
Global issue or only my account ?
I would advise you to try the following and check if you are able to connect:
Can you able to ping or SSH into VM instance using gcloud command ?
Could you also try running cloud shell in safe mode.
If you are still running the issue after trying 2 methods, please try to restart the cloud shell (This can be done by selecting restart option from hamburger menu on top right of your Cloud shell) and see if that fixes.
You can interact with the serial console so you can more easily troubleshoot instances that are not booting properly or that are otherwise inaccessible.
Please let me know of the results.

Apache Airflow unable to establish connect to remote host via FTP/SFTP

I am new to Apache Airflow and so far, I have been able to work my way through problems I have encountered.
I have hit a wall now. I need to transfer files to a remote server via sftp. I have not had any luck doing this. So far, I have gotten S3 and Postgres/Redshift connections via their respective hooks to work in various DAGs. I have been able to use the FTPHook with success testing on my local FTP server, but have not been able to figure out how to use SFTP to connect to a remote host.
I am able to connect to the remote host via SFTP with FileZilla, so I know my credentials are correct.
Through Google searching I have found the SFTPOperator, but am not able to figure out how to use it. I have also found FTPSHook, but still I have not been able to get it to work.
I keep getting the error nodename nor servname provided, or not known or a general Operation timed out in my Airflow logs.
Can someone point me in the right direction? Should I be using the FTPSHook with SSH or FTP Airflow Conn Type? Or do I need to utilize the SFTPOperator? I am also confused as to how I am supposed to setup the credentials in my Airflow connections. Do I use the SSH profile or FTP?
If I can provide any more additional info that may help, please let me know.
Cheers!
SFTPOperator is using ssh_hook underhood to open sftp transport channel that serves as a basis for file transfer. You can either configure ssh_hook by yourself or provide connection id via ssh_conn_id.
op = SFTPOperator(
task_id="test_sftp",
ssh_conn_id="my_ssh_connection",
local_filepath="",
remote_filepath="",
operation=SFTPOperation.PUT,
dag=dag
)

Google Cloud: Cannot connect to server via SSH

Port is up, firewall disabled, but connection is rejected with message:
"Read from socket failed: Connection reset by peer".
Other services in the same host are responding well.
SSH through Google Cloud Console gets the same error.
Is there any other method for Google Compute Engine to get shell?
Yes, there is a way to get shell, and it is through the serial port, a really useful feature Google Cloud provides.
There, I saw the error was about key file permissions:
Sep 30 10:51:02 localhost sshd: Permissions 0775 for '/etc/ssh/ssh_host_rsa_key' are too open.
And by assigning 0600 perms to this file everything went back to normal.

Unable to connect to instance through SSH in google compute engine , another instance of same account works fine

I am trying to connect to my instance using gcloud compute ssh new-instance .. it's gives the following error:
ssh: connect to host 107.167.180.68 port 22: Connection refused
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
See https://cloud.google.com/compute/docs/troubleshooting#ssherrors for troubleshooting hints.
I had already tried all the possible solution mentioned in the google document.
Any suggestions on how to get a backup of the Database and file? The site has been down for the last two days
Thanks in advance
I'd recommend looking at the serial console output of the VM instance using gcloud compute instances get-serial-port-ouput or using "View serial port" button on the instance page in Cloud Console. That output should give you information about what is wrong with the VM, such as whether it runs out of memory or ran out of disk space or something like that. Also, make sure you didn't change the VM's network firewall rules to accidentally disallow incoming traffic on port 22.
The documentation page for SSH from the Browser also has some additional tips on how to explore this kind of issues - see here and here.
You can use the ssh keys with other instances of your account if you update the ssh keys in your metadata by
sudo gcloud compute config-ssh