I am looking to share/sync a folder on my shared web hosting to a bucket I have created in Google Cloud Storage. My host has enabled both SSH and rsync on my shared hosting. unfortunately, I cannot install rclone which should fit this job nicely. I have setup a Google Cloud f1micro VM and will be using this as the link as well as writing the syncs to GCS.
How can I connect the two?
Here is the standard command to rsync to a remote server via SSH. How could this be adapted to send to Google Cloud Storage? At the moment it returns an error of Unexpected remote arg
rsync gcs1 ssh cpanelusername#hostingip:public_html/r1/ -p 26
There is gsutil rsync but unsure whether this would work in this setup
https://cloud.google.com/storage/docs/gsutil/commands/rsync
Does this need to run in opposite direction and link to my SSH from my Google VM which will run the rsync?
NOTE: Our SSH shared hosting port is 26
Related
I migrated the vm from libvirt to Google Cloud Platform using Cloudendure. The initial sync is complete and is in Data Replication stage from over a week. Once the VM is launched using test mode and try to putty using ssh it throws Connection Refused exited with error code 255.
I tried to log in using my on-premise local machine username and SSH key with putty, As it is told in the Cloudendure documentation that I can log in to the replicated server using same credentials
The firewall rule in GCP and the machine allows port 22 for incoming connections. SSH key is also updated properly in metadata section and saying SSH key is not propagated properly.
I thought there is a problem with my local machine ufw rules and tried turning off firewall and replicated again but no use. Also tried adding SSH rule to ufw allow connections from 0.0.0.0/0 still I'm not able to connect to VM which is replicated and launched in test mode.
Steps tried:
I tried interactive console method where I tried to log in using serial-port, but the problem is it is asking for ID and password. Where I don't have PASSWORD and using only SSH keys to log-into.
Tried using Static IP for an instance. before replicating boot disk I added firewall rule allow SSH from that static-IP then I replicated and tried to login (assuming that it is blocking connection via this IP).
Followed this article to install Linux Guest OS.
Generated SSH key using ssh-keygen -t RSA -C "" in gcloud shell.
I cannot ssh into the Linux environment. Appreciate the help
Operating System: Ubuntu 18.04 LTS x64
ANy help would be greatful.
I have some cloud foundry nodejs apps in IBM Cloud (Bluemix), and facing some issues with temp folders. It'd be easier if I can access directly to app folders (like my .tmp) to debug what's saving there. I can only think in SSH, but I prefer a visual tool or connected service. Any idea?
You can use graphical tools to browse the files by utilizing scp or sftp. I use FileZilla to browser the files. Instructions for accessing the files are similar to all Cloud Foundry providers, including IBM Cloud:
I logged in to IBM Cloud using the CLI: ibmcloud login
Next, set the org and space: ibmcloud target --cf
Obtain the GUID for the app: ibmcloud cf app YOURAPP --guid
Look for the ssh endpoint: ibmcloud cf curl /v2/info
Issue a onetime password for ssh access: ibmcloud cf ssh-code
With that, use a username like cf:theGUIDfrom3/0 (0 could be another number depending on how many instances you have) and the onetime password to log in. The host is the one listed as app_ssh_endpoint at the shown port. You likely need to prefix it with the protocol, e.g., sftp://.
I have a microservice, let's call it RdsConnector, I want to test locally that is normally deployed on a machine on AWS. It connects to a MySQL instance, which is also in AWS, without any SSH tunnelling as they are in the same VPC. To connect to that MySQL instance from my local machine, I can use SSH tunnelling to get into the VPC I have set up in AWS. This is what that configuration looks like:
I could set up my microservice to also connect through SSH (optionally, perhaps), but I don't want to do that. Then I would have a different configuration running it locally vs in the cloud. What I want to do instead is set up some kind of proxy server on my local machine that will take the SSH credentials and do that SSH tunnelling, exposing the VPC MySQL endpoint locally. Then RdsConnector will just use that local endpoint, and I won't have to have a different config for RdsConnector just for local testing.
I'm not very familiar with the networking technologies in use here. I just know that there's no public IPs for my VPC, so I have to SSH in. I imagine that what I want is possible, but I have no idea what the moving parts would be.
Ok this turned out to be quite simple actually! The ssh program can do this for you, this is how I configure it with Mac OS ssh:
ssh -N -i "/Users/foo/aws_ssh_key.pem" \
-L "localhost:5990:stack-name-vpc-db.asdfqwerty.us-east-1.rds.amazonaws.com:3306" \
foo#12.34.567.890
With the -L flag, it'll proxy stuff over the SSH connection for you from the given endpoint to the provided endpoint on the other side. That -N flag is optional, it just turns off the regular SSH console since we only want to run a proxy server. The microservice can treat localhost:5990 as if it were the regular MySQL endpoint.
before asking this question i looked through google and tried different alternatives none of which were successful for me, sadly. I'm a little above the noob level. What i want is to basicaly host a wordpress site on a google cloud debian machine.
I was doing good installing services through their SSH access until i got to the point where i installed an ftp service and wanted to access it through a remote computer(my own) i only got as far as to:
Status: Waiting to retry...
Status: Connecting to 104.197.183.19...
Response: fzSftp started
Command: open "root#104.197.183.19" 22
Error: Connection timed out
Error: Could not connect to server
I kept on looking and trying new ways until i found the gcloud documentation for ftp but it is not aimed at new ones, so my questions are:
Where do i input the commands for gcloud, on my computer or on the SSH console(Google cloud machine)?
Do i need to use gcloud for ftp remote access or can i do it entirely through my computer and their SSH machine?
Do i really need to add an ssh authorization file to FileZilla or is there a way i can disable that check on my vps so it lets me sign in with just a username and a password?
What i already tried and didn't work for me:
gCloud documentation for ssh and ftp
Google cloud documention for setting up a wordpress site
Many others
Basically what i need in short is to manage to access the vps through ftp so i can continue with my learning.. Been stuck there two days.
To get access to a users public area, ie. public_html
Go to the accounts Cpanel area and under Security > SSH Access you can import a key file.
You can use PuttyGen to make one, you will need both a private and public key.
Past the keys into the box's.
You may get a warning message about the private key, this is ok.
Go to Manage under public key and authorize it.
Or
Make on using the interface in Cpanel and download both Keys.
Then in FileZilla
Host: IP of server
Protocol: SFTP
Logon Type: Key File
Key File: the PPK you made.
(if you asked Cpanel to make the file select the one that does not end in .pub and FileZilla will convert it for you to a .ppk file.
After clicking connect you should be in
If you still have an error make sure the SSH port (22) is open in your filewalls both Google cloud.google.com > Networks and WHM > LDF/CSF plugin
Use SSH File Transfer Protocol.
No need to install ftp service.
Use winscp for connecting with sftp.
The recommended way of transferring files to a Unix-based Google Compute Engine VM is via the gcloud compute copy-files command. For this, please install the Google Cloud SDK. Then, run a command such as the following:
gcloud compute copy-files --zone=<Compute Engine zone>/path/to/local/file.txt <Compute Engine instance name>:/path/to/destination/file.txt
If you'd like to use FileZilla, you'll have to configure it for access. The SSH daemon on Compute Engine VMs is set up for key-based authentication. This forum post indicates how this is possible in FileZilla. The catch is that you need to put your public key on the VM, which can be a little tricky. gcloud compute copy-files and gcloud compute ssh take care of this for you, which is why they are the recommended method.
I'm playing with Google Compute Engine(GCE) as I'm planning to migrate the cloud service provider from Rackspace(reason: GCE has good upgrade plans with best discount price).
I have few issues with GCE and one of them is Ubuntu os/image not supported by default. But there is an alternate method to run any linux distro in GCE, which is called Building an image from scratch for uploading custom images and creating instances(servers) from uploaded image.
I could able to create and run the instances from the Ubuntu image I uploaded to GCE following the link hagikuratakeshi.hatenablog.com. This is simply running ubuntu in general. I didn't face any problem but google's gcutil tool prompts for ssh passphrase and adds the key in GCE meta data but accepts only password logins(then why it prompts for passphrase).
I want to strictly follow Building an image from scratch as recommended by google. But after following all the steps, I could not able to login to my server instance via SSH. I guess this happens when I install Google Compute Engine image packages: google-startup-scripts_1.1.2-1_all.deb, google-compute-daemon_1.1.2-1_all.deb & python-gcimagebundle_1.1.2-1_all.deb. These packages/scripts make some changes to the instance at the startup and also to SSH configuration which are Strongly recommended. Once I strictly follow the link or once I install these packages I could not able to establish SSH connection once the instance is rebooted. The error message similar to the one below is shown while trying to connect:
test#machine1:~$ gcutil --service_version="v1" --project="mypro-555" ssh --zone="asia-east1-a" "server-instance-1"
INFO: Running command line: ssh o UserKnownHostsFile=/dev/null -o CheckHostIP=no -o StrictHostKeyChecking=no -i /home/test/.ssh/google_compute_engine -A -p 22 test#101.167.xxx.xxx -
ssh: connect to host 101.167.xxx.xxx port 22: Connection refused
NOTE: The user account test is available and common on both local and GCE server!.
My main problem is SSH connection when I strictly follow the steps. If I upload the fresh image and then follow the recommended steps connecting SSH, I could not do SSH again once I restart the instance (or) if I setup everything in the uploaded image before uploading, the created instance will be running but I could not able to connect atleast ones and the error is same.
Anybody using GCE with your custom image?, are you allowed to connected even after following the recommended settings?. Anyone already fixed this SSH issue?. Please post your comments!
EDIT 1
I could not figure out from the logs and here is the output of gcutil getserialportoutput server-instance-1.
The key here is that your ssh client says "connection refused". This indicates that there is indeed a machine at that IP address, but it's not accepting SSH connections. There are a few possible explanations:
The ssh daemon isn't running, or is listening on the wrong interface
Your instance is configured with a firewall that's denying SSH traffic
The GCE firewall rule to allow SSH traffic has been removed