Access to cloud foundry filesystem (like FTP)? - ssh

I have some cloud foundry nodejs apps in IBM Cloud (Bluemix), and facing some issues with temp folders. It'd be easier if I can access directly to app folders (like my .tmp) to debug what's saving there. I can only think in SSH, but I prefer a visual tool or connected service. Any idea?

You can use graphical tools to browse the files by utilizing scp or sftp. I use FileZilla to browser the files. Instructions for accessing the files are similar to all Cloud Foundry providers, including IBM Cloud:
I logged in to IBM Cloud using the CLI: ibmcloud login
Next, set the org and space: ibmcloud target --cf
Obtain the GUID for the app: ibmcloud cf app YOURAPP --guid
Look for the ssh endpoint: ibmcloud cf curl /v2/info
Issue a onetime password for ssh access: ibmcloud cf ssh-code
With that, use a username like cf:theGUIDfrom3/0 (0 could be another number depending on how many instances you have) and the onetime password to log in. The host is the one listed as app_ssh_endpoint at the shown port. You likely need to prefix it with the protocol, e.g., sftp://.

Related

How can I log into a GCP Vertex VM via identity aware proxy using a different username

I have a GCP Vertex VM with IAP enabled. There is no external IP address (we cannot use external IPs in our organisation). Our team would like to be able to use JupyterLab via the browser and also login with PyCharm SSH remote development tools under the same user account to avoid any permissions conflicts. The reason for this is that some people want JupyterLab some of the time, and PyCharm some of the time without having to switch (we are data scientists XD). Since I cannot login to jupyterlab with my GCP email user account, I would like to be able to login via SSH using the "jupyter" username (which is the default for vertex). However, whenever I login via SSH (via the gcloud ssh command) I am logged in with the gcp email user account no matter what I put in the connection string.
One thing I could do is to add "sudo su juputer" to the .bashrc file, but this seems like a bad hack.

using cloud code plugin for GCP (google cloud platform)

I've a local clusters (minikube) that work pefectly well on my laptop (mint 19.3, Intellij 2019.3 with cloud code plugin, java (11) backend, mongo db, front end, .. ok ). But I can't find any usefull informations (on google cloud plateform site or intellij) to configure a new google cloud cluster. I can only see my minikube conf on the cluster explorer...even when I stopped minikube !
It seems that configuration could be found in kubctl !? But how can I force plugin to connect GCP. I've a GCP account and created a cluster and an image repo.
GCP documentation looks really unclear.
I solved the problem. You need to install SDK cloud ( an other solution ?), an use gcloud instructions to link kubctl with new kubernetes context, and for credentials contexts. a new configuration for kubctl must be generated, and you have to switch to that configuration (kubectl config set your-new-cluster).
Just one thing, to use google storage for docker images, you should enter where to find or put it in the conf of the run/edit configuration line image options -> gcr.io/your-project-id . I couldn't use the bucket i created before pushing, a new one was created. Is there a solution to connect with an existing bucket ?
If you want to manage your clusters from an on-prem machine you will need to install Cloud SDK and configure your cluster access, this will allow you to use kubectl comands to create, and administrate the clusters on GKE. Cloud code plugin should install this SDK automatically, you can take a look to this guide to learn hoy to use it.

Shared Hosting SSH (rsync enabled) to Google Cloud Storage

I am looking to share/sync a folder on my shared web hosting to a bucket I have created in Google Cloud Storage. My host has enabled both SSH and rsync on my shared hosting. unfortunately, I cannot install rclone which should fit this job nicely. I have setup a Google Cloud f1micro VM and will be using this as the link as well as writing the syncs to GCS.
How can I connect the two?
Here is the standard command to rsync to a remote server via SSH. How could this be adapted to send to Google Cloud Storage? At the moment it returns an error of Unexpected remote arg
rsync gcs1 ssh cpanelusername#hostingip:public_html/r1/ -p 26
There is gsutil rsync but unsure whether this would work in this setup
https://cloud.google.com/storage/docs/gsutil/commands/rsync
Does this need to run in opposite direction and link to my SSH from my Google VM which will run the rsync?
NOTE: Our SSH shared hosting port is 26

ftp through filezilla to google cloud machine, can't achieve it

before asking this question i looked through google and tried different alternatives none of which were successful for me, sadly. I'm a little above the noob level. What i want is to basicaly host a wordpress site on a google cloud debian machine.
I was doing good installing services through their SSH access until i got to the point where i installed an ftp service and wanted to access it through a remote computer(my own) i only got as far as to:
Status: Waiting to retry...
Status: Connecting to 104.197.183.19...
Response: fzSftp started
Command: open "root#104.197.183.19" 22
Error: Connection timed out
Error: Could not connect to server
I kept on looking and trying new ways until i found the gcloud documentation for ftp but it is not aimed at new ones, so my questions are:
Where do i input the commands for gcloud, on my computer or on the SSH console(Google cloud machine)?
Do i need to use gcloud for ftp remote access or can i do it entirely through my computer and their SSH machine?
Do i really need to add an ssh authorization file to FileZilla or is there a way i can disable that check on my vps so it lets me sign in with just a username and a password?
What i already tried and didn't work for me:
gCloud documentation for ssh and ftp
Google cloud documention for setting up a wordpress site
Many others
Basically what i need in short is to manage to access the vps through ftp so i can continue with my learning.. Been stuck there two days.
To get access to a users public area, ie. public_html
Go to the accounts Cpanel area and under Security > SSH Access you can import a key file.
You can use PuttyGen to make one, you will need both a private and public key.
Past the keys into the box's.
You may get a warning message about the private key, this is ok.
Go to Manage under public key and authorize it.
Or
Make on using the interface in Cpanel and download both Keys.
Then in FileZilla
Host: IP of server
Protocol: SFTP
Logon Type: Key File
Key File: the PPK you made.
(if you asked Cpanel to make the file select the one that does not end in .pub and FileZilla will convert it for you to a .ppk file.
After clicking connect you should be in
If you still have an error make sure the SSH port (22) is open in your filewalls both Google cloud.google.com > Networks and WHM > LDF/CSF plugin
Use SSH File Transfer Protocol.
No need to install ftp service.
Use winscp for connecting with sftp.
The recommended way of transferring files to a Unix-based Google Compute Engine VM is via the gcloud compute copy-files command. For this, please install the Google Cloud SDK. Then, run a command such as the following:
gcloud compute copy-files --zone=<Compute Engine zone>/path/to/local/file.txt <Compute Engine instance name>:/path/to/destination/file.txt
If you'd like to use FileZilla, you'll have to configure it for access. The SSH daemon on Compute Engine VMs is set up for key-based authentication. This forum post indicates how this is possible in FileZilla. The catch is that you need to put your public key on the VM, which can be a little tricky. gcloud compute copy-files and gcloud compute ssh take care of this for you, which is why they are the recommended method.

Gsutil - Installing and configuring on a remote server. How to automate it?

I have currently installed gsutil on a server to access my GCS buckets. I followed the instructions under the section 'How to convert gsutil to use OAuth 2.0' from https://cloud.google.com/storage/docs/gsutil_install
The intermediate steps in the instructions require that a URL is copy pasted in the browser to generate a code that you have to enter again on the terminal. You also need to enter proxy server details (if any).
I am looking for ways to automate this set up and configuration process for gsutil.
Any ideas/references/suggestions/comments are welcome.
Thanks.
Can you say more about what you're trying to do? Are you looking to create distinct credentials for each of a set of users, or are you trying to set up gsutil running on multiple machines all as part of an application that authenticates as that application to Google Cloud Storage?
For the former you need users to set up their own credentials. The web-based dialog for OK'ing the creation of OAuth2 credentials was designed to make it unlikely that a customer could grant long lasting credentials without being aware that they are doing so (for security reasons).
For the latter you should use a service account (see https://cloud.google.com/storage/docs/authentication#service_accounts). You create those credentials once and then deploy them on your production machines along with gsutil - which is a valid security approach because all instances of those machines are authenticating on behalf of an application, not distinct users.