ZyXel Config SSH - ssh

I got assigned to the task of creating a script to backup zywall configs.
I can connect to the firewall via putty-ssh client.I looked at all the documentation, but I could not find a way to output the config file via the shell.
I wanted to have one of our servers connect to the firewall every day and put the echo off this command into a file as a backup.
Can you help me with displaying the config file in putty?

Related

connection to hive via ssh on jetbrains datagrip

I'm able to connect and interact with a hive database on the Putty terminal. But I'm clueless about how to do it on Datagrip.
These are the steps that I'm using for Putty:
Starting a session through SSH-type on the host(HostName) and port(22), which opens a terminal, and there I feed my login details.
Then, I invoke a batch script on the remote server ssh session which then calls other .sh scripts, this step sets the path for various environment variables and defines a lot of hive configurations. At completion of this batch file I can see a "hive>" on the terminal, indicating I can run sql queries now.
Is there any way, I can get Datagrip working in this environment and setup driver location, work directory, home directory, everything on the remote server. And call this batch script from Datagrip itself.

Is it possible to edit code on my own machine and save it to account I've ssh'd into?

Scenario:
I'm using ssh to connect to a remote machine. I use the command line and run ssh <pathname>, which connects me to the machine at . I want to edit and run code on that remote machine. So far the only way I know is to create, edit, and run the files in the command window in vi, because my only connection to that machine is that command window.
My Question is:
I'd love to be able to edit my code in VSCode on my own machine, and then use the command line to save that file to the remote machine. Does anyone know if this is possible? I'm using OS X and ssh'ing into a Linux Fedora machine.
Thanks!
Sounds like you're looking for a command like scp. SCP stands for secure copy protocol, and it builds on top of SSH to copy files from one machine to another. So to upload your code to your server, all you'd have to do is do
scp path/to/source.file username#host:path/to/destination.file
EDIT: As #Pam Stums mentioned in a comment below the question, rsync is also a valid solution, and is definitely less tedious if you would like to automatically sync client and server directories.
You could export the directory on the remote machine using nfs or samba and mount it as a share on your local machine and then edit the files locally.
If you're happy using vim, check out netrw (it comes with most vim distributions; :help netrw for details) to let you use macvim locally to edit the remote files.

How to copy data from remote file to local agent ? (Using Bamboo)

I try to import an existing folder in a "remote server" on my "local agent Bamboo".
I tried to create an SCP Task in Bamboo, but it does not work, since the SCP task in Bamboo helps to do the opposite (from local to remote).
I try to create a Script Task with Bamboo to import the folder.
I await your help.
Thank you in advance.
Using Script Task is right way (or at least working one) to implement "reverse" SCP copy from remote host to Bamboo CI server.
In Script configuration define your scp command as usual:
/usr/bin/scp -P remote.host.port user#remote.host:/path/to/source_file bamboo.host:/path/to/target_file
For correct work don't forget to setup SSH public key authentication for password-less login.
Another option is to use custom bamboo-ssh-plugin which provides additional Reverse SCP Task:
which can be configured and used completely same way as default SCP Task.

Automatically run script/commands after connecting to OpenShift SSH

I set up an OpenShift application and set up my local PuTTY to connect to the server via SSH. Everything works fine, but I don't know how to run a few commands (mainly alias) after I connected to the server automatically (I don't want to copy&paste the same commands everytime I connect).
On my local linux shell I can use .bashrc, but this doesn't seem to work on OpenShift. I can't write a file in my home directory (/var/lib/openshift/[some letters and numbers]/) and I don't know the right place to put this file. Does anybody know where I have to put a file which will be run everytime I login?
I'd prefer a solution which doesn't involve my local SSH software as I'm connecting to this OpenShift application from different machines.
You can use your .bash_profile located in your $OPENSHIFT_DATA_DIR.
This has to be done in .bashrc or .profile or .bash_profile files. As you say they don't work then you can have a script in a file, scp that file to the remote server and then run when you ssh in a single command.
I have not used openshift but have used aws ec2 instances alot with ruby scripts,
ssh ubuntu#ec2-address ruby basic-auto.rb
The above command excutes the ruby file after the ssh. You can have a script in any language or may be a bash file(.sh) which executes after ssh.

rsync remote to local automatic backup

I would like to auto backup my server monthly and weekly. My server is running Centos 5.5 and while searching the web I'm found a tool named rsync. I got my first update manually by using this command in terminal:
sudo rsync -chavzP --stats USERNAME#IPADDRES: PATH_TO_BACKUP LOCAL_PATH_TO_BACKUP
I then prompt my password for that user and bob's my uncle.
This backups the necessary files from my remote server to my local device but does somebody know how I can automate this? Like automatic running this script every sunday?
EDIT
I forgot to mention that I let direct admin backup the files I need and then copy those files from the remote server to a local server.
this command worked for me. Combine it with a cronjob
rsync -avz username#ipaddress:/path/to/backup /path/to/save