Run a program after ssh session terminates - ssh

I am trying to run a long java program on a server via ssh, I tried using:
command &
Which would work fine on my local machine, but when I terminate my local ssh connection, the program also terminates. Is there a way I can run this and have the output go to some text file so I can login and check completion at a later date?

You can use nohup
nohup command &

Related

How does running ssh with a command string work behind the scene?

I have some difficulities on understanding how ssh works behind the scene when I run it with a command string.
Normally, I type ssh rick#1.2.3.4 then I am logged into the remote machine and run some commands. If I don't use nohup or disown, once I close the session, all running processes started by ssh will stop. That's the ordinary case.
But if I run ssh with a command string, things become different. The process started by ssh won't stop if I close the session.
For example:
Run from local command line: ssh rick#1.2.3.4 "while true; do echo \"123test\" >> sshtest.txt"; done
Run a remote script ssh rick#1.2.3.4 './remoteScript_whichDoTheSameAsAbove.sh'
After closing the session by Ctrl + C or kill pid on the local machine, on the remote machine I see the process still running with ps -ef .
So my question is, could someone make a short introduction on how ssh works when I run it with a command string like above?
Also, I get very confused when I see these 2 related questions during searching:
Q1: Getting ssh to execute a command in the background on target machine . What is this question asking for? Isn't ssh rick#1.2.3.4 'some command' already run as a seperate shell/pts? I don't understand what "in the background" is for.
Q2: Keep processes running after SSH session disconnects Isn't simply running a remote script meets his requirement? ssh rick#1.2.3.4 "./remoteScript.sh. I get very confused when I see so many "magic" answers under that question.

Putty output trigers vba script action

some context first, Im using vba to automate putty into loging into many host with ssh retrieve info and save it.
I can do that currently, however, each time I execute a command, I have to set some waiting time just so the command is correctly executed and then I can execute next command. Is there any way to feeback from Putty output into vba so vba knows when to send next command? that would reduce execution time
sub retrievinginfo()
Lines to ssh open putty to X host
SendKeys "first command for putty session"
Application.wait Now+Timeserial(0,0,"estimated seconds") '<---- here the problem
SendKeys "second command for putty session"
end
Do not use PuTTY. That's a GUI application, not intended for automation.
Use Plink from PuTTY package. Plink is a console application intended for automation. It uses standard input/output, so that you do not have to hack sending of command by simulating keystokes.
Why do you even execute command this way? Why don't you specify a script file using -m switch?
See Automating command/script execution using PuTTY.

How to run sql queries on backend instead of using nohup on linux.

I am looking for a solution for the problem. I am trying to execute one procedure in sqlplus by logging into linux server by using putty.
is there any possibility to run the procedure back end even if i close the putty session when running(in between) the procedure..? Instead of Nohup mode.
I heard, it is possible, because i heard one procedure ran for six days but it was not kept in nohup mode. They directly executed the procedure in sqlplus prompt and they directly closed the putty session, Still it is executing.
does server will take care of procedure....?
Thanks in advance
Siva
There are 2 ways to execute any script on oracle in background. First create a script name test.sql.
You have to use '&' which will put it in the background.
sqlplus username/password #test.sql &
execute using nohup command
nohup sqlplus username/password #test.sql &
And you can close putty session.

AWS process launched from SSH terminate when SSH hangs up

I use SSH to connect to my AWS EC2 instances and run code that takes a long time to complete. I find that if my local computer sleeps (or even if I leave it unattended for a bit) the SSH connection hangs up (which is not fatal in itself) but this seems to terminate the code on the EC2 instance that I launched using SSH.
Also, I use SSH to locally monitor the exception of my remote code, so even if there's a way to tell the remote process to stay alive after SSH has gone, I still need a way to locally see the output of the process as it continues to run (without SSH).
How do I keep code running on my AWS EC2 instance after SSH has hung up; how can I monitor the output of such a process?
When you close your tty (ssh close in your case) your process gets a SIGHUP and the default action on SIGHUP is to terminate. To avoid that you can use the command nohup to trap and not send the SIGHUP to your command, or trap the SIGHUP in your code and ignore it.
There are a bunch of ways to track a background process, but perhaps the easiest is to have it write to a file and in that other ssh you can read that file. If your process is really a command on the command line you can redirect its standard output and standard error to a file. When such a file keeps getting new content, it may be annoying to keep reading it to refresh, in which case the command "tail -f" handy.
Here is how you can config your ssh connection to stay alive :
vi ~/.ssh/config # on your client side
add this line to engage sending a "null packet" every 120 seconds :
ServerAliveInterval 120
If you own the server side do a similar change :
vi /etc/ssh/sshd_config
add these lines at bottom of config file
ClientAliveInterval 120
ClientAliveCountMax 720
this is for linux YMMV on other OS settings
Use screen
local> ssh ...
remote> screen
remote+screen> python long_running.py ...
You can then detach from screen and even disconnect from SSH, and when you return by SSHing back in again, you can
remote> screen -r
to reconnect to your running code.

Tornado stopped running on AWS immediately after I terminate my remote session

I'm using SSH to remotely launch Tornado on Amazon Web Service. It works fine when I launch it by:
python startTornado.py
However, after my SSH session times out or terminated, the Tornado server is also stopped immediately, so I can't access the webpage anymore. I did quite some search but couldn't find an answer on Google.
How can I keep Tornado and the site running after my SSH session terminated?
The process will shut down when you logout if it's running in the foreground or if it tries to write to stdout and the terminal it's outputting to no longer exists. Try starting the server with
nohup python startTornado.py &
The nohup command redirects output to a file, and the & at the end runs the command in the background. Alternatively, you can use the screen utility which allows you to detach a terminal and reattach it in a different ssh session (see the screen man page for details).
While all the above solutions solve the immediate problem, what you might really need to run such processes in production, control them (start/restart/stop) is supervisor. It is python based and its more useful when you have to run multiple instances of tornado behind nginx.
In addition to nohup as Kevin has mentioned, you can also use disown command if you are using bash:
disown <job-id>