SSH session dies on remote machine after detaching using TMUX - ssh

I have a few scripts running on a remote machine that I have ssh access to. I connect on my local machine (Mac) in terminal using TMUX. I create a new session using $ tmux new and ssh into the machine. I start my scripts and everything is great. I detach using Ctrl-b d and the scripts continue running. However, after my local machine gets rebooted or goes to sleep, the scripts on the remote machine stop. When I reload the the session using tmux attach -t <session name> I see the following error:
packet_write_wait: Connection to <remote ip> port 22: Broken pipe
That suggests that the session ended. What can I do to ensure that the scripts continue to run no matter what happens on my local machine?

Related

Spyder -- Connect to remote kernel via proxy

I'm trying to connect to a remote kernel in Spyder, however the machine on which it is running is not directly accessible. Rather, to connect to it I must go through a bastion host / jumpbox as follows:
ssh -i ~/.ssh/id_rsa -J me#jumpbox me#remote which logs me directly into remote, automatically sending the connection through jumpbox.
I have python -m spyder-kernels.console running on remote, where I want to do my computing, but no way to connect to it directly since it's only accessible from jumpbox. I've tried setting up my ssh config with a ProxyJump entry which works for logging into the machine through ssh in the command line, but it appears that Spyder ignores the config file when setting up the remote kernel connection,
Is there a way to connect to this remote kernel? It appears there's a way to do this with IPython and I know I can do it with Jupyter Notebook, but I'm wondering if I can do this in Spyder.
(Related: Connect spyder to a remote kernel via ssh tunnel)
I don't know if you're still looking for an answer to this, but for future people arriving here, and for my own reference:
Yes, you can. You have to create an ssh-tunnel and connect Spyder to the kernel via localhost. For you that would look something like this:
ssh -L 3336:me#jumpbox:22 me#remote
22 is for the port your ssh server at remote is listening to. This is usually 22, unless the moderator changed this. 3336 is the port at localhost to connect to, you can choose any number you like above 1024 (which are privileged ports).
Then proceed as explained in the Spyder docs, i.e., launch the spider kernel (in the environment you want) on remote
python -m spyder_kernels.console
copy the connection file (kernel-pid.json) file to your local computer:
scp -oProxyJump=me#jumpbox remote:/path/to/json-file/kernel-pid.json ~/Desktop
/path/to/json-file you have to change to the path to the connection file (which you can find by running jupyter --runtime-dir on remote in the same environment as the spyder-kernel is running) and kernel-pid.json of course to the real file name. ~/Desktop copies it to your Desktop-folder, you can change that to wherever you want.
Connect Spyder to the kernel via "Connect to existing kernel", point it to the connection file you just copied, check the This is a remote kernel (via SSH) box and enter localhost as the Hostname, and 3336 as the port (or whichever port you changed it to).
That should do it.
Note, that, as is the case for me, your jumpbox server may break your ssh connection over which you launched the Spyder kernel, which will cause your kernel to break. So you might want to use
python -m spyder_kernels.console &
to have it run in the background, or launch it in a screen session. However, note that you cannot shutdown a remote kernel with exit, and it will keep running (see here), so you have to kill it in a different way.

ssh tunnel VNC server connection closed unexpectedly

I have to tunnel the VNC server(tightvnc-server) running on my local machine to a remote server, so at remote server can access my local machine without port forwarding on router(at local machine network).
Right now I am using the following command
ssh -R 5950:localhost:5900 user#remote.ddns.net
Where 5900 the vnc server port on my local machine. And I have to access the machine from remote server using the command localhost:5950. And when I try to connect using vnc viewer I am getting the error connection closed unexpectedly. Normally if no connection exist I was getting connection refused error. But here something is missing in tunneling. Can any please tell me what could be the reason.
You need to activate the remote desktop, running:
sudo /System/Library/CoreServices/RemoteManagement/ARDAgent.app/Contents/Resources/kickstart \ -activate -configure -access -on \ -restart -agent -privs -all
Then you can connected using VNC Viewer

SCP times out, but ssh connection works fine. Am I doing something wrong?

I'm trying to copy task1.zip from my desktop /Users/myname/desktop if I pwd, to a remote server. I'm connected to the remote server via ssh. I would like to copy the file to /its/home/jt463/task1(pwd path from the directory) on the remote server.
I have used the command below in the terminal when I'm connected to the server via ssh and tried it on the terminal on my machine:
scp Users/myname/desktop/task1.zip username#inf900179.inf.susx.ac.uk:its/home/username/task1
Error that I get when I try to use the terminal that's connected to the remote server:
Users/jonatantibarovsky/desktop/task1.zip: No such file or directory
Error that I get when I try to use my local terminal:
ssh: connect to host inf900179.inf.susx.ac.uk port 22: Operation timed out lost connection
First scp to the intermediate server, using your credentials. Then, you should be able to scp from that server to the target.

start ssh-agent in remote ssh session

I connect to a server that runs xubuntu and start ssh-agent there. Then I execute ssh-add on the remote server and run rysnc commands that would require to enter the passwort mutliple times.
With my solution I only have to enter it one time. But how can I start the ssh-agent permanentely? I want to reuse it over multiple ssh sessions.
My solution so far:
ssh myhost 'eval $(ssh-agent); ssh-add;'
You can use agent-forwarding in ssh: -A switch. Basically, it will create the agent on your host and when you connect to myhost, you will have your agent in all your sessions and you will not be prompted for password again.
Basically agent should be started automatically with your session, so all you need to do is to add your keys locally and then connect to remote hosts with -A switch.
It is impossible to have running ssh-agent permanently, since it is running under your session. Basically if you don't close the first session, there is some way to connect to it, but it is certainly not the thing you want to do

Remote debugging using gdbserver over ssh

I want to debug a process running on a remote box from my host box (I built the code on the host machine).
Both have linux type operating systems.
I seems I can only communicate to the remote box from the host box via ssh (I tested using telnet).
I have followed the following steps to set this up:
On the Remote box:
Stop the firewall service:
service firewall_service stop
Attach the process to gdbserver
--attach :remote_port process_id
On the Host box:
Set up port forwarding via ssh
sudo ssh remote_username#remote_ip -L host_port:localhost:remote_port
-f sleep 60m
Set up gdb to attach to a remote process:
gdb file.debug
(gdb) target remote remote_ip:remote_port
When I try to start the debugging on the host by running 'target remote remote_ip:remote_port' on the host box I get a 'Connection timedout' error.
Can you guys see anything I am doing wrong, anything to check or an alternative way to debug remotely over ssh I would be grateful.
Thanks
This command:
sudo ssh remote_username#remote_ip -L host_port:localhost:remote_port ...
forwards local host_port to remote_port on remote_ip's localhost. This is useful only if you could not just connect to remote_ip:remote_port directly (for example, if that port is blocked by firewall).
This command:
(gdb) target remote remote_ip:remote_port
asks GDB to connect to remote_port on remote_ip. But you said that you can only reach remote_ip via ssh, so it's not surprising that GDB times out.
What you want:
ssh remote_username#remote_ip -L host_port:localhost:remote_port ...
(gdb) target remote :host_port
In other words, you connect to local host_port, and ssh forwards that local connection to remote_ip:remote_port, where gdbserver is listening for it.