I am working with gnome-terminal and trying to set up a bash script to do the following:
Open up a new gnome-terminal - got that working
in the new window, ssh to another computer (let's call it XYZ) - got that working
This is what is NOT working:
In that (XYZ) computer, I want to ssh to another computer (call it ABC)
Point 1 is the only way we can get to computer ABC, so don't ask. :)
I can open the window, ssh to XYZ, but how do I have another ssh command to fire off? I cannot use a .script on computer XYZ to fire off an SSH because we have multiple shells open (up to 10-12) where they ssh to XYZ, then ssh to several computers.
Related
Scenario:
I'm using ssh to connect to a remote machine. I use the command line and run ssh <pathname>, which connects me to the machine at . I want to edit and run code on that remote machine. So far the only way I know is to create, edit, and run the files in the command window in vi, because my only connection to that machine is that command window.
My Question is:
I'd love to be able to edit my code in VSCode on my own machine, and then use the command line to save that file to the remote machine. Does anyone know if this is possible? I'm using OS X and ssh'ing into a Linux Fedora machine.
Thanks!
Sounds like you're looking for a command like scp. SCP stands for secure copy protocol, and it builds on top of SSH to copy files from one machine to another. So to upload your code to your server, all you'd have to do is do
scp path/to/source.file username#host:path/to/destination.file
EDIT: As #Pam Stums mentioned in a comment below the question, rsync is also a valid solution, and is definitely less tedious if you would like to automatically sync client and server directories.
You could export the directory on the remote machine using nfs or samba and mount it as a share on your local machine and then edit the files locally.
If you're happy using vim, check out netrw (it comes with most vim distributions; :help netrw for details) to let you use macvim locally to edit the remote files.
How would you, in Ansible, make one remote node connect to another remote node?
My goal is to copy a file from remote node a to remote node b and untar it on the target, however one of the files is extremely large.
So doing it normally via fetching to controller, copy from controller to remote b, then unarchive is unacceptable. Ideally, I would do from _remote_a_ something like:
ssh remote_b cat filename | tar -x
It is to speed things up. I can use shell module to do this, however my main problem is that this way, I lose Ansible's handling of SSH connection parameters. I have to manually pass an SSH private key if any, or password in a non interactive way, or whatever to _remote_b_. Is there any better way to do this without multiple copying?
Also, doing it over SSH is a requirement in this case.
Update/clarification: Actually I know how to do this from shell and I could do same in ansible. I was just wondering if there is a better way to do this that is more ansible-like. The file in question is really large. The main problem is that when ansible executes commands on remote hosts, then I can configure everything in inventory. But in this case, if I would want a similar level of configurability/flexibility when it goes to parameters of that manually established ssh connection I would have to write it from scratch (maybe even as an ansible module), or something similar. Othervise for example trying to just use ssh hostname command would require a passwordless login or default private key, where I wouldn't be able to modify the private key path used in the inventory without adding that manually, and for ssh connection plugin there are actually two possible variables that may be used to set a private key.
Looks like more a shell question than an ansible one.
If the 2 nodes cannot talk to each other you can do a
ssh remote_a cat file | ssh remote_b tar xf -
if they can talk (one of the nodes can connect to the other) you can launch tell one remote node to connect to the other, like
ssh remote_b 'ssh remote_a cat file | tar xf -'
(maybe the quoting is wrong, launching ssh under ssh is sometimes confusing).
In this last case you need probably to insert some password or set properly public/private ssh keys.
I have to upload, compile and run some code on a remote system. It turned out, that the following mechanism works fine:
rsync -avz /my/code me#the-remote-host.xyz:/my/code
ssh me#the-remote-host.xyz 'cd /my/code; make; ./my_program'
While it's maybe not the best looking solution, it has the advantage that it's completely self-contained.
Now, the problem is: I need to do the same thing on another remote system which is not directly accessible from the outside by ssh, but via a proxy node. On that system, if I just want to execute a plain ssh command, I need to do the following:
[my local computer]$ ssh me#the-login-node.xyz
[the login node]$ ssh me#the-actual-system.xyz
[the actual system]$ make
How do I need to modify the above script in order to "tunnel" rsync and ssh via the-login-node to the-actual-system? I would also prefer a solution that is completely contained in the script.
I have the following scenario:
In my work computer (A) I open a byobu (tmux) session.
Inside byobu, I open several terminals. Some of them are local to (A), but in others I ssh to a different computer (B).
I go home, and from my my home computer (C) I ssh to (A), run "byobu" and find all my sessions in (A) or (B).
This works perfectly, except for running X11 applications. I don't leave any X11 application running when I change computers, but just running "xclock" sometimes works and sometimes doesn't ("cannot connect to X server localhost:n.0").
I understand this depends on the DISPLAY variable, and that it would be set up such that X11 would connect to the computer where I ran "byobu" last before creating the session inside byobu, and that could be (A) or (C). My problem is that often I don't know how to fix a session that's not working any more. Sometimes I can just open another session (another tab in byobu) and use the value of $DISPLAY in other sessions, but that only works as long as the new session is open, and not always. In other cases I've been able to detach byobu (F6), re-attach it (run "byobu") and open a new ssh connection to (B), and then that one works, but not the already existing sessions.
I have read some documents like SSH, X11 Forwarding, and Terminal Multiplexers or How to get tmux X display to come back?, but it is unclear to me how they apply (if they do) to my situation. For instance, the .bashrc code of the former, should it be in (A), (B), or (C)?
UPDATE/EDIT I have found the correct way to do this. Simply type this in any of the byobu shells
. byobu-reconnect-sockets
and the DISPLAY environment variable for your new ssh connection, as well as SSH_AUTH_SOCK and several others that may be useful and dependent on the primary login shell (that which you do byobu attach-session -t session_name or for screen backend, byobu -D -R session_name or however you do prefer to do it).
This is all supposed to happen simply by pressing CTRL-F5, but I suspect that like me, your computer is intercepting the CTRL-F5 (for me, I am using iTerm on a Mac) and either doing its own thing with it, or sending the wrong control character sequence so byobu doesn't receive it properly. It's a bit more typing, but sourcing the shell script as indicated above will do the same thing as CTRL-F5 is supposed to do, and will do it for ALL byobu open shells in the session. The rest of my original answer below you can probably now ignore, but I'll leave it there in case it is in someway useful to someone perhaps for some other purpose.
Also, you can edit the byobu-reconnect-sockets script (it is just a shell script) and there are places to add additional environment variables you want updated, so really none of the below is necessary.
(original answer follows)
When you ssh in again and reattach your byobu sessions, it is likely that the ssh forwarded X11 display for your new ssh connection is not the same as the proxy display that initial ssh session created when you launched byobu. Suppose you ssh in for the first time and are going to start a new byobu session with many shells and perhaps many forwarded X11 windows, and this will all work fine, because that first ssh shell sets the DISPLAY environment variable to what it is listening on for X11 connections. This might be something like
[~/]$ printenv DISPLAY
localhost:11.0
all shells started by byobu (and tmux or screen on the backend) are going to inherit the setting of all the environment variables that were set when byobu was initially launched, ie, the X11 display that was forwarded for your user for your first ssh connection.
Then you detach your byobu session and go home, and ssh back in. But this time you get a different X11 display, because some other user has localhost:11.0. In your new ssh session that you started at home, the value of DISPLAY might be localhost:14.0 (say). For X11 forwarding through this ssh connection, X11 clients need to connect to the ssh X11 proxy at display localhost:14.0, not localhost:11.0. You will likely not have the authorization keys for localhost:11.0 at that point, someone else will, or worse, if they have disabled X authentication, the X11 windows you are trying to open are going to start showing up on their screen. All you need to do to make it work, is this -
detach byobu
you should now be in the current ssh shell. Do printenv DISPLAY and note the value shown (or copy it)
reattach byobu
In any shell you want to use X11 in, do export DISPLAY=localhost:14.0 (in this example it'd be that value, you'll use whatever value you get for #2 in your case)
X11 will now forward through ssh to your screen as you expect
The catch - you have to do this in every byobu shell that is open separately if you want to use X in that shell. To my knowledge there is no way to set it in all shells except I think there may be a way to run any arbitrary command in all shells at the same time, but I don't know the key sequence to do that off the top of my head.
the annoying - you have to do this every time you detach and disconnect your ssh connection, and then reconnect with ssh and reattach your byobu, as it is likely the DISPLAY environment variable in the ssh shell has changed, but your shells either have what was set for DISPLAY when byobu was initially started, or whatever you have last set it to.
Even if you open new shells in byobu in some later ssh connection, those shells will still inherit the DISPLAY environment variable setting that was set when byobu was first started, all the way back to your first ssh connection. You have to do this with new shells too.
This annoys me constantly and I'd love to take the time to develop some hack of some kind to at least make this less tedious, and best of all would be to have it done along with ctrl-F5, which effectively does exactly all this, but for some other things you often want to reconnect with your new ssh session, especially SSH_AUTH_SOCK for ssh-agent.
I have a 12 computers cluster and I have a java program(the same) on each one, so I want to run these programs at the same time, how can i do this?
I already can copy (scp) files from one computer to another via ssh like
#!/bin/bash
scp /route1/file1 user#computerX:/route2$
scp /route1/file1 user#computerY:/route2$
so I was wondering if something like this can be done to run the programs that I have on each computer
You can run commands via
#!/bin/bash
ssh user#host1 <command>
ssh user#host2 <command>
You will need to use Key Based Auth to avoid entering your password when the script runs.
Alternatively take a look at Fabric for a neat way of controlling multiple hosts.
I recommend typing:
man ssh
and see what it says. That command will run commands remotely for you.