Using mpirun with a provided ssh file - ssh

I am trying to run mpi on an external server. As a part of my goal, I'm attempting to run something in parallel across multiple nodes.
However, this external server has a bad default configuration file that is readonly, such that when I try to ssh to another external server without using ssh <server> -F ~/.ssh/config then it will simply return four different "Bad configuration option"s. However, -F is not an option that I can use for mpirun, and I don't know if there is any way to manually change the ssh configuration file for mpirun.
What should I do?

Related

how to test ReadDir for nfs server

I've modified some of the nfs server functions using another server connecting between the client and the server.
I would like to test the ReadDir function for NFS, but whatever I try to test it, the command sent is ReadDirPlus (ls, ls -l etc.)
is there a specific command via terminal (bash) to request a ReadDir command for NFS?
Use NFS version 2, instead of 3. There's no READDIRPLUS in version 2. The client has to issue the READDIR, then do individual GETATTRs to retrieve the attributes for an ls -l.
If you're using Linux, simply issue the mount command with nfsvers=2.

cp: cannot create regular file ‘Users/James/Desktop’: No such file or directory

I'm trying to copy a file from a remote server to my desktop and i'm getting the above error. I've SSH'd to the server.
Here is what i'm doing:
deploy#ip-10-91-135-76 /data/project/current/lib/data $ scp customer_record.ods /Users/James/Desktop
I have very limited experience and don't understand what is going on?
Thanks a lot
man scp tells you how to use scp. In particular, most usages look like:
scp [user1#]host1:]file1 [[user2#]host2:]file2
You can omit putting the user in if its the same as your current user, and likewise for the host. Since you've SSH'd onto the server already, the start of your command is okay to be scp customer_records.ods, but the next argument has to include the user name and host of the target machine that you want to copy the file to, namely your home computer. Chances are you actually want to go the other way, since your home computer may not have a publicly accessible IP.
End the SSH session, go back to your home machine.
Do:
scp <user-you-sshd-as>#<server-you-sshd-to>:/data/project/current/lib/data/customer_records.ods /Users/James/Desktop
If you need to specify a private key, you can use the -i option: scp -i <path-to-key> ...

Smart way to copy multiple files from different paths using scp [duplicate]

This question already has answers here:
scp or sftp copy multiple files with single command
(19 answers)
Closed last year.
I would like to know an easy way to use scp to copy files and folders that are present in different paths on my file system. The SSH destination server requests a password and I cannot put this in configuration files. I know that scp doesn't have a password parameter that I could supply from a script, so for now I must copy each file or directory one by one, writing my password every time.
in addition to the already mentioned glob:
you can use {,} to define alternative paths/pathparts in one single statement
e.g.: scp user#host:/{PATH1,PATH2} DESTINATION
From this site:
Open the master
SSHSOCKET=~/.ssh/myUsername#targetServerName
ssh -M -f -N -o ControlPath=$SSHSOCKET myUsername#targetServerName
Open and close other connections without re-authenticating as you like
scp -o ControlPath=$SSHSOCKET myUsername#targetServerName:remoteFile.txt ./
Close the master connection
ssh -S $SSHSOCKET -O exit myUsername#targetServerName
It's intuitive, safer than creating a key pair, faster than creating a compressed file and worked for me!
If you can express all the names of the files you want to copy from the remote system using a single glob pattern, then you can do this in a single scp command. This usage will only support a single destination folder on the local system for all files though. For example:
scp 'RemoteHost:/tmp/[abc]*/*.tar.gz' .
copies all of the files from the remote system which are names (something).tar.gz and which are located in subdirectories of /tmp whose names begin with a, b, or c. The single quotes are to protect the glob pattern from being interpreted from the shell on the local system.
If you cannot express all the files you want to copy as a single glob pattern and you still want the copy to be done using a single command (and a single SSH connection which will ask for your passsword only once) then you can either:
Use a different command than scp, like sftp or rsync, or
Open an SSH master connection to the remote host and run several scp commands as slaves of that master. The slaves will piggyback on the master connection which stays open throughout and won't ask you for a password. Read up on master & slave connections in the ssh manpage.
create a key pair, copy the public key to the server side.
ssh-keygen -t rsa
Append content inside the file ~/.ssh/identity.pub to file ~/.ssh/authorized_keys2 of server side user. You need not to type password anymore.
However, be careful! anybody who can access your "local account" can "ssh" to the server without password as well.
Alternatively, if you cannot use public key authentication, you may add the following configuration to SSH (either to ~/.ssh/config or as the appropriate command-line arguments):
ControlMaster auto
ControlPath /tmp/ssh_mux_%h_%p_%r
ControlPersist 2m
With this config, the SSH connection will be kept open for 2 minutes so you'll only need to type the password the first time.
This post has more details on this feature.

Can I forward env variables over ssh?

I work with several different servers, and it would be useful to be able to set some environment variables such that they are active on all of them when I SSH in. The problem is, the contents of some of the variables contain sensitive information (hashed passwords), and so I don't want to leave it lying around in a .bashrc file -- I'd like to keep it only in memory.
I know that you can use SSH to forward the DISPLAY variable (via ForwardX11) or an SSH Agent process (via ForwardAgent), so I'm wondering if there's a way to automatically forward the contents of arbitrary environment variables across SSH connections. Ideally, something I could set in a .ssh/config file so that it would run automatically when I need it to. Any ideas?
You can, but it requires changing the server configuration.
Read the entries for AcceptEnv in sshd_config(5) and SendEnv in ssh_config(5).
update:
You can also pass them on the command line:
ssh foo#host "FOO=foo BAR=bar doz"
Regarding security, note than anybody with access to the remote machine will be able to see the environment variables passed to any running process.
If you want to keep that information secret it is better to pass it through stdin:
cat secret_info | ssh foo#host remote_program
You can't do it automatically (except for $DISPLAY which you can forward with -X along with your Xauth info so remote programs can actually connect to your display) but you can use a script with a "here document":
ssh ... <<EOF
export FOO="$FOO" BAR="$BAR" PATH="\$HOME/bin:\$PATH"
runRemoteCommand
EOF
The unescaped variables will be expanded locally and the result transmitted to the remote side. So the PATH will be set with the remote value of $HOME.
THIS IS A SECURITY RISK Don't transmit sensitive information like passwords this way because anyone can see environment variables of every process on the same computer.
Something like:
ssh user#host bash -c "set -e; $(env); . thescript.sh"
...might work (untested)
Bit of a hack but if you cannot change the server config for some reason it might work.

Automatically cd to a given remote path when connecting via ssh

I have a bunch of remote servers that I regularly connect to via ssh; which I've setup in my ~/.ssh/config file. I was wondering if it was possible to specify a remote path to cd to when I connect to some of these servers?
For example, I may have something like this in my config file:
Host testbox
HostName 192.123.456.789
User root
And when I ssh in to testbox, I'd like to also cd to /var/www/apps/myapp.
I've had a look around but cannot see an option that would do that via the .ssh/config file.
Cheers,
Diego
You can do this with a tool I've open sourced that allows you to SSH and CD – aptly named sshcd. For the example you've given, you'd simply use:
sshcd root#testbox:/var/www/apps/myapp
Hope this helps!
There's an option in the authorized_keys file.
Do a man on sshd, look under the heading "AUTHORIZED_KEYS FILE FORMAT". You can add various options to each authorized key - one is command="command". As the manpage says, "Specifies that the command is executed whenever this key is used for authentication."