Apache Airflow command not found with SSHOperator - ssh

I am trying to use the SSHOperator to SSH into a remote machine and run an external application through the command line. I have setup the SSH connection via the admin page.
This section of code is used to define the commands and the SSH connection to the external machine.
sshHook = SSHHook(ssh_conn_id='remote_comp')
command_1 ="""
cd /files/232-065/Rans
bash run.sh
"""
Where 'run.sh' runs the shell script:
#!/bin/sh
starccm+ -batch run_export.java Rans_Model.sim
Which simply runs the commercial software starccm+ with some options I have specified.
This section defines the task:
inlet_profile = SSHOperator(
task_id='inlet_profile',
ssh_hook=sshHook,
command=command_1
)
I have confirmed the SSH connection works by giving a simple 'ls' command and checking the output.
The error that I get is:
bash run.sh, error: run.sh: line 2: starccm+: command not found
The command in 'run.sh' works when I am logged into the machine (it does not require a GUI). This makes me think that there is a problem with the SSH session and it is not the same as the one that Apache Airflow logs into, but I am not sure how to solve this problem.
Does anyone have any experience with this?

There is no issue with SSH connection (at least from the error message). However, the issue is with starccm+ installation path.
Please check the installation path of starccm+ .
Check if the installation path is part of $PATH env variable
$ echo $PATH
If not, then install it in the standard locations like /bin or /usr/bin etc (provided they are included in $PATH variable), or export the installed director into PATH variable like this,
$ export PATH=$PATH:/<absolute_path>

It is not ideal but if you struggle with setting the path variable you can run starccm stating the full path like:
/directory/where/star/is/installed/starccm+ -batch run_export.java Rans_Model.sim

Related

One liner ssh different enviroment variables than normal ssh

I am using AWS Beanstalk, in case it may be relevant to the question.
The issue that I have is that when I do from my local terminal:
ssh mozart-api printenv
I missing most of the enviroment variables, instead if I do:
ssh mozart-api
..wait to open..
printenv
I get all enviroment varibles as I was expecting.
At first I thought it could be an ssh configuration in server but can't find anything strange.
if I do:
ssh mozart-api "export hello=123 && echo $hello"
then it outputs 123, which means that variables can be set and queried, however I just cannot get the existing variables from the server.
This is causing an issue because I am preparing a script that will run a command in ssh on this server, but because the variables are not loaded the project fails to open the database.
I tried reimporting them in one liner:
ssh mozart-api "sudo chmod +777 /etc/profile.d/sh.local && (/opt/elasticbeanstalk/bin/get-config environment | jq -r 'keys[] as \$k | \"echo export \(\$k)=\(.[\$k])\"') > /etc/profile.d/sh.local && printenv"
But still can't see the new added variables.
ssh mozart-api executes a login shell, which probably sources one or more files that define your environment variables.
ssh mozart-api printenv executes printenv instead of a login shell, so the only variables you see are the ones you inherit from the parent process, not any of the variables defined in your shell configuration files.

Why is $PATH different when executing commands via SSH and libssh?

I'm trying to run a command on a remote host via libssh2 as wrapped by the ssh2 Rust crate.
So I would like to run the command cargo build, but when I try to run it via libssh, I get the error:
cargo: command not found
However, when I ssh into the server manually from the command line everything works fine.
I have noticed that the $PATH is different when running ssh from the command line and libssh as well:
for instance when I echo $PATH
ssh gives me:
/home/<user>/.cargo/bin:/usr/share/swift/usr/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bi
while libssh gives me:
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
So it looks like what's happening is that the modifications made to $PATH inside .bashrc and .profile are not making it in when running via libssh.
I also get the same behavior if I run /bin/bash -c "echo ${PATH}"
Why would this be the case, and is there any way to get the same behavior in both these cases?
Please take a look at that question.
TL;DR A login shell first reads /etc/profile and then ~/.bash_profile. A non-login shell reads from /etc/bash.bashrc and then ~/.bashrc.

ssh execute command remotely that not exist locally

Something like
ssh root#host "ls -l"
works fine
But when I'm trying
ssh root#host "showrgst"
I'm getting "command not found". And yes, I don't have showrgst command on the host I'm connected from.
How to solve this?
you need to install showrgst in the remote server and make sure the PATH env variable has the path to showrgst.
firstly, you can locate what executable is for this command
$ which showrgst
for example, it is executable script from $HOME/bin/showrgst.
So, you need to copy this file to server by means of scp -
$ scp ~/bin/showrgst youserver.com:/home/username/bin/
if this command is executable of some package existing in repositories linux disto, you can install this on your server

I am getting error ssh exit staus 1

In jenkins post build action I configured Execute shell script on remote host using ssh
ssh site 10.32.25.66, command:
cd $HOME/appsadm/bin; ./ims-carte-stop
and i again modified
cd /HOME/appsadm/bin; ./ims-carte-stop.*
I tried both these commands and Build is successful, but I see in console output in Jenkins after, that it is not executing my script. I am getting ssh exit status 1 error.
In my winscp my script (ims-carte-stop) in this location home/appsadm/bin.
Please tell me if I am doing aything wrong.
My intention is to stop my server from jenkins automatically whenever the build success.
This may be a typo in your question, but:
You said your ims-carte-stop script is in:
/home/appsadm/bin
whereas your script is doing:
cd $HOME/appsadm/bin
or
cd /HOME/appsadm/bin
Looking at the paths, I am going to assume you are using a UNIX-flavoured OS (Linux, BSD, OSX).
UNIX paths are case sensitive. Your script should be calling:
cd /home/appsadm/bin
Note that the word "home" is all small letter not capitals. Also, using $ makes it a variable, which I don't think you want.

How to run a script file remotely using SSH

I want to run a script remotely. But the system doesn't recognize the path. It complains that "no such file or directory". Am I using it right?
ssh kev#server1 `./test/foo.sh`
You can do:
ssh user#host 'bash -s' < /path/script.sh
Backticks will run the command on the local shell and put the results on the command line. What you're saying is 'execute ./test/foo.sh and then pass the output as if I'd typed it on the commandline here'.
Try the following command, and make sure that thats the path from your home directory on the remote computer to your script.
ssh kev#server1 './test/foo.sh'
Also, the script has to be on the remote computer. What this does is essentially log you into the remote computer with the listed command as your shell. You can't run a local script on a remote computer like this (unless theres some fun trick I don't know).
If you want to execute a local script remotely without saving that script remotely you can do it like this:
cat local_script.sh | ssh user#remotehost 'bash -'
It works like a charm for me.
I do that even from Windows to Linux given that you have MSYS installed on your Windows computer.
I don't know if it's possible to run it just like that.
I usually first copy it with scp and then log in to run it.
scp foo.sh user#host:~
ssh user#host
./foo.sh
I was able to invoke a shell script using this command:
ssh ${serverhost} "./sh/checkScript.ksh"
Of course, checkScript.ksh must exist in the $HOME/sh directory.
Make the script executable by the user "Kev" and then remove the try it running through the command
sh kev#server1 /test/foo.sh