How to run a shell script on another server from kettle job at one server? - pentaho

Using Pentaho, I want to execute a shell script lying on another server in Kettle Job (.kjb) file using "Execute a Shell script" component. I was successful to run the script locally by giving script file name in that component. But I am not getting how to run a remote script.
Here is how my simple kjb looks like:
Any ideas?

that is pretty straightforward using the shell step from a pentaho job:
1) on General tab check "insert script"
2) on Script tab now you are able to add this inline script:
ssh user#remotemachine 'ls -l'
this will execute ls -l on the remote server via ssh
make sure the ssh user is allowed to login with ssh keys, not asking for password.

Related

SSH with command Bat file [duplicate]

I have a scenario where I need to run a linux shell command frequently (with different filenames) from windows. I am using PuTTY and WinSCP to do that (requires login name and password). The file is copied to a predefined folder in the linux machine through WinSCP and then the command is run from PuTTY. Is there a way by which I can automate this through a program. Ideally I would like to right click the file from windows and issue the command which would copy the file to remote machine and run the predefined command (in PuTTy) with the filename as argument.
Putty usually comes with the "plink" utility.
This is essentially the "ssh" command line command implemented as a windows .exe.
It pretty well documented in the putty manual under "Using the command line tool plink".
You just need to wrap a command like:
plink root#myserver /etc/backups/do-backup.sh
in a .bat script.
You can also use common shell constructs, like semicolons to execute multiple commands. e.g:
plink read#myhost ls -lrt /home/read/files;/etc/backups/do-backup.sh
There could be security issues with common methods for auto-login.
One of the most easiest ways is documented below:
Running Putty from the Windows Command Line
And as for the part the executes the command
In putty UI, Connection>SSH> there's a field for remote command.
4.17 The SSH panel
The SSH panel allows you to configure
options that only apply to SSH
sessions.
4.17.1 Executing a specific command on the server
In SSH, you don't have to run a
general shell session on the server.
Instead, you can choose to run a
single specific command (such as a
mail user agent, for example). If you
want to do this, enter the command in
the "Remote command" box.
http://the.earth.li/~sgtatham/putty/0.53/htmldoc/Chapter4.html
in short, your answers might just as well be similar to the text below:
let Putty run command in remote server
You can write a TCL script and establish SSH session to that Linux machine and issue commands automatically. Check http://wiki.tcl.tk/11542 for a short tutorial.
You can create a putty session, and auto load the script on the server, when starting the session:
putty -load "sessionName"
At remote command, point to the remote script.
You can do both tasks (the upload and the command execution) using WinSCP. Use WinSCP script like:
option batch abort
option confirm off
open your_session
put %1%
call script.sh
exit
Reference for the call command:
https://winscp.net/eng/docs/scriptcommand_call
Reference for the %1% syntax:
https://winscp.net/eng/docs/scripting#syntax
You can then run the script like:
winscp.exe /console /script=script_path\upload.txt /parameter file_to_upload.dat
Actually, you can put a shortcut to the above command to the Windows Explorer's Send To menu, so that you can then just right-click any file and go to the Send To > Upload using WinSCP and Execute Remote Command (=name of the shortcut).
For that, go to the folder %USERPROFILE%\SendTo and create a shortcut with the following target:
winscp_path\winscp.exe /console /script=script_path\upload.txt /parameter %1
See Creating entry in Explorer's "Send To" menu.
Here is a totally out of the box solution.
Install AutoHotKey (ahk)
Map the script to a key (e.g. F9)
In the ahk script,
a) Ftp the commands (.ksh) file to the linux machine
b) Use plink like below. Plink should be installed if you have putty.
plink sessionname -l username -pw password test.ksh
or
plink -ssh example.com -l username -pw password test.ksh
All the steps will be performed in sequence whenever you press F9 in windows.
Code:
using System;
using System.Diagnostics;
namespace playSound
{
class Program
{
public static void Main(string[] args)
{
Console.WriteLine(args[0]);
Process amixerMediaProcess = new Process();
amixerMediaProcess.StartInfo.CreateNoWindow = false;
amixerMediaProcess.StartInfo.UseShellExecute = false;
amixerMediaProcess.StartInfo.ErrorDialog = false;
amixerMediaProcess.StartInfo.RedirectStandardOutput = false;
amixerMediaProcess.StartInfo.RedirectStandardInput = false;
amixerMediaProcess.StartInfo.RedirectStandardError = false;
amixerMediaProcess.EnableRaisingEvents = true;
amixerMediaProcess.StartInfo.Arguments = string.Format("{0}","-ssh username#"+args[0]+" -pw password -m commands.txt");
amixerMediaProcess.StartInfo.FileName = "plink.exe";
amixerMediaProcess.Start();
Console.Write("Presskey to continue . . . ");
Console.ReadKey(true);
}
}
}
Sample commands.txt:
ps
Link: https://huseyincakir.wordpress.com/2015/08/27/send-commands-to-a-remote-device-over-puttyssh-putty-send-command-from-command-line/
Try MtPutty,
you can automate the ssh login in it. Its a great tool especially if you need to login to multiple servers many times. Try it here
Another tool worth trying is TeraTerm. Its really easy to use for the ssh automation stuff. You can get it here. But my favorite one is always MtPutty.
In case you are using Key based authentication, using saved Putty session seems to work great, for example to run a shell script on a remote server(In my case an ec2).Saved configuration will take care of authentication.
C:\Users> plink saved_putty_session_name path_to_shell_file/filename.sh
Please remember if you save your session with name like(user#hostname), this command would not work as it will be treated as part of the remote command.

Execute External commands Fitnesse/DBFit from within a Test page

I would like to execute external commands in my test from Fitnesse/DBFit from within a Test page.
For example, as part of a Unit Test:
run a remote ssh command towards host ABC and copy a file with win cp command
run a DB2 load command to import the data in the database from the file
run a stored procedure
verify results
May anyone advice regarding step 1) ?
There's been a few implementations of a CommandLineFixture - google commandlinefixture and you'll find them.

Jenkins SSH remote process is getting killed as soon as the Jenkins SSH plugin returns back

Jenkins version: 1.574
I created a simple job which performs the following:
Using "Execute shell script on remote host using SSH" as one of the BUILD steps, I'm just calling a shell script. This shell script performs stop and start operations on Tomcat to restart an application on the target machine.
I have a valid username, password, port defined for the target SSH server in Jenkins Global settings.
I saw this behavior that when I run a Jenkins job and call the restart script (which gets the application name as parameter $1), it works fine, but as soon as "Execute shell script on remote host using SSH" step completes, I see the new process dies on the remote/target application server.
If I run the script from the target/remote server itself, everything works fine and the new process/PID remains live forever, but running the same script from Jenkins, though I don't see any errors and everything works as expected, the new process dies as soon as the above mentioned SSH step is complete and control comes back to the next BUILD step in Jenkins job OR the Jenkins job is complete.
I saw a few posts/blogs and tried setting: BUILD_ID=dontKillMe in the Jenkins job (in various places i.e. Prepare Environment variables and also using Inject Environment variables...). When the job's particular build# is complete, I can see Environment Variables for that build# does say BUILD_ID=dontKillMe as its value (instead of the default Timestamp tag value).
I tried putting nohup before calling the restart script, i.e.,
nohup restart_tomcat.sh "${app}"
I also tried:
BUILD_ID=dontKillMe nohup restart_tomcat.sh "${app}"
This doesn't give any error and creates a nohup.out file on the remote server (but I'm not worried about it as the restart_tomcat.sh script itself creates its own LOG file which I'm "cat"ing after the restart_tomcat.sh script is complete. cat'ing on the log file is performed using another "Execute shell script on remote host using SSH" build step, and it successfully shows the log file created by the restart script).
I don't know what I'm missing at this point, but as soon as the restart_tomcat.sh step is complete, the new PID/process on the remote/target server dies.
How can I fix this?
I've been through this myself.
On my first iteration, before I knew about Jenkins ProcessTreeKiller, I ended up just daemonizing Tomcat. The Apache Tomcat documentation includes a section on running as a daemon.
You can also try disabling the ProcessTreeKiller for your whole Jenkins instance, if it's relatively small (read the first link for information).
The BUILD_ID=dontKillMe should be passed to the shell, and therefore it should be in your command line, not in Jenkins global configuration or job parameters.
BUILD_ID=dontKillMe restart_tomcat.sh "${app}" should have worked without problems.
You can also try nohup restart_tomcat.sh "${app}" & with the & at the end.
My solution (it worked after trying everything else) in Ubuntu 14.04 (Trusty Tahr) (Amazon AWS - Amazon EC2), Jenkins 1.601:
Exec command: (setsid COMMAND < /dev/null > /dev/null 2>&1 &);
Exec in PTY: DISABLED
// Example COMMAND=socat TCP4-LISTEN:1337,fork TCP4:127.0.0.1:1338
I created this Transfer as my last one.
#!/bin/ksh
export BUILD_ID=dontKillMe
I added the above line to the start of my script and the issue was resolved.

How to pass argument to another jenkins build step

In Jenkins job i need to execute shell on jenkins machine. I use Execute shell step from there i get value, which have to be used in another step Execute shell script on remote host using ssh. I can't find way to pass that value...
Here is my jenkins job configuration:
I appreciate any help.
Content of the file which constains the script:
echo "VERSION=`/data/...`"> env.properties
Path of the properties file:
env.properties
In shell script using ssh:
echo"... $VERSION"
Something like this works for the build, maybe here works too.

Teradata client on Unix Solaris

I deploy some .bteq and .sql scripts on a TERADATA database. For doing this, I use a client on my desktop called BTEQWin version 13.10.0.03.
I get the .bteq/.sql from a version control like pvcs/svn etc and all I do once the files are in my workspace folder (from Version control tool), to just drag and drop the files from Windows browser to BTEQWin client (which I connect to a database prior to drag/drop for running those scripts).
Now, I have to automate this whole process in UNIX.
I have written a SHELL KSH/BASH script which is getting all the .bteq/.sql from a TAG/LABEL in the version control tool to a given UNIX folder. Now, all I need to do is the pass these files one by one (i'll take care of the order) to Teradata client.
My ?
- what client do I need to tell Unix admin team to install on Unix server - so that I can run something like below:
someTeraDataCommand -u username -p password -h hostname -d database -f filenametoexectue | tee output_filename.log
Where, someTeraDataCommand is the client / executable - which will let me run Teradata scripts (like I was doing using BTEQWin on my desktop - GUI session). Other parameters can be username, password, which database to connect on what server and which file to run or make that file passed to the command using "<" operator at command line.
Any idea?
- What client ?
Assuming the complete Teradata Tools and Utilities package is installed on your UNIX server (which will have the connectivity tools to talk to Teradata), you should have access to bteq from the command line. Something like this:
bteq < script_file > output_file
Your script file should contain a .LOGON statement to establish the connection:
.LOGON yourTDPID/your_account,your_pw
You might also need to use other commands to set your default database or non-default session values.
Another option would be to combine the SQL and call to BTEQ in a Korn shell script:
#!/usr/bin/ksh
##############
SHELL_NAME=`basename $0`
PRG_NAME=`basename $(SHELL_NAME} .ksh`
LOG_FILE=${PRG_NAME}.log
OUT_FILE=${PRG_NAME}.out
#
bteq <<EOBTQ > ${LOG_FILE} 2>$1
.LOGON {TDPID}/{USERID},{PWD};
--.RUN file=${LOGON}
/* Add your SQL/BTEQ commands here */
.QUIT 0;
EOBTQ
Edit
The double hyphen indicates a single line comment. Typically in a UNIX script you do not leave your password in plain text of a KSH script. The .RUN command would reference a text file in a barely sufficient secure location containing the .LOGON {TPDID}/{USERID},{PWD}; command.
The .RUN command in BTEQ allows you to reference another text file containing a series of valid BTEQ commands that you want to run in the current BTEQ script.
Easiest way is to setup the Solaris TTU, is to request root sudo, and run an interactive installation into defaults as a root. That would cure all client issues.