Can you run jar files in Pentaho spoon tool? - pentaho

I want to run a jar file that has java code, inside the Pentaho spoon tool. This run is a single step by accepting an input, running the java class on top of this data and then providing the output somewhere. Is this possible with Pentaho Spoon?

This is very easy. Just pop the jar file into the lib directory with the other jars, and call it from the modified javascript step. ( Or maybe even the java class step if thats more suitable for what you're trying to do )
I've done this several times - most recently for some custom decoding which had to be done in java, and couldnt be done in the java step because janino didnt support it.

A different approach:
leave the jar in your project directly, no copying necessary
create a simple bash script "run.sh" in your project folder with
"java -jar yourjar.jar" in it. you can also add that line directly
within spoon and skip this step
to your job, add a "execute shell script" and point to run.sh
done, the result code of the script determines success or not
why this way? I hate the way the api is documented, no code completion in spoon, Java or Javascript, this way you can write your code outside and rely on working systems

Related

Exploring options to deliver Selenium test to non-tech business team who wish to run the test on their own

Our non-technical business team need to run only 1 automation test script which will be fetching data from several hundreds rows of Excel file. I've created automation that script using a Maven project (with POM framework and Extent Report), using Selenium WebDriver, Java, TestNG, and Eclipse but not sure how to deliver the test/script to the business team. Would appreciate if you guys can suggest a few options to deliver this script to the business team so that they can change few parameters in Excel file and run the script on their own. I'm getting Extent report at the end of the test, would be best if they can get the Extent report as well.
Note: Since they are non-technical, it is preferable to avoid installing and configuring Java, Eclipse, etc tools on their machine but that's not mandatory so I'm open to check several options.
I would like to suggest few options here :
Jenkins - Install it on Any server and share login and url with them. They can execute build and no technical knowledge required.
Executable JAR - Export your project as executable JAR and deliver that JAR file. All they have to do it , Double click on JAR file OR execute via batch file and it should run that script and get done the job. Make sure all data related file should be there with JAR. i.e excel file
Note : They always need JAVA in the machine regardless of what they prefer from above to use.
Usually i used to give as batch file script, if you have jenkins, integrate your bat file in it. other wise share the batch file as it is.
Create a package JAR,
Create .bat file to execute the JAR.

Jade Agent using command lines

I am new to Java Agent DEvelopment Framework for developing Agents.
I used to work with Jade using Eclipse, I've created some agents I converted my file to .jar format. But now I want to test my .jar file by creating multiple agents.
How can I create one or multiple Jade agents using a command line ?
Please check the tutorials before asking such questions. Visit the Help Center and read about how to ask good questions.
But nevertheless,To create an agent using command the command line use:
java -cp lib\jade.jar;classes jade.Boot -gui -agents ping1:examples.PingAgent.PingAgent
java jade.Boot -agents "a:agents.AgentClass;b:agents.AgentClass".
Note that the classpath includes JADE classes (lib\jade.jar) and the previously compiled classes of the examples (classes). Note also that the value of the -agents option takes the form:
<agent-local-name>:<fully-qualified-agent-class>
Using JADE terminology, this is called an "Agent Specifier". More than one agent can be started by just typing several agent specifiers separated by a semicolon (';') as in the example below:
java -cp lib\jade.jar;classes jade.Boot -gui -agents ping1:examples.PingAgent.PingAgent;ping2:examples.PingAgent.PingAgent
From the perspective of your personal workflow it is probably not very comfortable to build the jar file each time you want to test your agents. The better way for debugging is to use the eclipse debug capabilities.
For this, place the jade.jar (and other required libraries) in your eclipse project. Further, configure the Java project to include these libraries in your project. After this you should be able to configure a debug configuration, where jade.Boot should be the main class, while the further Jade options (for agents or services) can be placed in the start argument tab (hope this rough description is enough).

Running deeplearning4j intellij program from command line interface

i want to run my deeplearning4j program that i created as IntelliJ project from server with command line input and i don't have any clue how to do it, any suggestion?
In our examples you'll see the maven-shade-plugin:
https://github.com/deeplearning4j/dl4j-examples/blob/master/dl4j-examples/pom.xml
Use this as a base.
That creates an uber jar already. Just use java -cp on the uber jar and you're set.

Possible to make an IntelliJ IDEA Run/Debug Config where the contents of the active editor window|tab are passed as a script parameter?

Does anyone know if there is a way to pass the contents of the Active editor window/tab in IntelliJ IDEA CE 11.x as a parameter to a Groovy script being executed as a Run/Debug configuration? I was hoping IDEA would have some concept of internal environment variables that might allow this (such as $_ACTIVE_EDITOR), but I have been unable to find anything that might help.
Essentially my use case is to take the contents of current window/tab and run a custom tool against it - the custom tool is a groovy script that accepts a String as an argument.
You can do some wrapper that will read a file into a string and then call your script. IDEA External Tools can pass current file name to this wrapper. There is also a macro for the selected text.
Instead of the external tool you can install and use Batch Scripts Support plug-in or Bash Support. They provide run configurations for the command line tools.
Also check Shell Process plug-in, it claims to run external apps with the editor selection.

Using a variable obtained using a pre-build shell command to set an option for the Maven build in Hudson

I have a Hudson job that runs a maven goal. Before this maven goal is executed I have added a step to run before the build starts, it is a shell script that obtains the version number that I want to use in the 'Goals and options' field.
So in my job configuration, under Build Environment I have checked the Configure M2 Extra Build Steps box and added a shell script before the build. The script looks like this:
export RELEASE={command to extract release version}
echo $RELEASE
And then under the Build section I point to my 'root pom'. In the Goals and options I then want to be able to do something like this:
-Dbuild.release.version=${RELEASE} deploy
Where build.release.version is a maven property referenced in the POM. However since the shell doesn't seem to make its variables global it doesn't work. Any ideas?
The only one I have is to install the Envfile plugin and get the shell script to write out the RELEASE property to a file and then get the plugin to read the file, but the order in which everything is run may cause problems and it seems like there must be simpler way...is there?
Thanks in advance.
I recently wanted to do the same, but AFAIK it's not possible to export values from a pre-build shell to the job environment. If there is a Hudson Plugin for this I've missed it.
What did work, however, was a setup similar to what you were suggesting: having the pre-build shell script write the desired value(s) to a property-file in the workspace, and then using the Parametrized Trigger Plugin to trigger another job that actually does the work (in your case, invoke the Maven job). The plugin can be configured to read the parameters it passes from the property file. So the first job has just the shell script and the post-build triggers, and the second one does the actual work, having the correct parameters available as environment variables.
General idea of the shell script:
echo "foo=bar
baz=`somecmd`" > build.properties
And for your Goals and options, something like:
-Dbuild.release.version=${foo} deploy
Granted, this isn't as elegant as one might want but worked really well for us, since our build was broken into several jobs to begin with, and we can actually reuse the other jobs that the first one triggers (that is, invoke them with different parameters).
When you say it doesn't work, do you mean that your RELEASE variable is not passed to the maven command? I believe the problem is that by default, each line of the shell script is executed separately, so environment variables get lost.
If you want the entire shell script to execute as if it was one script file, make the first line:
#!/bin/sh
I think this is described in the Help information alongside the shell script build step (and if I'm wrong, that's a good place to look for the right syntax).