Adding external Jar to Pentaho Kettle - pentaho

I am working on Pentaho Kettle version 5.0.1. In one of my transformation I am using javascript component where I am calling a method located in the JAR which I have copied to the lib folder of data-integration and everything is working fine in my local. But in my dev environment(I run it using kitchen) I don't have permission to copy my Jar file to the lib folder due to the restrictions on the server. Is there any other way using which I can supply the path of my custom Jar during run time so that the Kettle Job/Transformation can use it while being executed. Is there a way Kettle can pick the Jar location other than data-integration/lib?. Any help will be appreciated.

Take a look into kitchen.sh (and pan.sh). At some point the script starts adding stuff to the classpath. You can add more folders to the classpath there.
You still need permissions to edit the kitchen.sh file, though. If you can't do that, I suggest creating a copy of kitchen.sh you can write, in a separate location, and change the $BASEDIR folder to the actual PDI installation, so that kitchen can be located elsewhere.

If you have permission you can put your jar in another directory and after you specify this directory in the launcher.properties which you find in data-integration\launcher.
For exemple: if you put your jar in this directory: /export/home.
In launcher.properties: you will add this path and precisely libraries=../test:../lib:../libswt:../export/home

Related

Maven project is not picking local repository after copying

I had a working workspace set up in Maven. Due to unavoidable reasons I have to change my system and hence I need to set up a new workspace. Hence I copied the .m2/repository folder from a working system to the new system.
But unfortunately my local is nopt picking jars from the local repository and it is throwing compiler error. I have copied it to my home directory .m2/repository folder
Can anyone please help me here?
Thanks,
Rengasami R
The local repository path is defined in your settings.xml file (found either in M2_HOME/conf or USER_HOME/.m2). Check that the value of this <localRepository> matches the path you have defined.
Another solution is to run the command mvn help:effective-settings, and it will display the content of the settings.xml Maven is using, so you will find this information easily.

How can I write into a file within an Eclipse bundle?

I have an xml configuration file located into my plugin resources. I want to update this file whenever in the plugin happens some event. I found some methods to find and read the contents of a file located my plugin classpath, but I'm looking for a way to write into such a file.
Is there any way?
Many thanks.
That location (the install directory) is intended to be read-only since it may be shared in a network install scenario. I suggest you instead write the XML file to your plugin's state location which is intended for just this purpose:
String path = Activator.getDefault().getStateLocation().toString();
I should add that this gives you a fully qualified path to the directory created by Eclipse for any files your plugin wants to store. This directory is unique to your plugin.

Bamboo artifacts

I am very new to Bamboo. I have got a html file generated using log4j. I wish to put it in user-defined artifacts but don't know how.
It is in surefire-reports folder so I tried giving Source directory as "**/target/surefire-reports/" and Artifact Copy Pattern as "**/*.html" but it doesn't seems to work.
Any idea how to configure it?
Try to change copy pattern to
*.html
and verify your complete path.
I wanted to get all surefire reports from each module, so I created a new Artifact definition with:
Name = Surefire Reports
Location =
Copy Pattern = /target/surefire-reports/.*
This was using Bamboo 3.2.2.
The Location field does not provide the Ant file copy pattern feature, only a fixed path is accepted relative to the working directory.
Set the Location as target/surefire-reports
and the Copy pattern as **/*.html
Also make sure that the Shared checkbox is set, otherwise other jobs will not be able to download the artifact.

Teamcity 2 configurations merge and deploy

I have two teamcity configurations one becoming my common helpers and reuseable components and my other a website which uses the common project.
I use a third configuration to publish to a test environment.
When the third configuration is run i would like it to get the artifacts from the common project and merge them with the website output and deploy. Am i asking for two much?
This ought to be pretty straightforward.
On ThirdConfig add two artifact dependencies. One whose source is CommonProject, and another whose source is WebProject. When configuring an artifact dependency it will allow you to specify which artifact files are are actually pulled from CommonProject and WebProject into ThirdConfig via the 'Artifact paths'. The artifact files can then be placed into some new folder hierarchy specific to ThirdConfig by using the 'Destination path'. These two options ought to be enough to create the directory structure that is the merging of CommonProject and WebProject. That takes care of the merge part.
The deploy is a bit more tricky. To my knowledge TeamCity does not support any sort of 'copy or upload to external location' function out of the box. For this bit you'll need to create an msbuild script (or batch file, or anything that can be run from the command line). Said script can expect the file/directory structure you've created via artifact dependencies where the root of the structure is the initial working directory of the script, and need only push these files out to your specific deploy location. That 'push' of course is going to be specific to your environment. Ftp, unc share, etc.

How can I make deployed resources editable with Maven 2?

I have a project where I create a JAR which contains a bunch of classes with main() plus a set of scripts which set the environment to invoke them. Most of those are long running processes which log a lot (~10-20GB).
This means I have a pretty complex log4j.xml file which, being in src/main/resources/, goes into the JAR. When something breaks in the production system, I'd like to modify the logging on the fly for a single run.
So I came up with the idea to have a conf/ directory on the production and put that into the classpath, first. Then, I thought that it would be great if M2 would put the config files in there (instead of the JAR). But that would overwrite any manual changes during an automated deployment which I strongly dislike. I'm also not fond of timestamps and things like that.
So my next ideas was this: M2 should leave the config files in the JAR but create copies of the files with the name *.tpl in the conf/ directory. The admin could then copy a template to the basename to override the files in the JARs. .tpl-Files would be overwritten but that wouldn't hurt. Admins would have full control over which version of the log was active and they could run a diff to see whether any important changes were made.
Now the question: Has someone seen a plugin which automates this process? That is which creates a conf/ directory with all or a selected subset of everything in src/main/resources/ and which renames the files?
Best practice in Maven handling config files is to place them in a separate conf directory, and pack them in a binary assembly using the assembly plugin. Placing configuration files, like log4j.xml in the src/main/resources doesn't make sense, since it is not a true application resource, but more of a configuration file.
We cope with the overwriting, by packing the configuration files with the posfix .def. For example: myapp.properties is packed into the assembly as myapp.properties.def. When the person who uses the assembly unpacks it, it will not overwrite his original files. After unpacking he simply merges them by an external tool (we use meld in Fedora Core).
I may be missing something and this doesn't answer directly the question but did you consider producing a zip assembly of the exploded content of required artifacts (to be unzipped on the target environment)?
Sounds like you're attacking the problem the wrong way. Why not just run the application with -Dlog4j.configuration=/some/where/my-log4j.properties? If you want, you can add a command line flag to main() which invokes the PropertyConfigurator directly.