Specifying jar file in maven build argument - maven-2

We have our project build using maven. We try to run our unit test cases in maven build itself and for doing that we need to add DB2 driver jar in the dependency of all the sub projects.
Instead of doing that, we need a solution to specify the absolute path of the jar file as a mvn command line argument to use it in the running of unit test cases.
This is because the driver jar is available in our app server lib folder and we don't want to specify it in the dependencies of our projects.
Couldn't find a suitable solution googling it, hence requesting for an expert solution here.
Any workaround would be of greater help.
Thanks in advance.

The usual way would be to add a dependency to the database driver and limit the dependency to testing (test scope). So the library is available for unit tests but will not deployed and jar'ed.
Practically spoken, I'd create a maven artifact for this driver (just a basic POM file) and place it on the build servers maven repository (or the nexus, if you use it for the projects).

I'm using a dependency with scope set to 'system' to reference a jar that is available in the container but not in any maven repository. In this case the jar is put in a folder named 'lib' in the project like this, :
<dependency>
<groupId>groupId</groupId>
<artifactId>artifactId</artifactId>
<version>version</version>
<scope>system</scope>
<systemPath>${project.basedir}/lib/library.jar</systemPath>
</dependency>
The groupId, artifactId and version can be set to any value you want, the trick was that system dependencies have to be given with an absolute path, which is worked around by using the project.basedir property. It should also be possible to specify the complete path as a property.

We have our project build using maven. We try to run our unit test cases in maven build itself and for doing that we need to add DB2 driver jar in the dependency of all the sub projects.
Well, the maven way would be to declare the DB2 driver as dependency with a test scope in a parent project.
Instead of doing that, we need a solution to specify the absolute path of the jar file as a mvn command line argument to use it in the running of unit test cases.
You could use the additionalClasspathElement in the plugin configuration to pass the path to the driver:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<additionalClasspathElements>
<additionalClasspathElement>path/to/additional/resources</additionalClasspathElement>
</additionalClasspathElements>
</configuration>
</plugin>
If you variablelize it, you could pass the value on the command line.
But to be honest, I can't understand why you don't install the driver in a corporate repository and declare it as dependency. And if you don't have a corporate repository, use a file based repo as described in this previous answer (please, don't use the system scope bad practice). There is no good reason to go the hacky way.

Related

Running unit tests in Tycho fails: resolves google-collections instead of Guava

I am having an issue running tests using tycho due to an incorrect dependency resolution that, somehow, is placing the the old Google Collections .jar on the classpath and not the Guava one, despite the fact that at no point in any of my poms do I specify a dependency on collections (only guava).
My unit tests fail due to things like NoSuchMethodError (ImmutableList.copyOf), NoClassDefFoundError (Joiner), which I pretty much narrowed down to 'finding the wrong jar'. These same tests pass when ran manually in Eclipse.
Here is the relevant part of the pom:
<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>14.0.1</version>
</dependency>
...
</dependencies>
The phrase 'google collections' appears no where. The only other repository I specify is:
<repositories>
<repository>
<id>helios</id>
<layout>p2</layout>
<url>http://download.eclipse.org/releases/helios</url>
</repository>
</repositories>
My plugin imports 'com.google.common.base' and 'com.google.common.collect' as imported packages. I have my own bundled version of Guava 14 in my workspace for debugging, but in the POM I elect to not use my local module.
I followed Sean Patrick Floyd's answer on this question (JUnit throws java.lang.NoSuchMethodError For com.google.common.collect.Iterables.tryFind), and had my test throw an exception with the location of the .jar that the Iterables class was loaded from. It spat back out:
java.lang.IllegalArgumentException: file:/C:/Documents and Settings/Erika Redmark/.m2/repository/p2/osgi/bundle/com.google.collect/0.8.0.v201102150722/com.google.collect-0.8.0.v201102150722.jar
This is where I am now stuck. This google-collections jar is coming seemingly out of no where, and I don't know how to stop it. As long as it is being resolved, my unit tests will fail. How can I stop Tycho from trying to get the old Google Collections?
Just to clarify, this has not stopped building and deployment; the plugin update site is on an CI platform and we have been able to install the plugin on different Eclipse IDEs, so this issue is only affecting the tests.
Please let me know if additional information is needed.
The plug-in com.google.collect 0.8.0.v201102150722 is part of the Helios p2 repository that you have configured in your POM. This means that this plug-in is part of the target platform and so may be used to resolve dependencies.
If you want to ensure that the bundle is not used, make sure that it is not part of the target platform. In your case, the easiest way to do this is to explicitly remove the plug-in from the target platform:
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>target-platform-configuration</artifactId>
<version>${tycho-version}</version>
<configuration>
<filters>
<filter>
<type>eclipse-plugin</type>
<id>com.google.collect</id>
<removeAll />
</filter>
</filters>
</configuration>
</plugin>
Next, you need to make sure that the guava plug-in is part of the target platform. You can add an artifact from a Maven repository to the target platform in the following way:
Declare a Maven dependency to the artifact in the dependencies section of the POM. You already have done this correctly.
Set the configuration parameter <pomDependencies> to consider on Tycho's target-platform-configuration plug-in.
Note that this will generally only work if the referenced artifact is already an OSGi bundle. This is the case here: com.google.guava:guava:14.0.1 seems to have all manifest headers needed by OSGi.
This should give you the result you wanted: In the test runtime, guava should now be used to match your com.google.common.* package imports.
And another general remark on declaring dependencies in Tycho: In Tycho, you can only declare dependencies in the PDE source files META-INF/MANIFEST.MF, feature.xml, etc.
The normal Maven-style dependencies declared in the POM do not add dependencies to the project. As explained above, the POM dependencies may only add artifacts to the target platform, i.e. the set of artifacts that may be used by Tycho to resolve the dependencies declared in the PDE source files. So in the end, the POM dependency may become part of the resolved dependencies, but only if the dependency resolver picks it for matching one of the declared dependencies.
by default, tycho will add any p2 artifacts you installed in your local maven repo to the target platform. If bundle com.google.collect exports the package which you import, it may be wired.
To stop tycho from including any locally installed artifacts, you can use -Dtycho.localArtifacts=ignore (or, remove the unwanted bundle from your local maven repo)
See http://wiki.eclipse.org/Tycho/Release_Notes/0.16#Improvements_and_Fixes

how to configure maven to use jar files present on the system to satisfy dependency?

I need to configure the jars in my pom.xml file in my web application in such a way that I need not use the lib folder to store all the jar files.
Please help.
If you really have dependencies which are stored in a lib folder (I assume those jar's don't exist in Central) you can use the system dependency
<dependency>
<groupId>...</groupId>
<artifactId>..</artifactId>
<scope>system</scope>
<systemPath>PathOnYourSystem</systemPath>
</dependency>
But i assume you mean something different, cause the above will procuce a warning on Maven 3. If you have a dependency which is provided by the Container (for example Tocmat) you can define a dependency as provided.
But the best is to put such dependencies into a local repository manager which i hope you are using (Artifactory, Nexus, Archiva).
You can mannually add them to your local repository (since it seems that they are not at central).
But the best would be to set up your own (or company) repository to hold them for you.

Using Maven ant task to install jar to local repository

At the end of my ant build id like it to call the equivalent of the command line call
mvn install:install-file -Dfile=my.jar -DgroupId=com.company.project -DartifactId=my_project -Dversion=1.0 -Dpackaging=jar -DgeneratePom=true
so that it will add the newly built jar to a maven repository which another project will rely on.
Ive tried using the maven-ant-task and have added the maven-ant-task jar to the ant built project and the following code to the build.xml:
<target name ="minstall" depends="jar">
<artifact:pom id="maven_install" file="maven_install.xml" />
<artifact:install file="${out.dir}/my_project.jar">
<pom refid="maven_install"/>
</artifact:install>
</target>
but seem to be missing something as it wont work for me. To begin with i get the error in the build.xml (ant build file) saying
The prefix "artifact" for element "artifact:pom" is not bound.
What am I doing wrong. I am fairly new to ant?
On a realted question what is the purpose of the associated POM file? I would not normally have a POM in this project as it is an ant build
Perhaps maven-ant-task jar is not installed, i.e. not in your ant CLASSPATH. You can follow this instruction for this.
As mentioned previously, you need to make sure the tasks are defined in your ant script, and the artifact namespace is understood.
The POM file is used (in this case) to tell the Maven repositories the dependencies of the JAR you are putting in the repository. The POM should also specify the JAR's identification information (groupId, artifactId, version number, license, etc.).
Strictly speaking, you do not need an external POM, you could define the information in your build.xml file as follows:
<!-- Assuming tasks defined, and 'artifact' namespace exists -->
<artifact:pom id="maven_install" groupId="com.whatever" artifactId="some-jar"
version="1.0" packaging="jar">
<dependency groupId="..." artifactId="..." version="..."/>
<dependency groupId="..." artifactId="..." version="..."/>
<license name="apache" url="http://www.apache.org"/> <!-- can be omitted -->
</artifact:pom>
<target name ="minstall" depends="jar">
<artifact:install file="${out.dir}/my_project.jar" pomRefId="maven_install"/>
</target>
When you install the JAR in the 'minstall' task, the POM should be generated with the appropriate dependencies in the local Repository.
That message means you are missing an xmlns:artifact attribute in your build.xml. Have a look at the installation page in the docs for an example.
As to the purpose of the POM file, it's mostly metadata so that maven can figure out dependencies properly. In a real maven build it also describes how to build, test and package. But in your case all that is done by ant instead.
I think that it makes no sense to put such commands in Ant's build.xml. If you want to have your jar file installed in your maven repo just use mvn install command.
Besides that, I guess that you are somehow confusing the purpose of Maven and Ant tools in your project. What I'd suggest is to use Maven as your main build tool. You can configure invokation of Ant targets in your POM file if you really need that. Personally, I think it is the best solution to have Ant called by Maven. Maven goals (such as clean, test, package, install and so on) are very simple to use and powerful (I guess that you can read it in every Maven tutorial).

Maven, how to add additional libs not available in repo

I have a maven project that has a set of library dependancies that are not available via any maven repository. How can I add those libraries to the pom? I want to do this so when I run 'mvn eclipse:eclipse' it doesnt remove those libraries from the eclipse classpath.
You can declare it as a dependency with system scope.
<project>
...
<dependencies>
<dependency>
<groupId>sun.jdk</groupId>
<artifactId>tools</artifactId>
<version>1.5.0</version>
<scope>system</scope>
<systemPath>${java.home}/../lib/tools.jar</systemPath>
</dependency>
</dependencies>
...
</project>
You have 3 options:
Add your libraries to your local repository via install:install-file (obviously, this is not portable, you won't be able to build the project on another machine without doing the same).
Install and run an "enterprise repository" like Nexus, Archiva, or Artifactory and add your libraries via deploy:deploy-file.
Setup a file based repository as described in this previous answer and put your libraries in there.
Then, declare your libraries in your pom like any other dependency.
You can include them with your project in a sub-directory (perhaps lib/). You can also provide .bat and/or .sh files containing all the appropriate calls to the maven-install-plugin necessary for each project member (or server env) to add these jars to the local repo.
This approach allows new project members to get up & running quickly, without having to invest several hours in setting up a new public repo for your project or team.
You can't 'add them to the pom'. You have to put them in some repo. You can put them in the local repo with the maven-install-plugin, as suggested by the error message. Or you can deploy them in a local copy of Nexus or something like it.
recently I created a small UI Util to install libraries to you local repository.
It works the same way as install:install-file.
https://github.com/escv/maven-install-ui

using maven to manage java dependencies in a jruby rails app

I'm trying to write a pom.xml that will allow me to run a command locally and fetch all dependencies that my jruby Rails app has. I'm seeing two different configs though and I'm not totally sure which to use (as I'm not a java person whatsoever)
First, many Pom's i'm seeing just have a tag under the root of the pom.xml that list all dependencies. This doesn't however have any information about where these are stored etc... so I feel like this isn't what I want (I need to copy them to my rails lib dir)
Second option, I'm seeing in the mvn docs is to use the maven-dependency-plugin, which seems more like what i'm looking for. I assume then that my outputDirectory would be something like lib
So I don't fully understand what the purpose of the first option's dependency list is for. All I want is mvn to copy my jars locally (and then eventually when my CI server does a deploy). Can someone point me in the right direction?
First Option
<project>
<dependencies>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.4</version>
</dependency>
</project>
Second Option
<project>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<configuration>
<artifactItems>
<artifactItem>
<groupId>[ groupId ]</groupId>
<artifactId>[ artifactId ]</artifactId>
<version>[ version ]</version>
<type>[ packaging ]</type>
<classifier> [classifier - optional] </classifier>
<overWrite>[ true or false ]</overWrite>
<outputDirectory>[ output directory ]</outputDirectory>
<destFileName>[ filename ]</destFileName>
</artifactItem>
</artifactItems>
<!-- other configurations here -->
</configuration>
</plugin>
</plugins>
</build>
</project>
First, many Pom's i'm seeing just have a tag under the root of the pom.xml that list all dependencies. This doesn't however have any information about where these are stored etc... so I feel like this isn't what I want (I need to copy them to my rails lib dir)
This is the traditional way to declare and use dependencies on a Java project. Dependencies declared under the <dependencies> element are downloaded from a "remote repository" and installed to your "local repository" (in ~/.m2/repository by default) and artifacts are then handled from there. Maven projects (at least the Java ones) don't use a local lib/ folder for their dependencies.
Second option, I'm seeing in the mvn docs is to use the maven-dependency-plugin, which seems more like what i'm looking for. I assume then that my outputDirectory would be something like lib
The maven dependency plugin allows to interact with artifacts and to copy/unpack them from the local or remote repositories to a specified location. So it can be used to get some dependencies and copy them in lets say a lib/ directory indeed. Actually, it has several goals allowing to do this:
dependency:copy takes a list of artifacts defined in the plugin
configuration section and copies them
to a specified location, renaming them
or stripping the version if desired.
This goal can resolve the artifacts
from remote repositories if they don't
exist in local.
dependency:copy-dependencies takes the list of project direct
dependencies and optionally transitive
dependencies and copies them to a
specified location, stripping the
version if desired. This goal can also
be run from the command line.
The first goal would use the setup you described in your second option. The second goal would use the standard project dependencies that you described in your first option. Both approaches would work.
The problem here is that I don't know exactly what a JRuby Rails app is, what the development workflow is, how to build such an app, etc so I don't know exactly what you need to do and, consequently, what would be the best way to implement that with Maven.
So I googled a bit and found this post that shows another approach based on OS commands (using the maven exec plugin) and has a complete pom.xml doing some other things. Maybe you should look at it and use it as a starting point instead of reinventing everything. This is my suggestion actually.