using maven to manage java dependencies in a jruby rails app - maven-2

I'm trying to write a pom.xml that will allow me to run a command locally and fetch all dependencies that my jruby Rails app has. I'm seeing two different configs though and I'm not totally sure which to use (as I'm not a java person whatsoever)
First, many Pom's i'm seeing just have a tag under the root of the pom.xml that list all dependencies. This doesn't however have any information about where these are stored etc... so I feel like this isn't what I want (I need to copy them to my rails lib dir)
Second option, I'm seeing in the mvn docs is to use the maven-dependency-plugin, which seems more like what i'm looking for. I assume then that my outputDirectory would be something like lib
So I don't fully understand what the purpose of the first option's dependency list is for. All I want is mvn to copy my jars locally (and then eventually when my CI server does a deploy). Can someone point me in the right direction?
First Option
<project>
<dependencies>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.4</version>
</dependency>
</project>
Second Option
<project>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<configuration>
<artifactItems>
<artifactItem>
<groupId>[ groupId ]</groupId>
<artifactId>[ artifactId ]</artifactId>
<version>[ version ]</version>
<type>[ packaging ]</type>
<classifier> [classifier - optional] </classifier>
<overWrite>[ true or false ]</overWrite>
<outputDirectory>[ output directory ]</outputDirectory>
<destFileName>[ filename ]</destFileName>
</artifactItem>
</artifactItems>
<!-- other configurations here -->
</configuration>
</plugin>
</plugins>
</build>
</project>

First, many Pom's i'm seeing just have a tag under the root of the pom.xml that list all dependencies. This doesn't however have any information about where these are stored etc... so I feel like this isn't what I want (I need to copy them to my rails lib dir)
This is the traditional way to declare and use dependencies on a Java project. Dependencies declared under the <dependencies> element are downloaded from a "remote repository" and installed to your "local repository" (in ~/.m2/repository by default) and artifacts are then handled from there. Maven projects (at least the Java ones) don't use a local lib/ folder for their dependencies.
Second option, I'm seeing in the mvn docs is to use the maven-dependency-plugin, which seems more like what i'm looking for. I assume then that my outputDirectory would be something like lib
The maven dependency plugin allows to interact with artifacts and to copy/unpack them from the local or remote repositories to a specified location. So it can be used to get some dependencies and copy them in lets say a lib/ directory indeed. Actually, it has several goals allowing to do this:
dependency:copy takes a list of artifacts defined in the plugin
configuration section and copies them
to a specified location, renaming them
or stripping the version if desired.
This goal can resolve the artifacts
from remote repositories if they don't
exist in local.
dependency:copy-dependencies takes the list of project direct
dependencies and optionally transitive
dependencies and copies them to a
specified location, stripping the
version if desired. This goal can also
be run from the command line.
The first goal would use the setup you described in your second option. The second goal would use the standard project dependencies that you described in your first option. Both approaches would work.
The problem here is that I don't know exactly what a JRuby Rails app is, what the development workflow is, how to build such an app, etc so I don't know exactly what you need to do and, consequently, what would be the best way to implement that with Maven.
So I googled a bit and found this post that shows another approach based on OS commands (using the maven exec plugin) and has a complete pom.xml doing some other things. Maybe you should look at it and use it as a starting point instead of reinventing everything. This is my suggestion actually.

Related

Running unit tests in Tycho fails: resolves google-collections instead of Guava

I am having an issue running tests using tycho due to an incorrect dependency resolution that, somehow, is placing the the old Google Collections .jar on the classpath and not the Guava one, despite the fact that at no point in any of my poms do I specify a dependency on collections (only guava).
My unit tests fail due to things like NoSuchMethodError (ImmutableList.copyOf), NoClassDefFoundError (Joiner), which I pretty much narrowed down to 'finding the wrong jar'. These same tests pass when ran manually in Eclipse.
Here is the relevant part of the pom:
<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>14.0.1</version>
</dependency>
...
</dependencies>
The phrase 'google collections' appears no where. The only other repository I specify is:
<repositories>
<repository>
<id>helios</id>
<layout>p2</layout>
<url>http://download.eclipse.org/releases/helios</url>
</repository>
</repositories>
My plugin imports 'com.google.common.base' and 'com.google.common.collect' as imported packages. I have my own bundled version of Guava 14 in my workspace for debugging, but in the POM I elect to not use my local module.
I followed Sean Patrick Floyd's answer on this question (JUnit throws java.lang.NoSuchMethodError For com.google.common.collect.Iterables.tryFind), and had my test throw an exception with the location of the .jar that the Iterables class was loaded from. It spat back out:
java.lang.IllegalArgumentException: file:/C:/Documents and Settings/Erika Redmark/.m2/repository/p2/osgi/bundle/com.google.collect/0.8.0.v201102150722/com.google.collect-0.8.0.v201102150722.jar
This is where I am now stuck. This google-collections jar is coming seemingly out of no where, and I don't know how to stop it. As long as it is being resolved, my unit tests will fail. How can I stop Tycho from trying to get the old Google Collections?
Just to clarify, this has not stopped building and deployment; the plugin update site is on an CI platform and we have been able to install the plugin on different Eclipse IDEs, so this issue is only affecting the tests.
Please let me know if additional information is needed.
The plug-in com.google.collect 0.8.0.v201102150722 is part of the Helios p2 repository that you have configured in your POM. This means that this plug-in is part of the target platform and so may be used to resolve dependencies.
If you want to ensure that the bundle is not used, make sure that it is not part of the target platform. In your case, the easiest way to do this is to explicitly remove the plug-in from the target platform:
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>target-platform-configuration</artifactId>
<version>${tycho-version}</version>
<configuration>
<filters>
<filter>
<type>eclipse-plugin</type>
<id>com.google.collect</id>
<removeAll />
</filter>
</filters>
</configuration>
</plugin>
Next, you need to make sure that the guava plug-in is part of the target platform. You can add an artifact from a Maven repository to the target platform in the following way:
Declare a Maven dependency to the artifact in the dependencies section of the POM. You already have done this correctly.
Set the configuration parameter <pomDependencies> to consider on Tycho's target-platform-configuration plug-in.
Note that this will generally only work if the referenced artifact is already an OSGi bundle. This is the case here: com.google.guava:guava:14.0.1 seems to have all manifest headers needed by OSGi.
This should give you the result you wanted: In the test runtime, guava should now be used to match your com.google.common.* package imports.
And another general remark on declaring dependencies in Tycho: In Tycho, you can only declare dependencies in the PDE source files META-INF/MANIFEST.MF, feature.xml, etc.
The normal Maven-style dependencies declared in the POM do not add dependencies to the project. As explained above, the POM dependencies may only add artifacts to the target platform, i.e. the set of artifacts that may be used by Tycho to resolve the dependencies declared in the PDE source files. So in the end, the POM dependency may become part of the resolved dependencies, but only if the dependency resolver picks it for matching one of the declared dependencies.
by default, tycho will add any p2 artifacts you installed in your local maven repo to the target platform. If bundle com.google.collect exports the package which you import, it may be wired.
To stop tycho from including any locally installed artifacts, you can use -Dtycho.localArtifacts=ignore (or, remove the unwanted bundle from your local maven repo)
See http://wiki.eclipse.org/Tycho/Release_Notes/0.16#Improvements_and_Fixes

Attaching Build Number for binaries in Maven

I am running maven build and storing files in Artifactory. One issue I am facing is when ever I try a -snapshot version it overwrites the binary in Artifactory. I tried using the Maven build number plugin, but running in to issues.I reffered to this
http://blog.codehangover.com/track-every-build-number-with-maven/
Describing below What I did?
Updated the masterpom.xml with following line.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.0-beta-3</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>create</goal>
</goals>
</execution>
</executions>
<configuration>
<doCheck>false</doCheck>
<doUpdate>false</doUpdate>
<format>${version}.{0,number}</format>
<items>
<item>buildNumber</item>
</items>
</configuration>
</plugin>
Now I update the pom of ear and webproject as below
<build>
<finalName>${project.artifactId}-${project.version}.${buildNumber}</finalName>
</build>
When I ran mvn clean install, ear and war got generated but when i checked the war inside the ear I am finding it as some thing like war-1.0-SNAPSHOT-null.war. I believe the war and ear couldn't get the buildNumber parameter. I was able to successfully generate the buildNUmber.property files and was able to increment the number by running the buildnumber:create plugin. Here are my questions
What I am doing wrong here and why the buildNumber parameter is not picked.
Also I want to generate all the binaries including jars in the following format binary-version-Snapshot.${buildNumber}. So Do i need to update pom of each file or any other way to update this?
Also we are using Hudson builds for Continous Integration and we want to separate developers builds with Hudson Build number. How can we achieve this if we don't want to checkin the buildNumber.properties after the Hudson build.
To get unique snapshots use the uniqueVersion flag (see James Lorenzen's Blog). If you use the maven goal deploy:deploy-file the uniqueVersion flag is true by default. At my company we have the following policy. Only "official" snapshots go to the repository. A "official" snapshot is one that was build on our reference system (our Jenkins ci server). We don't need the unique feature for snapshots, since we let Jenkins archive the artifacts. This way we can always go back to a certain version if we would like too by using Jenkins. If the build breaks the snapshot will not be deployed to the repo.
To your 2nd question; my understanding is that you need to update every pom file. But since it is a one time change, it shouldn't be too much of a burden.
I am not completely understanding your 3rd question ("... separate developers builds with Hudson Build number..."). In case you want to add the build number for every build done by Hudson, you have several options.
You can add a string as classifier while deploying. Maven will add that classifier in the filename (artifactID-version-classifier.jar - e.g. my.company.calendar-0.0.1-Snapshot-Hudson.jar). The artifact will be retrieved by adding the classifier to the dependency.
add another parameter to your maven call - outputfilename (${project.build.finalName}, see maven docu)
changing your version string to something like

Maven - Best way to refer to a directory on the system path

I am trying to build an RPM from my Maven project. I have 5 different modules and each one has its own pom.xml, In the root I have one pom.xml which builds all modules (Typical Maven Setup). When I build an RPM, I want to include a directory that is not part of the maven directories. Its above a directory [from the root folder that contains my maven modules]. What is the best way to include that in my RPM? or rather what is the best way to refer to a directory with out hardcoding the path? I am confused about ${baseDir} and what it refers to?
Thank you.
${project.basedir} refers to the root of the project, ie where the pom.xml is, so you could use that in <systemPath>${project.baseDir}/../../dirYouWant</systemPath>
In general though, Maven best-practices would frown about relying on the relative paths around your projects from being there. Instead, I suggest deploying those files as there own project to your maven repository (as a zip, jar, whatever), and then getting them as part of your rpm build. Depending on what plugin you are using to build your RPM, you can unpack those files automatically.
Try this
<dependency>
...groupid,artifactid etc..
<scope>system</scope>
<systemPath>path/to/your/jar</systemPath>
</dependency>
Did you mean you want to add another project to your maven build being level above?
you can do it like this :
in your parent pom :
<modules>
<module>../projectdirectory</module>
</modules>
in your projectdirectory pom :
<parent>
<groupId>...</groupId>
<artifactId>...parent...</artifactId>
<version>...</version>
<relativePath>../parentProject/pom.xml</relativePath>
</parent>

Specifying jar file in maven build argument

We have our project build using maven. We try to run our unit test cases in maven build itself and for doing that we need to add DB2 driver jar in the dependency of all the sub projects.
Instead of doing that, we need a solution to specify the absolute path of the jar file as a mvn command line argument to use it in the running of unit test cases.
This is because the driver jar is available in our app server lib folder and we don't want to specify it in the dependencies of our projects.
Couldn't find a suitable solution googling it, hence requesting for an expert solution here.
Any workaround would be of greater help.
Thanks in advance.
The usual way would be to add a dependency to the database driver and limit the dependency to testing (test scope). So the library is available for unit tests but will not deployed and jar'ed.
Practically spoken, I'd create a maven artifact for this driver (just a basic POM file) and place it on the build servers maven repository (or the nexus, if you use it for the projects).
I'm using a dependency with scope set to 'system' to reference a jar that is available in the container but not in any maven repository. In this case the jar is put in a folder named 'lib' in the project like this, :
<dependency>
<groupId>groupId</groupId>
<artifactId>artifactId</artifactId>
<version>version</version>
<scope>system</scope>
<systemPath>${project.basedir}/lib/library.jar</systemPath>
</dependency>
The groupId, artifactId and version can be set to any value you want, the trick was that system dependencies have to be given with an absolute path, which is worked around by using the project.basedir property. It should also be possible to specify the complete path as a property.
We have our project build using maven. We try to run our unit test cases in maven build itself and for doing that we need to add DB2 driver jar in the dependency of all the sub projects.
Well, the maven way would be to declare the DB2 driver as dependency with a test scope in a parent project.
Instead of doing that, we need a solution to specify the absolute path of the jar file as a mvn command line argument to use it in the running of unit test cases.
You could use the additionalClasspathElement in the plugin configuration to pass the path to the driver:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<additionalClasspathElements>
<additionalClasspathElement>path/to/additional/resources</additionalClasspathElement>
</additionalClasspathElements>
</configuration>
</plugin>
If you variablelize it, you could pass the value on the command line.
But to be honest, I can't understand why you don't install the driver in a corporate repository and declare it as dependency. And if you don't have a corporate repository, use a file based repo as described in this previous answer (please, don't use the system scope bad practice). There is no good reason to go the hacky way.

How do you use the maven-simian-plugin in Maven2?

I'm looking for a Maven2 reporting plugin for Simian and the closest thing to such a reporting I found is this. The problem is, the documentation for it appears to be for Maven 1 instead. Why is a Maven 1 plugin stored in a Maven 2 repository? I suppose that means I can use it... but how to use? The site mentions reporting but if I don't have a src/main/site, does that mean I can't use it? I was kinda hoping for something like mvn simian:simian similar to mvn checkstyle:checkstyle and mvn pmd:pmd. I don't want to generate site just for the reports. Sites take too long to generate when all I want is a quite xml report.
The Simian plugin listed on central is actually for Maven 1 (if you inspect the contents you'll see a project.xml and a plugin.jelly). So that explains why it doesn't work. This is rubbish and should be removed in my opinion.
As far as I can make out there isn't a publically available Maven 2 plugin, this may have something to do with the licence (Simian isn't open source).
As an alternative, have a look at PMD's CPD plugin, it may not be as fully featured as simian but I know it works in a Maven 2 build and detects copypasta pretty well.
To configure PMD, add something like the following to your POM:
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-pmd-plugin</artifactId>
<version>2.4</version>
</plugin>
</plugins>
</reporting>