I have a multi-module maven build where one of the child modules requires an extra goal to be executed as part of a release. But it looks as though any configuration of the maven-release-plugin in the child module is ignored in favour of the default configuration in the parent module.
This is the snippet from the child module. The plugin configuration is the same in the pluginManagement section of the parent pom, but without the custom element.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-release-plugin</artifactId>
<version>2.1</version>
<configuration>
<tagBase>http://mycompany.com/svn/repos/myproject/tags</tagBase>
<goals>deploy myCustomPlugin:myCustomGoal</goals>
</configuration>
</plugin>
So is it possible for a child module to override the parent's configuration and add extra goals?
Maven version 2.2.1
Use combine.children="append" combine.self="override"
Parent POM
<configuration>
<items>
<item>parent-1</item>
<item>parent-2</item>
</items>
<properties>
<parentKey>parent</parentKey>
</properties>
</configuration>
Child pom
<configuration>
<items combine.children="append">
<!-- combine.children="merge" is the default -->
<item>child-1</item>
</items>
<properties combine.self="override">
<!-- combine.self="merge" is the default -->
<childKey>child</childKey>
</properties>
</configuration>
Result
<configuration>
<items combine.children="append">
<item>parent-1</item>
<item>parent-2</item>
<item>child-1</item>
</items>
<properties combine.self="override">
<childKey>child</childKey>
</properties>
</configuration>
See this blog for further details
Yes and no. Certainly a child pom can override the configuration of a plugin specified by its parent, and I have to assume you've done so correctly because there's nothing really hard about it. If you check the output of mvn help:effective-pom, you should be able to see plainly that this module has different settings for the release plugin.
The problem you're having is with the behavior of the release plugin. Typically, if you run a goal or phase--mvn compile, for example--from the root module of your project, it first runs that goal/phase on the root module, then on all the modules in reactor order, almost as if you'd run it in each module yourself. Any customizations you've added to child modules take effect as expected. When you run the release plugin, it runs only at the root module. It doesn't run in any of the child modules. Instead, running it at the root module forks a new build using the same settings as the root module, which runs for all the other modules in nearly the same way, except that it uses the root module's configuration for all the modules. I don't know the exact semantics, but I believe this is analogous to you manually running the release goals in each child and specifying the configuration options as system properties at the command line: regardless of how a child module configures the release plugin, the command line args win.
I've never dealt with this problem myself, and it's hard to say without knowing exactly what you're trying to accomplish. Perhaps if you can express what you want to do in this special module as a profile, then you could add a profile to your goals and or preparationGoals. Alternately, there's an arguments option to both the prepare and perform goals that you might be able to pull some tricks with.
Related
I need to add some flags for unit tests, and want to share them for all team members. IntelliJ has a solution to share run configurations, but default configurations doesn't have share checkbox:
Of course, these settings are stored in idea/.workspace, but I don't want to store to the repository all my stuff, like recent searches. Is there any solution to store default run configurations in the repository?
There is a workaround to pass common params to any project's JUnit run configuration which relies on IntelliJ IDEA's feature of picking maven surefire settings.
So it's sufficient to add common params to the main pom:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<!-- force 7-bit default encoding to ensure that nothing depends on it -->
<!-- take JFR profiling snapshot on each run -->
<argLine>-Dfile.encoding=ASCII -Xmx512M -XX:+HeapDumpOnOutOfMemoryError -XX:+UnlockCommercialFeatures -XX:+FlightRecorder -XX:StartFlightRecording=name=EcpTest,duration=999s,filename=target/ecp.jfr,settings=profile</argLine>
</configuration>
</plugin>
I want to put the hibernate3-maven-plugin in my parent pom and have execution skipped in child modules if a given file does not exist in that module.
Is there any way to do this?
Up to now, I have had to do this:
<plugin>
...
<configuration>
<skip>true</skip>
<propertyfile>target/test-classes/jdbc.properties</propertyfile>
</configuration>
</plugin>
In the parent POM, and:
<plugin>
...
<configuration>
<skip>${maven.test.skip}</skip>
</configuration>
</plugin>
In all child POMs where I want it to execute. I.E Those actually having a jdbc.properties file.
You may be able to do this with profiles, but I suppose you'd probably not want to run it in the parent project, which may be problematic.
Here are some links on profiles:
http://maven.apache.org/guides/introduction/introduction-to-profiles.html
http://www.sonatype.com/books/mvnref-book/reference/profiles.html
http://mindthegab.com/2008/12/02/howto-give-your-multimodule-maven-build-subprojectenvironment-specific-behavior/
This question, had a similar issue and was not able to solve it with profiles:
activate-different-maven-profiles-depending-on-current-module
I'm not 100% on the logistics but you could possibly use the maven exec plugin in combination with a shell script. The shell script would check for the presence of the file and then invoke the mvn plugin using the maven pom directory - which can be obtained and passed to the shell script via the Maven environment variables.
Is there a way to trigger a maven install command from another maven install command?
In other words, I would like to be able to execute a maven install command on a maven project (in eclipse) and I want that this will automatically cause an install command on another maven project.
Is that possible?
The Maven way to "trigger" another build is to define a multi-module build. A parent pom project can specify modules, that will all be built using the standard lifecycle. So running mvn install on the parent would mean that each module is built in turn.
The parent is defined with pom packagin, and would have a modules declaration like this:
<modules>
<module>module-a</module>
<module>module-b</module>
</modules>
Alternatively it is possible to attach additional artifacts to a build so they are deployed alongside the primary artifacts (assuming they've already been packaged, you can use the build-helper-maven-plugin to attach an arbitrary file to your pom, so it will be deployed with the specified classifier. The following configuration will attach the specified file as my-artifact-1.0-extra.jar
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.3</version>
<executions>
<execution>
<id>attach-artifacts</id>
<phase>package</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>/path/to/extra/file.jar</file>
<type>jar</type><!--or specify your required extension-->
<classifier>extra</classifier>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
As pointed out, the maven way to launch a goal (lets say mvn install) on a set of modules is to organize them as a multi-module project and to launch the goal on the parent pom. Behind the scene, Maven will use a "Maven reactor" for this work. The reactor will calculate the build order by doing a topological sort of the nodes of the directed graph constructed by the dependency relation between modules. This graph is constructed by looking at <modules> and <dependencies> tags in poms.
But launching maven from a parent is not the only option and maven offers more possibilities to play with the reactor (e.g. making a project and its dependencies or those that depend on it):
With maven 2.0.x you have to use the reactor plugin : http://maven.apache.org/plugins/maven-reactor-plugin/ (see Reactor: My New Favourite Maven Plugin too)
With maven 2.1+ you can use native command line options : http://www.sonatype.com/people/2009/03/maven-210-released/ (see the new build mode options -amd, -rf, -am, -pl)
Check it out, it might help you to achieve your goal.
I have a set of web apps that I manage that I am trying to move to maven.
/pom.xml // parent pom
webapp1/pom.xml // configured to point to parent
webapp2/pom.xml // peer of webapp1 and points to parent.
each of the webapps refers to the parent pom, and they both currently have a jetty maven plugin that works.
My question is how do I mount each of the webapps from the parent pom such that mvn jetty:run works in the parent directory?
edit to anwer: Pascal T
The issue is not so much that I'm getting an error when I try and run the command from the root pom, but that I'm not sure how to configure it.
for example the webapp1/pom.xml
looks like:
<project>
...
<plugins>
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>
</plugin>
</plugins>
...
</project>
changing to this directory and typing mvn jetty:run works just fine and affords me the ability to hit: http://localhost:8080/webapp1.
However, what I would like would be to be in the parent of webapp1, and run all 'n' webapps from the parent directory. Thus having http://localhost:8080/webapp1, and http://localhost:8080/webapp2 available with one command line parameter.
btw, if the answer involved a tomcat plugin, that would be fine.
EDIT: I've totally edited my first answer now that I have a better understanding of the OP's expectations.
Check out Cargo, a thin wrapper that allows you to manipulate Java EE containers in a standard way.
Actually, there is a tutorial on Cargo's website that demonstrates how to use the Cargo Maven2 plugin to automatically start/stop a container (possibly deploying some deployables to it as it starts), which is what you're looking for from what I've understood.
I'm just not sure that doing this from the parent directory is feasible and if it's a requirement or if it would be ok to do it from another directory. I'll come back on this later. Lets first take a look at the Cargo Maven2 plugin setup.
In your case, you can start with the minimal configuration (that uses Jetty 5.x which is Cargo's default container):
[...]
<build>
<plugins>
<plugin>
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
</plugin>
</plugins>
</build>
[...]
If you want to use Jetty 6.x, you'll have to specify <containerId> and <type> in the <container> element:
[...]
<plugin>
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
<configuration>
<container>
<containerId>jetty6x</containerId>
<type>embedded</type>
</container>
</configuration>
</plugin>
[...]
Then, add the modules you want to deploy by defining deployables explicitly inside the plugin configuration (refer to the Maven2 Plugin Reference Guide for the details of the configuration) :
<deployables>
<deployable>
<groupId>com.mycompany.myproject</groupId>
<artifactId>myproject-alpha</artifactId>
<type>war</type>
<properties>
<context>optional alpha root context</context>
</properties>
</deployable>
<deployable>
<groupId>com.mycompany.myproject</groupId>
<artifactId>myproject-beta</artifactId>
<type>war</type>
<properties>
<context>optional beta root context</context>
</properties>
</deployable>
[...]
</deployables>
With this, you should be able to start Jetty and have your webapps deployed on it with a simple (to run from the project containing the cargo plugin configuration):
$ mvn cargo:start
I'm just not sure that this can work with the parent pom (I wonder if this can lead to cyclic dependencies issues) and I didn't test it. But personally, I'd put all this stuff in the pom of a dedicated project, e.g. in a sibling project of your webapps, and not in the parent pom. I don't think it's a really a big deal and this is IMHO a better setup, especially if you plan to use cargo for integration testing.
We are in the process of converting our main build process from ant to maven. We use TeamCity for our Continuous Integration server (CI).
We'd like to use the CI server to kick off (nightly) builds whose version contain a build number, as in 1.0.0.build#. These builds would be installed in our local maven repository to be used by other projects. So the CI server would manage the versions, maven would build the project, and the maven repository would make the builds accessible to other projects.
I intended to initiate the build from the CI server using the following command:
mvn -Dversion=1.0.0.25 install
The project's pom would have a bogus version number, and the -D flag would override it, as in:
<version>0.0.0.0</version>
The problem with this method is that the maven install plugin only uses the version in the pom file, not the version passed in on the command line. This is noted in this maven issue.
So since this issue has existed since 08/2006 and has not been fixed, I assume that this is somehow not 'the maven way'. So my question is, how can maven be used in a continuous integration situation to install versioned artifacts in the repository?
Sounds like you want to build SNAPSHOT versions with unique versions.
So, in your POM declare the version as:
<version>#.#.#-SNAPSHOT</version>
Then, in the distributionManagement section of your POM, enable unique versions for the snapshotRepository via (see Maven's POM reference on this):
<snapshotRepository>
<uniqueVersion>true</uniqueVersion>
<id>your-snapshot-repo-id</id>
<name>Your Snapshots</name>
<url>http://your-snapshot-repo-url/maven</url>
</snapshotRepository>
FYI, note that Maven conventions recommend versions be declared as major.minor.revision. So, 1.0.25 instead of 1.0.0.25. If you're able to use this versioning scheme, things will work more smoothly in a Maven world.
Matthew's answer provides a solution where the artifacts get uploaded into the local and remote repository having the desired version number, i.e. the paths inside the repository are contain the correct version numbers, however, Maven installs and deploys always the source POM file that would still contain the ${ciVersion} in the version element.
If you have a multi-module with a common parent like this:
<project xmlns="..." xmlns:xsi="..." xsi:schemaLocation="...">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>myParent</artifactId>
<groupId>com.stackoverflow</groupId>
<version>${ciVersion}</version>
</parent>
<artifactId>myChild</artifactId>
...
</project>
you won't be able to reference a dedicated version of the myChild module, as the dependency resolution will exist with an error that it cannot find the myParent module with version ${ciVersion}.
However, you could use the resolve-pom-maven-plugin that uploads a POM into the local and remote repository where all variables inside the POM get substituted by their actual values. In order to do this, you have to add the following snippet into your (parent) POM:
...
<build>
<plugins>
<plugin>
<groupId>com.sap.prd.mobile.ios.maven.plugins</groupId>
<artifactId>resolve-pom-maven-plugin</artifactId>
<version>1.0</version>
<executions>
<execution>
<id>resolve-pom-props</id>
<goals>
<goal>resolve-pom-props</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
...
Shek's answer is probably 'the maven way', so I'll accept it as the correct answer. However, we are not ready to change our conventions, so here is the workaround that we are using.
By using a level of indirection you can pass a version number in to the pom at build time and have the install and deploy plugins use them. For example:
<project xmlns="..." xmlns:xsi="..." xsi:schemaLocation="...">
<modelVersion>4.0.0</modelVersion>
<groupId>com.stackoverflow</groupId>
<artifactId>stackoverflow</artifactId>
<version>${ciVersion}</version>
<packaging>jar</packaging>
<name>StackOverflow</name>
<properties>
<ciVersion>0.0.0.0</ciVersion>
</properties>
...
</project>
We cannot override ${project.version} directly. So instead, we add a second property called 'ciVersion' and give it a default value of '0.0.0.0' in the properties section. Now the CI server can specify a version number by overriding the ciVersion property on the command line. As in:
mvn -DciVersion=1.0.0.25 install
The install and deploy plugins will use the value of the ciVersion property that was passed in whenever ${project.version} is referenced, as expected, and the default value will be used when no version is provided on the command line. This allows us to switch to maven with minimal impact on our process. In addition, this workaround is unobtrusive, allowing for an easy switch to the SNAPSHOT functionality when desired.