Did I understand right from this page that Cargo Maven plugin doesn't support hot remote deployment to GlassFish 3.x? If I'm wrong, how can I configure it to support such type of operation?
Maybe I should use some other plugin? I'd like to deploy to GlassFish remote installation, through HTTP, in "hot" mode.
This is what I've done so far:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<phase>package</phase>
<configuration>
<tasks>
<tempfile property="ant.temp-ear" deleteonexit="true" destdir="/tmp" />
<copy
file="${project.build.directory}/${project.build.finalName}.${project.packaging}"
tofile="${ant.temp-ear}" verbose="true" />
<exec executable="${glassfish.home}/glassfish/bin/asadmin"
failonerror="true">
<arg value="--user=${glassfish.username}"/>
<arg value="--passwordfile=${glassfish.passwordfile}"/>
<arg value="--interactive=false"/>
<arg value="--host=${glassfish.host}"/>
<arg value="--port=${glassfish.adminport}"/>
<arg value="deploy"/>
<arg value="--force"/>
<arg value="--name=${project.artifactId}"/>
<arg value="${ant.temp-ear}"/>
</exec>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
Works perfectly, but asadmin (and the entire GlassFish, I assume) has to be installed on the same machine where mvn is executed.
Is it possible to perform the same task with Cargo plugin?
Does this answer your question?
<build>
<plugins>
<plugin>
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
<configuration>
<container>
<containerId>glassfish3x</containerId>
<type>remote</type>
</container>
<configuration>
<type>runtime</type>
<properties>
<cargo.hostname>dev-server-01</cargo.hostname>
<cargo.servlet.port>8080</cargo.servlet.port>
<cargo.remote.username>user</cargo.remote.username>
<cargo.remote.password>pass</cargo.remote.password>
<cargo.glassfish.domain.name>domain-name</cargo.glassfish.domain.name>
<cargo.glassfish.adminPort>4848</cargo.glassfish.adminPort>
</properties>
</configuration>
<deployables>
<deployable>
<groupId>${project.groupId}</groupId>
<artifactId>${project.artifactId}</artifactId>
<type>war</type>
</deployable>
</deployables>
</configuration>
<dependencies>
<dependency>
<groupId>org.glassfish.main.deployment</groupId>
<artifactId>deployment-client</artifactId>
<version>3.1.2.2</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
Related
I am using thorntail framework and developed intregation test cases by using Arquillian. When I am trying yo generate test case coverage report it shows 0% coverage. I have already tried many solutions incuding thorntail-example, arquillian jacoco extension but couldn't able to resolve this.
below is my pom.xml
<profile>
<id>jacoco</id>
<dependencies>
<dependency>
<groupId>org.jacoco</groupId>
<artifactId>org.jacoco.core</artifactId>
<version>${version.jacoco}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.extension</groupId>
<artifactId>arquillian-jacoco</artifactId>
<version>${version.arquillian_jacoco}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.15</version>
<executions>
<!-- Ensures that both integration-test and verify goals of the Failsafe
Maven plugin are executed. -->
<execution>
<id>integration-tests</id>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
<configuration>
<!-- Sets the VM argument line used when integration tests are run. -->
<argLine>${failsafeArgLine}</argLine>
<!-- Skips integration tests if the value of skip.integration.tests
property is true -->
<skipTests>false</skipTests>
<systemPropertyVariables>
<thorntail.arquillian.jvm.args>${failsafeArgLine}</thorntail.arquillian.jvm.args>
</systemPropertyVariables>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${version.jacoco}</version>
<executions>
<execution>
<id>pre-integration-test</id>
<phase>pre-integration-test</phase>
<goals>
<goal>prepare-agent</goal>
</goals>
<configuration>
<!-- Sets the path to the file which contains the execution data. -->
<destFile>${jacoco.it.execution.data.file}</destFile>
<!-- Sets the name of the property containing the settings for JaCoCo
runtime agent. -->
<propertyName>failsafeArgLine</propertyName>
</configuration>
</execution>
<execution>
<id>post-integration-test</id>
<phase>post-integration-test</phase>
<goals>
<goal>report</goal>
</goals>
<configuration>
<!-- Sets the path to the file which contains the execution data. -->
<dataFile>${jacoco.it.execution.data.file}</dataFile>
<!-- Sets the output directory for the code coverage report. -->
<outputDirectory>${project.reporting.outputDirectory}/jacoco-it</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
and this is my arquillian.xml
<container default="true" qualifier="daemon">
<configuration>
<property name="host">localhost</property>
<property name="port">${thorntail.arquillian.daemon.port:8085}</property>
<property name="javaVmArguments">${thorntail.arquillian.jvm.args:}</property>
</configuration>
</container>
<extension qualifier="jacoco">
<property name="includes">com.*</property>
</extension>
I am using maven-exec-plugin to generate java sources of Thrift. It invokes the external Thrift compiler and using -o to specify the output directory, "target/generated-sources/thrift".
The problem is neither maven-exec-plugin nor Thrift compiler automatically create the output directory, I have to manually create it.
Is there a decent/portable way use create missing directories when needed? I don't want to define a mkdir command in the pom.xml, since my project need to be system independent.
Instead of the exec plugin, use the antrun plugin to first create the directory and then invoke the thrift compiler.
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>generate-sources</id>
<phase>generate-sources</phase>
<configuration>
<tasks>
<mkdir dir="target/generated-sources/thrift"/>
<exec executable="${thrift.executable}">
<arg value="--gen"/>
<arg value="java:beans"/>
<arg value="-o"/>
<arg value="target/generated-sources/thrift"/>
<arg value="src/main/resources/MyThriftMessages.thrift"/>
</exec>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
You may also want to take a look at the maven-thrift-plugin.
You can define an ant task to do the job. Put the plugin declaration into your project's pom.xml. This will keep your project system-independent:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>createThriftDir</id>
<phase>process-resources</phase>
<configuration>
<tasks>
<delete dir="${thrift.dir}"/>
<mkdir dir="${thrift.dir}"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
If you would like to prepare such folder structure somewhere in your project and then copy to place you want, use maven-resource plugin to do that:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<executions>
<execution>
<id>copy-folder</id>
<phase>package</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}</outputDirectory>
<resources>
<resource>
<filtering>false</filtering>
<directory>${project.basedir}/src/main/resources/folders</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
I've embedded the following code within my POM:
<plugin name="test">
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>validate</phase>
<configuration>
<tasks>
<pathconvert targetos="unix" property="project.build.directory.portable">
<path location="${project.build.directory}"/>
</pathconvert>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
I then reference ${project.build.directory.portable} from the run project action but it comes back as null. Executing <echo> within the Ant block shows the correct value. What am I doing wrong?
For completeness, the mentioned feature was implemented in the maven-antrun-plugin in October 2010.
The configuration parameter you are looking for is exportAntProperties.
Example of use:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7-SNAPSHOT</version>
<executions>
<execution>
<phase>process-resources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<exec outputproperty="svnversion"
executable="svnversion">
<arg value=".." />
</exec>
</target>
<exportAntProperties>true</exportAntProperties>
</configuration>
</execution>
</executions>
</plugin>
As a side note, at the time of this post (2011-10-20), the official plugin documentation didn't have this option documented. To get the help for 'versionXYZ' of the plugin:
mvn help:describe -Dplugin=org.apache.maven.plugins:maven-antrun-plugin:versionXYZ -Ddetail
The version 1.7 of the maven-antrun-plugin worked for me to pass a property from ant to maven (and from mvn to ant). Some sample code that calculates an md5 checksum of a file and later stores it into a property that is accessed by mvn at a later time:
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<id>ant-md5</id>
<phase>initialize</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<property name="compile_classpath" refid="maven.compile.classpath"/>
<property name="outputDir" value="${project.build.outputDirectory}"/>
<property name="sourceDir" value="${project.build.sourceDirectory}"/>
<checksum file="${sourceDir}/com/blah/db/blah.java" property="blah.md5db"/>
</target>
<exportAntProperties>true</exportAntProperties>
</configuration>
</execution>
</executions>
The property is accessible in later with ${blah.md5db} in a java file.
From the plugin documentation here:
Try to add the maven prefix, so you have
<path location="${maven.project.build.directory}"/> instead
If that doesn't work, you may need to explictly redefine the property yourself:
<property name="maven.project.build.dir" value="${project.build.directory}"/>
<path location="${maven.project.build.directory}"/>
I don't think you can set a property from Ant that will be visible from Maven. You should write a Mojo.
I have a Maven POM file with a plugin that runs on the test phase. What command line arguments do I have to pass mvn in order to execute just that plugin rather than all of the plugins for that phase? I am also trying to execute a specific ant-run plugin, that looks like the following:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.3</version>
<dependencies>
<dependency>
<groupId>com.googlecode.jslint4java</groupId>
<artifactId>jslint4java-ant</artifactId>
<version>1.3.3</version>
</dependency>
</dependencies>
<executions>
<execution>
<id>jslint</id>
<phase>test</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<ant antfile="${basedir}/jslint.xml">
<property name="root" location="${basedir}" />
<target name="jslint" />
</ant>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
Thanks.
Specify the fully-qualified goal in the form of:
mvn groupID:artifactID:version:goal
For example:
mvn sample.plugin:maven-hello-plugin:1.0-SNAPSHOT:sayhi
EDIT: I'm modifying my answer to cover the update of the initial question and a comment from the OP.
I won't cover all the details but, it the particular case of the antrun plugin, you could just run:
mvn antrun:run
But now that you've updated the question, I understand that things are a bit more complicated than what I thought initially and I don't think that this will actually work. I mean, invoking mvn antrun:run won't fail but it won't pick up the configuration of the execution bound to the test phase.
The only (ugly) solution I can think of would be to add another maven-antrun-plugin configuration in a specific profile, something like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.3</version>
<dependencies>
<dependency>
<groupId>com.googlecode.jslint4java</groupId>
<artifactId>jslint4java-ant</artifactId>
<version>1.3.3</version>
</dependency>
</dependencies>
<configuration>
<tasks>
<ant antfile="${basedir}/jslint.xml">
<property name="root" location="${basedir}" />
<target name="jslint" />
</ant>
</tasks>
</configuration>
</plugin>
And to use this profile when calling antrun:run:
mvn antrun:run -Pmyprofile-for-antrun
I have a project of persistence with spring and hibernate built with maven, I'm running the testing using Junit and a test database HSQLDB, when I do a test first initialize the database HSQLDB in server mode, is there some way to make hudson initializes the database, or with maven ?
I'd use DbUnit and the DbUnit Maven Plugin for that. You can use it to Clear database and insert a dataset before test phase and/or to setup data for each test cases (see the Getting Started for JUnit 3, see this blog post for example for JUnit 4).
Another option would be to use the SQL Maven Plugin. The examples section has a configuration that shows how to drop/create a database and schema, then populate it before the test phase, and drop the database after the test phase).
The later approach gives you less control on data setup between tests which is why I prefer DbUnit.
I add the folowing to pom.
<build>
<extensions>
<extension>
<groupId>hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>1.8.0.7</version>
</extension>
<extension>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.14</version>
</extension>
</extensions>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>hibernate3-maven-plugin</artifactId>
<version>2.0</version>
<configuration>
<components>
<component>
<name>hbm2java</name>
<implementation>annotationconfiguration</implementation>
<outputDirectory>/src/main/java</outputDirectory>
</component>
</components>
<componentProperties>
<jdk5>true</jdk5>
<export>false</export>
<drop>true</drop>
<outputfilename>schema.sql</outputfilename>
</componentProperties>
</configuration>
<executions>
<execution>
<id>generate-ddl</id>
<phase>process-classes</phase>
<goals>
<!--Genera Esquema-->
<goal>hbm2ddl</goal>
<!--Genera Clases -->
<!-- <goal>hbm2java</goal> -->
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>sql-maven-plugin</artifactId>
<version>1.0</version>
<executions>
<execution>
<id>create-schema</id>
<phase>process-test-resources</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<srcFiles>
<srcFile>target/hibernate3/sql/schema.sql</srcFile>
</srcFiles>
</configuration>
</execution>
<execution>
<id>drop-db-after-test</id>
<phase>test</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<sqlCommand>DROP SCHEMA public CASCADE</sqlCommand>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>1.8.0.7</version>
</dependency>
</dependencies>
<configuration>
<driver>org.hsqldb.jdbcDriver</driver>
<username>sa</username>
<password></password>
<url>jdbc:hsqldb:file:etc/out/test.db;shutdown=true</url>
<autocommit>true</autocommit>
<skip>${maven.test.skip}</skip>
</configuration>
</plugin>
</plugins>
</build>
and the folowing to my datasource:
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource" lazy-init="true"
destroy-method="close">
<property name="driverClassName" value="org.hsqldb.jdbcDriver" />
<property name="url" value="jdbc:hsqldb:file:etc/out/test.db;shutdown=true" />
<property name="username" value="sa" />
<property name="password" value="" />
</bean>
We do that with Maven under Hudson, with a profile that triggers the maven-antrun-plugin in the process-test-resources phase. In our case the maven-antrun-plugin runs a Java class that generates the schema, but of course there are other options. It looks like this:
<profile>
<id>dev</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>generate-database-schema-new</id>
<phase>process-test-resources</phase>
<configuration>
<tasks>
...
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>