Publishing wsdl java M2E plugin execution not covered - maven-2

I have written a WSDL and I want to generate java classes.
I use the cxf-codegen-plugin for maven2, but I get the following validation error in Eclipse:
Plugin execution not covered by lifecycle configuration:
org.apache.cxf:cxf-codegen-plugin:2.2.7:wsdl2java (execution:
generate-sources, phase: generate-sources).
Can anybody help me to resolve this? Or propose another solution?

The error message sounds like it is coming from the maven eclipse integration (m2e). If the build works from the command line but not from within eclipse then maybe this article on the eclipse wiki can help you to configure eclipse.
This answer shows the solution for a similar problem with another maven plugin.

You can add this snippet to your pom.xml (from the link published by #Jörn Horstmann):
<pluginManagement>
<plugins>
<!--This plugin's configuration is used to store Eclipse m2e settings
only. It has no influence on the Maven build itself. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<versionRange>[2.3.3,)</versionRange>
<goals>
<goal>wsdl2java</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
The more important lines are:
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<versionRange>[2.3.3,)</versionRange>
<goals>
<goal>wsdl2java</goal>
</goals>
So, you have to ensure what version of CXF are you using...
Hope this can help...

Change the plugin version to this especific. After, Run Maven Update Project, and finally Run Maven generate-sources
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<version>2.7.10</version>
Best regards,

Related

Is Mapstruct Java11 compatible?

I'm a bit confused. There is some documentation that says java 9 is "experimental":
https://mapstruct.org/documentation/stable/reference/html/#_using_mapstruct_on_java_9
And I found a post where a guy was having trouble in Java 10. So we are heading to java 11 and I want to know if Mapstuct will work in that environment. Specifically, will it generate the code at compile time AND does the generated code work there (I suppose the latter does).
Yes, it works on a Java 11 / Spring Boot 2 project at work, and we use Mapstruct without issues.
Yes, it is possible, although I struggled a bit with it while migrating a DropWizard project (1.3.7) to java 11. The configuration as proposed in the documentation (through the maven-compiler-plugin) didn't work for me (no error was shown, but the mapper class was not generated) so I had to use maven-processor-plugin v3.3.3.
Here is how I managed to do that:
Add the dependencies using <org.mapstruct.version>1.3.1.Final</org.mapstruct.version>
<dependency>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct</artifactId>
<version>${org.mapstruct.version}</version>
</dependency>
<dependency>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${org.mapstruct.version}</version>
<scope>provided</scope>
</dependency>
Then configure the plugin in the submodule as follows
<plugin>
<groupId>org.bsc.maven</groupId>
<artifactId>maven-processor-plugin</artifactId>
<version>3.3.3</version>
<executions>
<execution>
<id>process</id>
<goals>
<goal>process</goal>
</goals>
<phase>generate-sources</phase>
<configuration>
<processors>
<!-- list of processors to use -->
<processor>org.mapstruct.ap.MappingProcessor</processor>
</processors>
<outputDirectory>${basedir}/target/generated-sources-mappers</outputDirectory>
<compilerArguments>-source 11 -target 11</compilerArguments>
</configuration>
</execution>
</executions>
</plugin>
The outputDirectory is something specific to our project, but I leave there to highlight the fact that the xml tag changed from version 2.x of te plugin, in case you are migrating from that.
The compilerArguments portion was required because the plugin run javac passing java version 1.6 as default argument, which won't work if you are using lambda expressions or other new features from the language.
When compiling, make sure to pay attention to the output of the plugin, it should only show warnings, otherwise it won't generate you classes and you will get a generic ClassNotFound exception but the cause can be something not allowing your plugin to compile well.
[INFO] --- maven-processor-plugin:3.3.3:process
...
7 warnings
Also make sure you don't have any version of mapstruct library older than 1.3.0.Final in you classpath, that will also cause issues preventing classes from generating.
I used the following configuration for JDK11
<properties>
<mapstruct.version>1.3.1.Final</mapstruct.version>
<maven.compiler.version>3.6.1</maven.compiler.version>
</properties>
<dependency>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${mapstruct.version}</version>
<scope>provided</scope>
</dependency>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.version}</version>
<configuration>
<annotationProcessorPaths>
<path>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${mapstruct.version}</version>
</path>
</annotationProcessorPaths>
</configuration>
</plugin>
</plugins>
</build>
Then mvn clean install will generate the impl classes in target\generated-sources\annotations

maven-jar-plugin addClasspath scoping

Is there anyway to get the maven-jar-plugin to use scope when adding a classpath to a jar manifest? I have a project where I want to create 2 jars - runtime and test. The runtime jar should have a classpath of only the runtime dependencies. The test jar should have a classpath of the test dependencies. I have not been able to figure out how to do this. any ideas?
I am aware of MJAR-117, but this bug is over a year old - perhaps it has been resolved in a different JIRA?
I don't think this is supported by the Maven Archiver (and MJAR-117 doesn't seem to get much traction). A possible workaround would be to provide (hard-coded) additional classpath entries when building the test-jar:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.2</version>
<executions>
<execution>
<id>default-jar</id>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
</manifest>
</archive>
</configuration>
</execution>
<execution>
<id>default-test-jar</id>
<phase>package</phase>
<goals>
<goal>test-jar</goal>
</goals>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
</manifest>
<manifestEntries>
<Class-Path>foo-1.0.jar bar-2.1.jar</Class-Path>
</manifestEntries>
</archive>
</configuration>
</execution>
</executions>
</plugin>
I agree this is not ideal, you have to add things manually and this is error-prone. But it works.
You could maybe do something more dynamic with filtering and some antrun or groovy magic but this would definitely require more work.
Related question
Maven - how can I add an arbitrary classpath entry to a jar?
I am not aware if the bug is fixed. To be frank, I wasn't even aware if this bug existed. But I think your problem can be solved using maven profiles
You can have a separate profile for your testing as :
<profile>
<id>test</id>
<activation>
<property>
<name>test</name>
<value>true</value>
</property>
</activation>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8</version>
<!--
we don't need the test scope since we want the jar plugin to include
it in classpath
-->
<!-- scope>test</scope -->
</dependency>
</dependencies>
</profile>
To use this profile:
$>mvn -Dtest=true package
I would like to create the app jar and the test jar in one execution of mvn. By using profiles, I would have to execute mvn twice. The reason why I need one execution of mvn is because I want to use the maven-assembly-plugin to package these 2 jars together in a zip file.
I went into the sourcecode for maven archiver (here). Looks like this is not possible. I think the only way I can do this is by marking the scopes of all my test dependencies as "runtime" and just deploying these jars to PROD even when they will not be used. Not pretty.
196 if ( config.isAddClasspath() )
197 {
198 StringBuffer classpath = new StringBuffer();
199
200 List artifacts = project.getRuntimeClasspathElements();
i just had another thought - could this be accomplished with the maven groovy plugin?

Is there a maven plugin that verifies that all dependencies are releases?

The 'maven-release-plugin' has this feature, but it is not available as separate goal.
I think I have seen this functionality somewhere, but I can't find it again. Would be great if somebody knows where to find such a plugin.
The maven enforcer plugin has a requireReleaseDeps rule allowing to enforce that no snapshots are included as dependencies. It may be what you're looking for.
If you configure the plugin like this (check the rule documentation for more options):
<project>
[...]
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.0-beta-1</version>
<configuration>
<rules>
<requireReleaseDeps>
<message>No Snapshots Allowed!</message>
</requireReleaseDeps>
</rules>
</configuration>
</plugin>
</plugins>
</build>
[...]
</project>
Then calling mvn enforcer:enforce will do the job.
Using release:prepare together with dryRun=true should do what you want.

Maven2: How to stage JXR plugin result when using mvn site?

I have a multi-module project and I want to deploy on the project's site an HTML version of my source code using the JXR maven plugin.
The problem is that the JXR plugin runs well, the XREF folder is properly generated for each of my module, but when I use the mvn site:stage command in order to retrieve all the project's site content and to have all link properly generated it does not retrieve the XREF folders.
Here is an extract of my POM file where the JXR plugin is configured:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jxr-maven-plugin</artifactId>
<configuration>
<aggregate>true</aggregate>
</configuration>
</plugin>
Here is the command I use to create and stage my site:
mvn site site:stage
Do you guys have any idea?
Thanks in advance.
r.
Not sure this is relevant, but your command is running the site twice, mvn site will generate the site, and site:stage will also run the site, perhaps this is causing problems but I honestly can't see why.
Looking at the JXR documentation, it only mentions the site:site goal, I can't see why it wouldn't be run properly for the site:stage goal as it extends it. If you run the site goal, then copy the output to another directory, run the site:stage goal and compare the output it might give some insight into the problem.
Update: I tried this myself and the xref was included and aggregated nicely in c:\test\stage with the cross references correctly managed. I've included the configuration I used.
In my parent pom I defined the site configuration like this:
<build>
<plugins>
<plugin>
<artifactId>maven-site-plugin</artifactId>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>stage</goal>
</goals>
</execution>
</executions>
<configuration>
<stagingDirectory>c:\test\stage</stagingDirectory>
</configuration>
</plugin>
</plugins>
</build>
The distributionManagement section was configured with the site information (not really needed as I set the stagingDirectory above, but the goal won't run without it).
<distributionManagement>
<site>
<id>mojo.website</id>
<name>Mojo Website</name>
<url>scp://test/</url>
</site>
</distributionManagement>
My JXR configuration in the parent pom was as follows:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jxr-plugin</artifactId>
<reportSets>
<reportSet>
<id>src-xref</id>
<reports>
<report>jxr</report>
</reports>
</reportSet>
<reportSet>
<id>test-xref</id>
<reports>
<report>test-jxr</report>
</reports>
</reportSet>
</reportSets>
<configuration>
<aggregate>true</aggregate>
</configuration>
</plugin>
The commandline run was mvn clean site:stage
Edit: Per the comments, there is a codehaus jxr plugin with slightly different semantics. Be sure to use the org.apache.maven.plugins version rather than the org.codehaus.mojo version.

Integrate Protocol Buffers into Maven2 build

I'm experimenting with Protocol Buffers in an existing, fairly vanilla Maven 2 project. Currently, I invoke a shell script every time I need to update my generated sources. This is obviously a hassle, as I would like the sources to be generated automatically before each build. Hopefully without resorting to shameful hackery.
So, my question is two-fold:
Long shot: is there a "Protocol Buffers plugin" for Maven 2 that can achieve the above in an automagic way? There's a branch on Google Code whose author appears to have taken a shot at implementing such a plugin. Unfortunately, it hasn't passed code review or been merged into protobuf trunk. The status of that plugin is thus unknown.
Probably more realistic: lacking an actual plugin, how else might I go about invoking protoc from my Maven 2 build? I suppose I may be able to wire up my existing shell script into an antrun invocation or something similar.
Personal experiences are most appreciated.
You'll find some information about the plugin available in the Protocol Buffers repository in the Protocol Buffers Compiler Maven Plug-In thread on the Protocol Buffers discussion group. My understanding is that it's usable but lacking tests. I'd give it a try.
Or you could just use the antrun plugin (snipet pasted from the thread mentioned above):
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>generate-sources</id>
<phase>generate-sources</phase>
<configuration>
<tasks>
<mkdir dir="target/generated-sources"/>
<exec executable="protoc">
<arg value="--java_out=target/generated-sources"/>
<arg value="src/main/protobuf/test.proto"/>
</exec>
</tasks>
<sourceRoot>target/generated-sources</sourceRoot>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>2.0.3</version>
</dependency>
</dependencies>
The accepted answer encouraged me to get the Google-provided plugin to work. I merged the branch mentioned in my question into a checkout of 2.2.0 source code, built and installed/deployed the plugin, and was able to use it in my project as follows:
<build>
<plugins>
<plugin>
<groupId>com.google.protobuf.tools</groupId>
<artifactId>maven-protoc-plugin</artifactId>
<version>0.0.1</version>
<executions>
<execution>
<id>generate-sources</id>
<goals>
<goal>compile</goal>
</goals>
<phase>generate-sources</phase>
<configuration>
<protoSourceRoot>${basedir}/src/main/protobuf/</protoSourceRoot>
<includes>
<param>**/*.proto</param>
</includes>
</configuration>
</execution>
</executions>
<configuration>
<protocExecutable>/usr/local/bin/protoc</protocExecutable>
</configuration>
</plugin>
</plugins>
</build>
Note that I changed the plugin's version to 0.0.1 (no -SNAPSHOT) in order to make it go into my non-snapshot thirdparty Nexus repository. YMMV. The takeaway is that this plugin will be usable once it's no longer necessary to jump through hoops in order to get it going.
The accepted solution does not scale for multiple proto files. I had to come up with my own:
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>compile-protoc</id>
<phase>generate-sources</phase>
<configuration>
<tasks>
<mkdir dir="${generated.sourceDirectory}" />
<path id="proto.path">
<fileset dir="src/main/proto">
<include name="**/*.proto" />
</fileset>
</path>
<pathconvert pathsep=" " property="proto.files" refid="proto.path" />
<exec executable="protoc" failonerror="true">
<arg value="--java_out=${generated.sourceDirectory}" />
<arg value="-I${project.basedir}/src/main/proto" />
<arg line="${proto.files}" />
</exec>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</build>
There's also great plugin by Igor Petruk named protobuf-maven-plugin. It's in central repo now and plays nicely with eclipse (m2e-1.1 is recommended).
I just updated the maven plugin to work with 2.2.0 -- the updated pom are attached to the code review bug.
Here are the instructions to build the plugin yourself:
svn co http://protobuf.googlecode.com/svn/branches/maven-plugin/tools/maven-plugin
cd maven-plugin
wget -O pom.xml 'http://protobuf.googlecode.com/issues/attachment?aid=8860476605163151855&name=pom.xml'
mvn install
You can then use the maven config above.
I just tried a less official but very recent (v 0.1.7) fork from https://github.com/dtrott/maven-protoc-plugin and it worked very well, courtesy of David Trott. I tested it with a couple of Maven modules one of which contained DTO-style messages and the other a service depending on them. I borrowed the plugin configuration MaxA posted on Oct 16 '09, I had protoc on my PATH and I added
<temporaryProtoFileDirectory>${basedir}/target/temp</temporaryProtoFileDirectory>
right after
<protocExecutable>protoc</protocExecutable>.
What is really nice is that all I had to do is to declare a normal dependency from the service module on the DTO module. The plugin was able to resolve proto files dependencies by finding the proto files packaged with the DTO module, extracting them to a temporary directory and using while generating code for the service. And it was smart enough not to package a second copy of the generated DTO classes with the service module.
There is a maven plugin for protobuf. https://www.xolstice.org/protobuf-maven-plugin/usage.html
The minimal config
<plugin>
<groupId>org.xolstice.maven.plugins</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<version>0.5.0</version>
<configuration>
<protocExecutable>/usr/local/bin/protoc</protocExecutable>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
I think that using antrun to invoke non-Maven steps is the generally accepted solution.
You could also try the maven-exec-plugin.
I forked of the plugin from David Trott and have it compiling multiple languages which makes it a lot more useful. See the github project here and a tutorial on integrating it with a maven build here.