I want to use one class and two JPA persistence units and as such to be able to store data in different tables (or even databases) and different definitions.
According to the JPA2.2 specification this should be possible but I experience weird behaviour. I'm using payara which uses eclipselink.
For a complete description and a reproducer see this github project.
I hope someone can help me.
The reason why this does not work is eclipselink weaving.
What this does is manipulate the bytecode of entity classes to extend them with functionality so provide all kinds of optimizations.
The problem however is that the resulting 'woven' classes depend on the definitions as defined in the persistence.xml and orm.xml.
This means in my case, because I have two different persistence.xml/orm.xml combinations, I would need two Foo.class and Bar.class files which reflect the different functionality. Of course this won't work.
The solution is to turn off weaving and this can be done using a property in the persistence.xml
<property name="eclipselink.weaving" value="off"/>
If you want to see the actual 'woven' classes you can use static weaving. This can be done using this property
<property name="eclipselink.weaving" value="static"/>
and this maven plugin
<plugin>
<groupId>de.empulse.eclipselink</groupId>
<artifactId>staticweave-maven-plugin</artifactId>
<version>1.0.0</version>
<executions>
<execution>
<phase>process-classes</phase>
<goals>
<goal>weave</goal>
</goals>
<configuration>
<persistenceXMLLocation>META-INF/persistence.xml</persistenceXMLLocation>
<logLevel>FINE</logLevel>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.jpa</artifactId>
<version>2.7.7.payara-p3</version>
</dependency>
</dependencies>
</plugin>
When you then build the project you can decompile the classes in the jar to see what is happening.
Thanks you Cris for pointing me to eclipselink weaving.
Related
I'm a bit confused. There is some documentation that says java 9 is "experimental":
https://mapstruct.org/documentation/stable/reference/html/#_using_mapstruct_on_java_9
And I found a post where a guy was having trouble in Java 10. So we are heading to java 11 and I want to know if Mapstuct will work in that environment. Specifically, will it generate the code at compile time AND does the generated code work there (I suppose the latter does).
Yes, it works on a Java 11 / Spring Boot 2 project at work, and we use Mapstruct without issues.
Yes, it is possible, although I struggled a bit with it while migrating a DropWizard project (1.3.7) to java 11. The configuration as proposed in the documentation (through the maven-compiler-plugin) didn't work for me (no error was shown, but the mapper class was not generated) so I had to use maven-processor-plugin v3.3.3.
Here is how I managed to do that:
Add the dependencies using <org.mapstruct.version>1.3.1.Final</org.mapstruct.version>
<dependency>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct</artifactId>
<version>${org.mapstruct.version}</version>
</dependency>
<dependency>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${org.mapstruct.version}</version>
<scope>provided</scope>
</dependency>
Then configure the plugin in the submodule as follows
<plugin>
<groupId>org.bsc.maven</groupId>
<artifactId>maven-processor-plugin</artifactId>
<version>3.3.3</version>
<executions>
<execution>
<id>process</id>
<goals>
<goal>process</goal>
</goals>
<phase>generate-sources</phase>
<configuration>
<processors>
<!-- list of processors to use -->
<processor>org.mapstruct.ap.MappingProcessor</processor>
</processors>
<outputDirectory>${basedir}/target/generated-sources-mappers</outputDirectory>
<compilerArguments>-source 11 -target 11</compilerArguments>
</configuration>
</execution>
</executions>
</plugin>
The outputDirectory is something specific to our project, but I leave there to highlight the fact that the xml tag changed from version 2.x of te plugin, in case you are migrating from that.
The compilerArguments portion was required because the plugin run javac passing java version 1.6 as default argument, which won't work if you are using lambda expressions or other new features from the language.
When compiling, make sure to pay attention to the output of the plugin, it should only show warnings, otherwise it won't generate you classes and you will get a generic ClassNotFound exception but the cause can be something not allowing your plugin to compile well.
[INFO] --- maven-processor-plugin:3.3.3:process
...
7 warnings
Also make sure you don't have any version of mapstruct library older than 1.3.0.Final in you classpath, that will also cause issues preventing classes from generating.
I used the following configuration for JDK11
<properties>
<mapstruct.version>1.3.1.Final</mapstruct.version>
<maven.compiler.version>3.6.1</maven.compiler.version>
</properties>
<dependency>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${mapstruct.version}</version>
<scope>provided</scope>
</dependency>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.version}</version>
<configuration>
<annotationProcessorPaths>
<path>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${mapstruct.version}</version>
</path>
</annotationProcessorPaths>
</configuration>
</plugin>
</plugins>
</build>
Then mvn clean install will generate the impl classes in target\generated-sources\annotations
I have a Java EE-web-application and for using my project with oc4j application server it must be patched in my build-lifecycle to avoid several issues. Actually i do this via maven-antrun-plugin which works great. I have to remove, copy some special libraries into WEB-INF/lib and edit the web.xml, to avoid clashes with EL functions and classloading issues.
According to the maven lifecycle phases i chosed the phase prepare-package: this phase is executed before the war file is packaged, but unfortunately also before the (re-)sources are copied into the temporary working dir. I dislike working on the source folders because they're under version control and i don't want to have my coworkers to accidently commit them in cause the build-tool modified them.
So maven copies all the (re-)source stuff to target/__finalName__ where i want to fix the project for the use with oc4j. because this folder is temporary and will be packaged into the war file. Unfortunately the copying and packaging is isolated done in lifecycle package.
So how can i get between the copying of the sources and resources and the real packaging?
Example with prepare-package
This example doesn't work because the ${project.build.directory}/${build.finalName} doesn't exists and the ojdbc14.jar wasn't copied there in this phase.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<id>patch-oc4j</id>
<phase>prepare-package</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<echo>Patching distribution for OC4J</echo>
<echo>Deleting the obsolete OJDBC library</echo>
<delete file="${project.build.directory}/${build.finalName}
/WEB-INF/lib/ojdbc14.jar" />
[... more patching ...]
</tasks>
</configuration>
</execution>
</executions>
</plugin>
Couldn't you use a profile for this? Maybe something like this:
<profiles>
<profile>
<id>oc4j</id>
<dependencies>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc14</artifactId>
<version>10.2.0.4.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
</profile>
</profiles>
I have to remove, copy some special libraries into WEB-INF/lib and edit the web.xml, to avoid clashes with EL functions and classloading issues.
Sounds like you could, in part at least, do this with Build Profiles instead.. Your motivation for the problem above is a bit short, but if you elaborate we can judge this better..
We have a aggregation .pom set up to include several, individual modules, similar to the Maven documentation:
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany.app</groupId>
<artifactId>my-app</artifactId>
<version>1</version>
<packaging>pom</packaging>
<modules>
<module>my-module</module>
<module>my-module-2</module>
</modules>
</project>
Is there a way to get the artifacts from the builds (.JAR files) from these two modules into a common 'dist' directory after building? I did not want to adjust to output directory for the individual modules from "my-module/target" since they can be built separately as well.
I'm a Maven new-comer, so I'm sure there's an easy way to do this I'm missing.
Is there a way to get the artifacts from the builds (.JAR files) from these two modules into a common 'dist' directory after building?
The Maven Assembly Plugin can do that, it is very powerful and flexible. But power and flexibility also mean that this is not the most trivial plugin to use. In your case, the idea would be to generate a dir distribution from a moduleSets and you'll have to create a custom assembly descriptor for that.
I suggest starting with chapter 8.2. Assembly Basics of the Maven book and to pay a special attention to the chapter 8.5.5. moduleSets Sections.
After reading more at the links from the other answers, here is what I'm going to try for now:
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<executions>
<execution>
<id>copy-jars</id>
<phase>package</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>../src/my-module/target</directory>
<includes>
<include>**/my-module*.jar</include>
</includes>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
Not exactly pretty, but while researching the Assembly plug-in for a possible longer term solution, this will do.
i guess maven assembly plugin can do this
As #Pangea said assembly plugin will do it. Just run assembly:assembly goal with appropriately set outputDirectory parameter.
more info at http://maven.apache.org/plugins/maven-assembly-plugin/assembly-mojo.html
I only need to use this class org.apache.commons.io.FileUtils, and yet I'm downloading all commons Classes which I actually don't need, is there a way to say to maven download just FileUtils class? Not whole commons like from dependency below
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>1.4</version>
</dependency>
is there a way to say to maven download just FileUtils class?
No. But depending on your exact use case, you could maybe use the Maven Shade Plugin to create an uber-jar and filter the content of the included dependencies:
Selecting Contents for Uber JAR
...
For fine-grained control of which
classes from the selected dependencies
are included, artifact filters can be
used:
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.3.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>junit:junit</artifact>
<includes>
<include>junit/framework/**</include>
<include>org/junit/**</include>
</includes>
<excludes>
<exclude>org/junit/experimental/**</exclude>
<exclude>org/junit/runners/**</exclude>
</excludes>
</filter>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
...
</project>
Here, Ant-like patterns are used to
specify that from the dependency
junit:junit only certain
classes/resources should be included
in the uber JAR. The second filter
demonstrates the use of wildcards for
the artifact identity which was
introduced in plugin version 1.3. It
excludes all signature related files
from every artifact, regardless of its
group or artifact id.
But note that FileUtils depends on other classes:
import org.apache.commons.io.filefilter.DirectoryFileFilter;
import org.apache.commons.io.filefilter.FalseFileFilter;
import org.apache.commons.io.filefilter.FileFilterUtils;
import org.apache.commons.io.filefilter.IOFileFilter;
import org.apache.commons.io.filefilter.SuffixFileFilter; // depends on org.apache.commons.io.IOCase
import org.apache.commons.io.filefilter.TrueFileFilter;
import org.apache.commons.io.output.NullOutputStream;
That you'll obviously need to include too.
Apache commons io has no dependencies to other apache commons projects. You get only commons io, no other commons libraries. That is one jar with about 100 classes, not very much.
You cannot get only one class into your project - this would propably also violating the license!
A look at FileUtils source also shows a lot of imports of other commons io classes. It will not work without the rest of the jar!
Use the dependency <exclusion> element
<dependency>
<groupId>sample.ProjectA</groupId>
<artifactId>Project-A</artifactId>
<version>1.0</version>
<scope>compile</scope>
<exclusions>
<exclusion> <!-- declare the exclusion here -->
<groupId>sample.ProjectB</groupId>
<artifactId>Project-B</artifactId>
</exclusion>
</exclusions>
</dependency>
to exclude those transitive dependencies that you don't need.
It is also a good practice to use mvn dependency:analyze-only and mvn dependecy:tree to understand how your depency graph is actually structured and what dependencies are you really using and not declaring and/or declaring and not using.
Regards.
I don't think so. The artifacts are packaged as jar files; you can't get them individually (that is, on a file-by-file basis). At least, not to my knowledge.
Also, think about it a little more - it is entirely possible that the FileUtils class has dependencies on other classes. But you can't really tell what they are without examining the source. That is information the user of the package does not need to know. You wouldn't want to figure out every other class that FileUtils uses (or what other classes the dependencies of FileUtils uses and so on and so forth). This is why the entire artifact is distributed as a discrete and self-contained entity. The artifact as a whole, if it is mavenized, will know what dependencies it needs and maven will go grab those for you as well.
I have maven configured to run gunit (an ANTLR grammar unit testing tool) through the maven-gunit-plugin. gunit, however, has two different modes. The first mode causes gunit to act as an interpreter, reading through the *.gunit (or *.testsuite) file, interpreting it, and displaying the results. It can be configured as such:
<plugin>
<groupId>org.antlr</groupId>
<artifactId>maven-gunit-plugin</artifactId>
<version>3.1.3</version>
<executions>
<execution>
<id>maven-gunit-plugin</id>
<phase>test</phase>
<goals>
<goal>gunit</goal>
</goals>
</execution>
</executions>
</plugin>
The second mode causes gunit to generate source code that can be run by JUnit. How can I instruct the maven-gunit-plugin to generate JUnit sources instead of acting as an interpreter?
A few notes:
I can change the test phase to "generate-test-sources" to cause the maven plugin to run at the correct time.
I couldn't find any useful documentation on the maven-gunit-plugin
I've seen people use exec-maven-plugin to run gunit with a specific command line option, but I'm not looking to do that.
EDIT / RESOLUTION:
After reading the various responses, I downloaded the ANTLR source code, which includes the maven-gunit-plugin. The plugin does not support junit generation. It turns out that the codehaus snapshot of the gunit-maven-plugin and the exec plugin are currently the only options.
I found a discussion through MNG-4039 that is illustrated with a maven-gunit-plugin gunit-maven-plugin sample. I'll let you read the whole article but, according to the author, you should end up with something like this:
<dependencies>
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr-runtime</artifactId>
<version>3.1.1</version>
</dependency>
<!-- Here is the 'extra' dep -->
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr</artifactId>
<version>3.1.1</version>
<!-- we try to use scope to hide it from transitivity -->
<scope>test</scope> <!-- or perhaps 'provided' (see later discussion) or 'import' (maven >= 2.0.9) -->
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gunit-maven-plugin</artifactId>
<version>1.0.0-SNAPSHOT</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
I didn't test this configuration myself and can't thus confirm everything is working out of the box. I don't even know if the plugin has been released in a non SNAPSHOT version. The only thing I can confirm is that it seems indeed very hard to find "real" documentation about the maven-gunit-plugin.
There is sad news here
I found out so far there is no
GUnit-functionality (be it JUnit
Test-Generation or direct invocation
of GUnit) for maven right now. I
already mailed with Jim Idle concering
the state of GUnit in the
antlr3-maven-plugin and learned that
there is a patch to the old version of
the maven-plugin waiting in the queue.
I think this workaround that is the only option.