I am writing a Maven application using OpenJDK 1.8 and running tests using TestNG.
When I run Maven from the command line everything works fine, but when I try to run the test inside IntelliJ, then the make process is displaying the following error:
java: javacTask: source release 8 requires target release 1.8
I have the project settings pointing to the 1.8 JDK and Project Language Level 8.
Inside Maven I have the following block (which I am guessing is not getting called yet as it seems to be the make causing the problem)
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
I have even configured the Maven Runner JRE to point to the 1.8 JDK.
I just don't seem to be able to get IntelliJ 12.0.4 to run the tests properly
Am I missing something?
After the hint from CrazyCoder it turns out that .idea/compiler.xml had the following section in it
<bytecodeTargetLevel>
<module name="game" target="1.7" />
</bytecodeTargetLevel>
I changed this to:
<bytecodeTargetLevel>
<module name="game" target="1.8" />
</bytecodeTargetLevel>
and it worked
There is another way to make it. Go to Files->Other Settings->Type "javac" in the search bar -> change the JDK default to "1.8" or any version you target.
Xetius' answer didn't work for me in IntelliJ 14.1.2
In the end I found the section in .idea/misc.xml
<component name="ProjectRootManager" version="2" languageLevel="JDK_1_6" assert-keyword="true" jdk-15="true" project-jdk-name="1.7" project-jdk-type="JavaSDK">
<output url="file://$PROJECT_DIR$/out" />
</component>
I ended up with:
<component name="ProjectRootManager" version="2" languageLevel="JDK_1_8" default="false" assert-keyword="true" jdk-15="true" project-jdk-name="1.8" project-jdk-type="JavaSDK">
<output url="file://$PROJECT_DIR$/out" />
</component>
For me, I needed to go File > Project Structure > Modules. Half of my modules had their "module SDK" field set to the wrong thing (not Project Default as they should have been).
Related
I'm familiar with Tomcat/TomEE and testing applications using Arquillian. Now were are switching to Open Liberty. I see there is a module for Arquillian using embedded Open Liberty but it seems to require an existing Open Liberty installation whose path is provided in the configuration. This makes it non-portable and therefore unsuitable for automated testing since the installation has to be present at the exact same path. Arquillian and TomEE are self-contained, no installation required. Therefore my question is why this isn't also possible with Open Liberty? And is this planned for the future?
For reference this is how you use Arquillian with TomEE/Tomcat:
<arquillian xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://jboss.org/schema/arquillian"
xsi:schemaLocation="http://jboss.org/schema/arquillian http://www.jboss.org/schema/arquillian/arquillian_1_0.xsd">
<container qualifier="tomee" default="true">
<configuration>
<property name="httpPort">-1</property>
<property name="stopPort">-1</property>
<property name="users">user=pass</property>
</configuration>
</container>
</arquillian>
As you can see, there is no path to a local installation required to run the tests. The only thing you need to do is a add a couple of Maven dependencies in test scope that pull in TomEE (embedded). If the same would work for Open Liberty that would be great.
Going further..so the above is how we do automated testing
but it still uses the location.
I see, regarding not needing any location specified at all, you say:
"The only thing you need to do is a add a couple of Maven dependencies in test scope that pull in TomEE (embedded). If the same would work for Open Liberty that would be great."
So, thinking, maven will put a bunch of classes on the classpath due to the TomEE
dependancies and then the test run will find the appropriate container to
run the tests on.
I will raise an issue over on
https://github.com/OpenLiberty/liberty-arquillian/issues/39
to cover the requirement please feel free to add remarks etc.
If you have a look at https://github.com/OpenLiberty/open-liberty/blob/integration/dev/com.ibm.ws.microprofile.config.1.2_fat_tck/publish/tckRunner/tck/src/test/resources/arquillian.xml
you will see an example arquillian.xml that sets $wlpHome
<property name="wlpHome">${wlp}</property>
from an environment variable $wlp.
('wlp' is short for Websphere Liberty Profile)
The wlpHome variable is used in the managed/local container here:
https://github.com/OpenLiberty/liberty-arquillian/blob/42cb523b8ae6596a00f2e1793e460a910d863625/liberty-managed/src/main/java/io/openliberty/arquillian/managed/WLPManagedContainer.java#L224
An example that does this dynamically is the setting of the
system property ${wlp} here:
https://github.com/OpenLiberty/open-liberty/blob/95c266d4d6aa57cf32b589e7c9d8b39888176e91/dev/fattest.simplicity/src/componenttest/topology/utils/MvnUtils.java#L161
If you have any more queries please post them...
and hope you love OpenLiberty - it rocks!
Gordon
The result you seem to be trying to achieve is a embedded runtime for liberty using arquillian. However, all as far as I can see, the openliberty team only provides a remote container adapter and a managed container adapter at the moment.
With us having a similar need, wanting to run automated integration tests where we wouldnt necessarily have a Openliberty server in-place. We managed to work-around this using liberty-maven-plugin.
The build/testing process would then be:
Running mvn verify, liberty-maven-plugin would generate the specified openliberty which we want to run our tests against.
<plugin>
<groupId>net.wasdev.wlp.maven.plugins</groupId>
<artifactId>liberty-maven-plugin</artifactId>
<version>${version.liberty-maven-plugin}</version> <!-- plugin version -->
<configuration>
<assemblyArtifact> <!-- Liberty server to run test against -->
<groupId>io.openliberty</groupId>
<artifactId>openliberty-runtime</artifactId>
<version>18.0.0.4</version>
<type>zip</type>
</assemblyArtifact>
<configDirectory>src/liberty/${env}/</configDirectory>
<configFile>src/liberty/server.xml</configFile>
<serverName>defaultServer</serverName>
</configuration>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>create-server</goal>
</goals>
</execution>
</executions>
</plugin>
As liberty-maven-plugin per default adds the Liberty server to the target/folder
<arquillian xmlns="http://jboss.org/schema/arquillian">
<container qualifier="liberty-managed" default="true">
<configuration>
<property name="wlpHome">target/liberty/wlp</property>
<property name="serverName">defaultServer</property>
</configuration>
</container>
</arquillian>
This way we can assure that a runnable liberty server according to our liking is always existant in our local environment or e.g. our Jenkins CI/CD Server, essentially getting the same effect as having a embedded container.
With a fresh install of Eclipse 4.9.0 (2018-09), and m2e installed in Eclipse, I downloaded Optaplanner release 7.12.0, extracted the zip, and followed the first steps of the documentation instructions toward building the examples ("Open the file examples/sources/pom.xml as a new project, the maven integration will take care of the rest.") via:
1) File / Open Projects from File System... / Show other specialized import wizards / Maven / Existing Maven Projects
2) Selected the optaplanner-distribution-7.12.0.Final / examples / sources as the root directory
(pom.xml for 7.12.0.final appeared as expected as a selectable project)
However, when selecting the pom and clicking Finish, an error occurs:
Plugin execution not covered by lifecycle configuration: org.apache.maven.plugins:maven-antrun-plugin:1.8:run (execution: create-default-i18n-resource, phase: process-resources)
This appears to be related to the ancestor pom org.kie:kie-parent:7.12.0.Final.pom which has
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<!-- Temporary workaround for https://issues.jboss.org/browse/ERRAI-1101. Needs to stay here until
we find a general solution (e.g. moving all localized code to Errai TranslationService. -->
<id>create-default-i18n-resource</id>
<phase>process-resources</phase>
<configuration>
<target>
<copy todir="${project.build.directory}/classes"
includeemptydirs="false" failonerror="false" quiet="true">
<fileset dir="${project.build.directory}/classes"/>
<globmapper from="*Constants.properties" to="*Constants_default.properties"/>
</copy>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
However, I do not know how to resolve the error as "discover m2e connectors" reports "No marketplace entries".
My goal is to extend an example to create a reproduce-able example of an issue reported with Optaplanner. As this first step is an out of the box build of a recent release, I'm thinking it should work - what am I missing please?
This is a "feature" (deficiency?) of the M2Eclipse plugin. In the latest version of m2e, Antrun plugin can be ignored using lifecycle mapping stored in the Eclipse workspace. To do that, open the Problems tab, right-click the maven-antrun-plugin execution error and select Quick Fix. Then select "Mark goal run as ignored in eclipse preferences" and click Finish. The error will disappear and it's possible to work with the project.
Following video illustrates the process: https://youtu.be/TVRAtMx2XyE.
I try to compile an Eclipse Indigo RCP application with Maven and Tycho. It works fine if I just build it for one platform but if I try to build it for more the build stops working.
The Problem is that I have platform specific plugins in my product file I want to build. Dependencies like org.eclipse.swt.win32.win32.x86 which are fragment plugins for org.eclipse.swt.
When I add no platform specific fragments to my product the application wouldn't start because there are no platform libraries like org.eclipse.swt.win32.win32.x86.
As Tycho repository we use a clone of the eclipse indigo update site hosted on our own server. It includes the delta-pack.
And when I add all fragments for all platforms the build crashed and maven tell me that the platform filters did not match for the Linux build for example.
Does anyone know how to fix this?
Should I add these platform dependent stuff into my product? I prefer to keep the specific dependencies out of my product, am I right?
It sounds like you have a plug-in based product. In this case you will need to manually edit your .product file and add in platform filters for these plug-ins. Unfortunately the built-in product editor in eclipse does not expose these values. See http://wiki.eclipse.org/Tycho/FAQ#How_to_build_plugin-based_products_with_platform-specific_fragments.3F
For each plugin e.g. org.eclipse.swt.win32.win32.x86 you will need to add something like;
<plugin id="org.eclipse.swt.win32.win32.x86" fragment="true" ws="win32" os="win32" arch="x86"/>
Note, if you use the product editor it will remove these values.
It is better however to use a feature based product. The feature editor permits these fields to be edited.
There is an easier solution I found in the blog: http://blog.sdruskat.net/building-a-cross-platform-feature-based-eclipse-rcp-product-with-tycho-the-umpteenth/
In the parent/master pom.xml,
To use all the plugins from p2, specify the following:
<build>
<plugins>
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-maven-plugin</artifactId>
<version>${tycho-version}</version>
<extensions>true</extensions>
</plugin>
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>target-platform-configuration</artifactId>
<version>${tycho-version}</version>
<configuration>
<resolver>p2</resolver>
<environments>
<environment>
<os>linux</os>
<ws>gtk</ws>
<arch>x86_64</arch>
</environment>
<environment>
<os>win32</os>
<ws>win32</ws>
<arch>x86_64</arch>
</environment>
</environments>
</configuration>
</plugin>
</plugins>
</build>
My tycho version is 0.21.0
I am working on an application where we need to use java mail functionality. We have started using maven 3.x as out build tool.
Everything was working fine till Java Mail API has not been introduced. We are using Eclipse with M2Eclipse plugin but most of our deployment work is being done by maven Command line.
We have introduced following dependency in our pom.xml and I have verify that both mail.jar and activation.jar are in there respected folder structure.
<dependency>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
<version>1.4</version>
</dependency>
We tried the following command
> mvn clean
and then
>mvn tomcat:deploy
Though maven is showing that it has successfully deployed war on the tomcat but tomcat console showing that it is failing to deploy application and in other successful cases we are facing a strange issue, as we are using hibernate for persistance layer so on examing the folder structure it came out that the mapping file .hbm files are missing due to which Session factory is not creating and server is not able to startup.
Here is the snap shot of plugin entry
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.1.1</version>
<configuration>
<packagingExcludes>WEB-INF/web.xml</packagingExcludes>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>tomcat-maven-plugin</artifactId>
<configuration>
<warFile>${project.build.directory}/${project.build.finalName}.war</warFile>
<url>http://localhost:8080/manager/html</url>
<server>localhost</server>
<path>/blood_donor</path>
</configuration>
</plugin>
From you procedure I see:
mvn clean - that deletes target directory with your build
mvn tomcat:deploy - should take the build (which was deleted with mvn clean) and deploy it on tomcat
There is no build phase. So use instead mvn clean package tomcat:deploy. If your application is already deployed in tomcat try mvn clean package tomcat:redeploy. For details check plugin documentation.
Which Tomcat version do you use?
According to http://repo1.maven.org/maven2/org/codehaus/mojo/tomcat-maven-plugin/ you use the version 1.1. In the plugin documentation you can find there that Tomcat 7 is not supported in this version. For that you must upgrade to version 2.0. See http://tomcat.apache.org/maven-plugin-2.0-SNAPSHOT/
Probably yo
I have a multi-module maven build where one of the child modules requires an extra goal to be executed as part of a release. But it looks as though any configuration of the maven-release-plugin in the child module is ignored in favour of the default configuration in the parent module.
This is the snippet from the child module. The plugin configuration is the same in the pluginManagement section of the parent pom, but without the custom element.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-release-plugin</artifactId>
<version>2.1</version>
<configuration>
<tagBase>http://mycompany.com/svn/repos/myproject/tags</tagBase>
<goals>deploy myCustomPlugin:myCustomGoal</goals>
</configuration>
</plugin>
So is it possible for a child module to override the parent's configuration and add extra goals?
Maven version 2.2.1
Use combine.children="append" combine.self="override"
Parent POM
<configuration>
<items>
<item>parent-1</item>
<item>parent-2</item>
</items>
<properties>
<parentKey>parent</parentKey>
</properties>
</configuration>
Child pom
<configuration>
<items combine.children="append">
<!-- combine.children="merge" is the default -->
<item>child-1</item>
</items>
<properties combine.self="override">
<!-- combine.self="merge" is the default -->
<childKey>child</childKey>
</properties>
</configuration>
Result
<configuration>
<items combine.children="append">
<item>parent-1</item>
<item>parent-2</item>
<item>child-1</item>
</items>
<properties combine.self="override">
<childKey>child</childKey>
</properties>
</configuration>
See this blog for further details
Yes and no. Certainly a child pom can override the configuration of a plugin specified by its parent, and I have to assume you've done so correctly because there's nothing really hard about it. If you check the output of mvn help:effective-pom, you should be able to see plainly that this module has different settings for the release plugin.
The problem you're having is with the behavior of the release plugin. Typically, if you run a goal or phase--mvn compile, for example--from the root module of your project, it first runs that goal/phase on the root module, then on all the modules in reactor order, almost as if you'd run it in each module yourself. Any customizations you've added to child modules take effect as expected. When you run the release plugin, it runs only at the root module. It doesn't run in any of the child modules. Instead, running it at the root module forks a new build using the same settings as the root module, which runs for all the other modules in nearly the same way, except that it uses the root module's configuration for all the modules. I don't know the exact semantics, but I believe this is analogous to you manually running the release goals in each child and specifying the configuration options as system properties at the command line: regardless of how a child module configures the release plugin, the command line args win.
I've never dealt with this problem myself, and it's hard to say without knowing exactly what you're trying to accomplish. Perhaps if you can express what you want to do in this special module as a profile, then you could add a profile to your goals and or preparationGoals. Alternately, there's an arguments option to both the prepare and perform goals that you might be able to pull some tricks with.