I have an IDEA project that uses auto-generated JAXB classes from .xsd files. I have “client” and “server” modules that include a “common” module that contains, among other things, the JAXB classes.
I do not want to keep generated code under source control, but if the generated java classes do not exist, “client” and “server” modules do not compile. How to make IntelliJ automatically run JAXB before building?
There is no direct way to do it only with IntelliJ IDEA, you will need to use Ant or Maven or some other external process that will perform the code generation.
Check out jaxb2-maven-plugin.
In IntelliJ IDEA you can execute Maven or Ant before compilation.
In the build system's tool window you can bind a phase or a plugin goal to IDEA's build process.
For example the jaxb2-maven-plugin can be executed Before Rebuild or Before Build with a secondary click on the goal:
Another option would be to bind the goal to a lifecycle phase and execute the phase like 'generate-sources' before rebuild. In case of the jaxb2-maven-plugin the goal xjc is by default bound to the generate-sources phase of Maven.
Related
For almost every Java project I have, I define a new Gradle task to build a jar with the javadocs. Specifically, I add the following to almost every build.gradle:
task jarJavadoc(type: Jar, dependsOn: ['javadoc']) {
classifier = 'javadoc'
from javadoc.destinationDir
}
artifacts {
archives jarJavadoc
}
Is there a way to configure Intellij so that it automatically adds these lines to every new Gradle Java project?
I think you could explore couple of options:
First is creating a gradle file (e.g. init.gradle) in your GRADLE_HOME directory (e.g ~/.gradle/) and define the common parts there. Gradle always applies those files first while processing your build scripts. Note, everything you configure there is going to be available in
every Gradle project on your machine. Which means e.g. if you depend
on Java plugin (like you do in an example you provided) and you create
other project which doesn't depend on Java, this approach may produce
configuration errors so use it with caution.
You could write a simple Gradle plugin which adds common tasks you require to a project. With this approach you will still need to duplicate the apply plugin: 'your plugin'
You could leverage File and Code templates and update Gradle build script template to include the common code.
You can also mix the last 2 examples and write a plugin which configures the common tasks and modify Gradle build script template to include your plugin.
You could apply the nebula.javadoc-jar plugin.
Eg:
plugins {
id 'nebula.javadoc-jar' version '5.1.0'
}
I am thinking to use Gradle to manipulate with mysql database. It will read some files from filesystem, analyse them and populate database accordingly.
Such project will not produce any project code, because all output will go to database tables. On the other hand, gradle script should access some custom java or groovy classes to facilitate working with source data.
Is this a possible Gradle usage? Where to put gradle-accessible classes then? I don't want to have separate project, producing JAR for this project. I wan't single project, so that Gradle first compiles classes and the utilizes them in the script.
Is this possible?
Gradle is extensible, so you can utilize buildSrc for such scenarios. It works in the following way:
along build.gradle in the project there is buildSrc dir with custom build.gradle
in buildSrc/build.gradle you can define the script dependencies itself, implement plugins and tasks
finally you can apply a plugin from buildSrc to build.gradle.
It's quite handy, since e.g. IntelliJ can import such project and provide code completion for instance.
Another way is to put all the necessary stuff in build.gradle itself.
Such buildSrc project can be compiled to a jar, published and provided as a plugin, or it can be a separate project on github to be downloaded and used to manipulate data. Also, there no need to implement Plugin, you can use static methods e.g. Have a look at the demo.
I'm using IntelliJ and I want my unit tests to be run or debug with static weaving for lazy loading et al. I know that unlike Eclipse that IntelliJ does not have a static weaving step but I imagine someone must have setup IntelliJ to statically weave before running or debugging tests.
So far I have tried dynamic weaving with the JVM argument of -javaagent:./path/eclipselink-2.5.0.jar but that doesn't seem to work. I still get these warnings:
[EL Warning]: metadata: 2013-08-28 11:00:51.091--ServerSession(1610028911)--Reverting the lazy setting on the OneToOne or ManyToOne attribute [owner] for the entity class [class com.my.Contact] since weaving was not enabled or did not occur.
Do my IntelliJ brother and sisters just punt on this and skip weaving in the IDE? Do we just not use EclipseLink or have we figured out how to handle the static weaving and still use IntelliJ?
Thanks!
You can create additional build steps before launching a run configuration.
Run > Edit Configurations…
Select the desired run configuration
Add your static weaver compiler as additional build step under `Before Launch:
For example add a Maven goal like eclipselink:weave
Or execute a java process like java org.eclipse.persistence.tools.weaving.jpa.StaticWeave…
Position should be between Build and `Build artifact``
See Screenshot:
Another solution is to define the Maven goal as a hook for After Build in the Maven tool window. Just right-click on the appropriate Maven goal and select Execute After Build. This will execute the EclipseLink Weaver via Maven everytime after Build is executed. You will see the hook in parenthesis behind the Maven goal:
IntelliJ Idea can run additional targets to build the application.
If you have an maven project with staticweave plugin configured, it is possible to add a maven "process-classes" goal to perform staticweave actions automatically on idea build run.
Is it possible to specify the classes location like you can in the findbugs ant task?
or is there another way to exclude a directory of class files
(we compile our test classes to a different directory and don't want to use findbugs on those)
P.V. Goddijn
After looking through te source of the Eclipse Findbugs plug-in i found its currently impossible to do this (without modifying the Findbugs plug-in).
the plug-in does either a findbugs run over a single class file after a change has been made or a complete run over all the classes as defined by the eclipse project.
I'm trying to create a new plugin to package my latest project. I'm trying to have this plugin depend on the maven-dependency-plugin to copy all my projects dependencies.
I've added this dependency to my plugin's pom, but I can't get it to execute.
I have this annotation in my plugins main Mojo:
#execute goal="org.apache.maven.plugins:maven-dependency-plugin:copy"
I've tried a few other names for the goal, like dependency:copy and just copy but they all end with a message saying that the required goal was not found in my plugin. What am I doing wrong?
Secondary to this is where to I provide configuration info for the dependency plugin?
Use the Maven Mojo executor by Don Brown of Atlassian fame to run any other arbitrary plugin.
The Mojo Executor provides a way to to
execute other Mojos (plugins) within a
Maven 2 plugin, allowing you to easily
create Maven 2 plugins that are
composed of other plugins.
Have you tried to create your own packaging type? Then you can define your own lifecycle mapping, i.e. bind goals to phases. In this case you can bind the dependency:copy-dependencies goal to your packaging phase and you don't have to wrap the goal into your own Mojo.
See also: How do I create a new packaging type for Maven?