Kotlin doesn't see Java Lombok accessors? - kotlin

Using Kotlin 1.0.0 release (compiling in IntelliJ 15).
println(myPojoInstance.foo)
When it tries to compile code (in IntelliJ or Gradle) that references Lombok based POJOs it gives the error "Cannot access 'foo': it is 'private' in "MyPojo". Which is true, they're all private and my object has #Value #Builder for lombok annotations.
I've tried specifically calling getFoo(), but it says "unresolved reference for getFoo". There's perhaps some trick to make Kotlin aware of how to handle the lombok annotations?

Generally, no, it doesn't. The reason of that behavior is that Lombok is an annotation processor for javac but when the kotlin compiler runs it uses javac as well but with no annotation processing so this is why kotlin don't see declarations that wasn't yet generated.
The only workaround for now is to define strict compilation order: Java first and after that kotlin. Unfortunately this approach has great disadvantage: you can't use Kotlin code from Java in this case. To workaround it you may need multimodule project that may cause a lot of pain

There is a Kotlin compiler plugin for lombok. It is still experimental and it can be used with Gradle or Maven.
It only supports a hand full of annotations, including
#Getter, #Setter
#NoArgsConstructor, #RequiredArgsConstructor, and #AllArgsConstructor
#Data
#With
#Value
Seem to work as expected. Unfortunately, they do not support the #Builder annotation but you can request to be added in YouTrack
See the Lombok compiler plugin in the kotlin documentation for more information.
Update 1
The #Builder annotation ticket mentioned above has been fixed! The target version for the fix is 1.8.0-Beta.

As it was mentioned in comments above, delombok helps.
In case of maven build it would be:
<plugin>
<groupId>org.projectlombok</groupId>
<artifactId>lombok-maven-plugin</artifactId>
<version>${lombok.version}.0</version>
<executions>
<execution>
<id>delombok</id>
<phase>generate-sources</phase>
<goals>
<goal>delombok</goal>
</goals>
<configuration>
<formatPreferences>
<javaLangAsFQN>skip</javaLangAsFQN>
</formatPreferences>
<verbose>true</verbose>
</configuration>
</execution>
<execution>
<id>test-delombok</id>
<phase>generate-test-sources</phase>
<goals>
<goal>testDelombok</goal>
</goals>
<configuration>
<verbose>true</verbose>
</configuration>
</execution>
</executions>
</plugin>

To add to Sergey Mashkov's response (adding here I don't have enough rep points to comment on it), here's an example app of a Gradle multi-project setup where Kotlin can see the Lombok-generated code (without kapt or delomboking. Caveats do apply - namely, Kotlin can call the Java code, but Java can't call the Kotlin code in that particular module (as this would create a circular dependency). This kind of build might be suitable if you have an existing Java codebase and all new code is written in Kotlin, though.
I would love to see full Lombok/Kotlin support, however. While Kotlin is fully interoperable with Java, the reality is that Lombok is very widely used, and this issue may prevent a large number of developers who would like to switch to Kotlin from doing so.

Looks like it works if you use delombok according to site and add the target/generated-sources/delombok folder in the pom.xml under build > plugins > plugin > kotlin-maven-plugin

Related

Spring scanner not detecting component

I'm trying to write an event listener plugin for jira. When I go the old way (which the latest Atlassian SDK 6.2.9 does) and put these 2 lines
<component key="eventListener" class="jira.plugins.listeners.MyEventListener"/>
<component-import key="eventPublisher" class="com.atlassian.event.api.EventPublisher"/>
and try to package the plugin I get a warning saying that I cannot use component/component-import statement inside plugin descriptor file when Atlassian plugin key is set. The latest SDK uses Spring Scanner, which is added to the pom.xml file automatically during the skeleton creation and which documentation strongly recommends. So I remove those two lines from the atlassian-plugin.xml file and try to substitute them with corresponding annotations:
#Component
public class MyEventListener{
#Inject
public MyEventListener(#ComponentImport EventPublisher eventPublisher){
eventPublisher.register(this);
}
}
I can compile and package it this way, but when I install it on a running Jira instance, in the description of the plugin it says This plugin has no modules. I've tried changing #Component to #Named , addind #ExportAsService to the class all to no avail. It seems spring scanner does not detect my class as a component. Has anyone been able to overcome this issue? I've written to atlassian community but haven't gotten any news so far.
Configure the Spring Scanner maven plugin to execute in verbose mode, and make sure that your class is processed during the build using the inclusion patterns.
<plugin>
<groupId>com.atlassian.plugin</groupId>
<artifactId>atlassian-spring-scanner-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>atlassian-spring-scanner</goal>
</goals>
<phase>process-classes</phase>
</execution>
</executions>
<configuration>
<includeExclude>+your.package.goes.here.*</includeExclude>
<verbose>true</verbose>
</configuration>
</plugin>
If everything fine, after the build your component will be listed in the file target/classes/META-INF/plugin-components/component
In case the #Component is defined in a library module (as a dependency of the hosting plugin), you can also generate the component metadata using the configuration element
<scannedDependencies>
<dependency>
<groupId>your.library.group.id</groupId>
<artifactId>your-library</artifactId>
</dependency>
</scannedDependencies>
Note: there is a difference between the V1 and V2 spring scanner, make sure you use the right version. See reference.

Detect and weave dependencies automatically with AspectJ

We have a Maven project with multiple compile dependencies and every time a new <dependency> is added, we need to create an equivalent <weaveDependency> entry in
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.7</version>
<configuration>
<weaveDependencies>
<weaveDependency>
<groupId>a-group</groupId>
<artifactId>new-dependency</artifactId>
</weaveDependency>
</weaveDependencies>
<weaveDirectories>
<weaveDirectory>${project.build.directory}/classes/</weaveDirectory>
</weaveDirectories>
<complianceLevel>${java.version}</complianceLevel>
<showWeaveInfo>true</showWeaveInfo>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
This is being done exactly as described in
http://mojo.codehaus.org/aspectj-maven-plugin/examples/weaveJars.html
But this could easily lead to problems if it's needed to weave everything because someone could forget to add the <weaveDependency> after adding a new <dependency>, so is there a way of detecting and weaving all compile dependencies automatically? Maybe with another plugin?
AFAIK there is no such option or plugin, unless you decide to write one or open a ticket for AspectJ Maven.
One question before we continue: Are you really sure you want to weave all dependencies? What about libraries such as JUnit or Log4J in your aspect POM?
The way I usually go about weaving my aspects into the code - if they are production and not just development, debugging or profiling aspects, that is - is that I do it the other way around than you: I use the AspectJ Maven Plugin in each of my modules to directly compile the aspects into my code from source. So in my case each Java module depends on an aspect module, using it as an aspect library. Because usually I have way fewer aspect libs than Java modules, I cannot so easily forget to include them. Okay, I have to do it in each module, but this is a no-brainer with a good IDE (global search and replace on all pom.xml files in my project).
If you really want to do it even more cleanly and nicely, you can use the approach explained in Strategies for using AspectJ in a Maven multi-module reactor, i.e. you create a normal root POM and an aspect root POM which has the root pom as its parent. Then each Java module which needs the aspects can use the aspect root POM as its parent, other Java modules use the root POM directly.
The advantage of compiling the aspects into your artifacts right away is that there are no two class file versions of each artifact: an original without aspects and a woven version with aspects in the aspect module's target directory. The only reason why you would not do it they way I explained is if for some reason you also need artifact versions without aspect code. But then, as I said, you probably use development, debugging or profiling aspects. Be it as it might, you can still use my approach for production aspects and your old approach for the development stuff.

Maven2 & Groovy compilation error but not within Eclipse

He,
I am having a mixed Java / Groovy eclipse project.
Inside eclipse utilizing the groovy plugin everything compiles just fine. In addition I have set up my project to utilize Maven2. And still everything compiles and runs (tests) just fine within eclipse.
However, compiling the project outside eclipse, i.e. using Maven2 standalone gives me compiler errors! The project is devided into several sub-projects (parent / module). The Maven2 configuration seems to be OK cause some of the modules compile but actually one gives me an compiler error, like:
[ERROR] \Projects\X\rules\src\main\groovy\x\Normalizer.java:[18,25] normalize(java.util.List<java.util.Map<java.lang.String,java.lang.Object>>) in x.
x.util.RuleUtil cannot be applied to (java.util.List<java.util.Map<java.lang.String,?>>)
[ERROR] \Projects\X\rules\src\main\groovy\x\Statistics.java:[70,67] inconvertible types
found : capture#683 of ?
required: java.lang.String
Why is this code compiling within eclipse but not using standalone Maven2?
Thanks in advance,
/nm
The problem that you are facing is a stub generation problem. GMaven creates Java stubs for your Groovy files to compile against the remaining Java files. If your application is completely in Groovy, or there is no referencing from Java classes to Groovy classes, you can remove the <goal>generateStubs</goal> goal.
The Groovy-Eclipse compiler does not require stubs and so you are not seeing this issue inside of Eclipse.
If you do require cross referencing between Groovy and Java, I'd recommend using the groovy-eclipse-compiler plugin for maven. More information is here:
http://contraptionsforprogramming.blogspot.com/2010/09/where-are-all-my-stubs.html
With this, you will be sure that your compilation inside Eclipse and outside works exactly the same.
The Groovy Eclipse plugin uses the version of Groovy presents within the plugin folder of Eclipse (groovy-1.7.5).
Most probably, the version of Groovy referenced in your maven file is different. You can specify it thought in the gmaven-runtime:
<plugin>
<groupId>org.codehaus.gmaven</groupId>
<artifactId>gmaven-plugin</artifactId>
<version>1.3</version>
<configuration>
<providerSelection>1.7</providerSelection>
</configuration>
<dependencies>
<dependency>
<groupId>org.codehaus.gmaven.runtime</groupId>
<artifactId>gmaven-runtime-1.7</artifactId>
<version>1.3</version>
<exclusions>
<exclusion>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>1.7.5</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>generateStubs</goal>
<goal>compile</goal>
<!-- <goal>generateTestStubs</goal> -->
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>

How to use javadoc:aggregate properly in multi-module maven project?

This is my parent pom.xml file (part of it) in a multi-module project:
...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
</plugin>
</plugins>
</build>
...
This plugin is inherited in all sub-modules and I don't think it's a correct approach. When I'm running mvn javadoc:aggregate the documentation is generated in target/site/apidoc, but the log is full of warnings:
...
[WARNING] Removing: aggregate from forked lifecycle,
to prevent recursive invocation.
...
What am I doing wrong?
You need to enable aggregation for this plugin:
<plugin>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<aggregate>true</aggregate> <!-- this enables aggretation -->
</configuration>
</plugin>
On the commandline type:
mvn javadoc:aggregate
Edit:
Okay, I did some digging into maven plugin's jira and found that all the javadoc plugin mojos have been annotated with #aggregator. But there seem to be problems with maven's aggregator the issue for which has been filed here
There are also related bugs here and here and some more
This seems to be a blocker issue with maven's aggregator since some plugins like e.g. clover wont run.
To to summarize, you are doing nothing wrong
Just switch back to earlier versions of maven-javadoc-plugin that does not use #aggregator mojo annotation and you will not get the warnings (unless you are using certain feature of the javadoc plugin thats not available in earlier version)
On a side note, If you run the javadoc plugin as report then the #aggregator is ignored.

Compile scala classes with debug info through Maven

I have a scala project that I use Maven and the maven-scala-plugin to compile. I need to include debug information in the compiled classes and I was wondering is there a way to ask Maven or the scala plugin to do this. I found this page that makes it sound possible but it's not clear where to put the params in the pom.xml.
If possible I'd like this option to be something specified in the pom.xml rather than on the command line.
Compiling .class files with debugging information needs to be done at the maven-scala-plugin level. Doing it at the maven-compiler-plugin - which is by the way the default as we can see in the documentation of the debug option that defaults to true - is useless as it's not compiling your Scala sources.
Now, if we look at the scalac man page, the scalac compiler has a –g option that can take the following values:
"none" generates no debugging info,
"source" generates only the source file attribute,
"line" generates source and line number information,
"vars" generates source, line number and local variable information,
"notc" generates all of the above and will not perform tail call optimization.
The good news is that scala:compile has a nice args optional parameter that can be used to pass compiler additionnals arguments. So, to use it and pass the -g option to the scala compiler, you just need to configure the maven plugin as follow:
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.9.1</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<args>
<arg>-g:notc</arg>
</args>
...
</configuration>
</plugin>
I'm skipping other parts of the configuration (such are repositories, pluginRepositories, etc) as this is not what you're asking for :)