In the doc there are three values for the aptMode.
Is there any detail information about these values ?
What is the meaning of "stubs" ?
See https://blog.jetbrains.com/kotlin/2015/06/better-annotation-processing-supporting-stubs-in-kapt/ (stubs are described in the second paragraph, but the first one provides context):
The initial version of kapt worked by intercepting communication between annotation processors (e.g. Dagger 2) and javac, and added already-compiled Kotlin classes on top of the Java classes that javac saw itself in the sources. The problem with this approach was that, since Kotlin classes had to be already compiled, there was no way for them to refer to any code generated by the processor (e.g. Dagger’s module classes). Thus we had to write Dagger application classes in Java.
As discussed in the previous blog post, the problem can be overcome by generating stubs of Kotlin classes before running javac and then running real compilation after javac has finished. Stubs contain only declarations and no bodies of methods. The Kotlin compiler used to create such stubs in memory anyways (they are used for Java interop, when Java code refers back to Kotlin), so all we had to do was serialize them to files on disk.
And also this answer.
But now stubs are generated by default, you can explicitly disable this generation by using aptMode=apt or only generate stubs by using aptMode=stubs. I think they are primarily for use internally by build systems (e.g. Gradle), as described in https://www.bountysource.com/issues/38443087-support-for-kapt-for-improved-kotlin-support:
There's 4 steps.
kaptGenerateStubsKotlin:
run kotlinc with plugin:org.jetbrains.kotlin.kapt3:aptMode=stubs
kaptKotlin
run kotlinc with plugin:org.jetbrains.kotlin.kapt3:aptMode=apt
compileKotlin
run kotlinc regularly
compileJava
run javac with -proc:none and pass the generated sources from step 2.
These steps are slightly different with each minor version of kotlin so this will be interesting.
Related
Currently I am studying on KSP(Kotlin Symbol Processing), and I am curious about what does "Symbol" mean in KSP.
When it comes to comparing with KAPT, it says "To run Java annotation processors unmodified, KAPT compiles Kotlin code into Java stubs that retain information that Java annotation processors care about. To create these stubs, KAPT needs to resolve all symbols in the Kotlin program."
I don't know what does "all symbols in the Kotlin program" exactly mean?
I understand "symbols" as declarations of interfaces, classes, functions, properties, etc. It doesn't include the body or the code itself, only the API, items that are visible to others.
This term is not specific to Kotlin. I can't find any definition of "symbols" on Wikipedia, but for example native libraries also contain symbol tables.
In this specific context it means that KAPT has to create a full list of all such symbols in Kotlin code and generate their equivalents in Java, so annotation processors could work on them. This is pretty wasteful as we recreate Kotlin code structure in Java just to throw it away seconds later and replace with true compiled code.
When we use coroutine, we can either have the normal kotlin coroutine or the native-mt version.
i.e.
implementation 'org.jetbrains.kotlinx:kotlinx-coroutines-core:1.5.0'
or
implementation 'org.jetbrains.kotlinx:kotlinx-coroutines-core:1.5.0-native-mt'
Is there any difference between them? When should we use which?
It basically provides capability to use multiple threads in Kotlin/Native code (typically as part of a Kotlin Multiplatform (KMP) project). Some more info at https://kotlinlang.org/docs/mobile/concurrency-and-coroutines.html#multithreaded-coroutines. This is also version used now by many KMP libraries (e.g. Ktor) and is generally a requirement when developing KMP apps.
This is a follow up question of this answer.
But when the application hasn’t used lambda expressions before¹, even
the framework for generating the lambda classes has to be loaded
(Oracle’s current implementation uses ASM under the hood). This is the
actual cause of the slowdown, loading and initialization of a dozen
internally used classes, not the lambda expression itself
Ok, Java uses ASM to generate the classes on runtime. I found this and if I understood correctly, it is basically saying that Kotlin lambdas are compiled to pre-existing anonymous classes being loaded at runtime (instead of generated).
If I'm correct, Kotlin lambdas aren't the same thing as Java and shouldn't have the same performance impact. Can someone confirm?
Of course, Kotlin has built-in support for inlining lambdas, where Java doesn't. So many lambdas in Kotlin code don't correspond to any objects at runtime at all.
But for those that can't be inlined, yes, according to https://medium.com/#christian.c.carroll/exploring-kotlin-lambda-bytecode-8c2d15afd490 the anonymous class translation seems to be always used. Unfortunately the post doesn't specify the Kotlin version (1.3.30 was the latest available at that time).
I would also consider this an implementation detail which could change depending on Kotlin version at least when jvmTarget is set to "1.8" or greater; so there is no substitute to actually checking your own bytecode.
Can i write in my custom plugin some function like kotlin("jvm")?
plugins {
java
kotlin("jvm") version "1.3.71"
}
I want to write function myplugin("foo") in my custom plugin and then use it like
plugins {
java
kotlin("jvm") version "1.3.71"
custom.plugin
myplugin("foo")
}
How i can do it?
I think that plugins block is some kind of a macro expression. It is parsed and precompiled using a very limited context. Probably, the magic happens somewhere in kotlin-dsl. This is probably the only way to get static accessors and extension functions from plugins to work in Kotlin. I've never seen a mention of this process in Gradle's documentation, but let me explain my thought. Probably, some smart guys from Gradle will correct me.
Let's take a look at some third-party plugin, like Liquibase. It allows you to write something like this in your build.gradle.kts:
liquibase {
activities {
register("name") {
// Configure the activity here
}
}
}
Think about it: in a statically compiled language like Kotlin, in order for this syntaxt to work, there should be an extension named liquibase on a Project type (as it is the type of this object in every build.gradle.kts) available in the classpath of a Gradle's VM that executes the build script.
Indeed, if you click on it, you'll see something like:
fun org.gradle.api.Project.`liquibase`(configure: org.liquibase.gradle.LiquibaseExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("liquibase", configure)
But take a look at the file where it is defined. In my case it is ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/cmljl3ridzazieb8fzn553oa8/cache/src/org/gradle/kotlin/dsl/Accessors39qcxru7gldpadn6lvh8lqs7b.kt. It is definitelly an auto-generated file. A few levels upper in a file tree — at ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/ in my case — there are dozens of similar directories. I guess, one by every plugin/version I've ever used with Gradle 6.3. Here is another one for the Detekt plugin:
fun org.gradle.api.Project.`detekt`(configure: io.gitlab.arturbosch.detekt.extensions.DetektExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("detekt", configure)
So, we have a bunch of .kt files defining all that extensions for different plugins applied to the project. That files are obviously pre-cached and precompiled and their content is available in build.gradle.kts. Indeed, you can find classes directories beside those sources.
The sources are generated based on the content of the applied plugins. It is probably a tricky task that includes some magic, reflection and introspection. Sometimes this magic doesn't work (due too chatic Groovy nature) and then you need to use some crappy DSL from this package.
How are they generated? I see no other way, but to
Parse the build.script.kts with an embedded Kotlin compiler / lexer
Extract all the plugins sections
Compile them, probably against some mocks (remember that Project is not yet available: we're not executing the build.gradle.kts itself yet!)
Resolve the declared plugins from Gradle Plugin repository (with some nuances coming from settngs.gradle.kts)
Introspect plugin's artifacts
Generate the sources
Compile the sources
Add the resulting classes to the script's classpath
And here is the gotcha: there is a very limited context (classpath, classes, methods — call it whatever) available when compiling the plugins block. Actually, no plugins are yet applied! Because, you know, you're parsing the block that applies plugins. Chickens, eggs, and their problems, huh…
So, and we're getting closer to the answer on your question, to provide custom DSL in plugins block, you need to modify that classpath. It's not a classpath of your build.gradle.kts, it's the classpath of the VM that parses build.gradle.kts. Basically, it's Gradle's own classpath — all the classes bundled in a Gradle distribution.
So, probably the only way to provide really custom DSLs in plugins block is to create a custom Gradle distribution.
EDIT:
Indeed, totally forgot to test the buildSrc. I've created a file PluginExtensions.kt in it, with a content
inline val org.gradle.plugin.use.PluginDependenciesSpec.`jawa`: org.gradle.plugin.use.PluginDependencySpec
get() = id("org.gradle.war") // Randomly picked
inline fun org.gradle.plugin.use.PluginDependenciesSpec.`jawa`(): org.gradle.plugin.use.PluginDependencySpec {
return id("org.gradle.cunit") // Randomly picked
}
And it seems to be working:
plugins {
jawa
jawa()
}
However, this is only working when PluginExtensions.kt is in the default package. Whenever I put it into a sub-package, the extensions are not recognized, even with an import:
Magic!
The kotlin function is just a simple extension function wrapping the traditional id method, not hard to define:
fun PluginDependenciesSpec.kotlin(module: String): PluginDependencySpec =
id("org.jetbrains.kotlin.$module")
However, this extension function is part of the standard gradle kotlin DSL API, which means it's available without any plugin. If you want to make a custom function like this available, you would need a plugin. A plugin to load your plugin. Not very practical.
I also tried using the buildSrc module to make an extension function like the above. But it turns out that buildSrc definitions aren't even available from the plugins DSL block, which has a very constrained syntax. That wouldn't have been very practical anyway, you would have needed to make a buildSrc folder for every project in which you have wanted to use the extension.
I'm not sure if this is possible at all. Try asking on https://discuss.gradle.org/.
I'm trying to use JavaFX in my android device, with the help of javafxports.
I used the XStream to parse some XML file in my program.
When i compile them, the javafxports outputs the following warnings:
Note: there were 9 classes trying to access annotations using reflection.
You should consider keeping the annotation attributes
(using '-keepattributes *Annotation*').
(http://proguard.sourceforge.net/manual/troubleshooting.html#attributes)
Note: there were 32 classes trying to access generic signatures using reflection.
You should consider keeping the signature attributes
(using '-keepattributes Signature').
(http://proguard.sourceforge.net/manual/troubleshooting.html#attributes)
Note: there were 56 unresolved dynamic references to classes or interfaces.
You should check if you need to specify additional program jars.
(http://proguard.sourceforge.net/manual/troubleshooting.html#dynamicalclass)
Note: there were 3 class casts of dynamically created class instances.
You might consider explicitly keeping the mentioned classes and/or
their implementations (using '-keep').
(http://proguard.sourceforge.net/manual/troubleshooting.html#dynamicalclasscast)
Note: there were 39 accesses to class members by means of introspection.
You should consider explicitly keeping the mentioned class members
(using '-keep' or '-keepclassmembers').
(http://proguard.sourceforge.net/manual/troubleshooting.html#dynamicalclassmember)
Note: you're ignoring all warnings!
The output .apk can be installed and run until it calls the xstream classes to read annotations in my classes. The reason is actually described in the warnings.
So my question is, how can i disable the proguard when generating .apk, or send it a custom proguard.pro configuration.
And my build.gradle is almost the same as that in the helloworld example.
Thanks.