How to implement custom platform logic using Kotlin Multiplatform feature? - kotlin

Kotlin Multiplatform is a good feature to build multiplatform applications, but currently it is (likely) restricted to be intrinsic in Kotlin Multiplatform ecosystem. Can I implement custom build logic to extend the resolution strategy of expect, actual and the like? Or to say treat these features as a general concept of multiplatform, but have different behaviors during build process. Gradle work is welcomed.
For example, if the related extension points were available, one could write a Kotlin compiler plugin to resolve those expect/actual endpoints and maybe compose them into actually platform-specific runtime logic, and then write a Gradle plugin to ultimately process these artifacts.
So if there were two "multiplatform" scenes where both use jvm as "backend", but provide different api with the same or similar logic as "frontend", one could do as above to provide benefits which Kotlin Multiplatform does - write once, run anywhere.
I'd prefer to call this "api-layer multiplatform", to differ that Kotlin Multiplatform is "system-layer multiplatform". "Platform" could be a more abstract one.
So here is what the producer does, just like Kotlin Multiplatform:
build.gradle.kts:
plugins {
kotlin("jvm")
id("<multiplatform-plugin-id>") // Comes with Kotlin compiler plugin too
}
dependencies {
api("<common-dependency-notation>") // Another multiplatform library
}
common module:
fun hello() {
val logger = serviceLogger // Using api from that another multiplatform library
logger.info("Hello")
}
expect fun hookOnStart(block: () -> Unit) // Needs to provide platform-specific implementations
platform module:
actual fun hookOnStart(block: () -> Unit) { // Imaginary
ClientEvents.START.register(block)
}
anotherPlatform module:
actual fun hookOnStart(block: () -> Unit) { // Imaginary
val event = EventFactory.once(ClientStartEvent::class.java, block)
GlobalEventHandler.register(event)
}
As said before, after build, each platform will have its own artifact prepared for runtime or provided as library. He benefits from that another multiplatform library because he could provide each platform with same features through sharing code.
And the following is what the consumer does: (Let's say he's on platform)
build.gradle.kts
plugins {
kotlin("jvm")
}
dependencies {
implementation("<previous-common-dependency-notation>") // From the previous author, mapped to `platform` version
}
Bussiness logic:
fun runBussiness() {
hello()
hookOnStart { serviceLogger.info("world!") }
}

This is pretty uncharted territory and without any documentation.
I'd investigate the source code of the kotlin-multiplatform gradle plugin more in-depth and see if you can extend the existing target palette and expect/actual behaviour.
I'd guess that the plugin isn't really built for this kind of extension, but if you have solid reasons, you could probably submit feature requests and work on a local fork in the meantime.
Update:
If I understood your use-case correctly, you'd like to extend the expect/actual mechanism, which is currently a target/platform based abstraction?
I believe a more general way of making abstractions, such as using interfaces, could serve you. However, I can see the added compile-time safety benefits you seek 🤔, not sure what changes that'd need in the kotlin-multiplatform plugin and if JetBrains team would like that direction. Maybe something Artyom Degtyarev or someone from the JetBrains team could answer?

Related

Is sequence referred to internal DSL in Kotlin?

In book 'Kotlin in Action', it says Kotlin DSL structure is most commonly created through chained method calls. Also, a typical library consists of many methods and no context is maintained btw one call and the next.
I'm confused of which side sequence is close to. Before I read this, I've thought sequence is just API of library, but it really fits with feature of DSL.
I'm not 100% sure this answers your question, but I would not think of Sequence pipelines as a "DSL" per se, in particular because it is quite general, which is the opposite of "domain-specific" - the heart of the definition of a DSL.
If you build your own builder API based on chained method calls for a specific domain, you could consider that as a DSL, but I would say Kotlin DSLs are mostly made of nested lambdas with declarative property assignments, rather than chained method calls.
This is because lambdas in Kotlin give the illusion of blocks and structure more than actual functions and function calls, which is why nested structures like this look like their own "language" (the L of DSL). Chained method calls don't look like another "language" - they just look like function calls, but of course that's my subjective take.
For example, here is a Gradle build script using the Gradle Kotlin DSL:
plugins {
`java-library`
}
dependencies {
api("junit:junit:4.13")
implementation("junit:junit:4.13")
testImplementation("junit:junit:4.13")
}
java {
sourceCompatibility = JavaVersion.VERSION_11
targetCompatibility = JavaVersion.VERSION_11
}
It does look like its own language, you don't immediately think of Kotlin when reading such code.

How can I refer to libraries defined in a shared Gradle build plugin from another build script?

I'm trying to define libraries in a common location. So in an our.libraries.gradle.kts script in a shared build plugin, I have this:
inner class Libraries {
val junit get() = ...
val junitParams get() = ...
}
val libraries = Libraries()
project.extra["libraries"] = libraries
In one of the Groovy build scripts elsewhere in the project, this is referred to like this:
allprojects {
dependencies {
testImplementation libraries.junit
}
}
This works fine.
So I try converting that script to Kotlin:
allprojects {
dependencies {
"testImplementation"(libraries.junit)
}
}
And now this fails, because it can't see the libraries property on the project, so I try explicitly pulling it out at the start:
val libraries: Libraries by project.extra
allprojects {
dependencies {
"testImplementation"(libraries.junit)
}
}
But this doesn't work either, because the script can't find the Libraries class.
I also tried putting Libraries in Libraries.kt, but then I can't seem to call methods like exclude using named parameters because for whatever reason Gradle doesn't support using the DSL when it's moved to a top-level class.
This is sort of similar to this question, but in the case of wanting to put in simple types, everything works fine. Which is to say, I can put the libraries in as a Map, but then any time I want to reference one, I have to write this:
"testImplementation"(libraries["junit"]!!)
This is obviously ugly, so I have been trying to avoid it.
So I'm stumped again.
This is part of a long saga trying to get this to work in many different ways, so the essential question is still the same: how can we define all our libraries in one location, and then refer to those in a type-safe way from other build scripts?
Recently, Gradle added shared dependencies via a TOML file, but that method only supports the version numbers, whereas our library definitions also include the excluded dependencies.
It was hard to put a completely self-contained example in the question because multiple files are involved, so here's a test repo.

Gradle. Custom function in block plugins{}

Can i write in my custom plugin some function like kotlin("jvm")?
plugins {
java
kotlin("jvm") version "1.3.71"
}
I want to write function myplugin("foo") in my custom plugin and then use it like
plugins {
java
kotlin("jvm") version "1.3.71"
custom.plugin
myplugin("foo")
}
How i can do it?
I think that plugins block is some kind of a macro expression. It is parsed and precompiled using a very limited context. Probably, the magic happens somewhere in kotlin-dsl. This is probably the only way to get static accessors and extension functions from plugins to work in Kotlin. I've never seen a mention of this process in Gradle's documentation, but let me explain my thought. Probably, some smart guys from Gradle will correct me.
Let's take a look at some third-party plugin, like Liquibase. It allows you to write something like this in your build.gradle.kts:
liquibase {
activities {
register("name") {
// Configure the activity here
}
}
}
Think about it: in a statically compiled language like Kotlin, in order for this syntaxt to work, there should be an extension named liquibase on a Project type (as it is the type of this object in every build.gradle.kts) available in the classpath of a Gradle's VM that executes the build script.
Indeed, if you click on it, you'll see something like:
fun org.gradle.api.Project.`liquibase`(configure: org.liquibase.gradle.LiquibaseExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("liquibase", configure)
But take a look at the file where it is defined. In my case it is ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/cmljl3ridzazieb8fzn553oa8/cache/src/org/gradle/kotlin/dsl/Accessors39qcxru7gldpadn6lvh8lqs7b.kt. It is definitelly an auto-generated file. A few levels upper in a file tree — at ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/ in my case — there are dozens of similar directories. I guess, one by every plugin/version I've ever used with Gradle 6.3. Here is another one for the Detekt plugin:
fun org.gradle.api.Project.`detekt`(configure: io.gitlab.arturbosch.detekt.extensions.DetektExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("detekt", configure)
So, we have a bunch of .kt files defining all that extensions for different plugins applied to the project. That files are obviously pre-cached and precompiled and their content is available in build.gradle.kts. Indeed, you can find classes directories beside those sources.
The sources are generated based on the content of the applied plugins. It is probably a tricky task that includes some magic, reflection and introspection. Sometimes this magic doesn't work (due too chatic Groovy nature) and then you need to use some crappy DSL from this package.
How are they generated? I see no other way, but to
Parse the build.script.kts with an embedded Kotlin compiler / lexer
Extract all the plugins sections
Compile them, probably against some mocks (remember that Project is not yet available: we're not executing the build.gradle.kts itself yet!)
Resolve the declared plugins from Gradle Plugin repository (with some nuances coming from settngs.gradle.kts)
Introspect plugin's artifacts
Generate the sources
Compile the sources
Add the resulting classes to the script's classpath
And here is the gotcha: there is a very limited context (classpath, classes, methods — call it whatever) available when compiling the plugins block. Actually, no plugins are yet applied! Because, you know, you're parsing the block that applies plugins. Chickens, eggs, and their problems, huh…
So, and we're getting closer to the answer on your question, to provide custom DSL in plugins block, you need to modify that classpath. It's not a classpath of your build.gradle.kts, it's the classpath of the VM that parses build.gradle.kts. Basically, it's Gradle's own classpath — all the classes bundled in a Gradle distribution.
So, probably the only way to provide really custom DSLs in plugins block is to create a custom Gradle distribution.
EDIT:
Indeed, totally forgot to test the buildSrc. I've created a file PluginExtensions.kt in it, with a content
inline val org.gradle.plugin.use.PluginDependenciesSpec.`jawa`: org.gradle.plugin.use.PluginDependencySpec
get() = id("org.gradle.war") // Randomly picked
inline fun org.gradle.plugin.use.PluginDependenciesSpec.`jawa`(): org.gradle.plugin.use.PluginDependencySpec {
return id("org.gradle.cunit") // Randomly picked
}
And it seems to be working:
plugins {
jawa
jawa()
}
However, this is only working when PluginExtensions.kt is in the default package. Whenever I put it into a sub-package, the extensions are not recognized, even with an import:
Magic!
The kotlin function is just a simple extension function wrapping the traditional id method, not hard to define:
fun PluginDependenciesSpec.kotlin(module: String): PluginDependencySpec =
id("org.jetbrains.kotlin.$module")
However, this extension function is part of the standard gradle kotlin DSL API, which means it's available without any plugin. If you want to make a custom function like this available, you would need a plugin. A plugin to load your plugin. Not very practical.
I also tried using the buildSrc module to make an extension function like the above. But it turns out that buildSrc definitions aren't even available from the plugins DSL block, which has a very constrained syntax. That wouldn't have been very practical anyway, you would have needed to make a buildSrc folder for every project in which you have wanted to use the extension.
I'm not sure if this is possible at all. Try asking on https://discuss.gradle.org/.

How to do method instrumentation in Kotlin - but keep it method testable

I have a method that I need to instrument to call New Relic: setup a segment, run the business logic and end the segment. Is there a way to do it in Kotlin (as in Spring AOP)?
fun saveCustomer() {
val segment = NewRelic.getAgent().transaction.startSegment("save customer")
// business logic here
segment.end()
}
I experimented isolating newRelic dependency and can now reuse it across my whole app:
fun saveCustomer() {
newRelic.executeWithSegment { // this starts/ends the segment and calls the function block
// business logic here
}
}
However, this makes unit testing of saveCustomer harder, because after mocking newRelic.executeWithSegment (which I must; otherwise New Relic is contacted in the tests), the code block (business logic) is not executed anymore - so the test fails.
Is there a way to fulfill those requirements? (Perhaps with an annotation or using Kotlin delegation pattern or even some lightweight library; not sure.)
You can use AspectJ.
It is independent of Spring, more powerful and more efficient (no proxies) than Spring AOP.
It should work with any JVM language, although I never tried with Kotlin as a target for my aspects.
If you do compile-time weaving, the AspectJ runtime is the only dependency you need. Its size is 120K.
For load-time weaving you need the AspectJ weaving agent instead. Its size is 1.9M.

Multi-platform InputStream Alternative in Kotlin?

I’m looking for a multi-platform alternative to input streams. My concrete task is to fetch an encrypted file from a remote server via https and decrypt it on demand.
In Java land I would an implement InputStream which proxies the reads to the input stream from the https library. How can I do the same in kotlin targeting multiple platforms.
I see ktor returns an ByteReadChannel, but I don’t know which functions.
I’m lost and don’t know where to start. Thanks for your help in advance.
If the framework you are using does not provide you with a full-fledged InputStream implementation, the only chance left is to write your own. Much like what the ktor developers did: ByteReadChannel is just an abstraction of "reading bytes from a channel".
This abstraction lives in the common part and allows to write application and business logic around it.
The key to make this work in the context of a Kotlin Multiplatform project is, the actual implementation needs to be provided in the platform specific parts. The JVM specific code of the ktor project actually has an implementation that uses InputStream: InputStream.toByteReadChannel.
You certainly don't have to do it like your example from the ktor project and model everything down from byte channels up to file representations. If you want to leverage Kotlin framework classes, Sequences might be handy. This could look something like this:
// in common
interface FileFetcher {
fun fetch(): Sequence<Byte>
}
expect fun fileFetcher(source: String): FileFetcher
// in jvm
class JvmFileFetcher(val input: java.io.InputStream): FileFetcher {
override fun fetch(): Sequence<Byte> = input.readBytes().asSequence()
}
actual fun fileFetcher(source: String): FileFetcher {
val input = java.net.URL(source).openStream()
return JvmFileFetcher(input)
}
You would define an interface FileFetcher along with a factory function fileFetcher in the common part. By using the expect keyword on the fileFetcher function you need to provide platform-specific implementations for all target platforms you define. Use the FileFetcher interface in the common part to implement your logic (decrypting file contents etc.). See the documentation for Sequence for how to work with it.
Then implement the factory function for all platforms and use the actual keyword on them. You will then need to write platform-specific implementations of FileFetcher. My example shows how a JVM version of the FileFetcher interface.
The example is of course very basic and you probably would not want to do it exactly like this (at least some buffering would be needed, I guess). Also, within the JVM part you could also leverage your favorite networking/HTTP library easily.