In a multiproject build, some projects depend on others, and the latter provide not only compile/runtime libraries, but also useful test libs (such as, for instance, "mock" implementations of the components they provide).
I know of a couple of ways to make the test sources of one project available to another. They are discussed, for instance as answers to this question.
What I am looking for is some magical way to make this happen automatically, so that if a subproject adds a dependency on another subproject, it automatically gets that projects test sources added to the testCompile config.
I tried the naive approach:
configure(rootProject.subprojects.findAll {
it.configurations.getByName("compile")
.dependencies.any {
it.name == project.name
}
}) {
dependencies {
testCompile project.sourceSets.test.output
}
}
But this does not work ... presumably, because this code is evaluated "at the wrong stage" (or whatever the correct lingo is), and the other projects
don't "know" yet that they depend on this one.
I also tried putting (an equivalent of) this at the end of root build file (hoping, that everything else would already be configured by then), but that did not work either.
Is there a way to do what I am looking for here?
I also tried putting (an equivalent of) this at the end of root build file
The order of declaration in a build.gradle does not matter. Gradle build lifecycle has two phases, configuration and execution. In configuration, the various build.gradle files are read and a graph of execution order is created based on implicit and explicit dependencies among the various tasks.
Normally the root build.gradle is evaluated first, but it is possible to force the child projects to be evaluated first using evaluationDependsOnChildren()
Instead of position in the buildscripts, you can listen for various events of the buildcycle, to run something at certain points. In your case, you want to run your snippet once all projects are evaluated, using an afterEvaluate() block. See example here.
Some possible alternatives to your overall approach:
Add testCompile dependency from the downstream project instead of injecting from the upstream project. Add this to root build.gradle:
subprojects {
project.configurations.getByName("compile").dependencies.each {
if (it.name == foo){ // could be expanded to if it.name in collection
//inherit testCompile
project.configurations.dependencies.testCompile += it.sourceSets.test.output
}
}
}
(pseudo-code, haven't tested)
Separate out the shareable test/mock components into a separate mock-project that you can add as testCompile dependency to both upstream and downstream projects
Related
I'm trying to set up a Kotlin project which wants some runtime-only dependencies, but those dependencies come from a classifier.
I'm using a libs.versions.toml catalog file to store all the versions, but Gradle decided that the catalog file can't store classifier info, so now I'm trying to find a way to specify it that doesn't use the catalog.
kotlin {
// ...
sourceSets {
val jvmMain by getting {
dependencies {
// ...
implementation(libs.lwjgl)
implementation(libs.lwjgl.openvr)
runtimeOnly(libs.lwjgl.openvr) // but want classifier = "native-windows"
}
}
val jvmTest by getting
}
}
Tried so far
On a Gradle ticket, they say it would be this:
runtimeOnly(libs.lwjgl.openvr) {
artifact {
classifier = "native-windows"
}
}
However, this does not work, as there is no overload for runtimeOnly which takes both an Any and a configuration action. If I provide it as a string, it works, but then I'm not using the version catalog anymore.
The similar question answered here would have me use this:
implementation(variantOf(libs.lwjgl.openvr) {
classifier("native-windows")
})
However, this doesn't appear to be valid either - No variantOf function exists, I can't find anywhere to import it from, and nobody thus far has said where they got it from.
I then tried the usual hack of breaking out things like .apply{}... but it doesn't look like the type of the object I get at any point is the right type to be able to set the classifier.
So I'm out of ideas, short of discarding the catalogs feature in yet another project because it's blocking me doing anything useful.
Versions in play:
Gradle 7.5.1
Kotlin Multiplatform Plugin 1.6.10
I'm trying to define libraries in a common location. So in an our.libraries.gradle.kts script in a shared build plugin, I have this:
inner class Libraries {
val junit get() = ...
val junitParams get() = ...
}
val libraries = Libraries()
project.extra["libraries"] = libraries
In one of the Groovy build scripts elsewhere in the project, this is referred to like this:
allprojects {
dependencies {
testImplementation libraries.junit
}
}
This works fine.
So I try converting that script to Kotlin:
allprojects {
dependencies {
"testImplementation"(libraries.junit)
}
}
And now this fails, because it can't see the libraries property on the project, so I try explicitly pulling it out at the start:
val libraries: Libraries by project.extra
allprojects {
dependencies {
"testImplementation"(libraries.junit)
}
}
But this doesn't work either, because the script can't find the Libraries class.
I also tried putting Libraries in Libraries.kt, but then I can't seem to call methods like exclude using named parameters because for whatever reason Gradle doesn't support using the DSL when it's moved to a top-level class.
This is sort of similar to this question, but in the case of wanting to put in simple types, everything works fine. Which is to say, I can put the libraries in as a Map, but then any time I want to reference one, I have to write this:
"testImplementation"(libraries["junit"]!!)
This is obviously ugly, so I have been trying to avoid it.
So I'm stumped again.
This is part of a long saga trying to get this to work in many different ways, so the essential question is still the same: how can we define all our libraries in one location, and then refer to those in a type-safe way from other build scripts?
Recently, Gradle added shared dependencies via a TOML file, but that method only supports the version numbers, whereas our library definitions also include the excluded dependencies.
It was hard to put a completely self-contained example in the question because multiple files are involved, so here's a test repo.
Can i write in my custom plugin some function like kotlin("jvm")?
plugins {
java
kotlin("jvm") version "1.3.71"
}
I want to write function myplugin("foo") in my custom plugin and then use it like
plugins {
java
kotlin("jvm") version "1.3.71"
custom.plugin
myplugin("foo")
}
How i can do it?
I think that plugins block is some kind of a macro expression. It is parsed and precompiled using a very limited context. Probably, the magic happens somewhere in kotlin-dsl. This is probably the only way to get static accessors and extension functions from plugins to work in Kotlin. I've never seen a mention of this process in Gradle's documentation, but let me explain my thought. Probably, some smart guys from Gradle will correct me.
Let's take a look at some third-party plugin, like Liquibase. It allows you to write something like this in your build.gradle.kts:
liquibase {
activities {
register("name") {
// Configure the activity here
}
}
}
Think about it: in a statically compiled language like Kotlin, in order for this syntaxt to work, there should be an extension named liquibase on a Project type (as it is the type of this object in every build.gradle.kts) available in the classpath of a Gradle's VM that executes the build script.
Indeed, if you click on it, you'll see something like:
fun org.gradle.api.Project.`liquibase`(configure: org.liquibase.gradle.LiquibaseExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("liquibase", configure)
But take a look at the file where it is defined. In my case it is ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/cmljl3ridzazieb8fzn553oa8/cache/src/org/gradle/kotlin/dsl/Accessors39qcxru7gldpadn6lvh8lqs7b.kt. It is definitelly an auto-generated file. A few levels upper in a file tree — at ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/ in my case — there are dozens of similar directories. I guess, one by every plugin/version I've ever used with Gradle 6.3. Here is another one for the Detekt plugin:
fun org.gradle.api.Project.`detekt`(configure: io.gitlab.arturbosch.detekt.extensions.DetektExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("detekt", configure)
So, we have a bunch of .kt files defining all that extensions for different plugins applied to the project. That files are obviously pre-cached and precompiled and their content is available in build.gradle.kts. Indeed, you can find classes directories beside those sources.
The sources are generated based on the content of the applied plugins. It is probably a tricky task that includes some magic, reflection and introspection. Sometimes this magic doesn't work (due too chatic Groovy nature) and then you need to use some crappy DSL from this package.
How are they generated? I see no other way, but to
Parse the build.script.kts with an embedded Kotlin compiler / lexer
Extract all the plugins sections
Compile them, probably against some mocks (remember that Project is not yet available: we're not executing the build.gradle.kts itself yet!)
Resolve the declared plugins from Gradle Plugin repository (with some nuances coming from settngs.gradle.kts)
Introspect plugin's artifacts
Generate the sources
Compile the sources
Add the resulting classes to the script's classpath
And here is the gotcha: there is a very limited context (classpath, classes, methods — call it whatever) available when compiling the plugins block. Actually, no plugins are yet applied! Because, you know, you're parsing the block that applies plugins. Chickens, eggs, and their problems, huh…
So, and we're getting closer to the answer on your question, to provide custom DSL in plugins block, you need to modify that classpath. It's not a classpath of your build.gradle.kts, it's the classpath of the VM that parses build.gradle.kts. Basically, it's Gradle's own classpath — all the classes bundled in a Gradle distribution.
So, probably the only way to provide really custom DSLs in plugins block is to create a custom Gradle distribution.
EDIT:
Indeed, totally forgot to test the buildSrc. I've created a file PluginExtensions.kt in it, with a content
inline val org.gradle.plugin.use.PluginDependenciesSpec.`jawa`: org.gradle.plugin.use.PluginDependencySpec
get() = id("org.gradle.war") // Randomly picked
inline fun org.gradle.plugin.use.PluginDependenciesSpec.`jawa`(): org.gradle.plugin.use.PluginDependencySpec {
return id("org.gradle.cunit") // Randomly picked
}
And it seems to be working:
plugins {
jawa
jawa()
}
However, this is only working when PluginExtensions.kt is in the default package. Whenever I put it into a sub-package, the extensions are not recognized, even with an import:
Magic!
The kotlin function is just a simple extension function wrapping the traditional id method, not hard to define:
fun PluginDependenciesSpec.kotlin(module: String): PluginDependencySpec =
id("org.jetbrains.kotlin.$module")
However, this extension function is part of the standard gradle kotlin DSL API, which means it's available without any plugin. If you want to make a custom function like this available, you would need a plugin. A plugin to load your plugin. Not very practical.
I also tried using the buildSrc module to make an extension function like the above. But it turns out that buildSrc definitions aren't even available from the plugins DSL block, which has a very constrained syntax. That wouldn't have been very practical anyway, you would have needed to make a buildSrc folder for every project in which you have wanted to use the extension.
I'm not sure if this is possible at all. Try asking on https://discuss.gradle.org/.
Background
Lint is a Static Code Analysis Tool that scans Android project sources for potential bugs.
It uses one or more options inside a lintOptions block. One of these options is baseline, which allows snapshots of current sets of warnings in Lint. It takes in a file path and can do this one of two ways:
One way uses a method called file() after it, which records all current issues in the file provided:
android {
lintOptions {
baseline file("lint-baseline.xml") // your choice of filename/path here
}
}
Another way uses method called configFile()
android{
...
lintOptions {
baseline configFile('quality/lint/lint-baseline.xml')
}
...
}
which seems like it has a similar use, but I have found no documentation confirming or denying this.
Question
What is configFile() and how does it differ from file() in Lint?
I'm trying to run instrumentation test cases but getting the below error while dex conversion
UNEXPECTED TOP-LEVEL EXCEPTION:
com.android.dex.DexException: Too many classes in --main-dex-list, main dex capacity exceeded
at com.android.dx.command.dexer.Main.processAllFiles(Main.java:494)
at com.android.dx.command.dexer.Main.runMultiDex(Main.java:334)
at com.android.dx.command.dexer.Main.run(Main.java:244)
at com.android.dx.command.dexer.Main.main(Main.java:215)
at com.android.dx.command.Main.main(Main.java:106)
:App:dexDebug FAILED
How to resolve this issue in gradle?
Let's first understand the problem:
On pre-Lollipop devices, only main dex is being loaded by the framework. To support multi-dex applications you have to explicitly patch application class loader with all the secondary dex files (this is why your Application class have to extend MultiDexApplication class or call MultiDex#install).
This means that your application's main dex should contain all the classes that are potentially accessible before class loader patching.
You will receive java.lang.ClassNotFoundException if your application code will try to reference a class that was packaged in one of your secondary dex files before successfully patching application class loader.
I've documented here how plugin decides which classes should be packaged in main-dex.
If total amount of methods that those classes are referencing exceeds the 65,536 limit, then build will fail with Too many classes in --main-dex-list, main dex capacity exceeded error.
I can think of three possible solutions for this issue:
(The easiest solution, but not suitable for most of the
applications) Change your minSdkVersion to 21.
Shrink your application code. This was discussed many times previously (see here and here).
If none of the above solutions work for you, you can try to use my workaround for this issue - I'm patching the Android gradle plugin to not include Activity classes in main dex. It's a bit hacky, but works well for me.
There's an issue in Android bug tracker regarding this error. Hopefully the Tools team will provide a better solution soon.
Update (4/27/2016)
Version 2.1.0 of Gradle plugin allows to filter main-dex list classes.
Warning: this is using an unsupported api that will be replaced in the future.
For example, to exclude all activity classes you can do:
afterEvaluate {
project.tasks.each { task ->
if (task.name.startsWith('collect') && task.name.endsWith('MultiDexComponents')) {
println "main-dex-filter: found task $task.name"
task.filter { name, attrs ->
def componentName = attrs.get('android:name')
if ('activity'.equals(name)) {
println "main-dex-filter: skipping, detected activity [$componentName]"
return false
} else {
println "main-dex-filter: keeping, detected $name [$componentName]"
return true
}
}
}
}
}
You can also check my example project that demonstrates this issue (and applies the above filtering).
Update 2 (7/1/2016)
Version 2.2.0-alpha4 of Gradle plugin (with build-tools v24) finally solves this issue by reducing multidex keep list to a minimum.
The unsupported (and undocumented) filter from 2.1.0 should not be used anymore. I've updated my sample project, demonstrating that build succeeds now without any custom build logic.
One other way to resolve this issue is to remove classes with Runtime annotations from the main DEX file:
android {
dexOptions {
keepRuntimeAnnotatedClasses false
}
}
This is particularly helpful for applications which are using dependency injection frameworks since even for Dagger annotations are usually kept in runtime.
You have two choices.
use ProGuard to strip down number of methods
use multidex feature
My advice - go with ProGuard, it require as little as zero changes to source code