basing on Gradle : Copy all test dependencies to a zip file
I created
task zipDeps(type: Zip) {
from configurations.testCompile.allArtifacts.files
from configurations.testCompile
exclude { details -> details.file.name.contains('servlet-api') }
exclude { details -> details.file.name.contains('el-api') }
exclude { details -> details.file.name.contains('jsp-api') }
exclude { it.file in configurations.providedCompile.files }
archiveName "${rootProjectName}-runtime-dependencies_full.zip"
doLast{
ant.copy (toDir : "$buildDir/libs/") {
fileset(file:"$buildDir/distributions/${rootProjectName}-runtime-dependencies_full.zip")
}
}
}
This worked fine until I migrated to gradle 2.0. If i leave that code like it was, the task is executed in the beginning and nothing happens at all. If I add << to the task and make it dependent to my war build task, at the end of the war build it claims to be up-to-date but nothing has happened.
One of my problems seems to be that the fileset to be copied is not created at all.
What can I do to get that stuff working again?
The task won't be executed in the beginning, but calling .files resolves the configurations too early. The first from line needs to go (it's redundant and also calls .files when it shouldn't). The doLast block is suspicious and should probably be turned into a separate Copy task. Instead of the second from and last exclude, try from (configurations.compile - configurations.providedCompile).
Related
I have a Kotlin project in Gradle with 3 main folders: myorg.folder1, myorg.folder2, and myorg.folder3. Due to project constraints I cannot change this structure. As part of the project, I would like to have two different jar distributions: One containing all 3 folders, and one that only contains folder1 and folder2. The project has been set up so that these folders can operate independently of folder3 without errors. How can I do this?
In order to do this, (following broot's advice, thanks), I was able to find that I needed to use a sourceSet to make this work.
sourceSets {
noFolder3
kotlin {
exclude "myorg/folder3/**"
}
}
}
Then, I can build with all 3 folders using build, and without folder3 using build noFolder3.
One extra note is that since I was using Intellij IDEA, all of my code was stored in the directory src/main/kotlin, so I also added the following line inside the kotlin {} block:
srcDirs += "src/main/kotlin"
I'm trying to include a single source file for the Main-Class of a jar -- actually I have a toplevel directory of such files, demo/, but I don't want them all in a jar. I want separate jars, each using only one of these.
This seems like sort of an anti-pattern in gradle, as the fundamental mechanism infers or prefers that I should instead place each in a distinct sourceSet. Ugh.
A casual reading of the docs implies Jar.from() might be useful this way: "Specifies the source files or directories..."
As it turns out, "source" is perhaps a bit of a misnomer. Here's an example, a typical kotlin fat jar with the added from("demo/LockingBufferDemo.kt"):
val jar by tasks.getting(Jar::class) {
manifest { attributes["Main-Class"] = "LockingBufferDemoKt" }
from(sourceSets.main.get().output)
from("demo/LockingBufferDemo.kt")
dependsOn(configurations.runtimeClasspath)
from({
configurations.runtimeClasspath.get().filter {
it.name.endsWith("jar") }.map { zipTree(it) }
})
}
Forgive my naivety: Guess what does not end up in the jar? LockingBufferDemo.class. Guess what does? LockingBufferDemo.kt. In other words, this is treated more like a resource, not a source, and what would have been the simplest answer is a dead end.
Another way to approach this would be add the demo directory as an independent sourceSet and then use from(sourceSets["demo"].get(), except I can't find a way to complete that; according to IntelliJ get() returns a rather opaque "Provider" which I can't find mentioned in the actual javadoc: 1, 2 and I really feel like I'm heading down the garden path at this point with the woods rapidly growing darker around me.
This should not be this complicated.
How can I add a single file (or class derived from such) into a jar in gradle without having to put it alone in a directory and create a sourceSet for every such directory?
Regarding your explanations at the start of your post, you should consider creating multiple tasks of type Jar on your own, as every task of type Jar will only create a single JAR-file, and you "want separate jars". I do not think you should use different source sets, as all of the files are Java Kotlin source files in the end and are processed in the same way (compilation, tests, docs ...). Multiple source sets would complicate this common pipeline.
"Specifies the source files or directories..." As it turns out, "source" is perhaps a bit of a misnomer.
Well, the documentation does not stop there, but it says "for a copy and creates a child CopySpec". So it is not the source as in source code, but the source of a copy operation. In Gradle, tasks that create an archive (ZIP, JAR) share their API with tasks that copy files, as the creation of an archive can be seen as copying files from their source location to their target location (inside the archive).
So, the from method can be used to specify the files that are copied / archived. But it does not only take a sourcePath parameter, but also a closure or action for configuration. Using this second parameter, you can narrow your source files or directories down to the one file you need, for example using the method include:
val jar by tasks.getting(Jar::class) {
manifest { attributes["Main-Class"] = "LockingBufferDemoKt" }
from(sourceSets.main.get().output) {
include("**/LockingBufferDemo.class")
}
dependsOn(configurations.runtimeClasspath)
from({
configurations.runtimeClasspath.get().filter {
it.name.endsWith("jar") }.map { zipTree(it) }
})
}
I migrated a 3rd-party tool's gradle.build configs, so it uses android gradle plugin 3.5.3 and gradle 5.4.1.
The build goes all smoothly, but when I'm trying to make an .aab archive, things got broken because the toolchain expects the output .aab file to be named MyApplicationId.aab, but the new gradle defaults to output MyApplicationId-release.aab, with the buildType suffix which wasn't there.
I tried to search for a solution, but documentations about product flavors are mostly about adding suffix. How do I prevent the default "-release" suffix to be added? There wasn't any product flavor blocks in the toolchain's gradle config files.
I realzed that I have to create custom tasks after reading other questions and answers:
How to change the generated filename for App Bundles with Gradle?
Renaming applicationVariants.outputs' outputFileName does not work because those are for .apks.
I'm using Gradle 5.4.1 so my Copy task syntax reference is here.
I don't quite understand where the "app.aab" name string came from, so I defined my own aabFile name string to match my toolchain's output.
I don't care about the source file so it's not deleted by another delete task.
Also my toolchain seems to be removing unknown variables surrounded by "${}" so I had to work around ${buildDir} and ${flavor} by omitting the brackets and using concatenation for proper delimiting.
tasks.whenTaskAdded { task ->
if (task.name.startsWith("bundle")) { // e.g: buildRelease
def renameTaskName = "rename${task.name.capitalize()}Aab" // renameBundleReleaseAab
def flavorSuffix = task.name.substring("bundle".length()).uncapitalize() // "release"
tasks.create(renameTaskName, Copy) {
def path = "$buildDir/outputs/bundle/" + "$flavorSuffix/"
def aabFile = "${android.defaultConfig.applicationId}-" + "$flavorSuffix" + ".aab"
from(path) {
include aabFile
rename aabFile, "${android.defaultConfig.applicationId}.aab"
}
into path
}
task.finalizedBy(renameTaskName)
}
}
As the original answer said: This will add more tasks than necessary, but those tasks will be skipped since they don't match any folder.
e.g.
Task :app:renameBundleReleaseResourcesAab NO-SOURCE
In Gradle, I can run a single test from the command line as follows:
gradle -Dtest.single=VeryCriticalTestX test
VeryCriticalTestX is frequently executed alone, and I'd like to provide a more readable and flexible API to my fellow developers. Ideally, they would only need to run
gradle testCritical
without worrying about the test's name. This would also allow me to change the name over time without breaking Jenkins builds.
How do I do achieve this?
Gradle's Test-Tasks can be configured to only include tests matching a given name pattern. You can create a new task testCritical as follows:
task testCritical(type: Test) {
group = 'verification'
description = 'Runs a very critical test'
outputs.upToDateWhen { false }
include('**/VeryCriticalTestX.class')
}
With this, renaming VeryCriticalTestX to something else doesn't break other people's commands or Jenkins jobs. However, there is the risk that someone accidentally disables this task by renaming the VeryCriticalTestX without adapting the task configuration. This can be prevented with the following TaskExecutionListener:
// verify that testCritical is not skipped unexpectedly due to a renamed classfile
// we detect this using Gradle's NO-SOURCE TaskState
gradle.addListener(new TaskExecutionListener() {
void beforeExecute(Task task) {}
void afterExecute(Task task, TaskState state) {
if (checkJooqEnumBindings.equals(task) && state.getNoSource()) {
throw new GradleException("testCritical did not run because it couldn't find VeryCriticalTestX")
}
}
})
I have a Maven project that uses the jaxb2-maven-plugin to compile some xsd files. It uses the staleFile to determine whether or not any of the referenced schemaFiles have been changed. Unfortunately, the xsd files in question use <xs:include schemaLocation="../relative/path.xsd"/> tags to include other schema files that are not listed in the schemaFile argument so the staleFile calculation in the plugin doesn't accurately detect when things need to be actually recompiled. This winds up breaking incremental builds as the included schemas evolve.
Obviously, one solution would be to list all the recursively referenced files in the execution's schemaFile. However, there are going to be cases where developers don't do this and break the build. I'd like instead to automate the generation of this list in some way.
One approach that comes to mind would be to somehow parse the top-level XSD files and then either sets a property or outputs a file that I can then pass into the schemaFile parameter or schemaFiles parameter. The Groovy gmaven plugin seems like it might be a natural way to embed that functionality right into the POM. But I'm not familiar enough with Groovy to get started.
Can anyone provide some sample code? Or offer an alternative implementation/solution?
Thanks!
Not sure how you'd integrate it into your Maven build -- Maven isn't really my thing :-(
However, if you have the path to an xsd file, you should be able to get the files it references by doing something like:
def rootXsd = new File( 'path/to/xsd' )
def refs = new XmlSlurper().parse( rootXsd ).depthFirst().findAll { it.name()=='include' }.#schemaLocation*.text()
println "$rootXsd references $refs"
So refs is a list of Strings which should be the paths to the included xsds
Based on tim_yates's answer, the following is a workable solution, which you may have to customize based on how you are configuring the jaxb2 plugin.
Configure a gmaven-plugin execution early in the lifecycle (e.g., in the initialize phase) that runs with the following configuration...
Start with a function to collect File objects of referenced schemas (this is a refinement of Tim's answer):
def findRefs { f ->
def relPaths = new XmlSlurper().parse(f).depthFirst().findAll {
it.name()=='include'
}*.#schemaLocation*.text()
relPaths.collect { new File(f.absoluteFile.parent + "/" + it).canonicalFile }
}
Wrap that in a function that iterates on the results until all children are found:
def recursiveFindRefs = { schemaFiles ->
def outputs = [] as Set
def inputs = schemaFiles as Queue
// Breadth-first examine all refs in all schema files
while (xsd = inputs.poll()) {
outputs << xsd
findRefs(xsd).each {
if (!outputs.contains(it)) inputs.add(it)
}
}
outputs
}
The real magic then comes in when you parse the Maven project to determine what to do.
First, find the JAXB plugin:
jaxb = project.build.plugins.find { it.artifactId == 'jaxb2-maven-plugin' }
Then, parse each execution of that plugin (if you have multiple). The code assumes that each execution sets schemaDirectory, schemaFiles and staleFile (i.e., does not use the defaults!) and that you are not using schemaListFileName:
jaxb.executions.each { ex ->
log.info("Processing jaxb execution $ex")
// Extract the schema locations; the configuration is an Xpp3Dom
ex.configuration.children.each { conf ->
switch (conf.name) {
case "schemaDirectory":
schemaDirectory = conf.value
break
case "schemaFiles":
schemaFiles = conf.value.split(/,\s*/)
break
case "staleFile":
staleFile = conf.value
break
}
}
Finally, we can open the schemaFiles, parse them using the functions we've defined earlier:
def schemaHandles = schemaFiles.collect { new File("${project.basedir}/${schemaDirectory}", it) }
def allSchemaHandles = recursiveFindRefs(schemaHandles)
...and compare their last modified times against the stale file's modification time,
unlinking the stale file if necessary.
def maxLastModified = allSchemaHandles.collect {
it.lastModified()
}.max()
def staleHandle = new File(staleFile)
if (staleHandle.lastModified() < maxLastModified) {
log.info(" New schemas detected; unlinking $staleFile.")
staleHandle.delete()
}
}