Swagger-codegen-maven: Change where files are generated to - swagger-codegen

I'm currently using swagger-codegen through the maven plugin: https://github.com/swagger-api/swagger-codegen/tree/master/modules/swagger-codegen-maven-plugin.
I've set the place where I want my api files to be generated with the <apiPackage></apiPackage> field in my pom.xml.
But when I generate my apis, it gets put in src/main/java/<apiPackage>. I would like to know where is src/main/java declared as part of the path and if it can be changed.

src/main/java is a Maven convention.
This folder structure is hard-coded in Swagger Codegen's code and cannot be changed.

You can extend AbstractJavaCodegen and in constructor override value of sourceFolder variable. This extended class you can passed to codegen as --lang parameter.

Related

How does import find the file path in Kotlin?

I've read the Kotlin doc (https://kotlinlang.org/docs/packages.html), and I understood that, when importing a package, the package name does not need to match the folder's path that stores the package (unlike what happens in Java).
I don't have issues creating a package and importing it into other classes.
What I'd like to understand, is how the compiler can find the file to import?
For example:
if a file import animals.mammals.cats.* :
import animals.mammals.cats.*
...
the entities to import do not need to be stored in the file /animals/mammals/cats.kt, as long as the package name is "animals.mammals.cats":
package animals.mammals.cats
...
This Kotlin file could be stored in src/animals/kittens for example.
In other words, how import can locate the file/s to load since the package name does not help?
Thanks!
TL;DR: the compiler is given the paths to all files to compile and to all dependencies, and therefore knows about all available packages and the declarations they contain.
First, note that the import statement itself is not really the most important. It's just a convenient syntax to avoid having to specify the package everywhere throughout the file. But technically you don't need to import anything to be able to use declarations from outside the current file - you can just use their fully qualified name (a.k.a FQN) which is the package name + . + the name of the declaration.
Now, on to your question. When you run the compiler, you provide the paths to the complete set of files to be compiled at the same time: you compile a module, not a single file. Therefore, it has access to all declarations in all those files and maintains its own data structures about the available classes and top-level functions, and all symbols in general. So it can store and find declarations using just their FQN. (DISCLAIMER: I'm no expert and I don't actually know how it's done internally, but I'm just guessing that conceptually it's like storing a big mapping between FQN and the information about the corresponding declaration.)
If the declaration you use is not in the set of files being compiled, it must be in one of your dependencies. You can tell the compiler about the available dependencies by specifying the list of jars containing their already compiled classes. This list of all classes available at compile time is called the compile classpath. This is why the tool you use to build your project (for instance, Gradle or your IDE) needs to know about those dependencies, so it can put their declarations on the compile classpath when calling the compiler for you. Then, just like declarations that are being compiled, the ones from the compile classpath can be easily looked up by the compiler (the path has been given to the compiler as an argument).
Now, when you actually run your compiled program, at least on the JVM, the classes required must be placed on the runtime classpath - a set of classes given to the java program. Finding those declarations while the program runs is done by classloaders. There are multiple classloaders, organized in a hierarchy, but there is no need to go into details here. Basically, each time a class is used for the first time while your program is running, one classloader will be asked to load that class into memory. There are different implementations of classloaders, but one of the most common is the URLClassLoader which is given the URLs of some jars that contain classes, and knows how to read classes from these jars into memory on-demand.

Gradle. Custom function in block plugins{}

Can i write in my custom plugin some function like kotlin("jvm")?
plugins {
java
kotlin("jvm") version "1.3.71"
}
I want to write function myplugin("foo") in my custom plugin and then use it like
plugins {
java
kotlin("jvm") version "1.3.71"
custom.plugin
myplugin("foo")
}
How i can do it?
I think that plugins block is some kind of a macro expression. It is parsed and precompiled using a very limited context. Probably, the magic happens somewhere in kotlin-dsl. This is probably the only way to get static accessors and extension functions from plugins to work in Kotlin. I've never seen a mention of this process in Gradle's documentation, but let me explain my thought. Probably, some smart guys from Gradle will correct me.
Let's take a look at some third-party plugin, like Liquibase. It allows you to write something like this in your build.gradle.kts:
liquibase {
activities {
register("name") {
// Configure the activity here
}
}
}
Think about it: in a statically compiled language like Kotlin, in order for this syntaxt to work, there should be an extension named liquibase on a Project type (as it is the type of this object in every build.gradle.kts) available in the classpath of a Gradle's VM that executes the build script.
Indeed, if you click on it, you'll see something like:
fun org.gradle.api.Project.`liquibase`(configure: org.liquibase.gradle.LiquibaseExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("liquibase", configure)
But take a look at the file where it is defined. In my case it is ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/cmljl3ridzazieb8fzn553oa8/cache/src/org/gradle/kotlin/dsl/Accessors39qcxru7gldpadn6lvh8lqs7b.kt. It is definitelly an auto-generated file. A few levels upper in a file tree — at ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/ in my case — there are dozens of similar directories. I guess, one by every plugin/version I've ever used with Gradle 6.3. Here is another one for the Detekt plugin:
fun org.gradle.api.Project.`detekt`(configure: io.gitlab.arturbosch.detekt.extensions.DetektExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("detekt", configure)
So, we have a bunch of .kt files defining all that extensions for different plugins applied to the project. That files are obviously pre-cached and precompiled and their content is available in build.gradle.kts. Indeed, you can find classes directories beside those sources.
The sources are generated based on the content of the applied plugins. It is probably a tricky task that includes some magic, reflection and introspection. Sometimes this magic doesn't work (due too chatic Groovy nature) and then you need to use some crappy DSL from this package.
How are they generated? I see no other way, but to
Parse the build.script.kts with an embedded Kotlin compiler / lexer
Extract all the plugins sections
Compile them, probably against some mocks (remember that Project is not yet available: we're not executing the build.gradle.kts itself yet!)
Resolve the declared plugins from Gradle Plugin repository (with some nuances coming from settngs.gradle.kts)
Introspect plugin's artifacts
Generate the sources
Compile the sources
Add the resulting classes to the script's classpath
And here is the gotcha: there is a very limited context (classpath, classes, methods — call it whatever) available when compiling the plugins block. Actually, no plugins are yet applied! Because, you know, you're parsing the block that applies plugins. Chickens, eggs, and their problems, huh…
So, and we're getting closer to the answer on your question, to provide custom DSL in plugins block, you need to modify that classpath. It's not a classpath of your build.gradle.kts, it's the classpath of the VM that parses build.gradle.kts. Basically, it's Gradle's own classpath — all the classes bundled in a Gradle distribution.
So, probably the only way to provide really custom DSLs in plugins block is to create a custom Gradle distribution.
EDIT:
Indeed, totally forgot to test the buildSrc. I've created a file PluginExtensions.kt in it, with a content
inline val org.gradle.plugin.use.PluginDependenciesSpec.`jawa`: org.gradle.plugin.use.PluginDependencySpec
get() = id("org.gradle.war") // Randomly picked
inline fun org.gradle.plugin.use.PluginDependenciesSpec.`jawa`(): org.gradle.plugin.use.PluginDependencySpec {
return id("org.gradle.cunit") // Randomly picked
}
And it seems to be working:
plugins {
jawa
jawa()
}
However, this is only working when PluginExtensions.kt is in the default package. Whenever I put it into a sub-package, the extensions are not recognized, even with an import:
Magic!
The kotlin function is just a simple extension function wrapping the traditional id method, not hard to define:
fun PluginDependenciesSpec.kotlin(module: String): PluginDependencySpec =
id("org.jetbrains.kotlin.$module")
However, this extension function is part of the standard gradle kotlin DSL API, which means it's available without any plugin. If you want to make a custom function like this available, you would need a plugin. A plugin to load your plugin. Not very practical.
I also tried using the buildSrc module to make an extension function like the above. But it turns out that buildSrc definitions aren't even available from the plugins DSL block, which has a very constrained syntax. That wouldn't have been very practical anyway, you would have needed to make a buildSrc folder for every project in which you have wanted to use the extension.
I'm not sure if this is possible at all. Try asking on https://discuss.gradle.org/.

How can I suppress Dokka documentation for specific classes?

I'm trying to generate nice API docs for a library I'm working on, and I haven't found a way to keep Dokka from generating empty documentation pages for generated code like the R class in my package.
I'm already using
packageOptions {
prefix = "android"
suppress = true
}
to suppress documentation at a package level, but is there a way to prevent generation of documentation for specific classes in the package I do want to generate documentation for? Or build a whitelist of classes so Dokka only generates docs for those classes?
Or is there another doc generator for Kotlin that I should look into?
It looks like you can specify class names as well as prefixes in the packageOptions section. You just specify them the same as you would a prefix.
packageOptions {
prefix = "com.fqdn.MyPackage.R"
suppress = true
}
One option is to set the desired classes as internal.
Doing so, the documentation will not be generated for internal ones.
Also useful to configure in gradle file dokka like this:
named("main"){
//sourceRoots.setFrom(file("src/main/java/com/cujo/sb/util"))
includeNonPublic.set(false)
skipEmptyPackages.set(true)
reportUndocumented.set(true)
skipDeprecated.set(false)
}

Accessing all system properties from a maven2 mojo

I've written a maven mojo that makes use of all of the properties defined for the project using an injected MavenProject object like this:
project.getProperties()
I then realized that property values specified at the command line like -Dfoo=bar are not being reflected in my mojo, only the property's value as its defined in the pom like:
<foo>super</foo>
How can my mojo access the value of foo as it's defined from the command-line versus the pom? Any ideas on this would be greatly appreciated.
Thanks!
The properties defined from the command line should be available through the System.getProperties() method.

Google Guice: Singleton with xml deserialization support?

in my project I need a class which contains the project configuration.
The configuration must be loaded from a XML file and must be a singleton.
In Guice there is a singleton scope. Now I have to "overwrite" the singleton with the deserialized configuration.
Is this somehow possible?
Important: pelase do note that Guice was originally created to get rid of all these huge and ugly XML files used by another DI library for managing dependencies. In general, when using Guice, you should be able to -almost- completely remove any XML from your project.
But if you must, perhaps because the XML file is generated by something outside your control then consider these:
Keep your whole configuration object and create a Provider for it, and bind it in Singleton scope. But you'll have to perform deserialization by yourself.
Or if your configuration is simply made of (name, value) pairs, then you can use java.util.Properties whcih can be loaded from an XML file, then use Guice Names.bindProperties() API in one of your Modules.
Then you can directly inject each single property by using #Inject #Named.