I've learned that dojo's loader can load non-amd scripts, which is awesome. We have a script provided by a vendor that requires certain global variables to be set before the script can be loaded. If, in my AMD module, I set the global variables (I know, yuck, right?) and then require(["/vendor/script.js"]), everything works great.
The one thing I'm having trouble finding an answer to is if the build process will see this require call and add "/vendor/script.js" to the dependency list and build it. Which, if I understand correctly, will cause the script to fire before my module has had an opportunity to set the required global variables.
Can anyone tell me?
The default behavior is that the build will traverse the dependencies and include them in the build. However you have options:
1) in the profile, you can specify excludes. These will not be included into the build.
// profile.js used by the build
layers: {
"myApp/myApp": {
include: [...],
exclude: [vendor/script.js]
}
2) do not include the dependency in the define statement and use require later in the module. dojo/fx does this with dojo/fx/toggler
// myCustomWidget.js
define([], function() {
// the require function won't be called by the build
// and the vendorScript won't be pulled into the build.
require(["vendor/script"], function(vendorScript) {
});
});
What I would do is to create another script to set your globals, and then include that before including your compiled .js file.
If the values you set the globals to require modules, though, I guess you could edit the compiled .JS file after it's been built.
Related
I'm trying to define libraries in a common location. So in an our.libraries.gradle.kts script in a shared build plugin, I have this:
inner class Libraries {
val junit get() = ...
val junitParams get() = ...
}
val libraries = Libraries()
project.extra["libraries"] = libraries
In one of the Groovy build scripts elsewhere in the project, this is referred to like this:
allprojects {
dependencies {
testImplementation libraries.junit
}
}
This works fine.
So I try converting that script to Kotlin:
allprojects {
dependencies {
"testImplementation"(libraries.junit)
}
}
And now this fails, because it can't see the libraries property on the project, so I try explicitly pulling it out at the start:
val libraries: Libraries by project.extra
allprojects {
dependencies {
"testImplementation"(libraries.junit)
}
}
But this doesn't work either, because the script can't find the Libraries class.
I also tried putting Libraries in Libraries.kt, but then I can't seem to call methods like exclude using named parameters because for whatever reason Gradle doesn't support using the DSL when it's moved to a top-level class.
This is sort of similar to this question, but in the case of wanting to put in simple types, everything works fine. Which is to say, I can put the libraries in as a Map, but then any time I want to reference one, I have to write this:
"testImplementation"(libraries["junit"]!!)
This is obviously ugly, so I have been trying to avoid it.
So I'm stumped again.
This is part of a long saga trying to get this to work in many different ways, so the essential question is still the same: how can we define all our libraries in one location, and then refer to those in a type-safe way from other build scripts?
Recently, Gradle added shared dependencies via a TOML file, but that method only supports the version numbers, whereas our library definitions also include the excluded dependencies.
It was hard to put a completely self-contained example in the question because multiple files are involved, so here's a test repo.
Can i write in my custom plugin some function like kotlin("jvm")?
plugins {
java
kotlin("jvm") version "1.3.71"
}
I want to write function myplugin("foo") in my custom plugin and then use it like
plugins {
java
kotlin("jvm") version "1.3.71"
custom.plugin
myplugin("foo")
}
How i can do it?
I think that plugins block is some kind of a macro expression. It is parsed and precompiled using a very limited context. Probably, the magic happens somewhere in kotlin-dsl. This is probably the only way to get static accessors and extension functions from plugins to work in Kotlin. I've never seen a mention of this process in Gradle's documentation, but let me explain my thought. Probably, some smart guys from Gradle will correct me.
Let's take a look at some third-party plugin, like Liquibase. It allows you to write something like this in your build.gradle.kts:
liquibase {
activities {
register("name") {
// Configure the activity here
}
}
}
Think about it: in a statically compiled language like Kotlin, in order for this syntaxt to work, there should be an extension named liquibase on a Project type (as it is the type of this object in every build.gradle.kts) available in the classpath of a Gradle's VM that executes the build script.
Indeed, if you click on it, you'll see something like:
fun org.gradle.api.Project.`liquibase`(configure: org.liquibase.gradle.LiquibaseExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("liquibase", configure)
But take a look at the file where it is defined. In my case it is ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/cmljl3ridzazieb8fzn553oa8/cache/src/org/gradle/kotlin/dsl/Accessors39qcxru7gldpadn6lvh8lqs7b.kt. It is definitelly an auto-generated file. A few levels upper in a file tree — at ~/.gradle/caches/6.3/gradle-kotlin-dsl-accessors/ in my case — there are dozens of similar directories. I guess, one by every plugin/version I've ever used with Gradle 6.3. Here is another one for the Detekt plugin:
fun org.gradle.api.Project.`detekt`(configure: io.gitlab.arturbosch.detekt.extensions.DetektExtension.() -> Unit): Unit =
(this as org.gradle.api.plugins.ExtensionAware).extensions.configure("detekt", configure)
So, we have a bunch of .kt files defining all that extensions for different plugins applied to the project. That files are obviously pre-cached and precompiled and their content is available in build.gradle.kts. Indeed, you can find classes directories beside those sources.
The sources are generated based on the content of the applied plugins. It is probably a tricky task that includes some magic, reflection and introspection. Sometimes this magic doesn't work (due too chatic Groovy nature) and then you need to use some crappy DSL from this package.
How are they generated? I see no other way, but to
Parse the build.script.kts with an embedded Kotlin compiler / lexer
Extract all the plugins sections
Compile them, probably against some mocks (remember that Project is not yet available: we're not executing the build.gradle.kts itself yet!)
Resolve the declared plugins from Gradle Plugin repository (with some nuances coming from settngs.gradle.kts)
Introspect plugin's artifacts
Generate the sources
Compile the sources
Add the resulting classes to the script's classpath
And here is the gotcha: there is a very limited context (classpath, classes, methods — call it whatever) available when compiling the plugins block. Actually, no plugins are yet applied! Because, you know, you're parsing the block that applies plugins. Chickens, eggs, and their problems, huh…
So, and we're getting closer to the answer on your question, to provide custom DSL in plugins block, you need to modify that classpath. It's not a classpath of your build.gradle.kts, it's the classpath of the VM that parses build.gradle.kts. Basically, it's Gradle's own classpath — all the classes bundled in a Gradle distribution.
So, probably the only way to provide really custom DSLs in plugins block is to create a custom Gradle distribution.
EDIT:
Indeed, totally forgot to test the buildSrc. I've created a file PluginExtensions.kt in it, with a content
inline val org.gradle.plugin.use.PluginDependenciesSpec.`jawa`: org.gradle.plugin.use.PluginDependencySpec
get() = id("org.gradle.war") // Randomly picked
inline fun org.gradle.plugin.use.PluginDependenciesSpec.`jawa`(): org.gradle.plugin.use.PluginDependencySpec {
return id("org.gradle.cunit") // Randomly picked
}
And it seems to be working:
plugins {
jawa
jawa()
}
However, this is only working when PluginExtensions.kt is in the default package. Whenever I put it into a sub-package, the extensions are not recognized, even with an import:
Magic!
The kotlin function is just a simple extension function wrapping the traditional id method, not hard to define:
fun PluginDependenciesSpec.kotlin(module: String): PluginDependencySpec =
id("org.jetbrains.kotlin.$module")
However, this extension function is part of the standard gradle kotlin DSL API, which means it's available without any plugin. If you want to make a custom function like this available, you would need a plugin. A plugin to load your plugin. Not very practical.
I also tried using the buildSrc module to make an extension function like the above. But it turns out that buildSrc definitions aren't even available from the plugins DSL block, which has a very constrained syntax. That wouldn't have been very practical anyway, you would have needed to make a buildSrc folder for every project in which you have wanted to use the extension.
I'm not sure if this is possible at all. Try asking on https://discuss.gradle.org/.
In a multiproject build, some projects depend on others, and the latter provide not only compile/runtime libraries, but also useful test libs (such as, for instance, "mock" implementations of the components they provide).
I know of a couple of ways to make the test sources of one project available to another. They are discussed, for instance as answers to this question.
What I am looking for is some magical way to make this happen automatically, so that if a subproject adds a dependency on another subproject, it automatically gets that projects test sources added to the testCompile config.
I tried the naive approach:
configure(rootProject.subprojects.findAll {
it.configurations.getByName("compile")
.dependencies.any {
it.name == project.name
}
}) {
dependencies {
testCompile project.sourceSets.test.output
}
}
But this does not work ... presumably, because this code is evaluated "at the wrong stage" (or whatever the correct lingo is), and the other projects
don't "know" yet that they depend on this one.
I also tried putting (an equivalent of) this at the end of root build file (hoping, that everything else would already be configured by then), but that did not work either.
Is there a way to do what I am looking for here?
I also tried putting (an equivalent of) this at the end of root build file
The order of declaration in a build.gradle does not matter. Gradle build lifecycle has two phases, configuration and execution. In configuration, the various build.gradle files are read and a graph of execution order is created based on implicit and explicit dependencies among the various tasks.
Normally the root build.gradle is evaluated first, but it is possible to force the child projects to be evaluated first using evaluationDependsOnChildren()
Instead of position in the buildscripts, you can listen for various events of the buildcycle, to run something at certain points. In your case, you want to run your snippet once all projects are evaluated, using an afterEvaluate() block. See example here.
Some possible alternatives to your overall approach:
Add testCompile dependency from the downstream project instead of injecting from the upstream project. Add this to root build.gradle:
subprojects {
project.configurations.getByName("compile").dependencies.each {
if (it.name == foo){ // could be expanded to if it.name in collection
//inherit testCompile
project.configurations.dependencies.testCompile += it.sourceSets.test.output
}
}
}
(pseudo-code, haven't tested)
Separate out the shareable test/mock components into a separate mock-project that you can add as testCompile dependency to both upstream and downstream projects
I am in the process of transitioning my 'regular' Backbone projects into a combination of Backbone and RequireJS. While this process works pretty flawless, I still have one question.
Previously I declared a global namespace for my app to which I then bound all my models, views an collections. This is a tip I actually got from the Backbone ToDoMVC project.
So for example, the initialize method of a view could look like this:
initialize: function () {
app.employees = new app.EmployeeCollection();
app.employees.fetch();
}
This works because at the beginning of every file, I've done this:
var app = app || {};
Now when defining my files as AMD modules, the app namespace doesn't exist anymore, which means everything is much more encapsulated:
initialize: function () {
var employees = new EmployeeCollection();
employees.fetch();
}
The EmployeeCollection is loaded with RequireJS:
var EmployeeCollection = require('collections/EmployeeCollection');
Unfortunately I am still very new to Backbone and MVC in general, so I am unsure if this is a good or a bad thing.
What impact will this have on my project – is it okay to use an app namespace like I did previously or does this break any MVC/OOP 'rule'? Are there any Backbone specific consequences I need to be aware of?
Yes, loading the EmployeeCollection via requirejs is a good thing. This explicitly lists each module's dependencies and lets requirejs help you with loading modules in the proper order.
Both the app namespace approach and the requirejs approach are both valid. Backbone won't care which approach you take since with either you have the necessary View/Collection/Model constructor available to use. Personally I like the above benefits I mentioned of requirejs but it's a personal preference you'll have to decide.
However, you shouldn't use requirejs and an all-knowing app namespace together. If you're committed to requirejs then you should only use the app namespace sparingly with top-level data that most of your app will need, rather than attaching all of your requirejs modules to it.
For example, you might use it for a global UserModel that contains information about the current user. To do this you'd create an app object as a requirejs module just like you did with your EmployeeCollection, and then whatever module constructs the UserModel would require 'app' and do a simple assignment: app.user=user.
I said do this sparingly because using a global app namespaces for all your modules would sacrifice much of the benefit of requirejs and would cause you some sequencing pain. Namely:
You can no longer see the actual dependencies for each module declaratively and visualize easily how all your modules fit together. Instead of having the initialize function of your view (or whatever that is) require in 'collections/EmployeeCollection' you'd be requiring 'app'; not a lot of context there.
Requirejs will take care of loading required modules first before allowing your defining function to run. But if everything just requires 'app' then requirejs will only ensure 'app' is defined first and you're on your own for everything else. If app.Bar requires app.Foo, you have to do something to make sure app.Foo gets loaded and defined first.
On a similar note, if requirejs can't figure out all your dependencies because everything just requires 'app' then requirejs's javascript concatenator and optimizer tool (called r.js) will be either useless to you or require a lot of maintenance to add all your modules to a list that it should compile.
If you decide to use requirejs, embrace what it can do for you and just require in the modules you want instead of relying heavily on a global namespace. But there's not a right or wrong way choosing between these two approaches; each is used by lots of smart people.
I'm using SpineJS (which exports a commonjs module) and it needs to be available globally because I use it everywhere, but It seems like I have to do Spine = require('spine') on every file that uses Spine for things to work.
Is there any way to define Spine once to make it globally available?
PS: I'm using Spine as an example, but I'm in general wondering about how to do this with any other library.
Writing Spine = require('spine') in each file is the right way to do.
Yet, there are several possibilities by using the global or window object (browserify sets the global object to window, which is the global namespace):
in spine.js: global.Spine = module.exports
in any other .js file bundled by browserify: global.Spine = require('spine')
in a script tag or an .js file referenced by the .html file, after the spine.js file: window.Spine = require('spine')
First of all, for your example David is correct. Include all dependencies in every module you need it in. It's very verbose, but there is no compile time magic going on which alleviates all sorts of anti patterns and potential future problems.
The real answer.
This isn't always practical. Browserify accepts an option called insertGlobalVars. On build, each streamed file is scanned for identifiers matching the key names provided and wraps the module in an IIFE containing arguments that resolve each identifier that is not assigned within the module. This all happens before the dependency tree is finalized, which allows you to use require to resolve a dependency.
TLDR
Use insertGlobalVars option in Browserify.
browserify({
insertGlobalVars: {
spine: function(file, dir) {
return 'require("spine")';
}
}
});
For every file scanned, if an identifier spine exists that's not assigned, resolve as require("spine").