IntelliJ GSDL: Define method with optional arguments - intellij-idea

I have Jenkins pipeline shared library, that specifies a global variable foo which provides two methods.
One of the has no argument, the other has one optional argument:
/vars/foo.groovy
def getBarOne() {
//...
}
def getBarTwo(String value = '') {
//...
}
Now I want to provide an IntellJ GSDL file which supports useful code completion for both of this methods.
(The GSDL provided by my Jenkins only contains only the definition for the global variable but not for its methods, so I'm trying to add that.)
pipeline.gsdl (by Jenkins)
//The global script scope
def ctx = context(scope: scriptScope())
contributor(ctx) {
//...
property(name: 'foo', type: 'org.jenkinsci.plugins.workflow.cps.global.UserDefinedGlobalVariable')
}
//..
pipeline.gsdl (pimped by me)
//The global script scope
def ctx = context(scope: scriptScope())
contributor(ctx) {
//...
property(name: 'foo', type: 'org.jenkinsci.plugins.workflow.cps.global.UserDefinedGlobalVariable')
}
def uservarCtx = context(ctype: 'org.jenkinsci.plugins.workflow.cps.global.UserDefinedGlobalVariable')
contributor (uservarCtx) {
method name: 'getBarOne', type: 'java.lang.String', params: [:]
method name: 'getBarTwo', params: [value:'java.lang.String'], type: 'List<String>'
}
//..
So far so good.
However, code completion in my Jenkinsfile is not fully satisfactory, as it suggests
For getBarOne() it suggests both .barOne and .getBarOne(); for getBarTwo(..) only .getBarTwo(String value) is suggested, although the argument is optional.
How can I specify in the GDSL file, that the argument is optional, so that I get suggested all three (valid groovy) options: barTwo, getBarTwo() and getBarTwo(String value)?
(Unfortunately the “GDSL AWESOMENESS” Series was of no help.)

To have all three options provided, one has to specify two method signatures in the GDSL file.
One with the (optional) argument, an one without it:
pipeline.gdsl
//...
def uservarCtx = context(ctype: 'org.jenkinsci.plugins.workflow.cps.global.UserDefinedGlobalVariable')
contributor (uservarCtx) {
method name: 'getBarOne', type: 'java.lang.String', params: [:]
method name: 'getBarTwo', params: [:], type: 'List<String>' //<---
method name: 'getBarTwo', params: [value:'java.lang.String'], type: 'List<String>'
}
autocomplete suggestions:
Bonus track: Multiple global variables
As I've not only one global variable but two, I also wanted to have auto-completion to support that as well.
The trick to do that is, to have specifiy different types for your global variables:
pipeline.gsdl
//The global script scope
def ctx = context(scope: scriptScope())
contributor(ctx) {
//...
property(name: 'foo', type: 'org.jenkinsci.plugins.workflow.cps.global.UserDefinedGlobalVariable.Foo')
property(name: 'bar', type: 'org.jenkinsci.plugins.workflow.cps.global.UserDefinedGlobalVariable.Bar')
}
def varCtxFoo = context(ctype: 'org.jenkinsci.plugins.workflow.cps.global.UserDefinedGlobalVariable.Foo')
contributor (varCtxFoo) {
//...
}
def varCtxBar = context(ctype: 'org.jenkinsci.plugins.workflow.cps.global.UserDefinedGlobalVariable.Bar')
contributor (varCtxBar) {
//...
}
//..
Note the .Foo and .Bar suffix to the type UserDefinedGlobalVariable with the type definitions.

Related

IntelliJ GDSL jenkinsfile nested properties not working

contributor(ctx) {
method(name: 'pipeline', type: Object, params: [body: Closure], doc: 'Declarative pipeline')
property(name: 'env', type: 'org.jenkinsci.plugins.workflow.cps.EnvActionImpl.Binder', doc: 'Environment variable')
}
def envVars = context(ctype: 'org.jenkinsci.plugins.workflow.cps.EnvActionImpl.Binder')
contributor(envVars) {
property(name: 'GIT_BRANCH', type: String, doc: 'Branch name')
}
With the example above, after plenty of googling and examples, this should be the way it is written and seems to be in all the examples.
However, in the Jenkinsfile env.GIT_BRANCH env is recognized, but for the other half, it's No candidates found for method call env.GIT_BRANCH.
Same for everything that uses property.property style.
Is this a known issue/limitation (could not find it mentioned anywhere)?

Rust deserialize JSON into custom HashMap<String, google_firestore1::Value>

I just started with Rust and I have some trouble with deserialization.
I'm actually trying to use the function ProjectDatabaseDocumentCreateDocumentCall from the following crate google_firestore1. I want to populate the field fields of the struct Document. The documentation of the struct is clear, it's expecting a HashMap<String, google_firestore1::Value> as a value.
The question is, how can I deserialize a JSON string to a HashMap<String, google_firestore1::Value> ?
Here is the code I wrote for the moment:
extern crate google_firestore1 as firestore1;
use google_firestore1::Document;
use std::collections::HashMap;
use serde_json;
pub fn go() {
let _my_doc = Document::default();
let test = "{\"test\":\"test\", \"myarray\": [1]}";
// Working perfectly fine
let _working: HashMap<String, serde_json::Value> = serde_json::from_str(test).unwrap();
// Not working
let _not_working: HashMap<String, firestore1::Value> = serde_json::from_str(test).unwrap();
// Later I want to do the following
// _my_doc.fields = _not_working
}
Obvsiouly this is not working, and it crashes with the following error.
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: Error("invalid type: string \"test\", expected struct Value", line: 1, column: 14)', src/firestore.rs:17:85
stack backtrace:
Of course, I noticed that serde_json::Value and firestore1::Value are not the same Struct.
But I gave a look at the source code and it seems that firestore1::Value is implementing the Deserialize trait.
So why is it not working ? In this case, do I need to iterate over the first HashMap and deserialize serde_json::Value to firestore1::Value again ? Is there a cleaner way to do what I want ?
Thanks for your answer !
The definition of the firestore1::Value is:
/// A message that can hold any of the supported value types.
///
/// This type is not used in any activity, and only used as *part* of another schema.
///
#[derive(Default, Clone, Debug, Serialize, Deserialize)]
pub struct Value {
/// A bytes value.
///
/// Must not exceed 1 MiB - 89 bytes.
/// Only the first 1,500 bytes are considered by queries.
#[serde(rename="bytesValue")]
pub bytes_value: Option<String>,
/// A timestamp value.
///
/// Precise only to microseconds. When stored, any additional precision is
/// rounded down.
#[serde(rename="timestampValue")]
pub timestamp_value: Option<String>,
...
}
This means each entry for a firestore1::Value must be an object.
I suspect that only one of the fields would actually be set, corresponding
to the actual type of the value (as they're all optional).
So your json would need to be something like:
let test = r#"{
"test":{"stringValue":"test"},
"myarray": {
"arrayValue":{"values":[{"integerValue":1}]}
}
}"#;
This is pretty ugly, so if you're doing a lot of your own JSON to firestore conversations, I'd probably write some helpers to convert from the serde_json::Value to firestore1::Value.
It would probably look something like this:
fn my_firestore_from_json(v:serde_json::Value) -> firestore1::Value {
match v {
serde_json::Value::Null => firestore::Value {
// I don't know why this is a Option<String>
null_value: Some("".to_string),
..Default::default(),
},
serde_json::Value::Bool(b) => firestore::Value {
bool_value: Some(b),
..Default::default(),
},
// Implement this
serde_json::Value::Number(n) => my_firestore_number(n),
serde_json::Value::String(s) => firestore::Value {
string_value: Some(s),
..Default::default(),
},
serde_json::Value::Array(v) => firestore::Value {
array_value:
Some(firestore1::ArrayValue{
values:v.into_iter().map(my_firestore_from_json)
}),
..Default::default(),
},
// Implement this
serde_json::Value::Object(d) => my_firststore_object(/* something */)
}
}
This would be a bit neater if there were various implementations of From<T> for the firestore1::Value, but using the implementation of
Default makes this not too ugly.
It is also worth noting that not all firebase types are created here,
since the types expressed in serde_json are different from those supported by firebase.
Anyway this allows you to use your JSON as written by doing something like:
let test = "{\"test\":\"test\", \"myarray\": [1]}";
let working: HashMap<String, serde_json::Value> = serde_json::from_str(test).unwrap();
let value_map: HashMap<String, firestore1::Value> = working.iter().map(|(k,v)| (k, my_firestore_from_json(v)).collect();

More concise way to build a configuration class using environment variables?

I have a class Configuration that reads in environment variables:
class Configuration {
has $.config_string_a;
has $.config_string_b;
has Bool $.config_flag_c;
method new() {
sub assertHasEnv(Str $envVar) {
die "environment variable $envVar must exist" unless %*ENV{$envVar}:exists;
}
assertHasEnv('CONFIG_STRING_A');
assertHasEnv('CONFIG_STRING_B');
assertHasEnv('CONFIG_FLAG_C');
return self.bless(
config_string_a => %*ENV{'CONFIG_STRING_A'},
config_string_b => %*ENV{'CONFIG_STRING_B'},
config_flag_c => Bool(%*ENV{'CONFIG_FLAG_C'}),
);
}
}
my $config = Configuration.new;
say $config.config_string_a;
say $config.config_string_b;
say $config.config_flag_c;
Is there a more concise way to express this? For example, I am repeating the environment variable name in the check and the return value of the constructor.
I could easily see writing another, more generic class that encapsulates the necessary info for a config parameter:
class ConfigurationParameter {
has $.name;
has $.envVarName;
has Bool $.required;
method new (:$name, :$envVarName, :$required = True) {
return self.bless(:$name, :$envVarName, :$required);
}
}
Then rolling these into a List in the Configuration class. However, I don't know how to refactor the constructor in Configuration to accommodate this.
The most immediate change that comes to mind is to change new to be:
method new() {
sub env(Str $envVar) {
%*ENV{$envVar} // die "environment variable $envVar must exist"
}
return self.bless(
config_string_a => env('CONFIG_STRING_A'),
config_string_b => env('CONFIG_STRING_B'),
config_flag_c => Bool(env('CONFIG_FLAG_C')),
);
}
While // is a definedness check rather than an existence one, the only way an environment variable will be undefined is if it isn't set. That gets down to one mention of %*ENV and also of each environment variable.
If there's only a few, then I'd likely stop there, but the next bit of repetition that strikes me is the names of the attributes are just lowercase of the names of the environment variables, so we could eliminate that duplication too, at the cost of a little more complexity:
method new() {
multi env(Str $envVar) {
$envVar.lc => %*ENV{$envVar} // die "environment variable $envVar must exist"
}
multi env(Str $envVar, $type) {
.key => $type(.value) given env($envVar)
}
return self.bless(
|env('CONFIG_STRING_A'),
|env('CONFIG_STRING_B'),
|env('CONFIG_FLAG_C', Bool),
);
}
Now env returns a Pair, and | flattens it in to the argument list as if it's a named argument.
Finally, the "power tool" approach is to write a trait like this outside of the class:
multi trait_mod:<is>(Attribute $attr, :$from-env!) {
my $env-name = $attr.name.substr(2).uc;
$attr.set_build(-> | {
with %*ENV{$env-name} -> $value {
Any ~~ $attr.type ?? $value !! $attr.type()($value)
}
else {
die "environment variable $env-name must exist"
}
});
}
And then write the class as:
class Configuration {
has $.config_string_a is from-env;
has $.config_string_b is from-env;
has Bool $.config_flag_c is from-env;
}
Traits run at compile time, and can manipulate a declaration in various ways. This trait calculates the name of the environment variable based on the attribute name (attribute names are always like $!config_string_a, thus the substr). The set_build sets the code that will be run to initialize the attribute when the class is created. That gets passed various things that in our situation aren't important, so we ignore the arguments with |. The with is just like if defined, so this is the same approach as the // earlier. Finally, the Any ~~ $attr.type check asks if the parameter is constrained in some way, and if it is, performs a coercion (done by invoking the type with the value).
So I mentioned this in a comment but I figured it would be good as an actual answer. I figured this would be useful functionality for anyone building a Docker based system so took Jonanthan's example code, added some functionality for exporting Traits Elizabeth showed me and made Trait::Env
Usage is :
use Trait::Env;
class Configuration {
has $.config_string_a is env;
has $.config-string-b is env(:required);
has Bool $.config-flag-c is env is default(True);
}
The :required flag turns on die if not found. And it plays nicely with the is default trait. Attribute names are upper cased and - is replaced with _ before checking %*ENV.
I have a couple of planned changes, make it throw a named Exception rather than just die and handle Boolean's a bit better. As %*ENV is Strings having a Boolean False is a bit of a pain.

Dojo this.inherit throws 'Uncaught TypeError: Cannot read property 'callee' of undefined'

To facilitate a JsonRest store with a non-standard url scheme, I am trying to inherit JsonRest and override the _getTarget(id) function. Here is what my inherited javascript class looks like:
define([
"dojo/_base/declare",
"dojo/store/JsonRest",
],
function(declare, JsonRest) {
return declare(JsonRest, {
_getTarget: function(id){
var target = this.target;
if(typeof id != "undefined"){
if(target.indexOf("{id}") != -1) {
//use template
target = target.replace("{id}", id);
} else {
target = this.inherited(id);
}
}
return target;
},
});
});
However the line target = this.inherited(id); returns an error: Uncaught TypeError: Cannot read property 'callee' of undefined.
I looked at the docs, and I think I am doing it right:
http://dojotoolkit.org/reference-guide/1.10/dojo/_base/declare.html#calling-superclass-methods
What is the proper way to call the base class's _getTarget(id) function?
If you look closely the part of the documentation you linked, you are supposed to literally pass the arguments object to this.inherited - that's what contains the callee property it is looking for (and will include the id and any other arguments anyway, to be passed along to the superclass).
A few paragraphs in, the documentation also explains how to call this.inherited with arguments other than the same ones passed, if necessary: you can pass custom arguments in an array after arguments, i.e. this.inherited(arguments, [ ... ]). arguments always has to be first.

How do I check if Geb Module "content" is present?

I'm a bit new to the whole Selenium/Geb thing, so I'm probably going about this a bit wrong, but I'm trying to get the exists() method in the following code to work properly.
class Question extends Module {
static base = { $("fieldset") }
static content = {
input { $("input[type=text]") }
name { input.getAttribute("name").toString() }
}
boolean exists() {
return input.isPresent()
}
Frustratingly, when I try to execute that code (from a Spock Test, "go"ing to a PageObjectm including this module, I get the following:
The required page content 'input - SimplePageContent (owner: question - Question (owner: MyPage, args: [], value: null), args: [], value: null)' is not present
I've tried a number of other things, including:
if (input) return true; return false,
... input.size() == 0,
Using static at = {...} (doesn't seem to be supported for modules"
Any ideas
By default Geb ensures that your content definitions return non empty elements. If your content is optional you can tell it using required content option:
name(required: false) { input.getAttribute("name").toString() }
Because Geb utilizes Groovy Truth to redefine how navigator are coerced to boolean values(empty navigators are falsey and non-empty are truthy) you can simplify your exists method to:
boolean exists() {
input
}