For loop in JJB templates - jenkins-job-builder

Is there a way to use for loops in YAML templates for Jenkins Job Builder?
Like it Ansible with jinja2
Something like
jobs: job1, job2, job3
- trigger-builds:
- project:
{% for j in jobs %}
project_{{ j }}
{% endfor %}
So it will be like
- trigger-builds:
- project: project_job1 project_job2 project_job3

From Documentation: https://docs.openstack.org/infra/jenkins-job-builder/definition.html
- project:
name: project-name
axe1:
- axe1val1
- axe1val2
axe2:
- axe2val1
- axe2val2
axe3:
- axe3val1
- axe3val2
exclude:
- axe1: axe1val1
axe2: axe2val1
axe3: axe3val2
- axe2: axe2val2
axe3: axe3val1
jobs:
- build-{axe1}-{axe2}-{axe3}
- job-template:
name: build-{axe1}-{axe2}-{axe3}
builders:
- shell: "echo Combination {axe1}:{axe2}:{axe3}"
The above example will omit the jobs:
build-axe1val1-axe2val1-axe3val2
build-axe1val1-axe2val2-axe3val1
build-axe1val2-axe2val2-axe3val1

You can use jinja2 template for achieving this purpose.
Something like this:
- builder:
name: test-builder
builders:
- shell:
!j2: |
{{ var }}
{% for item in test_list -%}
{{ item }}
{% endfor %}
- job:
name: test-job
builders:
- test-builder:
var: "test variable"
test_list:
- a
- b
- c

Related

Call a variable inside the nested expression in github actions

I am trying to call the variable like this:
with:
tags: ${{ inputs.push_tag_to_release && 'op/post:'[env.tag] || 'op/post:${{ env.tag }}' }}
where,
env.tag = 0.13 (value)
but both the ways [env.tag] or ${{ env.tag }} are showing as incorrect/not-supported.
I tried like this,
tags: ${{ inputs.push_tag_to_release && 'op/post:'[env.tag] || 'op/post:${{ env.tag }}' }}
and the format function as well:
tags: ${{ inputs.push_tag_to_release && 'op/post:format({0},env.tag)' || 'op/post:${{ env.tag }}' }}
but not working in any way.

DBT - how can i add model configuration (using a macro on {{this}}) in dbt_project.yml

I want to add node_color to all of my dbt models based on my filename prefix to make it easier to navigate through my dbt documentation :
fact_ => red
base__ => black.
To do so i have a macro that works well :
{% macro get_model_color(model) %}
{% set default_color = 'blue' %}
{% set ns = namespace(model_color=default_color) %}
{% set dict_patterns = {"base__[a-z0-9_]+" : "black", "ref_[a-z0-9_]+" : "yellow", "fact_[a-z0-9_]+" : "red"} %}
{% set re = modules.re %}
{% for pattern, color in dict_patterns.items() %}
{% set is_match = re.match(pattern, model.identifier, re.IGNORECASE) %}
{% if is_match %}
{% set ns.model_color = color %}
{% endif %}
{% endfor %}
{{ return({'node_color': ns.model_color}) }}
{% endmacro %}
And i call it in my model .sql :
{{config(
materialized = 'table',
tags=['daily'],
docs=get_model_color(this),
)}}
This works well but force me to add this line of code in all my models (and in all the new ones).
Is there a way i can define it in my dbt_project.yml to make it available to all my models automatically?
I have tried many things like the config jinja function or this kind of code in dbt_project.yml
+docs:
node_color: "{{ get_model_color(this) }}"
returning Could not render {{ get_model_color(this) }}: 'get_model_color' is undefined
But nothing seems to work
Any idea? Thanks

Using one YAML definition for the same column as it moves through models

I have some model YAML:
version: 2
models:
- name: my_model
columns:
- name: foo
description: My bestest column
...If I make other models which inherit from this one, is there any way to refer back to this column definition when documentation is generated, or do I need to copy-paste the column definition for each model in which the column appears?
In other words, is there a way of defining a column only once to make edits and updates easier.
Cheers,
Graham
I think there are two ways to do it.
1. using macro
create a file that contains all those reusable descriptions. you could even use params to customize the description.
e.g. doc_library.sql
{% macro column_bestest_doc(col_name) %}
My bestest column {{ col_name }}
{% endmacro %}
then use it in dbt_project.yml
version: 2
models:
- name: my_model
columns:
- name: foo_1
description: {{column_bestest_doc(foo_1)}}
- name: my_model_another
columns:
- name: foo_2
description: {{column_bestest_doc(foo_2)}}
2. using YAML anchor
you could do YAML anchors in dbt_project.yml as in any other yml files.
version: 2
models:
- name: my_model
columns:
- name: &foo
description: My bestest column
- name: my_model_another
columns:
- name: *foo
ref:
https://support.atlassian.com/bitbucket-cloud/docs/yaml-anchors/
https://medium.com/#kinghuang/docker-compose-anchors-aliases-extensions-a1e4105d70bd
We have been thinking about this problem extensively as well... our current solution is to modify the generate_model_yaml macro:
{% macro generate_model_yaml(model_name) %}
{% set model_yaml=[] %}
{% set existing_descriptions = fetch_existing_descriptions(model_name) %}
-- # TO DO: pass model to fetch()
-- if column not blank on current model, use description in returned dict
-- otherwise, use global
-- also extract tests on column anywhere in global scope
{% do model_yaml.append('version: 2') %}
{% do model_yaml.append('') %}
{% do model_yaml.append('models:') %}
{% do model_yaml.append(' - name: ' ~ model_name | lower) %}
{% do model_yaml.append(' description: ""') %}
{% do model_yaml.append(' columns:') %}
{% set relation=ref(model_name) %}
{%- set columns = adapter.get_columns_in_relation(relation) -%}
{% for column in columns %}
{%- set column = column.name | lower -%}
{%- set col_description = existing_descriptions.get(column, '') %}
{% do model_yaml.append(' - name: ' ~ column ) %}
{% do model_yaml.append(' description: "' ~ col_description ~ '"') %}
{% do model_yaml.append('') %}
{% endfor %}
{% if execute %}
{% set joined = model_yaml | join ('\n') %}
{{ log(joined, info=True) }}
{% do return(joined) %}
{% endif %}
{% endmacro %}
And then get the first description found that matches the column, with fetch_existing_descriptions():
{% macro fetch_existing_descriptions(current_model) %}
{% set description_dict = {} %}
{% set current_model_dict = {} %}
{% for node in graph.nodes.values() | selectattr("resource_type", "equalto", "model") %}
{% for col_dict in node.columns.values() %}
{% if node.name == current_model %}
-- Add current model description to seperate dict to overwrite with later
{% set col_description = {col_dict.name: col_dict.description} %}
{% do current_model_dict.update(col_description) %}
{% elif description_dict.get(col_dict.name, '') == '' %}
{% set col_description = {col_dict.name: col_dict.description} %}
{% do description_dict.update(col_description) %}
{% endif %}
{% endfor %}
{% endfor %}
-- Overwrite description_dict with current descriptions
{% do description_dict.update(current_model_dict) %}
{% if var('DEBUG', False) %}
{{ log(tojson(description_dict), info=True) }}
{% else %}
{{ return(description_dict) }}
{% endif %}
{% endmacro %}
Finally, we use a bash script with the path to the model to write/overwrite a yaml file using the modified generate_model_yaml above:
#!/bin/bash
# Generates documentation for dbt models.
# Usage:
# Run from within the gold folder.
# Run with no args for all models. Provide optional relative model path to generate docs for just one model:
# Eg. `$ bash ./scripts/generate_docs.sh [./models/path/to_your_model.sql]`
if [[ -n $1 ]]; then # Build array of one model if optional arg provided
yml="${1%.sql}.yml" # Create model yml filename
touch "$yml" && rm "$yml" || exit 1 # Ensure filepath works by testing creation
array=("$yml")
else # Create array of yml
array=()
while IFS= read -r -d $'\0'; do
if [[ ${REPLY} != *"src"* ]]; then # Only proceed for model yml files (don't contain "src")
if [[ -n $1 ]]; then
# Include only the model yml of the optional arg
if [[ $(basename $yml) == $(basename $REPLY) ]]; then
array+=("$REPLY")
fi
else
array+=("$REPLY")
fi
fi
done < <(find "./models/" -name "*.yml" -print0)
fi
# Create copy model yml with prescribed yml containing docs standard.
for i in "${array[#]}"
do
model=$(basename $i | sed -E 's|[.]yml||g')
generated_yml=$(dbt run-operation generate_model_yaml --args "model_name: $model" | sed '1d')
echo "$generated_yml" > "${i}_copy" # Create non-yml copy file to allow script to complete
done
# Once all copies are created, replace originals
for i in "${array[#]}"
do
cat "${i}_copy" > $i
rm "${i}_copy"
done

jinja for loop in salt file.blockreplace for /etc/hosts

I have some issues with my jinja code inside my salt state, which should change the /etc/hosts file by a LDAP Pillar.
{% set CID = grains['CID'] %}
{% set ldap_pillar = 'ldap-hosts-{{CID}}' %}
ldap-hosts:
file.blockreplace:
- name: /tmp/hosts
- marker_start: "# BEGIN SALT MANAGED CONTENT - DO NOT EDIT BETWEEN THIS - #"
- marker_end: "# END SALT MANAGED CONTENT - DO NOT EDIT BETWEEN THIS - #"
- content:
{% for entry in {{ salt.pillar.get('ldap_pillar') }} %}
{% for hostname, ip in entry.items %}
{{ip}} {{hostname}}
{% endfor %}
{% endfor %}
- show_changes: True
- append_if_not_found: True
The LDAP Pillar serves the following Format:
local:
|_
----------
cn:
host1.domain.tld
ipHostNumber:
4.4.4.4
|_
----------
cn:
host2
ipHostNumber:
8.8.8.8
Now I like to catch all the IPs and Hostnames a build a valid host file.
Here is my Error:
local:
Data failed to compile:
----------
Rendering SLS 'base:ldap_hosts' failed: Jinja syntax error: expected token ':', got '}'; line 10
---
[...]
file.blockreplace:
- name: /tmp/hosts
- marker_start: "# BEGIN SALT MANAGED CONTENT - DO NOT EDIT BETWEEN THIS - #"
- marker_end: "# END SALT MANAGED CONTENT - DO NOT EDIT BETWEEN THIS - #"
- content:
{% for entry in {{ salt.pillar.get('ldap_pillar') }} %} <======================
{% for hostname, ip in entry.items %}
{{ip}} {{hostname}}
{% endfor %}
{% endfor %}
- show_changes: True
[...]
---
I just fiexed it. It was quiet easy.
{% set CID = grains['CID'] %}
{% set ldap_pillar = 'ldap-hosts-'+CID %}
ldap-hosts:
file.blockreplace:
- name: /etc/hosts
- marker_start: "# BEGIN SALT MANAGED CONTENT - DO NOT EDIT BETWEEN THIS - #"
- marker_end: "# END SALT MANAGED CONTENT - DO NOT EDIT BETWEEN THIS - #"
- content: |
{% for entry in salt['pillar.get'](ldap_pillar) -%}
{{entry.ipHostNumber}} {{entry.cn}}
{% endfor %}
- show_changes: True
- append_if_not_found: True
Now everything worked good.

On Jekyll (Liquid) can't get where working

I have a people.yml like
- name: Foo
activities:
- title: bar1
- title: bar2
And an assign like
{% assign person = site.data.people | where: "activities.title", "bar1" %}
When I previously had only one activity per person (without having the title attribute) I could easily make it work. But now I'm struggling with it.
You cannot pass an Array to the where filter. It will not try to find the desired value by looping over all Hashes like {"title"=>"bar1"}, it will simply evaluate the property against the passed string. So, those Hash will never be equal to bar1.
My two cents :
Simplify people.yml by removing activities.name key :
Note : the two activities array presentations are equivalent.
- name: Foo
activities: [ bar1, bar2 ]
- name: Bar
activities:
- bar3
- bar4
You can now use the contains filter to retrieve people that have bar1 as an activity. contains filters strings or array like ["bar1", "bar2"]
{% assign selected = "" | split: "/" %} --> create empty array
{% for people in site.data.people %}
{% if people.activities contains 'bar1' %}
--> add people to array if match is found
{% assign selected = selected | push: people %}
{% endif %}
{% endfor %}