Condition (if, else, else if) in YAML - conditional-statements

I use hautelook/AliceBundle to create faker data in YAML, but I would for more consistency of data.
I would like that:
gender: <randomElement(['Homme', 'Femme'])>
if $gender == 'Homme'
title: 'Monsieur'
else if $gender == 'Femme'
title: 'Madame'
I know it's not directly possible in YAML, but I do not know which plugin to use... And how to.
My tools/languages used in my project (Symfony, hautelook/AliceBundle, PHP, YAML)

You cannot do that. YAML is data serialization language, not a programming language.

- task: TerraformTaskV1#0
${{ if eq(parameters.destroy, false) }}:
displayName: Terraform Apply
${{ if eq(parameters.destroy, true) }}:
displayName: Terraform Destroy
inputs:
provider: 'azurerm'
${{ if eq(parameters.destroy, false) }}:
command: 'apply'
${{ if eq(parameters.destroy, true) }}:
command: 'destroy'
workingDirectory: "$(System.ArtifactsDirectory)/${{ parameters.environment_name }}${{ parameters.root_directory }}"
${{ if eq(parameters.destroy, false) }}:
commandOptions: "$(System.ArtifactsDirectory)/${{ parameters.environment_name }}${{ parameters.root_directory }}/plan.tfplan"
${{ if eq(parameters.destroy, true) }}:
commandOptions: "--var-file=$(System.ArtifactsDirectory)/${{ parameters.environment_name }}${{ parameters.tfvarFile }}"
environmentServiceNameAzureRM: ${{ parameters.service_connection_name }}

yaml files won't include any conditional logic. YAML is a data serialisation language, so it's not contain if/else style executable statements.

Related

GitHub Actions: Use variables in matrix definition?

I have the following code in a GitHub Action config:
name: Build & Tests
on:
pull_request:
env:
CARGO_TERM_COLOR: always
ZEROCOPY_MSRV: 1.61.0
ZEROCOPY_CURRENT_STABLE: 1.64.0
ZEROCOPY_CURRENT_NIGHTLY: nightly-2022-09-26
jobs:
build_test:
runs-on: ubuntu-latest
strategy:
matrix:
# See `INTERNAL.md` for an explanation of these pinned toolchain
# versions.
channel: [ ${{ env.ZEROCOPY_MSRV }}, ${{ env.ZEROCOPY_CURRENT_STABLE }}, ${{ env.ZEROCOPY_CURRENT_NIGHTLY }} ]
target: [ "i686-unknown-linux-gnu", "x86_64-unknown-linux-gnu", "arm-unknown-linux-gnueabi", "aarch64-unknown-linux-gnu", "powerpc-unknown-linux-gnu", "powerpc64-unknown-linux-gnu", "wasm32-wasi" ]
features: [ "" , "alloc,simd", "alloc,simd,simd-nightly" ]
exclude:
# Exclude any combination which uses a non-nightly toolchain but
# enables nightly features.
- channel: ${{ env.ZEROCOPY_MSRV }}
features: "alloc,simd,simd-nightly"
- channel: ${{ env.ZEROCOPY_CURRENT_STABLE }}
features: "alloc,simd,simd-nightly"
I'm getting the following parsing error on this file:
Invalid workflow file: .github/workflows/ci.yml#L19
You have an error in your yaml syntax on line 19
It appears to be referring to this line (it's actually one off, but maybe it's zero-indexing its line numbers?):
channel: [ ${{ env.ZEROCOPY_MSRV }}, ${{ env.ZEROCOPY_CURRENT_STABLE }}, ${{ env.ZEROCOPY_CURRENT_NIGHTLY }} ]
Is there any way to use variables in the matrix definition like this? Or do I just need to hard-code everything?
According to the documentation (reference 1 and reference 2)
Environment variables (at the workflow level) are available to the steps of all jobs in the workflow.
In your example, the environment variables are used at the job level (inside the job strategy / matrix definition), not inside the job steps.
At that level, environment variables aren't interpolated by the GitHub interpreter.
First alternative
Hardcode the values inside the channel field inside your matrix strategy:
Example:
channel: [ "1.61.0", "1.64.0", "nightly-2022-09-26" ]
However, you'll have to do this for each job (bad for maintenance as duplicated code).
Second alternative
Use inputs (with reusable workflow workflow_call trigger, or with workflow_dispatch trigger.
Example:
on:
workflow_dispatch: # or workflow_call
inputs:
test1:
description: "Test1"
required: false
default: "test1"
test2:
description: "Test2"
required: false
default: "test2"
test3:
description: "Test3"
required: false
default: "test3"
jobs:
build_test:
runs-on: ubuntu-latest
strategy:
matrix:
channel: [ "${{ inputs.test1 }}", "${{ inputs.test2 }}", "${{ inputs.test3 }}" ]
In that case, inputs will be interpolated by the GitHub interpreter.
However, you'll need to trigger the workflow from another workflow, or through the GitHub API to send the inputs (in some way, it gives you more flexibility with the values, but increase the complexity).

azure devops yaml 'if' not evaluating as expected

Oh man, I'm having a really time with the yaml learning curve. Something that seems so simple, I just can't get to work.
I have this nested parameter group (not sure if that's the correct term) -
build: {
configuration: '',
nugetSource: '',
dotnetVersion: '',
projectsToPublish: '',
releaseCandidate: 'False',
tag: ''
}
I'm trying to then use an 'if' to set a variable for pool, dependent on the parameters.build.dotnetVersion passed in.
- job: build
displayName: Build & test
variables:
- ${{ if eq(parameters.build.dotnetVersion,6.0.200) }}:
- name: buildpool
value: build-dotnet6
- ${{ if ne(parameters.build.dotnetVersion,6.0.200) }}:
- name: buildpool
value: build-default
pool: $(buildpool)
So even though I have confirmed with an echo command that the value of parameters.build.dotnetVersion is 6.0.200, buildpool is always getting set to build-default.
I also tried wrapping 6.0.200 in single quotes, but didn't work as expected.

Job as a variable and use variable inside this job | github action & yaml

I'm currently deploying a security tool for my cluster. It worked well but I want to reduce the length of the code and avoid repeating code inside the file.
Here is the situation:
on:
pull_request:
path:
- 'ionos/terraform/dev/*.tf'
- 'ionos/terraform/prod/*/*/*.tf'
jobs:
# JOB to run change detection
changes:
runs-on: ubuntu-latest
# Set job outputs to values from filter step
outputs:
Ionos_dev: ${{ steps.filter.outputs.Ionos_dev }}
Ionos_prod: ${{ steps.filter.outputs.Ionos_prod }}
steps:
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter#v2
id: filter
with:
filters: |
Ionos_dev:
- 'ionos/terraform/dev/*.tf'
Ionos_prod:
- 'ionos/terraform/prod/*/*/*.tf'
Duplicated part
Ionos_prod:
name: tfsec sarif report ionos_prod
needs: changes
if: ${{ needs.changes.outputs.Ionos_prod == 'true' }}
runs-on: ubuntu-latest
steps:
- name: Clone repo
uses: actions/checkout#master
- name: tfsec sarif ionos_dev
uses: aquasecurity/tfsec-sarif-action#v0.1.0
with:
working_directory: ionos/terraform/prod/
sarif_file: tfsec.sarif
- name: Upload SARIF file
uses: github/codeql-action/upload-sarif#v1
with:
sarif_file: tfsec.sarif
Ionos_dev:
name: tfsec sarif report ionos_dev
needs: changes
if: ${{ needs.changes.outputs.Ionos_dev == 'true' }}
runs-on: ubuntu-latest
steps:
- name: Clone repo
uses: actions/checkout#master
- name: tfsec sarif ionos_dev
uses: aquasecurity/tfsec-sarif-action#v0.1.0
with:
working_directory: ionos/terraform/dev/
sarif_file: tfsec.sarif
- name: Upload SARIF file
uses: github/codeql-action/upload-sarif#v1
with:
sarif_file: tfsec.sarif
I have more than 2 duplicated jobs, that's why I want to make the job as a variable.
My problem is I don't see how can I create the job as a variable and pass these two variables inside the job just created:
if: ${{ needs.changes.outputs.Ionos_prod == 'true' }}
&
working_directory: ionos/terraform/prod/
Anny suggestions?
After few days of research, and based on that documentation (I haven't found it before):
https://docs.github.com/pt/actions/using-jobs/using-a-matrix-for-your-jobs
I finally solved my problem.
Here is the code and some explanations at the end of it.
on:
pull_request:
types: [synchronize, reopened, labeled]
paths:
- 'aws/dns/domains/**'
- 'ionos/terraform/prod/**'
- 'ionos/terraform/dev/**'
- 'azure/terraform/**'
jobs:
changes:
runs-on: ubuntu-latest
#Outputs gives a bool variable. If a file in the path has been change -- true
outputs:
Ionos_dev: ${{ steps.filter.outputs.Ionos_dev }}
Ionos_prod: ${{ steps.filter.outputs.Ionos_prod }}
aws: ${{ steps.filter.outputs.aws }}
azure: ${{ steps.filter.outputs.azure }}
steps:
#Use of an action which check if a file in a path has been change.
- uses: dorny/paths-filter#v2
id: filter
with:
filters: |
Ionos_dev:
- 'ionos/terraform/dev/**/*.tf'
Ionos_prod:
- 'ionos/terraform/prod/**/*.tf'
aws:
- 'aws/dns/domains/**/*.tf'
azure:
- 'azure/terraform/prod/**/*.tf'
tfsec_scan_matrix:
name: tfsec_sarif_report_all_directory
runs-on: ubuntu-latest
#Here we point the job changes, required for this job
needs: changes
#We create a matrix to store the output of each repo (true or false)
#Each filter link with its directory (the directory is use to indicate the scan which directory it has to scan)
strategy:
matrix:
include:
- filters: ${{ needs.changes.outputs.Ionos_dev }}
working_directory: ionos/terraform/dev/
- filters: ${{ needs.changes.outputs.Ionos_prod }}
working_directory: ionos/terraform/prod/
- filters: ${{ needs.changes.outputs.aws }}
working_directory: aws/dns/domains/
- filters: ${{ needs.changes.outputs.azure }}
working_directory: azure/terraform/prod/
steps:
#if the path has been modified, then clone repo, same thing for the others steps
- if: ${{ matrix.filters == 'true' }}
name: Clone repo
uses: actions/checkout#master
- if: ${{ matrix.filters == 'true' }}
name: tfsec sarif ionos_dev
uses: aquasecurity/tfsec-sarif-action#v0.1.0
with:
working_directory: ${{ matrix.working_directory }}
sarif_file: tfsec.sarif
- if: ${{ matrix.filters == 'true' }}
name: Upload SARIF file
uses: github/codeql-action/upload-sarif#v2
with:
sarif_file: tfsec.sarif
What does it do?
If a tf file has been change on a pull-request in a particular path then it runs on this specific path a tfsec scan.
To solve my problem:
I implemented a matrix inside a job:
strategy:
matrix:
include:
- filters: ${{ needs.changes.outputs.Ionos_dev }}
working_directory: ionos/terraform/dev/
- filters: ${{ needs.changes.outputs.Ionos_prod }}
working_directory: ionos/terraform/prod/
- filters: ${{ needs.changes.outputs.aws }}
working_directory: aws/dns/domains/
- filters: ${{ needs.changes.outputs.azure }}
working_directory: azure/terraform/prod/
EXTRA: The "include" parameter, in my case, is to assigned an outputs to its specific path.
However, if I wanted to combine all the possibilities, I would have done this way:
strategy:
matrix:
filter: [Ionos_dev, Ionos_prod, aws, azure]
working_directory: [Ionos_dev, ionos/terraform/prod/, aws/dns/domains/, azure/terraform/prod/]
In this case, it will run all the 9 possibilities.
steps:
#if the path has been modified, then clone repo, same thing for the others steps
- if: ${{ matrix.filters == 'true' }}
name: Clone repo
uses: actions/checkout#master
- if: ${{ matrix.filters == 'true' }}
name: tfsec sarif ionos_dev
uses: aquasecurity/tfsec-sarif-action#v0.1.0
with:
working_directory: ${{ matrix.working_directory }}
sarif_file: tfsec.sarif
- if: ${{ matrix.filters == 'true' }}
name: Upload SARIF file
uses: github/codeql-action/upload-sarif#v2
with:
sarif_file: tfsec.sarif
For this part, I am still working on it. I try to improve it by simplify to only one 'if'

Ansible: Lookup variables dynamically in v2.3

I have a set of variables and a task as follows. My intent is to dynamically do a healthcheck based on the URL the user chose.
vars:
current_hostname: "{{ ansible_hostname }}"
hc_url1: "https://blah1.com/healthcheck"
hc_url2: "https://blah2.com/healthcheck"
tasks:
- name: Notification Msg For Healthcheck
shell: "echo 'Performing healthcheck at the URL {{ lookup('vars', component) }} on host {{ current_hostname }}'"
Run playbook in Ansible 2.3
ansible-playbook ansible_playbook.yml -i inventory -k -v --extra-vars "component=hc_url1"
Error
fatal: [hostname]: FAILED! => {"failed": true, "msg": "lookup plugin (vars) not found"}
I know this happens because lookup plugin "var" was introduced in Ansible v2.5. Is there a way to do this in Ansible 2.3? I want get the value of {{ component }}, and then the value of {{ hc_url1 }}
PS - upgrading to 2.5 is not an option because of org restrictions
Alternatively, maybe you can do this using a dictionary.
For example,
vars:
current_hostname: "{{ ansible_hostname }}"
urls:
hc_url1: "https://blah1.com/healthcheck"
hc_url2: "https://blah2.com/healthcheck"
tasks:
- name: Notification Msg For Healthcheck
shell: "echo 'Performing healthcheck at the URL {{ urls[component] }} on host {{ current_hostname }}'"
That way, the user provided value of component will just be looked up as a key in the dictionary.

Ansible: make output from a command become a key-value item/variable for the next command

I want to use this output (from a previous command) as an array of key-values or as an inventory for the next command in the same playbook
stdout:
hot-01: 10.100.0.101
hot-02: 10.100.0.102
hot-03: 10.100.0.103
....
hot-32: 10.100.0.132
like this:
- shell: "echo {{ item.key }} has value {{ item.value }}"
with_items: "{{ output.stdout_lines }}"
or:
- add_host: name={{ item.key }} ansible_ssh_host={{ item.value }}
with_items: "{{ output.stdout_lines }}"
Desired output of the echo command:
hot-01 has value 10.100.0.101
I also tried with with_dict: "{{ output.stdout }}" but still no luck
"fatal: [ANSIBLE] => with_dict expects a dict"
AFAIK there are no Jinja2 filters to convert strings to dictionaries.
But in your specific case, you can use the python's split string function to separate the key from the value:
- shell: "echo {{ item.split(': ')[0] }} has value {{ item.split(': ')[1] }}"
with_items: "{{ output.stdout_lines }}"
I know, having to use split twice is a bit sloppy.
As in this case your output is a valid YAML, you can also do the following:
- shell: "echo {{ item.key }} has value {{ item.value }}"
with_dict: "{{ output.stdout | from_yaml }}"
As a last resort, you can also create your own ansible module to create a Jinja2 filter to cover your case. There is an split module filter that you can use as inspiration here: https://github.com/timraasveld/ansible-string-split-filter