Gitlab-CI: variable in variable in downstream trigger job - gitlab-ci

(How) Can I use variable value as a name for another variable?
I have a job with matrix and dotenv artifacts as follows:
build-names:
stage: build
...
script:
...
<lines omitted>
...
- echo "${NAME}_DEB_PACKAGE_VERSION=${DEB_PACKAGE_VERSION}" >> build.env
artifacts:
reports:
dotenv: build.env
parallel:
matrix:
- NAME:
- name1
- name2
type:
- 0
- 1
environment: $NAME/$TYPE
Then I have a downstream trigger job again using matrix and I want to pass the appropriate package version based on the ${NAME}
build-images:
stage: .post
needs:
- job: build-names
artifacts: true
variables:
PACKAGE_VERSION_VARIABLE_NAME: ${NAME}_DEB_PACKAGE_VERSION
PACKAGE_VERSION: ${${BACKEND_VERSION_VARIABLE_NAME}}
// OR
PACKAGE_VERSION_VARIABLE_NAME: ${NAME}_DEB_PACKAGE_VERSION
PACKAGE_VERSION: ${!BACKEND_VERSION_VARIABLE_NAME}
trigger:
project: group/project-${NAME}
parallel:
matrix:
- NAME:
- name1
- name2
Neither of the two approaches above (double ${${}} or !) works in the variables section.
I could 'generate' the variable within script section, but AFAIK you cannot have both trigger and script within the same job.
Is there a workaround for similar use cases?
(Using self-hosted gitlab 15.4)

Related

How to reference a variable within a variable YAML GitLab pipeline

I am using env as a (lowercase) input variable for my pipeline and
I want to be able to have this stage use the correct AWS account based on the environment I input. Right now I have it set just as AWS_ACCOUNT_DEV so I need to have separate stages.
I just want this one stage to be able to be used for all environments based on my input - how can I achieve this?
variables:
AWS_ACCOUNT_DEV: 000000000
AWS_ACCOUNT_NONPROD: 000000000
Import Alertmanager Endpoint:
stage: import
dependencies:
- "Validate Credentials"
tags:
- ${runner}
rules:
- if: $CI_PROJECT_ID != $MONITORING_PROJECT_ID
variables:
AWS_IAM_DEPLOYER_ROLE: "arn:aws:iam::${AWS_ACCOUNT_DEV}:role/${runner_role}"
...
figured it out by adding:
rules:
- if: $CI_PROJECT_ID != $MONITORING_PROJECT_ID
- if: $ecs_env == "dev"
variables:
AWS_IAM_DEPLOYER_ROLE: "arn:aws:iam::${AWS_ACCOUNT_DEV}:role/${runner_role}"
- if: $ecs_env == "nonprod"
variables:
AWS_IAM_DEPLOYER_ROLE: "arn:aws:iam::${AWS_ACCOUNT_NONPROD}:role/${runner_role}"

How to skip job in Gitlab pipeline and still run dependencies

I have two jobs in the same stage with a dependency specified via "needs" keyword, for example JobA -> JobB(needs[JobA]).
When I try to skip JobA with a rule (to speedup the build process), JobB throws an 'invalid yaml' error for the "needs" keyword, because the referenced JobA now doesn't exist.
What is the correct syntax/construct to enable such dependency ? Is the use of "rules" in JobA the right approach ?
The simplified version of what I have is:
image1:
stage: build-images
script:
- etc...
rules:
- changes:
- values.env
image2:
stage: build-images
script:
- ...
needs: [image2]
Use needs:optional:
image1:
stage: build-images
script:
- etc...
rules:
- changes:
- values.env
image2:
stage: build-images
script:
- ...
needs:
- job: image1
optional: true

gitlab only runs one job in child-pipeline

I have a gitlab-ci.yml that creates and trigger a child .yml
stages:
- child-pipeline-generator
- child-pipeline-trigger
generate-child-pipeline:
stage: child-pipeline-generator
tags:
- GroupRunner
script:
- $(./generate-build.ps1) *>&1 > child-pipeline-gitlab-ci.yml
- (Get-Content child-pipeline-gitlab-ci.yml) | Set-Content child-pipeline-gitlab-ci.yml -Encoding UTF8
artifacts:
paths:
- child-pipeline-gitlab-ci.yml
trigger-child-pipeline:
stage: child-pipeline-trigger
trigger:
include:
- artifact: child-pipeline-gitlab-ci.yml
job: generate-child-pipeline
strategy: depend
The resulting yml looks like
build_1:
tags:
- GroupRunner
script:
- echo 'build_1'
build_2:
tags:
- GroupRunner
script:
- echo 'build_2'
But when executed only job 1 (build_1) shows up in the Downstream list
Turned out the problem was the encoding of the powershell-output. Default encoding from powershell 5 is UFT16BOM and my reencoding to UTF8 resulted in UFT8BOM wich gitlab can´t handle properly. My sollution was to encode in ASCII.
What I can´t explain is why it was able to interpret the first job correctly, I thought that encodeing would result in an all or nothing outcome. Maybe the CRLF-CRLF after the first job caused the errror

Multiple extends or multiple stages?

I want to have a CI to deploy two commands ("bash X" and "bash Y") on different production servers (server 1, server 2, server 3, etc.).
I looked for multiple stages but it don't seems to answer my question.
I don't really care if it runs in parallel or B after A. (the manual section is for debugging)
I don't know how to do it : I tried with multiple extends but it only takes the last one (bashB) in my pipeline.
stages:
- get_password
- bashA
- bashB
get_password:
stage: get_password
# Steps
.bashA:
stage: bashA
script:
- lorem ipsum
when: manual
only:
changes:
- script/bashA.sh
.bashB:
stage: bashB
script:
- ipsum loreem
when: manual
only:
changes:
- script/bashB.sh
# SRV1
deploy-srv1:
extends:
- .bashA
- .bashB
variables:
SRV_1: urlsrv1
# SRV2
deploy-srv2:
extends:
- .bashA
- .bashB
variables:
SRV_1: urlsrv2
I just want to be able to deploy bashA and bash B on X servers (I just took 2 servers for example).
When using multiple extend in GitLab, some of the values will not be merged, but overwritten. If you check the documentation here:
https://docs.gitlab.com/ee/ci/yaml/#extends
They write:
The algorithm used for merge is “closest scope wins”, so keys from the last member will always shadow anything defined on other levels
You are not alone in wanting a feature to be able to merge scripts instead of overwriting them. Here's an open issue on GitLab to do what you described:
https://gitlab.com/gitlab-org/gitlab/issues/16376
In the meantime, and only looking at the example you provided, you can get something like what you want by manually merging bashA and bashB into one job:
stages:
- get_password
- bash
get_password:
stage: get_password
# Steps
.bash_both:
stage: bash
script:
- lorem ipsum
- ipsum loreem
when: manual
only:
changes:
- script/bashA.sh
- script/bashB.sh
# SRV1
deploy-srv1:
extends:
- .bash_both
variables:
SRV_1: urlsrv1
# SRV2
deploy-srv2:
extends:
- .bash_both
variables:
SRV_1: urlsrv2

Way to use anchors/references in job.<spec>.script for DRYness

I'm fairly new to using gitlab-ci and as such, I've run into a problem where the following fails ci-lint because of my use of anchors/references:
image: docker:latest
services:
- docker:dind
variables:
DOCKER_DRIVER: overlay2
DOCKER_HOST: tcp://localhost:2375
.install_thing1: &install_thing1
- do things
- to install
- thing1
.install_thing2: &install_thing2
- do things to
- install thing2
.setup_thing1: &setup_things1
variables:
VAR: var
FOO: bar
script:
- all
- the
- things
before_script:
...
stages:
- deploy-test
- deploy-stage
- deploy-prod
test:
stage: deploy-test
variables:
RUN_ENV: "test"
...
only:
- tags
- branches
script:
- *install_thing1
- *install_thing2
- *setup_thing1
- other stuff
...
test:
stage: deploy-stage
variables:
RUN_ENV: "stage"
...
only:
- master
script:
- *install_thing1
- *install_thing2
- *setup_thing1
- other stuff
When I attempt to lint the gitlab-ci.yml, I get the following error:
Status: syntax is incorrect
Error: jobs:test:script config should be a string or an array of strings
The error eludes to just needing an array for the script piece, which I believe I have. Use of the <<: *anchor pragma causes an error as well.
So, how can one accomplish what I'm trying to do here where I don't have to repeat the code in every -block?
You can fix it and even make it more DRY, take a look at the Auto DevOps template Gitlab created.
It can fix your issue and even more improve your CI file, just have a template job like their auto_devops job, include it in a before_script and then you can combine and call multiple functions in a script block.
The anchors only give you limited flexibility.
(This concept made it possible for me to have one CI file for 20+ projects and a centralized functions file I wget and load in my before_script.)