Why submodules are not updated using GIT_SUBMODULE_STRATEGY:recursive - gitlab-ci

I created a .gitmodules file in the root of MASTER's project:
[submodule "SLAVE"]
path = SLAVE
url = ../../my-group/SLAVE.git
Added to MASTER's .gitlab-ci.yml:
variables:
GIT_SUBMODULE_STRATEGY: recursive
Triggered MASTER's CI pipeline.
As a result, no changes made in SLAVE project were applied while running MASTER's CI

You need to add GIT_SUBMODULE_UPDATE_FLAGS to the .gitlab-ci.yml.
variables:
GIT_SUBMODULE_STRATEGY: recursive
GIT_SUBMODULE_UPDATE_FLAGS: --remote --merge

Related

GitLab CI - forced start of job during manual start of another job

I have a dependency problem. My pipeline looks like it gets the dependencies required for jobs first, and finally runs a stage cleanup that cleans them all. The problem is that I have one stage with manual launch which also needs these dependencies but they are cleared.
Question can I somehow run a stage which has dependencies by running a manual stage? is there any other way i can solve this problem?
The normal behaviour of GitLab-CI is to clone the git repository at each job because the jobs can be run on different runners and thus need to be independent.
The automatic clone can be disabled by adding:
job-with-no-git-clone:
variables:
GIT_STRATEGY: none
If you need to use in a job some files/directories created in a previous stage, you must add them as GitLab artifacts
stages:
- one
- two
job-with-git-clone:
stage: one
script:
# this script creates something in the folder data
# (which means $CI_PROJECT_DIR/data)
do_something()
artifacts:
paths:
- data/
job2-with-git-clone:
stage: two
script:
# here you can use the files created in data
job2-with-no-git-clone:
stage: two
variables:
GIT_STRATEGY: none
script:
# here you can use the files created in data

GitLab CI/CD run step only if on correct branch AND it has changes

I want a step in the build process only to run on the master branch, if there where changes to the src folder.
My .gitlab-ci.yml file thus contains:
build:php:
stage: build
image: alpine
interruptible: true
needs: [ "test:php" ]
script:
- do stuff // abreviated for simplicity
rules:
- if: $LANGUAGE_RELEASE
when: never
- if: '$CI_COMMIT_REF_SLUG == "master"' # run for production test branch
- changes:
- src/*
However, the issue here is, that it also runs on the dev branch, when I change anything.
Question: Is the a way to have this step only run, both conditions (the branch and the changes) are met?
When using the rules keyword, the rules:if clause may be used, with the variable $CI_COMMIT_BRANCH.
Thus, something like below to specify master as the only branch to run the job:
build:php:
stage: build
# ...
rules:
- if: '$CI_COMMIT_BRANCH == "master"'
# ...
(Rules are applied in order)
The documentation reference for common if clauses available is here.
Now to combine an if and changes rule, you'll need to use
build:php:
stage: build
# ...
rules:
- if: '$CI_COMMIT_BRANCH == "master"'
changes:
- file1 # single file
- folder/**/* # folder including all files and subfolders
# ...
You can read more about changes used in rules here and read the full changes specification here.
With the initial setup there where two issues:
the changes where seen as a new rule (since they had a - in front)
src/* only takes the src folder in consideration, not sub folders, for that you'll need src/**/*
It is also possible to use the only keyword to specify master as the only branch to run the job. Despite being simple, this is no longer encouraged and it cannot be used together with rules (only/except reference).
Example:
build:php:
stage: build
# ...
only:
- master
# ...
There is no way to run rules:changes on the master branch.
Gitlab docs says:
You should use rules: changes only with branch pipelines or merge
request pipelines. You can use rules: changes with other pipeline
types, but rules: changes always evaluates to true when there is no
Git push event. Tag pipelines, scheduled pipelines, manual pipelines,
and so on do not have a Git push event associated with them.
It means changes gives always true if there is different type than branches or merge_requests

Configure Gitlab to fetch MR branch instead of commit hash

I have a CI configuration for merge requests like this:
# build artifacts and run tests
build-and-test:
stage: build
script:
- mvn $MAVEN_CLI_OPTS clean verify
only:
- master
- merge_requests
When I push to a MR, gitlab checks out a specific commit, instead of the branch being used in the MR. This makes my gitver configuration ignore the branch name.
Can I make Gitlab fetch the branch instead of the commit? Every MR has a specific branch, right?
MRs are actually built in a detached commit, it seems like it's an implementation detail of gitlab. Maybe it's made to protect from accidental pushes?
If you have git on you machine or build container, you can manually checkout to $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME variable or to use it somehow in your gitver.
$CI_COMMIT_REF_NAME also seems to be set to source branch name in my tests, it might be a bit more convenient in case of having single job for both master and MR branches.
Edit: example pipeline for merge checks only:
build-and-test:
stage: build
script:
- git checkout $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME
- mvn $MAVEN_CLI_OPTS clean verify
only:
- merge_requests

How to set gitlab-ci variables dynamically?

How to set gitlab-ci varibales through script not just in "varibales" section in .gitlab-ci.yaml?So that I can set variables in one job and use in different job
There is currently no way in GitLab to pass environment variable between stages or jobs.
But there is a request for that: https://gitlab.com/gitlab-org/gitlab/-/issues/22638
Current workaround is to use artifacts - basically pass files.
We had a similar use case - get Java app version from pom.xml and pass it to various jobs later in the pipeline.
How we did it in .gitlab-ci.yml:
stages:
- prepare
- package
variables:
VARIABLES_FILE: ./variables.txt # "." is required for image that have sh not bash
get-version:
stage: build
script:
- APP_VERSION=...
- echo "export APP_VERSION=$APP_VERSION" > $VARIABLES_FILE
artifacts:
paths:
- $VARIABLES_FILE
package:
stage: package
script:
- source $VARIABLES_FILE
- echo "Use env var APP_VERSION here as you like ..."
If you run a script you can set an environment variable
export MY_VAR=the-value
once the environment variable is set it should persist in the current environment.
Now for why you do not want to do that.
A tool like Gitlab CI is meant to achieve repeatability in your
artifacts. Consistency is the matter here. What happens if a second job
has to pick up a variable from the first? Then you have multiple paths!
# CI is a sequence
first -> second -> third -> fourth -> ...
# not a graph
first -> second A -> third ...
\> second B />
How did you get to third? Now if you had to debug third which path do you test? If the build in third is broken who is responsible second A or second B?
If you need a variable use it now, not later in another job/script. Whenever you
want to write a longer sequence of commands make it a script and execute the script!
You can use either Artifact or Cache to achieve this, see the official documentation for more information around Artifact and Cache:
https://docs.gitlab.com/ee/ci/caching/#how-cache-is-different-from-artifacts

How to import the environment variables from the parent branch in a forked repo (GitLab)?

I'm setting up a Gitlab runner to SSH into a remote server so I can run tests on physical hardware, however the jobs fail when launched from my forked branch. I save the SSH keys as environment variables in the parent and they are not picked up by the jobs running on the forked runners. How can I import the environment variables from the parent?
The jobs are successful when I manually add the SSH key as an environment variable to my forked repo, however this is not scalable. I have tried adding the project and all people involved to a common group and set the same variables in there, as well as initiate Group Runners. It seems that if you kickoff a runner from your personal account then you cannot access the necessary variables.
In the .gitlab-ci.yml file I added some print out statements to help debug. I set the SSH_PRIVATE_KEY and RUNNER_ID to their required values in the parent repo and left unassigned in my forked branch. I got blank outputs when run from my personal account.
gitlab-ci.yml
hardware-1:
image: ubuntu
before_script:
- echo "$SSH_PRIVATE_KEY"
- echo "$RUNNER_ID"
tags:
- hardware
script:
- ssh pi#raspberry "./test-hardware.sh"
Runner console output on forked repo.
$ ...
$ Updating certificates in /etc/ssl/certs...
$ 0 added, 0 removed; done.
$ Running hooks in /etc/ca-certificates/update.d...
$ echo "$SSH_PRIVATE_KEY"
$ echo "$RUNNER_ID"
On the parent branch, the console outputs the actual SSH_PRIVATE_KEY and RUNNER_ID. How to I force the runner to always run from the parent repo?
It might be because of this:
Variables can be protected. Whenever a variable is protected, it would only be securely passed to pipelines running on the protected branches or protected tags. The other pipelines would not get any protected variables.
Protected variables can be added by going to your project’s Settings > CI/CD, then finding the section called Variables, and check “Protected”.
Once you set them, they will be available for all subsequent pipelines.
To protect a branch or a tag:
Settings -> Repository -> Protected branches/tags