Gitlab-CI: conditional variables? - gitlab-ci

In our gitlab-ci file we have some variables like this:
variables:
...
LOCAL_IMAGE_PHP: $CI_REGISTRY_IMAGE/php:$CI_COMMIT_BRANCH
...
Which we use as the image file for another stage:
.build-php:
...
image: $LOCAL_IMAGE_PHP
...
We then have a docker image with tag master, staging and develop. So the correct image is automatically chosen based on which branch is currently getting build.
However, we also want to use this to build and run automatic tests on our feature branches. However, no php:feature/something images exist.
Is it possible to conditionally assign another variable based on the value of $CI_COMMIT_BRANCH? Something like DOCKER_TAG: $CI_COMMIT_BRANCH if ($CI_COMMIT_BRANCH in ['master', 'staging', 'develop']) else 'develop'. So DOCKER_TAG would fallback to develop if it's an a feature branch.
So in a way it should select the corresponding docker image on the branches master, staging and develop, but fallback to develop on any other branch.
Is something like that possible? Or is there a different way to handle this kind of scenario?
Thanks!

Related

Can I default every dbt command to cautious indirect selection?

I have a dbt project with several singular tests that ref() several models, many of them testing if a downstream model matches some expected data in the upstream. Whenever I build only another downstream model that uses the same upstream models, dbt will try to execute the tests with an out-of-date model. Sample visualization:
Model "a" is an upstream view that only makes simply transformations on a source table
Model "x" is a downstream reporting table that uses ref("a")
Model "y" is another downstream reporting table that also uses ref("a")
There is a test "t1" making sure every a.some_id exists in x.some_key
Now if I run dbt build -s +y, "t1" will be picked up an executed, however "x" is out-of-date when compared to "a" since new data has been pushed into the source table, so the test will fail
If I run dbt build -s +y --indirect-selection=cautious the problem will not happen, since "t1" will not be picked up in the graph.
I want every single dbt command in my project to use --indirect-selection=cautious by default. Looking at the documentation I've been unable to find any sort of environment variable or YML key in dbt_project that I could use to change this behavior. Setting a new default selector also doesn't help because it is overriden by the usage of the -s flag. Setting some form of alias does work but only affects me, and not other developers.
Can I make every dbt build in my project use cautious selection by default, unless the flag --indirect-selection=eager is given?

gitlab-ci: Is there a way to visualize or simulate pipelines for different branch/tag names?

We have a fairly complex .gitlab-ci.yml configuration. I know about the new gitlab pipeline editor, but I can't find a way to 'simulate' what jobs get picked by my rules depending on the branch name, tag, etc.
On our jobs, we have a $PIPELINE custom variable to allow us to have different pipeline 'types' by using schedules to define this var to different values, like this:
rules:
- if: '$PIPELINE == "regular" && ($CI_COMMIT_BRANCH == "master" || $CI_COMMIT_TAG != null)'
or like this:
rules:
- if: '$CI_COMMIT_TAG != null'
Is there a way to 'simulate' a pipeline with different branch names, tags and variables so I can see what jobs get picked on each case, without actually running the pipelines (e.g. with a test tag, etc.). Or is there a better way to do this?
Thanks in advance.
Not quite, but this is close, with GitLab 15.3 (August 2022):
Simulate default branch pipeline in the Pipeline Editor
The pipeline editor helps prevent syntax errors in your pipeline before you commit. But pipeline logic issues are harder to spot. For example, incorrect rules and needs job dependencies might not be noticed until after you commit and try running a pipeline.
In this release, we’ve brought the ability to simulate a pipeline to the pipeline editor.
This was previously available in limited form in the CI Lint tool, but now you can use it directly in the pipeline editor. Use it to simulate a new pipeline creation on the default branch with your changes, and detect logic problems before you actually commit!
See Documentation and Issue.

Pass Value Between Pipelines

I have a drone file containing multiple pipelines that run in a sequence via dependancies.
In the first pipeline a value is generated that I would like to store as a variable and use in one of the other pipelines.
How would I go about doing this? I’ve seen that variables can be passed between steps via a file but this isn’t possible with pipelines from what i’ve seen and tried.
Thanks

Is there a way to trigger a child plan in Bamboo and pass it information like a version number?

We're using Go.Cd and transitioning to Bamboo.
One of the features we use in Go.Cd is value stream maps. This enables triggering another pipeline and passing information (and build artifacts) to the downstream pipeline.
This is valuable when an upstream build has a particular version number, and you want to pass that version number to the downstream build.
I want to replicate this setup in Bamboo (without a plugin).
My question is: Is there a way to trigger a child plan in Bamboo and pass it information like a version number?
This has three steps.
Use a parent plan/child plan to setup the relationship.
Using the artifacts tab, setup shared artifacts to transfer files of one plan to another.
3a. At the end of the parent build, dump the environment variables to a file
env > env.txt
3b. Setup (using the artifacts tab) an artifact selector that picks this up.
3c. Setup a fetch for this artifact from the shared artifacts in the child plan.
3d. Using the Inject Variables task - read the env.txt file you have transferred over. Now your build number from the original pipeline is now available in this downstream pipeline. (Just like Go.Cd).

Is there a way to make Gitlab CI run only when I commit an actual file?

New to Gitlab CI/CD.
What is the proper construct to use in my .gitlab-ci.yml file to ensure that my validation job runs only when a "real" checkin happens?
What I mean is, I observe that the moment I create a merge request, say—which of course creates a new branch—the CI/CD process runs. That is, the branch creation itself, despite the fact that no files have changed, causes the .gitlab-ci.yml file to be processed and pipelines to be kicked off.
Ideally I'd only want this sort of thing to happen when there is actually a change to a file, or a file addition, etc.—in common-sense terms, I don't want CI/CD running on silly operations that don't actually really change the state of the software under development.
I'm passably familiar with except and only, but these don't seem to be able to limit things the way I want. Am I missing a fundamental category or recipe?
I'm afraid what you ask is not possible within Gitlab CI.
There could be a way to use the CI_COMMIT_SHA predefined variable since that will be the same in your new branch compared to your source branch.
Still, the pipeline will run before it can determine or compare SHA's in a custom script or condition.
Gitlab runs pipelines for branches or tags, not commits. Pushing to a repo triggers a pipeline, branching is in fact pushing a change to the repo.