How can I disable a trigger from a linked repository in Bamboo YAML specs? - bamboo

We've been using Bamboo YAML specs to run our build plans. We use the default repository + a linked repository in that plan. The build plan now triggers if a commit/new branch has been created in the default repository (=desired behavior), but also when the linked repository has an update (=undesired behavior). How can I disable this via YAML specs?
The Bamboo documentation does not help me, and looking at a 'normal' (non-YAML specs) build plan does not work either, since this option is not converted to YAML specs when selecting 'view as YAML specs'. It does not show in the YAML specs if the trigger of the linked repo is on or off (see attached picture).
Help would be much appreciated!

Insert a script task that compares bamboo.planRepository.<position>.revision to bamboo.planRepository.<position>.previousRevision (find the correct <position> for your repository). Skip the plan build (exit 0) if the two are the same.
Move away from YAML specs. They are still very limited compared to Java specs.

This will disable trigger
---
version: 2
triggers: []
https://docs.atlassian.com/bamboo-specs-docs/8.2.0/specs.html?yaml#plan-branches
This will enable trigger for specific repository
---
version: 2
...
triggers:
- bitbucket-server-trigger:
repositories:
- your_repository_name_here
https://docs.atlassian.com/bamboo-specs-docs/8.2.0/specs.html?yaml#triggering-selected-repositories

Related

Pass services to child pipeline in GitLab

I am trying to generalize the cicd of our GitLab projects.
I am planning to create a cicd-templates repo, containing general jobs that I run in multiple projects.
I have for example a terraform template that accepts input variables and runs an init, validate, plan and apply job.
I am now trying to create a similar template for our python-nox sessions. The issue is that, for our integration tests, we need two services.
I would prefer not to include the services in the template, since they are not needed for the integration tests of other projects (but other services might).
So I was wondering how I could include a ci template (from another project) and pass the needed images from the parent pipeline.
What is not working:
Parent/project pipeline:
trigger-nox-template:
variables:
IMAGE: "registry.gitlab.com/path/to/my/image:latest"
trigger:
include:
- project: cicd-templates
file: /nox_tests.yml
strategy: depend
services:
- name: mcr.microsoft.com/mssql/server:2017-latest
alias: db
- name: mcr.microsoft.com/azure-storage/azurite:3.13.1
alias: storage
cicd-templates/nox_tests.yml:
variables:
IMAGE: "registry.gitlab.com/path/to/a/default/image:latest"
integration:
image: '$IMAGE'
script:
- python -m nox -s integration
As I said, I could hardcode the services in the template as well, but they might vary based on the parent pipeline, so I'm looking for a more dynamic solution.
ps. How I implemented the image does work, but if there is a more elegant way, that would be appreciated as well.
Thanks in advance!

How can I write a commit hash to a file using .gitlab-ci before build and deploy

I want to add an endpoint in my server to retrieve the current commit hash in production. I am using .gitlab-ci. I want to do this in the pipeline so that the commit hash is written to a file before "build and deploy". I can read this file on request to return the latest deployed version. Can anyone help me with the steps and examples? Thanks in advance!
I would offer an alternative to this. Use GitLab's environments and deployments features that, in part, considers this exact use case.
In your CI/CD configuration (.gitlab-ci.yml), you can specify an environment: key that will record deployments to your environment(s).
For example:
deploy:
script:
- echo "your deployment script here"
environment:
name: "production"
Now, when this job runs, GitLab will record it as a deployment that can be queried later.
Then you can use the deployments API or the environments API to get the latest deployment information which will include, among other information, the commit hash of the deployment.

Cannot find Additional Behaviours parameters in JJB

I am trying to create my jobs through JJB and almost all parameters are present.
But I can't find the parameters for the repositories that are in the Additional Behaviours category.
Tell me how to configure them using JJB?
Maybe you need additional plugins?Options screen
The list of options in the web UI are for the git plugin. You can find them in JJB under the git scm module. If you wanted to add Prune stale remote-tracking branches (git remote prune origin`) you would use:
- job:
name: git-prune-remote
scm:
- git:
prune: true
Full online documentation: https://jenkins-job-builder.readthedocs.io/en/latest/scm.html#scm.git

In GitlabCI - How can we trigger a build/pipeline if a specific build/pipeline is completed successfully?

We are using GitLab Enterprise Edition 10.8.7-ee 075705a and trying to use Gitlab CI.
Here is my scenario:-
I've two repositories repo1 and repo2 and I'm setting up two pipelines pipeline1 and pipeline2.
Now I'm looking for an option where I can configure pipeline2 to trigger a build if pipeline1 build is successful. One more thing, I need to get the version number of the pipeline1 in pipeline2
Note:- I know we can trigger pipeline2 from pipeline1 but I need other way around.
Please suggest.
A couple of options.
Use the gitlab api to do this (triggers).
Use webhooks to do this.
gitlab webhooks docs
gitlab triggers docs
with this. you can get any data / meta data for your stack.
and can automagically call it/set it on any condition.
This can also be done if your stack is using aws (CLI) and (or) Jenkins
Some sections that may interest you in gitlab triggers docs
When used with multi-project Pipelines
When a pipeline depends on the artifacts of another Pipeline
Triggering a pipeline from a webhook
Using cron to trigger nightly (or pretty much *ly) pipelines

Delete or reset Gitlab CI builds

Is it possible to delete old builds in Gitlab CI?
I tested a few things and have now about 20 builds that are useless (most are failed anyway).
It also shows stages that I don't have anymore which kinda clutters the Pipelines page and some of the uploaded artifacts are a bit big.
I wasn't able to find any documentation on this, only that disabling CI in the settings doesn't remove the builds.
Using Gitlab 8.10 Community (hosted by Gitlab.com)
There is currently no option in the GUI to completely get rid of a build other than expunge related data from the build. (The erase option in the build)
If you would have a local installation you could modify the database directly but I would advise caution. (I'll put the guide here for completeness sake)
Login to the GitLab database. If you use the default PostgreSQL :
sudo -u gitlab-psql /opt/gitlab/embedded/bin/psql -h /var/opt/gitlab/postgresql -d gitlabhq_production
Check if there is a table ci_builds. For pSQL: \dt
Delete the builds with normal SQL. For example: DELETE FROM ci_builds WHERE id = 2
(Optional) If you want to cleanup a list of commits which triggered a build you need to midify the table ci_commits.