Failed on startup: ExpectedArtifact matches multiple artifacts - spinnaker

I created a pipeline (KubernetesV2 provider) with a GitHub trigger that expects multiple artifacts using a regex. First stage is a bake stage using that artifact as "overrides" artifact.
If a push event is received containing multiple artifacts, the pipeline does not start with the reason
"Failed on startup: Expected artifact ExpectedArtifact(matchArtifact=Artifact(type=github/file, name=charts/values-.*.yml... matches multiple artifacts
I would like to to execute a pipeline instance for each of the artifacts. For now it seems to me that this cannot be done using Spinnaker alone. I could invoke a Jenkins job that again for each of the artifacts triggers a pipeline e.g. via webhook.
Could you please comment on this?
Thanks!

Does the override artifact need to have that same naming convention? I wonder if the workaround for this is to have the override artifact named something like override-blah-bblah.yml which then will make the spinnaker trigger think there is only one artifact found.

Related

Gitlab CI - Trigger daily pipeline only if new chanes have been commited

The company I work for has a self hosted Gitlab CE server v13.2.1.
For a specific project I've setup the CI jobs to build according to the following workflow :
If a commit has been pushed to the main branch
If a merge request has been created
If a tag has been pushed
Every day at midnight to build the main branch (using scheduled pipelines)
Everything works fine. The only thing I would like to improve is that the nightly builds are performed even if the main branch has not been modified (no new commit).
I had a look to the Gitlab documentation to change my workflow: rules in the .gitlab-ci.yml file but I didn't find anything relevant.
The gitlab runner is installed in a VM and is setup as a shell executor. I was thinking of creating in the home directory a file to store the last commit ID. I'm not a big fan of that solution, because :
it's a ugly fix.
The pipeline will be triggered by Gitlab even if it does nothing. This will pollute the pipeline list.
Is there any way to setup the workflow: section to perform this so the pipeline list won't contain unnecessary pipeline ?

what is the use of custom-artifact in spinnaker, it always gives error - Custom references are passed on to cloud platforms to handle or process 500

i am trying to use custom-artifact account in spinnaker.
I have a pipeline, where i want to pull a http file (a deployment manifest) as an artifact, and use it in deployment.
i use custom-artifact and put the url - (https://raw.githubusercontent.com/sdputurn/flask-k8s-inspector/master/Deployment.yaml) in reference.
I have tried running this pipeline multiple times, but i always fails with the error (Internal Server Error",“message”:“Custom references are passed on to cloud platforms to handle or process”,“status”:500)
i saw some tutorials where they just use custom artifact and put some http url to get files for deploy stage.
steps to re-produce:
1. create a new pipeline --> in configuration stage --> add artifact --> choose "custom-artifact" --> update reference with (https://raw.githubusercontent.com/sdputurn/flask-k8s-inspector/master/Deployment.yaml) --> check "use default artifact" and fill the same details -- > add one more stage Deploy --> use the artifact template from configuration stage --> run the pipeline
spinnaker version - 1.16.1
For the Spinnaker version 1.17.1 the custom-artifact is deprecated. If possible use the embedded-artifact>produce an artifact and use the artifact in another execution.

gitlab: Is there a way to http access an artifact during a job, rather than after?

I'm trying to run a pipeline with two stages. The first stage creates a zip file and the second stage executes a http curl POST for that file. If the curl succeeds, the pipeline is completed.
The problem is that gitlab only exposes the zip file AFTER the pipeline has completed - which means the zip file from the previous pipeline gets sent instead.
I've tried using artifacts and dependencies, but it seems the http url is only exposed for completed pipelines. I tried using the url of the specific job that executed the build stage, but it didn't work either.
Does anyone know how to access an artifact by URL, before pipeline completion?
i was unable to find a way to access the artifact remotely before the pipeline had completed. sad face.
i have a workaround though - i moved the deploy stage to a separate pipeline. so the first pipeline just executes the build (generating the artifacts) and then triggers the second pipeline. the second pipeline is then able to access the artifacts of the first pipeline and it just executes a deploy stage. happy face.

In GitlabCI - How can we trigger a build/pipeline if a specific build/pipeline is completed successfully?

We are using GitLab Enterprise Edition 10.8.7-ee 075705a and trying to use Gitlab CI.
Here is my scenario:-
I've two repositories repo1 and repo2 and I'm setting up two pipelines pipeline1 and pipeline2.
Now I'm looking for an option where I can configure pipeline2 to trigger a build if pipeline1 build is successful. One more thing, I need to get the version number of the pipeline1 in pipeline2
Note:- I know we can trigger pipeline2 from pipeline1 but I need other way around.
Please suggest.
A couple of options.
Use the gitlab api to do this (triggers).
Use webhooks to do this.
gitlab webhooks docs
gitlab triggers docs
with this. you can get any data / meta data for your stack.
and can automagically call it/set it on any condition.
This can also be done if your stack is using aws (CLI) and (or) Jenkins
Some sections that may interest you in gitlab triggers docs
When used with multi-project Pipelines
When a pipeline depends on the artifacts of another Pipeline
Triggering a pipeline from a webhook
Using cron to trigger nightly (or pretty much *ly) pipelines

Global variable in Jenkins Repository URL

I am trying to use a global Jenkins variable in the Repository URL field:
Repository URL: ${BUILD-PEND-SRC}
BUILD-PEND-SRC is defined in Configure System and a value of a proper URL is set. If I do a shell execution job with echo ${BUILD-PEND-SRC} it does display the correct value.
However, when I run the job, I get
ERROR: Failed to check out ${BUILD-PEND-SRC}
org.tmatesoft.svn.core.SVNException: svn: E125002: Malformed URL '${BUILD-PEND-SRC}'
Which tells me that Jenkins did not resolve ${BUILD-PEND-SRC}.
I am summarizing the SO answer that solved it for git-based Jenkins pipeline jobs but also applies to svn-based jobs: https://stackoverflow.com/a/57065165/1994888 (credits go to #rupesh).
Summary
Edit your job config
go to the Pipeline section
go to the definition Pipeline script from SCM
uncheck Lightweight checkout
The issue seems to be with the scm-api-plugin (see the bug report in the Jenkins issue tracker), hence, it is not specific to a version control system.