Passing variable between jobs in Azure Pipeline with empty result - variables

I am writing an azure pipeline yml requesting to pass variables between jobs but the variables are not passing through. however, it wasn't successful and it returns an empty variable.
here is my pipeline:
jobs:
- job: UpdateVersion
variables:
terraformRepo: ${{ parameters.terraformRepo }}
pool:
vmImage: ubuntu-latest
steps:
- checkout: self
persistCredentials: true
- checkout: ${{ parameters.terraformRepo }}
- task: AzureCLI#2
displayName: PerformVerUpdate
inputs:
azureSubscription: ${{ parameters.azureSubscriptionName }}
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
echo Step 3 result
echo "Reponame $Reponame"
echo "notify $notify"
echo "pullRequestId $pullRequestId"
echo "##vso[task.setvariable variable=pullRequestId;isOutput=true;]$pullRequestId"
echo "##vso[task.setvariable variable=Reponame;isOutput=true;]$Reponame"
echo "##vso[task.setvariable variable=notify;isOutput=true;]true"
Name: PerformVerUpdate
- job: SlackSuccessNotification
dependsOn: UpdateVersion
condition: and(succeeded(), eq(dependencies.UpdateVersion.outputs['PerformVerUpdate.notify'], 'true'))
pool:
vmImage: 'ubuntu-latest'
variables:
- group: platform-alerts-webhooks
- name: notify_J1
value: $[ dependencies.UpdateVersion.outputs['PerformVerUpdate.notify'] ]
- name: pullRequestId_J1
value: $[ dependencies.UpdateVersion.outputs['PerformVerUpdate.pullRequestId'] ]
- name: Reponame_J1
value: $[ dependencies.UpdateVersion.outputs['PerformVerUpdate.Reponame'] ]
steps:
- task: AzurePowerShell#5
displayName: Slack Notification
inputs:
pwsh: true
azureSubscription: ${{ parameters.azureSubscriptionName }}
ScriptType: 'InlineScript'
TargetAzurePs: LatestVersion
inline: |
write-host "Reponame $(Reponame_J1)"
write-host "pullRequest $(pullRequestId_J1)"
I've tried so many different syntax for it but the variables are still not able to pass through between both jobs - e.g. The condition is passing Null result to second job "(Expanded: and(True, eq(Null, 'true'))". Could anyone help with this?

Firstly 'Name' should be 'name' in lowercase
Name: PerformVerUpdate
The rest of syntax seems fine(I have tested it on Bash task because I do not have Azure subscription).
If renaming 'Name' does not help I suppose the problem may be that your Bash task is running within AzureCLI#2 task.
Maybe as workaround you could add new Bash task right after AzureCLI#2 and try to set there output variable for next job?

Related

How to pass variable to REST API that is defined in Environment for a yaml pipeline and how to get the REST API to translate the variable

In Azurce DevOPS I have a REST API defined in Environment and I have a yaml pipline that sets a variable. I want to pass this variable to the REST API but it does not work. I can get it to work fine in a Release Pipeline.
The REST API is used as a gate in Environment.
This is the URL and Parameters in the REST API servicenowchecker/api/ServiceNow/GetServiceNowChangeStatus/$(PipelineChangeNumber)?onlyDate=true and my problem is to get the content of $(PipelineChangeNumber) to work. I cannot fin out how it has to be written in URL and Parameters section
my yaml pipeline looks like this - can someone please help me out - thanks
name: Test_serviceNowCheker-$(Date:yyyyMMdd)$(Rev:.r)
trigger: none
variables:
name: PipelineChangeNumber
value: CHG0070534
stages:
stage: Stage1
jobs:
deployment:
pool:
name: dotNET
environment:
name: P1_Urgent_CRB_Environment
job: Say_Hello
pool:
name: dotNEt
steps:
task: PowerShell#2
displayName: Hello from stage1
inputs:
targetType: 'inline'
script:
write-host "Hello from stage 1";
task: PowerShell#2
displayName: Set Outputvar OutPutCase
inputs:
targetType: 'inline'
script: Write-Host "##vso[task.setvariable variable=ChangeNumber;isoutput=true]CHG0070534";
name: OutPutCase
powershell: |
write-host "OutPutCase.ChangeNumber = $(OutPutCase.ChangeNumber)";
write-host "PipelineChangeNumber = $(PipelineChangeNumber)";
stage: Stage2
dependsOn: Stage1
variables:
Stage2_ChangeNumber: $[ stageDependencies.Stage1.Say_Hello.outputs['OutPutCase.ChangeNumber'] ]
jobs:
deployment:
pool:
name: DotNet
environment:
name: P1_Urgent
job: Say_Hello_again
pool:
name: DotNet
steps:
task: PowerShell#2
displayName: Hello from stage2
inputs:
targetType: 'inline'
script:
write-host "Hello from stage 2";
write-host "Stage2_ChangeNumber = $(Stage2_ChangeNumber)";
write-host "PipelineChangeNumber = $(PipelineChangeNumber)";
I have tried to write the as $(PipelineChangeNumber) and $[ $(PipelineChangeNumber) ] then the REST API does not work at all.
When it is written as $[ $PipelineChangeNumber ] the REST API starts up but is sending https://XXXXXX-xx/servicenowchecker/api/ServiceNow/GetServiceNowChangeStatus/$[ $PipelineChangeNumber ]?onlyDate=true and the ¤PipelineChangeNumber is null.

GitHub Actions: Use variables in matrix definition?

I have the following code in a GitHub Action config:
name: Build & Tests
on:
pull_request:
env:
CARGO_TERM_COLOR: always
ZEROCOPY_MSRV: 1.61.0
ZEROCOPY_CURRENT_STABLE: 1.64.0
ZEROCOPY_CURRENT_NIGHTLY: nightly-2022-09-26
jobs:
build_test:
runs-on: ubuntu-latest
strategy:
matrix:
# See `INTERNAL.md` for an explanation of these pinned toolchain
# versions.
channel: [ ${{ env.ZEROCOPY_MSRV }}, ${{ env.ZEROCOPY_CURRENT_STABLE }}, ${{ env.ZEROCOPY_CURRENT_NIGHTLY }} ]
target: [ "i686-unknown-linux-gnu", "x86_64-unknown-linux-gnu", "arm-unknown-linux-gnueabi", "aarch64-unknown-linux-gnu", "powerpc-unknown-linux-gnu", "powerpc64-unknown-linux-gnu", "wasm32-wasi" ]
features: [ "" , "alloc,simd", "alloc,simd,simd-nightly" ]
exclude:
# Exclude any combination which uses a non-nightly toolchain but
# enables nightly features.
- channel: ${{ env.ZEROCOPY_MSRV }}
features: "alloc,simd,simd-nightly"
- channel: ${{ env.ZEROCOPY_CURRENT_STABLE }}
features: "alloc,simd,simd-nightly"
I'm getting the following parsing error on this file:
Invalid workflow file: .github/workflows/ci.yml#L19
You have an error in your yaml syntax on line 19
It appears to be referring to this line (it's actually one off, but maybe it's zero-indexing its line numbers?):
channel: [ ${{ env.ZEROCOPY_MSRV }}, ${{ env.ZEROCOPY_CURRENT_STABLE }}, ${{ env.ZEROCOPY_CURRENT_NIGHTLY }} ]
Is there any way to use variables in the matrix definition like this? Or do I just need to hard-code everything?
According to the documentation (reference 1 and reference 2)
Environment variables (at the workflow level) are available to the steps of all jobs in the workflow.
In your example, the environment variables are used at the job level (inside the job strategy / matrix definition), not inside the job steps.
At that level, environment variables aren't interpolated by the GitHub interpreter.
First alternative
Hardcode the values inside the channel field inside your matrix strategy:
Example:
channel: [ "1.61.0", "1.64.0", "nightly-2022-09-26" ]
However, you'll have to do this for each job (bad for maintenance as duplicated code).
Second alternative
Use inputs (with reusable workflow workflow_call trigger, or with workflow_dispatch trigger.
Example:
on:
workflow_dispatch: # or workflow_call
inputs:
test1:
description: "Test1"
required: false
default: "test1"
test2:
description: "Test2"
required: false
default: "test2"
test3:
description: "Test3"
required: false
default: "test3"
jobs:
build_test:
runs-on: ubuntu-latest
strategy:
matrix:
channel: [ "${{ inputs.test1 }}", "${{ inputs.test2 }}", "${{ inputs.test3 }}" ]
In that case, inputs will be interpolated by the GitHub interpreter.
However, you'll need to trigger the workflow from another workflow, or through the GitHub API to send the inputs (in some way, it gives you more flexibility with the values, but increase the complexity).

Unexpected value 'steps' in azure-pipelines.yml

Pipeline validation fails with Unexpected value 'steps' in acr-login.yaml. Even I tribble-checked the docs and stackoverflows, I can't find the issue in my pipeline:
pipeline.yaml
trigger: none
pool:
name: MyPool
variables:
- template: vars/global.yaml
- template: vars/stage.yaml
stages:
- stage: Import
jobs:
- template: steps/acr-login.yaml
parameters:
registry_name: ${{variables.registry_name}}
acr-login.yaml
parameters:
- name: registry_name
type: string
steps:
- bash: |
az login --identity --output none
az acr login --name ${{ parameters.registry_name }} --output none
Your acr-login.yaml must contain (one or multible) jobs since you are using it as job template:
parameters:
- name: registry_name
type: string
jobs:
- job: myStep
steps:
- bash: |
az login --identity --output none
az acr login --name ${{ parameters.registry_name }} --output none

Drone template not triggering build

Following is how our.drone.yml looks like (and template also listed below) this an example configuration very much how we want in our production. The reason we are using a template is that our staging and production have similar configurations with values different in them(hence circuit template). And we wanted to remove duplication using the template circuit.yaml.
But currently, we are unable to do so df I don’t define the test.yaml(template) and have test step imported without template (and have the circuit template define to avoid the duplicate declaration of staging and production build) the drone build fails with
"template converter: template name given not found
If I define the test step as a template. I see the test step working but on creating the tag I see the following error
{"commit":"28ac7ad3a01728bd1e9ec2992fee36fae4b7c117","event":"tag","level":"info","msg":"trigger: skipping build, no matching pipelines","pipeline":"test","ref":"refs/tags/v1.4.0","repo":"meetme2meat/drone-example","time":"2022-01-07T19:16:15+05:30"}
---
kind: template
load: test.yaml
data:
commands:
- echo "machine github.com login $${GITHUB_LOGIN} password $${GITHUB_PASSWORD}" > /root/.netrc
- chmod 600 /root/.netrc
- go clean -testcache
- echo "Running test"
- go test -race ./...
---
kind: template
load: circuit.yaml
data:
deploy: deploy
create_tags:
commands:
- echo "Deploying version $DRONE_SEMVER"
- echo -n "$DRONE_SEMVER,latest" > .tags
backend_image:
version: ${DRONE_SEMVER}
tags:
- '${DRONE_SEMVER}'
- latest
And the template is below
test.yaml
kind: pipeline
type: docker
name: test
steps:
- name: test
image: golang:latest
environment:
GITHUB_LOGIN:
from_secret: github_username
GITHUB_PASSWORD:
from_secret: github_token
commands:
{{range .input.commands }}
- {{ . }}
{{end}}
volumes:
- name: deps
path: /go
- name: build
image: golang:alpine
commands:
- go build -v -o out .
volumes:
- name: deps
path: /go
volumes:
- name: deps
temp: {}
trigger:
branch:
- main
event:
- push
- pull_request
circuit.yaml
kind: pipeline
type: docker
name: {{ .input.deploy }}
steps:
- name: create-tags
image: alpine
commands:
{{range .input.create_tags.commands }}
- {{ . }}
{{end}}
- name: build
image: plugins/docker
environment:
GITHUB_LOGIN:
from_secret: github_username
GITHUB_PASSWORD:
from_secret: github_token
VERSION: {{ .input.backend_image.version }}
SERVICE: circuits
settings:
auto_tag: false
repo: ghcr.io/meetme2meat/drone-ci-example
registry: ghcr.io

Why is this rule preventing my GitLab stage from running?

In my .gitlab-ci.yml file I have this stage, which uses environment variables in artifacts from a previous stage:
build_dev_containers:
stage: build_dev_containers
variables:
CI_DEBUG_TRACE: "true"
script:
- whoami
…and it outputs the following debug information:
++ DEV_CONTAINERS=true
If I change it by adding the following rule, the stage no longer runs:
rules:
- if: '$DEV_CONTAINERS == "true"'
Any idea what I could be doing wrong?
Not sure if this information adds any value, but just in case:
My previous stage outputs a .env file in its artifacts, and it contains the value
DEV_CONTAINERS=true
Here is the complete file. The powershell script creates package.env in the root path:
image: microsoft/dotnet:latest
variables:
GIT_RUNNER_PATH: 'C:\GitLab'
SCRIPTS_PATH: '.\Lava-Tools\BuildAndDeploy\BuildServer'
stages:
- dev_deploy
- build_dev_containers
dev_deploy:
stage: dev_deploy
tags:
- lava
variables:
GIT_CLONE_PATH: '$GIT_RUNNER_PATH/builds/d/$CI_COMMIT_SHORT_SHA/$CI_PROJECT_NAME'
script:
- 'powershell -noprofile -noninteractive -executionpolicy Bypass -command ${SCRIPTS_PATH}\createdevdeployvars.ps1 -Branch "${CI_COMMIT_REF_NAME}" -ShortCommitHash "${CI_COMMIT_SHORT_SHA}"'
artifacts:
reports:
dotenv: package.env
build_dev_containers:
stage: build_dev_containers
image: docker.repo.ihsmarkit.com/octo/alpine/build/dotnet:latest
tags:
- lava-linux-containers
variables:
CI_DEBUG_TRACE: "true"
script:
- whoami
rules:
- if: '$DEV_CONTAINERS == "true"'
The rules are evaluated before the jobs begin, so a rule cannot evaluate the output from a job.
As a workaround I used if statements in my script: section:
build_dev_containers:
stage: build_dev_containers
image: docker.repo.ihsmarkit.com/octo/alpine/build/dotnet:latest
tags:
- lava-linux-containers
script:
- if [ "$DEV_CONTAINERS" == "true" ]; then echo "DEV_CONTAINERS is true - running"; else echo "DEV_CONTAINERS is not true - skipping"; exit 0; fi
- whoami
deploy_dev_containers:
stage: deploy_dev_containers
tags:
- lava
script:
- |
if ( "$DEV_CONTAINERS" -eq "true" ) {
Write-Output "DEV_CONTAINERS is true - running"
}
else {
Write-Output "DEV_CONTAINERS is not true - skipping"
exit 0
}
- ls