Something happened with my Drone configuration. It's not finding the environment variables since today. Until few days ago, I could run a pipeline, but today I can't.
This is the step into the pipeline:
pipeline:
[...]
sdk:
image: mycompany/swagger-codegen:latest
environment:
- API_SWAGGER_JSON_URL=http://api.mycompany.biz:9000/v1/swagger.json
- API_PACKAGE=com.mycompany.api
- API_GROUP_ID=com.mycompany.api
- API_ARTIFACT_ID=sdk
- API_VERSION=0.1-SNAPSHOT
when:
branch: master
commands:
- java -jar /usr/lib/swagger/swagger-codegen-cli.jar generate
-i ${API_SWAGGER_JSON_URL}
--api-package ${API_PACKAGE}
--invoker-package ${API_PACKAGE}.client
--model-package ${API_PACKAGE}.client.model
--group-id ${API_GROUP_ID}
--artifact-id ${API_ARTIFACT_ID}
--artifact-version ${API_VERSION}
-l java
-o ./swagger-codegen-source
- etc.
And this is what I get
+ java -jar /usr/lib/swagger/swagger-codegen-cli.jar generate -i --api-package --invoker-package .client --model-package .client.model --group-id --artifact-id --artifact-version -l java -o ./swagger-codegen-source
Exception in thread "main" io.airlift.airline.ParseArgumentsUnexpectedException: Found unexpected parameters: [java]
at io.airlift.airline.Cli.validate(Cli.java:148)
at io.airlift.airline.Cli.parse(Cli.java:116)
at io.airlift.airline.Cli.parse(Cli.java:97)
at io.swagger.codegen.SwaggerCodegen.main(SwaggerCodegen.java:36)
Look at the command. Every environment variable was substituted by an empty string. Am I doing something wrong?
You should use $variable or $${variable} instead of ${variable}
This is because drone interpolates runtime variables [1] into the yaml using ${variable} syntax. This behavior is similar to docker-compose which drone uses as a baseline for functionality and syntax.
[1] http://docs.drone.io/environment/
[2] http://docs.drone.io/secrets-not-working/#variable-expansion
Related
I recognize that within a pipeline step I can run a simple export, like:
commands:
- export MY_ENV_VAR=$(my-command)
...but if I want to use this env var throughout the whole pipeline, is it possible to do something like this:
environment:
MY_ENV_VAR: $(my-command)
When I do this, I get yaml: unmarshal errors: line 23: cannot unmarshal !!seq into map[string]*yaml.Variable which suggests this isn't possible. My end goal is to write a drone plugin that accepts the output of $(...) as one if it's settings. I'd prefer to have the drone plugin not run the command, but just use the output.
I've also attempted to use step dependencies to export an env var, however it's state doesn't carry over between steps:
- name: export
image: bash
commands:
- export MY_VAR=$(my-command)
- name: echo
image: bash
depends_on:
- export
commands:
- echo $MY_VAR // empty
Writing the command output to a script file might be a better way to do what you want, since filesystem changes are persisted between individual steps.
---
kind: pipeline
type: docker
steps:
- name: generate-script
image: bash
commands:
# - my-command > plugin-script.sh
- printf "echo Fetching Google;\n\ncurl -I https://google.com/" > plugin-script.sh
- name: test-script-1
image: curlimages/curl
commands:
- sh plugin-script.sh
- name: test-script-2
image: curlimages/curl
commands:
- sh plugin-script.sh
From Drone's Docker pipeline documentation:
Workspace
Drone automatically creates a temporary volume, known as your workspace, where it clones your repository. The workspace is the current working directory for each step in your pipeline.
Because the workspace is a volume, filesystem changes are persisted between pipeline steps. In other words, individual steps can communicate and share state using the filesystem.
⚠ Workspace volumes are ephemeral. They are created when the pipeline starts and destroyed after the pipeline completes.
if cant execute command in environment period.
maybe you can define a "command string" in "environment" block, like:
environment:
MY_ENV_VAR: 'echo "this is command to execute"' # note the single quote
then in commands block,
commands:
- eval $MY_ENV_VAR
worth a try
I have a CI pipeline in Gitlab (relevant part only):
default:
image: docker:latest
variables:
DOCKER_APP_TAG: ${REGISTRY_URL}/${APP_NAME}:${CI_COMMIT_SHORT_SHA}
stages:
- build
.config:
only:
- branches
- merge_requests
- tags
except:
- triggers
tags:
- prod
build-app:
extends: .config
stage: build
script:
- docker build --target production -t ${DOCKER_APP_TAG} -f ${CI_PROJECT_DIR}/etc/node/Dockerfile .
When I build from a branch (i.e. push to main branch) everything works well. The docker build command is ran with the proper value available in S{DOCKER_APP_TAG}.
However after I create a TAG in GitLab (and a release), the build on this GitLab TAG fails at the docker build ... line complaining that the docker tag is not valid:
invalid argument "/:e5dc27fd" for "-t, --tag" flag: invalid reference format
The variables ${REGISTRY_URL} and ${APP_NAME} are not expanded. I have checked GitLab docs and the only limitations I see is if I was running in a service. But it is not the case.
What am I missing to expand properly the variables even with tag builds?
Following the gitlab reference for reference tags, this simplest example doesn't work.
.gitlab-ci.yml:
include:
- local: shared.yml
this-doesnt-work:
script:
- !reference [.test, script]
shared.yml:
.test:
script:
- echo from shared
But gitlab doesn't seem to replace the reference, it tries to execute the literal ".test":
Executing "step_script" stage of the job script
$ .test
/bin/bash: line 106: .test: command not found
Your code works perfectly fine. Which gitlab version are you using? The !reference feature was introduced with gitlab 13.9. Maybe you are running an older version?
I'm trying to include a file in which I declare some repetitive jobs, I'm using extends.
I always have this error did not find expected key while parsing a block
this is the template file
.deploy_dev:
stage: deploy
image: nexus
script:
- ssh -i ~/.ssh/id_rsa -o "StrictHostKeyChecking=no" sellerbot#sb-dev -p 10290 'sudo systemctl restart mail.service'
only:
- dev
this is the main file
include:
- project: 'sellerbot/gitlab-ci'
ref: master
file: 'deploy.yml'
deploy_dev:
extends: .deploy_dev
Can anyone help me please
`
It looks like just stage: deploy has to be indented. In this case it's a good idea to use gilab CI line tool to check if CI pipeline code is valid or just YAML validator. When I checked section from template file in yaml linter I've got
(<unknown>): mapping values are not allowed in this context at line 3 column 8
I am trying to make use of the variables: keyword documented in the Gitlab CI Documentation here:
FROM: https://docs.gitlab.com/ce/ci/yaml/README.html
variables
This feature requires gitlab-runner with version equal or greater than
0.5.0.
GitLab CI allows you to add to .gitlab-ci.yml variables that are set
in build environment. The variables are stored in repository and are
meant to store non-sensitive project configuration, ie. RAILS_ENV or
DATABASE_URL.
variables:
DATABASE_URL: "postgres://postgres#postgres/my_database"
These variables can be later used in all executed commands and
scripts.
The YAML-defined variables are also set to all created service
containers, thus allowing to fine tune them.
When I attempt to use it, my builds do not run any stages and are marked successful anyway, a good sign of bad YAML. I pasted my gitlab-ci.yml contents into the LINT tool in the settings area and the output error is:
Status: syntax is incorrect
Error: variables job: unknown parameter PACKAGE_NAME
I'm using my YAML syntax the same as the docs, however it will not work. I'm unable to find any open bugs related to this. Below are my current versions and a sanitized version of my gitlab-ci.yml.
Gitlab Version: 7.13.2 Omnibus
Gitlab Runner Version: 0.5.2
gitlab-ci.yml (Sanitized)
types:
- test
- build
variables:
PACKAGE_NAME: "awesome-django-app"
PACKAGE_SUMMARY: "Awesome webapp backend."
MAJOR_RELEASE: "1"
MINOR_RELEASE: "0"
PATCH_LEVEL: "0dev"
DEV_DB_URL: "db"
DEV_SERVER: "pydev.example.com"
PROD_SERVER: "pyprod.example.com"
TEST_SERVER: "pytest.example.com"
envtest:
type: test
script:
- ". ./testbuild.sh"
tags:
- python2.7
- postgres
- linux
except:
- tags
buildrpm:
type: build
script:
- mkdir -p ~/rpmbuild/SOURCES
- mkdir -p ~/rpmbuild/SPECS
- mkdir -p ~/tarbuild/$PACKAGE_NAME-$MAJOR_RELEASE.$MINOR_RELEASE.$PATCH_LEVEL
- cp $PACKAGE_NAME.spec ~/rpmbuild/SPECS/.
- cp -r * ~/tarbuild/$PACKAGE_NAME-$MAJOR_RELEASE.$MINOR_RELEASE.$PATCH_LEVEL/.
- cd ~/tarbuild
- tar -zcf ~/rpmbuild/SOURCES/$PACKAGE_NAME-$MAJOR_RELEASE.$MINOR_RELEASE.$PATCH_LEVEL.tar.gz *
- cd ~
- rm -Rf ~/tarbuild
- rpmlint -i ~/rpmbuild/SPECS/$PACKAGE_NAME.spec
- echo $CI_BUILD_ID
- 'rpmbuild -ba ~/rpmbuild/SPECS/$PACKAGE_NAME.spec \
--define="_build_number $CI_BUILD_ID" \
--define="_python_version_min 2.7" \
--define="_version $MAJOR_RELEASE.$MINOR_RELEASE.$PATCH_LEVEL" \
--define="_package_name $PACKAGE_NAME" \
--define="_summary $SUMMARY"'
- scp rpmbuild/RPMS/noarch/$PACKAGE_NAME-$MAJOR_RELEASE.$MINOR_RELEASE.$PATCH_LEVEL-$CI_BUILD_ID.noarch.rpm $DEV_SERVER:~/.
tags:
- python2.7
- postgres
- linux
- rpm
except:
- tags
Question:
How do I use this value properly?
Additional Info:
Removing this section from the YAML file causes everything to work so the rest of the file is in working order. (Of course undefined variables lead to script errors...)
Even just reducing the variables for testing down to just PACKAGE_NAME causes the same break.
The original answer is no longer correct.
The original documentation now stands, Now there are more ways as well. Variables can be created from the GUI, API, or by being defined in the .gitlab-ci.yml as well.
https://docs.gitlab.com/ce/ci/variables/README.html
While it is in the documentation, I do not believe that variables were included in the latest version of gitlab (7.13). The functionality to read variables out of the yaml files was brought in by a commit by ayufan 9 days ago.
Looking at the parser on the 7.13 stable branch, you can see that his contribution did not make it in. So assuming you're on 7.13 or earlier, I'm afraid we are out of luck. Since it is on master, I am fairly certain that we'll see it in the next release. Until then, we could either monkey patch, do a git pull if you're using the source directly, or just rely on the project variables until the next release.