Gitlab-CI custom variable - gitlab-ci

I declare a custom variable in gitlab-ci.yml like so:
variables:
APP_NAME: moodleadmin
Then I try to use it in a script:
script:
- ssh root#devsb01 'service $APP_NAME stop'
But it's not replaced, here is the CI log:
$ ssh root#devsb01 'service $APP_NAME stop'
Which lead to following error:
stop: unrecognized service
What is the correct way to use the variable ?

You don't say which image you are using, but I assume it's something that provides a default bash shell where inside single quotes everything is preserved literally, without exception.
You have to use the double quote:
script:
- ssh root#devsb01 "service $APP_NAME stop"

Related

Drone CI - How to set pipeline env var to result of CLI output

I recognize that within a pipeline step I can run a simple export, like:
commands:
- export MY_ENV_VAR=$(my-command)
...but if I want to use this env var throughout the whole pipeline, is it possible to do something like this:
environment:
MY_ENV_VAR: $(my-command)
When I do this, I get yaml: unmarshal errors: line 23: cannot unmarshal !!seq into map[string]*yaml.Variable which suggests this isn't possible. My end goal is to write a drone plugin that accepts the output of $(...) as one if it's settings. I'd prefer to have the drone plugin not run the command, but just use the output.
I've also attempted to use step dependencies to export an env var, however it's state doesn't carry over between steps:
- name: export
image: bash
commands:
- export MY_VAR=$(my-command)
- name: echo
image: bash
depends_on:
- export
commands:
- echo $MY_VAR // empty
Writing the command output to a script file might be a better way to do what you want, since filesystem changes are persisted between individual steps.
---
kind: pipeline
type: docker
steps:
- name: generate-script
image: bash
commands:
# - my-command > plugin-script.sh
- printf "echo Fetching Google;\n\ncurl -I https://google.com/" > plugin-script.sh
- name: test-script-1
image: curlimages/curl
commands:
- sh plugin-script.sh
- name: test-script-2
image: curlimages/curl
commands:
- sh plugin-script.sh
From Drone's Docker pipeline documentation:
Workspace
Drone automatically creates a temporary volume, known as your workspace, where it clones your repository. The workspace is the current working directory for each step in your pipeline.
Because the workspace is a volume, filesystem changes are persisted between pipeline steps. In other words, individual steps can communicate and share state using the filesystem.
⚠ Workspace volumes are ephemeral. They are created when the pipeline starts and destroyed after the pipeline completes.
if cant execute command in environment period.
maybe you can define a "command string" in "environment" block, like:
environment:
MY_ENV_VAR: 'echo "this is command to execute"' # note the single quote
then in commands block,
commands:
- eval $MY_ENV_VAR
worth a try

Variable inside variable gitlab ci

Is there a way to use predefined variable inside custom variable in gitlab ci like this:
before_script:
- cat "${$CI_COMMIT_REF_NAME}" >> .env
to extract the name of branch from $CI_COMMIT_REF_NAME and use it as a name of custom variable
Update:
Check out GitLab 14.3 (September 2021)
Use variables in other variables
CI/CD pipeline execution scenarios can depend on expanding variables declared in a pipeline or using GitLab predefined variables within another variable declaration.
In 14.3, we are enabling the “variables inside other variables” feature on GitLab SaaS.
Now you can define a variable and use it in another variable definition within the same pipeline.
You can also use GitLab predefined variables inside of another variable declaration.
This feature simplifies your pipeline definition and eliminates pipeline management issues caused by the duplicating of variable data.
Note - for GitLab self-managed users the feature is disabled by default.
To use this feature, your GitLab administrator will need to enable the feature flag.
(demo -- video)
See Documentation and Issue.
dba asks in the comments:
Does this include or exclude using globally defined variables?
dba's own answer:
Global variables can be reused, but they need the local_var: ${global_var} syntax with recursive expansion (independent of the shell).
Check if this matches gitlab-org/gitlab-runner issue 1809:
Description
In the .gitlab-ci.yml file, a user can define a variable and use it in another variable definition within the same .gitlab-ci.yml file.
A user can also use a GitLab pre-defined variable in a variable declaration.
Example
variables:
variable_1: "foo" # here, variable_1 is assigned the value foo
variable_2: "${variable_1}" # variable_2 is assigned the value variable_1.
# The expectation is that the value in variable_2 = value set for variable_1
If it is, it should be completed/implemented for GitLab 14.1 (July 2021)
Lots of options.
But you could just pass the predefined var into the .env
image: busybox:latest
variables:
MY_CUSTOM_VARIABLE: $CI_JOB_STAGE
ANIMAL_TESTING: "cats"
before_script:
- echo "Before script section"
- echo $CI_JOB_STAGE
- echo $MY_CUSTOM_VARIABLE
- echo $MY_CUSTOM_VARIABLE >> .env
- echo $CI_COMMIT_BRANCH >> .env
- cat .env
example pipeline output
$ echo "Before script section"
Before script section
$ echo $CI_JOB_STAGE
build
$ echo $MY_CUSTOM_VARIABLE
build
$ echo $MY_CUSTOM_VARIABLE >> .env
$ echo $CI_COMMIT_BRANCH >> .env
$ cat .env
build
exper/ci-var-into-env
$ echo "Do your build here"
Do your build here
or pass it in earlier.
image: busybox:latest
variables:
MY_CUSTOM_VARIABLE: "${CI_JOB_STAGE}"
ANIMAL_TESTING: "cats"
before_script:
- echo "Before script section"
- echo $CI_JOB_STAGE
- echo $MY_CUSTOM_VARIABLE
- echo $MY_CUSTOM_VARIABLE >> .env
- cat .env
example: https://gitlab.com/codeangler/make-ci-var-custom-var-in-script/-/blob/master/.gitlab-ci.yml

BitBucket deployment using SSH keys to remote server

I am trying to write a YAML pipeline script to deploy files that have been altered from my bitbucket repository to my remote server using ssh keys. The document that I have in place at the moment was copied from bitbucket itself and has errors:
pipelines:
default:
- step:
name: Deploy to test
deployment: test
script:
- pipe: atlassian/sftp-deploy:0.3.1
- variables:
USER: $USER
SERVER: $SERVER
REMOTE_PATH: $REMOTE_PATH
LOCAL_PATH: $LOCAL_PATH
I am getting the following error
Configuration error
There is an error in your bitbucket-pipelines.yml at [pipelines > default > 0 > step > script > 1]. To be precise: Missing or empty command string. Each item in this list should either be a single command string or a map defining a pipe invocation.
My ssh public and private keys are setup in bitbucket along with the fingerprint and host. The variables have also been setup.
How do I go about setting up my YAML deploy script to connect to my remote server via ssh and transfer the files?
Try to update the variables section become:
- variables:
- USER: $USER
- SERVER: $SERVER
- REMOTE_PATH: $REMOTE_PATH
- LOCAL_PATH: $LOCAL_PATH
Here is am example about how to set variables: https://confluence.atlassian.com/bitbucket/configure-bitbucket-pipelines-yml-792298910.html#Configurebitbucket-pipelines.yml-ci_variablesvariables
Your directive - step has to be intended.
I have bitbucket-pipelines.yml like that (using rsync instead of ssh):
# This is a sample build configuration for PHP.
# Check our guides at https://confluence.atlassian.com/x/e8YWN for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: php:7.2.1-fpm
pipelines:
default:
- step:
script:
- apt-get update
- apt-get install zip -y
- apt-get install unzip -y
- apt-get install libgmp3-dev -y
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- composer install
- cp .env.example .env
#- vendor/bin/phpunit
- pipe: atlassian/rsync-deploy:0.2.0
variables:
USER: $DEPLOY_USER
SERVER: $DEPLOY_SERVER
REMOTE_PATH: $DEPLOY_PATH
LOCAL_PATH: '.'
I suggest to use their online editor in repository for editing bitbucket-pipelines.yml, it checks all formal yml structure and you can't commit invalid file.
Even if you check file on some other yaml editor, it may look fine, but not necessary according to bitbucket specification. Their online editor does fine job.
Also, I suggest to visit their community on atlasian community as it's very active, sometimes their staff members are providing answers.
However, I struggle with plenty dependencies needed to run tests properly. (actual bitbucket-pipelines.yml is becoming bigger and bigger).
Maybe there is some nicely prepared Docker image for this job.

Drone replacing environment variables with empty strings

Something happened with my Drone configuration. It's not finding the environment variables since today. Until few days ago, I could run a pipeline, but today I can't.
This is the step into the pipeline:
pipeline:
[...]
sdk:
image: mycompany/swagger-codegen:latest
environment:
- API_SWAGGER_JSON_URL=http://api.mycompany.biz:9000/v1/swagger.json
- API_PACKAGE=com.mycompany.api
- API_GROUP_ID=com.mycompany.api
- API_ARTIFACT_ID=sdk
- API_VERSION=0.1-SNAPSHOT
when:
branch: master
commands:
- java -jar /usr/lib/swagger/swagger-codegen-cli.jar generate
-i ${API_SWAGGER_JSON_URL}
--api-package ${API_PACKAGE}
--invoker-package ${API_PACKAGE}.client
--model-package ${API_PACKAGE}.client.model
--group-id ${API_GROUP_ID}
--artifact-id ${API_ARTIFACT_ID}
--artifact-version ${API_VERSION}
-l java
-o ./swagger-codegen-source
- etc.
And this is what I get
+ java -jar /usr/lib/swagger/swagger-codegen-cli.jar generate -i --api-package --invoker-package .client --model-package .client.model --group-id --artifact-id --artifact-version -l java -o ./swagger-codegen-source
Exception in thread "main" io.airlift.airline.ParseArgumentsUnexpectedException: Found unexpected parameters: [java]
at io.airlift.airline.Cli.validate(Cli.java:148)
at io.airlift.airline.Cli.parse(Cli.java:116)
at io.airlift.airline.Cli.parse(Cli.java:97)
at io.swagger.codegen.SwaggerCodegen.main(SwaggerCodegen.java:36)
Look at the command. Every environment variable was substituted by an empty string. Am I doing something wrong?
You should use $variable or $${variable} instead of ${variable}
This is because drone interpolates runtime variables [1] into the yaml using ${variable} syntax. This behavior is similar to docker-compose which drone uses as a baseline for functionality and syntax.
[1] http://docs.drone.io/environment/
[2] http://docs.drone.io/secrets-not-working/#variable-expansion

How can i set a local variable in ssh?

I would like to set a local variable in a ssh command-chain that is only used in this environment:
#!/bin/sh
my_var='/tmp/wrong_file'
ssh user#server "my_var='/tmp/a_file'; cat $my_var;my_var=123;echo $my_var"
echo $my_var
This example the "outer" $my_var is used. How to fix this and use variables "in" the current ssh connection as locally defined? There is no need to change or access the external value '/tmp/wrong_file' in $my_var, as asked in Assign directory listing to variable in bash script over ssh.
You're using the wrong quotes. Parameter expansion is performed inside double quotes, but not inside single quotes.
#!/bin/sh
my_var=/tmp/wrong_file
ssh user#server 'my_var=/tmp/a_file; cat $my_var;my_var=123;echo $my_var'
First of all: The SSH shell and your local shell are completely different and do not exchange any environment variables. This is a good thing - consider environment variables such as LD_LIBRARY_PATH when using SSH between machines of different OS architecture.
IMHO the best solution for your problem is to encapsulate your commands into a shell script on the remote side, then maybe start it with parameters. E.g.:
Remote:
myscript.sh contains:
#!/bin/sh
MY_FILE="$1";
echo "Contents of §MY_FILE:"
cat $MY_FILE
Local:
RUn something like
export REMOTE_FILE='/path/to/it'
ssh user#server "/path/to/myscript.sh '$REMOTE_FILE'"