Jenkins variable not working with sed command in pipeline - variables

The Sed command is giving me issues with incorporating the $tag variable witch is equal to "latest${GIT_COMMIT:0:7}". Here is the Sed command:
sh "sed -i 's/{BUILD_NUMBER}/$tag/' /var/lib/jenkins/workspace/${JOB_NAME}/em-api/dev-nics-emapi-svc-param.json"
I obviously want to put into my .json file the commit information but It doesnt pull the actual commit sha. When I take a look at the .json file it inserted the literal definition of the variable which is “latest${GIT_COMMIT:0:4}”. I am trying to do this on a declarative pipeline on my jenkins server running on linux.
I would like it to insert "latestxxxx". Any suggestions on how I can get around this?

GIT_COMMIT is an environment variable available to you; tag is a groovy variable, you have set to 'latest${GIT_COMMIT:0:4}'. So this gets replaced since you are using " for your sed command. But you are using ' for your sed expression, which then again will not replace environment variables. So you have basically two options:
Use " to quote the sed command, if you feel safe about the content, that gets replaced (you can use """ triple quotes for the whole command to don't have to quote the " for groovy)
Resolve the variable from the environment yourself in groovy (e.g. something like System.env['GIT_COMMIT].substring(0,4))

Related

Using sed over ssh to add item to list

I need to modify a Python file on a remote server, and I'm stuck formatting a sed command inside an ssh.
The file to be modified has this line
my_list = ["item1"]
and I need to change it to include another item:
my_list = ["item1", "item2"]
Here's what I have:
ssh user#host 'sed -i \'s/my_list = \[\\"item1\\"]/my_list = \[\\"item1\\", \\"item2\\"]/\' path/to/file'
The number of escapes required for quotes and open brackets is throwing me off since it's within an ssh.
I'd appreciate a hand if anyone can help!
You can't nest single quotes, and you can't escape single quotes inside single quotes. The simplest solution by far in this particular case is to just quote less; there is nothing in sed or -i which requires quoting. But because both your local shell and the remote shell processes the command line, you need two layers of quoting.
ssh user#host sed -i "'s/my_list = \\[\"item1\"]/my_list = [\"item1\", \"item2\"]/'" path/to/file
Perhaps notice also that the replacement string is just a string, so there is no need to escape the [ there.
For debugging these things, try
ssh user#host printf '%s\\n' sed -i "'s/my_list = \\[\"item1\"]/my_list = [\"item1\", \"item2\"]/'" path/to/file
to see the command line split up into one token per line on the remote host.
Fundamentally, you should probably change the remote Python script to read its input in a standard format like JSON or YAML. Programs which write programs are a powerful tool, but unsophisticated programs which modify existing programs are often going to end up brittle and hard to debug.

How to use KSH variable in string and Robotframework not to interpret it for a case variable

I tried to use Robot to do the following operation in KSH to remove the ".auto" postfix in a directory:
Write for file in .auto; do mv $file ${file%.}; done
The ${file%.} is for KSH variable however Robot always considered it as Robot variable and gave the error message: "Resolving variable '${file%.}' failed: Variable '${file}' not found."
Is there any way to tell Robot that the ${file%.*} is not for a Robot variable?
If a string has something the framework may interpret as inline variable usage - escape it, with the \ char.
In your case, put it in front of the ${:
Write for file in .auto; do mv $file \${file%.}; done

snakemake: correct quoting when using singularity

I want to run the following shell command
shell:
"""
Rscript -e "rmarkdown::render('{input.markdown}', output_dir = 'output/{wildcards.version}', params = list(datapath = '../data/{wildcards.version}', max_lab_days = {config[max_lab_days]}, seed = {config[seed]}))"
"""
everything is fine in normal mode but breaks down when setting --use-singularity. I guess this is come quoting related issue since singularity exec adds another layer of quotes here, right?
So, I guess my question is how to avoid this quotation hell - any ideas?
okay, turns out the single quotes, ', are the problem - never use them in a snakemake shell command or it will not be portable to singularity execution. Fortunately one may escape them for the Rscript -e command by replacing ' with \".
Is that really necessary?

How do I escape a "$" in bitbake/yocto?

One of my recipes in Yocto need to create a file containing a very specific line, something like:
${libdir}/something
To do this, I have the recipe task:
do_install() {
echo '${libdir}/something' >/path/to/my/file
}
Keeping in mind that I want that string exactly as shown, I can't figure out how to escape it to prevent bitbake from substituting in its own value of libdir.
I originally thought the echo command with single quotes would do the trick (as it does in the bash shell) but bitbake must be interpreting the line before passing it to the shell. I've also tried escaping it both with $$ and \$ to no avail.
I can find nothing in the bitbake doco about preventing variable expansion, just stuff to do with immediate, deferred and Python expansions.
What do I need to do to get that string into the file as is?
Bitbake seems to have particular issues in preventing expansion from taking place. Regardless of whether you use single or double quotes, it appears that the variables will be expanded before being passed to the shell.
Hence, if you want them to not be expanded, you need to effectively hide them from BitBake, and this can be done with something like:
echo -e '\x24{libdir}/something' >/path/to/my/file
This uses the hexadecimal version of $ so that BitBake does not recognise it as a variable to be expanded.
You do need to ensure you're running the correct echo command however. Under some distros (like Ubuntu), it might run the sh-internal echo which does not recognise the -e option. In order to get around that, you may have to run the variant of echo that lives on the file system (and that does recognise that option):
/bin/echo -e '\x24{libdir}/something' >/path/to/my/file
By default this task will be executed as shell function via /bin/sh, but it depends on your system what it will be as you can have a symlink named /bin/sh pointing to bash. The BitBake's manual prevents from using bashism syntax though.
You can consider just adding this task in your recipe as python function:
python do_install () {
with open('/path/to/your/file', 'a') as file:
file.write('${libdir}/something')
}
'a' stands for append.
This should eliminate the problem with variable expansion.
There is no standard way to escape these sorts of expressions that I am aware of, other than to try to break up the expression - accordingly this should work:
do_install() {
echo '$''{libdir}/something' >/path/to/my/file
}
The best solution is simply this:
bitbake_function() {
command $libdir/whatever
}
Bitbake will only expand ${libdir}; $libdir is passed through verbatim.
We don't have to worry about dollar signs that are not followed by {, and in this case, there is no need for libdir to be wrapped in braces.
The only time we run into a problem with just $foo is if we have something like ${foo}bar where the braces are required as delimiters so that bar isn't included into the variable name. In that situation, there are other solutions, such as for instance generating the shell syntax "$foo"bar. This is less cryptic than resorting to \x24.
If you need to use $ in variable assignment, remember that bitbake won't evaluate $whatever so you have to escape it for the underlying shell.
For instance I set gcc/ld Rpath option to use $ORIGIN keyword this way:
TARGET_LDFLAGS_append = " -Wl,-rpath-link=\\$$ORIGIN"
https://lists.yoctoproject.org/pipermail/yocto/2017-September/037820.html
You can define a variable to be a literal dollar sign.
DOLLAR = "$"
do_install() {
echo '${DOLLAR}{libdir}/something' >/path/to/my/file
}
no extra quoting required.

Using an environment variable in a PSQL script

Is it possible to use a Linux environment variable inside a .sql file? I'm using the copy/select query to write to an output file, and I'll like to put that directory in a variable. So I want to do something like:
COPY (SELECT * FROM a)
TO $outputdir/a.csv
Outputdir would be set in my environment. Is this possible?
You can store the result of a shell command inside a psql variable like this:
\set afile `echo "$outputdir/a.csv"`
COPY (SELECT * FROM a) TO :'afile';
Another (better in my opinion) solution is to use only psql variables, see this answer of mine about psql variables, which is similar to your example. A example for your case would be:
\set outputdir '/path/to/output'
\set afile :outputdir '/a.csv'
COPY (SELECT * FROM a) TO :'afile';
Note that, in the example, you need to set the variable inside the script file, but you can skip the first line if you set it when you call psql:
psql --set=outputdir="$outputdir" <conn parameters> -f /path/to/yourscript.sql
This appears to work for your use case, provided you single quote the output file name as I mentioned. It will escape any double quotes as well contained within the SQL.
psql -c "$(eval echo '"' $(<envvars.sql | sed 's/"/\\"/g') '"')"
Of course, note that if your file contains any dollar quoted variables, the shell is going to try to interpret as a variable, and your script will break, so you will need to escape any dollar signs you need preserved literally with a backslash.
See also the second snippet in the accepted answer to this question for a possibly more robust answer.
The accepted answer is correct for PostgreSQL running on Unix. Under Windows a different incantation is required for obtaining the value of the environment variable from the CMD shell and for avoiding the carriage return returned by the echo command.
\set afile `set /p=%outputdir%/a.csv`
COPY (SELECT * FROM a) TO :'afile';