gitlab-ci include *gitlab-ci yml does not exist - gitlab-ci

When using the regular include YML configuration file, an error will be reported, indicating that the file does not exist
include:
- local: .gitlab/staging/*.gitlab-ci.yml
Find the error in your .gitlab-ci.yml :
Local file .gitlab/staging/*.gitlab-ci.yml does not exist!
but,it is normal to import one file
include:
- local: .gitlab/staging/oa.gitlab-ci.yml
it works

Use quotes around the path string, otherwise it will not be interpreted properly.
# This matches all `.yml` files in `configs` and any subfolder in it.
include: 'configs/**.yml'
# This matches all `.yml` files only in subfolders of `configs`.
include: 'configs/**/*.yml'
See, Use include:local with wildcard file paths.

Related

How to to define bamboo artifact so its NOT published in syb folder

The artifact definition and the file structure in the bitbucket repository is as below.
The build when run creates application-dev.properties & safeguard-dev.properties under classes folder. When i click them it takes me to classes folder inside which the property files are present. But i want them to be published directly like the jar. So that when I click the file it should download. But if i give full path it errors out. Please help me how to define this?
error 19-Dec-2022 16:34:57 Failing as no matching files has been found and empty artifacts are not allowed.
error 19-Dec-2022 16:34:57 Unable to publish artifact [application-dev.properties]:
error 19-Dec-2022 16:34:57 The artifact is required, build will now fail.
Bamboo won't put the artifacts in a subfolder if the "Location" points to exactly where the files are. I.e. don't use the "**" wildcard in the "Copy pattern". I'd say try even to have a separate artifact per properties file.
Your location is pointing to "workdir/src", but you need "target/classes". Get rid of the $bamboo.build.working.directory: that's not the maven build dir and you don't need it altogether (because current work dir is already set to your project dir). If I get the paths right, this should work:
Name: application-dev.properties
Location: target/classes
Copy pattern: application-dev.properties

How to use include directives with specific config files in snakemake?

How can we define specific config files for includes in snakemake workflows?
I have a pipeline with the following structure.
The main entrypoint of the pipeline:
workflow/Snakefile
Other smk files:
workflow/rules/Snakefile_0.smk
workflow/rules/Snakefile_1.smk
workflow/rules/Snakefile_2.smk
Inside Snakefile, I include them using
include: "rules/Snakefile_0.smk"
include: "rules/Snakefile_1.smk"
include: "rules/Snakefile_2.smk"
I access and use the config file within these smk files, too. How can I specify config files for each smk file I include?
Snakemake does not support different configfiles for different rules or rule files if they are within the same main Snakefile. All rules of the include: *.smk files have access to the same configfile and config dict.
If you want to use different config-files for each .smk file, you can use the following workaround:
Instead of using the include directive, import the rules of the .smk files as module. You can specify a different configfile for each module imported.

Working directory when using include in snakemake for rules that use the report() function

I am using snakemake to program my workflows. In order to reuse code, the simplest way is to use the statement
include: "path/to/other/Snakefile"
This works fine for most cases but fails when creating the reports via the report() function. The problem is that it does not find the .rst file that is specified for the caption.
Thus it seems that report() has the working directory in which the other Snakefile is located and not the one of the main Snakefile.
Is there a flexible workaround for this, so that it behaves as just being loaded into the Snakefile and then being executed as it were in the main Snakefile?
This is an example rule in another Snakemake file:
rule evaluation:
input:
"data/final_feature.model"
output:
report("data/results.txt",caption="report/evaluation.rst",category ="Evaluation")
shell:
"Rscript {scripts}/evaluation.R {input}"
This is included in the main Snakefile via:
include: "../General/subworkflows/evaluation.snakemake"
This is the error message showing that the file is not present:
WorkflowError:
Error loading caption file of output marked for report.
FileNotFoundError: [Errno 2] No such file or directory: '.../workflows/General/subworkflows/report/evaluation.rst'
Thank you for any help in advance!
One option may be to expand relative paths to absolute paths using os.path.abspath(). If the paths are relative to the directory where the Snakefile is, you may need instead to use workflow.basedir which contains the path to the Snakefile. For example:
caption= os.path.join(workflow.basedir, "report/evaluation.rst")

Is there a way to get the config file name specified by --configfile?

I need to get the config file name (along with its path, if any) specified by --configfile on the commandline within Snakefile. Is there a way to get it without scanning sys.argv? Thanks.

Compiling haml from a different directory

I use this to launch my haml compile:
/install-location/haml /myproject/index.haml /myproject/index.html
It runs fine when I'm in the directory with the haml file but when I change to a different directory I get:
Exception on line 3: No such file or directory # rb_sysopen - assets/page/structure/_head.haml
Use --trace for backtrace.
What can I do to fix this?
The path you are trying to read — assets/page/structure/_head.haml is relative to the working directory, not to the source file directory. When you’re in the same directory it works because these two directories are the same.
To be able to run the code from a different directory you need to use absolute paths. You can convert the relative path to an absolute one with File::expand_path, File::dirname and __FILE__:
= Haml::Engine.new(File.read(File.expand_path 'assets/page/structure/_head.haml', File.dirname(__FILE__))).render