How to cache all directories of my repository using Gitlab CI - gitlab-ci

I have a problem, how could I cache multiple directories in gitlab ci? When Gitlab-runner download the repository, inside I have like 10 folders (diferent projects), some project depeneds of another, so I would like do it available for the next jobs.
I thought do it something like that, without specifying all folders manually
cache:
paths:
- "./"
Could be work this or I need something else?
Thanks in advance

If what you want is to cache all the untracked files and directories of the repository, yo can use untracked:true
job_name:
script: test
cache:
untracked: true
You can also combine this with the path keyword like this:
rspec:
script: test
cache:
untracked: true
paths:
- your_path/
You may use wildcards for the paths too like your_path/*.jar to take every .jar file in the directory.
I point you to the official GitLab documentation for more info: GitLab-CI documentation

Related

How to speed up Gitlab CI build of a c/c++ project with caching

I am building a C/C++ project in the Gitlab CI. Its compilation result is 360 .lo files and the binary (executable) file. It's a slow process, so I want to speed up it. I thought about caching. How to do it? Before, I used caching for npm, Python modules/packages. But now it's C/C++ object files and they are 360 items. How to do it with CI Yaml file? All these object .lo files are located together with source files in the src/ directory.
Caching is the same no matter what objects you're using, so the syntax will be the same for npm and python. In the job that builds your .lo files, add the cache information following the suggestions in the docs (https://docs.gitlab.com/ee/ci/caching/). Then for any other pipelines for that branch (or however you set it up) jobs that depend on the .lo files will download them from the cache instead of creating them.
You could do the following:
Use Ccache
Use the Gitlab cache to cache the Ccache cache folder.
variables:
CCACHE_BASEDIR: $CI_PROJECT_DIR
CCACHE_DIR: $CI_PROJECT_DIR/ccache
before_script:
- ccache --zero-stats
script:
- build your files here
after_script:
- ccache --show-stats
cache:
- key: ccache-$CI_JOB_NAME
paths:
- $CCACHE_DIR
Here is an example .gitlab-ci.yml file using it, and the MR diff where the functionality was added.

In the gitlab-runner, How can I keep the build/?

When I start a new pipeline, I don't want to repeat make the directory build/. But the gitlab-runner always remove the build/. How can i keep the build/ ?
Thanks:)
Use a cache: https://docs.gitlab.com/ee/ci/caching/
From the docs:
Use cache for dependencies, like packages you download from the internet. Cache is stored where GitLab Runner is installed and uploaded to S3 if distributed cache is enabled.

How to list the modified files?

I am using gitlab-ci to run scripts defined in .gitlab-ci.yml whenever a PR is raised.
I want to get the list of modified files since the last commit.
The use case is to run file-specific integration tests in a large codebases.
If you do not need to know the paths, but you simply need to run a specific job only when a specific file is changed, then use only/changes .gitlab-ci.yml configuration, e.g.
docker build:
script: docker build -t my-image:$CI_COMMIT_REF_SLUG .
only:
changes:
- Dockerfile
- docker/scripts/*
Alternatively, if you need to get paths of the modified scripts, you can use gitlab-ci CI_COMMIT_BEFORE_SHA and CI_COMMIT_SHA environment variables, e.g.
> git diff --name-only $CI_COMMIT_BEFORE_SHA $CI_COMMIT_SHA
src/countries/gb/homemcr.js
src/countries/gb/kinodigital.js
A common use case that some people will find useful. Run the job when a merge request is raised. Specifically, run lint on all the files that are changed w.r.t. target branch.
As one of the answers suggest, we can get the target branch name through CI_MERGE_REQUEST_TARGET_BRANCH_NAME variable (list of predefined variables). We can use git diff --name-only origin/$CI_MERGE_REQUEST_TARGET_BRANCH_NAME to get the list of files that were changed. Then pass them on to the linter via xargs. The example configuration may look like.
code_quality:
only:
- merge_requests
script:
- git diff --name-only origin/$CI_MERGE_REQUEST_TARGET_BRANCH_NAME | xargs <LINT_COMMAND_FOR_FILE>

How to transfer a value from one build container to another in drone.io CI pipeline

I know I can write it to the mounted host file system which will be shared amongst the multiple build containers. But how can I make use of that file in a drone plugin container like docker-plugin?
Or, is there any other way to pass arbitrary data between build steps? Maybe through environment variables?
This is drone 0.5
It is only possible to share information between build steps via the filesystem. Environment variable are not an option because there is no clean way to share environment variables between sibling unix processes.
It is the responsibility of the plugin to decide how it wants to accept configuration parameters. Usually parameters are passed to the plugin as environment variables, defined in the yaml configuration file. Some plugins, notably the docker plugin [1], have the ability to read parameters from file. For example, the docker plugin will read docker tags from a .tags file in the root of your repository, which can be generated on the fly.
pipeline:
build:
image: golang
commands:
- go build
- echo ${DRONE_COMMIT:0:8} > .tags
publish:
image: plugins/docker
repo: octocat/hello-world
Not all plugins provide the option to read parameters from file. It is up to the plugin author to include this capability. If the plugin does not have this capability, or it is not something the plugin author is planning to implement, you can always fork and adjust the plugin to meet your exact needs.
[1] https://github.com/drone-plugins/drone-docker

GitLab CI use untracked files in repository

I'm evaluating GitLab CI/CD at the moment and trying to get the pipelines working, however am having a simple(?) problem.
I'm trying to automate the build process by automatically updating ISO images. However these ISO images are not tracked by the repository and ignored in a .gitignore file. But this leads to the issue of when I try run make that it can't find the ISO image...
I'm just using a simple .gitlab-ci.yml file:
stages:
- build
build:
stage: build
script:
- make
And when I try running this in the GitLab CI interface, it clones the repository (without the ISO images), and then fails, as there is no rule to that make target (as the ISO images are missing). I have tried moving the files into the "build" directory which GitLab creates, however that gives the error of saying it has failed to remove ...
How do I use the local repository rather than having GitLab clone the repository to a "build" location?
You can't use Gitlab CI with the files that are on your computer, at least you shouldn't. You could do it with an ssh-executor that will login to a machine that stores the iso files (and it should be a server rather than your development machine) or you could have the docker executor pull them from ftp/object store. For testing purposes you can run the builds locally on your machine.