I'm evaluating GitLab CI/CD at the moment and trying to get the pipelines working, however am having a simple(?) problem.
I'm trying to automate the build process by automatically updating ISO images. However these ISO images are not tracked by the repository and ignored in a .gitignore file. But this leads to the issue of when I try run make that it can't find the ISO image...
I'm just using a simple .gitlab-ci.yml file:
stages:
- build
build:
stage: build
script:
- make
And when I try running this in the GitLab CI interface, it clones the repository (without the ISO images), and then fails, as there is no rule to that make target (as the ISO images are missing). I have tried moving the files into the "build" directory which GitLab creates, however that gives the error of saying it has failed to remove ...
How do I use the local repository rather than having GitLab clone the repository to a "build" location?
You can't use Gitlab CI with the files that are on your computer, at least you shouldn't. You could do it with an ssh-executor that will login to a machine that stores the iso files (and it should be a server rather than your development machine) or you could have the docker executor pull them from ftp/object store. For testing purposes you can run the builds locally on your machine.
Related
Is it allowed to create local dbt deps repository so that "dbt deps" command should download libraries from local repository?
N.B: Our client is not interested to connect to external network.
Yes this is possible, provided that the repositories have already been locally cloned or copied.
dbt docs's page on Packages tells you exactly how to do this
Packages that you have stored locally can be installed by specifying the path to the project, like so:
packages:
- local: /opt/dbt/redshift # use a local path
Local packages should only be used for specific situations, for example, when testing local changes to a package.
Note: I think it is worth re-iterating the caveat given in the docs. You will now own downloading the cloning the correct versions of the packages along with the ongoing work of keeping the packages up-to-date.
As for how this works in practice. Consider the following example:
/Users/michelle/repos/my_dbt_project where my dbt project lives (that contains dbt_project.yml and packages.yml
/Users/michelle/repos/dbt_utils the location where I previously cloned the dbt-utils repo
In this example my packages.yml should look like
packages:
- local: /Users/michelle/repos/dbt_utils # use a local path
Please note that the external package does not live within my dbt project directory, but outside of it. While it should work to have it within the repo, this is not best practice. This external package development article goes even further in-depth.
how to edit the .gitlab-ci.yml file in order to correctly execute the CI/CD pipeline on behalf of a vue.js application?
A .gitlab-ci.yml should be part of your repository, at the root folder of said repo.
See for example "How to use GitLab CI/CD for Vue.js" (2017, so some details may have changed since then)
Or the 2020 "How to auto deploy a Vue application using GitLab CI/CD on Ubuntu 18.04"
In both case, you can directly edit the .gitlab-ci.yml, add, commit and push.
Create a .gitlab-ci.yml file at the root of your repo.
GitLab will check for this file when new code is pushed.
If the file is present, it will define a pipeline, executed by a GitLab Runner.
I am building a C/C++ project in the Gitlab CI. Its compilation result is 360 .lo files and the binary (executable) file. It's a slow process, so I want to speed up it. I thought about caching. How to do it? Before, I used caching for npm, Python modules/packages. But now it's C/C++ object files and they are 360 items. How to do it with CI Yaml file? All these object .lo files are located together with source files in the src/ directory.
Caching is the same no matter what objects you're using, so the syntax will be the same for npm and python. In the job that builds your .lo files, add the cache information following the suggestions in the docs (https://docs.gitlab.com/ee/ci/caching/). Then for any other pipelines for that branch (or however you set it up) jobs that depend on the .lo files will download them from the cache instead of creating them.
You could do the following:
Use Ccache
Use the Gitlab cache to cache the Ccache cache folder.
variables:
CCACHE_BASEDIR: $CI_PROJECT_DIR
CCACHE_DIR: $CI_PROJECT_DIR/ccache
before_script:
- ccache --zero-stats
script:
- build your files here
after_script:
- ccache --show-stats
cache:
- key: ccache-$CI_JOB_NAME
paths:
- $CCACHE_DIR
Here is an example .gitlab-ci.yml file using it, and the MR diff where the functionality was added.
My Environment
When I develop vue project, I keep track of files in dist folder.
My laptop
I develop vue project -> npm(yarn) run build -> git add everthing in dist folder -> git commit -> push
Production server
git pull -> Everything is good to go (since I already built dist folder on my laptop).
I have URLs of production server and localhost server in my .env file
VUE_APP_PRODUCTION_API=https://myprodserver.com/
VUE_APP_LOCAL_API=http://localhost:3000/
When I use
vue-cli-service serve, it uses localhost server url.
vue-cli-service build, it uses production server url.
(so URL is allocated in build-time, not run-time)
The Problem
But I came across a problem when I added a qa server.
Now, there are three envrionments
1. My laptop
2. qa server
3. production server
I found that vue project in qa server tries to talk to PRODUCION_API (since It pulled the dist folder for production server which I built on my laptop and committed).
What I tried
I created the .env.local in qa server and overwrote the VUE_APP_PRODUCTION_API
# qa server .env.local
VUE_APP_PRODUCTION_API=http://qaserver.com/
and installed npm, yarn
and build
it worked! But... the problem is it makes git status output dirty.
when I git status
changes:
file removed: dist/file...
file removed: dist/file...
file removed: dist/file...
file removed: dist/file...
I know I can .gitignore dist folder, and build everytime when I git pull in each environment... but that would be bothersome. that would make me have to install npm, yarn, and install whole dependency in production server too. (It wasn't necessary before since I used to build on my laptop)
Question
Do I have to build the project everytime just to change the base url?
Was it right to keep track of dist folder from the fist place?
Is there any Best-Practice you could share me?
Ideally, you would want to deploy your project using a tool like Github Actions or Gitlab CI.
This would allow you to build your projects with the necessary environment variables for the given target environment.
In general, the dist files aren't included in version control, as each developer working on your project would have different versions of it and you would encounter merging conflicts really often.
Here's an example using Github Actions, deploying on Github Pages
Right now, anyone that creates a branch in my project and adds a .gitlab-ci.yml file to it, can execute commands on my server using the runner. How can I make it so that only masters or owners can upload CI config files and make changes to them?
I'm using https://gitlab.com/gitlab-org/gitlab-ci-multi-runner running on bash.
The GitLab runner wasn't really designed for this scenario and thus you are unable to do this. What you could do instead is have a new project with just your .gitlab-ci.yml file and configure it so that it pulls the original repository. From there you can do all the other things you want to do with your repository.