Can't get drone.io CI to share files between pipelin steps - drone.io

here is my .drone.yml:
pipeline:
test:
image: node:10
commands:
- npm install --production
- npm run build --production
publish:
image: plugins/docker
repo: myhub/image_name
when:
event: push
branch: master
the command npm run build creates a folder named build with static files inside. However, the publish step fails when building the docker image. Here is my Dockerfile:
FROM node:10-alpine
RUN mkdir -p /app
WORKDIR /app
COPY build build
The error being: COPY failed: stat /var/lib/docker/tmp/docker-builder090186817/build: no such file or directory time="2018-05-28T21:19:25Z" level=fatal msg="exit status 1"
So I don't quite understand how to build some files in one step, and copy them in the docker publish step...
Thanks for your help!

So anything in the workspace will get shared to the next step ;)
Are you able to build the docker image without drone with just docker build .?
ie. You might want to try to change COPY build build to be COPY ./build /app/build or the something like it.

Related

Self-hosted GitLab Runner with shell executor can’t find npm

I’m configuring a very simple CI job. GitLab Runner is running on my own server, the specific runner for this project has been registered, with the shell executor, as I want to simply run shell commands.
stages:
- build
build:
stage: build
script:
- npm install
- npm run build
artifacts:
paths:
- "public/dist/main.js"
only:
- master
The job fails at the first command, npm install, with npm: command not found. I just installed npm and node via npm. If I SSH on my server and run npm -v, I can see version 8.5.5 is installed. If I sudo su gitlab-runner, which I suppose is what GitLab Runner is running as, npm -v works just as well.
I installed npm while gitlab-runner was already running. So I ran service gitlab-runner restart, thinking that it had to reevaluate its PATH, but it didn’t fix the issue.
I fixed it by simply adding this command before npm install: . ~/.bashrc.
I’m not sure why gitlab-runner didn’t properly read .bashrc before, even though I restarted it. Maybe it’s not supposed to? That would be contrary to what’s said in the GitLab CI runners docs.
N.B.: A key element in me being able to debug this was to clone the repo on a folder on my server, cd into it, and run gitlab-runner exec shell build after any (local) change to .gitlab-ci.yml. Skipping the whole commit + push + wait was a huge time (and sanity) saver.

vue.js project build in Docker corrupts base64 image

I have a weird issue that base64 images are corrupted after I deploy to my server with Gitlab CI
I started my boilerplate from this recent updated Github:
https://github.com/Jamie-Yang/vue3-boilerplate
A local yarn dev works
Then I made the Dockerfile:
FROM node:16 as build-stage
WORKDIR /app
COPY package*.json ./
RUN yarn install
COPY /. .
RUN yarn build
# production stage
FROM nginx:stable-alpine as production-stage
COPY --from=build-stage /app/dist /usr/share/nginx/html
RUN rm /etc/nginx/conf.d/default.conf
COPY nginx.conf /etc/nginx/conf.d
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
The image does build fine and running on Windows Docker Desktop without a problem
Next I started with the Gitlab deploy:
image: docker
services:
- docker:dind
stages:
- deploy
step-deploy-prod:
cache: []
stage: deploy
script:
- docker build -t app/myapp .
- docker stop myapp || true && docker rm myapp || true
- docker run -d --restart always -p 11580:80 --name myapp app/myapp
The app is deployed but somehow base64 image is corrupt
I wanted to be sure that it wasn't the nginx proxy reverse, but visiting mysite.com:11580 gives the same result.
When I transfer the dist folder via FTP to a public_html folder (so no docker) that works.
So maybe some cashing is going on in the Gitlab runner?
When I inspect the working webpage in public_html, my base64 src is
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAA...
and on the non working docker environment
data:image/png;base64,iVBORwoaCgAAAA1JSERSAAAAyAAAAM...
So what can be happening here?
[UPDATE]
When I replace the logo with a perfect image it renders corrupt.

How should I use Jfrog artifactory in Dockerfile for npm install

We are using JFrog Artifactory for NPM Packages in our Jenkins CI Pipeline.. I would like to download npm packages from Artifactory while building docker images in our CI for npm ci/npm install step to decrease the duration of docker build process.
I tried as below via copy the .npmrc file that contains our private registry informations from local to docker container and when I checked the logs of npm install it able to download the dependencies from our JFrog Artifactory.. But this is not a secure approach cause I do not want to keep .npmrc in local repository and commit to the VCS.
What might be the best approach of doing this ?
Dockerfile
FROM node:12.21.0-alpine3.12 AS builder
WORKDIR /usr/src/app
ARG NPM_TOKEN
ARG NODE_ENVIRONMENT=development
ENV NODE_ENV=$NODE_ENVIRONMENT
COPY package.json /usr/src/app/package.json
COPY package-lock.json* .
COPY .npmrc /usr/src/app/.npmrc
RUN npm ci --loglevel verbose
RUN rm -f .npmrc
FROM node:12.21.0-alpine3.12
WORKDIR /usr/src/app
RUN apk update && apk add curl
COPY --from=builder /usr/src/app /usr/src/app
COPY . .
EXPOSE 50005 9183
CMD [ "npm", "run", "start:docker" ]
.npmrc
registry=https://artifacts.[company].com/artifactory/api/npm/team-npm-development-virtual
_auth = xxxxxxxxxx
always-auth = true
email = firstname.lastname#company.com
You can store your .npmrc on your VCS as long as it doesn't contain the "_auth" entry.
On azure devops we use service connections.
On "build containers" it wouldn't be possible, AFAIK.
So my approach would be using a protected build variable, to store the credential, and inject it in build time. Right before npm install you can set the "_auth" value on .npmrc.
You can achieve this in many different ways, but this is it.

Vue.js application does not run on Gitlab pages

I built a Vue.js Vuex user interface. It works perfectly (on my laptop). I want to deploy it on Gitlab pages.
I used the file described here (except that I upgraded the Node.js version):
build site:
image: node:10.8
stage: build
script:
- npm install --progress=false
- npm run build
artifacts:
expire_in: 1 week
paths:
- dist
unit test:
image: node:10.8
stage: test
script:
- npm install --progress=false
- npm run unit
deploy:
image: alpine
stage: deploy
script:
- apk add --no-cache rsync openssh
- mkdir -p ~/.ssh
- echo "$SSH_PRIVATE_KEY" >> ~/.ssh/id_dsa
- chmod 600 ~/.ssh/id_dsa
- echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config
- rsync -rav --delete dist/ user#server.com:/your/project/path/
The job is marked as run successfully on the pipeline. However when I click on the pages URL I get a 404 HTTP error code.
What am I missing?
I was facing a similar issue when I was trying to deploy my Vue.js application to Gitlab pages. After weeks of trial and error, I have got it to work.
Seeing your above script your building the app, unit testing it and trying to deploy it to an external server. If you need it in Gitlab pages as well you have to use the pages job.
Here is my pages job for deploying a vue.js app to Gitlab pages:
pages:
image: node:latest
stage: deploy
script:
- npm install --progress=false
- npm run build
- rm -rf public
- mkdir public
- cp -r dist/* public
artifacts:
expire_in: 1 week
paths:
- public
only:
- master
Hope this is what you're looking for.
You can deploy without the pipeline. In order for this to work you have to first build your application for production. If you have used Vue cli this is done by invoking the build command. ex. npm run build
This will generate a dist folder where your assets are. This is what you have to push in your repository. For example, look at my repository.
https://github.com/DanijelH/danijelh.github.io
And this is the page
https://danijelh.github.io/

How to run build in local machine with drone.io

Does the build have to run on the drone.io server? Can I run the build locally? Since developers need to pass the build first before pushing code to github, I am looking for a way to run the build on developer local machine. Below is my .drone.yml file:
pipeline:
build:
image: node:latest
commands:
- npm install
- npm test
- npm run eslint
integration:
image: mongo-test
commands:
- mvn test
It includes two docker containers. How to run the build against this file in drone? I looked at the drone cli but it doesn't work in my expected way.
#BradRydzewski comment is the right answer.
To run builds locally you use drone exec. You can check the docs.
Extending on his answer, you must execute the command in the root of your local repo, exactly where your .drone.yml file is. If your build relies on secrets, you need to feed these secrets through the command line using the --secret or --secrets-file option.
When running a local build, there is no cloning step. Drone will use your local git workspace and mount it in the step containers. So, if you checkout some other commit/branch/whatever during the execution of the local build, you will mess things up because Drone will see those changes. So don't update you local repo while the build is running.