copy artifacts from gitlab runners to windows shared drive - gitlab-ci

Am new to gitlab runners. I need help with pipeline script to copy the artifact produced by gitlab pipeline( which runs on a linux runner and windows) to a windows share. Any help is appreciated.
The pipeline script is a gitlab-ci.yml file which in turn calls a .sh script.
Thanks

You can use samba and cifs-utils to mount the windows share on your runner and then use the gitlab-ci apis to download the artifact to those mounted paths.
check out - this and
this

Related

GitLab CI / CD: do I need docker for ASP.NET Core - Angular application

I am having self-hosted GitLab-EE. I want to enable CI / CD. I already have gitlab-runner on windows and it is activated for my project.
I also have custom server for hosting my ASP.NET Core 2.0 with Angular 5.0 application.
My final idea is when commit is made to GitLab, build and deploy (to custom server) is executed. Deploy path would be different if commit to master was made or any other branch (from merge request) conditional by Git Tag.
Almost all tutorials use docker, but I couldn't found why? What are the cons using docker?
I thought I only need msbuild (I can also install Visual Studio) on machine where GitLab-runner is running. It would build and deploy application. I found this configuration file which doesn't use docker, but question remains. Do I need docker and why?
A docker image would contain all the msbuild dependencies and build tools needed to build your application with out you going through the trouble of manually installing them on your server.
So basically docker helps you to manage dependencies of you application more efficiently.

Gitlab CE Continuous Integration build node/angular app and deploy dist folder to server

I've spent countless hours trying to understand how to do this correctly.
I have a nodejs application with angular front-end which are both contained in the same project.
I would simply like to have Gitlab CE CI build the project and then copy the resulting dist folder and package.json file to the production server and restart
I have a shared gitlab runner setup and was able to successfully configure the ssh runner.
Using the Gitlab runner ssh I was able to copy the entire project to the production server but cannot get it to build (plus I really don't want to have all files on the server, just the production required files.
what am I missing. do you use a docker runner with a node image to build the project and they scp files to the production server?
Any guidance would greatly be appreciated.
Yes, I think you already state the solution.
I would recommend looking into Job Artifacts to build your application in a Docker-based build job and upload the production files as an artifact.
The next job could then deploy (by scp or ssh+wget/curl) the artifact files to your production server and restart the webserver.

Bamboo builder task

I am a newbie for CI & now start using Bamboo server for continuous integration. I've just get running Bamboo server and set first plan. I set the task for source checkout. Now I am trying to add automated build for my app.
For now app is just simple console based example running both at windows and linux. I have makefile associated with app to build it and then I run .exe (win) or ./ (linux) manually.
Now I want to set the "Builder task" (script task), how do I automatize it building it with Bamboo?
Yes, you have to add "Script task" and call your makefile to build your application in it. You also would need to define an Artifact - providing relative path to your .exe file. (Artifacts tab within your job configuration, more info here) so you can download your .exe file when the build is finished.
Depends on where you bamboo is running
If the Bamboo is on linux machine , just script task would be enough . To build it on windows environment as well , you need to have windows agent or use winexe to call it remotely .

Whether drone.io support creating docker during build process

I am using maven-docker-plugin in my project. This plugin will create docker containers during integration tests. Since drone.io put the build process inside a docker container, whether I can still use maven-docker-plugin during maven build? How to control the docker containers during build time?
If you want interact directly with the Docker daemon to create and start containers, you need to mount the host machines Docker socket into your build container.
Since you mentioned using the docker-maven-plugin you may want a configuration similar to the following:
pipeline:
build:
image: maven
environment:
- DOCKER_API_VERSION=1.20
- DOCKER_HOST=/var/run/docker.sock
volumes:
- /var/run/docker.sock:/var/run/docker.sock
commands:
- mvn clean package docker:build
Please note that exposing the Docker daemon to your build environment is essentially giving your build root access to your server. This approach is therefore not recommended for public repositories.
Please also note that volumes are restricted security reasons. To use volumes you need to have a Drone administrator mark your repository as trusted, in your repository settings screen.
So it is possible to launch containers from inside the build environment for the purpose of running your tests. The recommended approach, however, is to run your tests directly inside your build environment. This is the use case for which Drone is optimized, and it eliminates the security issues mentioned above.

How to install a jar file from target to a specific location using maven?

Currently I have a simple maven project that is building a jar file and putting it inside target/some-1.0.jar when i run mvn install.
I want to copy this file to another location when I run mvn deploy.
Currently the location is on the same machine, but it would also be great if the solution could be applied for multiple targets, some of them being on other machines (scp deployment).
What is the easiest solution to do this? I would be nice, if you could include an example too.
Details: I have few jira plugins that are compiled as jar files and I just want to be able to run a single maven command that would copy the files to the server and eventually restart the server.
mvn deploy is intended for deployment to a remote Maven repository. mvn install is used for copying to the local Maven repo (so actually, the jar is also ending up in $HOME/.m2/repository, as well as target).
I'm not sure what you're intending to do, but I suggest you look at deploying something like http://www.sonatype.org/nexus/ if you want Maven artifacts to be available to multiple machines. This will integrate nicely with the rest of Maven.
Edit: based on your updated question, it's probably best to investigate the Wagon ssh plugin, or see if there's an Ant plugin. A suitable phase would be pre-integration-test: install and deploy should be run after you've run your integration tests to check the artefact works as expected. Use profiles to distinguish the local vs. remote cases.