I'm trying to get an existing Jenkins job working in Gitlab CI.
On Jenkins my project gets build via mvn clean package and the resulting war file is then moved to a tomcat container. Once finished, another job gets triggered to call a specific URL of this project to do some stuff which is unrelated to my question. When this is done, tomcat stops and the 2 jobs are finished.
How can I do that with Gitlab CI?
I started doing sth like
image: cdornbusch/tomcat-maven
stages:
- build
build:
script:
- "mvn clean tomcat:run"
- "echo 'TEST'"
stage: build
but I never see the echo 'TEST' which makes pretty sense since the task mvn tomcat:run never stops...but how can I build and deploy the project and then call a specific url of it??? Once the build is done, I don't need the tomcat instance anymore.
Just a side note, I use my own docker image which installs an Ubuntu with Maven, Java and Tomcat to fulfill my project requirements.
Related
How do I run several buildsteps after each other in IntelliJ? I think I want a mini CI/CD build system inside the editor.
For example, the project I work on now is a Spring boot and javascript web site. I need to build it with maven with mvn clean package -Pdockerimage. This copies files for building the Docker image to target/dockerimgbuild.
Then I want to build the docker image using docker build -t scheduling-ui-dev . and after that run it with docker compose docker-compose up --build from src/main/resources/docker-compose.
I have built one run configuration for each of these steps but how do I run them after each other? I have found that you can have before launch but the system is clunky and complains if target/dockerimgbuild doesn't exists even before it have run the maven step which creates it. Latest problem I stumbled on was that a file prevented maven from removing target/dockerimgbuild and all run steps was automatically removed from the run configurations.
There is a run configuration called compound but that runs everything in parallell and you can not specify order which is a problem.
I wonder if it is feasible to start TeamCity in a container, do anyone have a clue about that (is teamcity easy to configure, how to make it launch a docker-compose container on my host machine etc)?
My solution right now is to have several terminals (if this gets more permanent I will replace it with a script) where I just press up and enter to execute the steps manually. Seems stupid as I guess maven itself can do all of this...but I don't know how or how much work it is.
There is a compound Run/Debug configuration: https://www.jetbrains.com/help/idea/run-debug-configuration-compound-run-configuration.html
Also, there is a multi-run plugin: https://plugins.jetbrains.com/plugin/7248-multirun
I've adapted the following example from the Jenkins Kubernetes plugin documentation
I've created a simple Jenkins pipeline job and added the groovy script in line. Maven pulls all the dependencies, but weirdly fails at the compilation step.
Here's the console output
How does one go about even debugging this as the failure happens on a temporary container?
The issue was with the settings.xml. It had a windows path which messed up maven's classpath. I added a line "sleep 99m" to keep the container on hold and then I SSH'd in to the container to debug to identify the root cause.
I have a question about workflow with docker and gitlab-ci or automated builds in general.
This is how I am imagine how a build should look like↓.
How to do it with gitlab-ci ?
I know how to do one of this tasks, but I don't know how to.
In my imagination i would need more than one base image.
Maybe I am missunderstanding the hole thing.
How should this process be done in general ?
Thx four your help 😀
Since your question is very general, I will answer it with an example.
Consider a imaginary C++ project, which contains the code, a Makefile which creates the executable "app" and this Dockerfile:
FROM ubuntu:16.04
ADD ./app /app
CMD ["/app"]
To build the application and the docker image as you said, you could use a GitLab CI config like this:
stages:
- test
- build
- docker
test:
stage: test
script:
- make test
build:
stage: build
script:
- make
artifacts:
paths:
- ./app
docker:
stage: docker
dependencies:
- build
script:
- docker build -t your-repo/image-name:latest .
- docker push your-repo/image-name:latest
Explanation
This CI file creates three jobs: "test", "build" and "docker". "test" runs "make test" to execute any imaginary tests our codebase might have. If they suceed, the GitLab runner will execute the next job, "build".
"build" builds the application by calling "make". We expect make to create a file "app" in the current directory, which is our compiled application that will run in the container. The section "artifacts" states that we want to keep this resulting file, since we need it for the next job.
The next job "docker" has a section "dependencies"; in this section we state that this job depends on the output of the job called "build", which created our file "app" before. Then we first build the docker image using docker build and push it as usual.
As said before, these are just examples, and especially the script sections will greatly differ based on your projects and your runner config. See the official CI documentation for all possibilities.
In maven, what does "-e" stands for in the following command.
mvn -e clean install
Moreover, what is the difference between
mvn clean install
and
mvn clean compile
As Satish stated, the "-e" switch will display execution errors in the maven output.
As to the difference in "install" vs "compile", those are different Maven lifecycle stages. See the Introduction to the Build Lifecycle documentation for help with that. The key to remember is that Maven will execute all lifecycle stages up to and including the one you specify, and then stop.
Specifically in your case, "mvn clean compile" will run Maven with two lifecycle targets, the first being "clean", and the second being "compile". The "compile" lifecycle phase will run the build up to and including the compilation of project source code. The "install" lifecycle phase will run all the way through packaging your project into it's container (jar, war, etc) and will install it to your local maven repository, which resides on your local machine. When a project is installed to your local repository, other projects you build on your machine can reference it without having to have any knowledge of where the source code or project build artifacts actually reside.
the e flag (e = errors) prints out more detailed error messages.
mvn clean install, does compilation, linking and installs (copies to app server etc)
for more maven options look at this ref card
http://www.scribd.com/doc/15778516/DZone-Refcard-55-Apache-Maven-2
or maven command list
http://cvs.peopleware.be/training/maven/maven2/mvnCommand.html
mvn clean install - First, cleans already compiled class files (probably in target/ directory). Then, it compiles the classes, generate the jar, and then install the created jar to your local m2 repository (probably located at ~/.m2/repository/).
mvn clean compile - The clean does the same thing as above. And, then, it compiles the java files in the project. And, stops there. It doesn't create the jar nor install anything to the local maven repository.
-e switch will display the stack-traces occur when your build is failed. It's a normal stack-trace that java programs produce when exceptions occur. Do note that Maven itself is a Java program.
I've created a hudson job for our maven multi-project with 5 modules to deploy the SNAPSHOT artifacts to the maven repository. That's ok, as long as it builds successfully without test failures. However, now I'd like to fulfill the following requirements:
When a module has a test failure, the build should continue bulding and test the other modules, but turn yellow. Using -Dmaven.test.failure.ignore=true accomplishes, but fails at the next requirement.
When a module has a test failure, none of the artifacts should be deployed to the maven repository. Other projects depend on the snapshots this project and those projects only want to use the latest snapshots that don't have any failing tests.
Preferably, use the hudson maven integration instead of a free script we get the hudson report pages (red/yellow/blue status per module, build log error coloring, ...). Specifically running the maven build twice (first mvn test -Dmaven.test.failure.ignore=true, than mvn deploy -DskipTests) is not a solution because it's a performance loss and it confuses the hudson report pages and it's not atomic (it updates from the repositories again in the second build).
Is there any way to accomplish this?
There is an post build option called Deploy artifacts to Maven repository. If you do not select Deploy even if the build is unstable, then that mean if test fails, it won't deploy anything. Together with the -fae in the command, thing should work in your desired way
maybe you can try use mvn -fae option with you jobs on hudson - it make maven fail only after full build
If build time isn't a problem for you, I think the better option is to create another job, just for deploying. Something like this:
Configure your original job (let's call it "build job") with "mvn -fae clean install"
Create a new job ("deploy job") with "mvn deploy", and don't configure any Build triggers for it
In the "build job", enable the Build other projects option, under Post-build actions and set it to run your "deploy job".
Maybe you can try to configure both jobs to use the same workspace, saving some time on the whole build/deploy process.
If you happen to use Artifactory as a repository manager, you can use the Hudson/Jenkins Artifactory plugin to deploy your artifacts. This plugin will only deploy your artifacts if all tests pass for all modules of a Maven build.