I am struggling to get coverage of classes that are deployed into WebLogic server. I have them surely instrumented. The setup is that I have some classes in jars and some in unpacked form. When tests are executed against running application (some automated clicking), I can see some coverage for the classes packaged in jar, but I have no coverage for the unpacked classes. However, clover knows about those classes, I see them listed in clover database, but having no coverage.
Related
I started a project and have about 7 tests in my project now and it takes already more than a minute to execute the whole test suite using gradle test.
From the additional output (--info flag) I can see that the whole quarkus application and also dependencies like the mongodb instance are restarted for every test class and method.
This is the exact opposite of what the quarkus documentation says on the testing guide page:
So far in all our examples we only start Quarkus once for all tests. Before the first test is run Quarkus will boot, then all tests will run, then Quarkus will shutdown at the end. This makes for a very fast testing experience however it is a bit limited as you can’t test different configurations.
All the tests are annotated with #QuarkusTest and every test just tests a single endpoint.
I use "pure" kotlin (1.5.21), Quarkus version 2.2.2.Final and gradle 6.9.
Installed features: cdi, config-yaml, jacoco, kotlin, mongodb-client, mongodb-panache-kotlin, narayana-jta, rest-client, rest-client-jackson, resteasy, resteasy-jackson, smallrye-context-propagation, smallrye-health, smallrye-openapi, swagger-ui
Is that a normal behaviour? If yes, an application with multiple hundred tests could easily take ~20 minutes or more to run the entire test suite.
I didn't try out maven yet, so I can't verify that it's not a gradle related issue.
While trying to reproduce it with a fresh project, I think I found the issue with my code:
I also used #QuarkusTestResources with restrictToAnnotatedClass=true on my tests.
This means the configuration & test profiles must be reloaded and therefore also the quarkus application.
Apparently all the DevServices get restarted, too (in my case it was a mongodb, since I'm using the panache extension), which explains the long runtimes of the tests.
I reorganized my tests a little bit, so they work with the "global" test resources (it was a WireMockServer in my case).
Now quarkus only gets started once before the tests and the total runtime of the gradle test task is acceptable.
Here is my scenario.
I have a code base, which is built and deployed as EAR on jBoss server.
I have a separate testing framework.
Now I want to run the classes of that EAR using my testing framework.
The test cases are written in TestNG.
Also I want to know the code coverage of the EAR.
I have used eclEmma to do code coverage for Junits, it was simple as the code and tests are at same place.
How can I use Emma in the case of remote code base. Please help.
EclEmma is Eclipse plugin based on JaCoCo - Java Code Coverage Library. JaCoCo provides various ways for collection of code coverage. In particular you can attach it to the server as a Java agent and request information about coverage remotely. And even import and show it in Eclipse using EclEmma.
We have seperate integration test project which fires the integration test cases on different modules . At present we do not have unit test cases within each module. We would like to ensure that the integration test covers most of the domain functionality.
Since we have the integration tests in a different project , Sonar always reports the test coverage as zero for the modules under test.
Is there any way to have the test coverage reported on a project , when the actual test is run from a different project
Thanks
You should be able to achieve what you want by reading the Code Coverage documentation page on the wiki. Most notably, you'll be able to use the following sample project to see how it works:
IT JaCoCo Sonar Runner sample project
Basically, you have to run your integration tests first using JaCoCo to generate the coverage report (jacoco.exec) and then you reuse this report during the SonarQube analysis.
Is it possible to run unit tests when a non-maven project is analyzed with Sonar, in Sonar light mode?
Sonar doesn't run unit tests. But it should be able to analyze existing unit tests reports. From Reuse in Sonar unit test reports generated by other systems:
2. Using Sonar in its full capability in an ANT environment
If you are using ANT to build your
applications, the main weakness so far
in Sonar was that it did not allow to
display Unit tests results nor Code
coverage. I am sure that now you have
read the first use case, you know that
by using the
“-Dsonar.dynamicAnalysis=reuseReports”
parameter, this limitation does not
exist anymore. You simply need to
specify where those reports to reuse
are going to be found, by using the
following properties :
sonar.cobertura.reportPath,
sonar.clover.reportPath,
sonar.surefire.reportsPath...
I recently discovered that Hudson was not the problem. In actuality it was Maven itself as the multi-module build was causing the build failure, not Hudson. I just hadn't noticed where the issue actually existed.
Leaving the original question here.
I'm using the failsafe-maven-plugin to run some integration tests. The difference between failsafe and surefire is that failsafe allows failures and does not fail the build.
On my nightly builds there are occasions that a service the integration tests use might be down. In normal builds, the failsafe plugin would let the build continue since the integration tests are allowed to fail. However, Hudson does not seem to respect this and stops the build and produces rain.
I tried to turn the failsafe tests off on nightly builds using -DskipITs. This appears to fail since I'm in a multi module build.
Any ideas on how to get Maven to respect that these tests can fail even though they're part of a specific module?
The project structure is as follows:
-parent
\-jar
\-jar (where integration tests run)
\-war
\-ear
You can use profiles to make builds a bit different for different environments (nightly builds, releases, normal developer builds and so on).
I'd also try updating the Maven version, there were recently few fixes related to multi-module builds.
I don't believe your original assumption that failsafe-maven doesn't fail the build is correct. A failed test does not stop the integration-test phase from completing, which is different from the surefire plugin that runs unit tests. This allows the post-integration-test phase to run, so the test environment can be torn down (app server shut down, etc.).
After this, the verify phase is run, which looks at the results of the integration tests. if one of these tests has failed, then Maven will return with a build failure, which Hudson will rightly pick up so your build can be flagged as broken.
Use a maven profile to turn on/off the verify goal of the maven failsafe plugin.