Generate Serenity aggregate report when tests are executed with cucumber-jvm-parallel-plugin - serenity-bdd

Since Serenity, doesn't support parallel execution out of the box ,I'm using Cucumber jvm parallel plugin
, after the tests are executed successfully I get the following files in my target/failsafe-reports directory:
failsafe-summary.xml
Parallel01IT.txt
Parallel02IT.txt
Parallel03IT.txt
TEST-Parallel01IT.xml
TEST-Parallel02IT.xml
TEST-Parallel03IT.xml
After I run mvn sernity:aggregate I get this:
[INFO] Generating test results for 0 tests
[INFO] 2 requirements loaded after 80 ms
[INFO] 2 related requirements found after 80 ms
[INFO] Generating test outcome reports: false
[INFO] Starting generating reports: 92 ms
[INFO] Configured report threads: 40
[INFO] Test results for 0 tests generated in 352 ms
For some reason the report aggregator does not seem to find the file locations. If I run the tests sequentially the report works just fine , even though the results are stored in the same directory.
If also tried setting the sourceDirectory in report plugin , but to no avail.
Is there some configuration options I am missing ? Or is it just plainly not possible to generate the report if I'm using the parallel plugin ?

The actual runners generated by the plugin , did not use CucumberWithSerenity.
I've created a custom template based on https://github.com/temyers/cucumber-jvm-parallel-plugin/blob/master/src/main/resources/cucumber-junit-runner.java.vm and set the path to it
with <customVmTemplate>src/test/resources/cucumber-custom-runner.vm</customVmTemplate>
Afterwards the report is successfully generated.

Related

How to generate two seperate serenity reports for two seperate cucumber tests using the same runner class?

I have two cucumber feature files.
1) Feature 1
2) Feature 2
I have one CucumberWithSerenity runner class.
When I run both feature files together, using the same runner class the report generation is failing with the below error.
FAILURE: Build failed with an exception.
What went wrong:
Execution failed for task ':runTests'.
Multiple build operations failed.
Could not write XML test results
It's a gradle based project, and the task fails.
My observations:
As the serenity is saving the results/screenshots for both the feature files in the same below directory : /target/site/ it's failing to generate two seperate reports at run time. Please can I get some help on how to generate two seperate reports for both feature files using the same runner class.

Jmeter plugin for Teamcity - Teamcity shows all Jmeter tests failed but in the log they all passed

I added the Jmeter plugin for Teamcity according to : https://www.blazemeter.com/blog/how-run-jmeter-tests-teamcity-continuous-integration
After running the tests, Teamcity reports: "Tests failed: 13, passed: 0", but all the tests passed according to the log :"Generate Summary Results = 13 in 00:00:03 = 4.4/s Avg: 205 Min: 23 Max: 1377 Err: 0 (0.00%)".
How can I configure Teamcity to show the correct results?
Thanks!
I also had this problem and found a solution after reading these channels:
https://stackoverflow.com/a/52935009/5210267 and
https://github.com/jtorgan/jmeter_plugin/issues/24#issuecomment-421016226
The plugin expects to have "success" column in some exact position or exact order in the output file. For me it worked when "success" was the 4th column (more info in the articles I mentioned above).
You can achieve it by turning off columns in the report file, for example:
jmeter.save.saveservice.response_message=false
jmeter.save.saveservice.thread_name=false
jmeter.save.saveservice.data_type=false
But when I reached working configurations, generation of html report with -e -o Report command stopped working.
So, I just went to "Build Features" settings and turned off the "assertions" checkbox and added my own Build Failure conditions.

SonarQube - integrationTest.exec - sonarRunner (Gradle) or "sonar-runner" command - showing 0.0% covereage

I'm successfully generating 2 .exec files by Jacoco within "build/jacoco" folder after running a Gradle based build and integration tests.
Gradle command:
"gradle clean build integrationTest"
Once done, it generates the following .exec files under build/jacoco folder.
test.exec
integrationTest.exec
Following is my sonar-project.properties file. When, I run "sonar-runner" from Linux prompt it completes but on SonarQube dashboard for this project, I see Unit test says some 34.5% but integration tests says 0.0%. Both .exec files have valid size. I also did "cat" on the .exec files and piped the output to "strings" command in Linux and saw that integrationTest.exec did hit the Tests functions - I have only 1 .java file.
When I run "gradle clean build integrationTest sonarRunner -Dxxx.xxx=yyy -Dyyy.xx=zzz" i.e. by passing all the sonar variable as mentioned in the sonar-project.properties file using -D option, it works but same result on SonarQube project's dashboard. Project's sonar dashboard has both widgets configured for Unit / Integration Tests and I'm including IT tests for showing Overall coverage. Overall coverage is showing 34.5% (which is Unit test % value). Sonar does see test.exec, integrationTest.exec and also auto generates overall-xxx.exec file as well during this operation.
NOTE: I'm no where - while starting tomcat on a separate putty / linux console -OR within Gradle build script, providing any value or setting JAVA Agent for Jacoco. I'm getting integrationTest.exec file and test.exec file already so not sure if JVM needs to be stopped once IT tests are complete running. I don't think I need those as i have valid file size for .exec files.
My ?:
- Why sonar is not getting IT coverage on the dashboard even though I'm setting / passing the following variable correctly:
sonar.jacoco.itReportPath=build/jacoco/integrationTest.exec
-bash-3.2$ cat sonar-project.properties
# Root project information
sonar.projectKey=com:company:product:ProjectA
sonar.projectName=ProjectA
sonar.projectVersion=1.0
# optional description
sonar.projectDescription=ProjectA Service
#Tells SonarQube that the code coverage tool by unit tests is JaCoCo
sonar.java.coveragePlugin=jacoco
#Tells SonarQube to reuse existing reports for unit tests execution and coverage reports
sonar.dynamicAnalysis=reuseReports
# Some properties that will be inherited by the modules
sonar.sources=src/java,test/java,src/java-test
# Sonar Unit Test Report path
sonar.jacoco.reportPath=build/jacoco/test.exec
# Sonar Integration Test Report Path
sonar.jacoco.itReportPath=build/jacoco/integrationTest.exec
sonar.junit.reportsPath=build/UT/results
# Sonar Binaries
sonar.binaries=build/classes/main
Narrowing down the cause: I think it's due to the .exec file for Integration test. To proove it: I passed UT exex file to both reportsPaths in Sonar variables i.e. the following and SonarQube picked both UT/IT test coverage. This prooves that if .exec file for IT tests is good (which I think it's But I need to double check) then Sonar will pick the .exec file and show a valid coverage % instead of 0.0%. Note: the following is just to proove if Sonar is picking the values or not. itReportPath variable should use the .exe file which is generated by Integration tests by Jacoco.
sonar.jacoco.reportPath=build/jacoco/test.exec
# Sonar Integration Test Report Path
#sonar.jacoco.itReportPath=build/jacoco/testintegrationTest.exec
sonar.jacoco.itReportPath=build/jacoco/test.exec
OK Found the issue. I was running integrationTest task in Gradle and was NOT attaching the jacocoagent.jar (as per Jacoco documentation) to the target JVM (Tomcat's instance) scope. Once I did that, I removed jacoco { ... } section from integrationTest task in Gradle (build.gradle or GRADLE_HOME/init.d/some.common.gradle file as this attach jacoco agent to the Java JVM in which Gradle is running). Now, once jacocoagent.jar was attached to Tomcat's JVM (as per the line below which I added in Tomcat's startup.sh script and added the variable to the command which starts Tomcat), then I ran Gradle (integrationTest) task for running IT tests.
PROJ_EXTRA_JVM_OPTS=-javaagent:tomcat/jacocoagent.jar=destfile=build/jacoco/IT/jacocoIT.exec,append=false
Then while Gradle was in progress, tests ran and I got a file (jacocoIT.exec at the given location) with some file size BUT this is not yet the final one. I had to stop the Tomcat session/JVM instance by running Tomcat's stop.sh script. Once Tomcat was stopped, I saw jacocoIT.exec file size increased significantly and this was the valid final jacocoIT.exec file (which I needed for sonarRunner Gradle task OR sonar-runner exectuable to pick and successfully push IT code coverage data to project's sonar dashboard). Once done, I got both UT + IT and it's combined code coverage.
sonar.jacoco.reportPath=build/jacoco/UT/jacocoUT.exec
sonar.jacoco.itReportPath=build/jacoco/IT/jacocoIT.exec

Configuration for "SeleniumHQ htmlSuite Run" in Jenkins to run Selenium HTML TestSuite

I want to run my Selenium HTML Test Suite through Jenkins (a continuous integration). The following shows, how the build is configured for the current project:
And here's the console output after commiting a new test for example:
ERROR: The suiteFile is not a file or an url ! Check your build configuration.
Build step 'SeleniumHQ htmlSuite Run' changed build result to FAILURE
Build step 'SeleniumHQ htmlSuite Run' marked build as failure
Publishing Selenium report...
Finished: FAILURE
In fact, I get these log issues even after committing both extensionless test files AND .html files.
SeleniumHQ Jenkins plugin supports only ONE suite file per build step. Try out Selunit to run Selenese suites in batch and across multiple browsers. This tutorial shows hot to setup the test execution in Jenkins/Hudson.
Your suiteFile is written with wildcard as: tests/selenium/*.html. I think it is wrong.
You need to provide the exact/absolute path to your suite without the wildcard as below:
tests/selenium/suite.html

How can I speed up my maven2 build?

I'm using a local artifactory to proxy the request, but the build and test phases are still a bit slow. It's not the actual compile and tests that are slow, it's the "warmup" of the maven2 framework. Any ideas?
There are some possibilities to optimize some of the build tasks. For example the 'clean' task can be optimized from minutes to just milliseconds using simple trick - rename 'target' folder instead of delete.
To get details how to do it refer to Speed up Maven build.
I don't know what version of Maven you are using, I assume 2, but I will give what I use for Maven 1.x to speed up and make things build a tiny bit quicker.
These will fork the junit tests into a new process (also helps when you use environment variables in tests etc and gives the tests a little more memory.
-Dmaven.junit.fork=true
-Dmaven.junit.jvmargs=-Xmx512m
This forks the compilation which might speed things up for you
-Dmaven.compile.fork=true
I hope this can help a little, try it out.
Also refer to get more speed with your maven2 build.
If you are using Maven3 ($ mvn -version), you can also follow this guide. In my case, the results are:
Normal execution:
$ mvn clean install
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:05 min
[INFO] Finished at: 2015-07-15T11:47:02+02:00
[INFO] Final Memory: 88M/384M
With Parallel Processing (4 threads):
$ mvn -T 4 clean install
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:22 min (Wall Clock)
[INFO] Finished at: 2015-07-15T11:50:57+02:00
[INFO] Final Memory: 80M/533M
Parallel Processing (2 threads per core)
$ mvn -T 2C clean install
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:12 min (Wall Clock)
[INFO] Finished at: 2015-07-15T12:00:29+02:00
[INFO] Final Memory: 87M/519M
[INFO] ------------------------------------------------------------------------
As we can see, the difference it's almost a minute, near 20-30% of speed improvement.
Adjust memory configurations to optimum for eg: add this line to mvn.bat
set MAVEN_OPTS=-Xmx512m -XX:MaxPermSize=256m
Clean phase of mvn normally deletes target folder. Instead if we are renaming target folder the cleaning phase will be much faster.<quickClean>
-Dmaven.test.skip=true will skip the test execution.
Add -Denforcer.skip=true to mvn command line argument (This is enforcing versions of maven, jdk etc ,we can skip it after initial runs)
Disable non-critical operations during build phase: Analysis, javadoc generation, source packaging. This will save huge time.
Spawnig new process also helps in time improvement
-Dmaven.junit.fork=true (fork the junit tests into a new process)
-Dmaven.compile.fork=true (forks the compilation)
Hope it helps.
You can use -DskipTests=true to skip unit tests. which would speed up builds
I've found that parsing reactor projects is significantly slower than single-pom projects. If your build is reactor (multi-module) and your developers are not working on all modules at the same time, you can remove the parent POM and build them separately, resolving the dependencies using the local repo. The disadvantage is that you need to install or deploy a module in order for its dependents to see the changes.
Also, you might want to look at the new Maven 2.1 M1 which contains some significant speed improvements.
If none of these helps, post more details about your project configuration (modules structure and plugins), command line parameters and hardware config (memory and disk). Running Maven with -X might also show where is it taking its time.
I'd use locally installed Nexus.
Initially, you should get a finer analysis on your build times using something like this and identify the candidates that are taking the most time.
Are tests spinning up a H2 database per test? Is the download of external jar files taking the time? This will guide where to focus your investigation. Just applying go-fast flags don't usually work as they would have already been included by default, and you don't want to be sacrificing your tests with skip flags.