log4j properties files based on leiningen test metadata? - testing

How can I use different log4j properties files based on leiningen test metadata? I have functions that have debug logging output to a file. Often, there is a lot of data being written to this debug log file, slowing down the function. Normal runs of the application will not have debug file writing, so I want to benchmark the normal running of the function without that file writing. For benchmarking, I am using criterium. Let's assume that the metadata for benchmarking deftest defitions is :benchmark.

The trouble with doing this based on test metadata is that all tests are run in a single JVM instance, and modifying the Log4j configuration on the fly within a JVM is not exactly easy. Instead, I would set up profiles in your project.clj to disable the :benchmark tests by default, and set up a separate profile for running benchmarks. Assuming that you have your :resource-paths set up to include a debug-level log4j.properties file, your benchmark profile could then set up the classpath or system profiles as appropriate to use a different file. For example:
(defproject myproject
...
:test-selectors {:default (complement :benchmark)}
:profiles {
:benchmark {
:test-selectors {:default :benchmark}
:jvm-opts ["-Dlog4j.configuration=log4j-benchmark.properties"]
}
})
You could then run the benchmarks with:
> lein with-profile +benchmark test

Related

Spring Profiles in combination with ConfigServer

I have a very basic Spring Boot Config Server (just added the dependency and annotated mainclass with #EnableConfigServer).
In general I would like to support multiple environments with different propertysources for each of my applications, here is the example of the ConfigServer itself:
Profile: default (application.yml on classpath):
Profile: docker (application-docker.yml on classpath):
Profile: default (application.yml in repository of ConfigServer):
So in my case all of the properties from all of the three screenshots should be active, I'd expect the order/priority as follows:
application.yml from classpath
application-ANY_PROFILE.yml from classpath
application.yml from config repo
APP-NAME.yml from config repo (does not exists in this case)
So far this works flawlessly, except the issue that I'm having is that my application-docker.yml on classpath is beeing ignored when I start the application with the command (of course inside the container):
java -jar -Dspring-boot.run.profiles=docker *.jar
as you can see here:
My question is, even when I provide the profile as command line argument its not beeing picked up.
Why is that?
UPDATE, here is the Dockerfile and entrpoint.sh:
To activate one or more profiles do one of the following:
Activate using the VM parameters -Dspring.profiles.active=<profiles>
Activate using program arguments --spring.profiles.active=<profiles>
Following your example, the following should work:
java -jar -Dspring.profiles.active=docker *.jar

JMeter Test Results Monitoring/ Analysis

I want to start load testing by running JMeter from command line for more accurate test results, but how can I monitor the run and then analyze the results after the test finishes.
You can generate JTL (JMeter results) file while executing the JMX (JMeter script) file from command line. A sample command for generating JTL file will look like this..
jmeter -n -t path-to-jmeterScript.jmx -l path-to-jtlFile.jtl
After completion of script execution you can open the JMeter GUI and simply open the JTL file in any listener (as per your requirement).
Most of the listeners in JMeter have an option to save the results into a file. This file contains usually not the report itself, but the samples which are generated by the tests. If you define this filename, you can generate the reports using these saved files. For example see http://jmeter.apache.org/usermanual/component_reference.html#Summary_Report .
If you run JMeter in command-line non-GUI mode passing results file name via -l parameter it will output results there. After test finishes you will be able to open the file with the Listener of your choice and perform the analysis.
By default JMeter writes results in chunks, if you need to monitor them in real time add the following line to user.properties file (lives under /bin folder of your JMeter installation)
jmeter.save.saveservice.autoflush=true
You can use other properties which names start with jmeter.save.saveservice.* to control what metrics you need to store. The list with default values can be seen in jmeter.properties file. See Apache JMeter Properties Customization Guide for more information on various JMeter properties types and ways of working with them.
You can also consider running your JMeter test via Taurus tool - it provides some statistics as the test goes either in console mode or via web interface.

Arquillian files 'in classpath' even though not defined in WebArchive when run tests

Testing Arquillian 1.9.final TOMCAT-EMBED-7 container, and I'm getting questionable results around creating a WebArchive for testing.
In /src/main/resources, I have several configuration files that I do not want to use when running the integration tests, instead I want to provide named ones stored in /src/embed-itest/resources.
org.jboss.shrinkwrap.api.Filter x = Filters.exclude(".*Test.*|.*xml|.*properties");
WebArchive webArchive = ShrinkWrap
.create(WebArchive.class, "mytest.war")
.addPackages(true, x, "com.myapp")
//and some other additions
Then at the end of the srhinkwrap process, I add the specific test files I want to use:
File n = new File("src/embed-itest/resources/test-log4j.properties");
webArchive.addAsResource(n,"log4j.properties");
However, the behavior is still running as though it is using the /src/main/resources/log4j.properties. I've verified the _DEFAULT_DEFAULT_mytest.war really does have the test-log4j.properties content as log4j.properties, but running the tests the behavior is that of /src/main/resources/log4j.properties. (and this is true for other configuration files, such as camelContext.xml I've tried to override).
Anyone have some insight please? I was hoping to leverage the ability to create a custom WebArchive with specific files in the archive to more precisely test, but the actual behavior seems to be as if it was the 'standard' created war limiting what I thought was a great capability of arquillian.
I think the problem is that yo are using the Tomcat embedded approach, which means you are sharing the JVM of your tests with your Tomcat instance. I suggest you try with managed or remote mode.

SonarQube - integrationTest.exec - sonarRunner (Gradle) or "sonar-runner" command - showing 0.0% covereage

I'm successfully generating 2 .exec files by Jacoco within "build/jacoco" folder after running a Gradle based build and integration tests.
Gradle command:
"gradle clean build integrationTest"
Once done, it generates the following .exec files under build/jacoco folder.
test.exec
integrationTest.exec
Following is my sonar-project.properties file. When, I run "sonar-runner" from Linux prompt it completes but on SonarQube dashboard for this project, I see Unit test says some 34.5% but integration tests says 0.0%. Both .exec files have valid size. I also did "cat" on the .exec files and piped the output to "strings" command in Linux and saw that integrationTest.exec did hit the Tests functions - I have only 1 .java file.
When I run "gradle clean build integrationTest sonarRunner -Dxxx.xxx=yyy -Dyyy.xx=zzz" i.e. by passing all the sonar variable as mentioned in the sonar-project.properties file using -D option, it works but same result on SonarQube project's dashboard. Project's sonar dashboard has both widgets configured for Unit / Integration Tests and I'm including IT tests for showing Overall coverage. Overall coverage is showing 34.5% (which is Unit test % value). Sonar does see test.exec, integrationTest.exec and also auto generates overall-xxx.exec file as well during this operation.
NOTE: I'm no where - while starting tomcat on a separate putty / linux console -OR within Gradle build script, providing any value or setting JAVA Agent for Jacoco. I'm getting integrationTest.exec file and test.exec file already so not sure if JVM needs to be stopped once IT tests are complete running. I don't think I need those as i have valid file size for .exec files.
My ?:
- Why sonar is not getting IT coverage on the dashboard even though I'm setting / passing the following variable correctly:
sonar.jacoco.itReportPath=build/jacoco/integrationTest.exec
-bash-3.2$ cat sonar-project.properties
# Root project information
sonar.projectKey=com:company:product:ProjectA
sonar.projectName=ProjectA
sonar.projectVersion=1.0
# optional description
sonar.projectDescription=ProjectA Service
#Tells SonarQube that the code coverage tool by unit tests is JaCoCo
sonar.java.coveragePlugin=jacoco
#Tells SonarQube to reuse existing reports for unit tests execution and coverage reports
sonar.dynamicAnalysis=reuseReports
# Some properties that will be inherited by the modules
sonar.sources=src/java,test/java,src/java-test
# Sonar Unit Test Report path
sonar.jacoco.reportPath=build/jacoco/test.exec
# Sonar Integration Test Report Path
sonar.jacoco.itReportPath=build/jacoco/integrationTest.exec
sonar.junit.reportsPath=build/UT/results
# Sonar Binaries
sonar.binaries=build/classes/main
Narrowing down the cause: I think it's due to the .exec file for Integration test. To proove it: I passed UT exex file to both reportsPaths in Sonar variables i.e. the following and SonarQube picked both UT/IT test coverage. This prooves that if .exec file for IT tests is good (which I think it's But I need to double check) then Sonar will pick the .exec file and show a valid coverage % instead of 0.0%. Note: the following is just to proove if Sonar is picking the values or not. itReportPath variable should use the .exe file which is generated by Integration tests by Jacoco.
sonar.jacoco.reportPath=build/jacoco/test.exec
# Sonar Integration Test Report Path
#sonar.jacoco.itReportPath=build/jacoco/testintegrationTest.exec
sonar.jacoco.itReportPath=build/jacoco/test.exec
OK Found the issue. I was running integrationTest task in Gradle and was NOT attaching the jacocoagent.jar (as per Jacoco documentation) to the target JVM (Tomcat's instance) scope. Once I did that, I removed jacoco { ... } section from integrationTest task in Gradle (build.gradle or GRADLE_HOME/init.d/some.common.gradle file as this attach jacoco agent to the Java JVM in which Gradle is running). Now, once jacocoagent.jar was attached to Tomcat's JVM (as per the line below which I added in Tomcat's startup.sh script and added the variable to the command which starts Tomcat), then I ran Gradle (integrationTest) task for running IT tests.
PROJ_EXTRA_JVM_OPTS=-javaagent:tomcat/jacocoagent.jar=destfile=build/jacoco/IT/jacocoIT.exec,append=false
Then while Gradle was in progress, tests ran and I got a file (jacocoIT.exec at the given location) with some file size BUT this is not yet the final one. I had to stop the Tomcat session/JVM instance by running Tomcat's stop.sh script. Once Tomcat was stopped, I saw jacocoIT.exec file size increased significantly and this was the valid final jacocoIT.exec file (which I needed for sonarRunner Gradle task OR sonar-runner exectuable to pick and successfully push IT code coverage data to project's sonar dashboard). Once done, I got both UT + IT and it's combined code coverage.
sonar.jacoco.reportPath=build/jacoco/UT/jacocoUT.exec
sonar.jacoco.itReportPath=build/jacoco/IT/jacocoIT.exec

Griffon resource loading differences between run-app and test-app

I am fairly new to Griffon and have some experience with Grails.
I have a problem loading a file from the resources directory.
I am using Griffon version 1.4.0.
When I run griffon run-app the following code (inside a Service) to get the location of an XML file works fine:
URL resource = getResourceAsURL('schema.xsd')
assert resource != null : "schema cannot be located"
When I run griffon test-app however the same code produces an assertion error because the returned URL is null. Same behaviour with getResourceAsStream().
This happens in the unit test of said service.
I put the file in ./griffon-app/resources.
What am I doing wrong? Do I have to copy all resources from production to some test resources folder, do I have to edit the build-configuration?
Thanks in advance!
Edit as suggested below I filed a bug report in the griffon-projects issue tracker.
araxn1d is correct, running the tests in integration mode will give you the right answer because the full application gets bootstrapped before tests are run. Now, running this kind of test (a unit test that depends on resources being available in the classpath) encounters a problem because the classpath is not setup correctly. Executing the following command
griffon -Dgriffon.cli.verbose=true test-app --unit --compileTrace=true
will output all classpaths. There you can see that the resources directory points to $USER_HOME/.griffon/1.4.0/projects/<project_name>/resources. If you inspect that directory you'll find the file you're looking for inside griffon-app/resources. This means the test classpath is not accurately configured as it should be $USER_HOME/.griffon/1.4.0/projects/<project_name>/resources/griffon-app/resources instead. This is clearly a bug, most likely found in the $GRIFFON_HOME/scripts/_GriffonClasspath.groovy script. Could you please file a JIRA http://jira.codehaus.org/browse/griffon ticket for it? Thanks!
You should run test-app to run your unit tests. In this case you should mock any refers to real files, otherwise you should implement Integration Tests. Pls see Griffon Testing. Integration tests differ from unit tests in that you have full access to the Griffon application within the test.