When running a custom TestEngine can execution of JUnitTestEngine be suppressed? - junit5

I have created a custom TestEngine using the JUnit 5 (junit-platform-engine) framework.
The custom TestEngine registers using the ServiceLoader mechanism with an entry in META-INF/services/org.junit.platform.engine.TestEngine.
When I run my tests, this works well, but the tests get run a second time by the built-in JUnitTestEngine.
Is it possible to replace the default TestEngine in this circumstance instead of supplement it?

After checking the JUnit 5 user guide and documentation for maven surefire plugin it seems there's currently no way to filter out certain test engine with Maven :-(.
Using the console launcher, however, does allow to choose test engines: https://junit.org/junit5/docs/current/user-guide/#running-tests-console-launcher-options. And so does Gradle: https://junit.org/junit5/docs/current/user-guide/#running-tests-build-gradle.

Related

Running a single feature file before and after a test run

Was looking for a solution to run a feature file at the end of the suite
My workflow (In parallel Run)
karate.callSingle('Login.feature') so at the beginning i do one
login and then use the cookies/token for the whole suite
Run tests in Parallel
Runs the Logout.feature file
There is no direct support for this currently. By the way no one has ever requested this. If this is so important, kindly open a feature request.
One workaround is to set a singleton / Java static variable from callSingle and then in your JUnit / Java parallel runner, call the feature to logout using the Java API (search the docs for this) and you can pass arguments / access the static variable.
EDIT: just realized that the #AfterClass JUnit annotation may be more than sufficient for your needs.

serenity jbehave multiple browsers

I am trying to setup a test project that uses serenity and jbehave
I am noticing that all examples use serenity.properties that define a browser in it
I would like to structure my tests in a way so that same test can be executed in IE/firefox/chrome etc
How do I do this?
You can pass in properties as command line properties, so you can rerun the same tests with different browsers by passing in different settings for webdriver.driver, e.g.
$ mvn verify -Dwebdriver.driver=firefox
$ mvn verify -Dwebdriver.driver=chrome
etc.
I think you are able to get this to work by creating multiple Junit test classes with each its own driver and execute them all in a single run.
Every test class should be able to assign a specific 'managed' driver (e.g. PhantomJS, Chrome, Firefox). This is documented here: http://www.thucydides.info/docs/serenity/#_serenity_webdriver_support_in_junit
I don't know what the impact this would have on the generated report, hopefully you are still able to identify the feature/driver combination.

Griffon resource loading differences between run-app and test-app

I am fairly new to Griffon and have some experience with Grails.
I have a problem loading a file from the resources directory.
I am using Griffon version 1.4.0.
When I run griffon run-app the following code (inside a Service) to get the location of an XML file works fine:
URL resource = getResourceAsURL('schema.xsd')
assert resource != null : "schema cannot be located"
When I run griffon test-app however the same code produces an assertion error because the returned URL is null. Same behaviour with getResourceAsStream().
This happens in the unit test of said service.
I put the file in ./griffon-app/resources.
What am I doing wrong? Do I have to copy all resources from production to some test resources folder, do I have to edit the build-configuration?
Thanks in advance!
Edit as suggested below I filed a bug report in the griffon-projects issue tracker.
araxn1d is correct, running the tests in integration mode will give you the right answer because the full application gets bootstrapped before tests are run. Now, running this kind of test (a unit test that depends on resources being available in the classpath) encounters a problem because the classpath is not setup correctly. Executing the following command
griffon -Dgriffon.cli.verbose=true test-app --unit --compileTrace=true
will output all classpaths. There you can see that the resources directory points to $USER_HOME/.griffon/1.4.0/projects/<project_name>/resources. If you inspect that directory you'll find the file you're looking for inside griffon-app/resources. This means the test classpath is not accurately configured as it should be $USER_HOME/.griffon/1.4.0/projects/<project_name>/resources/griffon-app/resources instead. This is clearly a bug, most likely found in the $GRIFFON_HOME/scripts/_GriffonClasspath.groovy script. Could you please file a JIRA http://jira.codehaus.org/browse/griffon ticket for it? Thanks!
You should run test-app to run your unit tests. In this case you should mock any refers to real files, otherwise you should implement Integration Tests. Pls see Griffon Testing. Integration tests differ from unit tests in that you have full access to the Griffon application within the test.

Integration Tests on Geb / Spock + Selenium Grid do not run in parallel

I have the following set-up: an integration tests project which has a suite of tests written in Groovy/Geb + Spock, which are running perfectly both using Selenium WebDriver and Selenium Grid (RemoteWebDriver).
The problem is that no matter how much I try to tweak the "system", I can't get the tests to run in parallel (i.e. although I have 3 slaves [nodes] registered to the hub, only one of the slaves actually receives the requests). I've enforced maxSession=1 to the Selenium nodes and tried different combinations of parallel=classes|methods, threadCount and fork settings in failsafe plugin configuration (pom.xml file).
I have the feeling that the problem lies somewhere between the maven configuration and selenium grid, probably in relation to Geb/Spock config.
Does any of you have any insight on this issue?
PS: someone suggested that running tests in parallel using Geb / Spock is not possible - because for some reason ?Geb? locks the JUnitRunner (not sure what this means).
Add following configuration to your build.gradle file:
tasks.withType(Test) {
maxParallelForks = 3 // here three forks shall open in parallel
forkEvery = 1
include '**/*TestName*.class' // name of your test class
}
There are test frameworks, TestNG for example, that support parallel testing on the method level out of the box.
Spock, as an example to the contrary, does not support it.
But you do not have to have multithreading implemented by your test framework to make this work.
You can use your build tool to run test classes in parallel, both Maven and Gradle support this.
If you are using Maven, this documentation page and examples might help you:
https://maven.apache.org/surefire/maven-surefire-plugin/examples/fork-options-and-parallel-execution.html
Specifically have a look at "Forked Test Execution".
You can run it for sure, The point is you have to put them (your tests) in threads. Here is the link.

Maven reporting plugins do not execute if a unit test failure occurs

None of the plugins in the reporting section of Maven execute if there is a unit test failure.
I found that I can set maven.test.failure.ignore=true here - http://jira.codehaus.org/browse/SUREFIRE-247 The problem to this approach is now our hudson builds are successful even if there are unit test failures.
What I would really like to do is set the reporting plugin maven-surefire-report-plugin to run with the build plugins on a phase but I can't get this to work.
Any idea on how to get the Maven reporting plugins to execute if a unit test failure occurs?
Firstly run: mvn test OR mvn install. Then, if the tests fail, please run the following target to generate the reports for the test results executed above: mvn -Dmaven.test.skip=true surefire-report:report
In the link you posted:
With the latest version (2.1.2), I get
a message saying that "There are some
test failure," but I get no reports
anywhere whether or not I specify that
variable, or whether or not I specify
"testFailureIgnore" in the plugin
config. I got the reports fine with
2.0, but not with 2.1.2.
Do you need version 2,1 or can you work with a 2.0 version of Maven?
The error you see with 2.1.2 is because of forkmode settings which you need to perform in the plugin.
set forkmode=never and try it (I susppect there might problem in your useSystemclassloader property).
Otherwise make use of maven-surefire plugin version 2.5 which should definitel work and generate surefire rpeorts even though few test fails.
Please make use of surefire-report:report-only plugin if the reports are already generated after execution.
I had the same issue and it is due to a wrong call to the report plugin.
The correct execution of maven command is: mvn surefire-report:report
This will run the test phase by itself and if it fails it will generate the report anyway.
Check the documentation:
http://maven.apache.org/surefire/maven-surefire-report-plugin/report-mojo.html
Hope this helps!! :D