ScalaTest and Maven: getting started - testing

I have a Maven/Java project I've been working on for years, and I wanted to take JavaPosse's advice and start writing my tests in Scala. I've written a few tests following ScalaTest's JUnit4 quick start, and now I want these tests to be executed while running "mvn test". How should I do this? What should I put into my pom.xml to allow the tests in src/test/scala to be run side-by-side my old JUnit4 tests?
Cheers
Nik
PS, yes, I've been Googling, but all I could find on the topic were some pre-v1.0 suggestions that I didn't get working
PPS, bonus question: how can I run these tests one-at-a-time by rightclicking them in Eclipse/STS and say "Debug As... ScalaTest" or something similar where I've so far said "Debug As... JUnit Test"?
PPPS, I expect the answer has changed since July '09?

The second answer in one of the questions you linked to SHOULD work:
Is there a Scala unit test tool that integrates well with Maven?
You annotate your tests with a junit #RunWith annotation and give it the scalatest http://www.artima.com/docs-scalatest-2.0.RC3/#org.scalatest.junit.JUnitRunner
If your Tests also adhere to any naming conventions possibly enforced by Maven, this should work fine.
Note: It doesn't matter what kind of scalatest trait you use. All of them should work. If they don't and Bill Venners doesn't answer to this question, contact him on the ScalaTest mailing list.
Other Note: you can run such test suites in Eclipse using the normal JUnit plugin. But you can't run single tests, since the plugin expects to deduct a method name from the test name, which doesn't work with all types of scalatest tests.

Related

How do I separate gradle unit tests from integration tests in the same source set?

The problem is that we are an IntelliJ shop and this issue has been a thorn in our sides. It basically means that everything has to be in ./test/ in order to work. ./it/ isn't acceptable because IntelliJ picks it up as the wrong kind of source every time you try to do anything. So.... how do I separate integration tests from unit tests so that they can be run separately in Gradle if they are in the same source set? Anyone have an example?
We use the *Test*.java and *ITCase*.java naming conventions, if that helps. Anothing thing we were thinking of is some kind of use of JUnit's #Category annotation.
P.S. Please vote for this issue. It will be a thorn in the side of any IntelliJ shop considering Gradle that has integration tests in a different directory from unit tests.
I can't speak to IntelliJ configuration, but in build.gradle, if you have:
test {
if (! project.hasProperty("ITCASE")) {
exclude "**/*ITCase*"
}
}
then the following command-line would include the integration tests:
gradle test -PITCASE=true
then and the standard would exclude them:
gradle test

Does Codeception have an equivalent to PHPUnit's "strict coverage"?

When using PHPUnit, you can annotate a test case with #covers SomeClass::someMethod to ensure that only code inside of that method is recorded as covered when running the test. I like to use this feature because it helps me separate code that was incidentally executed during a test from code that was actually tested.
After using Codeception to implement some acceptance tests for my project, I decided I would rather use it than PHPUnit to run my unit tests. I would like to remove PHPUnit from the project if possible.
I am using Codeception's Cest format for my unit tests, and the #covers and #codeCoverageIgnore annotations no longer work. Code coverage reports show executed code outside of the methods specified with #covers as covered. Is there any way to mimic that "strict coverage" functionality using Codeception?
Edit: I have submitted an enhancement request to the Codeception project's Github.
It turns out that strict coverage was not possible using Cest-format tests when I asked the question. I have implemented it and the pull request has been merged.
For anyone migrating tests from PHPUnit and looking for this feature as I was, this means that a later release of Codeception should provide support for #covers, #uses, #codeCoverageIgnore, and other related test annotations.
The current version (2.2.4 at the time this was written) doesn't support it but 2.2.x-dev should.

How to display a short test report/counters in travis-ci?

I mean, it would be very useful if I can see how many tests passed/failed just by one line, without reading build logs.
I use karma as test runner. It have a lot of reporter, but which one should I use?
Example from TeamCity:
This seems like a useful feature but the current user interface doesn't seem to support it.
You can file it as a feature request on Travis CI's GitHub page using the link below:
https://github.com/travis-ci/travis-ci/issues
Although Travis CI doesn't have its own interface for counting the number of tests passed, they do work with CodeClimate, which has it's it's interface and metrics for test coverage. It shows overall test coverage for the whole project and coverage for each file. There's some more info on that here, though it looks like their free version allows local testing only.
There are other tools out there for tracking and analyzing coverage as well, including Coveralls, which is pretty good as well. They have a free version for open source, like Travis CI, so that's can be a plus. They also show coverage as a percent and file-by-file.

All the tests of the same category show up as only one test result w/ TestNG in Intellij and I'd like it not to happen. How?

I have been developing a project which contains a TestLauncher class that'll read a given directory and for each file it contains, run it against my tool and yield the results.
So, when coding in Eclipse, it would show up one result for each test (as expected). Today I've been toying with Intellij, and I've decided to try to run and code a bit of this project in Intellij.
When trying to run the tests, though, it seems to be only showing up 2 results instead of the 100+ it should. Although I am sure it is running the full suite, it seems to be folding all the results of a given category in a single result. That means that if I have at least one failing test in each category, it shows up as a "failed test".
I guess this must not be a bug, but rather some configuration that I am not aware about and that is on by default in Intellij but not in Eclipse. Could anyone explain what might be going on?
Edit: I am using the latest Intellij (downloaded one of these days).
Thanks
What you're seeing is simply a difference in the way the Eclipse and IDEA plug-ins are implemented. I implemented the Eclipse plug-in to be pretty clever in its display, so it will show different things depending on various factors such as the presence of a toString() method in your test class or whether your test class implements org.testng.ITest.
I suggest you ask this question on the IDEA forums and if you don't get any response, feel free to email the testng-users list and I can put you in touch with the JetBrains engineer in charge of the TestNG plug-in.
The IntelliJ-IDEA TestNG Plugin has a filter symbol called "Hide Passed" above the output Test Results. You can toggle that to display all tests, including the passed ones.

Is there a tool for creating historical report out of j/nunit results

Looking for a way to get a visual report about:
overall test success percentage over time (information about if and how quickly tests are going greener)
visualised single test results over time (to easily notice test gone red that has been green for long time or vice versa to pay attention to a test that has just gone green)
any other visual statistics that would benefit testers and the project as a whole
Basically a tool that would generate results from the whole test results directory not just off the single (daily) run.
Generally it seems it could be done using XSLT, but it doesn't seem to have much flexibility to work with multiple files at the same time.
Does such a tool exist already?
I feel fairly courageous to claim that most Continuous Integration Engines such as Hudson (for Java) provide such capability either natively or through plugins. In Hudson's case there's a few code coverage plugins available already and I think it does basic graphs from unit tests automatically by itself.
Oh and remember to configure the CI properly, for example our Hudson polls CVS every 10 minutes and if it sees any changes, it does all the associated tricks (get updated .java files, compile, run tests, verify dependencies etc.) to see if the build is still OK or not.
Hudson will do this and it will work with Nunit (here), Junit (natively), and MSTest.exe tests using the steps I outline here. It does all that you require and more. Even if you want it to ONLY run tests and give you feedback on those, it can.
There's such new report supporting NUnit \ JUnit called Allure. To retrieve information from NUnit you need to use NUnit adapter, for JUnit - read the following wiki page. You can use it with Jenkins via respective plugin.