Exclude tests from a certain category - ncrunch

"Bottom" line: How do you set up NCrunch to exclude all tests in a certain category, e.g. "LongRunning"?
Stack Overflow search provided zero results, and the relevant NCrunch wiki page merely informs me that "NUnit's Category attribute is just as powerful as NCrunch", which is nice but doesn't tell me how to use those categories.
To further clarify, I was expecting a checkbox-list somewhere to turn categories on/off, similar to the /exclude option in the NUnit Test Runner to exclude all tests in certain categories.
PS. I've tagged the question only with NCrunch, it didn't seem relevant that I'm using NUnit as my testing framework. Should the answers indicate otherwise I'm happy to add the tag.

After some more searching on the wiki I found this page describing it: you need to create a new Engine Mode. Who would've thought?
Here's the two relevant steps/screenshots that got this working for me. First, go to the NCrunch menu, choose "Set Engine Mode" and then "Customise Engine Modes...":
And then step 2, choose "Run all tests automatically + CATEGORY FILTER", click "Add Engine Mode". Then give it a name, and add a condition "does not have category" and choose your category to ignore, e.g. "LongRunning". Then hit "OK":
Finally choose the new engine and your tests are filtered!

Related

karate - How to change display name of a Feature file in the report [duplicate]

I want to have an option on the cucumber report to mute/hide scenarios with a given tag from the results and numbers.
We have a bamboo build that runs our karate repository of features and scenarios. At the end it produces nice cucumber html reports. On the "overview-features.html" I would like to have an option added to the top right, which includes "Features", "Tags", "Steps" and "Failures", that says "Excluded Fails" or something like that. That when clicked provides the same exact information that the overview-features.html does, except that any scenario that's tagged with a special tag, for example #bug=abc-12345, is removed from the report and excluded from the numbers.
Why I need this. We have some existing scenarios that fail. They fail due to defects in our own software, that might not get fixed for 6 months to a year. We've tagged them with a specified tag, "#bug=abc-12345". I want them muted/excluded from the cucumber report that's produced at the end of the bamboo build for karate so I can quickly look at the number of passed features/scenarios and see if it's 100% or not. If it is, great that build is good. If not, I need to look into it further as we appear to have some regression. Without these scenarios that are expected to fail, and continue to fail until they're resolved, it is very tedious and time consuming to go through all the individual feature file reports and look at the failing scenarios and then look into why. I don't want them removed completely as when they start to pass I need to know so I can go back and remove the tag from the scenario.
Any ideas on how to accomplish this?
Karate 1.0 has overhauled the reporting system with the following key changes.
after the Runner completes you can massage the results and even re-try some tests
you can inject a custom HTML report renderer
This will require you to get into the details (some of this is not documented yet) and write some Java code. If that is not an option, you have to consider that what you are asking for is not supported by Karate.
If you are willing to go down that path, here are the links you need to get started.
a) Example of how to "post process" result-data before rendering a report: RetryTest.java and also see https://stackoverflow.com/a/67971681/143475
b) The code responsible for "pluggable" reports, where you can implement a new SuiteReports in theory. And in the Runner, there is a suiteReports() method you can call to provide your implementation.
Also note that there is an experimental "doc" keyword, by which you can inject custom HTML into a test-report: https://twitter.com/getkarate/status/1338892932691070976
Also see: https://twitter.com/KarateDSL/status/1427638609578967047

karate html report in Jenkins shows everything expanded [duplicate]

I want to have an option on the cucumber report to mute/hide scenarios with a given tag from the results and numbers.
We have a bamboo build that runs our karate repository of features and scenarios. At the end it produces nice cucumber html reports. On the "overview-features.html" I would like to have an option added to the top right, which includes "Features", "Tags", "Steps" and "Failures", that says "Excluded Fails" or something like that. That when clicked provides the same exact information that the overview-features.html does, except that any scenario that's tagged with a special tag, for example #bug=abc-12345, is removed from the report and excluded from the numbers.
Why I need this. We have some existing scenarios that fail. They fail due to defects in our own software, that might not get fixed for 6 months to a year. We've tagged them with a specified tag, "#bug=abc-12345". I want them muted/excluded from the cucumber report that's produced at the end of the bamboo build for karate so I can quickly look at the number of passed features/scenarios and see if it's 100% or not. If it is, great that build is good. If not, I need to look into it further as we appear to have some regression. Without these scenarios that are expected to fail, and continue to fail until they're resolved, it is very tedious and time consuming to go through all the individual feature file reports and look at the failing scenarios and then look into why. I don't want them removed completely as when they start to pass I need to know so I can go back and remove the tag from the scenario.
Any ideas on how to accomplish this?
Karate 1.0 has overhauled the reporting system with the following key changes.
after the Runner completes you can massage the results and even re-try some tests
you can inject a custom HTML report renderer
This will require you to get into the details (some of this is not documented yet) and write some Java code. If that is not an option, you have to consider that what you are asking for is not supported by Karate.
If you are willing to go down that path, here are the links you need to get started.
a) Example of how to "post process" result-data before rendering a report: RetryTest.java and also see https://stackoverflow.com/a/67971681/143475
b) The code responsible for "pluggable" reports, where you can implement a new SuiteReports in theory. And in the Runner, there is a suiteReports() method you can call to provide your implementation.
Also note that there is an experimental "doc" keyword, by which you can inject custom HTML into a test-report: https://twitter.com/getkarate/status/1338892932691070976
Also see: https://twitter.com/KarateDSL/status/1427638609578967047

Intellij find all test coverage for a class

I found a really nice set of functionality in IntelliJ but it is very manual...
use Analyze/Analyze backward dependencies on a object to find all the tests that eventually reference that class.
Create a run configuration using Test Kind "Pattern" and manually enter each of the test classes found into the "Pattern" field.
Run the test with code coverage
Navigate to the original class to view it's total test coverage.
This whole process is fairly slow and user intensive, but it could easily be automated with a single "Find test coverage for class" key-press (It would still be pretty slow, but I could go on and do something else). Does anyone know if this is in a key binding or plug-in I haven't found yet? It seems like a pretty obviously useful and easy to implement piece of functionality.
If not, can anyone suggest how I might do this with the IDE scripting console or a custom Intention (I've had no success finding really good usable documentation/examples of the IDE scripting console, haven't looked into intentions too much...)
How about the following 2 flows/options based on the Windows shortcuts (don't mind the failing stuff, it's just a quick copy-paste for the sake of brevity):
1) With the cursor placed on your class name:
CTRL+SHIFT+T (Chose test for launch)
SHIFT+END (Select all)
SHIFT+UP (Unselect Create new test...)
CTRL+SHFIT+F10 (Execute selected tests)
2) With the Group by test/production option selected in the find window and cursor placed on your class name:
ALT+F7 (Find usages)
chose the tests from the list
CTRL+SHFIT+F10 (Execute selected tests)

What is the correct (Most effective) Cucumber Project Layout when Considerring Page Object Modelling?

What is the correct (Most effective) Cucumber Project Layout when Considerring Page Object Modelling?
After allot of research I have come up with the following design:
Maven Project
NEW PROJECT SETUP:
I agree with the main idea you present, however the page object model also refers to the utilites. One of the desired goals of the page object model is to keep the selenium code out of the test itself, so most of those references will go to the pages, then its locators and actions would access the driver class, preferably through the utilities. That does not mean that the test program cannot make direct references to the utilities, but it should do so for non-selenium reasons only. There are exceptions, of course. In the case of Cucumber or any other BDD-based framework, you would only refer to what you are now calling "main" as "steps" and each test would have its own story file, accessing the one, or more, steps files. The rest remains the same. The idea behind that is it allows you to create and maintain a library of related steps that existing and future story files can refer to.
Hope this has helped you and/or others to better understand the flow. Also, disclaimer - most of this is my opinion - there are likely many ways to diagram the relationships, but what I described is what I use.
Upon further examination, I see that I missed the lower half (I am visually impaired). The testrunner is typically at the very top of the chain in this environment. It runs as a single JUnit or TestNG test to run ALL your stories.
And now my browser is messed-up and I cannot re-scroll back up to confirm that diagram again to comment more.
I've drawn a crude layout of what I was trying to describe. I hope it explains my answer to your question more clearly.
Here's the basic project tree:
Here's src/test/java expanded:
And finally, src/test/resources expanded
The only thing in \src\main\resources is some extra stuff that JBehave uses to allow some customization of their reports, called FTL.

All the tests of the same category show up as only one test result w/ TestNG in Intellij and I'd like it not to happen. How?

I have been developing a project which contains a TestLauncher class that'll read a given directory and for each file it contains, run it against my tool and yield the results.
So, when coding in Eclipse, it would show up one result for each test (as expected). Today I've been toying with Intellij, and I've decided to try to run and code a bit of this project in Intellij.
When trying to run the tests, though, it seems to be only showing up 2 results instead of the 100+ it should. Although I am sure it is running the full suite, it seems to be folding all the results of a given category in a single result. That means that if I have at least one failing test in each category, it shows up as a "failed test".
I guess this must not be a bug, but rather some configuration that I am not aware about and that is on by default in Intellij but not in Eclipse. Could anyone explain what might be going on?
Edit: I am using the latest Intellij (downloaded one of these days).
Thanks
What you're seeing is simply a difference in the way the Eclipse and IDEA plug-ins are implemented. I implemented the Eclipse plug-in to be pretty clever in its display, so it will show different things depending on various factors such as the presence of a toString() method in your test class or whether your test class implements org.testng.ITest.
I suggest you ask this question on the IDEA forums and if you don't get any response, feel free to email the testng-users list and I can put you in touch with the JetBrains engineer in charge of the TestNG plug-in.
The IntelliJ-IDEA TestNG Plugin has a filter symbol called "Hide Passed" above the output Test Results. You can toggle that to display all tests, including the passed ones.