How to print the number of failures for every test case using SoftAssert in TestNG - testing

I have a test case that performs the following:
#Test
public void testNumber() {
SoftAssert softAssert = new SoftAssert();
List<Integer> nums = Arrays.asList(1,2,3,4,5,6,7,8,9,10);
for (Integer num : nums){
softAssert.assertTrue(num%2==0, String.format("\n Old num : %d", num);
}
softAssert.assertAll();
}
The above test will fail for numbers 1,3,5,7,9
Five statements will be printed in the test report.
If I run the test for a larger data set, I find it difficult to get the count of test data that failed the test case.
Is there any easier way to get the number of test data that failed for a test case using softAssert itself ?

Easy way to find count of test data that failed the test case in SoftAssert itself is not possible.
Alternatively you can think of extending Assertion class. Below is the sample that gives count of test data that failed the test case and prints each failure in new line.
package yourpackage;
import java.util.Map;
import org.testng.asserts.Assertion;
import org.testng.asserts.IAssert;
import org.testng.collections.Maps;
public class AssertionExtn extends Assertion {
private final Map<AssertionError, IAssert> m_errors;
public AssertionExtn(){
this.m_errors = Maps.newLinkedHashMap();
}
protected void doAssert(IAssert a) {
onBeforeAssert(a);
try {
a.doAssert();
onAssertSuccess(a);
} catch (AssertionError ex) {
onAssertFailure(a, ex);
this.m_errors.put(ex, a);
} finally {
onAfterAssert(a);
}
}
public void assertAll() {
if (!(this.m_errors.isEmpty())) {
StringBuilder sb = new StringBuilder("Total assertion failed: "+m_errors.size());
sb.append("\n\t");
sb.append("The following asserts failed:");
boolean first = true;
for (Map.Entry ae : this.m_errors.entrySet()) {
if (first)
first = false;
else {
sb.append(",");
}
sb.append("\n\t");
sb.append(((AssertionError)ae.getKey()).getMessage());
}
throw new AssertionError(sb.toString());
}
}
}
You can use this as:
#Test
public void test()
{
AssertionExtn asert=new AssertionExtn();
asert.assertEquals(false, true,"failed");
asert.assertEquals(false, false,"passed");
asert.assertEquals(0, 1,"brokedown");
asert.assertAll();
}

Related

JUnit 5 Parameterized test #ArgumentsSource parameters not loading

I have created below JUnit5 parameterized test with ArgumentsSource for loading arguments for the test:
public class DemoModelValidationTest {
public ParamsProvider paramsProvider;
public DemoModelValidationTest () {
try {
paramsProvider = new ParamsProvider();
}
catch (Exception iaex) {
}
}
#ParameterizedTest
#ArgumentsSource(ParamsProvider.class)
void testAllConfigurations(int configIndex, String a) throws Exception {
paramsProvider.executeSimulation(configIndex);
}
}
and the ParamsProvider class looks like below:
public class ParamsProvider implements ArgumentsProvider {
public static final String modelPath = System.getProperty("user.dir") + File.separator + "demoModels";
YAMLDeserializer deserializedYAML;
MetaModelToValidationModel converter;
ValidationRunner runner;
List<Configuration> configurationList;
List<Arguments> listOfArguments;
public ParamsProvider() throws Exception {
configurationList = new ArrayList<>();
listOfArguments = new LinkedList<>();
deserializedYAML = new YAMLDeserializer(modelPath);
deserializedYAML.load();
converter = new MetaModelToValidationModel(deserializedYAML);
runner = converter.convert();
configurationList = runner.getConfigurations();
for (int i = 0; i < configurationList.size(); i++) {
listOfArguments.add(Arguments.of(i, configurationList.get(i).getName()));
}
}
public void executeSimulation(int configListIndex) throws Exception {
final Configuration config = runner.getConfigurations().get(configListIndex);
runner.run(config);
runner.getReporter().consolePrintReport();
}
#Override
public Stream<? extends Arguments> provideArguments(ExtensionContext context) {
return listOfArguments.stream().map(Arguments::of);
// return Stream.of(Arguments.of(0, "Actuator Power"), Arguments.of(1, "Error Logging"));
}}
In the provideArguments() method, the commented out code is working fine, but the first line of code
listOfArguments.stream().map(Arguments::of)
is returning the following error:
org.junit.platform.commons.PreconditionViolationException: Configuration error: You must configure at least one set of arguments for this #ParameterizedTest
I am not sure whether I am having a casting problem for the stream in provideArguments() method, but I guess it somehow cannot map the elements of listOfArguments to the stream, which can finally take the form like below:
Stream.of(Arguments.of(0, "Actuator Power"), Arguments.of(1, "Error Logging"))
Am I missing a proper stream mapping of listOfArguments?
provideArguments(…) is called before your test is invoked.
Your ParamsProvider class is instantiated by JUnit. Whatever you’re doing in desiralizeAndCreateValidationRunnerInstance should be done in the ParamsProvider constructor.
Also you’re already wrapping the values fro deserialised configurations to Arguments and you’re double wrapping them in providesArguments.
Do this:
#Override
public Stream<? extends Arguments> provideArguments(ExtensionContext context) {
return listOfArguments.stream();
}}

IRetryAnalyzer produces incorrect results for test cases defined as SoftAssert

I have a implementation of Retry Logic for failed test cases using IRetryAnalyzer and there are 2 types of asserts - Assert and SoftAssert defined in my test cases. IRetryAnayzer works fine for normal Assert however does not work as expected in case of SoftAssert. Below are the scenario details about the issue faced:
If a test case which is defined as SoftAssert fails in first attempt and passes in next attempt, it continues to retry till the max retry attempt even though the test case is passing. In this case, if the next test case which is defined as normal assert (non-softassert) passes, it will also get marked as fail even though it is passing and will be retried for max retry attempt defined.
If all test cases are defined as normal assert, it works as expected i.e if it fails in first attempt and passes in next attempt, it moves on and does not get stuck in retry loop.
If a test case defined as SoftAssert is passing in first attempt, it does not retry and moves on to next test cases i.e it works as expected.
I need to keep few test cases as softAssert as I have to continue with the test run. Ex:
#Test(retryAnalyzer = RetryAnalyzer.class, groups = { "group1" }, priority=1)
public void TestSection1(){
Class1.verifyingAppLaunch(); //Defined as Assert
Class1.Test1(); //Defined as softAssert
Class1.Test2(); //Defined as softAssert
Class1.Test3(); //Defined as softAssert
Class1.Test4(); //Defined as softAssert
Class1.Test5(); //Defined as softAssert
softAssert.assertAll();
}
Below is a sample of IRetryAnalyer and ListenerAdapter implementation. ListenerAdapter is implemented to remove duplicate test case execution which were marked as skipped as part of retry implementation. In the below sample code, if samplecondition1 fails in first attempt, it will retry for max retry count defined even if it passes in second attempt and will also mark the samplecondition2 as fail even if it is passing:
MyTestListenerAdapter.class
import java.util.Iterator;
import org.testng.ITestContext;
import org.testng.ITestNGMethod;
import org.testng.ITestResult;
import org.testng.TestListenerAdapter;
public class MyTestListenerAdapter extends TestListenerAdapter {
#Override
public void onFinish(ITestContext context) {
Iterator<ITestResult> skippedTestCases = context.getSkippedTests().getAllResults().iterator();
while (skippedTestCases.hasNext()) {
ITestResult skippedTestCase = skippedTestCases.next();
ITestNGMethod method = skippedTestCase.getMethod();
if (context.getSkippedTests().getResults(method).size() > 0) {
System.out.println("Removing:" + skippedTestCase.getTestClass().toString());
skippedTestCases.remove();
}
}
}
}
TestRetryAnalyzer.class
import org.testng.IRetryAnalyzer;
import org.testng.ITestResult;
public class TestRetryAnalyzer implements IRetryAnalyzer {
int counter = 1;
int retryMaxLimit = 3;
public boolean retry(ITestResult result) {
if (counter < retryMaxLimit) {
counter++;
return true;
}
return false;
}
}
TestRetryTestCases.class
import org.testng.Assert;
import org.testng.annotations.Listeners;
import org.testng.annotations.Test;
#Listeners(MyTestListenerAdapter.class)
public class TestRetryTestCases {
SoftAssert softAssert = new SoftAssert();
#Test(retryAnalyzer = TestRetryAnalyzer.class)
public void firstTestMethod() {
System.out.println("First test method");
if (samplecondition1 == true)
softAssert.assertTrue(true);
else
softAssert.assertTrue(false);
softAssert.assertAll();
}
#Test(retryAnalyzer = TestRetryAnalyzer.class)
public void secondTestMethod() {
System.out.println("Second test method");
if (samplecondition2 == true)
Assert.assertTrue(true);
else
Assert.assertTrue(false);
}
}
I am not sure what version of TestNG are you working with, but I cant seem to reproduce this issue using TestNG 7.0.0 (the latest released version as of today).
You have an additional problem in your code. You are having the SoftAssert as a global variable. SoftAssert by its very implementation, remembers all the failures. So for every retry, it continues to persist all the failures from the first attempt till now. That means that a #Test method which involves a RetryAnalyser and which uses SoftAssert wherein there's a possibility of something failing, would cause that test method to never pass.
When using SoftAssert, you should always declare and use the SoftAssert object within the #Test method, so that it gets instantiated (and thus reset for every retry).
Here's the same sample that you shared (i tweaked it just a little bit) which demonstrates that this works fine in 7.0.0
As you can see from the output, its only the firstTestMethod (which has SoftAssert is being retried) and secondTestMethod (which has a hard assert and hasn't failed) is not being retried.
Test class (I have only altered this, everything else is borrowed from your original post)
import org.testng.Assert;
import org.testng.annotations.Listeners;
import org.testng.annotations.Test;
import org.testng.asserts.SoftAssert;
#Listeners(MyTestListenerAdapter.class)
public class TestRetryTestCases {
int softAssertCounter = 0;
int hardAssertCounter = 0;
#Test(retryAnalyzer = TestRetryAnalyzer.class)
public void firstTestMethod() {
SoftAssert softAssert = new SoftAssert();
System.out.println("First test method");
if (softAssertCounter++ > 2) {
softAssert.assertTrue(true);
} else {
softAssert.assertTrue(false);
}
softAssert.assertAll();
}
#Test(retryAnalyzer = TestRetryAnalyzer.class)
public void secondTestMethod() {
System.out.println("Second test method");
if (hardAssertCounter++ < 2) {
Assert.assertTrue(true);
} else {
Assert.assertTrue(false);
}
}
}
** Retry analyser with some additional logging **
import org.testng.IRetryAnalyzer;
import org.testng.ITestResult;
public class TestRetryAnalyzer implements IRetryAnalyzer {
int counter = 1;
int retryMaxLimit = 3;
public boolean retry(ITestResult result) {
if (counter < retryMaxLimit) {
counter++;
System.err.println("Retrying the test method " + result.getMethod().getMethodName());
return true;
}
return false;
}
}
Console output
First test method
Retrying the test method firstTestMethod
Retrying the test method firstTestMethod
Test ignored.
First test method
Test ignored.
First test method
java.lang.AssertionError: The following asserts failed:
did not expect to find [true] but found [false]
at org.testng.asserts.SoftAssert.assertAll(SoftAssert.java:47)
at org.testng.asserts.SoftAssert.assertAll(SoftAssert.java:31)
at com.rationaleemotions.stackoverflow.qn58072880.TestRetryTestCases.firstTestMethod(TestRetryTestCases.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:133)
at org.testng.internal.TestInvoker.invokeMethod(TestInvoker.java:584)
at org.testng.internal.TestInvoker.retryFailed(TestInvoker.java:204)
at org.testng.internal.MethodRunner.runInSequence(MethodRunner.java:58)
at org.testng.internal.TestInvoker$MethodInvocationAgent.invoke(TestInvoker.java:804)
at org.testng.internal.TestInvoker.invokeTestMethods(TestInvoker.java:145)
at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:146)
at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:128)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at org.testng.TestRunner.privateRun(TestRunner.java:770)
at org.testng.TestRunner.run(TestRunner.java:591)
at org.testng.SuiteRunner.runTest(SuiteRunner.java:402)
at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:396)
at org.testng.SuiteRunner.privateRun(SuiteRunner.java:355)
at org.testng.SuiteRunner.run(SuiteRunner.java:304)
at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:53)
at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:96)
at org.testng.TestNG.runSuitesSequentially(TestNG.java:1180)
at org.testng.TestNG.runSuitesLocally(TestNG.java:1102)
at org.testng.TestNG.runSuites(TestNG.java:1032)
at org.testng.TestNG.run(TestNG.java:1000)
at org.testng.IDEARemoteTestNG.run(IDEARemoteTestNG.java:73)
at org.testng.RemoteTestNGStarter.main(RemoteTestNGStarter.java:123)
Second test method
Removing:[TestClass name=class com.rationaleemotions.stackoverflow.qn58072880.TestRetryTestCases]
Removing:[TestClass name=class com.rationaleemotions.stackoverflow.qn58072880.TestRetryTestCases]
===============================================
Default Suite
Total tests run: 2, Passes: 1, Failures: 1, Skips: 0
===============================================
Process finished with exit code 0

Robotium test case not Running at all

Am a day old to Robotium. Hence trying to run some apps on Robotium.
I have done a simple calci app and am trying to run it using Robotium.
But the Robotium app is not responding at all. Neither the tests are being done.
I have included the permissions in Manifest file and all. But still the Program never runs.
My Source Code for Robotium Test is like this:
package com.example.demo.project.test;
import android.test.ActivityInstrumentationTestCase2;
import android.widget.EditText;
import android.widget.TextView;
import com.example.demo.project.MainActivity;
import com.example.demo.project.R;
import com.jayway.android.robotium.solo.Solo;
public class SampleQA extends ActivityInstrumentationTestCase2<MainActivity> {
public SampleQA(Class<MainActivity> activityClass) {
super(activityClass);
// TODO Auto-generated constructor stub
}
private Solo solo;
/*public TestMain()
{
super(MainActivity.class);
}*/
#Override
protected void setUp() throws Exception {
super.setUp();
solo = new Solo(getInstrumentation(), getActivity());
}
public void testDisplayBlackBox() {
//Enter 10 in first edit-field
solo.enterText(0, "10");
//Enter 20 in first edit-field
solo.enterText(1, "20");
//Click on Multiply button
solo.clickOnButton("Multiply");
//Verify that resultant of 10 x 20
assertTrue(solo.searchText("200"));
}
public void testDisplayWhiteBox() {
//Defining our own values to multiply
float firstNumber = 10;
float secondNumber = 20;
float resutl = firstNumber * secondNumber ;
//Access First value (edit-filed) and putting firstNumber value in it
EditText FirsteditText = (EditText) solo.getView(R.id.EditText01);
solo.enterText(FirsteditText, String.valueOf(firstNumber));
//Access Second value (edit-filed) and putting SecondNumber value in it
EditText SecondeditText = (EditText) solo.getView(R.id.EditText02);
solo.enterText(SecondeditText, String.valueOf(secondNumber));
//Click on Multiply button
solo.clickOnButton("Multiply");
assertTrue(solo.searchText(String.valueOf(resutl)));
TextView outputField = (TextView) solo.getView(R.id.TextView01);
//Assert to verify result with visible value
assertEquals(String.valueOf(resutl), outputField.getText().toString());
}
#Override
protected void tearDown() throws Exception{
try {
solo.finalize();
}
catch (Throwable e)
{
e.printStackTrace();
}
getActivity().finish();
super.tearDown();
}
}
The Test is not getting executed at all.
Please help me out Folks!!
Thanks.
Can you be more specific on the errors that you are encounter? maybe put some examples?
Your main activity of the application under test it is named MainActivity?
Maybe you should change the constructor from
public SampleQA(Class<MainActivity> activityClass) {
super(activityClass);
// TODO Auto-generated constructor stub
}
to
public SampleQA() {
super(MainActivity.class);
}

How to catch test failure in Selenium Webdriver using JUnit4?

I need to write the testcase Pass/Fail status in an Excel report. How do I catch the result in a simple way? In Ant XML/HTML reports status (Success/Failed) is displayed so I believe there must be some way to catch this..may be in the #After Class.
Can anyone help me out with this?
I am using Selenium Webdriver with JUnit4 and using Apache POI(oh its screwing my mind too!) for excel handling. Let me know in case you need more info.
Thanks :)
P.S: As I am asking a lot of questions it would be great if someone can change or suggest me changes to make these questions and threads helpful for others.
I got your problem and this is what I have do in all my test cases now -
In your test case if you want to check if 2 Strings are equal -
You might be using this code -
Assert.assertTrue(value1.equals(value2));
And if these values are not equal an AssertionError is generated.
Instead you can change your code like this -
String testCaseStatus = "";
if(value1.equals(value2)){
testCaseStatus = "success";
}
else{
testCaseStatus = "fail";
}
Now you store this result in your excel sheet by passing the testCaseStatus to your code which adds a line to your excel which you have implemented using Apache POI. You can also handle the conditions of "error" if you implement a try catch block.
For example you can catch some exception and add the status as error to your excel sheet.
Edited part of answer -
I just figured out on how to use the TestResult class -
This is some sample code I'm posting -
This is the test case class called ExampleTest -
public class ExampleTest implements junit.framework.Test {
#Before
public void setUp() throws Exception {
}
#After
public void tearDown() throws Exception {
}
#Test
public void test(TestResult result) {
try{
Assert.assertEquals("hari", "");
}catch(AssertionFailedError e){
result.addFailure(this, e);
}catch(AssertionError e){
result.addError(this, e);
}
}
#Override
public int countTestCases() {
// TODO Auto-generated method stub
return 0;
}
#Override
public void run(TestResult result) {
test(result);
}
}
I call the above test case from this code -
public class Test {
public static void main(String[] args) {
TestResult result = new TestResult();
TestSuite suite = new TestSuite();
suite.addTest(new ExampleTest());
suite.run(result);
System.out.println(result.errorCount());
}
}
You can call many test cases by just adding them to this suite and then get the entire result using the TestResult class failures() and errors() methods.
You can read more on this from here.

TestNG Test Case failing with JMockit "Invalid context for the recording of expectations"

The following TestNG (6.3) test case generates the error "Invalid context for the recording of expectations"
#Listeners({ Initializer.class })
public final class ClassUnderTestTest {
private ClassUnderTest cut;
#SuppressWarnings("unused")
#BeforeMethod
private void initialise() {
cut = new ClassUnderTest();
}
#Test
public void doSomething() {
new Expectations() {
MockedClass tmc;
{
tmc.doMethod("Hello"); result = "Hello";
}
};
String result = cut.doSomething();
assertEquals(result, "Hello");
}
}
The class under test is below.
public class ClassUnderTest {
MockedClass service = new MockedClass();
MockedInterface ifce = new MockedInterfaceImpl();
public String doSomething() {
return (String) service.doMethod("Hello");
}
public String doSomethingElse() {
return (String) ifce.testMethod("Hello again");
}
}
I am making the assumption that because I am using the #Listeners annotation that I do not require the javaagent command line argument. This assumption may be wrong....
Can anyone point out what I have missed?
The JMockit-TestNG Initializer must run once for the whole test run, so using #Listeners on individual test classes won't work.
Instead, simply upgrade to JMockit 0.999.11, which works transparently with TestNG 6.2+, without any need to specify a listener or the -javaagent parameter (unless running on JDK 1.5).