How to generate a report using SOAtest - parasoft

I am generating a SOAtest report after running the parasoft SOAtest from command line, I am seeing results at Test case level where for one test there are multiple validations from Datasource. How can I generate that gives me report on all validation I am doing rather than just the Tests in suite

(SOATest 9.8)Below gives you Complete Validation report in test suite hierarchy structure in HTML or XML.
Right click on .tst Structure report, Select your configuration and file format you need the report in.

Related

Polarion: xUnitFileImport creates duplicate testcases instead of referencing existing ones

I have the xUnitFileImport scheduled job configured in my polarion project (as described in Polarion documentation) to import e2e test results (formatted to JUnit test results)
<job cronExpression="0 0/5 * * * ? *" id="xUnitFileImport" name="Import e2e Tests Results" scope="system">
<path>D:\myProject\data\import-test-results\e2e-gitlab</path>
<project>myProject</project>
<userAccountVaultKey>myKey</userAccountVaultKey>
<maxCreatedDefects>10</maxCreatedDefects>
<maxCreatedDefectsPercent>5</maxCreatedDefectsPercent>
<templateTestRunId>xUnit Build Test</templateTestRunId>
<idRegex>(.*).xml</idRegex>
<groupIdRegex>(.*)_.*.xml</groupIdRegex>
</job>
This works and I get my test results imported into a new test run and new test cases are created. But if I run the import job multiple times (for each test run) it creates duplicate test case work items even though they have the same name, which leads to this situation:
Is there some way to tell the import job to reference the existing testcases to the
newly created test run, instead of creating new ones?
What i have done so far:
yes I checked that the "custom field for test case id" in the "testing > configuration" is configured
yes I checked that the field value is really set in the created test case
The current value in this field is e.g. ".Login" as i don't want the classnames in the report.
YES I still get the same behaviour with the classname set
In the scheduler I have changed the job parameter for the group id because it wasn't filled. New value is: <groupIdRegex>e2e-results-(.*).xml</groupIdRegex>
I checked that no other custom fields are interfering, only the standard fields are set
I checked that no readonly fields are present
I do use a template for the testcases as supported by the xUnitFileImport. The testcases are successfully created and i don't see anything that would interfere
However I do have a hyperlink set in the template (I'll try removing this soon™)
I changed the test run template from "xUnit Build test" to "xUnit Manual Test Upload" this however did not lead to any visible change
I changed the template status from draft to active. Had no change in behaviour.
I tripple checked all the fields in the created test cases. They are literally the same, which leads to the conclusion that no fields in the testcases interfere with referencing to them
After all this time i have invested now, researching on my own and asking on different forums, I am ready to call this a polarion bug unless someone proves me this functionality is working.
I believe you have to set a custom field that identifies the testcase with the xUnit file you're importing, for the importer to identify the testcase.
Try adding a custom field to the TestCase workitem and selecting it here.
Custom Field for Test Case ID option in settings
If you're planning on creating test cases beforehand, note that the ID is formatted form the {classname}.{name} for a given case.

How to get disabled test cases count in jenkins result?

I have suppose 10 test cases in test suite in which 2 test cases are disabled.I want to get those two test cases in test result of jenkins job like pass = 7 ,fail = 1 and disabled/notrun= 2.
By default, TestNG generates report for your test suite and you may refer to index.html file under the test-output folder. If you click on "Ignored Methods" hyperlink, it will show you all the ignored test cases and its class name and count of ignored methods.
All test cases annotated with #Test(enabled = false) will be showing in "Ignored Methods" link.
I have attached a sample image. Refer below.
If your test generates JUnit XML reports, you can use the JUnit plugin to parse these reports after the build (as a post-build action). Then, you can go into your build and click 'Test Result'. You should see a breakdown of how the execution went (including passed, failed, and skipped tests).

Integrating RFT Test framework to work with RQM

I designed a framework in RFT where the test cases are written in spreadsheet specifying the data source, object and keyword and a driver script which processes through all this data and routes it to the appropriate method for each test step all in a spreadsheet. Now I want to integrate this with RQM so that each of my test cases in the spreadsheet is shown as passed/failed in RQM. Any ideas?
You could implement now an algorithm to read those testcases in the spreadsheet and pass them to RQM as attachments with logTestResult.
For example:
logTestResult( <your attachment> , true );
And if you are already connected to RQM the adapter will attach files that you indicate automatically to RQM. So, at the end you will see step by step the results and if the script ends correctly RQM will show you the script as "passed".
Thanks for the answer Juan. I solved this by passing the testcase name from Script Argument part of RQM and fetching the arguments in my starter script as shown below:-
public void testMain(Object[] args) throws Exception
{
String n=args[0].toString();
logInfo("Parameter from RQM"+n);
ModuleDriver d=new ModuleDriver();
d.execute_main(n);
}
Since I have verification points setup for each of the steps in my test cases the results get reported based on each of those verification points in RQM which is what i needed.

How to get the result after each test case instead of logging the results into a file?

I'm using PHPUnit Selenium.
Is it possible to get the results of the test case in a json format or something and store it into a variable instead of logging it into a file? I don't want to read from a file.
For example, I run the test case testAsdf2. After the test case, i would like to get the results similar to the one below and store it into a variable and insert the results into a database
Is it possible to do that?
{"event":"test","suite":"JsonTest","test":"JsonTest::testAsdf2","status":"error","time":10.839267015457,"trace":[],"message":"\nInvalid response while accessing the Selenium Server at 'http:\/\/localhost:4444\/selenium-server\/driver\/': ERROR: Element id=asdfasfe not found","output":""}

Logging additional custom information in PHPUnit for reports

I am using PHPUnit Selenium for functional testing of my project.
I am using junit for logging and using the log file to gnerate the report. Following is the log tag in phpunit.xml
<phpunit>
<logging>
<log type="junit" target="reports/logfile.xml" logIncompleteSkipped="false" />
</logging>
</phpunit>
Then I use the logfile.xml to generate the report.
What I am looking for is the ability to log additional information (information telling what exactly is getting tested in assertion, in both cases i.e. in both pass/fail of assertion).
Basically in reports I want to tell what is being asserted. And that information will be written by the test writer in the test case manually along with assertion.
assert functions comes with the third optional parameter as message but that is shown only on failure.
Eg:
<?php
// $accountExists is the dummy variable which wil probably checking in database for the existence of the record
$this->assertEquals(true, $accountExists, 'Expecting for accountExists to be true');
?>
Above will return message on failure but not when test is passed.
you must use the
--printer command line argument to point to a custom printer class
http://www.phpunit.de/manual/3.6/en/extending-phpunit.html#extending-phpunit.PHPUnit_Framework_TestListener
in your endTest function whatever you put in printf will show up in your log file.