Integrating RFT Test framework to work with RQM - rft

I designed a framework in RFT where the test cases are written in spreadsheet specifying the data source, object and keyword and a driver script which processes through all this data and routes it to the appropriate method for each test step all in a spreadsheet. Now I want to integrate this with RQM so that each of my test cases in the spreadsheet is shown as passed/failed in RQM. Any ideas?

You could implement now an algorithm to read those testcases in the spreadsheet and pass them to RQM as attachments with logTestResult.
For example:
logTestResult( <your attachment> , true );
And if you are already connected to RQM the adapter will attach files that you indicate automatically to RQM. So, at the end you will see step by step the results and if the script ends correctly RQM will show you the script as "passed".

Thanks for the answer Juan. I solved this by passing the testcase name from Script Argument part of RQM and fetching the arguments in my starter script as shown below:-
public void testMain(Object[] args) throws Exception
{
String n=args[0].toString();
logInfo("Parameter from RQM"+n);
ModuleDriver d=new ModuleDriver();
d.execute_main(n);
}
Since I have verification points setup for each of the steps in my test cases the results get reported based on each of those verification points in RQM which is what i needed.

Related

Polarion: xUnitFileImport creates duplicate testcases instead of referencing existing ones

I have the xUnitFileImport scheduled job configured in my polarion project (as described in Polarion documentation) to import e2e test results (formatted to JUnit test results)
<job cronExpression="0 0/5 * * * ? *" id="xUnitFileImport" name="Import e2e Tests Results" scope="system">
<path>D:\myProject\data\import-test-results\e2e-gitlab</path>
<project>myProject</project>
<userAccountVaultKey>myKey</userAccountVaultKey>
<maxCreatedDefects>10</maxCreatedDefects>
<maxCreatedDefectsPercent>5</maxCreatedDefectsPercent>
<templateTestRunId>xUnit Build Test</templateTestRunId>
<idRegex>(.*).xml</idRegex>
<groupIdRegex>(.*)_.*.xml</groupIdRegex>
</job>
This works and I get my test results imported into a new test run and new test cases are created. But if I run the import job multiple times (for each test run) it creates duplicate test case work items even though they have the same name, which leads to this situation:
Is there some way to tell the import job to reference the existing testcases to the
newly created test run, instead of creating new ones?
What i have done so far:
yes I checked that the "custom field for test case id" in the "testing > configuration" is configured
yes I checked that the field value is really set in the created test case
The current value in this field is e.g. ".Login" as i don't want the classnames in the report.
YES I still get the same behaviour with the classname set
In the scheduler I have changed the job parameter for the group id because it wasn't filled. New value is: <groupIdRegex>e2e-results-(.*).xml</groupIdRegex>
I checked that no other custom fields are interfering, only the standard fields are set
I checked that no readonly fields are present
I do use a template for the testcases as supported by the xUnitFileImport. The testcases are successfully created and i don't see anything that would interfere
However I do have a hyperlink set in the template (I'll try removing this soon™)
I changed the test run template from "xUnit Build test" to "xUnit Manual Test Upload" this however did not lead to any visible change
I changed the template status from draft to active. Had no change in behaviour.
I tripple checked all the fields in the created test cases. They are literally the same, which leads to the conclusion that no fields in the testcases interfere with referencing to them
After all this time i have invested now, researching on my own and asking on different forums, I am ready to call this a polarion bug unless someone proves me this functionality is working.
I believe you have to set a custom field that identifies the testcase with the xUnit file you're importing, for the importer to identify the testcase.
Try adding a custom field to the TestCase workitem and selecting it here.
Custom Field for Test Case ID option in settings
If you're planning on creating test cases beforehand, note that the ID is formatted form the {classname}.{name} for a given case.

RobotFramework / Selenium: How to set screenshot-name to testcase-name on failure

I was wondering if there is a possibility to make the following happen. Let's say I have 3 Testcases with the following results in RIDE:
Testcase Easter -- PASS
Testcase Christmas -- FAIL
Testcase Foo -- PASS
I want to take a screenshot which should be named testcase_christmas.png (or with ' ' instead of '_', that does not matter). Is there a possibility to do it dynamically, something like
${testcase}= Get Testcase Name
Capture Page Screenshot ${testcase}
or anything like that? I am using:
Python 2.7.x (latest) 32 bit
wxPython 2.8 32 bit
geckodriver latest 64 bit
Robot framework automatically sets the variable ${TEST NAME} to contain the name of the currently executing test. See Automatic Variables in the user guide)
The documentation for SeleniumLibrary's Capture Page Screenshot shows that you can give it a filename as the first argument.
Putting those two together, you can do this:
Capture page screenshot ${TEST NAME}.png
The way I would go about this is creating a test teardown and using automatic variables form robot framework. Found here: http://robotframework.org/robotframework/latest/RobotFrameworkUserGuide.html#automatic-variables
Your keywords page / resource file should have a load test data keyword that gets the test name, along with setting a test variable you can assign the screenshot too.
*** Keywords ***
Load Test Data
${data} Get File ${TEST NAME}.txt
Set Test Variable ${data} ${data}
Common Test Teardown
capture page screenshot ${data}.png
Your test should call whatever test teardown you decide to use.
*** Settings ***
Test Setup Load Test Data
*** Testcases ***
Test Case A
My keywords
[Teardown] Common Test Teardown
Calling the test setup allows you to load the name of each test in your file and in the teardown if it fails will take a screenshot with the test case name you loaded in your test setup.
If you want to save the screenshots on basis of test case ie. a separate folder for all screenshots related to each test case. Then you can use:
Set Screenshot Directory ./Screenshots/${SUITE NAME}/${TEST NAME}
Capture Page Screenshot ABC.png
A Screenshot directory will be created in the project root folder where all the screenshots will be stored in different folders based on the test case and test suites.
For a single suite you can use
Set Screenshot Directory ./Screenshots/${TEST NAME}

How to write a custom Failure message for the Failed Step in Cucumber-java in extentReports

I want to write custom failure message in my Cucumber ExtentReports.
Tool using :
Cucumber
Java
Selenium
JUnit
ExtentReports
What's happening now:
I have a cucumber scenario.
Given something
When I do something
Then this step fails
The failed step Fails with:
Assert.assertTrue("CUSTOM_FAIL_MSG", some_condition);
In the ExtentReport, I see the
What I want to achieve:
What I have researched so far:
There is a scenario.write("") function but this creates a new info log into the report(But I am looking for CustomFailure message rather than a new log entry)
scenario.stepResults has the String which is displayed in the report. However, I could not find a way to set some value in the same.
Any ideas on this?
Have you tried using the create label markup?
Here is how to do it for the FAILED test:
test.log(Status.FAIL, MarkupHelper.createLabel(result.getName()+" Your MSG here!", ExtentColor.RED));
and the PASSED test:
test.log(Status.PASS, MarkupHelper.createLabel(result.getName()+" Test Case PASSED", ExtentColor.GREEN));
You can easily manipulate the string part (var interpolation?) according to your need.
Does this help?
try to replace the JUnit assertion library with the testNG library...also using cucumber you can see the failed step...why do you want to change this "narrative" log?...or if you want a better report try to use a 3rd party library

Retrieve response from a "Run Test Step", using SoapUI/ Groovy?

In SoapUI, I have a host Test Case, which executes another external Test Case (with several test steps) using the "Run Test Case" test step. I need to access a response from the external TC from within my host TC, since I need to assert on some values.
I cannot transfer the properties since they are in XML. Could I get some pointers as to how I could leverage Groovy/SoapUI for this.
For Response you can use the below code.
testRunner.testCase.getTestStepByName("test step").testRequest.response.responseContent
In you external TC create another property and at the end of the TC use Transfer Property step to transfer your XML node to it. In your host TC, just read that property as you would any other.
I also had a look around to see if this can be done from Groovy. SoapUI documentation says that you need to refer to the external name of the testsuite / testcase:
def tc = testRunner.testCase.testSuite.project.testSuites["external TestSuite"].testCases["external TestCase"]
def ts = tc.testSteps["test step"]
But I could not find how to get at the Response after that.
In addition to Guest and SiKing answers, I share a solution to a problem that I've met:
If your step is not of type 'request' but 'calltestcase' you cannot use Guest answer.
I have a lot of requests contained each in a testCase and my other testCases call these testCases each time I need to launch a request.
I configured my request testCases in order to return the response as a custom property that I call "testResponse" so I can easily access it from my other testCases.
I met a problem in the following configuration :
I have a "calltestcase" step that gives me a request result.
Further in the test I have a groovy script that needs to call this step and get the response value
If I use this solution :
testRunner.runTestStepByName("test step")
followed by testRunner.testCase.getTestStepByName("test step").testRequest.response.responseContent
I'm stuck as there is no testRequest property for the class.
The solution that works is :
testRunner.runTestStepByName("test step")
def response_value = context.expand( '${test step#testResponse#$[\'value\']}' )
another solution is :
testRunner.runTestStepByName("test step")
tStep = testRunner.testCase.getTestStepByName("test step")
response = tStep.getPropertyValue("testResponse")
Then I extract the relevant value from 'response' (in my case, it is a json that I have to parse).
Of course it works only because I the request response as a custom property of my request test case.
I hope I was clear enough

MsTest, DataSourceAttribute - how to get it working with a runtime generated file?

for some test I need to run a data driven test with a configuration that is generated (via reflection) in the ClassInitialize method (by using reflection). I tried out everything, but I just can not get the data source properly set up.
The test takes a list of classes in a csv file (one line per class) and then will test that the mappings to the database work out well (i.e. try to get one item from the database for every entity, which will throw an exception when the table structure does not match).
The testmethod is:
[DataSource(
"Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\EntityMappingsTests.Types.csv",
"EntityMappingsTests.Types#csv",
DataAccessMethod.Sequential)
]
[TestMethod()]
public void TestMappings () {
Obviously the file is EntityMappingsTests.Types.csv. It should be in the DataDirectory.
Now, in the Initialize method (marked with ClassInitialize) I put that together and then try to write it.
WHERE should I write it to? WHERE IS THE DataDirectory?
I tried:
File.WriteAllText(context.TestDeploymentDir + "\\EntityMappingsTests.Types.csv", types.ToString());
File.WriteAllText("EntityMappingsTests.Types.csv", types.ToString());
Both result in "the unit test adapter failed to connect to the data source or read the data". More exact:
Error details: The Microsoft Jet database engine could not find the
object 'EntityMappingsTests.Types.csv'. Make sure the object exists
and that you spell its name and the path name correctly.
So where should I put that file?
I also tried just writing it to the current directory and taking out the DataDirectory part - same result. Sadly, there is limited debugging support here.
Please use the ProcessMonitor tool from technet.microsoft.com/en-us/sysinternals/bb896645. Put a filter on MSTest.exe or the associate qtagent32.exe and find out what locations it is trying to load from and at what point in time in the test loading process. Then please provide an update on those details here .
After you add the CSV file to your VS project, you need to open the properties for it. Set the Property "Copy To Output Directory" to "Copy Always". The DataDirectory defaults to the location of the compiled executable, which runs from the output directory so it will find it there.