Allure reporting integrated with karate returning number of test cases as zero though I have 1 test case that ran successfully - karate

I tried to integrate allure report with karate and am successfully able to integrate and see the allure report but the count of test case shown is zero though I have passed 1 test case which ran successfully.
Not sure why it's returning as zero test case

Related

Azure DevOps - Connect Test Case and Test Run in Report

I am trying to create a report that shows me the connection of the test case and the latest test run, as well as connected defects.
At the moment I am only able to see the test run connected to the result in the test plan, but also the defect connection is missing.
How is it possible to set up a query that displays test case, test run status and connected defect status?
Best Regards

How can reports be generated in Cucumber while running multiple test cases?

I am working on the automation of test cases with the Cucumber 4.0.0 framework and Selenium. Each test case corresponds to a Feature file.
When I run multiple test cases, the report is only generated until the end of the entire run. So what I would like is that when multiple test cases are run, the report is generated at the end of each test case and not until the end of all cases.

Problem running automated Tests using TFS, in correct order

I am using the TFS Web Portal to create test suites for automated testing and from Visual Studio test project, I can associate any [Test Method] to any TFS Test case. The tests can then be run from TFS.
As you can see below, my test suite has some setup tests then some actual tests. I have a Test Method that sets a folder where my test suite will then crop it's screenshots, I then Launch the application I am going to test and login. I then do a test, set the screenshot prefix to the requirement number 1581 (as that's the requirement tested) and I repeat this for requirement 2062 then 2061.
The problem is that these tests run in the wrong order.
I know about Ordered Tests and I know I could simply put all of this in an ordered test and run it. In fact I used ordered tests, myself and 1 other tester in the team has a Visual Studio installation and the skills to use ordered tests. The requirement from the rest of the team (manual testers and managers) is that I need to split my tests up (as shown below).
As I created the test suite below, I thought it all worked. I can run the first 5 tests. Inspired by this I added the last 4 tests in my 9 test suite. It all started to fail though!
Now when I run all 9 tests, the 4th test (that had worked fine) fails as it is somehow run in the wrong order by TFS. I know this because my tests all create 1 or more screenshots. These screenshots work but are being created in the wrong order.
I know a lot of people will tell me to use Ordered Tests and they will tell me that each test should be self contained. The reason I am deviating from that is that I have 4 thousand tests that verify 3 thousand requirements. If I want to test that a member record can have attributes X, Y and Z then that is 3 requirements. I want 1 setup test (launch the application, login, create a basic record). Then I want 1 test that X works, 1 test that Y works and 1 test that Z works. I have 3 requirements so I need at least 3 tests. If each test repeats all of the setup (Launch the app, login and create a basic record) then my test suite will take forever to run and will quickly become too slow.
I also want these to be split up because manual testers with no programming skills and no visual studio experience will be expected to look at my test suites, find the failed tests and re-test manually. If my tests suites all have 1 test in TFS, they will not know what went wrong. If I have 9 tests in a test suite and 1 fails, they will know what to manually re-test.
Finally, I have seen similar question relating to Microsoft Test Manager. This is not a duplicate question, MTM is not supported anymore so I'm using the Web Portal (as pictured). I have also seen this question relating to Visual Studio 2010 etc. I am using Visual Studio 2019 so it's not a duplicate, VS has changed a lot.

Nunit3: how to save the test results while tests are running via Console Runner

NUnit3 Console runner saves test results in xml at the end of the test run. But in my case I want to save the partial results on each test failure, so that in case console runner crashes or is stopped, I can still get partial results for completed tests.
Is there a way to do that?
Test Results are saved regardless of failures or successful test runs. The XML file is generated for every test run. You could also write every test result to a log file if the XML is not generated for a particular test run.

No link to the test results in the Build Results reports in TFS 2008

Is there a way to display the test results on the build results page returned by Build.aspx page? Because right now you only get to see the total number of tests and the number of failed and passed tests as in the sample:
Result details for Any CPU/Debug 0 errors, 90 warnings, 12 tests total, 4 tests passed, 8 tests failed
Errors and Warnings: 0 errors, 90 warnings
Test Results: 1 test runs completed, 12 tests total, 4 passed, 8 failed
Test Run Run By Total Passed Failed
TFSBUILD#TFB 2008-01-15 15:23:42_Any CPU_Debug PROJECTA\TFSBUILD 12 4 8
Furthermore in the Visual Studio 2008 GUI the Test Run can be opened because it is a link point to the .trx file in the TestResults subfolder of the \Build\ folder but the Build.aspx page does not show this as a link. Better would be maybe if you would get to see the list of tests and for each whether it passed or not.
I am saying this because we use the Team Foundation Build Notification Tool from the TFS 2008 Power Tools and right now you see that a build failed or partially succeeded but you cannot see the test list.
It's been one of my constant frustrations with VSTS that, out of the box, test results are not visible to anyone who does not have the VS Test SKU. See this question on how to at least get .trx files into HTML.
As for the build summary itself, last I heard there is no way to customize this, nor are there plans to allow doing so in the future. One could do the transformation of the .trx test results file, and post the HTML results to a known location, though.