CTest pull test names from cmocka source - ctest

I'm using CTest to run tests written with cmocka. I'd like to know if it's possible to have CTest read the test names from my cmocka source and give them to me on the output. For example, if my test source contains 3 tests: test_order_correct, test_order_received and test_customer_happy, if I build these tests into an executable called tests and I run it with CTest, the only output that I get is:
Test project .......
Start 1: tests
1/1 Test #1: tests ......................... Passed 0.00 sec
100% tests passed, 0 tests failed out of 1
Total Test time (real) = 0.01 sec
I'd like to see:
Test project .......
Start 1: test_order_correct
1/3 Test #1: test_order_correct .......................... Passed 0.00 sec
Start 2: test_order_received
2/3 Test #2: test_order_received ......................... Passed 0.00 sec
Start 3: test_customer_happy
3/3 Test #3: test_customer_happy ......................... Passed 0.00 sec
100% tests passed, 0 tests failed out of 3
Total Test time (real) = 0.01 sec
Is this possible, or is CTest not capable of delving into the source like that? As I type this, it seems less and less possible by the word.

If you call 'make test' it only gives you reduced output. To be more verbose just call 'ctest -V' in the build directory.

Related

Ask re-run test failed and merge (RobotFramework)

I have 3 test suites: test1.robot (10 TCs inside), test2.robot(3 TCs inside), test3.robot(2TCs inside).
I run all test suites by shell script: robot --variable:ABC --name Testing --outputdir /perf-logs/Testing test1.robot test2.robot test3.robot
I found that we have 2 ways to rerun:
--rerunfailed (for tests) and --rerunfailedsuites (for testsuites)
I have some question:
1/ What is different between them (--rerunfailed vs --retunfailedsuites)
2/ Assumpting I have 2 TCs failed in test suite (test1.robot) and 1 TCs failed in testsuite test2.robot, so Which re-run I should use?
3/ Assumpting first time run 3 testsuites I have 1 output.xml. After re-running TCs failed (for 2 testsuites) I have another output2.xml. Could I merge them?
4/ In case I only re-run 1 TCs failed (in test1.robot) and get result in output3.xml. Could I merge output3.xml with first output.xml?
Many thanks
Difference:
https://robotframework.org/robotframework/latest/RobotFrameworkUserGuide.html
-R, --rerunfailed <file>
Selects failed tests from an earlier output file to be re-executed.
-S, --rerunfailedsuites <file>
Selects failed test suites from an earlier output file to be re-executed.
Which to use:
if you want to rerun an entire suite use rerunfailedsuite if you want to rerun failed test cases only not the passed tests in the suite then use rerunfailed ( if test are independent)
to combine files
rebot --outputdir . --output final_output.xml output.xml output.xml
4)same as above

Running unit test target on XCode9 returns "Early unexpected exit" error

I'm learning how to add unit tests to an objective-c project using XCode9. So I've created a command line project from scratch called Foo and afterwards I've added a new target to the project called FooTests. Afterwards I've edited Foo's scheme to add FooTests. However, whenever I run the tests (i.e., menu "Product" -> "Tests" ) XCode9 throws the following error:
Showing All Messages
Test target FooTests encountered an error (Early unexpected exit, operation never finished bootstrapping - no restart will be attempted)
However, when I try to run tests by calling xcode-build from the command line, it seems that all unit tests are executed correctly. Here's the output;
a483e79a7057:foo ram$ xcodebuild test -project foo.xcodeproj -scheme foo
2020-05-15 17:39:30.496 xcodebuild[53179:948485] IDETestOperationsObserverDebug: Writing diagnostic log for test session to:
/var/folders/_z/q35r6n050jz5fw662ckc_kqxbywcq0/T/com.apple.dt.XCTest/IDETestRunSession-E7DD2270-C6C2-43ED-84A9-6EBFB9A4E853/FooTests-8FE46058-FC4A-47A2-8E97-8D229C5678E1/Session-FooTests-2020-05-15_173930-Mq0Z8N.log
2020-05-15 17:39:30.496 xcodebuild[53179:948484] [MT] IDETestOperationsObserverDebug: (324DB265-AD89-49B6-9216-22A6F75B2EDF) Beginning test session FooTests-324DB265-AD89-49B6-9216-22A6F75B2EDF at 2020-05-15 17:39:30.497 with Xcode 9F2000 on target <DVTLocalComputer: 0x7f90b2302ef0 (My Mac | x86_64h)> (10.14.6 (18G4032))
=== BUILD TARGET foo OF PROJECT foo WITH CONFIGURATION Debug ===
Check dependencies
=== BUILD TARGET FooTests OF PROJECT foo WITH CONFIGURATION Debug ===
Check dependencies
Test Suite 'All tests' started at 2020-05-15 17:39:30.845
Test Suite 'FooTests.xctest' started at 2020-05-15 17:39:30.846
Test Suite 'FooTests' started at 2020-05-15 17:39:30.846
Test Case '-[FooTests testExample]' started.
Test Case '-[FooTests testExample]' passed (0.082 seconds).
Test Case '-[FooTests testPerformanceExample]' started.
/Users/ram/development/objective-c/foo/FooTests/FooTests.m:36: Test Case '-[FooTests testPerformanceExample]' measured [Time, seconds] average: 0.000, relative standard deviation: 84.183%, values: [0.000006, 0.000002, 0.000001, 0.000002, 0.000001, 0.000001, 0.000001, 0.000001, 0.000001, 0.000001], performanceMetricID:com.apple.XCTPerformanceMetric_WallClockTime, baselineName: "", baselineAverage: , maxPercentRegression: 10.000%, maxPercentRelativeStandardDeviation: 10.000%, maxRegression: 0.100, maxStandardDeviation: 0.100
Test Case '-[FooTests testPerformanceExample]' passed (0.660 seconds).
Test Suite 'FooTests' passed at 2020-05-15 17:39:31.589.
Executed 2 tests, with 0 failures (0 unexpected) in 0.742 (0.743) seconds
Test Suite 'FooTests.xctest' passed at 2020-05-15 17:39:31.589.
Executed 2 tests, with 0 failures (0 unexpected) in 0.742 (0.744) seconds
Test Suite 'All tests' passed at 2020-05-15 17:39:31.590.
Executed 2 tests, with 0 failures (0 unexpected) in 0.742 (0.745) seconds
** TEST SUCCEEDED **
Does anyone know how to add unit tests to an xcode9 project for a command line application? If you happen to know, what's the right way of doing this and what am I doing wrong?

xunit from TFS2015 - where to put ParallelizeAssemblies

I'm trying to speed up our unit tests. By setting xunit.parallelizeAssembly to true in in the app.config files I get multiple tests from different assemblies to run in parallel when run from Visual Studio. But when running on the build server it makes no difference in execution time and I can see than only one core is used.
In the paragraph on MSBuild Runner on this page it is suggested that the setting ParallelizeAssemblies would solve this problem. I'm currently running the tests with the "Visual Studio Test" build step (see image for configuration). Where do I put this setting?
I can't share all of the log but I believe the first and last part might contain good clues.
2017-04-20T16:51:10.5496891Z Executing the powershell script: C:\Tfs_Agent5\tasks\VSTest\1.0.32\VSTest.ps1
2017-04-20T16:51:12.9402898Z ##[debug]Calling Invoke-VSTest for all test assemblies
2017-04-20T16:51:12.9559206Z ##[warning]Install Visual Studio 2015 Update 1 or higher on your build agent machine to run the tests in parallel.
2017-04-20T16:51:13.0027923Z Working folder: C:\Tfs_Agent5\_work\1
2017-04-20T16:51:13.0027923Z Executing C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow\vstest.console.exe "C:\Tfs_Agent5\_work\1\s\src\AwtSG.CompareToAros\bin\Release\AwtSG.Domain.MetOceanData.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Dynamics.UnitTests\bin\Release\AwtSG.Domain.Dynamics.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.EcdisRouteFiles.UnitTests\bin\Release\AwtSG.Domain.EcdisRouteFiles.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.EcdisRouteFiles.UnitTests\bin\Release\AwtSG.Domain.NavigationUtilities.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.GeoSpatial.UnitTests\bin\Release\AwtSG.Domain.Geospatial.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.GeoSpatial.UnitTests\bin\Release\AwtSG.Domain.NavigationUtilities.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.MetOceanData.GribApi.UnitTests\bin\Release\AwtSG.Domain.MetOceanData.GribApi.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.MetOceanData.UnitTests\bin\Release\AwtSG.Domain.MetOceanData.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.NavigationUtilities.UnitTests\bin\Release\AwtSG.Domain.NavigationUtilities.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Optimization.Evolutionary.IntegrationTests\bin\Release\AwtSG.Domain.MetOceanData.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Optimization.Evolutionary.IntegrationTests\bin\Release\AwtSG.Domain.Optimization.Evolutionary.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Optimization.Evolutionary.IntegrationTests\bin\Release\AwtSG.Domain.TechnicalPerformance.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Optimization.Evolutionary.UnitTests\bin\Release\AwtSG.Domain.MetOceanData.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Optimization.Evolutionary.UnitTests\bin\Release\AwtSG.Domain.Optimization.Evolutionary.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Optimization.Evolutionary.UnitTests\bin\Release\AwtSG.Domain.TechnicalPerformance.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Optimization.IntegrationTests\bin\Release\AwtSG.Domain.MetOceanData.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Optimization.Mesh\bin\Release\AwtSG.Domain.NavigationUtilities.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Simulation.IntegrationTests\bin\Release\AwtSG.Domain.MetOceanData.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Simulation.UnitTests\bin\Release\AwtSG.Domain.NavigationUtilities.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Simulation.UnitTests\bin\Release\AwtSG.Domain.Simulation.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.TechnicalPerformance.UnitTests\bin\Release\AwtSG.Domain.TechnicalPerformance.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Domain.Units.UnitTests\bin\Release\AwtSG.Domain.Units.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Dto.Mapping.UnitTests\bin\Release\AwtSG.Domain.NavigationUtilities.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Dto.Mapping.UnitTests\bin\Release\AwtSG.Domain.Simulation.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Dto.Mapping.UnitTests\bin\Release\AwtSG.Dto.Mapping.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Dto.Simulation.Mapping.UnitTests\bin\Release\AwtSG.Domain.NavigationUtilities.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Dto.Simulation.Mapping.UnitTests\bin\Release\AwtSG.Domain.Simulation.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Dto.Simulation.Mapping.UnitTests\bin\Release\AwtSG.Dto.Simulation.Mapping.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Dto.Simulation.UnitTests\bin\Release\AwtSG.Dto.Simulation.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Dto.UnitTests\bin\Release\AwtSG.Dto.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Infrastructure.UnitTests\bin\Release\AwtSG.Infrastructure.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.Numerics.UnitTests\bin\Release\AwtSG.Numerics.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.VoyageSimulationValidation\bin\Release\AwtSG.Numerics.UnitTests.dll" "C:\Tfs_Agent5\_work\1\s\src\AwtSG.WindowsService.VoyageSimulation.UnitTests\bin\Release\AwtSG.WindowsService.VoyageSimulation.UnitTests.dll" /Settings:"C:\Tfs_Agent5\_work\1\s\src\all.runsettings" /EnableCodeCoverage /InIsolation /logger:trx /TestAdapterPath:"C:\Tfs_Agent5\_work\1\s\src\packages"
2017-04-20T16:51:13.4090709Z Microsoft (R) Test Execution Command Line Tool Version 14.0.25420.1
2017-04-20T16:51:13.4090709Z Copyright (c) Microsoft Corporation. All rights reserved.
2017-04-20T16:51:15.7373080Z Starting test execution, please wait...
2017-04-20T16:51:19.4718867Z Warning: Diagnostic data adapter message: Could not find diagnostic data adapter 'Code Coverage'. Make sure diagnostic data adapter is installed and try again.
2017-04-20T16:51:33.2378718Z Information: [xUnit.net 00:00:01.2430136] Discovering:
2017-04-20T17:17:09.1501081Z Warning: System.AppDomainUnloadedException: Attempted to access an unloaded AppDomain. This can happen if the test(s) started a thread but did not stop it. Make sure that all the threads started by the test(s) are stopped before completion.
2017-04-20T17:17:10.3845539Z Total tests: 17704. Passed: 17679. Failed: 0. Skipped: 25.
2017-04-20T17:17:10.3845539Z Test Run Successful.
2017-04-20T17:17:10.3845539Z Test execution time: 25.8603 Minutes
2017-04-20T17:17:28.5726606Z Results File: C:\Tfs_Agent5\_work\1\TestResults\tfsservice_US-SUN-TFSBUILD 2017-04-20 09_57_25.trx
2017-04-20T17:17:29.3539333Z Publishing Test Results...
2017-04-20T17:17:44.9950924Z Test results remaining: 17704
2017-04-20T17:17:47.0264093Z Test results remaining: 16704
2017-04-20T17:17:49.0421061Z Test results remaining: 15704
2017-04-20T17:17:53.1985047Z Test results remaining: 14704
2017-04-20T17:17:54.9329389Z Test results remaining: 13704
2017-04-20T17:17:56.5579944Z Test results remaining: 12704
2017-04-20T17:17:58.2299179Z Test results remaining: 11704
2017-04-20T17:17:59.9331076Z Test results remaining: 10704
2017-04-20T17:18:01.5894343Z Test results remaining: 9704
2017-04-20T17:18:03.0113618Z Test results remaining: 8704
2017-04-20T17:18:04.3395079Z Test results remaining: 7704
2017-04-20T17:18:05.6052151Z Test results remaining: 6704
2017-04-20T17:18:06.8083476Z Test results remaining: 5704
2017-04-20T17:18:08.5896555Z Test results remaining: 4704
2017-04-20T17:18:09.9178475Z Test results remaining: 3704
2017-04-20T17:18:11.2304148Z Test results remaining: 2704
2017-04-20T17:18:12.5429604Z Test results remaining: 1704
2017-04-20T17:18:13.8867197Z Test results remaining: 704
2017-04-20T17:18:25.5277535Z Published Test Run :
Note the warning that VS2015 Update 1 must be installed. This is what the About dialog looks like on the build agent (Update 3 is installed):
Parallel run is available in VS2015 Update 1 and later, make sure that you are using the right VS version to run the test in parallel.

Bamboo vs CxxTest

When I create a plan in Bamboo and add a task for running CxxTest test code(running function TS_ASSERT(1==1) or st). When I try to run for checking failure (TS_ASSERT(1==2)), this test case is fail and Bamboo output a log as:
12-Mar-2014 15:12:07 Failed 1 and Skipped 0 of 2 tests
12-Mar-2014 15:12:07 Success rate: 50%
12-Mar-2014 15:12:07 Failing task since return code was 1 while expected 0
So, does anyone here know why bamboo can understand the test result, and what is the return code here(return code was 1 while expected 0)?
From my observation, one windows, Bamboo consider value of the %ERRORLEVEL% as the result of the task. so return code was 1 while expected 0 means your task is returning 1 and Bamboo is expecting 0. Yes, it's expecting 0 since it considers any value other than 0 as failure.

How to pass quoted parameters to add_test in cmake?

I'm trying to pass parameters to a gtest test suite from cmake:
add_test(NAME craft_test
COMMAND craft --gtest_output='xml:report.xml')
The issue is that these parameters are being passed surrounded by quotes, why? It looks like a bug, is there a good way for avoiding it?
$ ctest -V
UpdateCTestConfiguration from :/usr/local/src/craft/build-analyze/DartConfiguration.tcl
UpdateCTestConfiguration from :/usr/local/src/craft/build-analyze/DartConfiguration.tcl
Test project /usr/local/src/craft/build-analyze
Constructing a list of tests
Done constructing a list of tests
Checking test dependency graph...
Checking test dependency graph end
test 1
Start 1: craft_test
1: Test command: /usr/local/src/craft/build-analyze/craft "--gtest_output='xml:report.xml'"
1: Test timeout computed to be: 9.99988e+06
1: WARNING: unrecognized output format "'xml" ignored.
1: [==========] Running 1 test from 1 test case.
1: [----------] Global test environment set-up.
1: [----------] 1 test from best_answer_test
1: [ RUN ] best_answer_test.test_sample
1: [ OK ] best_answer_test.test_sample (0 ms)
1: [----------] 1 test from best_answer_test (0 ms total)
1:
1: [----------] Global test environment tear-down
1: [==========] 1 test from 1 test case ran. (0 ms total)
1: [ PASSED ] 1 test.
1/1 Test #1: craft_test ....................... Passed 0.00 sec
100% tests passed, 0 tests failed out of 1
Total Test time (real) = 0.00 sec
It's not the quotes that CMake adds that is the problem here; it's the single quotes in 'xml:report.xml' that are at fault.
You should do:
add_test(NAME craft_test
COMMAND craft --gtest_output=xml:report.xml)