I'm writing some integration tests using failsafe.
There are two features like this:
Feature: example feature 1
Scenario:
Given url 'http://httpbin.org/'
When method get
Then status 200
My "suite" is:
public class ApiIT {
#Test
public void testParallel(){
Results results = Runner.path("classpath:.").tags("~#ignore").parallel(5);
assertEquals(results.getErrorMessages(), 0, results.getFailCount());
}
}
When I run integration tests using mvn (mvn clean install) I get:
Karate version: 0.9.6.RC4
======================================================
elapsed: 1.41 | threads: 5 | thread time: 1.39
features: 2 | ignored: 0 | efficiency: 0.20
scenarios: 2 | passed: 2 | failed: 0
======================================================
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.558 sec - in ApiIT
Is there any way to count the real tests so I can get this in the logs:
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.558 sec - in ApiIT
I uploaded an example project here: https://github.com/italktothewind/test-count
Nope. Ignore the last line, that is JUnit because you have 1 #Test annotation. What matters here is the Karate output. JUnit makes it simpler to call Karate. But if it bothers you so much, call the Runner using a Java main method.
Related
I'm using Karate Netty, version 0.9.6 on Windows 10 with openJDK 14.0.2.
I'm trying to read data from a json file in a feature file.
The following code fails:
Scenario: Get the credit balance
* def data = read('classpath:examples1/user_credit_balance_get.json')
My console output looks as follows:
Karate version: 0.9.6
======================================================
elapsed: 2.31 | threads: 1 | thread time: 0.02
features: 1 | ignored: 0 | efficiency: 0.01
scenarios: 1 | passed: 0 | failed: 1
======================================================
failed features:
features.protect_a_prospect: protect_a_prospect.feature:4 - evaluation (js) failed: read('classpath:examples1/user_credit_balance_get.json'), java.lang.RuntimeException: evaluation (js) failed: ?{
"session_data": {
"user_id": "101",
"session_id": "dslkdaskljd",
"token": "02389poasklj"
},
"call_data": {
"user_id": "101"
}
}, javax.script.ScriptException: <eval>:2:18 Expected ; but found :
"session_data": {
^ in <eval> at line number 2 at column number 18
stack trace: jdk.scripting.nashorn/jdk.nashorn.api.scripting.NashornScriptEngine.throwAsScriptException(NashornScriptEngine.java:477)
stack trace: com.intuit.karate.ScriptBindings.eval(ScriptBindings.java:155)
com.intuit.karate.exception.KarateException: there are test failures !
at ...(.)
This leads me to believe that Karate is trying to read my json file as if it were JavaScript.
What could be the reason for this behaviour?
-- Edit --
Using karate.readAsString instead of read works as a workaround for me:
Scenario: Get the credit balance
* def data = karate.readAsString('classpath:examples1/user_credit_balance_get.json')
This is mighty confusing, you say Karate mocks but then you show the log for a Karate test. Karate should never try to evaluate a *.json file.
I think the best thing to do is to follow this process: https://github.com/intuit/karate/wiki/How-to-Submit-an-Issue
I'm learning how to add unit tests to an objective-c project using XCode9. So I've created a command line project from scratch called Foo and afterwards I've added a new target to the project called FooTests. Afterwards I've edited Foo's scheme to add FooTests. However, whenever I run the tests (i.e., menu "Product" -> "Tests" ) XCode9 throws the following error:
Showing All Messages
Test target FooTests encountered an error (Early unexpected exit, operation never finished bootstrapping - no restart will be attempted)
However, when I try to run tests by calling xcode-build from the command line, it seems that all unit tests are executed correctly. Here's the output;
a483e79a7057:foo ram$ xcodebuild test -project foo.xcodeproj -scheme foo
2020-05-15 17:39:30.496 xcodebuild[53179:948485] IDETestOperationsObserverDebug: Writing diagnostic log for test session to:
/var/folders/_z/q35r6n050jz5fw662ckc_kqxbywcq0/T/com.apple.dt.XCTest/IDETestRunSession-E7DD2270-C6C2-43ED-84A9-6EBFB9A4E853/FooTests-8FE46058-FC4A-47A2-8E97-8D229C5678E1/Session-FooTests-2020-05-15_173930-Mq0Z8N.log
2020-05-15 17:39:30.496 xcodebuild[53179:948484] [MT] IDETestOperationsObserverDebug: (324DB265-AD89-49B6-9216-22A6F75B2EDF) Beginning test session FooTests-324DB265-AD89-49B6-9216-22A6F75B2EDF at 2020-05-15 17:39:30.497 with Xcode 9F2000 on target <DVTLocalComputer: 0x7f90b2302ef0 (My Mac | x86_64h)> (10.14.6 (18G4032))
=== BUILD TARGET foo OF PROJECT foo WITH CONFIGURATION Debug ===
Check dependencies
=== BUILD TARGET FooTests OF PROJECT foo WITH CONFIGURATION Debug ===
Check dependencies
Test Suite 'All tests' started at 2020-05-15 17:39:30.845
Test Suite 'FooTests.xctest' started at 2020-05-15 17:39:30.846
Test Suite 'FooTests' started at 2020-05-15 17:39:30.846
Test Case '-[FooTests testExample]' started.
Test Case '-[FooTests testExample]' passed (0.082 seconds).
Test Case '-[FooTests testPerformanceExample]' started.
/Users/ram/development/objective-c/foo/FooTests/FooTests.m:36: Test Case '-[FooTests testPerformanceExample]' measured [Time, seconds] average: 0.000, relative standard deviation: 84.183%, values: [0.000006, 0.000002, 0.000001, 0.000002, 0.000001, 0.000001, 0.000001, 0.000001, 0.000001, 0.000001], performanceMetricID:com.apple.XCTPerformanceMetric_WallClockTime, baselineName: "", baselineAverage: , maxPercentRegression: 10.000%, maxPercentRelativeStandardDeviation: 10.000%, maxRegression: 0.100, maxStandardDeviation: 0.100
Test Case '-[FooTests testPerformanceExample]' passed (0.660 seconds).
Test Suite 'FooTests' passed at 2020-05-15 17:39:31.589.
Executed 2 tests, with 0 failures (0 unexpected) in 0.742 (0.743) seconds
Test Suite 'FooTests.xctest' passed at 2020-05-15 17:39:31.589.
Executed 2 tests, with 0 failures (0 unexpected) in 0.742 (0.744) seconds
Test Suite 'All tests' passed at 2020-05-15 17:39:31.590.
Executed 2 tests, with 0 failures (0 unexpected) in 0.742 (0.745) seconds
** TEST SUCCEEDED **
Does anyone know how to add unit tests to an xcode9 project for a command line application? If you happen to know, what's the right way of doing this and what am I doing wrong?
I'm using CTest to run tests written with cmocka. I'd like to know if it's possible to have CTest read the test names from my cmocka source and give them to me on the output. For example, if my test source contains 3 tests: test_order_correct, test_order_received and test_customer_happy, if I build these tests into an executable called tests and I run it with CTest, the only output that I get is:
Test project .......
Start 1: tests
1/1 Test #1: tests ......................... Passed 0.00 sec
100% tests passed, 0 tests failed out of 1
Total Test time (real) = 0.01 sec
I'd like to see:
Test project .......
Start 1: test_order_correct
1/3 Test #1: test_order_correct .......................... Passed 0.00 sec
Start 2: test_order_received
2/3 Test #2: test_order_received ......................... Passed 0.00 sec
Start 3: test_customer_happy
3/3 Test #3: test_customer_happy ......................... Passed 0.00 sec
100% tests passed, 0 tests failed out of 3
Total Test time (real) = 0.01 sec
Is this possible, or is CTest not capable of delving into the source like that? As I type this, it seems less and less possible by the word.
If you call 'make test' it only gives you reduced output. To be more verbose just call 'ctest -V' in the build directory.
I am getting following information in console when the test is executed by arquillian.
Apr 15, 2014 7:41:56 PM org.jboss.arquillian.protocol.jmx.JMXMethodExecutor invoke
SEVERE:Failed:com.bidis.bridge.systemlog.server.facade.SystemLogTest.testInsertSystemLog1
Apr 15, 2014 7:41:56 PM org.jboss.arquillian.protocol.jmx.JMXMethodExecutor invoke
SEVERE:Failed:com.bidis.bridge.systemlog.server.facade.SystemLogTest.testInsertSystemLog
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.272 sec - in com.bidis.bridge.systemlog.server.facade.SystemLogTest
Results :
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0
The test are green. But, one of the test which is suppose to insert record in db is also green, but no records are inserted in DB.
I am not able to figure out what is happing here. Why the SEVER: Failed message is there after JMX invoke.?
Any input on this.
Thank you
Sanjeev.
We had the same issue. It was a bug in Arquillian 1.1.4 and it's fixed in 1.1.5. Simply updating helped us.
I'm trying to pass parameters to a gtest test suite from cmake:
add_test(NAME craft_test
COMMAND craft --gtest_output='xml:report.xml')
The issue is that these parameters are being passed surrounded by quotes, why? It looks like a bug, is there a good way for avoiding it?
$ ctest -V
UpdateCTestConfiguration from :/usr/local/src/craft/build-analyze/DartConfiguration.tcl
UpdateCTestConfiguration from :/usr/local/src/craft/build-analyze/DartConfiguration.tcl
Test project /usr/local/src/craft/build-analyze
Constructing a list of tests
Done constructing a list of tests
Checking test dependency graph...
Checking test dependency graph end
test 1
Start 1: craft_test
1: Test command: /usr/local/src/craft/build-analyze/craft "--gtest_output='xml:report.xml'"
1: Test timeout computed to be: 9.99988e+06
1: WARNING: unrecognized output format "'xml" ignored.
1: [==========] Running 1 test from 1 test case.
1: [----------] Global test environment set-up.
1: [----------] 1 test from best_answer_test
1: [ RUN ] best_answer_test.test_sample
1: [ OK ] best_answer_test.test_sample (0 ms)
1: [----------] 1 test from best_answer_test (0 ms total)
1:
1: [----------] Global test environment tear-down
1: [==========] 1 test from 1 test case ran. (0 ms total)
1: [ PASSED ] 1 test.
1/1 Test #1: craft_test ....................... Passed 0.00 sec
100% tests passed, 0 tests failed out of 1
Total Test time (real) = 0.00 sec
It's not the quotes that CMake adds that is the problem here; it's the single quotes in 'xml:report.xml' that are at fault.
You should do:
add_test(NAME craft_test
COMMAND craft --gtest_output=xml:report.xml)