Is it possible to call a test feature from a MockServer feature(matched) in KarateDSL? - karate

I'm trying to do something that I don't know if it is even remotely possible or not.
I've a Mock server, and I'd like that when it receives a given request, it "starts another test", calling a test feature. I tried some stuff, including the one bellow. But turns out that this Mockserver scenario do not respond.
Scenario: pathMatches('/ideas')
* def xx = call read('SimpleStart.feature')
* def response = $ideas.*
Is there an elegant way to make this work? AN workaround or a suggestion you can give me?
The use case is:
Perform tests, some tests, make some external services invoke the mockserver, and if the mockserver is requested it triggers other tests.
Thanks in advance.

Yeah Karate certainly isn't designed to do that. The pattern should be set up your mocks and tests from a Java "runner" for maximum control and that's what most teams do.
In short, "orchestrate" things from Java code.
That said, see if this gives you some other creative ideas: https://twitter.com/getkarate/status/1417023536082812935

Related

Mock a client to a dependency when doing integration tests

I have a service composed of several micro services working in workflow. Some of these MS are calling dependencies.
Now I'd like to mock these calls, so when I send a message at the entrance of the workflow the dependencies answers get mocked, but my system works as normal.
The challenge is that I would want to configure the mock from the outside: Configure on the fly the answer I want to get from the mock, run my test, verify that my system behaves properly.
I worked for a company that shall remain unnamed that has an internal tool doing just that, but I'm wondering if there is a public tool doing this? I can't believe it doesn't exist, I bet more on my lack of knowledge about this, at that point ;)
Cheers
So far best I've found is https://wiremock.org/docs/
I don't want to do any ad here, it's just the only one I see that gets its mocks configured from a distance

Best practices to test Hessian RPC requests with Karate

We're successfully using Karate to automate tests for REST and SOAP webservices. In addition, we're having some legacy webservices, which are based on the Hessian Web Service protocol (http://hessian.caucho.com/).
Hessian calls are HTTP requests as well, so we would like to add them to our Karate test suites.
My first attempt was to use the Java Interop feature, so the tests are implemeted as Java code and the Java classes are getting called within the Feature files.
Example:
Scenario: Test offer purchase order
* def OfferPurchaseClient = Java.type('com.xyz.OfferPurchaseClient')
* def orderId = OfferPurchaseClient.createOrder('12345', 'xyz', 'max.mustermann#test.de')
* match orderId == '#number'
This approach is working, but I'm wondering if there is a more elegant way which would also use some more features of the Karate DSL.
I'm thinking about something like this (Dummy Code):
Scenario: Test offer purchase order
Given url orderManagementEndpoint
And path offerPurchase
And request serializeHessian(offerPurchase.json)
When method post
Then status 200
And match deserializeHessian(response).orderId == '#number'
Any recommendations/tips about how to implement such an approach?
I actually think you are already on the right track with the Java interop approach. The HTTP client is in my honest opinion a very small sub-set of the capabilities of Karate, there are so many other aspects such as assertions, reports, parallel-execution, even load-testing etc. So I wouldn't worry about not using request, method, status etc.
Maybe OfferPurchaseClient.createOrder() can return a JSON.
Or what I would do is consider an approach like OfferPurchaseClient.request(Map<String, Object> payload) - where payload can include all the parameters including the path and expected response code etc.
To put this another way, I'm suggesting that you write your own mini-framework and custom DSL around some Java code. If you want inspiration and ideas, please look at this discussion, it is an example of building a CLI testing tool around Karate: https://stackoverflow.com/a/62911366/143475

Karate Standalone as Mock Server with multiple Feature Files

I try to setup an integration/API test suite with Karate and consider to use Karate Netty for mocking required services. For the test setup the system under test A (a Spring Boot app) is started up completely. The Karate tests are then executed by a Maven test run against this instance.
The service A depends on multiple other services these needs to be mocked away for the tests. To do so my idea was to configure a running Karate Netty standalone instance as HTTP proxy (done by JVM args of the service A).
Now my idea was to create one test feature file: xyz-test.feature
And the required mocks for this file are defined in an associated mock feature file: xyz-mock.feature
(The test scenarios are rather complex and the responses of the external services could vary)
This means for a full test run I need to load up a couple of mock feature files. So:
What is the matching strategy for multiple mock feature files? Which scenario wins, so to say.
Is there any way to ensure, that the right mock file is used for the associated test file?
(Clearly I can reconfigure the running standalone instance and advice it to use xyz-mock.feature next.
But this would stop me from using parallel execution for my API tests, right?)
I already thought about reusing the Correlation-Id which I can send in for each test and then match against this in the mock file (it is also sent to all called services). But:
Is there a way to define a global matcher per mock file?
It sounds like you need only one mock file. You could boot 2 on different ports if you wanted, but there is no way to "merge" them into one port - if that is what you were looking for.
In my experience, you will be able to have a single mock take care of all your edge cases. This is because Karate's approach is un-conventional: you pretty much write a stateful server. But by keeping variables in memory and some clever JSON-path, you can simulate CRUD with very few lines of code: https://github.com/intuit/karate/tree/master/karate-netty#background
You can use only one at a time, by design
Given the above limitation, here's an interesting idea: add something like an extra pathMatches('/__test/reset') scenario that cleans-up your state and sets the Background variables to things like * def cats = []. Now in each feature, just call the special "reset" URL at the start. The good thing is Karate is thread-safe. Another idea as you said is you can maintain two or three different variables and use some logic to "route" based on a header, again very easy IMO. Use a map of maps, e.g:
def data = { cats1: {}, cats2: {}, cats3: {} }
And you can get the header, e.g. if it is mode: cats1
* def mode = karate.get('requestHeaders.mode[0]')
* def cats = data[mode]
not sure if this answers your question, but if the last Scenario has an "empty" description, it is a "catch all" and can in theory delegate to another server (or mock): https://github.com/intuit/karate/tree/develop/karate-netty#proxy-mode
Your question is a little confusing, so you may have to edit and re-word it if I haven't understood.
EDIT: using multiple mock files should be possible in 1.1.0 onwards: https://github.com/intuit/karate/issues/1566

Karate Listener support

Does karate provide any listener support where I can intercept any specific things like rest calls?
This is more like added customization we want to perform apart from what the karate provides. There will be always something or other we need to customize based on the need.
Say that I have 10000 test cases running in parallel and using karate parallel runner I get a nice report with the time it takes for each step and test cases. One of my service is getting called multiple times and I wanted to know what is the average time the service takes out of all the calls. What is the maximum or minimum time it takes.
I think Karate Hooks will get you what you need - if you write a function to aggregate the responseTime.
I'm willing to look at introducing this feature if needed, but you'll have to make a proposal on what the syntax should look like. Feel free to open a feature request. Today we do have configure headers that is like a "before" for all requests. Maybe something along those lines.

Modify an http response in a protractor test

I'm trying to write some end to end tests for our application's login process, but am having trouble getting my head around the best way to set up the scenario where the user needs to change his password.
When our server responds to a successful login, a user object is returned with a changePassword field. The client then inspects the response and redirects accordingly.
My problem is getting the test set up so that the changePassword field is set - what is the best approach to use?
I see my options as:
Have a test set up and tear-down script for the server that creates a brand new user specifically for the test run with changePassword flag set in the database.
This seems like the most end to end approach, but is probably also the most effort & code.
Somehow intercept the http response in the test and modify the changePassword flag to be set for this test only.
Mock the http response completely. Using this approach is the most removed from an end to end test, but is perhaps the simplest?
Which is the best or most common approach? Also any general pointers on how to actually implement the above (particularly 1 and 2) with protractor would be great - I'm finding it hard to conceptually get straight in my head, and hence hard to know what to search for.
I'm using protractor as the test framework, with angular.js powering the client side, and a node server running utilising (among other things) express.js and mongoDB.
Having thought about this further, option 1 is the best solution, but is not always possible.
Option 2 is also possible, and option 3 should be avoided.
For option two, a mock module can be created like so: (coffeescript)
e2eInterceptors =->
angular.module('e2eInterceptors', [])
.factory('loginInterceptor', ()->
response: (response)->
# Only edit responses we are interested in
return response unless response.match(/login/)
# do the modifiations
response.data.changePassword = true
# return the response
return response
)
.config(($httpProvider)->
$httpProvider.interceptors.push('loginInterceptor')
)
You can then inject this module into your tests using
browser.addMockModule('e2eInterceptors', e2eInterceptors)
If you want to do this globally, you can put this in the onPrepare function in your protractor file, otherwise just call it when needed in tests.
I think your first approach is the most appropriate.
It would be useful anyway to test the new user creation, so it is not a waste.
And for example this example seems to be something similar: http://product.moveline.com/testing-angular-apps-end-to-end-with-protractor.html