Usecase:
Rest API returns response payload having 10 fields. We have store expected json in DB. When we run test - we compare expected json with actual json returned by api to assert each field.
Let's say - we have 500 such test.
Now - when we change response of Rest API to start returning 15 fields as part of response (additional 5 fields) - for existing 500 tests - should we update expected json to validate additional 5 fields? If yes - what would be most effective way to do it?
Existing approach is - we have to manually update expected json for all 500 tests which is time consuming and can be erroneous.
For Example:
Here is sample response of an API:
{
"field1" : "val1",
"field2" : "val2",
"field3" : {
"field3.a" : "val3",
"field3.b" : "val4",
},
"field4" : "val5"
}
For all 500 tests - we have expected json - with same schema as above to assert all 4 fields.
Now - REST API evolved and started returning 7 fields.
{
"field1" : "val1",
"field2" : "val2",
"field3" : {
"field3.a" : "val3",
"field3.b" : "val4",
},
"field4" : "val5",
"field5" : "val6",
"field6" : "val7",
"field7" : "val8"
}
In such case - existing 500 tests with expected json only assert 5 fields.
To assert all 7 fields - we have to manually update expected json for each 500 tests which is lots of maintenance when rest api response keeps changing.
How to effectively handle it?
Thanks,
Related
I am designing the API where all the following three cases are possible
All the inputs in the array are correct, so this API will return the 200 status code
Sample output: 200 Status code
[{ "status" : "success", "value" : "some response" }, { "status" : "success", "value" : "some response" }]
Few inputs in the array are correct, so this API will return the 207 Multi-Status code
Sample output: 207 Multi-Status code
[{ "status" : "success", "value" : "some response" }, { "status" : "fail", "value" : "reason for failure" }, { "status" : "success", "value" : "some response" }]
All the inputs are wrong in the array. In this case do I need to send the 400 Bad Request (OR) 207 status code ?
Because if I send the 400 Bad Request then the response format will not be consistent like below
Sample Output: 400 Status code
{"errorcode" : "XXXXXX", "message" : "It is failed due to invalid inputs. Input must contain XXXX"}
In Case 3, do I need to send the 400 status code response or 207 status code response with all the status as 'FAIL'. Which is correct and consistent?
There is a huge difference between 2xx and 4xx codes.
2xx like 207 means the request is valid and returns values (success or fails).
However 4xx codes like 400 means that client sent an invalid request.
For instance if you have to provide at least one input in the array and client request with an empty array, it's a bad request.
In your case, the request looks valid and you have a mixed behavior of 2xx and 4xx.
If client is able to use part of your results then it's a 207, if he can use successes without failures then the result is 2xx.
If he can't and he needs each input to be valid then it's a 4xx as the request should contains only valid inputs (resulting in success only).
For instance in case 3, a 400 makes sense as user can't continue his process with failures only.
You should be able to format this error by adding a property "details" containing your array of input's failures like
400
{
"errorcode" : "XXXXXX",
"message" : "Only invalid inputs. Input must contain XXXX",
"details" : [{
"status" : "fail",
"value" : "reason for failure"
}, {
"status" : "fail",
"value" : "reason for failure"
}, {
"status" : "fail",
"value" : "reason for failure"
}]
}
I am just wondering how can I do conditional schema validation. The API response is dynamic based on customerType key. If customerType is person then, person details will be included and if the customerType is org organization details will be included in the JSON response. So the response can be in either of the following forms
{
"customerType" : "person",
"person" : {
"fistName" : "A",
"lastName" : "B"
},
"id" : 1,
"requestDate" : "2021-11-11"
}
{
"customerType" : "org",
"organization" : {
"orgName" : "A",
"orgAddress" : "B"
},
"id" : 2,
"requestDate" : "2021-11-11"
}
The schema I created to validate above 2 scenario is as follows
{
"customerType" : "#string",
"organization" : "#? response.customerType=='org' ? karate.match(_,personSchema) : karate.match(_,null)",
"person" : "#? response.customerType=='person' ? karate.match(_,orgSchema) : karate.match(_,null)",
"id" : "#number",
"requestDate" : "#string"
}
but the schema fails to match with the actual response. What changes should I make in the schema to make it work?
Note : I am planning to reuse the schema in multiple tests so I will be keeping the schema in separate files, independent of the feature file
Can you refer to this answer which I think is the better approach: https://stackoverflow.com/a/47336682/143475
That said, I think you missed that the JS karate.match() API doesn't return a boolean, but a JSON that contains a pass boolean property.
So you have to do things like this:
* def someVar = karate.match(actual, expected).pass ? {} : {}
This question already has an answer here:
How to navigate and validate through all the pages of a api response
(1 answer)
Closed 1 year ago.
I am trying to fetch the list of dynamic id's one at a time by application ID as a parameter in GET URL
Example : below response is for POST call
{"Car": 1,
content:[{
"type" : "A"
"Id" : "1"
},
{
"type" : "B"
"Id" : "2"
}
]}
Now for the above POST response I am trying to fetch the data using dynamic Id as a parameter in GET URL
ex:
def ID = karate.jsonPath(response, '$.content[*].id')
Given URL 'https://localhost:8080'
And path '/'+ID+'/id
When method GET
Then status 200
in GET response I am getting list of Id's instead of single Id in URL as shown below
This is the output : http://localhost:8080/1,2/id
As Id's are generating dynamically, so instead of calling one by one ID manually I want to call using parameter
Can any one suggest me how can I fetch one ID at a time using GET URL ?
Try this example, and observe the output and then try to understand how it works:
* def response = {"Car": 1, content:[{ "type" : "A", "Id" : "1" }, { "type" : "B", "Id" : "2" } ]}
* def ids = $response.content[*].Id
* match ids == ['1', '2']
* def data = karate.mapWithKey(ids, 'id')
* call read('called.feature') data
And called.feature looks like this:
#ignore
Feature:
Scenario:
* url 'https://httpbin.org/anything'
* param id = id
* method get
Please try to read the docs, it is worth it: https://github.com/karatelabs/karate#json-transforms
Here I would like to clarify about creating dynamic example table for a dynamic JSON index size
My JSON looks like
Env - Dev - 2 servers
"response": {
"abc": [{
"status": "pass"
.
.
},
{
"status": "pass"
.
.
}
]
}
Env - Uat - 3 servers
{
"response": {
"abc": [{
"status": "pass"
},
{
"status": "pass"
},
{
"status": "pass"
}
]
}
}
My scenario outline looks like
Scenario Outline: validating .....
When def result = callonce read('featurefilename#tagname')
Then print result
And print <status>
And print ...
And match ....
Examples:
|result.response.abc|
Errors for the above:
1) * dynamic expression evaluation failed:result.response.abc
2) com.intuit.karate.karateExpresion: ---- javascript evaluation failed result.response.abc, ReferenceError:"result" is not defined in at line number 1
Note - If I move step 'When def result = callonce read('featurefilename#tagname') to background it's working as expected but I can't use background in my feature file due to other factors.
Thanks in advance
Instead of providing index in a table you can leverage Dynamic Scenario Outline feature in karate.
In this case you you can pass the variable as a input to Examples. If the JSON provided above is from variable result then,
Examples:
| result.response.abc |
Refer the docs for more insights.
I have a smallish (~50,00) array of json dictionaries that I want to store/index in ES. My preference is to use python, since the data I want to index is coming from a csv file, loaded and converted to json via python. Alternatively, I would like to skip the step of converting to json, and simply use the array of python dictionaries I have. Anyway, a quick search revealed the bulk indexing functionality of ES. I want to do something like this:
post_url = 'http://localhost:9202/_bulk'
request.post(post_url, data = acc ) # acc a python array of dictionaries
or
post_url = 'http://localhost:9202/_bulk'
request.post(post_url, params = acc ) # acc a python array of dictionaries
both request give a [HTTP 500 error]
My understanding is that you have to have one "command" per line (index, create, delete...) and then some of them (like index) takes a row of data on the next line like so
{'index': ''}\n
{'your': 'data'}\n
{'index': ''}\n
{'other': 'data'}\n
NB the new-lines, even on the last row.
Empty index objects like above works if you POST to ../index/type/_bulk or else you need to specify index and type I think, have not tried that.
You the following function will do it:
def post_request(self, endpoint, data):
endpoint = 'localhost:9200/_bulk'
response = requests.post(endpoint, data=data, headers={'content-type':'application/json', 'charset':'UTF-8'})
return response
As data you need to pass a String such:
{ "index" : { "_index" : "test-index", "_type" : "_doc", "_id" : "1681", "routing" : 0 }}
{ "field1" : ... , ..., "fieldN" : ... }
{ "index" : { "_index" : "test-index", "_type" : "_doc", "_id" : "1684", "routing" : 1 }}
{ "field1" : ... , ..., "fieldN" : ... }
Make sure you add a "\n" at the end of each line.
I don't know much about Python, but did you look at Pyes?
Bulk is supported in Pyes.