I have the following response:
[
{
"id": 53,
"fileUri": "abc",
"filename": "abc.jpg",
"fileSizeBytes": 578466,
"createdDate": "2018-10-15",
"updatedDate": "2018-10-15"
},
{
"id": 54,
"fileUri": "xyz",
"filename": "xyz.pdf",
"fileSizeBytes": 88170994,
"createdDate": "2018-10-15",
"updatedDate": "2018-10-15"
}
]
and I am trying to match the id value to the object in JUnit like so:
RestAssured.given() //
.expect() //
.statusCode(HttpStatus.SC_OK) //
.when() //
.get(String.format("%s/%s/file", URL_BASE, id)) //
.then() //
.log().all() //
.body("", hasSize(2)) //
.body("id", hasItems(file1.getId(), file2.getId()));
But when the match occurs it tries to match an int to a long. Instead I get this output:
java.lang.AssertionError: 1 expectation failed.
JSON path id doesn't match.
Expected: (a collection containing <53L> and a collection containing <54L>)
Actual: [53, 54]
How does one tell Rest Assured that the value is indeed a long even though it might be short enough to fit in an int? I can cast the file's id to an int and it works, but that seems sloppy.
The problem is that when converting from json to java type, int type selected,
one solution is to compare int values.
instead of
.body("id", hasItems(file1.getId(), file2.getId()));
use
.body("id", hasItems(new Long(file1.getId()).intValue(), new Long(file2.getId()).intValue()));
Related
I consume a HTTP request and I need to save and accumulate the JSON response with out any transformation in a variable, I do that but I dont know why the accumulate it correctly, could you please tell me how can I solve that problem.
Json Response By Iteration:
Iteration 1:
{
"orderId": "11111",
"status": "false",
"receivedAt": "2022-07-28T22:45:12.175Z",
"createdAt": "2022-07-28T22:45:12.175Z",
}
Iteration 2:
{
"orderId": "22222",
"status": "false",
"receivedAt": "2022-07-28T22:45:27.907Z",
"createdAt": "2022-07-28T22:45:27.907Z"
}
Dataweave: (csv Payload is the name of the variable where the values accumulated)
%dw 2.0
output application/json
---
if ( vars.counter == 1)
( payload )
else
( vars.csvPayload ) ++ payload
Variable Result:
{
"orderId": "11111",
"status": "false",
"receivedAt": "2022-07-28T22:45:12.175Z",
"createdAt": "2022-07-28T22:45:12.175Z",
"orderId": "22222",
"status": "false",
"receivedAt": "2022-07-28T22:45:27.907Z",
"createdAt": "2022-07-28T22:45:27.907Z"
}
Variable Expected:
[
{
"orderId": "11111",
"status": "false",
"receivedAt": "2022-07-28T22:45:12.175Z",
"createdAt": "2022-07-28T22:45:12.175Z",
},
{"orderId": "22222",
"status": "false",
"receivedAt": "2022-07-28T22:45:27.907Z",
"createdAt": "2022-07-28T22:45:27.907Z"
}
]
NOTE: I don't know why the JSON responses of each iteration are joined in the same object and not as a different object in an array.
The incorrect output is easy. You are concatenating an object to another object. In that case the operator ++ "extracts all the key-values pairs from each object, then combines them together into one result object." according to the documentation. The parenthesis are totally unneeded.
Setting the output to JSON in each iteration -I'm assuming this is inside a foreach- is inefficient since it requires formatting the output to JSON in each iteration only to parse it again in the following one. I recommend to use application/java -which doesn't require parsing/formatting- inside a loop and after the loop convert the entire output to JSON in one go.
You should use an array to hold the values, so assign an empty array ([]) to the variable before the foreach loop to initialize it. Then the counter is unneeded since you can just add elements to the array:
%dw 2.0
output application/java
---
vars.allOrders ++ payload
Then after the foreach just transform the array to JSON:
%dw 2.0
output application/json
---
vars.allOrders
Your requirement is a very common scenario in integrations.
Assuming you are using HTTP Requester inside a for-each loop and initialized the variable to hold the final Payload (lets say var finalPayload = []) before the loop, you have to keep on updating the same variable (append the new data in the same array variable like below) with the response payload received from HTTP Requester for each iteration.
var finalPayload << payload
Ultimately check the finalPayload outside for-each loop, that will give you your desired result.
I am using Elsaticsearch Spring Data. I have a custom repository that uses ElasticsearchOperations based on examples on docs. I need some aggregation query results and I successfully get the intended results. but I need to map those results to a model. But currently I'm unable to access contents of AggregationsContainer.
override fun getStats(startTime: Long, endTime: Long, pageable: Pageable): AggregationsContainer<*>?
{
val query: Query = NativeSearchQueryBuilder()
.withQuery(QueryBuilders.rangeQuery("time").from(startTime).to(endTime))
.withAggregations(AggregationBuilders.sum("discount").field("discount"))
.withAggregations(AggregationBuilders.sum("price").field("price"))
.withPageable(pageable)
.build()
val searchHits: SearchHits<Product> = operations.search(query, Product::class.java)
return searchHits.aggregations
}
I return the result of the following code:
val stats = repository.getTotalStats(before, currentTime, pageable)?.aggregations()
the result is :
{
"asMap": {
"discount": {
"name": "discount",
"metadata": null,
"value": 8000.0,
"valueAsString": "8000.0",
"type": "sum",
"fragment": true
},
"price": {
"name": "price",
"metadata": null,
"value": 9000.0,
"valueAsString": "9000.0",
"type": "sum",
"fragment": true
}
},
"fragment": true
}
How can I convert above output to an intended output model like following? as I tested contents of aggregations() are inaccessible and the type is Any :
{
"priceSum":9000.0,
"discountSum":8000
}
There is no data model in the Elasticsearch RestHighLevelClient classes for aggregations, and there is no on in Spring Data Elasticsearch. Therefore the original Aggregations object is returned to the caller (contained in that AggregationContainer, because that will change with new new client implementation, and then the container will hold a different object).
You have to parse this by yourself, I had something in the answer of another question (https://stackoverflow.com/a/63105356/4393565). The interesting thing for you is the last codeblock where the aggregations are passed. You basically have to iterate over the elements, cast them to the appropriate type and evaluate them.
I am trying to create my request body dynamically from an external json file.
I want to update few values and keep the remaining ones same as received from the json.
The idea here is to keep one maintainable json file and manipulate it at run time to execute various scenarios.
Here's my feature file:
* def myJson = read('testFile.json')
* def requestBody = { "product": "#(myJson.product)", "properties": { "make": "#(brand)", "color": "#(myJson.color)" }
When request requestBody
And method post
Then status 200
Examples:
| brand |
| honda |
Contents of testFile.json are -
{
"product": "car",
"properties": {
"make": "brand",
"color": "red"
}
}
The problem is that whenever there is nested json object, those fields won't keep the value from json. If the value is passed from the feature file as an example, then it gets evaluated correctly. Here's how the request body gets passed in the service call-
{
"product": "car",
"properties": {
"make": "honda",
"color": null
}
}
I need the color key's value to be taken from myJson i.e. red but it get evaluated as null.
Shouldn't it be:
"color": "#(myJson.properties.color)"
From a form submission I receive two objects: the original values and the dirty values. I like to figure out how to create a diff to send to the server using the following rules:
id field of the root object should always be included
all changed primitive values should be included
all nested changes should be included as well.
if a nested value other than id changed, it should include id as well.
Original values:
{
"id":10,
"name": "tkvw"
"locale": "nl",
"address":{
"id":2,
"street": "Somewhere",
"zipcode": "8965",
},
"subscriptions":[8,9,10],
"category":{
"id":6
},
}
Example expected diff objects:
1) User changes field name to "Foo"
{
"id":10,
"name":"foo"
}
2) User changes field street on address node and category
{
"id":10,
"address":{
"id": 2,
"street":"Changed"
},
"category":{
"id":5
}
}
I do understand the basics of functional programming, but I just need a hint in the right direction (some meta code maybe).
Take a look at JSON Patch (rfc6902), JSON Patch is a format for describing changes to a JSON document. For example:
[
{ "op": "replace", "path": "/baz", "value": "boo" },
{ "op": "add", "path": "/hello", "value": ["world"] },
{ "op": "remove", "path": "/foo"}
]
You generate a patch by comparing to JS objects/arrays, and then you can apply the patch to the original object (on the server side for example) to reflect changes.
You can create a patch using the fast-json-patch lib.
const obj1 = {"id":10,"name":"tkvw","locale":"nl","address":{"id":2,"street":"Somewhere","zipcode":"8965"},"subscriptions":[8,9,10],"category":{"id":6}};
const obj2 = {"id":10,"name":"cats","locale":"nl","address":{"id":2,"street":"Somewhere","zipcode":"8965"},"subscriptions":[8,9,10,11],"category":{"id":7}};
const delta = jsonpatch.compare(obj1, obj2);
console.log('delta:\n', delta);
const doc = jsonpatch.applyPatch(obj1, delta).newDocument;
console.log('patched obj1:\n', doc);
<script src="https://cdnjs.cloudflare.com/ajax/libs/fast-json-patch/2.0.6/fast-json-patch.min.js"></script>
I am trying to take output from Salesforce & transform it to a json. here is my code:
%dw 1.0
%output application/json
payload map {
headerandlines:{ id : $.Id,
agreementLineID : $.LineItems__r.Id,
netPrice : $.LineItems__r.Price__c,
volume : $.Volume__c,
name : $.Name,
StartDate : $.Start_Date__c,
EndDate : $.End_Date__c,
poField : $.PO_Field__c,
ConsoleNumber : $.Console_Number__c,
Term : $.Term__c,
ownerID : $.OwnerId,
Unit : $.Unit__c,
siteNumber : $.Site_Num__c,
customerNumber : $.Customer_Num__c
}
}
input payload looks like this.. it is a collection of objects. Somehow after the transformation only the first object is sent & rest is clobbered.
[
{
"id": "DA0YAAW",
"LineID": [
"jGEAU",
"jBEAU",
"j6EAE"
],
"Price": [
"50000.0",
"12000.0",
"45000.0"
],
"netPrice": null,
"volume": null,
"name": " Test 2.24",
"StartDate": "2017-02-17",
"EndDate": "2018-02-17",
"poField": "123456",
"ConsoleNumber": "8888888",
"PaymentTerm": "thirty (30)",
"ownerID": “abcd”,
"OperatingUnit": " International Company",
"siteNumber": null,
"customerNumber": null
},
{
"id": "a37n0000000DAMAAA4",
"LineID": [
"JunEAE",
"JuiEAE",
"KdMEAU",
"JuYEAU"
],
"Price": [
"5000.0",
"8000.0",
"5000.0",
"5000.0"
],
"netPrice": null,
"volume": null,
"name": " Test 3.6",
"StartDate": "2017-03-06",
"EndDate": "2018-03-16",
"poField": "12345",
"ConsoleNumber": "123456-",
"PaymentTerm": "30 NET",
"ownerID": “dfgh”,
"OperatingUnit": ", inc.",
"siteNumber": null,
"customerNumber": null
},
….
]
When I call this code from the browser (using API testing) I get the complete payload with multiple objects. When I call this from another API I get only one 1 object indicating it is not looping through. I can confirm that the payload has multiple objects . Is there anything I am missing in terms of looping through this code to extract multiple objects? I assume that '$' notation is good enough for iteration.
#insaneyogi, your input is either incorrect or your dataweave is incorrect.
Here in the input you have specified id in the small. but in dataweave, it is mentioned in capital.
I think the problem here is with your Lineitem and Price type elements. They are collection within and element. In your data mapping $. will take care of the outer object. However, i think the mapping like LineItems__r.Price__c is not correct. It should have proper index , probably LineItems__r.Price__c[0]. Please try that and it should work. First change the input with single element for price or line-item and test.
It looks like the agreementLineID and netPrice are arrays and you need to loop through them with a map operator within the bigger outer map to get all the line items. That should work.