Dataweave to remove json elements from values after groupBy | Mule 3.9 - mule

I am trying to transform an xml based on groupBy in dataweave, however I also need to remove few json attributes from output.
Input json:
[ {
"m": {
"a": "a",
"b": "b"
},
"tag1": "A",
"tag2": "v1",
"tag3": "v1" }, {
"m": {
"a": "a",
"b": "b"
},
"tag1": "A",
"tag2": "v2",
"tag3": "v2" }, {
"m": {
"a": "a",
"b": "b"
},
"tag1": "C",
"tag2": "v3",
"tag3": "v3" } ]
Output json
**{
"A": [
{
"tag2": "v1",
"tag3": "v1"
},
{
"tag2": "v2",
"tag3": "v2"
}
],
"C": {
"tag2": "v3",
"tag3": "v3"
}
}**
I have tried following transformation (Mule 3.9) however the could not remove the extra attributes in json.
payload groupBy (item) -> item.tag1
Appreciate any suggestion on this and possibly explain how this can be achieved.

The way to iterate over an object is using mapObject and then you can filter the objects to remove the elements that are not wanted
{a: [1,2,3], b: [2,3]} mapObject ((value,key) ->
{
(key): value filter ((value, index) -> value > 2)
}
)
This will output
{
"a": [
3
],
"b": [
3
]
}

Related

Kotlin - switching object detail based on group by

For example I have a class with below json format
[
{
"name": "a",
"detail": [
"1",
"2",
"3"
]
},
{
"name": "b",
"detail": [
"2",
"3",
"4"
]
}
]
how to change grouped it based on the detail?
[
{
"detail": "1",
"name": [
"a"
]
},
{
"detail": "2",
"name": [
"a",
"b"
]
},
{
"detail": "3",
"name": [
"a",
"b"
]
},
{
"detail": "4",
"name": [
"b"
]
}
]
below is my class structure
data class funName(
#field:JsonProperty("name")
val name: String = "",
#field:JsonProperty("detail")
val detail: Array<String> = arrayOf(""),
)
and my object is based on the array of funName
val data: Array<funName> = ...
i really have no idea how to do it.
val convert = data.groupBy { x -> x.detail } ??
Is this doable in kotlin/java?
Since the original data is grouped by name, you can think of the original data as a list of pairs
name detail
a 1
a 2
a 3
b 2
b 3
b 4
Mapping it to this format first would make it very easy to group by the second thing (detail) in the pair.
Since each funName corresponds to multiple pairs like this, you should use flatMap on data.
val result = data.flatMap { funName ->
funName.detail.map { funName.name to it }
}
.groupBy(keySelector = { (name, detail) -> detail }, valueTransform = { (name, detail) -> name })
// or more concisely, but less readable
// .groupBy({ it.second }) { it.first }
This will get you a Map<String, List<String>>.
If you want a List<Result>, where Result is something like
data class Result(
val detail: String = "",
val names: List<String> = listOf(),
)
You can add an additional map:
.map { (k, v) -> Result(k, v) }

Compare Object value with array in dataweave?

Sample input 1
{
"data": [
{
"a": [
{
"id": 123
}
],
"a1": [],
"a3": [],
"a4": []
},
{
"b": [
{
"bid": 133
}
],
"b1": [],
"b2": []
},
{
"c": [],
"c1": [],
"d": []
}
]
}
sample input 2: (based on which will filter the sample input 1)
[
"d",
"b",
"b1",
"a4"
]
by comparing the values of both the inputs
Scenario: based on the object names present in 2 input need to filter out the objects from the payload 1.
Expected final output:
{
"data": [{
"a": [{
"id": 123
}],
"a1": [],
"a3": []
},
{
"b2": []
},
{
"c": [],
"c1": []
}]
}
sample code:
%dw 2.0
output application/json
---
payload.data map ((item, index) -> item - "d" - "b" - "b1" - "a4") //
Note: This sample is working but but the values should be taken dynamically from the 2 input
Any help would be appreciated. Thank you.
Try with this:
Payload is the sample input 1 that you have typed in.
%dw 2.0
output application/json
var filterList = [ "d", "b", "b1", "a4" ]
---
data: payload.data map {
($ -- filterList)
}

Transformation of Json array in Dataweave

How to write Dataweave transformation in Anytime Studio for given input and output of Json array.
Input:
{
"result": [{
"Labels": [{
"value": [{
"fieldName": "firstName",
"value": "John"
},
{
"fieldName": "lastName",
"value": "Doe"
},
{
"fieldName": "fullName",
"value": "John Doe"
}
]
}]
}]
}
Output:
{
"result": [{
"Labels": [{
"value": [{
"firstName": "John",
"lastName": "Doe",
"fullName": "John Doe"
}]
}]
}]
}
https://docs.mulesoft.com/dataweave/2.4/dw-core-functions-reduce Reduce function might be the one should be used
Thank you in advance
You can just use map to map all the arrays to required format. For the value part you can map the values as fieldName: value array and deconstruct them to an object by wrapping the array around parentheses
%dw 2.0
output application/json
---
{
result: payload.result map ((item) -> {
Labels: item.Labels map ((label) -> {
value: [
{
(label.value map ((field) ->
(field.fieldName): field.value
)) //wrap the array, i.e. lavel.value map ... in parentheses so that it will give you individual key pair.
}
]
})
})
}
You can try below if you are aware that the keyNames will not change:
%dw 2.0
output application/json
---
payload update {
case res at .result -> res map (res, resIndex) -> (res update {
case lbl at .Labels -> lbl map (lbl, lblIndex) -> (lbl update {
case val at .value -> [
(val reduce ((item, acc = {}) -> acc ++ {
(item.fieldName): (item.value)
}))
]
}
)
}
)
}
Here's 2 caveats and a solution. Your input and output files, both are not valid JSON.
Input file, in your "result" object, "Labels" need curly braces {} since they are objects. Key-value pairs should look like this {key:value} not like that key:value
Output file, inside your "value" arrays, key-value pairs need to have the curlies {key:value}
So here's a valid JSON version of your input
{
"result": [
{"Labels": [
{
"value": [
{"fieldName": "firstName","value": "John"},
{"fieldName": "lastName","value": "Doe"},
{"fieldName": "fullName","value": "John Doe"}
]
}
]},
{"Labels": [
{
"value": [
{"fieldName": "firstName","value": "John"}
]
}
]}
]}
Here's a solution
%dw 2.0
import keySet from dw::core::Objects
// this is "result"
var layer1key = keySet(payload)[0]
// this is "Labels" and grabs the first Labels, so assumes Labels doesn't change
var layer2 = payload[layer1key]
var layer2key = keySet(layer2[0])[0]
// this is "value"
var layer3 = layer2[layer2key]
var layer3key = keySet(layer3[0][0])[0]
// this is "fieldName" and "value"
var layer4 = layer3 map (x) -> x['value']
var data1 = ((layer1key) : layer4 map (x) -> {
(layer2key): x map (y) -> {
(layer3key): y map (z) -> {
(z['fieldName']):z['value']
}
}
})
output application/json
---
data1
And a valid JSON version of your output
{
"result": [
{
"Labels": [
{
"value": [
{
"firstName": "John"
},
{
"lastName": "Doe"
},
{
"fullName": "John Doe"
}
]
}
]
},
{
"Labels": [
{
"value": [
{
"firstName": "John"
}
]
}
]
}
]
}

JSON Schema: Conditionally require property depending on several properties' values

I want to validate a JSON file using JSON schema, several "properties" should be required depending on what values some other properties have.
Example
Having properties "A", "B", "C", and "D"
if "A" has value "foo", C is required
if "B" has value "foo", D is required
if both "A" and "B" each have value "foo", both C and D are required
else, nothing is required
I have seen a very helpful answer here: https://stackoverflow.com/a/38781027/5201771 --> there, the author addresses how to solve this problem for the case of a single property (e.g., only "A" has value "foo", so require "C").
However, I currently don't see how to extend that answer to my case, where several properties determine the outcome.
Example Files
for illustration, I supply some files that should pass or fail the validation:
should pass:
{
"A": "bar"
"B": "baz"
}
{
"A": "foo"
"C": "some value"
}
{
"A": "bar"
"B": "foo"
"D": "some value"
}
should fail:
{
"A": "foo"
"B": "foo"
"D": "some value"
}
You can combine conditionals a number of ways, but combining them with allOf is usually the best way.
{
"type": "object",
"properties": {
"A": {},
"B": {},
"C": {},
"D": {}
},
"allOf": [
{ "$ref": "#/definitions/if-A-then-C-is-required" },
{ "$ref": "#/definitions/if-B-then-D-is-required" }
],
"definitions": {
"if-A-then-C-is-required": {
"if": {
"type": "object",
"properties": {
"A": { "const": "foo" }
},
"required": ["A"]
},
"then": { "required": ["C"] }
},
"if-B-then-D-is-required": {
"if": {
"type": "object",
"properties": {
"B": { "const": "foo" }
},
"required": ["B"]
},
"then": { "required": ["D"] }
}
}
}

How do I write a SQL query for a shown azure cosmos db documents?

I have following documents in azure cosmos db collection.
// Document 1
{
"c": {
"firstName": "Robert"
}
"elements" : [
{
"a": "x2",
"b": {
"name": "yadda2",
"id": 1
}
}
]
}
// Document 2
{
"c": {
"firstName": "Steve"
}
"elements" : [
{
"a": "x5",
"b": {
"name": "yadda2",
"id": 4
}
},
{
"a": "x3",
"b": {
"name": "yadda8",
"id": 5
}
},
]
}
// Document 3
{
"c": {
"firstName": "Johnson"
}
"elements" : [
{
"a": "x4",
"b": {
"name": "yadda28",
"id": 25
}
},
{
"a": "x5",
"b": {
"name": "yadda30",
"id": 37
}
},
]
}
I need to write a query that returns all documents which have "b" object whose name is "yadda2" (i.e. /elements/*/b/name=yadda2). In other words, this query should return document 1 and 2 but NOT 3.
I tried following but it did not work:
SELECT * FROM x where ARRAY_CONTAINS(x.elements, {b: { name: "yadda2"}})
What am I doing wrong?
Just modify your sql to :
SELECT * FROM x where ARRAY_CONTAINS(x.elements, {b: { name: "yadda2"}},true)
Result:
Based on the official doc , the boolean expression could specify if the match is full or partial.
Hope it helps you.