In mule ESB dataweave I am having trouble ignoring empty objects, {}.
I am trying to check if a particular table exists in my input. If it exists, I do some business logic, if it doesn't exist, it should not be included in the output. However, I am getting {} instead of nothing.
This is my input file :
{
"srcTable": {
"srcList": [
{
"tableNames": "table1",
"src": [
{
"srcKey": [
{
"key": "date",
"value": "01/01/2016"
},
{
"key": "withinYearTotalMaxSection",
"value": "2500"
},
{
"key": "previousClaimsTotalMaxSection",
"value": "25000"
},
{
"key": "previousClaimsTotalMax",
"value": "50000"
}
]
}
]
},
{
"tableNames": "table2",
"src": [
{
"srcKey": [
{
"key": "date",
"value": "01/01/2016"
},
{
"key": "type",
"value": "A"
},
{
"key": "garden",
"value": "1000"
},
{
"key": "risk",
"value": "50000"
}
]
},
{
"srcKey": [
{
"key": "date",
"value": "01/01/2016"
},
{
"key": "type",
"value": "B"
},
{
"key": "garden",
"value": "0"
},
{
"key": "risk",
"value": "50000"
}
]
}
]
},
{
"tableNames": "table3",
"src": [
{
"srcKey": [
{
"key": "date",
"value": "01/01/2016"
},
{
"key": "type",
"value": "GLD"
},
{
"key": "plants",
"value": "1500"
},
{
"key": "theft",
"value": "3000"
}
]
},
{
"srcKey": [
{
"key": "date",
"value": "01/01/2016"
},
{
"key": "type",
"value": "SVR"
},
{
"key": "plants",
"value": "0"
},
{
"key": "theft",
"value": "1000"
}
]
}
]
}
]
}
}
This is my dataweave:
%dw 1.0
%output application/json skipNullOn="everything"
---
{
singlevalue: [
{
(payload.srcTable.srcList filter ($.tableNames == 'table1') map (r,pos)-> {
(r.src map {
($.srcKey filter ($.key == 'date') map {
name: 'date',
value: $.value
})
})
})
},
{
(payload.srcTable.srcList filter ($.tableNames != null and $.tableNames == 'xxx') map (r,pos)-> {
(r.src map {
($.srcKey filter ($.key == 'date') map {
name: 'date' when $.value != null otherwise null,
value: $.value
})
})
})
}
]
}
This output file :
{
"singlevalue": [
{
"name": "date",
"value": "01/01/2016"
},
{}
]
}
Can anyone suggest how to get rid of the empty objects, {}, please?
Thank you and regards
NK
The easiest thing to do would be to remove all the empty elements at the end like this:
%dw 1.0
%output application/json skipNullOn="everything"
%var transformation = [
{
(payload.srcTable.srcList filter ($.tableNames == 'table1') map (r,pos)-> {
(r.src map {
($.srcKey filter ($.key == 'date') map {
name: 'date',
value: $.value
})
})
})
},
{
(payload.srcTable.srcList filter ($.tableNames != null and $.tableNames == 'xxx') map (r,pos)-> {
(r.src map {
($.srcKey filter ($.key == 'date') map {
name: 'date' when $.value != null otherwise null,
value: $.value
})
})
})
}
]
%function removeEmptyObjects(e)
e filter $ != {}
---
{ singleValue: removeEmptyObjects(transformation) }
This outputs:
{
"singleValue": [
{
"name": "date",
"value": "01/01/2016"
}
]
}
With Josh's help this is the solution if anyone interested.
Using the size Of with the filter combination
%dw 1.0
%output application/json skipNullOn="everything"
{
singlevalue: [
({
(payload.srcTable.srcList filter ($.tableNames == 'table1') map (r,pos)-> {
(r.src map {
($.srcKey filter ($.key == 'date') map {
name: 'date',
value: $.value
})
})
})
}) when (sizeOf (payload.srcTable.srcList filter $.tableNames == 'table1')) != 0,
({
(payload.srcTable.srcList filter ($.tableNames != null and $.tableNames == 'xxx') map (r,pos)-> {
(r.src map {
($.srcKey filter ($.key == 'date') map {
name: 'date' when $.value != null otherwise null,
value: $.value
})
})
})
}) when (sizeOf (payload.srcTable.srcList filter $.tableNames == 'xxx')) != 0]}
Related
I tried to filter below json payload characters[] array which is having result as 'valid' and data[] array name as 'WBB' and priority as '1'
I tried below code but not working can some one help me ?.
flatten(payload.offers.category.characters) filter ((item, index) -> item.result=='valid' and flatten(item.data) filter ((item, index) -> item.name=='WBB' and item.priority==1))
Json payload
{
"offers": [
{
"id": 100,
"name": "Test1",
"category": {
"characters": [
{
"result": "valid",
"data": [
{
"name": "WBB",
"priority": 1
},
{
"name": "ILL",
"priority": 2
}
]
}
]
}
},
{
"id": 200,
"name": "Test2",
"category": {
"characters": [
{
"data": [
{
"name": "ISS",
"priority": 1
},
{
"name": "ILL",
"priority": 2
}
]
}
]
}
},
{
"id": 300,
"name": "Test3",
"category": {
"characters": [
{
"data": [
{
"name": "WSS",
"priority": 1
},
{
"name": "ILL",
"priority": 2
}
]
}
]
}
}
]
}
Expected payload
[
{
"name": "WBB",
"priority": 1
}
]
flatten((flatten(payload.offers..characters) filter $.result == "valid").data) filter ($.name=="WBB" and $.priority == 1)
flatten(payload..data) filter ((item, index) -> item.name == 'WBB' )
I'm trying to do some testing with MongoDB and I have figured some of the simpler MySQL queries MongoDB. Now, I have this slightly more complex query.
I have this query that tells me if there was a message in a certain period from a determined user:
SELECT CASE WHEN count(1) = 0 THEN false ELSE true END AS 'value'
FROM messages m
WHERE m.time BETWEEN 0 AND 1652471890 AND m.user_id = '256f5280-fb49-4ad6-b7f5-65c4329d46e0';
Currently I am trying to do this to count the amount and emit a custom value 0/1:
Current MongoDB Aggregate
db.messages.aggregate([
{ $match:
{
$and: [
{user_id: "256f5280-fb49-4ad6-b7f5-65c4329d46e0"},
{time: {$gt: 1622471890, $lt: 1822471890}}
]
}
},
{ $count: "value"}
])
Dataset:
[
{
"time": 1422471890,
"user_id": "256f5280-fb49-4ad6-b7f5-65c4329d46e0",
"message": "This is an example of my db"
},
{
"time": 1622471890,
"user_id": "256f5280-fb49-4ad6-b7f5-65c4329d46e0",
"message": "This is an example of my db (1)"
},
{
"time": 1622471890,
"user_id": "256f5280-fb49-4ad6-b7f5-65c4329d46e0",
"message": "This is an example of my db (2)"
},
{
"time": 1622471890,
"user_id": "e194d667-d79f-4262-94b1-ecf4561c9418",
"message": "This is an example of my db (3)"
},
{
"time": 1922471890,
"user_id": "256f5280-fb49-4ad6-b7f5-65c4329d46e0"<
"message": "This is an example of my db (4)"
}
]
Return:
With this dataset it's returning:
{ "value" : 2 }
I'm trying make its return:
If count > 0:
{ "value": 1 }
If count <= 0:
{ "value": 0 }
You just need one more $addFields stage to apply $cond to your value
db.collection.aggregate([
{
$match: {
$and: [
{
user_id: "256f5280-fb49-4ad6-b7f5-65c4329d46e0"
},
{
time: {
$gte: 1622471890,
$lt: 1822471890
}
}
]
}
},
{
"$count": "value"
},
{
"$addFields": {
"value": {
"$cond": {
"if": {
"$gt": [
"$value",
0
]
},
"then": "$value",
"else": 0
}
}
}
},
{
"$unionWith": {
"coll": "collection",
"pipeline": [
{
$limit: 1
}
]
}
},
{
"$sort": {
value: -1
}
},
{
$limit: 1
},
{
"$project": {
_id: 0,
value: {
"$ifNull": [
"$value",
0
]
}
}
}
])
Here is the Mongo Playground for your reference.
Problem with using mapObject function properly.
Trying to retain existing array structure but calculate number of vehicles and properties and update the existing array that contains the value.
GENERAL data comes from one source, VEHICLE data comes from another source, PROPERTY data comes from another source. So when merging, I have to update GENERAL data with count of other source data.
Also GENERAL is an array object, it will always have 1. So using GENERAL[0] is safe and fine.
Original Payload
[
{
"commId": "1",
"GENERAL": [
{
"ID": "G1",
"VEHICLE_COUNT": "TODO",
"PROPERTY_COUNT": "TODO"
}
],
"VEHICLE": [
{
"ID": "V1-1"
},
{
"ID": "V1-2"
}
],
"PROPERTY": [
{
"ID": "P1-1"
}
]
},
{
"commId": "2",
"GENERAL": [
{
"ID": "G2",
"VEHICLE_COUNT": "TODO",
"PROPERTY_COUNT": "TODO"
}
],
"VEHICLE": [
{
"ID": "V2-1"
}
],
"PROPERTY": [
{
"ID": "P2-1"
},
{
"ID": "P2-2"
}
]
},
{
"commId": "3",
"GENERAL": [
{
"ID": "G3",
"VEHICLE_COUNT": "TODO",
"PROPERTY_COUNT": "TODO"
}
],
"VEHICLE": [
{
"ID": "V3-1"
},
{
"ID": "V3-2"
},
{
"ID": "V3-3"
}
]
}
]
Tried using map to loop through the payload and tried modifying 2 attribute but only managed to map one but even that is showing wrong output.
test map (item, index) -> {
(item.GENERAL[0] mapObject (value, key) -> {
(key): (value == sizeOf (item.VEHICLE)
when (key as :string) == "VEHICLE_COUNT"
otherwise value)
})
}
Expected output:
[
{
"commId": "1",
"GENERAL": [
{
"ID": "G1",
"VEHICLE_COUNT": "2",
"PROPERTY_COUNT": "1"
}
],
"VEHICLE": [
{
"ID": "V1-1"
},
{
"ID": "V1-2"
}
],
"PROPERTY": [
{
"ID": "P1-1"
}
]
},
{
"commId": "2",
"GENERAL": [
{
"ID": "G2",
"VEHICLE_COUNT": "1",
"PROPERTY_COUNT": "2"
}
],
"VEHICLE": [
{
"ID": "V2-1"
}
],
"PROPERTY": [
{
"ID": "P2-1"
},
{
"ID": "P2-2"
}
]
},
{
"commId": "3",
"GENERAL": [
{
"ID": "G3",
"VEHICLE_COUNT": "3",
"PROPERTY_COUNT": "0"
}
],
"VEHICLE": [
{
"ID": "V3-1"
},
{
"ID": "V3-2"
},
{
"ID": "V3-3"
}
]
}
]
Getting totally wrong output so far:
[
{
"ID": "G1",
"VEHICLE_COUNT": false,
"PROPERTY_COUNT": "TODO"
},
{
"ID": "G2",
"VEHICLE_COUNT": false,
"PROPERTY_COUNT": "TODO"
},
{
"ID": "G3",
"VEHICLE_COUNT": false,
"PROPERTY_COUNT": "TODO"
}
]
Edited: Update for dynamic transform
The below dataweave transform is not particularly attractive, but it might work for you.
Thanks to Christian Chibana for helping me find a dynmaic answer by answering this question: Why does Mule DataWeave array map strip top level objects?
%dw 1.0
%output application/json
---
payload map ((item) ->
(item - "GENERAL") ++
GENERAL: item.GENERAL map (
$ - "VEHICLE_COUNT"
- "PROPERTY_COUNT"
++ { VEHICLE_COUNT: sizeOf (item.VEHICLE default []) }
++ { PROPERTY_COUNT: sizeOf (item.PROPERTY default []) }
)
)
It is dynamic, so everything should be copied across as it comes in, with only the two fields you want being updated.
The output for this transform with the input you supplied is below. Only difference from your desired is that the counts are shown as numbers rather than strings. If you really need them as strings you can cast them like (sizeOf (comm.VEHICLE default [])) as :string,
[
{
"commId": "1",
"VEHICLE": [
{
"ID": "V1-1"
},
{
"ID": "V1-2"
}
],
"PROPERTY": [
{
"ID": "P1-1"
}
],
"GENERAL": [
{
"ID": "G1",
"VEHICLE_COUNT": 2,
"PROPERTY_COUNT": 1
}
]
},
{
"commId": "2",
"VEHICLE": [
{
"ID": "V2-1"
}
],
"PROPERTY": [
{
"ID": "P2-1"
},
{
"ID": "P2-2"
}
],
"GENERAL": [
{
"ID": "G2",
"VEHICLE_COUNT": 1,
"PROPERTY_COUNT": 2
}
]
},
{
"commId": "3",
"VEHICLE": [
{
"ID": "V3-1"
},
{
"ID": "V3-2"
},
{
"ID": "V3-3"
}
],
"GENERAL": [
{
"ID": "G3",
"VEHICLE_COUNT": 3,
"PROPERTY_COUNT": 0
}
]
}
]
I need to extract key and values from json input object to form different json output.
I went through the documentation and other questions asked similar to this where I could found out that $$ gives the key but, in my case it is giving me index but not the key name.
The input json looks like this :{
"key2": "val2",
"key3": "val3",
"key4": "val4",
"key5": "val5",
"key6": "val6"
}
The dataweave code I have written is :
{
"someOtherKey": "val",
properties: {
entry: payload map
{
key:$$,
value:$
}
}
}
After transformation I am getting :
{
"someOtherKey": "val",
"properties": {
"entry": [
{
"key": 0,
"value": "val2"
},
{
"key": 1,
"value": "val3"
},
{
"key": 2,
"value": "val4"
},
{
"key": 3,
"value": "val5"
},
{
"key": 4,
"value": "val6"
}
]
}
}
Here I am expecting output with key name as value for Key
Expected output :
{
"someOtherKey": "val",
"properties": {
"entry": [{
"key": "key2",
"value": "val2"
},
{
"key": "key3",
"value": "val3"
},
{
"key": "key4",
"value": "val4"
},
{
"key": "key5",
"value": "val5"
},
{
"key": "key6",
"value": "val6"
}
]
}
}
The tag pluck worked for me. Here is the example :
{
"someOtherKey": "val",
properties: {
entry: payload pluck
{
key:$$,
value:$
}
}
}
Use mapObject instead of map
%dw 1.0
%output application/json
---
{
key: "val",
key1: "val1",
properties: {
entry: payload mapObject {
key:$$,
value:$
}
}
}
Hope this helps.
I am trying to filter same level object values in the payload in Dataweave. I was able to loop through but it does not produce the expected output.
Sample Payload:
{
"root": {
"createItem": {
"itemInfo": {
"lines": [{
"lineIdentifier": "4",
"Attributes": "Test1",
"partNumber": "QZRB"
}, {
"lineIdentifier": "10",
"Attributes": "Test3",
"partNumber": "QPR1"
}, {
"lineIdentifier": "12",
"Attributes": "Test4",
"partNumber": "QHT2"
}]
}
},
"ItemResponse": {
"lines": [{
"lineIdentifier": 4,
"itemName": "QZRB",
"status": "FAILED"
}, {
"lineIdentifier": 10,
"itemName": "QPR1",
"status": "COMPLETE"
}, {
"lineIdentifier": 12,
"itemName": "QHT2",
"status": "COMPLETE"
}]
}
}
}
Expected Output:
{
"root": {
"createItem": {
"itemInfo": {
"lines": [ {
"lineIdentifier": "10",
"Attributes": "Test3",
"partNumber": "QPR1"
}, {
"lineIdentifier": "12",
"Attributes": "Test4",
"partNumber": "QHT2"
}]
}
}
}
}
Here's what I am doing:
{
root: {
(payload.root.createItem.itemInfo.lines map ((respLines, indexOfRespLines) -> {
items:payload.root.ItemResponse.lines filter ($.itemName == respLines.partNumber and $.status =='COMPLETE') map
{
item: $.itemName,
attributes: respLines.Attributes
}
}
)
)
}
}
How do I achieve this?
Thanks,
ROA
try this:
%dw 1.0
%output application/json
%var completedLines = payload.root.ItemResponse.lines filter $.status == 'COMPLETE' map $.lineIdentifier as :string
---
{
root: {
createItem: {
itemInfo: {
lines: payload.root.createItem.itemInfo.lines filter (completedLines contains $.lineIdentifier)
}
}
}
}
pay attention to as :string in completedLines, because the lineIdentifier in ItemResponse is a number, while in itemInfo it is a string.