Remove Object Using Conditional Filtering - Mule 3 - mule

I need to do a conditional filtering using Mule 3. A sample payload is given below:
{
"MainNode": {
"ID": "123",
"MainNodekey1": "1",
"sysDetail": [
{
"sysDetail11": "localhost1",
"sysDetail21": "443",
"country": [
{
"k1": "country",
"value": [
"IN"
]
}
]
},
{
"sysDetail21": "localhost2",
"sysDetail22": "443",
"country": [
{
"k2": "country",
"value": [
"RSA",
"UK",
"SL"
]
}
]
}
],
"sysDetai2": {},
"sysDetai3": {}
},
"MainCode": "AAA",
"MainTyoe": "BBB",
"MyArray": [
{
"ObjId": "100",
"Obj1": {
"SubObj11": {},
"SubObj12": {},
"ObjCountry": "IN",
"ObjId": "A100"
},
"Obj1Source": {}
},
{
"ObjId": "200",
"Obj2": {
"SubObj21": {},
"SubObj22": {},
"ObjCountry": "IN",
"ObjId": "B100"
},
"Obj2Source": {}
}
]
}
The output that I need is to filter the main payload by applying a filter like below:
payload.MainNode.sysDetail.country..value == payload.Obj1.ObjCountry (which will be constant for all array elements)
Expected output is below:
{
"MainNode": {
"ID": "123",
"MainNodekey1": "1",
"sysDetail": [
{
"sysDetail11": "localhost1",
"sysDetail21": "443",
"country": [
{
"k1": "country",
"value": [
"IN"
]
}
]
}
],
"sysDetai2": {},
"sysDetai3": {}
},
"MainCode": "AAA",
"MainTyoe": "BBB",
"MyArray": [
{
"ObjId": "100",
"Obj1": {
"SubObj11": {},
"SubObj12": {},
"ObjCountry": "IN",
"ObjId": "A100"
},
"Obj1Source": {}
},
{
"ObjId": "200",
"Obj2": {
"SubObj21": {},
"SubObj22": {},
"ObjCountry": "IN",
"ObjId": "B100"
},
"Obj2Source": {}
}
]
}

Related

Dataweave filtering nested arrays and displaying in descending order

I'm trying to filter an array based on some values nested in objects.
My data pertains to offers (array), customers(array) with tickets(array) and other child arrays.
I want to orderBy to get all the customers information ordered by the latest timeStamp (attribute in tickets array)
From the example, offer 1 has customer 50001 with tickets 1001, 1002 and customer 50002 with tickets 1003, 1004. I want the customer which has latest timestamp in all the tickets available to be displayed first: (Desc order) with all the other passengers ordered accordingly.
Request Payload:
{
"count": 1,
"offers": [{
"offerInfo": {
"orderNumber": "1",
"orderCreationDtTime": "2023-01-10 00:00:00"
},
"customers": [{
"customerInfo": {
"name": {
"frstNm": "JOHN",
"lstNm": "DOE"
}
},
"customerNum": "50001",
"tickets": [{
"timestamp": "2023-01-07 00:38:00.167000",
"ticketService": {
"ticketNum": "1001",
"ticketType": "3"
},
"ticketReps": [{
"seq": "1",
"comment": "1st",
"location": "US"
},
{
"seq": "2",
"comment": "2nd",
"location": "US"
}
]
},
{
"timestamp": "2023-01-11 00:38:00.167000",
"ticketService": {
"ticketNum": "1002",
"ticketType": "3"
},
"ticketReps": [{
"seq": "3",
"comment": "1st",
"location": "US"
},
{
"seq": "4",
"comment": "2nd",
"location": "US"
}
]
}
]
},
{
"customerInfo": {
"name": {
"frstNm": "FAN",
"lstNm": "SING"
}
},
"customerNum": "50002",
"tickets": [{
"timestamp": "2023-01-10 00:38:00.167000",
"ticketService": {
"ticketNum": "1003",
"ticketType": "3"
},
"ticketReps": [{
"seq": "1",
"comment": "1st",
"location": "US"
},
{
"seq": "2",
"comment": "2nd",
"location": "US"
}
]
},
{
"timestamp": "2023-01-19 00:38:00.167000",
"ticketService": {
"ticketNum": "1004",
"ticketType": "3"
},
"ticketReps": [{
"seq": "1",
"comment": "1st",
"location": "US"
},
{
"seq": "2",
"comment": "2nd",
"location": "US"
}
]
}
]
}
]
}]
}
Expecting Payload after transform message:
{
"count": 1,
"offers": [{
"offerInfo": {
"orderNumber": "1",
"orderCreationDtTime": "2023-01-10 00:00:00"
},
"customers": [{
"customerInfo": {
"name": {
"frstNm": "FAN",
"lstNm": "SING"
}
},
"customerNum": "50002",
"tickets": [{
"timestamp": "2023-01-19 00:38:00.167000",
"ticketService": {
"ticketNum": "1004",
"ticketType": "3"
},
"ticketReps": [{
"seq": "1",
"comment": "1st",
"location": "US"
},
{
"seq": "2",
"comment": "2nd",
"location": "US"
}
]
},
{
"timestamp": "2023-01-10 00:38:00.167000",
"ticketService": {
"ticketNum": "1003",
"ticketType": "3"
},
"ticketReps": [{
"seq": "1",
"comment": "1st",
"location": "US"
},
{
"seq": "2",
"comment": "2nd",
"location": "US"
}
]
}
]
},
{
"customerInfo": {
"name": {
"frstNm": "JOHN",
"lstNm": "DOE"
}
},
"customerNum": "50001",
"tickets": [{
"timestamp": "2023-01-11 00:38:00.167000",
"ticketService": {
"ticketNum": "1002",
"ticketType": "3"
},
"ticketReps": [{
"seq": "3",
"comment": "1st",
"location": "US"
},
{
"seq": "4",
"comment": "2nd",
"location": "US"
}
]
},
{
"timestamp": "2023-01-07 00:38:00.167000",
"ticketService": {
"ticketNum": "1001",
"ticketType": "3"
},
"ticketReps": [{
"seq": "1",
"comment": "1st",
"location": "US"
},
{
"seq": "2",
"comment": "2nd",
"location": "US"
}
]
}
]
}
]
}]
}
Below script will help you.
%dw 2.0
output application/json
import * from dw::util::Values
---
payload update ["offers","customers"] with (
(($ map (
$ update {
case .tickets -> ($ orderBy $.timestamp as LocalDateTime {format : "yyyy-MM-dd HH:mm:ss.SSSSSS"}) [-1 to 0]
}
)) orderBy $.tickets[0].timestamp as LocalDateTime {format : "yyyy-MM-dd HH:mm:ss.SSSSSS"}) [-1 to 0]
)
A solution using the update operator for each step and auxiliary functions for clarity.
%dw 2.0
output application/json
fun convertTimestampToNumber(t)=t as LocalDateTime {format: "yyyy-MM-dd HH:mm:ss.SSSSSS"} as String {format: "yyyyMMddHHmmssSSSSSS"} as Number
fun getMaxTimestamp(t)=max(t map convertTimestampToNumber($.timestamp))
---
payload update {
case offers at .offers ->
offers map ($ update {
case customers at .customers ->
customers map ($ update {
case tickets at .tickets -> tickets orderBy ( -convertTimestampToNumber($.timestamp) )
})
orderBy ( -getMaxTimestamp($.tickets) )
}
)
}
Output:
{
"count": 1,
"offers": [
{
"offerInfo": {
"orderNumber": "1",
"orderCreationDtTime": "2023-01-10 00:00:00"
},
"customers": [
{
"customerInfo": {
"name": {
"frstNm": "FAN",
"lstNm": "SING"
}
},
"customerNum": "50002",
"tickets": [
{
"timestamp": "2023-01-19 00:38:00.167000",
"ticketService": {
"ticketNum": "1004",
"ticketType": "3"
},
"ticketReps": [
{
"seq": "1",
"comment": "1st",
"location": "US"
},
{
"seq": "2",
"comment": "2nd",
"location": "US"
}
]
},
{
"timestamp": "2023-01-10 00:38:00.167000",
"ticketService": {
"ticketNum": "1003",
"ticketType": "3"
},
"ticketReps": [
{
"seq": "1",
"comment": "1st",
"location": "US"
},
{
"seq": "2",
"comment": "2nd",
"location": "US"
}
]
}
]
},
{
"customerInfo": {
"name": {
"frstNm": "JOHN",
"lstNm": "DOE"
}
},
"customerNum": "50001",
"tickets": [
{
"timestamp": "2023-01-11 00:38:00.167000",
"ticketService": {
"ticketNum": "1002",
"ticketType": "3"
},
"ticketReps": [
{
"seq": "3",
"comment": "1st",
"location": "US"
},
{
"seq": "4",
"comment": "2nd",
"location": "US"
}
]
},
{
"timestamp": "2023-01-07 00:38:00.167000",
"ticketService": {
"ticketNum": "1001",
"ticketType": "3"
},
"ticketReps": [
{
"seq": "1",
"comment": "1st",
"location": "US"
},
{
"seq": "2",
"comment": "2nd",
"location": "US"
}
]
}
]
}
]
}
]
}

how to solve this error in mongodb aggregation: TypeError: ({$unwind:{path:"$tags", preserveNullAndEmptyArrays:true}}) is not iterable :

i have a collection
[{
"_id": {"$oid": "63873b95c7e956270e734f3c"},
"employeeId": "emp1",
"users": [{"name": "Allen","registered": true},
{"name": "Henry","registered": true},
{"name": "Adam","registered": false}],
"tags": ["good","excellent","average"]
},{
"_id": {"$oid": "63873c05c7e956270e734f3d"},
"employeeId": "emp2",
"users": [{"name": "Federick","registered": false},
{"name": "Mary","registered": true},
{"name": "Sam","registered": false} ],
"tags": ["poor","excellent", "good"]}
,{
"_id": {"$oid": "63873c1fc7e956270e734f3e"},
"employeeId": "emp3",
"users": [ {"name": "john", "registered": true},
{"name": "jack","registered": true},
{"name": "elle", "registered": false } ],
"tags": ["very good", "excellent", "good" ]}]
i am trying to groupby tags and employee with count of employees under each tag,expected output is
i am getting the correct output by python code which is
pipeline = [{"$unwind":"$users"},{"$match":{"users.registered":True}},
{"$unwind":"$tags"},
{
"$group": {
"_id": "$tags","employees":{"$push": {"employeeId":"$employeeId"}},
"employees" : { "$addToSet" : "$employeeId" },
"count": {
"$sum": 1
}
}
},
{"$project":{"_id":1,"employees":1,"size":{"$size":"$employees"}}},
{"$sort" : { "size": -1 } },
]
rec = db.ratings.aggregate(pipeline)
but its giving error in mongoshell
Is this what you want?
db.collection.aggregate([
{
"$unwind": "$tags"
},
{
"$group": {
"_id": "$tags",
"employees": {
"$push": "$employeeId"
}
}
},
{
"$project": {
"_id": 1,
"employees": 1,
"size": {
"$size": "$employees"
}
}
},
{
"$sort": {
"size": -1
}
}
])
See how it works on the playground example

How to apply filter for nested arrays in mulesoft dataWeave

I tried to filter below json payload characters[] array which is having result as 'valid' and data[] array name as 'WBB' and priority as '1'
I tried below code but not working can some one help me ?.
flatten(payload.offers.category.characters) filter ((item, index) -> item.result=='valid' and flatten(item.data) filter ((item, index) -> item.name=='WBB' and item.priority==1))
Json payload
{
"offers": [
{
"id": 100,
"name": "Test1",
"category": {
"characters": [
{
"result": "valid",
"data": [
{
"name": "WBB",
"priority": 1
},
{
"name": "ILL",
"priority": 2
}
]
}
]
}
},
{
"id": 200,
"name": "Test2",
"category": {
"characters": [
{
"data": [
{
"name": "ISS",
"priority": 1
},
{
"name": "ILL",
"priority": 2
}
]
}
]
}
},
{
"id": 300,
"name": "Test3",
"category": {
"characters": [
{
"data": [
{
"name": "WSS",
"priority": 1
},
{
"name": "ILL",
"priority": 2
}
]
}
]
}
}
]
}
Expected payload
[
{
"name": "WBB",
"priority": 1
}
]
flatten((flatten(payload.offers..characters) filter $.result == "valid").data) filter ($.name=="WBB" and $.priority == 1)
flatten(payload..data) filter ((item, index) -> item.name == 'WBB' )

aggregate in mongodb left join with $lookup

I have three collections
posts=[
{
"id": "p1",
"title": "title 1"
},
{
"id": "p2",
"title": "title 2"
}]
users = [
{
"id": "u1",
"name": "name1"
},
{
"id": "u2",
"name": "name2"
}]
comments = [
{
"userId": "u1",
"postId": "p1",
"comment": "comment 1"
}]
I want to get all collection posts and comments in each post by userId(u1) as:
posts=[
{
"id": "p1",
"title": "title 1",
"comments":[
"userId": "u1",
"comment": "comment 1"
]
},
{
"id": "p2",
"title": "title 2",
"comments":[]
}]
I used aggregate function and $lookup operator but I don't know using the $match operator to filter userId. I used aggregate bellow:
self.db.posts.aggregate([
{
"$lookup":{
"from": "comments",
"localField": "id",
"foreignField": "postId",
"as": "comments",
}
},
{
"$match":{
"comments.userId": {"$eq": param.objectUserId}
},
},
{"$skip": (param.page - 1) * param.pageSize},
{"$limit": param.pageSize},
{"$sort": {"unixDate": pymongo.DESCENDING}}
])
It only return one post in array corresponding with userId="u1"
Please help me!
Thank all!
You have to make use of the pipeline option of $lookup stage and pass the additional conditions that you want to apply.
db.posts.aggregate([
{
"$lookup": {
"from": "comments",
"let": {
"pId": "$id"
},
"pipeline": [
{
"$match": {
"$expr": {
"$eq": [
"$postId",
"$$pId"
],
},
"userId": "u1",
},
},
{
"$project": {
"_id": 0,
"userId": 1,
"comment": 1,
},
},
],
"as": "comments"
}
}
])
Mongo Playground Sample Execution
self.db.posts.aggregate([
{
"$lookup": {
"from": "comments",
"let": {
"pId": "$id"
},
"pipeline": [
{
"$match": {
"$expr": {
"$eq": [
"$postId",
"$$pId"
],
},
"userId": param.objectUserId,
},
},
{
"$project": {
"_id": 0,
"userId": 1,
"comment": 1,
},
},
],
"as": "comments"
}
},
{"$skip": (param.page - 1) * param.pageSize},
{"$limit": param.pageSize},
{"$sort": {"unixDate": pymongo.DESCENDING}}
])

Replace specific values in the array using dwl 1.0

Problem with using mapObject function properly.
Trying to retain existing array structure but calculate number of vehicles and properties and update the existing array that contains the value.
GENERAL data comes from one source, VEHICLE data comes from another source, PROPERTY data comes from another source. So when merging, I have to update GENERAL data with count of other source data.
Also GENERAL is an array object, it will always have 1. So using GENERAL[0] is safe and fine.
Original Payload
[
{
"commId": "1",
"GENERAL": [
{
"ID": "G1",
"VEHICLE_COUNT": "TODO",
"PROPERTY_COUNT": "TODO"
}
],
"VEHICLE": [
{
"ID": "V1-1"
},
{
"ID": "V1-2"
}
],
"PROPERTY": [
{
"ID": "P1-1"
}
]
},
{
"commId": "2",
"GENERAL": [
{
"ID": "G2",
"VEHICLE_COUNT": "TODO",
"PROPERTY_COUNT": "TODO"
}
],
"VEHICLE": [
{
"ID": "V2-1"
}
],
"PROPERTY": [
{
"ID": "P2-1"
},
{
"ID": "P2-2"
}
]
},
{
"commId": "3",
"GENERAL": [
{
"ID": "G3",
"VEHICLE_COUNT": "TODO",
"PROPERTY_COUNT": "TODO"
}
],
"VEHICLE": [
{
"ID": "V3-1"
},
{
"ID": "V3-2"
},
{
"ID": "V3-3"
}
]
}
]
Tried using map to loop through the payload and tried modifying 2 attribute but only managed to map one but even that is showing wrong output.
test map (item, index) -> {
(item.GENERAL[0] mapObject (value, key) -> {
(key): (value == sizeOf (item.VEHICLE)
when (key as :string) == "VEHICLE_COUNT"
otherwise value)
})
}
Expected output:
[
{
"commId": "1",
"GENERAL": [
{
"ID": "G1",
"VEHICLE_COUNT": "2",
"PROPERTY_COUNT": "1"
}
],
"VEHICLE": [
{
"ID": "V1-1"
},
{
"ID": "V1-2"
}
],
"PROPERTY": [
{
"ID": "P1-1"
}
]
},
{
"commId": "2",
"GENERAL": [
{
"ID": "G2",
"VEHICLE_COUNT": "1",
"PROPERTY_COUNT": "2"
}
],
"VEHICLE": [
{
"ID": "V2-1"
}
],
"PROPERTY": [
{
"ID": "P2-1"
},
{
"ID": "P2-2"
}
]
},
{
"commId": "3",
"GENERAL": [
{
"ID": "G3",
"VEHICLE_COUNT": "3",
"PROPERTY_COUNT": "0"
}
],
"VEHICLE": [
{
"ID": "V3-1"
},
{
"ID": "V3-2"
},
{
"ID": "V3-3"
}
]
}
]
Getting totally wrong output so far:
[
{
"ID": "G1",
"VEHICLE_COUNT": false,
"PROPERTY_COUNT": "TODO"
},
{
"ID": "G2",
"VEHICLE_COUNT": false,
"PROPERTY_COUNT": "TODO"
},
{
"ID": "G3",
"VEHICLE_COUNT": false,
"PROPERTY_COUNT": "TODO"
}
]
Edited: Update for dynamic transform
The below dataweave transform is not particularly attractive, but it might work for you.
Thanks to Christian Chibana for helping me find a dynmaic answer by answering this question: Why does Mule DataWeave array map strip top level objects?
%dw 1.0
%output application/json
---
payload map ((item) ->
(item - "GENERAL") ++
GENERAL: item.GENERAL map (
$ - "VEHICLE_COUNT"
- "PROPERTY_COUNT"
++ { VEHICLE_COUNT: sizeOf (item.VEHICLE default []) }
++ { PROPERTY_COUNT: sizeOf (item.PROPERTY default []) }
)
)
It is dynamic, so everything should be copied across as it comes in, with only the two fields you want being updated.
The output for this transform with the input you supplied is below. Only difference from your desired is that the counts are shown as numbers rather than strings. If you really need them as strings you can cast them like (sizeOf (comm.VEHICLE default [])) as :string,
[
{
"commId": "1",
"VEHICLE": [
{
"ID": "V1-1"
},
{
"ID": "V1-2"
}
],
"PROPERTY": [
{
"ID": "P1-1"
}
],
"GENERAL": [
{
"ID": "G1",
"VEHICLE_COUNT": 2,
"PROPERTY_COUNT": 1
}
]
},
{
"commId": "2",
"VEHICLE": [
{
"ID": "V2-1"
}
],
"PROPERTY": [
{
"ID": "P2-1"
},
{
"ID": "P2-2"
}
],
"GENERAL": [
{
"ID": "G2",
"VEHICLE_COUNT": 1,
"PROPERTY_COUNT": 2
}
]
},
{
"commId": "3",
"VEHICLE": [
{
"ID": "V3-1"
},
{
"ID": "V3-2"
},
{
"ID": "V3-3"
}
],
"GENERAL": [
{
"ID": "G3",
"VEHICLE_COUNT": 3,
"PROPERTY_COUNT": 0
}
]
}
]