I have two queries
Suppose if I declared two variables inside a forEach like flowVars.ABC and flowVars.DEF, how can I access those 2 variables outside that forEach block?
And each variable has a JSON payload, how can I add those 2 variable's data into single JSON payload?
Can anyone assist me? I unable to access the variables inside of foreach and adding 2 JSON.
This is my sample code
<flow name="test">
<foreach doc:name="For Each">
<scatter-gather doc:name="Scatter-Gather">
<set-variable variableName="ABC" value="#[payload]" mimeType="application/json" doc:name="ABC"/>
<set-variable variableName="DEF" value="#[payload]" mimeType="application/json" doc:name="DEF"/>
</scatter-gather>
</foreach>
<set-payload value="#[flowVars.ABC + flowVars.DEF]" mimeType="application/json" doc:name="adding 2 vars"/>
</flow>
You need to understand how scoping works with foreach. Any variables set inside the foreach scope will NOT be available outside of that scope. However, variables set outside of the foreach scope (e.g. a set-variable before the foreach) will be available inside the foreach scope. This should help you get around your issue. I'm taking out the scatter-gather because it really doesn't serve any purpose in your example:
<flow name="test">
<set-variable variableName="ABC value="#[payload] mimeType="application/json" doc:name="ABC"/>
<set-variable variableName="DEF value="#[payload] mimeType="application/json" doc:name="DEF"/>
<foreach doc:name="For Each">
<set-variable variableName="ABC" value="#[payload]" mimeType="application/json" doc:name="ABC"/>
<set-variable variableName="DEF" value="#[payload]" mimeType="application/json" doc:name="DEF"/>
</foreach>
<set-payload value="#[flowVars.ABC ++ flowVars.DEF]" mimeType="application/json" doc:name="adding 2 vars"/>
</flow>
Beyond this, I'm not sure if your code is a simplification or not, but as it stands now there are a couple things that are questionable:
Why are you using a scatter-gather? If you don't really need to do multiple things asynchronously (like making calls to multiple services), it's just a complication in your code. Setting two vars doesn't qualify, in my opinion.
What is your code supposed to do? From my perspective it looks like you're just setting the payload to a duplicate of the last element in the original payload. If so you could just do this in a transformer:
%dw 2.0
output application/json
---
if (not isEmpty(payload))
payload[-1] ++ payload[-1]
else
[]
Related
I have an output from a webservice in Mule that returns a linkedHashMap and I need to get the individual values to be dynamically inserted into a template. The template is used to send email through the SMTP connector. I can get all values using MEL #[payload], but I can't get them one by one. I've tried #[payload.get(0)], #[payload[0]] but they all return null.
The Mule XML looks like this:
<flow name="MW_Flow">
<file:inbound-endpoint path="C:\....\1" connector- ref="File" responseTimeout="10000" doc:name="File" pollingFrequency="60000"/>
<ws:consumer config-ref="File_Read_WS" operation="all3" doc:name="FileRead DBWriter WS"/>
<dw:transform-message metadata:id="6ee92ba8-9f67-40d6-bfa3-3e237da20822" doc:name="Transform Message">
<foreach doc:name="For Each">
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
<parse-template location="C:\.....\Templates\Mail.txt" metadata:id="b7d894eb-465b-47f7-a542-b49fc4fb53d9" doc:name="Parse Template"/>
<logger message="2: #[message.exception] #[message.dataType] #[payload]" level="INFO" doc:name="Logger"/>
</foreach>
</flow>
The template (plain text file) looks a bit like this:
Hello [name].
This is email from [name2]. The following event [event].....
All I get are null values except when using #[payload] which returns the whole row (4 values).
Any help greatly appreciated!
/Johan
If your payload is a Map then payload.get(0) or payload[0] will behave as if you are trying to get a value from map with 0 as key, which I guess doesn't exist in your map.
Try accessing it with name - #[payload.name] or #[payload.name2] or #[payload[name]]
Scenario - Converting a csv file to json format, taking each json element and making a get request api call. I am doing this in a for-each loop sequence. I am getting a json response (extracting eventId and cost from each). Now I wish to club all these responses together under the main header listings and make a bigger json payload.
For example:
{
"listings": [
{
"eventId":"8993478",
"cost":34
},
{
"eventId":"xxxxxyyyy",
"cost":zz
},
]
}
How would I do this for all iteration entries. I can do it for a single entry(using groovy script).
You could define a variable before the for-each loop as an empty list with:
<set-variable variableName="listings" value="#[[]]" />
Then, on each iteration inside the for-each loop add an element to the previous variable with:
<expression-transformer expression="#[flowVars.listings.add(flowVars.iterationMap)]" />
In the previous code fragment I used the variable flowVars.iterationMap to denote the map generated on each iteration.
Finally, if needed, you can add a set-payload transformer after the for-each loop:
<set-payload value="#[flowVars.listings]" />
HTH, Marcos
You can use the Batch module but you would have to rewrite this logic a little bit different. For example, you will no longer be able to use an aggregation flowVar like Marcos suggested. Instead, you would need to use a fixed size batch:commit block (which would actually be better in many ways, for example you could start sending bulks to the remote API while still processing some of the other records in the background).
...I like Marco's answer and it worked perfectly for my use case.
Simply creating an array in a flow variable and using the add() method on the array in a ForEach scope did the trick.
The OP follow up question was a good one. It prompted me to do an alternate test using the approach suggested. See both of my flows here:
<flow name="sampleAggregatorFlow" doc:description="this is a simple demo that shows how to aggregate results into an accumulator array">
<http:listener config-ref="manage-s3-api-httpListenerConfig" path="/aggregate" allowedMethods="GET" doc:name="HTTP"/>
<set-payload value="#[['red','blue','gold']]" doc:name="Set Payload"/>
<set-variable variableName="accumulator" value="#[[]]" doc:name="accumulator"/>
<foreach doc:name="For Each">
<expression-transformer expression="#[flowVars.accumulator.add(payload)]" doc:name="addEm"/>
</foreach>
<set-payload value="#[flowVars.accumulator]" doc:name="Set Payload"/>
<json:object-to-json-transformer doc:name="Object to JSON"/>
</flow>
<flow name="Copy_of_sampleAggregatorFlow" doc:description="this is a simple demo that shows how to aggregate results into an accumulator array">
<http:listener config-ref="manage-s3-api-httpListenerConfig" path="/aggregate2" allowedMethods="GET" doc:name="Copy_of_HTTP"/>
<set-payload value="#[['red','blue','gold']]" doc:name="Copy_of_Set Payload"/>
<set-variable variableName="accumulator" value="#[new java.util.ArrayList()]" doc:name="Copy_of_accumulator"/>
<foreach doc:name="Copy_of_For Each">
<expression-transformer expression="#[flowVars.accumulator.add(payload)]" doc:name="Copy_of_addEm"/>
</foreach>
<set-payload value="#[flowVars.accumulator]" doc:name="Copy_of_Set Payload"/>
<json:object-to-json-transformer doc:name="Copy_of_Object to JSON"/>
</flow>
Both flows produced the same outcome:
[
"red",
"blue",
"gold"
]
Tests conducted 12/26/2017 with Anypoint Studio 6.4.1 and wth Mule Runtime 3.9
In order to set multiple values to a set payload transformer in mule we use
<set-payload value="#[{1000,1,1,1}]" doc:name="Set Payload"/>
can we assign multiple flow variables to a set payload transformer
<set-payload value="#[{flowVars['principal'],flowVars['years'],flowVars['rate'],flowVars['appid']}]" doc:name="Set Payload"/>
Or is there any other right way to do it
Thank you in advance
Yes, this will work just fine. Whether you can do something that looks prettier, would be up to your full flow configuration.
You can also use as following :-
<set-payload value="#[flowVars['principal']] #[flowVars['years']] ..." doc:name="Set Payload"/>
I get a list of files on amazon S3 and iterate over the list of files and process one file at a time. The corresponding flow is as follows --
<flow name="process-from-s3" doc:name="process-from-s3"
processingStrategy="synchronous">
<poll doc:name="Poll" frequency="${s3-poll-interval}">
<s3:list-objects config-ref="Amazon_S3" doc:name="Get List of files"
accessKey="${s3-access-key}" secretKey="${s3-secret-key}"
bucketName="${s3-read-bucket}" />
</poll>
<choice doc:name="Choice">
<foreach doc:name="For Each">
<set-session-variable variableName="s3_file_name" value="#[payload.getKey()]" doc:name="Session Variable"/>
<logger message="From bucket ( ${s3-read-bucket} ), received the file #[s3_file_name]" level="INFO" doc:name="Logger"/>
<flow-ref name="process_s3_file" doc:name="Flow Reference"/>
</foreach>
</choice>
</flow>
The flow works well, however it keeps on spitting the following log statements if there are no files found.
[03-06 21:52:05] WARN Foreach$CollectionMapSplitter
[[myapp].connector.polling.mule.default.receiver.01]: Splitter returned no results.
If this is not expected, please check your split expression
How can I avoid this annoying log message. Should I wrap the foreach within a choice router that processes the foreach if there is atleast one element in the list. Any suggestions are welcome.
I would rather set the log level for org.mule.routing.Foreach$CollectionMapSplitter to ERROR than configure any additional logic for this warning. See Mule docs for configuring logger/log4j if you need to.
I have a list of json objects containing about 200 objects. I want to split that list into smaller lists where each list contains max 20 objects each. I would like to POST each sublist to HTTP based endpoint.
<flow name="send-to-next-step" doc:name="send-to-vm-flow">
<vm:inbound-endpoint exchange-pattern="one-way"
path="send-to-next-step-vm" doc:name="VM" />
<!-- received the JSON List payload with 200 objects-->
<!-- TODO do processing here to split the list into sub-lists and call sub-flow for each sub-list
<flow-ref name="send-to-aggregator-sf" doc:name="Flow Reference" />
</flow>
One possible way is that I write a java component which iterates over the list and after iterating over each 20 objects, call sub-flow. Is there any better way of accomplishing this?
If your payload is a Java Collection, the Mule foreach scope has batching built in: http://www.mulesoft.org/documentation/display/current/Foreach
Example:
<foreach batchSize="20">
<json:object-to-json-transformer/>
<http:outbound-endpoint ... />
</foreach>
You could use the Groovy collate method for the batching, and then foreach or collection-splitter, depending on your needs:
<json:json-to-object-transformer returnClass="java.util.List"/>
<set-payload value="#[groovy:payload.collate(20)]"/>
<foreach>
<json:object-to-json-transformer/>
<http:outbound-endpoint exchange-pattern="request-response" host="0.0.0.0" port="8082" path="xx"/>
</foreach>
<set-payload value="#[groovy:payload.flatten()]"/>
This will send each batch of 20 objects to the http endpoint and then flatten back to the original list.