Handling many If else Conditions in a single mule flow - mule

I have to call another teams one System API, with different query parameters based on different conditions. .
I am adding different conditions ( for now 15 , count may increase in future ) in Choice Router based on which I am calling the same SAPI with different query parameters.
But Choice Router is becoming very cumbersome and my flow is looking ugly.
Please suggest any other good way to handle this scenario.

One approach I can think of is to abstract the System API into its own flow which expects a flowVar JSON variable (sapiParameters) which has all the expected parameters. Like:
{
param1: "hello",
param2: "world",
...
}
Then you can move the conditional logic into transformers that create this flowVar variable sapiParameters.
These transformers could be dataweave transformers, but can also become Java classes or groovy scripts if the logic becomes too complex.

Try using match..case operator for removing multiple if..else logic. Based on your conditions, you can pass different query parameters passed to your calling API.
https://docs.mulesoft.com/mule-runtime/4.3/dataweave-pattern-matching

Another approach, I can think of DataWeave using
if,else If,else or using pattern Matching using match operator
<ee:transform doc:name="Choice alternative" doc:id="230-3-kke9" >
<ee:message >
</ee:message>
<ee:variables >
<ee:set-variable variableName="choiceFlow" ><![CDATA[%dw 2.0
output application/java
---
if(<condition1>)
Mule::lookup('flow1', {test:'hello '})
else if (<conditional2>)
Mule::lookup('flow2', {test:'hello '})
else
Mule::lookup('flow3', {test:'hello '})]]></ee:set-variable>
</ee:variables>
</ee:transform>
Internal workings of the DataWeave engine might cause a lookup function to be invoked in parallel with other lookup functions, or not to be invoked at all.
MuleSoft recommends that you invoke flows with the Flow Ref (flow-ref) component
Please Note: You can add more else if in above code.

Related

if else statement mule 3

I want to set a variable based on the output in Mule 3.
For example the check I want to do is if there is any payload
I want to set the var value to this ${http.path.one} else
${http.path.two}.
In Mule 4 it can be done in multiple ways but in Mule 3 seems little tricky. Anyone an Idea?
Thanks
In Mule 3 DataWeave you can use when/otherwise instead of Mule 4 if/else. To access the properties use the p() function. Depending on the exact payload and the condition you need you may need to tweak the expression for the condition.
Example:
p('http.path.two') when (payload != null) otherwise p('http.path.one')

mule3 to mule 4 expression to dataweave 2.0

I'm new to migrating the mule 3 apps to mule 4 I have done almost conversion but one expression stopped my flow and not able to achieve the logic for it if anyone has an idea regarding the expression to transform please help me
Expression:
if(flowVars.maindata.keySet().contains(payload.idCaseNumber))
{
flowVars.temporary=[];
flowVars.maindata.get(payload.idCaseNumber).add(map);
}
else
{
flowVars.temporary.add(previousdata);
vars.maindata.put(payload.idCaseNumber,temporary);
}
I have tried up to my knowledge on the above code but still I'm getting problem
flowVars.maindata.get(payload.idCaseNumber).add(map);
In Mule 3 the expression language is MEL. In Mule 4 it is DataWeave 2.0. You can't just translate directly. MEL is an imperative scripting language, similar to a subset of Java and it is easy to call Java methods. DataWeave 2.0 is a functional language. Furthermore Mule 4 operations (example: a , , etc) can only return one value, which can be assigned to the payload or to one variable.
For your snippet I'll assume that maindata is a map. You can use two set-variable to assign each variable:
<set-variable variableName="temporary" value="#[ if( namesOf(vars.maindata) contains payload.idCaseNumber ) [] else vars.temporary ++ **previousdata** ]" />
I don't know exactly what do you use for previousdata.
To update the variable maindata it is probably a good match for the update operator, in a separate or Transform operation, with the same condition than for vars.temporary.
Update:
I'll assume vars.maindata is a map, which DataWeave will consider an object, and each element is a list. As an example of doing an 'upsert' operation with a dynamic selector:
%dw 2.0
output application/java
var temporary=[5]
var maindata={ a:[1,2,3,4] }
var myKey="a"
---
maindata update {
case data at ."$(myKey)"! -> if (data != null) data ++ temporary else temporary
}
You could replace in above script the DataWeave var temporary with the expression from my example above, and the other DataWeave variables with the Mule variables (vars.name) or payload. If you change in above example myKey to have value "b" you will see that key being added.

Apache Flink Error Handing and Conditional Processing

I am new to Flink and have gone through site(s)/examples/blogs to get started. I am struggling with the correct use of operators. Basically I have 2 questions
Question 1: Does Flink support declarative exception handling, I need to handle parse/validate/... errors?
Can I use org.apache.flink.runtime.operators.sort.ExceptionHandler or similar
to handle errors?
or Rich/FlatMap function my best option?
If Rich/FlatMap the only option then is there a way to get handle to Stream inside Rich/FlatMap function so Sink(s) could be attached for error processing?
Question 2: Can I conditionally attach different Sink(s)?
Based on certain field(s) in keyed split streams I need to select different sink(s), do I split the stream again or use a Rich/FlatMap to handle that?
I am using Flink 1.3.2. Here is the relevant portion of my job
.....
.....
DataStream<String> eventTextStream = env.addSource(messageSource)
KeyedStream<EventPojo, Tuple> eventPojoStream = eventTextStream
// parse, transform or enrich
.flatMap(new MyParseTransformEnrichFunction())
.assignTimestampsAndWatermarks(new EventAscendingTimestampExtractor())
.keyBy("eventId");
// split stream based on eventType as different reduce and windowing functions need to be applied
SplitStream<EventPojo> splitStream = eventPojoStream
.split(new EventStreamSplitFunction());
// need to apply reduce function
DataStream<EventPojo> event1TypeStream = splitStream.select("event1Type");
// need to apply reduce function
DataStream<EventPojo> event2TypeStream = splitStream.select("event2Type");
// need to apply time based windowing function
DataStream<EventPojo> event3TypeStream = splitStream.select("event3Type");
....
....
env.execute("Event Processing");
Am I using the correct operators here?
Update 1:
Tried using the ProcessFunction as suggested by #alpinegizmo but that didn't work as it depends upon a keyed stream which I don't have until I parse/validate input. I get "InvalidProgramException: Field expression must be equal to '*' or '_' for non-composite types. ".
It's such a common use case where your first parse/validate input and won't have keyed stream yet, so how do you solve it?
Thanks for your patience and help.
There's one key building block that you've overlooked. Take a look at side outputs.
This mechanism provides a typesafe way to produce any number of additional output streams. This can be a clean way to report errors, among other uses. In Flink 1.3 side outputs can only be used with ProcessFunction, but 1.4 will add side outputs to ProcessWindowFunction.

How to refer payload which is a ResultSet Iterator in mule dataweave?

Sql query returns a streamed output as Resultset iterator object from the Database component.
i want to convert this to xml in dataweave. But don't know how to refer the incoming object,
If it's a map i can access it simply by using .operator like payload.student
Tried using payload.next() but it gives an error. Also tried the following,
%var input1 payload as :iterator but still wont' work
Here the steps:
Drag and drop the Transform Message (Dataweave) component after your configured DB Connector. You will see that the input payload for dataweave script is filled with the db result List<Map>.
Then you can access the fields, using the map function in dw.
dw script
%dw 1.0
%output application/xml
---
{
"Results":{
(payload map {
"key1":$."db_field1",
"key2":$."db_field2"
})
}
}
Can you post your code (XML) and a screenshot of dataweave or debugger?
My first guess would be to use a standard transfomer to transform that object to a list or map before the dataweave transformer.

How to concatenate 2 values in mule?

Can someone please let me know how to concatenate multiple values in mule?
Something like,
#[payload.getPayload()].concat(#[getSubject()])
I assume you are using Mule 3.3.x or above. If so you can use Mule Expression Language(MEL).
One example using MEL is:
#['Hello' + 'World']
Or MEL also allows you to use standard Java method invocation:
#[message.payload.concat(' Another String')]
Cheat sheet on MEL
MULE 4 Update
For Mule 4. Dataweave 2.0 is the main expression language:
Simple concat:
#['Hello' ++ ' World']
Other alternative is to use Mule Design plugin :
Drop an "Append String" operation as many times as you need.
This operation takes the message payload of the previous step and concats a specified string to it.
Not sure about performance details, but it will be surely more easy to maintain.
Append to String - MuleSoft
you can declare a string buffer using expression component
<expression-component doc:name="Expression"><![CDATA[StringBuffer sb = new
StringBuffer();
flowVars.stBuffer=sb;
]]></expression-component>
and then append use append on string buffer any where in the flow.
flowVars.stBuffer.append("string to append")
Once done use #[flowVars.stBuffer] to access the concatenated string
If you want to add two different values received through payload in the mule flow then we can use concat() method.
For example below we have received values through arraylist where i am adding two diffrent fields i.e. FirstName and the LastName -
concat(#[payload[0].'firstname']," " #[payload[0].'lastname']