Synchronous and asynchrous strategy - mule

I want to run one thread at a time in my mule flow. And also i want to take input only one by one, i.e.,for first input once I completed with the flow, only then Mule flow picks the second input. Which strategy should I use??
If I used synchronous strategy, and we have two or more than two files in a folder looking by Mule Flow, it picks all the input at a time.
And if i use asynchronous strategy and 1 thread at a time, then I am not able to complete the full flow before taking any other input.
<flow name="Catalog_command_Execution" doc:name="Catalog_command_Execution" processingStrategy="synchronous">
<file:inbound-endpoint path="${inputCAT.path}" responseTimeout="10000" connector-ref="File" doc:name="Catalog File"/>
<object-to-string-transformer doc:name="File Mapping"/>
<custom-transformer class="com.tcs.sdm.kcm.cmdExecution.CmdCAT" doc:name="CAT cmd Execution"/>
<logger message="******************Entered file #[message.inboundProperties.originalFilename] for command execution has been Processed*********" level="INFO" category="Audit_LogCAT" doc:name="Logger"/>
<catch-exception-strategy doc:name="Catch Exception Strategy">
<logger message="*******************************Entered Catalog file for command execution is having error: #[exception.causeException]****************" level="INFO" category="Audit_LOgCAT" doc:name="Logger"/>
</catch-exception-strategy>
</flow>
<flow name="CatalogueFlow_AB" doc:name="CatalogueFlow_AB" processingStrategy="allowOneThread">
<wmq:inbound-endpoint queue="${wmq.queue.nameCT_AB}" doc:name="WMQ" connector-ref="WMQ"/>
<object-to-string-transformer doc:name="File Mapping"/>
<logger level="INFO" doc:name="CAT Logger" category="Audit_LogCAT" message="******************Entered Catalogue SOAP File with Province Name AB is Processed from queue*********"/>
<custom-transformer class="com.tcs.sdm.kcm.catalog.ServiceController_AB" doc:name="Java"/>
<catch-exception-strategy doc:name="Catch Exception Strategy">
<logger level="INFO" doc:name="CAT Exception Logger" category="Audit_LogCAT" message="*******************************Entered Catalogue SOAP File with Province Name AB is having error: #[exception.causeException]****************"/>
</catch-exception-strategy>
</flow>

From the kind of scenario you are looking for processing one file after another Mule Synchronous processing strategy should serve the purpose.
If you see Mule picking up more than one file then the flow needs to be looked at as to why this is happening.
Update:
The processing strategy on your flow for WMQ inbound is not synchronous. Then it should work as expected.
<flow name="CatalogueFlow_AB" doc:name="CatalogueFlow_AB" processingStrategy="synchronous">
Hope this helps.

Old thread but have you tried setting the WMQ consumer count to 1?
Flow can be synchronous but that doesn't mean the inbound connector will work in a synchronous manner. For file based connector you can set dispatcher to be non-threaded and for WMQ you should try to make consumers to 1.

Related

Mule file inbound connector with poll scope

I'm trying to use mule inbound file connector with poll scope got error saying couldn't start endpoint. If I remove poll scope and use file connector with default polling and its working fine without any file path changes.
I was wondering why is Poll scope giving error? If file inbound connector not allowed to wrapped in poll scope, why anypoint studio showing poll scope in the wrap in option ?
I found similar question, but I didn't see detailed explanations.
Mule won't allow POLL message processor to read file using file Inbound?
Advance thanks for your response.
Use mule-module-requester https://github.com/mulesoft/mule-module-requester, together with the Poll Scheduler.
relevant posts: http://blogs.mulesoft.com/dev/mule-dev/introducing-the-mule-requester-module/
Another way is,
Set the FTP flow initialState="stopped", and let the poll scheduler start the flow. After the FTP processing, stop the flow again.
see sample code:
<ftp:connector name="FTP" pollingFrequency="1000"
validateConnections="true" moveToDirectory="/work/ftp/processed"
doc:name="FTP" />
<flow name="scheduleStartFTPFlow">
<poll doc:name="Poll">
<fixed-frequency-scheduler frequency="1"
timeUnit="MINUTES" />
<expression-component doc:name="START FTP FLOW"><![CDATA[if(app.registry.processFTPFlow.isStopped()){
app.registry.processFTPFlow.start();
}]]></expression-component>
</poll>
<logger message="Poll Logging: #[payload]" level="INFO"
doc:name="Logger" />
</flow>
<flow name="processFTPFlow" initialState="stopped">
<ftp:inbound-endpoint host="localhost" port="21"
path="/data/ftp" user="Sanjeet" password="sanjeet123" responseTimeout="10000"
doc:name="FTP" connector-ref="FTP" />
<logger message="Logging FTP #[payload]" level="INFO" doc:name="Logger" />
<expression-component doc:name="STOP FTP FLOW"><![CDATA[app.registry.processFTPFlow.stop();]]></expression-component>
</flow>
Please, provide SSCCE.
Based on your question you do not need Poll at all. File Connector already has this feature to check file periodically. Here is example which polls file every 0.123 seconds
<file:inbound-endpoint path="/tmp" responseTimeout="10000" doc:name="File" pollingFrequency="123"/>
My suggestion is to use the quartz connector beside the file connector and set the interval in the quartz connector. Or use the file connector itself having the poll frequency so no need to wrap the file in poll scope.
you can create a file endpoint in the global element section and then use mule requester to invoke that endpoint inside a poll scope.
<file:connector name="File1" autoDelete="true" streaming="true" validateConnections="true" doc:name="File"/>
<file:endpoint connector-ref="File1" name="File" responseTimeout="10000" doc:name="File" path="/"/>
<flow name="pocforloggingFlow1">
<poll doc:name="Poll">
<mulerequester:request resource="File" doc:name="Mule Requester"/>
</poll>
</flow>

How to acknowledge the activemq message in mule using client acknowledge?

Below is my mule configuration, i want to acknowledge using client acknoledge , how can i do it?
<mule>
<jms:activemq-connector name="Active_MQ" brokerURL="tcp://localhost:61616" validateConnections="true" doc:name="Active MQ" maxRedelivery="2" persistentDelivery="true"/>
<flow name="activemqFlow">
<file:inbound-endpoint path="D:\mule\input" responseTimeout="10000" doc:name="File"/>
<object-to-string-transformer doc:name="Object to String"/>
<set-property propertyName="fileName" value="#[message.inboundProperties.originalFilename]" doc:name="Property"/>
<jms:outbound-endpoint queue="logfilequeue" connector-ref="Active_MQ" doc:name="JMS">
<jms:transaction action="NONE"/>
</jms:outbound-endpoint>
</flow>
<flow name="JmsInboundFlow">
<jms:inbound-endpoint queue="logfilequeue" connector-ref="Active_MQ" doc:name="JMS">
<jms:client-ack-transaction action="ALWAYS_BEGIN"/>
</jms:inbound-endpoint>
<logger message="#[payload.toString()]" level="INFO" doc:name="Logger"/>
<file:outbound-endpoint path="D:\mule\output" responseTimeout="10000" doc:name="File" outputPattern="#[message.inboundProperties.fileName]"/>
</flow>
</mule>
Note: Be REALLY sure you want to use CLIENT_ACKNOWLEDGE it doesn't work like most people think. It ack's the current message AND all previous within the session. If you have parallel/threaded consumers this setting will inadvertently ack messages that aren't ready to be ack'd yet. ActiveMQ has a INDIVIDUAL_ACKNOWLEDGE which ack's just the single message.
JMS Spec 2.0 has feat requests to make this add'l ack mode a standard.
Try adding acknowledgementMode="CLIENT_ACKNOWLEDGE" in your JMS connector.
You can refer this question for more details
Mule jms with CLIENT_ACKNOWLEDGE mode? Message automatically consumed even though I didn't acknoeledge it

Mule flow with Jms connector, Threads blocking in dynamic outbound endpoint

I have a jms connector, i am receiving message from a queue processing the message in a flow, calling db to get the data based on some ids in the message and writing response output to files, i am using dynamic outbound endpoints to decide output location.
<jms:connector name="tibco" numberOfConsumers="20" ..... >
.....
</jms:connector>
<flow name="realtime" doc:name="ServiceId-8">
<jms:inbound-endpoint queue="${some.queue}" connector-ref="tibco" doc:name="JMS">
<jms:transaction action="ALWAYS_BEGIN"/>
</jms:inbound-endpoint>
<processor ref="proc1"></processor>
<processor ref="proc2"></processor>
<component doc:name="Java">
<spring-object bean="comp1"/>
</component>
<processor ref="proc3"></processor>
<collection-splitter doc:name="Collection Splitter"/>
<processor ref="endpointprocessor"></processor>
<foreach collection="#[message.payload.consumerEndpoints]" counterVariableName="endpoints" doc:name="Foreach">
<when expression="#[consumerEndpoint.getOutputType().equals('txt') and consumerEndpoint.getChannel().equals('file')]">
<processor-chain>
<file:outbound-endpoint path="#[consumerEndpoint.getPath()]" outputPattern="#[consumerEndpoint.getClientId()]-#[attributes['eventId']]%#[consumerEndpoint.getTicSeedCount()]-#[attributes['dateTime']].tic" responseTimeout="10000" doc:name="File"/>
</processor-chain>
</when>
<when expression="#[consumerEndpoint.getOutputType().equals('txt') and consumerEndpoint.getChannel().equals('ftp')]">
<processor-chain>
<ftp:outbound-endpoint path="#[consumerEndpoint.getPath()]" outputPattern="#[consumerEndpoint.getClientId()]-#[attributes['eventId']]%#[consumerEndpoint.getTicSeedCount()]-#[attributes['dateTime']].tic" host="#[consumerEndpoint.getHost()]" port="#[consumerEndpoint.getPort()]" user="#[consumerEndpoint.getChannelUser()]" password="#[consumerEndpoint.getChannelPass()]" responseTimeout="10000" doc:name="FTP"/>
</processor-chain>
</when>
</choice>
</foreach>
<rollback-exception-strategy doc:name="Rollback Exception Strategy">
<processor ref="catchExceptionCustomHandling"></processor>
</rollback-exception-strategy>
</flow>
Above is not complete flow. i pasted the important parts to understand.
Question 1. As i have not defined any thread strategy at any level, and connector has numberOfConsumers="20", if i drop 20 messages in queue how many threads will start.
prefetch size in the jms queue is set to 20.
Question 2: Do i need to configure threading strategy at receiver end and/or at flow level.
some time when the load is very high(let say 15k msgs in queue in a minute) i see message processing gets slow and thread dump shows some thing like below:
"TIBCO EMS Session Dispatcher (7905958)" prio=10 tid=0x00002aaadd4cf000 nid=0x3714 waiting for monitor entry [0x000000004af1e000]
java.lang.Thread.State: BLOCKED (on object monitor)
at org.mule.endpoint.DynamicOutboundEndpoint.createStaticEndpoint(DynamicOutboundEndpoint.java:153)
- waiting to lock <0x00002aaab711c0e0> (a org.mule.endpoint.DynamicOutboundEndpoint)
Any help and pointers will be appreciated.
Thanks-
Message processing is getting slow because of dynamic endpoint, I see thread congestion when dynamic outbound endpoint is created and used. I was using mule 3.3.x and after looking at mule 3.4.x code i realized that dynamic outbound endpoint creation is handled more appropriately. upgraded to 3.4 and the issue is almost gone.

Mule flow execution unexpectedly splits on error in SMTP sendout

I would like to catch errors from SMTP endpoint (for example in case it's misconfigured or the server is down) and when this happens, prevent messages from proceeding the normal path and rather go into an exception flow. An exception handler works and messages are routed into an exception flow. What is unexpected is that the message is duplicated and also proceeds with the "normal" flow as well. I would expect it to go only in one direction: if an email was sent successfully go into a normal endpoint, if sendout failed go into an exception endpoint.
In the provided example below, smtp is failing with UnknownHostException and the message goes into failureEndpoint but the message for some reason also ends up in the outboundEndpoint:
<mule><!-- namespaces omitted for readability -->
<flow name="sample-flowFlow1" doc:name="sample-flowFlow1">
<inbound-endpoint ref="inboundEndpoint" doc:name="AMQP Consumer"/>
<smtp:outbound-endpoint host="foobaz" to="test#example.com" from="test#example.com" subject="test" responseTimeout="10000" doc:name="SMTP"/>
<outbound-endpoint ref="outboundEndpoint" doc:name="AMQP Publisher"/>
<exception-strategy ref="FailureNotification" doc:name="Publish failure notification" />
</flow>
<catch-exception-strategy name="FailureNotification">
<flow-ref name="FailureNotificationFlow" doc:name="Flow Reference" />
</catch-exception-strategy>
<sub-flow name="FailureNotificationFlow" doc:name="FailureNotificationFlow">
<outbound-endpoint ref="failureEndpoint" doc:name="Failure Endpoint"/>
</sub-flow>
</mule>
When a message is published on inboundEndpoint and SMTP connector is misconfigured the way it is done in the provided example, I would like to see the message exclusively in the failureEndpoint, not in both outboundEndpoint and failureEndpoint. How do I accomplish this?
Mule version: 3.4.0
In this flow you are using multiple outbounds. The flow doesn't wait for the response of the smtp and still continues with the next out-bound.
A condition can be added to check whether the smtp was successful before proceeding with the out-bound.
The modified flow looks like this. Try this.
<flow name="sample-flowFlow1" doc:name="sample-flowFlow1">
<inbound-endpoint ref="inboundEndpoint" doc:name="AMQP Consumer"/>
<flow-ref name="mailingFlow" ></flow-ref>
<choice>
<when expression="#[flowVars['mailingSuccess'] == 'failure']">
<logger level="INFO" message="Mailing failed"></logger>
</when>
<otherwise>
<outbound-endpoint ref="outboundEndpoint" doc:name="AMQP Publisher"/>
</otherwise>
</choice>
</flow>
<flow name="mailingFlow" processingStrategy="synchronous" >
<smtp:outbound-endpoint host="foobaz" to="test#example.com" from="test#example.com" subject="test" responseTimeout="10000" doc:name="SMTP"/>
<catch-exception-strategy name="FailureNotification">
<set-variable variableName="mailingSuccess" value="failure" ></set-variable>
<flow-ref name="FailureNotificationFlow" doc:name="Flow Reference" />
</catch-exception-strategy>
</flow>
<sub-flow name="FailureNotificationFlow" doc:name="FailureNotificationFlow">
<outbound-endpoint ref="failureEndpoint" doc:name="Failure Endpoint"/>
</sub-flow>
Hope this helps
Even if the flow is synchronous, it will make no difference. SMTP transport is asynchronous/oneway in mule. Therefore, you cannot get a status from the transport to ascertain if it was successful and route the flow based on this. If you need to route based on status, you are better off writing an email component and embedding it in the flow. If a MessagingException is thrown by the email component, it will be automatically handled by the error handler flow and the outbound end point will not be executed.

How to rectify the issue with the below flow?

I have a mule flow as under
<flow name="flow1" doc:name="f1">
<file:inbound-endpoint path="C:\input" responseTimeout="10000"
doc:name="File" />
</flow>
<flow name="flow2" doc:name="f2">
<http:inbound-endpoint address="http://localhost:8080"
doc:name="HTTP" exchange-pattern="request-response" />
<flow-ref name="flow1" doc:name="Flow Reference" />
<file:outbound-endpoint path="C:\outputfile"
responseTimeout="10000" doc:name="File" />
</flow>
I am trying to move/upload multiple files from source to destination (can be anything e.g. FTP or File outbound etc..) by using the flow.
The reason for doing in this way is that I want to invoke the job from CLI(Command Line Interface) using CURL.
But it is not working....
Edited
I need to pick up some files(multiple files) from a particular folder located in my hard drive. And then move those to some outbound process which can be FTP site or some other hard drive location.
But this flow needs to be invoked from CLI.
Edited (Based on David's answer)
I now have the flow as under
<flow name="filePickupFlow" doc:name="flow1" initialState="stopped">
<file:inbound-endpoint path="C:\Input" responseTimeout="10000" doc:name="File"/>
<logger message="#[message.payloadAs(java.lang.String)]" level="ERROR" />
</flow>
<flow name="flow2" doc:name="flow2">
<http:inbound-endpoint address="http://localhost:8080/file-pickup/start" doc:name="HTTP" exchange-pattern="request-response"/>
<expression-component>
app.registry.filePickupFlow.start();
</expression-component>
<file:outbound-endpoint path="C:\outputfile" responseTimeout="10000" doc:name="File"/>
</flow>
I am getting couple of problems
a) I am getting an error that - Attribute initialState is not defined as a valid property of flow
However, if I remove that attribute, the flow continues without waiting for "http://localhost:8080/file-pickup/start" to fire up.
b) The files are not moved to the destination folder
So how can I do so?
You can't reference a flow that has an inbound endpoint in it because such a flow is already active and consuming events from its inbound endpoint so you can't invoke it on demand.
The following, tested on Mule 3.3.1, shows how to start a "file pickup flow" on demand from an HTTP request.
<flow name="filePickupFlow" initialState="stopped">
<file:inbound-endpoint path="///tmp/mule/input" />
<!-- Do something with the file: here we just log its content -->
<logger message="#[message.payloadAs(java.lang.String)]" level="ERROR" />
</flow>
<flow name="filePickupStarterFlow">
<http:inbound-endpoint address="http://localhost:8080/file-pickup/start"
exchange-pattern="request-response" />
<expression-component>
app.registry.filePickupFlow.start();
</expression-component>
<set-payload value="File Pickup successfully started" />
</flow>
HTTP GETting http://localhost:8080/file-pickup/start would then start the filePickupFlow, which in turn will process the files in /tmp/mule/input.
Note that it is up to you to configure the file:connector for what behavior it must have for files it processes, either deleting them or moving them to another directory are two main options.
I guess in this case a File inbound to read a file on demand will not be helpful.
Please try if the follwoing way.
<flow name="flow1" doc:name="f2">
<http:inbound-endpoint address="http://localhost:8080"
doc:name="HTTP" exchange-pattern="request-response" />
<component>
<spring-object bean="fileLoader"></spring-object>
</component>
<file:outbound-endpoint path="C:\outputfile"
responseTimeout="10000" doc:name="File" />
</flow>
So the Custom component will be a Class which reads the file from your specified location.
Hope this helps.
You can use Mule Requester for a clean solution. See the details in the blog entry Introducing the Mule Requester.