I have two directories from where i want to pick files and send it to SFTP server , but i want to send directory1 files first , if it is successful then only i need to send files from directory2 . How do achieve this processing sequence ?
In the Batch Processing documentation, you'll see an entry for error handling, which states the following (my emphasis):
From time to time, when processing a batch job, a Mule message
processor in a batch step may find itself unable to process a record.
When this occurs – perhaps because of corrupted or incomplete record
data – Mule has three options for handling a record-level error:
stop processing the entire batch, skip any remaining batch steps and push all records to the On Complete phase (where, ideally, you
have designed a report to notify you of failed records)
continue processing the batch regardless of any failed records, using filters to instruct subsequent batch steps how to handle failed
records
continue processing the batch regardless of any failed records (using filters to instruct subsequent batch steps how to handle failed
records), until the batch job accumulates a maximum number of failed
records at which point Mule pushes all records to the On Complete
phase (where, ideally, you have designed a report to notify you of
failed records)
By default, Mule's batch jobs follow the first error handling option
which halts processing as soon as Mule encounters a single
record-level error.
So, to do what you require, you need to create a batch job where directory1 is processed first, then set max-failed-records to 0.
The nicer solution I see is to use Mule Module Requester to read the second set of files to later process them. Download the jar for Studio or Mule ESB standalone from the link and write a config like this:
<mule xmlns:mulerequester="http://www.mulesoft.org/schema/mule/mulerequester"
xmlns:file="http://www.mulesoft.org/schema/mule/file"
xmlns="http://www.mulesoft.org/schema/mule/core"
xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.mulesoft.org/schema/mule/jersey http://www.mulesoft.org/schema/mule/jersey/current/mule-jersey.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/mulerequester http://www.mulesoft.org/schema/mule/mulerequester/current/mule-mulerequester.xsd">
<file:connector name="File1" autoDelete="true" streaming="false" validateConnections="true" doc:name="File"/>
<file:endpoint path="/temp/in2" name="File2" responseTimeout="10000" doc:name="File"/>
<flow name="main" doc:name="main">
<file:inbound-endpoint path="/temp/in1" responseTimeout="10000" doc:name="File" connector-ref="File1"/>
<mulerequester:request resource="File2" returnClass=""/>
<logger level="ERROR"/>
</flow>
</mule>
1)Create 2 flows:- Flow1 and Flow2
2)Configure both the flows with file inbound endpoint with 2 different folders .....
3)Now suppose Flow1 need to execute first and flow 2 need to execute after that, make intial state property of flow 2 as stopped ..(So the file inbound endpoint of the flow will not pick the file from the directory) ......
4)Put a flow reference in flow 1 after your execution happened so that it call flow 2 ... so it will happen sequentially
example :-
<flow name="flow1" doc:name="f1">
<file:inbound-endpoint path="C:\folder1" responseTimeout="10000" doc:name="File" />
<!-- Do your busssiness proccess -->
<!--Start the flow2 -->
<scripting:component doc:name="Script">
<scripting:script engine="groovy">
muleContext.registry.lookupFlowConstruct('flow2').start()
</scripting:script>
</scripting:component>
<!-- Now call the second flow using Flow ref -->
<flow-ref name="flow2" doc:name="Flow Reference"/>
</flow>
<flow name="flow2" doc:name="f2" initialState="stopped">
<file:inbound-endpoint path="C:\folder2" responseTimeout="10000" doc:name="File" />
<!-- Do your busssiness proccess -->
</flow>
Related
I am loading request file to FTP (below flow where loading part works perfectly fine).
I am trying to read file (which is result of request file load) from FTP server and could not find any solution. Any suggestion?
<flow name="FTP_FLOW">
<file:inbound-endpoint responseTimeout="10000" doc:name="File" path="D:\AnypointStudio\workspace\ftppoc\src\test\resources\in" pollingFrequency="10000" moveToDirectory="D:\AnypointStudio\workspace\ftppoc\src\test\resources\backup"/>
<ftp:outbound-endpoint host="hostname" port="21" responseTimeout="10000" doc:name="FTP" password="password" path="/path" user="username"/>
</flow>
The ftp connector provided by MuleSoft can write a file from within a flow or it can poll a directory (inbound connector).
When you want to read a file with given name within a flow, you can use the open source connector from here: https://github.com/rbutenuth/ftp-client-connector/
I have another scenario where a flow in an independent process needs to run independently of any scheduled flows. The flow checks for a system property on startup and does a task, after which it stops. However, it fails to start except I add a poller at the start of the flow. It seems it is prevented from starting because the other flow has an inbound quartz scheduler.
<flow name="system-prop-flow">
<choice>
<when expression="#[System.getProprty("dingbert")!=empty]">
<flow-ref name=""/>
</when>
<otherwise>
..log something..
</otherwise>
</flow>
<!-- This seems to cause the top flow from running -->
<flow name="schedulededJob">
<quartz:inbound-endpoint responseTimeout="10000" doc:name="Task"
cronExpression="0 */2 * * * ?" jobName="mailJob" repeatInterval="0">
<quartz:event-generator-job/>
</quartz:inbound-endpoint>
<component.....
</flow>
Is it possible to have the top flow not run in a quartz scheduler?
I have a quartz scheduled flow which should only run once an initial flow has completed. The initial flow sets up data which must be present in a file for the quartz scheduled process to succeed. However, the quartz process starts and the initial process never starts. I only want the initial to run once so I don't want it to be run in the quartz flow.
<!-- Needs to run only once -->
<flow name="InitialJob">
<component ....
</flow>
<!-- Depends on InitialJob -->
<flow name="ScheduledProcess">
<quartz:inbound-endpoint responseTimeout="10000" doc:name="Schd"
cronExpression="0 */5 * * * ?" jobName="doIt"
repeatInterval="0">
<quartz:event-generator-job/>
</quartz:inbound-endpoint>
<!-- I don't want to put InitialJob here,
I only want it to run once
-->
<flow-ref name="PerformJob"/>
</flow>
Is there a way to achieve this? How can I arrange the flows to accomplish my goal?
You can create two flows, one which will be triggered periodically but disabled on start-up and one that will set-up your data and activate the periodic flow. Something like:
<!-- Will run periodically once started -->
<flow name="PeriodicJob" initialState="stopped">
<quartz:inbound-endpoint jobName="PeriodicJob" cronExpression="* * * * * ?" repeatInterval="0" responseTimeout="10000" doc:name="Quartz">
<quartz:event-generator-job/>
</quartz:inbound-endpoint>
<flow-ref name="PerformJob"/>
</flow>
<!-- Will run once on start-up and activate PeriodJob -->
<flow name="InitialJobRunOnce">
<quartz:inbound-endpoint jobName="InitialJobRunOnce" repeatInterval="0" repeatCount="0" startDelay="0" responseTimeout="10000" doc:name="Quartz">
<quartz:event-generator-job/>
</quartz:inbound-endpoint>
<expression-component doc:name="Activate period job"><![CDATA[app.registry.PeriodicJob.start();]]></expression-component>
</flow>
Your initial flow will run once on start-up, but this "run a flow once" approach have some limits. If your application restart, the initial flow will run again - though this can be somehow mitigated by adding some logic to your initial flow.
In your initial flow try to start the quartz flow like this`<expression-component>
app.registry.yourflowName.start();
</expression-component>`
Then in after quartz flow is finished try to stop the initial flow with below script:`<expression-component>
app.registry.yourflowName.stop();
</expression-component>`
Thanks!
I have a simplistic mule configuration that takes in HTTP Query parameters, builds a URL & downloads a File from the URL. It does nothing much & does not use any kind of File (or FTP/SFTP) protocols. What kind of Exception handling mechanism do I need to think about for this?
Here is the snippet of code:
<http:listener-config name="HTTP_Listener_Configuration" host="0.0.0.0" port="8048" doc:name="HTTP Listener Configuration"/>
<http:request-config name="HTTP_Request_Configuration" host="${sync.host}" port="${sync.port}" doc:name="HTTP Request Configuration"/>
<file:connector name="output" doc:name="File"/>
<flow name="syncFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/og" allowedMethods="GET" doc:name="HTTP"/>
<set-variable variableName="year" value="#[message.inboundProperties.'http.query.params'.year]" doc:name="Variable"/>
<set-variable variableName="month" value="#[message.inboundProperties.'http.query.params'.month]" doc:name="Variable"/>
<http:request config-ref="HTTP_Request_Configuration" path= "/year/{year}/month/{month}/monthly.csv" method="GET" doc:name="HTTP">
<http:request-builder>
<http:uri-param paramName="year" value="#[flowVars.year]"/>
<http:uri-param paramName="month" value="#[flowVars.month]"/>
</http:request-builder>
</http:request>
Also this is the end of the flow & I just simply dump the file I receive , any suggestions on elegant ways of handling this part of the code?
What kind of Exception handling mechanism do I need to think about for
this?
I would use a choice exception strategy to catch, at a minimum, exceptions related to file permissions, disk space, and an already existing file.
any suggestions on elegant ways of handling this part of the code?
I would probably remove the variable declarations if you're not going to be re-using them and just put the values directly in the request builder. You can also collapse those into one "Message Properties" transformer using the invocation scope.
What kind of Exception handling mechanism do I need to think about for
this?
You can have a Rollback Exception Strategy in case the service you are sending the request to is down or fails to receive your request properly.
I am designing an application flow where I have to read a file (XML File) and put the data into IBM MQ (Queue). Do I need to create an HTTP request that will trigger the File read and update queue, otherwise how do I perform this task.
Currently I am creating an HTTP Request and connecting it to WMQ but I am getting NULL data into the Queue. Basically the payload is NULL.
This is the data I read when I browse the Queue:
sr.org.mule.transport.NullPayload1.L5U���...xp
Try like this:
Whenever you use File connector at any other place than the beginning of the flow, then it becomes an outbound endpoint.
For using File as an inbound endpoint to retrieve any file, you must use it in the beginning of some flow & keep the flow in stopped state initially as:
<flow name="filePickupFlow" initialState="stopped">
<file:inbound-endpoint path="src/main/resources/input" responseTimeout="10000" doc:name="File"/>
<wmq:outbound-endpoint queue="gh" doc:name="WMQ" connector-ref="WMQ"/>
</flow>
For your case just change the path with your required file location.
Then for further calling it via http, create another flow with http endpoint & use an Expression component to start the flow containing file inbound endpoint call as :
<expression-component doc:name="Expression">
app.registry.filePickupFlow.start();
</expression-component>
Then if you wish to stop it after it completes processing, you can again use an Expression component as:
<expression-component doc:name="Expression">
Thread.sleep(5000);
app.registry.filePickupFlow.stop();
</expression-component>
Thread.sleep() is used here just to give some time gap between flow start & stop to let the flow operation complete. You can add some other thing for maintaing this time gap or set the time according to your own usage.
I guess this is what you were looking for.
Regards,
JJ
If you want to access the file component on-demand( Whenever you access the HTTP, only then File component needs to access), use Mule Requester in the place of file component which do the same work.
<mulerequester:config name="Mule_Requester" doc:name="Mule Requester"/>
<http:listener-config name="HTTP_Listener_Configuration" host="0.0.0.0" port="8081" basePath="test2" doc:name="HTTP Listener Configuration"/>
<flow name="httpFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/" doc:name="HTTP"/>
<mulerequester:request config-ref="Mule_Requester" resource="file://C:/in" doc:name="Mule Requester"/>
<object-to-string-transformer doc:name="Object to String"/>
<logger message="**payload:*#[payload]" level="INFO" doc:name="Use WMQ here instead of Logger"/>
</flow>
Link to refer: https://github.com/mulesoft/mule-module-requester/blob/master/mulerequesterdemo/src/main/app/MuleRequesterDemo.xml
Instead of HTTP, you can also schedule job trigger using Poll component based on your requirement. Hope this helps.
You wanted to used file connector as inbound endpoint but actually using as outbound endpoint. Check your configuration xml file for file connector.
There are many ways for reading file as inbound like file connector, Poll scope or Quartz connector. You can use anyone of these as per you requirement. The simplest flow is as
<flow name="testfixedFlow">
<file:inbound-endpoint path="tmp" connector-ref="File" responseTimeout="10000" doc:name="File"/>
<wmq:outbound-endpoint queue="xyz" connector-ref="WMQ" doc:name="WMQ"/>
</flow>
But if you want to get resource in between the flow you can use Mule requester
Hope this helps.