I'm using batch in mule for the first time, not sure how to handle the exceptions for batch records.
Records getting failed input phase, but not able to catch failure exception either in input phase logger and also in Batch step( Failure flow)logger. Perhaps MEL #[inputPhaseException] itself throwing exception.
<batch:job name="Batch1" max-failed-records="-1">
<batch:threading-profile poolExhaustedAction="WAIT"/>
<batch:input>
<file:inbound-endpoint path="C:\IN" responseTimeout="10000" doc:name="File"/>
<component class="com.General" doc:name="Java"/>
<logger message="InputPhase: #[inputPhaseException]" level="INFO" doc:name="Logger"/>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step" accept-policy="ALL" ">
<data-mapper:transform config-ref="Pojo_To_CSV" doc:name="Pojo To CSV"/>
<file:outbound-endpoint path="C:\Users\OUT" outputPattern="#[function:dateStamp]_product.csv" responseTimeout="10000" doc:name="File"/>
</batch:step>
<batch:step name="FailureFlow" accept-policy="ONLY_FAILURES">
<logger message="Inside Failure: #[getStepExceptions()], Loading Phase: #[failureExceptionForStep],#[inputPhaseException] " level="ERROR" doc:name="Logger"/>
</batch:step>
</batch:process-records>
<batch:on-complete>
<logger level="INFO" doc:name="Logger" message=" On Complete: #[payload.loadedRecords] Loaded Records #[payload.failedRecords] Failed Records"/>
</batch:on-complete>
</batch:job>
Is there is any restriction for batch MEL to be used only inputphase and certain MEL on Process Record and Complete. Because i tried keep most of the get..Exception{} in Failure flow, it is throwing error.
Please suggest, Thanks in advance.
Yes .. you are right .. #[inputPhaseException] is causing all the issues ..
I have modified your Mule flow and you can try the following :-
<batch:job name="Batch1" max-failed-records="-1">
<batch:threading-profile poolExhaustedAction="WAIT"/>
<batch:input>
<file:inbound-endpoint path="C:\IN" responseTimeout="10000" doc:name="File"/>
<component class="com.General" doc:name="Java"/>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step" accept-policy="ALL" ">
<data-mapper:transform config-ref="Pojo_To_CSV" doc:name="Pojo To CSV"/>
<file:outbound-endpoint path="C:\Users\OUT" outputPattern="#[function:dateStamp]_product.csv" responseTimeout="10000" doc:name="File"/>
</batch:step>
<batch:step name="Batch_Failed">
<logger doc:name="Logger" level="ERROR" message="Record with the following payload has failed. Payload:: #[message.payload], Loading Phase: #[failureExceptionForStep], Inside Failure the exception is :- #[getStepExceptions()]" />
</batch:step>
</batch:process-records>
<batch:on-complete>
<logger message="Number of failed Records: #[payload.failedRecords] " level="INFO" doc:name="Failed Records" />
<logger level="INFO" doc:name="Logger" message=" Number of loadedRecord: #[payload.loadedRecords]"/>
<logger message="Number of sucessfull Records: #[payload.successfulRecords]" level="INFO" doc:name="Sucessfull Records" />
<logger message="ElapsedTime #[payload.getElapsedTimeInMillis()]" level="INFO" doc:name="Elapsed Time" />
</batch:on-complete>
</batch:job>
Yes, You need to use in The On complete Phase as it acts Finally block to collect the Successful and Unsuccessful Batch results.
https://dzone.com/articles/handle-errors-your-batch-job%E2%80%A6
Related
I am reading the multiple files from different folder and merging them into one but not able to merge into one file.
I am using composite source where I added two file connector then I am logging the payload into logger. payload I am getting one by one. How can I get the one payload combination of the two different payloads or multiple files input?
<flow name="file2Flow">
<composite-source doc:name="Copy_of_Composite Source">
<file:inbound-endpoint path="src/main/resources/input1" responseTimeout="10000" doc:name="File"/>
<file:inbound-endpoint path="src/main/resources/input2" responseTimeout="10000" doc:name="File"/>
</composite-source>
<file:file-to-string-transformer doc:name="File to String"/>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
</flow>
also I am trying this but not getting output
<flow name="file2file2Flow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/files" doc:name="HTTP"/>
<scatter-gather doc:name="Scatter-Gather">
<file:outbound-endpoint path="src/main/resources/input1" responseTimeout="10000" doc:name="File"/>
<file:outbound-endpoint path="src/main/resources/input1" responseTimeout="10000" doc:name="File"/>
</scatter-gather>
<dw:transform-message doc:name="Transform Message">
<dw:set-payload><![CDATA[%dw 1.0
%output application/json
---
{
post1: payload[0],
post2: payload[1]
}]]>
</dw:set-payload>
</dw:transform-message>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
</flow>
file:inbound-endpoint will poll one directory, so if you need different directories that won't work.
composite-source allows it, but they wont be available in the same payload.
file:outbound-endpoint is for writing files only.
In Mule 3, you can achieve this though through a combination of a poll to trigger the flow, scatter-gather to route to multiple processors and the mule requester module to read files mid flow.
Mule Requester Module: https://www.mulesoft.com/exchange/68ef9520-24e9-4cf2-b2f5-620025690913/requester-module/
Rough example:
<flow name="dw-testFlow">
<poll doc:name="Poll" frequency="10000">
<logger level="INFO" doc:name="Logger" />
</poll>
<scatter-gather doc:name="Scatter-Gather">
<mulerequester:request config-ref="muleRequesterConfig" resource="myFileEndpoint" doc:name="Mule Requester" />
<mulerequester:request config-ref="muleRequesterConfig" resource="myFileEndpoint" doc:name="Mule Requester" />
</scatter-gather>
</flow>
I am using Groovy inside Poll to check file existence at given location.
My flow is working fine, when there is a file. But If I delete that file, flow is not triggering.Below is my code
<flow name="monitor-dst-file-flow">
<poll doc:name="Poll">
<schedulers:cron-scheduler expression="0 0 23 ? * TUE-SAT"/>
<scripting:transformer doc:name="Groovy">
<scripting:script engine="Groovy"><![CDATA[def endpointBuilder = muleContext.endpointFactory.getEndpointBuilder(
"sftp://${user}:${pwForGroovy}#${host}:${port}${inputpath}/?connector=SFTP")
endpointBuilder.addMessageProcessor(new org.mule.routing.MessageFilter(new org.mule.transport.file.filters.FilenameWildcardFilter('test.txt')))
def inboundEndpoint = endpointBuilder.buildInboundEndpoint()
inboundEndpoint.request(30000L)]]>
</scripting:transformer>
</poll>
<choice doc:name="Choice">
<when expression="#[message.inboundProperties.originalFilename =="test.txt"]">
<logger level="INFO" doc:name="Logger" message="File Exists..."/>
</when>
<otherwise>
<logger message="FILE EXISTS" level="ERROR" doc:name="Logger"/>
<flow-ref name="email-notification-sub-flow" doc:name="Flow Reference"/>
</otherwise>
</choice>
</flow>
Here, if there is no test.txt file, I am not able to debug Choice component.
It says -Polling of monitor-dst-file-flow returned null, the flow will not be invoked.
I am not able to identify the exact solution to run my flow. I have to handle that condition, where given file is not there.
You need to return something besides null from the poller's target for the flow to be invoked. I'd recommend doing this in a sub-flow:
<flow name="monitor-dst-file-flow">
<poll doc:name="Poll">
<schedulers:cron-scheduler expression="0 0 23 ? * TUE-SAT"/>
<flow-ref name="pollerProcessor" doc:name="pollerProcessor"/>
</poll>
<choice doc:name="Choice">
<when expression="#[payload == 'file not found']">
<logger level="INFO" doc:name="Logger" message="File Exists..."/>
</when>
<otherwise>
<logger message="FILE EXISTS" level="ERROR" doc:name="Logger"/>
<flow-ref name="email-notification-sub-flow" doc:name="Flow Reference"/>
</otherwise>
</choice>
</flow>
<sub-flow name="pollerProcessor">
<scripting:transformer doc:name="Groovy">
<scripting:script engine="Groovy"><![CDATA[def endpointBuilder = muleContext.endpointFactory.getEndpointBuilder(
"sftp://${user}:${pwForGroovy}#${host}:${port}${inputpath}/?connector=SFTP")
endpointBuilder.addMessageProcessor(new org.mule.routing.MessageFilter(new org.mule.transport.file.filters.FilenameWildcardFilter('test.txt')))
def inboundEndpoint = endpointBuilder.buildInboundEndpoint() inboundEndpoint.request(30000L)]]>
</scripting:script>
</scripting:transformer>
<set-payload value="#[payload == null ? 'file not found' : payload]" doc:name="Set Payload"/>
</sub-flow>
I am trying to enrich the data to create an XML file.
The first query does a Group By to obtain the transaction header.
The second query gets all records (details) that match the header from the same file, to enrich the message.
The problem is that it takes about a second to run the query that enriches the data. I will need to run this process for 184,764 headers. At one second per header this job will take too long. Is there a way to accomplish the same thing without having to query the database for details? Can all the records be loaded first and obtain the details from memory instead? Here's the code:
<db:generic-config name="Generic_Database_Configuration" url="${db.url}"
driverClassName="${driver.class.name}" doc:name="Generic Database
Configuration"/>
<data-mapper:config name="List_Map__To_List_Map_"
transformationGraphPath="list_map__to_list_map_.grf"
doc:name="List_Map__To_List_Map_"/>
<data-mapper:config name="List_Map__To_XML_1"
transformationGraphPath="list_map__to_xml_1.grf"
doc:name="List_Map__To_XML_1"/>
<batch:job name="OrceTransactionImportBatch">
<batch:input>
<db:select config-ref="Generic_Database_Configuration"
doc:name="Database">
<db:parameterized-query><![CDATA[SELECT TRANDATED, STORED, REG#D
AS REG_D, TRAN#D AS TRAN_D, VIP#D AS VIP_D, VIP#D AS VIPNO, SUM(RETAIL*QTY)
AS TOTAL,
CONCAT(SUBSTRING(TRANDATED,1,4),
CONCAT('-',CONCAT(SUBSTRING(TRANDATED,5,2),
CONCAT('-',CONCAT(SUBSTRING(TRANDATED,7,2),'T00:00:00'))))) AS
BusinessDayDate
FROM ORCTEXDTLP
WHERE DGROUPID IN (SELECT HGROUPID FROM ORCTEXHDRP WHERE HPRCFLAG = 'P')
GROUP BY STORED, TRANDATED, REG#D, TRAN#D, VIP#D
FETCH FIRST 60 ROWS ONLY]]></db:parameterized-query>
</db:select>
<logger message="before mapper..." level="INFO" doc:name="before
mapper..."/>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step">
<data-mapper:transform config-ref="List_Map__To_List_Map_"
doc:name="List<Map> To List<Map>"/>
<logger message="before enricher..." level="INFO"
doc:name="before enricher..."/>
</batch:step>
<batch:step name="Batch_Step1">
<logger message="BEFORE FOR EACH..." level="INFO"
doc:name="Logger"/>
<enricher target="#[variable:LineItem]" doc:name="Message
Enricher">
<db:select config-ref="Generic_Database_Configuration"
doc:name="Database">
<db:parameterized-query><![CDATA[SELECT TRANCODED,
CONCAT(SUBSTRING(TRANDATED,1,4),
CONCAT('-',CONCAT(SUBSTRING(TRANDATED,5,2),
CONCAT('-',CONCAT(SUBSTRING(TRANDATED,7,2),'T00:00:00'))))) AS
BusinessDayDate, STORED AS RetailStoreID, TRAN#D AS TransactionNumber, REG#D
AS WorkstationID, RETAIL AS TransactionGrandAmount, VIP#D AS AlternateID,
DISCOUNT, VOUCHER#D AS VOUCHER_D, TRIM(SKU#) AS ItemID, A03K2 AS
UnitCostPrice, RETAIL AS RegularSalesUnitPrice, (RETAIL*QTY) AS
ExtendedAmount, QTY AS Quantity, ROW_NUMBER() OVER () rownumber,
(RETAIL*QTY) AS ActualRetail,
VOUCHERCD AS VoucherCode, VOUCHER#D AS VoucherNumber
FROM FBF02P
LEFT OUTER JOIN KSK2P ON SKUK2 = SKU#
WHERE TRANDATED = #[payload[0]['TRANDATED']] AND STORED = #[payload[0]
['STORED']] AND REG#D = #[payload[0]['REG_D']] AND TRAN#D = #[payload[0]
['TRAN_D']]]]></db:parameterized-query>
</db:select>
</enricher>
<expression-component doc:name="Expression"><![CDATA[#
[payload[0].LineItem=flowVars.LineItem]]]></expression-component>
<logger message="#[payload[0]['TRAN_D']]" level="INFO"
doc:name="Logger"/>
</batch:step>
<batch:step name="Batch_Step2">
<batch:commit streaming="true" doc:name="Batch Commit">
<data-mapper:transform config-ref="List_Map__To_XML_1"
doc:name="List<Map> To XML"/>
<file:outbound-endpoint path="${output.path}"
outputPattern="TranImport#[server.dateTime.format('yyyyMMdd_HHmmss')].xml"
responseTimeout="10000" doc:name="File"/>
</batch:commit>
</batch:step>
</batch:process-records>
<batch:on-complete>
<logger message="DONE..." level="INFO" doc:name="Logger"/>
</batch:on-complete>
</batch:job>
<flow name="OrceTransactionImportFlow">
<poll doc:name="Poll">
<fixed-frequency-scheduler frequency="1" timeUnit="DAYS"/>
<db:update config-ref="Generic_Database_Configuration"
doc:name="Database">
<db:parameterized-query><![CDATA[UPDATE ORCTEXHDRP
SET HPRCFLAG = 'P'
WHERE HPRCFLAG = '' OR HPRCFLAG = 'P']]></db:parameterized-query>
</db:update>
</poll>
<choice doc:name="Choice">
<when expression="#[payload == 0]">
<logger message="Zero payload..." level="INFO"
doc:name="Logger"/>
</when>
<otherwise>
<batch:execute name="OrceTransactionImportBatch"
doc:name="OrceTransactionImportBatch"/>
</otherwise>
</choice>
</flow>
Inside your database connector configuration you should setup a Connection Pooling profile.
How to convert type=java.lang.String to type=java.lang.Iterable as batch step (Process Records) is expecting type of java.lang.Iterable. Note : Input is an xml file and mule flow is a batch job.
When the xml has only one 'Report_Entry' record below error is recieved. For multiple entries of 'Report_Entry' flow works fine.
<object-to-string-transformer doc:name="Object to String"/>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
<set-payload
value="#[xpath3('/*:Report_Data/*:Report_Entry', payload, 'NODESET')]" doc:name="Set Payload"/>
<logger message="XML Record - #[payload]" level="INFO" doc:name="Logger"/>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step1">
<json:object-to-json-transformer doc:name="Object to JSON"/>
<logger message="XML Record - #[payload]" level="INFO" doc:name="Logger"/>
<amqp:outbound-endpoint exchangeName="${amqp.exchangeName}" queueName="${amqp.queueName}" responseTimeout="10000" encoding="UTF-8" mimeType="application/xml" connector-ref="AMQP_Connector" doc:name="AMQP"/>
</batch:step>
</batch:process-records>
In the logger it is printing 'org.mule.api.processor.LoggerMessageProcessor: XML Record - net.sf.saxon.dom.DOMNodeList#57d263b4' after the set-payload condition. Our requirement is to convert the xml record to JSON and write to AMQP.
That's because of the splitter. If you just want a collection/iterable before the batch job, jsut use set-payload:
<set-payload
value="#[xpath3('/*:Report_Data/*:Report_Entry', payload, 'NODESET')]" />
<batch:execute name="test" />
This should wok regardless of the amount nodes.
I'm using Mule version 3.5.1. I'm trying to run a batch records.Input phase ( File inbound) has been successfully being completed, but processing phase is erring out whereas i have only datamapper inside the process record phase( i have also validated xsd against xml, it looks correct).
<data-mapper:config name="XML_To_CSV" transformationGraphPath="xml_to_csv.grf" doc:name="XML_To_CSV"/>
<batch:job name="businesslogicflowBatch1">
<batch:threading-profile poolExhaustedAction="WAIT"/>
<batch:input>
<file:inbound-endpoint path="C:\Users\Desktop\IN" responseTimeout="10000" doc:name="File"/>
<logger message="*******inputPhase:#[payload]******" level="INFO" doc:name="Logger"/>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step">
<data-mapper:transform config-ref="XML_To_CSV" doc:name="XML To CSV"/>
</batch:step>
</batch:process-records>
<batch:on-complete>
<logger level="INFO" doc:name="Logger"/>
</batch:on-complete>
</batch:job>
Please find my error
.............
com.mulesoft.module.batch.engine.DefaultBatchEngine: Input phase completed
ERROR 2014-09-12 14:26:04,219 [[businesslogicflow].connector.file.mule.default.receiver.01] org.mule.exception.DefaultMessagingExceptionStrategy:
Message:Object"org.mule.transport.file.ReceiverFileInputStream" not of correct type. It must be of type "{interface java.lang.Iterable,interface java.util.Iterator,interface org.mule.routing.MessageSequence,interface java.util.Collection}" (java.lang.IllegalArgumentException)
i'm not sure what is looking for to make "java.lang.interface and java.util.iterator" .
Please let me know your suggestions. Thank in advance.
I think you need to change it from filestream to an object the datamapper can use. I don't have a sample configured to test with but I would start by adding a File To String transformer in front of the datamapper.
I have resolved by the following way, since my input is xml, i have converted xml to jaxb object. Process record will expect the records in collections or list. Using java component convert in to Arraylist. Then as usual, datamapper( pojo to csv). Please find the config below
<mulexml:jaxb-context name="JAXB_Context" packageNames="com.to" doc:name="JAXB Context"/>
<spring:beans>
<spring:bean name="NoFactsBean" class="java.util.ArrayList" />
</spring:beans>
<data-mapper:config name="Pojo_To_CSV" transformationGraphPath="pojo_to_csv.grf" doc:name="Pojo_To_CSV"/>
<batch:job name="businesslogicflowBatch1">
<batch:threading-profile poolExhaustedAction="WAIT"/>
<batch:input>
<file:inbound-endpoint path="C:\Users\Desktop\IN" responseTimeout="10000" doc:name="File"/>
<mulexml:jaxb-xml-to-object-transformer returnClass="com.to.envelop" jaxbContext-ref="JAXB_Context" doc:name="XML to JAXB Object"/>
<component class="com.GenerateList" doc:name="Java"/>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step" accept-expression="#[getFirstException()]" accept-policy="ALL">
<data-mapper:transform config-ref="Pojo_To_CSV" doc:name="Pojo To CSV"/>
<file:outbound-endpoint path="C:\Users\Desktop\OUT" outputPattern="#[function:dateStamp]_convert.csv" responseTimeout="10000" doc:name="File"/>
</batch:step>
</batch:process-records>
<batch:on-complete>
<logger level="INFO" doc:name="Logger"/>
</batch:on-complete>
</batch:job>