validate fields in datamapper (CSV to JSON). - mule

Converting from CSV to JSON using mule datamapper. I want to check if required field is empty. If empty log that field and discard it for further processing.
I know in script option we have if(input.data.length >0).'
But how to discard the whole row if this fails??

You can do this within mule datamapper simply by encapsulating the whole conversion within the if statements opening and closing braces. Something like this:
if ( input.Quantity > 0 ) {
output.id = input.id;
output.Customer = input.Customer;
output.Quantity = input.Quantity;
output.Price = input.Price;
}
However a different, perhaps better, approach would be to let the datamapper transform every row into JSON and then split and filter as seperate steps in the flow.
<flow name="filterindatamapperFlow2" doc:name="filterindatamapperFlow2">
<file:inbound-endpoint path="/tmp/inbox" doc:name="Inbound file"/>
<data-mapper:transform config-ref="CSV_To_UnfilteredJSON" doc:name="CSV To Unfiltered JSON"/>
<request-reply>
<vm:outbound-endpoint path="splittandprocess" exchange-pattern="one-way"/>
<vm:inbound-endpoint path="result"/>
</request-reply>
<json:object-to-json-transformer doc:name="Object to JSON"/>
<file:outbound-endpoint path="/tmp/outbox" doc:name="Outbound file"/>
</flow>
<flow name="splittandprocess">
<vm:inbound-endpoint path="splittandprocess" exchange-pattern="one-way"/>
<json:json-to-object-transformer returnClass="java.util.List" doc:name="JSON to Object"/>
<splitter expression="#[payload]" doc:name="Splitter"/>
<json:json-to-object-transformer returnClass="java.util.Map" doc:name="JSON to Object"/>
<message-filter doc:name="Filter Out Orders With No Quantity" onUnaccepted="handleFilteredMessages">
<expression-filter expression="#[payload['Quantity'] > 0]" />
</message-filter>
<collection-aggregator failOnTimeout="false" timeout="1000"/>
<vm:outbound-endpoint path="result" exchange-pattern="one-way"/>
</flow>
<flow name="handleFilteredMessages">
<logger message="Payload filtered #[payload]" level="ERROR" doc:name="Logger"/>
</flow>

Related

"Read multiple file from different location simultaneously and merge them into one payload"

I am reading the multiple files from different folder and merging them into one but not able to merge into one file.
I am using composite source where I added two file connector then I am logging the payload into logger. payload I am getting one by one. How can I get the one payload combination of the two different payloads or multiple files input?
<flow name="file2Flow">
<composite-source doc:name="Copy_of_Composite Source">
<file:inbound-endpoint path="src/main/resources/input1" responseTimeout="10000" doc:name="File"/>
<file:inbound-endpoint path="src/main/resources/input2" responseTimeout="10000" doc:name="File"/>
</composite-source>
<file:file-to-string-transformer doc:name="File to String"/>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
</flow>
also I am trying this but not getting output
<flow name="file2file2Flow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/files" doc:name="HTTP"/>
<scatter-gather doc:name="Scatter-Gather">
<file:outbound-endpoint path="src/main/resources/input1" responseTimeout="10000" doc:name="File"/>
<file:outbound-endpoint path="src/main/resources/input1" responseTimeout="10000" doc:name="File"/>
</scatter-gather>
<dw:transform-message doc:name="Transform Message">
<dw:set-payload><![CDATA[%dw 1.0
%output application/json
---
{
post1: payload[0],
post2: payload[1]
}]]>
</dw:set-payload>
</dw:transform-message>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
</flow>
file:inbound-endpoint will poll one directory, so if you need different directories that won't work.
composite-source allows it, but they wont be available in the same payload.
file:outbound-endpoint is for writing files only.
In Mule 3, you can achieve this though through a combination of a poll to trigger the flow, scatter-gather to route to multiple processors and the mule requester module to read files mid flow.
Mule Requester Module: https://www.mulesoft.com/exchange/68ef9520-24e9-4cf2-b2f5-620025690913/requester-module/
Rough example:
<flow name="dw-testFlow">
<poll doc:name="Poll" frequency="10000">
<logger level="INFO" doc:name="Logger" />
</poll>
<scatter-gather doc:name="Scatter-Gather">
<mulerequester:request config-ref="muleRequesterConfig" resource="myFileEndpoint" doc:name="Mule Requester" />
<mulerequester:request config-ref="muleRequesterConfig" resource="myFileEndpoint" doc:name="Mule Requester" />
</scatter-gather>
</flow>

Until-Successful Process for list of objects in a long running query not working

I want to develop a flow that could allow me to make queries to an external system that could take a long time to return. I may have to make queries for multiple values in a list. I am using an until-successful scope in solving the problem. Unfortunately, the even though the request is run several times, the failed records never get put in the dead letter queue. Here is my attempt at solving the problem:
<!-- Dead Letter Queue for exhausted attempts-->
<vm:endpoint name="DLQ" path="DLQ_VM" doc:name="VM"/>
<flow name="StartFlow" processingStrategy="synchronous">
<!--Place a list of String errors to query for on this vm -->
<vm:inbound-endpoint path="request-processing-queue" "
exchange-pattern="one-way" doc:name="VM"/>
<vm:outbound-endpoint path="reprocessing-queue"
exchange-pattern="request-response" doc:name="VM"/>
<logger level="INFO" message="Data returned is #[payload]"/>
<catch-exception-strategy>
<logger level="ERROR" message="Failure During Processing"/>
</catch-exception-strategy>
</flow>
<flow name="RetryingProcess">
<vm:inbound-endpoint name="reprocessing-vm" exchange-
pattern="request-response"
path="reprocessing-queue" doc:name="VM"/>
<foreach collection="#[payload]" doc:name="For Each">
<vm:outbound-endpoint path="by-singles-vm" exchange-
pattern="request-response"/>
</foreach>
</flow>
<flow name="query-retry">
<vm:inbound-endpoint path="by-singles-vm" exchange-
pattern="request-response" doc:name="VM"/>
<until-successful objectStore-ref="objectStore"
failureExpression="#[groovy:(exception &&
exception in com.trion.CustomException)
||!(payload instanceof
com.trion.QueryResult])]"
maxRetries="5"
millisBetweenRetries="300000"
deadLetterQueue-ref="DLQ_VM" doc:name="Until
Successful">
<vm:outbound-endpoint path="try-again-vm" exchange-
pattern="request-response" doc:name="VM"/>
</until-successful>
</flow>
<flow name="GetQueryValue" >
<vm:inbound-endpoint path="try-again-vm" exchange-
pattern="request-response" doc:name="VM"/>
<flow-ref name="QueryRequest" />
</flow>
<!-- This never happens, i.e. the results are not put here... after retying
-->
<flow name="AttemptsExceededProcessing">
<inbound-endpoint ref="DLQ_VM" doc:name="Generic"/>
<logger level="DEBUG" message="Entering Final Destination Queue with
payload is #[payload]"/>
</flow>
<!-- Here I have a query to the external system... >
<flow name="QueryRequest">
...... Makes the long running query Here..
//returns com.trion.QueryResult
</flow>
</mule>
Please help!
There was no problem with the configuration. I had a millisSecondsBetweenRetry value set so small I wasn't seeing the log messages and assumed it wasn't working.

Mule - Send SFTP outbound message after Collection Split

I am using collection-splitter to split my List. Now how should I set the payload to SFTP outbound-endpoint.
<sftp:inbound-endpoint connector-ref="sftp-inbound" host="${SFTP_HOST}" port="${SFTP_PORT}"
path="/files/" user="${SFTP_USER}" password="${SFTP_PASS}"
responseTimeout="10000" pollingFrequency="30000" fileAge="20000" sizeCheckWaitTime="5000"
archiveDir="/files/archive/" doc:name="SFTP" >
<file:filename-regex-filter pattern="Test(.*).zip" caseSensitive="true"/>
</sftp:inbound-endpoint>
<set-variable variableName="regexVal" value="${REGEX}" doc:name="Variable"/>
<set-variable variableName="sourceFileName" value="#[flowVars.originalFilename]" doc:name="Variable"/>
<custom-transformer name="zipTxt" class="com.mst.transform.UnzipTransformer" doc:name="Java" mimeType="image/gif">
<spring:property name="filenamePattern" value="*.csv,*.txt" />
</custom-transformer>
<set-variable variableName="fileContents" value="#[payload]" />
<collection-splitter enableCorrelation="IF_NOT_SET" />
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
<sftp:outbound-endpoint connector-ref="sftp-inbound"
host="${SFTP_HOST}" port="${SFTP_PORT}"
path="/files/" user="${SFTP_USER}" password="${SFTP_PASS}"
responseTimeout="10000" doc:name="SFTP"
exchange-pattern="one-way"/>
</flow>
If your payload before collection splitter is list of objects that can be consumed by SFTP outbound endpoint like InputStream, then after splitter, you can wrap logger, sftp inside processor-chain. Splitter will send each object one-by-one to processor chain. SFTP should be able to write it if its an InputSream.
<collection-splitter enableCorrelation="IF_NOT_SET" />
<processor-chain doc:name="Processor Chain">
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
<sftp:outbound-endpoint connector-ref="sftp-inbound"
host="${SFTP_HOST}" port="${SFTP_PORT}"
path="/files/" user="${SFTP_USER}" password="${SFTP_PASS}"
responseTimeout="10000" doc:name="SFTP"
exchange-pattern="one-way"/>
</processor-chain>
You wouldn't need processor-chain if you just want to put one processor (eg. SFTP) after splitter.
If this doesn't work, then please add error details to question.

Expected return type java.lang.Iterable

How to convert type=java.lang.String to type=java.lang.Iterable as batch step (Process Records) is expecting type of java.lang.Iterable. Note : Input is an xml file and mule flow is a batch job.
When the xml has only one 'Report_Entry' record below error is recieved. For multiple entries of 'Report_Entry' flow works fine.
<object-to-string-transformer doc:name="Object to String"/>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
<set-payload
value="#[xpath3('/*:Report_Data/*:Report_Entry', payload, 'NODESET')]" doc:name="Set Payload"/>
<logger message="XML Record - #[payload]" level="INFO" doc:name="Logger"/>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step1">
<json:object-to-json-transformer doc:name="Object to JSON"/>
<logger message="XML Record - #[payload]" level="INFO" doc:name="Logger"/>
<amqp:outbound-endpoint exchangeName="${amqp.exchangeName}" queueName="${amqp.queueName}" responseTimeout="10000" encoding="UTF-8" mimeType="application/xml" connector-ref="AMQP_Connector" doc:name="AMQP"/>
</batch:step>
</batch:process-records>
In the logger it is printing 'org.mule.api.processor.LoggerMessageProcessor: XML Record - net.sf.saxon.dom.DOMNodeList#57d263b4' after the set-payload condition. Our requirement is to convert the xml record to JSON and write to AMQP.
That's because of the splitter. If you just want a collection/iterable before the batch job, jsut use set-payload:
<set-payload
value="#[xpath3('/*:Report_Data/*:Report_Entry', payload, 'NODESET')]" />
<batch:execute name="test" />
This should wok regardless of the amount nodes.

Flow variable not working correct for DB select query

I am facing one Strange issue ... My Mule flow is as follow :-
<jdbc-ee:connector name="Database_Global" dataSource-ref="DB_Source" validateConnections="true" queryTimeout="-1" pollingFrequency="0" doc:name="Database">
<jdbc-ee:query key="InsertQuery" value="INSERT INTO getData(ID,NAME,AGE,DESIGNATION)VALUES(#[flowVars['id']],#[flowVars['name']],#[flowVars['age']],#[flowVars['designation']])"/>
<jdbc-ee:query key="RetriveQuery" value="Select * from getData where ID=#[flowVars['id']] "/>
</jdbc-ee:connector>
<flow name="MuleDbInsertFlow1" doc:name="MuleDbInsertFlow1">
<http:inbound-endpoint exchange-pattern="request-response" host="localhost" port="8082" path="mainData" doc:name="HTTP"/>
<cxf:jaxws-service service="MainData" serviceClass="com.vertu.services.schema.maindata.v1.MainData" doc:name="SOAPWithHeader" />
<component class="com.vertu.services.schema.maindata.v1.Impl.MainDataImpl" doc:name="JavaMain_ServiceImpl"/>
<mulexml:object-to-xml-transformer doc:name="Object to XML"/>
<choice doc:name="Choice">
<when expression="#[message.inboundProperties['SOAPAction'] contains 'retrieveDataOperation']">
<processor-chain doc:name="Processor Chain">
<set-variable variableName="id" value="#[xpath('//id').text]" doc:name="Variable"/>
<logger message="ID from req #[flowVars['id']]" level="INFO" doc:name="Logger"/>
<jdbc-ee:outbound-endpoint exchange-pattern="request-response" queryKey="RetriveQuery" queryTimeout="-1" connector-ref="Database_Global" doc:name="Database (JDBC)"/>
<choice doc:name="Choice">
<when expression="#[message.payload.isEmpty()]">
<processor-chain>
<!-- Data not exists .. We cannot display -->
<logger message="No records found in Database !!!" level="INFO" doc:name="Logger"/>
</processor-chain>
</when>
<otherwise>
<processor-chain>
<!-- Data exists .. We cannotdisplay -->
<logger message="The Data retrieved from the Database" level="INFO" doc:name="Logger"/>
</processor-chain>
</otherwise>
</choice>
Now the issue is whenever I use the query RetriveQuery:- Select * from getData where ID=#[flowVars['id']]
It goes to the choice block where the logger shows No records found in Database !!! .. But you can see I placed a logger before call the SQL query by DB outbound
<logger message="ID from req #[flowVars['id']]" level="INFO" doc:name="Logger"/>
<jdbc-ee:outbound-endpoint exchange-pattern="request-response" queryKey="RetriveQuery" queryTimeout="-1" connector-ref="Database_Global" doc:name="Database (JDBC)"/>
which prints #[flowVars['id']] and I am successfully getting the value ..
But I don't know why it is going to the <when expression="#[message.payload.isEmpty()]">block ...
If I use the following in RetriveQuery : Select * from getData where ID=22
Then it's successfully getting the value of ID in the query ..
Please let me know why it's not getting the value in SQL query if I use a flowVars ..
It's executing successfully for insert and update query but not for Select ..
Pls note :- here the value of ID in Select * from getData where ID is integer ..
This is strange.
Try cleaning and re-building the project: the version that's running is maybe not using the latest config.
Yes, this was a strange and I found cleaning and re building the project in studio as David suggested worked .. may be there was an issue in picking up the latest config and reflecting in the proect