I am reading a zip file using file inbound connector in mule. The file should be auto deleted, since auto delete is true. But it's not.
The flow I have is
<file:connector name="File" writeToDirectory="D:\FileProcessed\ringmoved\" readFromDirectory="D:\FileProcessed\" autoDelete="true" streaming="true" validateConnections="true" doc:name="File"/>
<flow name="filFlow">
<file:inbound-endpoint path="D:\FileProcessed\" moveToDirectory="D:\FileProcessed\moved\" connector-ref="File" responseTimeout="10000" doc:name="File"/>
<logger message="hi" level="INFO" doc:name="Logger"/>
</flow>
It's because you are not consuming the file. Try adding a transformer such as
<object-to-string-transformer />
after the file endpoint.
Related
I want to use HTTP Listener in my flows, and import csv files in Anypoint studio as input and convert them into JSON. Please help me.
you can just use a transform message and convert payload to json.
as you can see i am reading a file called address.csv.
in the transform message you can simple right
and in my logger you can see that contents of file converted to json
note -------------------------------------------------
if you want to pick a file in middle of a flow with a http listener you can always use Message-Requester module
here is how the code will look like
<file:connector name="file-connector-config" autoDelete="false" streaming="true" validateConnections="true" doc:name="File" />
<http:listener-config name="HTTP_Listener_Configuration" host="0.0.0.0" port="8081" basePath="/requester" doc:name="HTTP Listener Configuration" />
<flow name="muleRequester">
<http:listener config-ref="HTTP_Listener_Configuration" path="/requester" doc:name="HTTP" />
<logger message="Invoking Mule Requester" level="INFO" doc:name="Logger" />
<mulerequester:request resource="file://src/main/resources/in/ReadME.txt?connector=file-connector-config" doc:name="Retrieve File" returnClass="java.lang.String" />
<logger message="Payload after file requester #[payload]" level="INFO" doc:name="Logger" />
</flow>
refer link --> https://dzone.com/articles/mule-reading-file-in-the-middle-of-a-flow-using-mu
Maybe I'm misunderstanding the question, but if you'd like the http listener kick off the flow, then to load the file, you'll need a groovy script.
<scripting:component doc:name="Groovy">
<scripting:script engine="Groovy"><![CDATA[return new File("C:\test.csv").getText("UTF-8");]]></scripting:script>
</scripting:component>
I am reading the multiple files from different folder and merging them into one but not able to merge into one file.
I am using composite source where I added two file connector then I am logging the payload into logger. payload I am getting one by one. How can I get the one payload combination of the two different payloads or multiple files input?
<flow name="file2Flow">
<composite-source doc:name="Copy_of_Composite Source">
<file:inbound-endpoint path="src/main/resources/input1" responseTimeout="10000" doc:name="File"/>
<file:inbound-endpoint path="src/main/resources/input2" responseTimeout="10000" doc:name="File"/>
</composite-source>
<file:file-to-string-transformer doc:name="File to String"/>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
</flow>
also I am trying this but not getting output
<flow name="file2file2Flow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/files" doc:name="HTTP"/>
<scatter-gather doc:name="Scatter-Gather">
<file:outbound-endpoint path="src/main/resources/input1" responseTimeout="10000" doc:name="File"/>
<file:outbound-endpoint path="src/main/resources/input1" responseTimeout="10000" doc:name="File"/>
</scatter-gather>
<dw:transform-message doc:name="Transform Message">
<dw:set-payload><![CDATA[%dw 1.0
%output application/json
---
{
post1: payload[0],
post2: payload[1]
}]]>
</dw:set-payload>
</dw:transform-message>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
</flow>
file:inbound-endpoint will poll one directory, so if you need different directories that won't work.
composite-source allows it, but they wont be available in the same payload.
file:outbound-endpoint is for writing files only.
In Mule 3, you can achieve this though through a combination of a poll to trigger the flow, scatter-gather to route to multiple processors and the mule requester module to read files mid flow.
Mule Requester Module: https://www.mulesoft.com/exchange/68ef9520-24e9-4cf2-b2f5-620025690913/requester-module/
Rough example:
<flow name="dw-testFlow">
<poll doc:name="Poll" frequency="10000">
<logger level="INFO" doc:name="Logger" />
</poll>
<scatter-gather doc:name="Scatter-Gather">
<mulerequester:request config-ref="muleRequesterConfig" resource="myFileEndpoint" doc:name="Mule Requester" />
<mulerequester:request config-ref="muleRequesterConfig" resource="myFileEndpoint" doc:name="Mule Requester" />
</scatter-gather>
</flow>
I have a flow with the following steps:
1) Pick a file from source SFTP sever
2) Copy it into local storage
3) Process file using the copy in the local storage
4) Place the processed file (which will be transformed) into a destination SFTP server
5) Move file present in the source SFTP into a different folder on the source SFTP server (I could not find a way to do this and hence I'm copying from the temp location back into the SFTP processed folder)
This seems to a standard workflow, however I could not find any advice on how to specifically implement this in Mule.
My current implementation is described below:
<file:connector name="tempFile" workDirectory="${temp.file.location}/work"
workFileNamePattern="#[message.inboundProperties.originalFilename]"
autoDelete="true" streaming="false" validateConnections="true"
doc:name="File" />
<sftp:connector name="InputSFTP" validateConnections="true" keepFileOnError="true" doc:name="SFTP" >
<reconnect frequency="${reconnectfrequency}" count="5"/>
</sftp:connector>
<sftp:connector name="DestinationSFTP" validateConnections="true" pollingFrequency="30000" doc:name="SFTP">
<reconnect frequency="${reconnectfrequency}" count="5"/>
</sftp:connector>
<smtp:gmail-connector name="Gmail" contentType="text/plain" validateConnections="true" doc:name="Gmail"/>
<flow name="DownloadFTPFileIntoLocalFlow" processingStrategy="synchronous" tracking:enable-default-events="true">
<sftp:inbound-endpoint connector-ref="InputSFTP" host="${source.host}" port="22" path="${source.path}" user="${source.username}"
password="${source.password}" responseTimeout="90000" pollingFrequency="120000" sizeCheckWaitTime="1000" doc:name="InputSFTP" autoDelete="true">
<file:filename-regex-filter pattern="[Z].*\.csv" caseSensitive="false" />
</sftp:inbound-endpoint>
<file:outbound-endpoint path="${temp.file.location}" responseTimeout="10000" doc:name="Templocation" outputPattern="#[message.inboundProperties.originalFilename]" connector-ref="tempFile" />
<exception-strategy ref="Default_Exception_Strategy" doc:name="Reference Exception Strategy"/>
</flow>
<flow name="ProcessCSVFlow" processingStrategy="synchronous" tracking:enable-default-events="true">
<file:inbound-endpoint path="${temp.file.location}" connector-ref="tempFile" pollingFrequency="180000" fileAge="10000" responseTimeout="10000" doc:name="TempFileLocation"/>
<transformer ref="enrichWithHeaderAndEndOfFileTransformer" doc:name="headerAndEOFEnricher" />
<set-variable variableName="outputfilename" value="#['Mercury'+server.dateTime.year+server.dateTime.month+server.dateTime.dayOfMonth+server.dateTime.hours +server.dateTime.minutes+server.dateTime.seconds+'.csv']" doc:name="outputfilename"/>
<sftp:outbound-endpoint exchange-pattern="one-way" connector-ref="DestinationSFTP" host="${destination.host}" port="22" responseTimeout="10000" doc:name="DestinationSFTP"
outputPattern="#[outputfilename]" path="${destination.path}" user="${destination.username}" password="${destination.password}"/>
<gzip-compress-transformer/>
<sftp:outbound-endpoint exchange-pattern="one-way" connector-ref="InputSFTP" host="${source.host}" port="22" responseTimeout="10000" doc:name="SourceArchiveSFTP"
outputPattern="#[outputfilename].gzip" path="Archive" user="${source.username}" password="${source.password}"/>
<set-payload value="Hello world" doc:name="Set Payload"/>
<smtp:outbound-endpoint host="${smtp.host}" port="${smtp.port}" user="${smtp.from.address}" password="${smtp.from.password}"
to="${smtp.to.address}" from="${smtp.from.address}" subject="${mail.success.subject}" responseTimeout="10000"
doc:name="SuccessEmail" connector-ref="Gmail"/>
<logger message="Process completed successfully" level="INFO" doc:name="Logger"/>
<tracking:transaction id="#[server.dateTime]"/>
<exception-strategy ref="Default_Exception_Strategy" doc:name="Reference Exception Strategy"/>
</flow>
<catch-exception-strategy name="Default_Exception_Strategy">
<logger message="Exception has occured Payload is #[payload] and Message is #[message]" level="ERROR" doc:name="Logger"/>
<!-- <smtp:outbound-endpoint host="localhost" responseTimeout="10000" doc:name="Failure Email"/> -->
</catch-exception-strategy>
Have you tried enabling autoDelete="true" on the SFTP connector to force deleting?
Also, is it not possible to do flow1: SFTP-in -> transform -> file out, flow2: file-in -> SFTP-out?
HTH
Below is my mule flow. I want to move my corresponding from the jdbc query rseult set
----------------------------------------------------
<foreach doc:name="Foreach" counterVariableName="#[message.payload.size()]">
<logger message="#[payload.filepath] - #[payload.name] - #[payload.filename]" level="INFO" doc:name="Logger" />
</foreach>
---------------------------------------------------------------------
<jdbc-ee:postgresql-data-source name="PostgreSQL_Data_Source"
user="postgres" password="postgres" url="jdbc:postgresql://localhost:5432/postgres"
transactionIsolation="UNSPECIFIED" doc:name="PostgreSQL Data Source">
</jdbc-ee:postgresql-data-source>
<jdbc-ee:connector name="JDBCConnector"
dataSource-ref="PostgreSQL_Data_Source" validateConnections="true"
doc:name="JDBCConnector">
<jdbc-ee:query key="emprec" value="select * from emp where salary>50000";">
</jdbc-ee:query>
</jdbc-ee:connector>
<flow name="empflow" >
<quartz:inbound-endpoint responseTimeout="10000"
doc:name="Quartz" jobName="CronJobSchedule" repeatInterval="0"
cronExpression="0 0/1 * ? * MON-FRI" repeatCount="1">
<quartz:event-generator-job>
<quartz:payload>quartzSchedular started</quartz:payload>
</quartz:event-generator-job>
</quartz:inbound-endpoint>
<jdbc-ee:outbound-endpoint queryKey="emprec"
queryTimeout="-1" connector-ref="JDBCConnector" exchange-pattern="request-response"
doc:name="Database" />
<logger message="Size of payload is ::: #[message.payload.size()]" level="INFO" doc:name="Logger"/>
<foreach doc:name="Foreach" counterVariableName="#[message.payload.size()]">
<logger message="#[payload.filepath] - #[payload.name] - #[payload.filename]" level="INFO" doc:name="Logger" />
</foreach>
</flow>
please suggest way to move whatever filename got using query result ,that file need to move other location
Inside for each loop i tried file inbound and outpoint .but it is not worked out
1 - You need to load the file, for that I suggest using the Mule Requester Module. You can find more on it in this blogpost.
2 - Right after that you can move it using a file outbound endpoint.
Here's an example:
<mulerequester:request config-ref="Mule_Requester" resource="file:///Users/anafelisatti/test.txt" returnClass="java.lang.String" doc:name="Mule Requester"/>
<file:outbound-endpoint responseTimeout="10000" doc:name="File" outputPattern="test.txt" path="/Users/anafelisatti/Documents"/>
Hope that helps.
I have a case where I need to poll a directory on my file system. Each file that is added to that directory needs to be posted to an HTTP endpoint.
The HTTP endpoint is available on "/rest/latest/file". Using postman, I've verified that the REST call works with the following settings:
POST
basic auth
form-data:
key = file
value = Selected a file from my file system (using a dialog)
My mule flow currently looks like this:
<file:connector name="File" autoDelete="true" streaming="true" validateConnections="true" moveToPattern="#[message.inboundProperties['originalFilename']].backup" moveToDirectory="src/main/resources/output" doc:name="File"/>
<flow name="importdataqualityresultsFlow1" doc:name="importdataqualityresultsFlow1">
<file:inbound-endpoint path="src/main/resources/input" responseTimeout="10000" doc:name="File"/>
<set-payload value="#[['file' :#[message.inboundAttachments['text.txt']]]]" doc:name="Set Payload"/>
<http:outbound-endpoint exchange-pattern="request-response" host="localhost" port="80" path="rest/latest/file" method="POST" user="Admin" password="admin" contentType="application/form-data" doc:name="HTTP"/>
</flow>
I can tell in my application logs that the user logs in using basic auth, after which I get a stack trace.
Any help / pointers would be greatly appreciated.
You need to create a map payload with the form fields:
<flow name="importdataqualityresultsFlow1">
<file:inbound-endpoint path="src/main/resources/input" />
<object-to-string-transformer />
<set-payload value="#[['file': message.payload]]" />
<http:outbound-endpoint exchange-pattern="request-response" host="localhost" port="80" path="rest/latest/file" method="POST" user="Admin" password="admin" contentType="application/form-data" />
</flow>