I want to read a file from a configured Amazon S3 bucket on certain event (example - a message in JMS queue). It seems that mule requester helps in these kind of situations for file, ftp, etc connectors. However, it appears that the scope of mulerequester is limited to transport connectors and not cloud connectors.
Can I use S3 as a resource for mulerequester?
<flow name="process_s3_file" doc:name="process_s3_file"
processingStrategy="synchronous">
<mulerequester:request config-ref="" resource="need-to-use-s3-get-object"
doc:name="Mule Requester">
</mulerequester:request>
<logger level="INFO" doc:name="Logger" />
<!-- do something here -->
<s3:delete-object config-ref="Amazon_S3"
bucketName="${s3-read-bucket}" key="#[s3_file_name]" doc:name="Delete File"
accessKey="${s3-access-key}" secretKey="${s3-secret-key}" />
</flow>
Here is the S3 get-object from where I want to request resource.
<s3:get-object-content config-ref="Amazon_S3" bucketName="${s3-read-bucket}"
key="#[s3_file_name]" accessKey="${s3-access-key}"
secretKey="${s3-secret-key}"
doc:name="Read File" />
It seems you don't need to have mulerequester for S3 connector. You can put it anywhere in the flow. The following flow worked for me.
<flow name="process_s3_file" doc:name="process_s3_file"
processingStrategy="synchronous">
<s3:get-object-content config-ref="Amazon_S3" bucketName="${s3-read-bucket}"
key="#[s3_file_name]" accessKey="${s3-access-key}"
secretKey="${s3-secret-key}"
doc:name="Read File" />
<logger level="INFO" doc:name="Logger" />
<!-- do something here -->
<s3:delete-object config-ref="Amazon_S3"
bucketName="${s3-read-bucket}" key="#[s3_file_name]" doc:name="Delete File"
accessKey="${s3-access-key}" secretKey="${s3-secret-key}" /> </flow>
Related
I want to use HTTP Listener in my flows, and import csv files in Anypoint studio as input and convert them into JSON. Please help me.
you can just use a transform message and convert payload to json.
as you can see i am reading a file called address.csv.
in the transform message you can simple right
and in my logger you can see that contents of file converted to json
note -------------------------------------------------
if you want to pick a file in middle of a flow with a http listener you can always use Message-Requester module
here is how the code will look like
<file:connector name="file-connector-config" autoDelete="false" streaming="true" validateConnections="true" doc:name="File" />
<http:listener-config name="HTTP_Listener_Configuration" host="0.0.0.0" port="8081" basePath="/requester" doc:name="HTTP Listener Configuration" />
<flow name="muleRequester">
<http:listener config-ref="HTTP_Listener_Configuration" path="/requester" doc:name="HTTP" />
<logger message="Invoking Mule Requester" level="INFO" doc:name="Logger" />
<mulerequester:request resource="file://src/main/resources/in/ReadME.txt?connector=file-connector-config" doc:name="Retrieve File" returnClass="java.lang.String" />
<logger message="Payload after file requester #[payload]" level="INFO" doc:name="Logger" />
</flow>
refer link --> https://dzone.com/articles/mule-reading-file-in-the-middle-of-a-flow-using-mu
Maybe I'm misunderstanding the question, but if you'd like the http listener kick off the flow, then to load the file, you'll need a groovy script.
<scripting:component doc:name="Groovy">
<scripting:script engine="Groovy"><![CDATA[return new File("C:\test.csv").getText("UTF-8");]]></scripting:script>
</scripting:component>
I have a flow with the following steps:
1) Pick a file from source SFTP sever
2) Copy it into local storage
3) Process file using the copy in the local storage
4) Place the processed file (which will be transformed) into a destination SFTP server
5) Move file present in the source SFTP into a different folder on the source SFTP server (I could not find a way to do this and hence I'm copying from the temp location back into the SFTP processed folder)
This seems to a standard workflow, however I could not find any advice on how to specifically implement this in Mule.
My current implementation is described below:
<file:connector name="tempFile" workDirectory="${temp.file.location}/work"
workFileNamePattern="#[message.inboundProperties.originalFilename]"
autoDelete="true" streaming="false" validateConnections="true"
doc:name="File" />
<sftp:connector name="InputSFTP" validateConnections="true" keepFileOnError="true" doc:name="SFTP" >
<reconnect frequency="${reconnectfrequency}" count="5"/>
</sftp:connector>
<sftp:connector name="DestinationSFTP" validateConnections="true" pollingFrequency="30000" doc:name="SFTP">
<reconnect frequency="${reconnectfrequency}" count="5"/>
</sftp:connector>
<smtp:gmail-connector name="Gmail" contentType="text/plain" validateConnections="true" doc:name="Gmail"/>
<flow name="DownloadFTPFileIntoLocalFlow" processingStrategy="synchronous" tracking:enable-default-events="true">
<sftp:inbound-endpoint connector-ref="InputSFTP" host="${source.host}" port="22" path="${source.path}" user="${source.username}"
password="${source.password}" responseTimeout="90000" pollingFrequency="120000" sizeCheckWaitTime="1000" doc:name="InputSFTP" autoDelete="true">
<file:filename-regex-filter pattern="[Z].*\.csv" caseSensitive="false" />
</sftp:inbound-endpoint>
<file:outbound-endpoint path="${temp.file.location}" responseTimeout="10000" doc:name="Templocation" outputPattern="#[message.inboundProperties.originalFilename]" connector-ref="tempFile" />
<exception-strategy ref="Default_Exception_Strategy" doc:name="Reference Exception Strategy"/>
</flow>
<flow name="ProcessCSVFlow" processingStrategy="synchronous" tracking:enable-default-events="true">
<file:inbound-endpoint path="${temp.file.location}" connector-ref="tempFile" pollingFrequency="180000" fileAge="10000" responseTimeout="10000" doc:name="TempFileLocation"/>
<transformer ref="enrichWithHeaderAndEndOfFileTransformer" doc:name="headerAndEOFEnricher" />
<set-variable variableName="outputfilename" value="#['Mercury'+server.dateTime.year+server.dateTime.month+server.dateTime.dayOfMonth+server.dateTime.hours +server.dateTime.minutes+server.dateTime.seconds+'.csv']" doc:name="outputfilename"/>
<sftp:outbound-endpoint exchange-pattern="one-way" connector-ref="DestinationSFTP" host="${destination.host}" port="22" responseTimeout="10000" doc:name="DestinationSFTP"
outputPattern="#[outputfilename]" path="${destination.path}" user="${destination.username}" password="${destination.password}"/>
<gzip-compress-transformer/>
<sftp:outbound-endpoint exchange-pattern="one-way" connector-ref="InputSFTP" host="${source.host}" port="22" responseTimeout="10000" doc:name="SourceArchiveSFTP"
outputPattern="#[outputfilename].gzip" path="Archive" user="${source.username}" password="${source.password}"/>
<set-payload value="Hello world" doc:name="Set Payload"/>
<smtp:outbound-endpoint host="${smtp.host}" port="${smtp.port}" user="${smtp.from.address}" password="${smtp.from.password}"
to="${smtp.to.address}" from="${smtp.from.address}" subject="${mail.success.subject}" responseTimeout="10000"
doc:name="SuccessEmail" connector-ref="Gmail"/>
<logger message="Process completed successfully" level="INFO" doc:name="Logger"/>
<tracking:transaction id="#[server.dateTime]"/>
<exception-strategy ref="Default_Exception_Strategy" doc:name="Reference Exception Strategy"/>
</flow>
<catch-exception-strategy name="Default_Exception_Strategy">
<logger message="Exception has occured Payload is #[payload] and Message is #[message]" level="ERROR" doc:name="Logger"/>
<!-- <smtp:outbound-endpoint host="localhost" responseTimeout="10000" doc:name="Failure Email"/> -->
</catch-exception-strategy>
Have you tried enabling autoDelete="true" on the SFTP connector to force deleting?
Also, is it not possible to do flow1: SFTP-in -> transform -> file out, flow2: file-in -> SFTP-out?
HTH
I have a case where I need to poll a directory on my file system. Each file that is added to that directory needs to be posted to an HTTP endpoint.
The HTTP endpoint is available on "/rest/latest/file". Using postman, I've verified that the REST call works with the following settings:
POST
basic auth
form-data:
key = file
value = Selected a file from my file system (using a dialog)
My mule flow currently looks like this:
<file:connector name="File" autoDelete="true" streaming="true" validateConnections="true" moveToPattern="#[message.inboundProperties['originalFilename']].backup" moveToDirectory="src/main/resources/output" doc:name="File"/>
<flow name="importdataqualityresultsFlow1" doc:name="importdataqualityresultsFlow1">
<file:inbound-endpoint path="src/main/resources/input" responseTimeout="10000" doc:name="File"/>
<set-payload value="#[['file' :#[message.inboundAttachments['text.txt']]]]" doc:name="Set Payload"/>
<http:outbound-endpoint exchange-pattern="request-response" host="localhost" port="80" path="rest/latest/file" method="POST" user="Admin" password="admin" contentType="application/form-data" doc:name="HTTP"/>
</flow>
I can tell in my application logs that the user logs in using basic auth, after which I get a stack trace.
Any help / pointers would be greatly appreciated.
You need to create a map payload with the form fields:
<flow name="importdataqualityresultsFlow1">
<file:inbound-endpoint path="src/main/resources/input" />
<object-to-string-transformer />
<set-payload value="#[['file': message.payload]]" />
<http:outbound-endpoint exchange-pattern="request-response" host="localhost" port="80" path="rest/latest/file" method="POST" user="Admin" password="admin" contentType="application/form-data" />
</flow>
<flow name="receive-files-from-client">
<file:inbound-endpoint connector-ref="ibFileConnector"
path="/client-data/accounts/client/ToTest">
<file:filename-wildcard-filter pattern="ABC_123*.txt, XYZ_987*.txt" />
<object-to-byte-array-transformer /> <!-- need to convert from an input stream to a byte array to avoid having the wire-tap close it -->
<wire-tap>
<file:outbound-endpoint path="${workingDdir}/Dir1/archive/inbound/#[function:datestamp-yyyy-MM-dd_HH-mm-ss.SSS]" />
</wire-tap>
</file:inbound-endpoint>
....
...
</flow>
I have Configured everything properly, But Mule is not picking the file from that inbound path location.
Use the following modified flow. You are using a file outbound inside a file inbound.
<flow name="receive-files-from-client">
<file:inbound-endpoint connector-ref="inboundFileConnector"
path="/client-ftpdata/ftpaccounts/client/To_CC-Test">
<file:filename-wildcard-filter pattern="CYC53_810*.txt,CYC53_855*.txt,CYC53_856*.txt,CYC53_997*.txt" />
<object-to-byte-array-transformer /> <!-- need to convert from an input stream to a byte array to avoid having the wire-tap close it -->
<wire-tap>
<file:outbound-endpoint path="${global.workdir}/suppliers/S000590/archive/inbound/#[function:datestamp-yyyy-MM-dd_HH-mm-ss.SSS]" />
</wire-tap>
</file:inbound-endpoint>
<wire-tap>
<file:outbound-endpoint path="${global.workdir}/suppliers/S000590/archive/inbound/#[function:datestamp-yyyy-MM-dd_HH-mm-ss.SSS]" />
</wire-tap>
....
...
</flow>
Try to execute your flow without filter and see if the flow is picking the files. If so modify your filter regular expression.
Hope this helps.
In a flow something like below:
<flow name="fileFlow">
<file:inbound-endpoint path="/inbound/ftp/sbc" pollingFrequency="30000" fileAge="30000" moveToDirectory="/inbound/ftp/sbc/archive">
<file:filename-wildcard-filter pattern="*.xml" caseSensitive="false"/>
</file:inbound-endpoint>
<logger message="Entering #[flow.name] flow" level="INFO"/>
<component class="com.abc.RequestFile"/>
<logger message="Payload after transformation is: #[payload] flow" level="INFO"/>
<vm:outbound-endpoint path="merge" />
<logger message="Exiting #[flow.name] flow" level="INFO"/>
</flow>
I get InputStream from file:inbound-endpoint which is passed to RequestFile component. This component is required to return a list of files of which one of this is the original one read and passed. I am seeking a solution other than manual copying InputStream to File in java component.
As explained in this answer https://stackoverflow.com/a/12397775/387927 you can get the java.io.File object instead of its content with this setting:
<file:connector name="fileConnector" streaming="false" autoDelete="false">
<service-overrides messageFactory="org.mule.transport.file.FileMuleMessageFactory" />
</file:connector>
Note that it is up to you to move / delete the file either before you start or once you've done processing it, otherwise Mule will poll it again and again.