mule multiple various types of file upload into amazon s3 bucket - amazon-s3

<flow name="flow1">
<file:inbound-endpoint path="C:\temp" moveToPattern="abc.txt" responseTimeout="10000" doc:name="File"/>
<s3:create-object config-ref="Amazon_S3" bucketName="mulebucket" key="img" doc:name="Amazon S3"/>
<logger message="s3 upload done...:" level="INFO" />
</flow>
I want to upload multiple files into my s3 bucket.
but above code upload only one file.
any suggestions are welcome

The file inbound-endpoint will keep picking up files from the source directory and creating them in S3. I think the problem is you S3 object key is static, so it is overwriting the same file. You can chnage the key to be more dynamic by using the filename of the loaded file, something like so:
<s3:create-object config-ref="Amazon_S3" bucketName="mulebucket" key="#[message.inboundProperties.originalFilename]" doc:name="Amazon S3"/>

Related

Mule file inbound connector with poll scope

I'm trying to use mule inbound file connector with poll scope got error saying couldn't start endpoint. If I remove poll scope and use file connector with default polling and its working fine without any file path changes.
I was wondering why is Poll scope giving error? If file inbound connector not allowed to wrapped in poll scope, why anypoint studio showing poll scope in the wrap in option ?
I found similar question, but I didn't see detailed explanations.
Mule won't allow POLL message processor to read file using file Inbound?
Advance thanks for your response.
Use mule-module-requester https://github.com/mulesoft/mule-module-requester, together with the Poll Scheduler.
relevant posts: http://blogs.mulesoft.com/dev/mule-dev/introducing-the-mule-requester-module/
Another way is,
Set the FTP flow initialState="stopped", and let the poll scheduler start the flow. After the FTP processing, stop the flow again.
see sample code:
<ftp:connector name="FTP" pollingFrequency="1000"
validateConnections="true" moveToDirectory="/work/ftp/processed"
doc:name="FTP" />
<flow name="scheduleStartFTPFlow">
<poll doc:name="Poll">
<fixed-frequency-scheduler frequency="1"
timeUnit="MINUTES" />
<expression-component doc:name="START FTP FLOW"><![CDATA[if(app.registry.processFTPFlow.isStopped()){
app.registry.processFTPFlow.start();
}]]></expression-component>
</poll>
<logger message="Poll Logging: #[payload]" level="INFO"
doc:name="Logger" />
</flow>
<flow name="processFTPFlow" initialState="stopped">
<ftp:inbound-endpoint host="localhost" port="21"
path="/data/ftp" user="Sanjeet" password="sanjeet123" responseTimeout="10000"
doc:name="FTP" connector-ref="FTP" />
<logger message="Logging FTP #[payload]" level="INFO" doc:name="Logger" />
<expression-component doc:name="STOP FTP FLOW"><![CDATA[app.registry.processFTPFlow.stop();]]></expression-component>
</flow>
Please, provide SSCCE.
Based on your question you do not need Poll at all. File Connector already has this feature to check file periodically. Here is example which polls file every 0.123 seconds
<file:inbound-endpoint path="/tmp" responseTimeout="10000" doc:name="File" pollingFrequency="123"/>
My suggestion is to use the quartz connector beside the file connector and set the interval in the quartz connector. Or use the file connector itself having the poll frequency so no need to wrap the file in poll scope.
you can create a file endpoint in the global element section and then use mule requester to invoke that endpoint inside a poll scope.
<file:connector name="File1" autoDelete="true" streaming="true" validateConnections="true" doc:name="File"/>
<file:endpoint connector-ref="File1" name="File" responseTimeout="10000" doc:name="File" path="/"/>
<flow name="pocforloggingFlow1">
<poll doc:name="Poll">
<mulerequester:request resource="File" doc:name="Mule Requester"/>
</poll>
</flow>

Mule - process file only when another is present

I have a Mule flow which processes files in an inbound folder that are named AAA_[id_number].dat. However, I need to configure Mule to only process this file when a corresponding file named [id_number].dat is also available. The second file indicates that the first is ready for processing.
Is there a way I can configure an inbound endpoint in Mule to only start processing the AAA_ file when it's counterpart is present? The [id_number].dat file is purely for notification purposes, it should not be processed by Mule. The inbound endpoint has a regex filter to look for a file in the format AAA...
<!-- Mule Requester Config -->
<mulerequester:config name="muleRequesterConfig" doc:name="Mule Requester"/>
<!-- File Connectors -->
<file:connector name="inputTriggerConnector" pollingFrequency="100" doc:name="File"/>
<file:connector name="inputFileConnector" doc:name="File"/>
<file:connector name="outputFileConnector" doc:name="File"/>
<!-- File Endpoints -->
<file:endpoint name="inputFileEndpoint" path="src/test/input" responseTimeout="10000" doc:name="File">
<file:filename-regex-filter pattern="\d{6}.dat" caseSensitive="true"/>
</file:endpoint>
<!-- Trigger Flow -->
<flow name="triggerFlow" doc:name="triggerFlow">
<file:inbound-endpoint ref="inputFileEndpoint" connector-ref="inputTriggerConnector" pollingFrequency="1000" doc:name="Input Trigger"/>
<flow-ref name="mainFlow_StockB2C" doc:name="Flow Reference"/>
</flow>
<!-- Main Flow -->
<flow name="mainFlow" doc:name="mainFlow">
<mulerequester:request config-ref="muleRequesterConfig" resource="file://.../AAA_#[message.inboundProperties.originalFilename]?connector=inputFileConnector" timeout="6000" doc:name="Mule Requester"/>
<DO SOMETHING WITH AAA_ FILE>
<file:outbound-endpoint connector-ref="outputFileConnector" path="src/test/output" outputPattern="#[function:dateStamp].csv" responseTimeout="6000" doc:name="Output File"/>
</flow>
Why not filter set a file inbound filter for the [id_number].dat files (or one that excludes the AAA_ files), if those are only used for notification? Would make more sense in my opinion. You can then grab the file to be processed with the requester module inside the flow, based on the originalFileName property.
Just in case this might help someone who needs it, you can create a custom filter and include your own filtering logic in there. More details from this blog here

Mule: How to pass File from FTP to Java class in Mule ESB?

In Mule, I am downloading files from FTP server. I want to pass all the files in this directory to my java class which should be performing actions after Download_ZIP_File in my flow. I need to perform actions like reading text files and unzip the zipped files using Java.
There should be a Java class in my flow, for which a function call should be raised when download is complete.. Object of this class must know all the information about downloaded files.
Can someone please help on this.? Here is my current flow;
My XML for this flow is like this;
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:ftp="http://www.mulesoft.org/schema/mule/ee/ftp"
xmlns:tracking="http://www.mulesoft.org/schema/mule/ee/tracking"
... >
<file:endpoint name="Download_File_KBB" responseTimeout="10000" doc:name="File" path="E:\csv\output"/>
<file:connector name="Global_File_Connector" autoDelete="false" streaming="false" validateConnections="true" doc:name="File"/>
<flow name="ftp_kbb_download_fileFlow1" doc:name="ftp_kbb_download_fileFlow1">
<ftp:inbound-endpoint host="${ftp.host}" port="${ftp.port}" path="${ftp.pathInbound}" user="${ftp.user}" password="${ftp.password}" responseTimeout="10000" doc:name="KBB_FTP">
</ftp:inbound-endpoint>
<logger message="KBBUsedVehiclesNoSpecTabFormat-#[server.dateTime.year]-W#[server.dateTime.weekOfYear]" level="INFO" doc:name="Logger"/>
<file:outbound-endpoint path="${file.inboundEndpoint}" outputPattern="#[header:originalFilename]" responseTimeout="10000" doc:name="Donwload_ZIP_FILE" connector-ref="Global_File_Connector"/>
</flow>
</mule>
One option is to create a class that implements org.mule.api.lifecycle.Callable then configure it with a component element in your config.
Then, you will have full access to the MuleEventContext in the onCall method of this Callable class.

Mule Esb File output not received

I tried to create a Data-Mapper example in mule in which both inbound and outbound endpoints are File, Looks some thing like.
When i execute this program output folder of file remains empty, Logically i assume that i need to put and HashMap to XML transformer between Data Mapper and Output File. More Over i created a csv file to xml file selecting from example option in data mapper.
Initially i tried to use FTP endpoint it started resulting into error so i replaced FTP with file endpoint.
Here I am Sharing configuration.xml file
<mule xmlns:file="....>
<data-mapper:config name="sample_mapper_grf" transformationGraphPath="sample_mapper.grf" doc:name="DataMapper"/>
<flow name="CSV_to_XML_Data_MapperFlow1" doc:name="CSV_to_XML_Data_MapperFlow1">
<file:inbound-endpoint path="/home/jay/CSV_XML_/input" responseTimeout="10000" doc:name="Input File"/>
<data-mapper:transform config-ref="sample_mapper_grf" doc:name="DataMapper"/>
<file:outbound-endpoint path="/home/jay/CSV_XML_/output/" responseTimeout="10000" doc:name="Output File"/>
</flow>
</mule>
Data-Mapper configuration image is here
Add a Groovy component after the data mapper and try and dump the contents
println "post mapping payload " + payload
return payload
I got it resolved using.
here is the configuration.xml
<mule ....>
<data-mapper:config name="sample_mapper_grf"transformationGraphPath="sample_mapper.grf" doc:name="DataMapper"/>
<flow name="CSV_to_XML_Data_MapperFlow1" doc:name="CSV_to_XML_Data_MapperFlow1">
<file:inbound-endpoint path="/home/jay/CSV_XML_/input" responseTimeout="10000" doc:name="Input File"/>
<data-mapper:transform config-ref="sample_mapper_grf" doc:name="DataMapper"/>
<object-to-string-transformer doc:name="Object to String"/>
<file:outbound-endpoint path="/home/jay/Output" responseTimeout="10000" doc:name="File" outputPattern="#[function:dateStamp].xml"/>
</flow>
</mule>

How to upload multiple files via REST over HTTP using Mule?

I have a folder say "MyFiles" where I have lots of files. Now I need to upload those file via REST over HTTP . What will be the approach?
I tried the below but it is wrong
<flow name="testFlow1" doc:name="testFlow1">
<http:inbound-endpoint exchange-pattern="request-response" host="localhost" port="8081" doc:name="HTTP"/>
<http:rest-service-component
serviceUrl="http://localhost:8280/rest/xyz"
httpMethod="POST">
</http:rest-service-component>
<http:endpoint host="localhost" port="5430" encoding="UTF-8"
method="POST" connector-ref="fileHttp" path="fileuploader" name="muleFileUploader">
</http:endpoint>
</flow>
Please help. Since the input folder will have multiple files, how can we achieve that also?
Thanks
Your flow doesn't use a file inbound endpoint and uses a generic (non-in non-out) HTTP endpoint so there's no way this can work.
Below is a configuration that successfully uploads files to an HTTP endpoint. I can not make it work without the object-to-byte-array-transformer (the same file gets polled over and over again - bug?), so I hope your files are not huge...
<flow name="fileUploader">
<file:inbound-endpoint path="/tmp/mule/in"
pollingFrequency="5000" moveToDirectory="/tmp/mule/done" />
<object-to-byte-array-transformer />
<http:outbound-endpoint address="http://..."
method="POST" exchange-pattern="request-response" />
</flow>