Rename SFTP file after reading it - mule

I want to read file using SFTP connector and then want to change its name.
This is my SFTP connector -
<sftp:connector preferredAuthenticationMethods="publickey,password,keyboard-interactive" name="SFTP" validateConnections="true" doc:name="SFTP" outputPattern="file1Temp.txt"/>
My SFTP inbound-endpoint is -
<sftp:inbound-endpoint connector-ref="SFTP" host="${a.host}" port="${a.port}" path="${a.inputpath}" user="${a.user}" password="${a.pw}" responseTimeout="10000" pollingFrequency="600000" doc:name="SFTP">
<file:filename-regex-filter pattern="file1.txt" caseSensitive="true"/>
</sftp:inbound-endpoint>
Here, my file name is "file1.txt" and I want to change its name to "file1Temp.txt".

The SFTP connector in Mule 3.x doesn't allow operations on files like rename out of the box. You can follow the method in this KB Article to implement it: https://support.mulesoft.com/s/article/How-to-rename-file-with-the-SFTP-connector

Related

Mule 4 sftp read file - reconnection strategy not working

Am trying to read a file from a SFTP location .
Am using Mule 4.4 Community edition.
If there is any error while connecting to SFTP server or file is not present , would like mule to retry 2 times .
So I configured in sftp connector : 'Reconnection strategy' as 'Standard'
However the logs do not show any retry occurring .
NOTE - ideally this will be kicked off at a scheduled time ( scheduler ) but for purposes of testing am using a HTTP Listener to invoke the flow
Here is the code :
<sftp:config name="SFTP_Config" doc:name="SFTP Config">
<sftp:connection host="abcd" username="xyz" password="pwd" />
</sftp:config>
<flow name="get:employee">
<logger level="INFO" doc:name="Logger" message="starting search" category="get-employee"/>
<sftp:read doc:name="Read" config-ref="SFTP_Config" path="/a/employees.unl">
<repeatable-in-memory-stream />
<reconnect />
</sftp:read>
<error-handler ></error-handler>
</flow>
Am wondering if I am doing something wrong ? I would want the flow or atleast the file reading to be attempted twice before erroring out.
Presently when the file does not exist in the sftp location it simply throws an error :
Message : Path '/a/employees.unl' doesn't exist
Error type : SFTP:ILLEGAL_PATH
when does the reconnection strategy kick in ?
Thanks
Reconnection strategies are for connections, not operations that fail. In your example the connection is working ok. The operation fails because the directory doesn't exists.
For an operations you should put the operation inside an <until-successful> scope. You can use the maxRetries attribute to specify the number of retries.
Documentation: https://docs.mulesoft.com/mule-runtime/4.4/until-successful-scope

How to setup Mule ESB SFTP listener

How do I setup an SFTP (ssh) listener on Mule ESB (CE)?
I could only find the HTTP(S) listener)
Thank you for any hints
There is no separate connector same connector can be used for username-password and publickey. Please refer preferredAuthenticationMethods attribute for more details. Configuration will be like
<sftp:connector name="SFTP" identityFile="ppkOrpemfile_path" preferredAuthenticationMethods="publickey" validateConnections="true" doc:name="SFTP"/>
<flow name="testSFTP_flow">
<sftp:inbound-endpoint connector-ref="SFTP" host="host" port="22" responseTimeout="10000" doc:name="SFTP"/>
</flow>
Hope this helps.
Try to find out wrapper.conf file from your server's conf folder.
Modify the port no
wrapper.java.additional.=-Dmule.mmc.bind.port=7779
MuleESB/ Mule Runtime is not SFTP server. All you can do is using mule SFTP connector pull and push the files (SFTP Client). If you are looking for SFTP server need to host. If your partner company have option to send data over HTTP(S) you could use mule HTTP Listener.

mule multiple various types of file upload into amazon s3 bucket

<flow name="flow1">
<file:inbound-endpoint path="C:\temp" moveToPattern="abc.txt" responseTimeout="10000" doc:name="File"/>
<s3:create-object config-ref="Amazon_S3" bucketName="mulebucket" key="img" doc:name="Amazon S3"/>
<logger message="s3 upload done...:" level="INFO" />
</flow>
I want to upload multiple files into my s3 bucket.
but above code upload only one file.
any suggestions are welcome
The file inbound-endpoint will keep picking up files from the source directory and creating them in S3. I think the problem is you S3 object key is static, so it is overwriting the same file. You can chnage the key to be more dynamic by using the filename of the loaded file, something like so:
<s3:create-object config-ref="Amazon_S3" bucketName="mulebucket" key="#[message.inboundProperties.originalFilename]" doc:name="Amazon S3"/>

Mule - process file only when another is present

I have a Mule flow which processes files in an inbound folder that are named AAA_[id_number].dat. However, I need to configure Mule to only process this file when a corresponding file named [id_number].dat is also available. The second file indicates that the first is ready for processing.
Is there a way I can configure an inbound endpoint in Mule to only start processing the AAA_ file when it's counterpart is present? The [id_number].dat file is purely for notification purposes, it should not be processed by Mule. The inbound endpoint has a regex filter to look for a file in the format AAA...
<!-- Mule Requester Config -->
<mulerequester:config name="muleRequesterConfig" doc:name="Mule Requester"/>
<!-- File Connectors -->
<file:connector name="inputTriggerConnector" pollingFrequency="100" doc:name="File"/>
<file:connector name="inputFileConnector" doc:name="File"/>
<file:connector name="outputFileConnector" doc:name="File"/>
<!-- File Endpoints -->
<file:endpoint name="inputFileEndpoint" path="src/test/input" responseTimeout="10000" doc:name="File">
<file:filename-regex-filter pattern="\d{6}.dat" caseSensitive="true"/>
</file:endpoint>
<!-- Trigger Flow -->
<flow name="triggerFlow" doc:name="triggerFlow">
<file:inbound-endpoint ref="inputFileEndpoint" connector-ref="inputTriggerConnector" pollingFrequency="1000" doc:name="Input Trigger"/>
<flow-ref name="mainFlow_StockB2C" doc:name="Flow Reference"/>
</flow>
<!-- Main Flow -->
<flow name="mainFlow" doc:name="mainFlow">
<mulerequester:request config-ref="muleRequesterConfig" resource="file://.../AAA_#[message.inboundProperties.originalFilename]?connector=inputFileConnector" timeout="6000" doc:name="Mule Requester"/>
<DO SOMETHING WITH AAA_ FILE>
<file:outbound-endpoint connector-ref="outputFileConnector" path="src/test/output" outputPattern="#[function:dateStamp].csv" responseTimeout="6000" doc:name="Output File"/>
</flow>
Why not filter set a file inbound filter for the [id_number].dat files (or one that excludes the AAA_ files), if those are only used for notification? Would make more sense in my opinion. You can then grab the file to be processed with the requester module inside the flow, based on the originalFileName property.
Just in case this might help someone who needs it, you can create a custom filter and include your own filtering logic in there. More details from this blog here

How to trigger a Mule job from CLI (Command Line Interface)?

I have the below flow that will make a REST request on a periodic basis and then store the data into the PostgreSQL database.
<jdbc:postgresql-data-source name="PostgreSQL_Data_Source" user="postgres" password="*******" url="jdbc:postgresql://localhost:5432/TestDB" transactionIsolation="UNSPECIFIED" doc:name="PostgreSQL Data Source"/>
<jdbc:connector name="PostgreSQL_JDBC_Connector" dataSource-ref="PostgreSQL_Data_Source" validateConnections="true" queryTimeout="-1" pollingFrequency="0" doc:name="Database">
<jdbc:query key="InsertRecord" value="INSERT INTO "tblJSON"("StoreHere") VALUES (CAST(#[message.payload] AS json))"/>
</jdbc:connector>
<flow name="RESTServiceScheduling" doc:name="RESTServiceScheduling">
<!-- Step 1: Generates events at a given interval of time -->
<quartz:inbound-endpoint jobName="RESTServiceJobSchedule" repeatInterval="0" doc:name="Quartz" responseTimeout="10000" cronExpression="0 0 10 ? * *">
<quartz:event-generator-job/>
</quartz:inbound-endpoint>
<!-- Step 2: This will read the REST service data -->
<http:rest-service-component httpMethod="GET" serviceUrl="http://localhost:12186/RestServiceImpl.svc/StorageUsage" />
<!-- Step 3: Transform the HTTP-streamed payload into a java.lang.String -->
<object-to-string-transformer doc:name="Object to String"/>
<!-- Step 4: Dump into the destination Database -->
<jdbc:outbound-endpoint exchange-pattern="one-way" queryKey="InsertRecord" queryTimeout="-1" connector-ref="PostgreSQL_JDBC_Connector" doc:name="Destination"/>
</flow>
This works fine but I need a way from CLI (Command Line Interface) to trigger the job.
How can I do so?
Thanks in advance
Use an HTTP inbound endpoint to trigger the flow and call it with curl from the command line.
I know this is an old question with an accepted answer, but an alternative is to use a file endpoint that deletes the file. Set the file (or its endpoint) to have file age 1 and polling frequency 10 seconds. To trigger the flow, create a file with the right name. I found doing this, though, that Mule would not delete the file until the flow was done. It would then pick up the same file multiple times if the flow took longer than the polling period. To get around that, I have one flow that has just a file inbound endpoint, a logger, and an VM outbound endpoint with a specific path. Use a VM inbound endpoint with the same path as you would use the HTTP inbound endpoint above.
edit: You can then use touch or something similar in your CLI to create the file. I found this question looking for an alternative to the way described above.