Call Flow from Batch sequentially Mule ESB - mule

I create one batch flow. From one batch flow i am calling another batch and from 2nd batch i am calling simple 3rd flow.
I am getting problem in this scenario All are not working sequentially
I need all flows should work sequentially manner
For example I have 5 records are comming in batch and in 2nd batch 10 records are comming and from 2nd batch i am calling 3rd flow which is simple flow
Problem is 2nd batch is not waiting to finish 3rd flow execution,it continuously executed till 10 records,
I need first 3rd flow exection should be complete then and only then 2 records should be come
How can I solve this scenario
Please help me

According to Mulesoft Documentation
Batch Processing at a Glance
https://docs.mulesoft.com/mule-user-guide/v/3.8/batch-processing
A batch job is a top-level element in Mule which exists outside all
Mule flows. Batch jobs split large messages into records which Mule
processes asynchronously in a batch job; just as flows process
messages, batch jobs process records.
So the answer is that you are not able to run batch synchronously. After the input stage, mule will do a load&dispatch and transform the collection into a queue of individuals recordes that are processed asynchronously.
Any reason why you are using batch instead of normal Flow ???

Related

mule : Batch Processing vs For-each

I have a scenario where I have a list of IDs, for each ID fetch the data from multiple API and aggregate them (this is loading phase) and then write it to DB. I know that we can use batch processing for writing to db, but what about loading phase?
You should be able to use a foreach scope for this.
Your list of ID's will be in your payload before it reaches the foreach. You can use HTTP components set as request-response, this way all the data you need will be fetched before you reach your DB component for saving the data.
Fetching data from multiple APIs is something that takes time and has to be kept inside batch step. For each record, after fetching the data, move that to a VM queue. In the on complete phase, use a mule requester to fetch details from vm queue and insert in db. Inserting in db is a single step and does not require batch processing
You can use scatter-gather for each id and fetch data from multiple api's. Scatter-Gather sends a request message to multiple targets concurrently.Based on the responses you can implement aggregation strategy for responses.
Similar can be done using mule batch as well.
References: https://docs.mulesoft.com/mule-user-guide/v/3.9/scatter-gather

Synchronous processing works with Batch Processing?

I do have bunch of xml files say hundreds in my source directory. I have made my flow processing strategy to be synchronous to execute only 1 xml file at a time as performance
is not much priority to me. But i do have batch processing in my flow. So what i under stand is flow thread is creating a child thread to execute my Batch processing and control moves forward. My whole transformation code lies in batch processing which takes 30secs to execute a xml. So nothing much logic in my main flow except file inbound EP and batch execute component(to trigger batch job). So file inbound endpoint is keep on pollingfiles and whole bunch xmls getting picked in very less time make my mule memory out and unexpected behavior occurs.
Came to know fork-join pattern very late and it may or not fit into my req.
So is there any configuration to make my batch process completely and
execute and pick the next files. Help me out. I already made processing strategy synchronous!!
Shouldn't you in this case just adjust the polling frequency at the file inbound endpoint?
https://docs.mulesoft.com/mule-user-guide/v/3.7/file-connector
Polling Frequency
(Applies to inbound File endpoints only.)
Specify how often the endpoint should check for incoming messages. The default value is 1000 ms.
Set maxThreadsActive and maxBufferSize
https://docs.mulesoft.com/mule-user-guide/v/3.6/tuning-performance#calculating-threads

Salesforce Monitor Bulk Data Load Jobs: for processing jobs progress is not showing up

We are not able to see the progress while Salesforce job(bulk-api) is being processing. Now We're exporting 300.000 tasks and the job is there for 4 days, however we can not see any progress on it. Is there a way we can see the progress? We need to know when it's going to be finished.
A job by itself doesn't do any work. It is batches that are queued to jobs that actually carry on data modifications. An open job will stay open until closed or until timed out (after 1 week if I remember correctly). The open status therefore does not signify progress, it only means you can queue more batches to this job.
As you can see on your second screenshot, no batches were queued to this job. Check the code that actually queues the batches, the API probably returns some kind of error there.

Monitor a mule process from another mule flow

I have a process which reads a file and uploads it to a database. The flow goes as below.
File connector
Processing within a for-each loop (Update to database)
The problem with the above approach was that at any time when an exception occurs, the processing stops at that record and the rest of the records are not processed. As a work around, I have changed the flow as below:
File connector
For each - Within the for-each a flow-ref is placed to call a separate flow which does the processing.
The thing that I noticed is that, at the point of calling the new flow, separate threads are used for processing, due to which an exception does not cause all the records to fail. Now I am facing another difficulty, which is that after the processing completes, I need to initiate a report with the complete processing details (no of records processed, rejected, etc). Since all the records are processed asynchronously in different threads, I am not able to figure out when the processing is completed. Is there a way to monitor whether the processing is complete from another mule flow, so that I can generate the report when it is complete?
at the point of calling the new flow, separate threads are used for processing
A flow-ref doesn't necessarily imply a new thread: you can tune the processing-strategy of the ref-ed flow to force synchronous processing.
With this, you'll be able to remain synchronous, have a custom expression strategy in the ref-ed flow and achieve your goal of not breaking processing when an error occurs.

How to run multiple tasks in single data flow task in SSIS project

i have an issue that , i ran multiple data flow tasks in single control flow, if 5 out of 5 source alive it works fine but any one source is dead, it is not executing remaining 4 source flows,how to run which ever is alive should run smoothly when ever we are executing the job
I am assuming that the 5 data flows are all connected together on the control flow. The desire is to have all 5 data flows execute, regardless of success or failure of the previous data flow.
To accomplish this, you will need to change the Precedence Constraint from the current value of Success (green) to Completion (blue). To access the Precedence Constraint Editor, double click the connector line and you should see the following.
Another option is to place all 5 data flow tasks in a sequence container and have them run in parallel (by not connecting them to each other). It would look something like this: