Is there any way to set skip expression of task using java code in flowable or any dynamic way to set skip expression of task - flowable

I have checked in flowable rest API but I have not found any rest call which can I use to set skip expression of the task.
I have also checked in the API documentation of flowable but unable to find a way to set skip expression dynamically using java code.
If any have any idea how can we set skip expression of flowable task dynamically using java code or any other dynamic way without setting skip expression for the flowable modeler.
The flowable version is: 6.4.1

In order for the flowable:skipExpression from the BPMN XML to be used the process instance needs to have a variable named _FLOWABLE_SKIP_EXPRESSION_ENABLED with the value of true

Related

How do i set dynamic default value of a rest data source in oracle apex?

I have just started learning oracle apex.
I imported an external API into oracle apex(https://localhost/report/?dateFrom=2022-09-30&dateTo=2022-10-01) . This API has two parameters dateFrom and dateTo. I want to dynamically set the datefrom parameter to 5 days ago and dateTo to today in default value. How can I do this?
One option is to define the parameter value for the component, which is using the REST data source. If that is, for instance, a report, you can configure within the Parameters Node in Page Designer. In the tree on the left hand side, click the node for your parameter, and then (in the Attributes pane on the right, choose Expression or Function Body, and enter the expression to compu For a REST Synchronization (download data to local table), you would configure the same as a Synchronization Step.
These options will configure the dynamic value at the component level (the component which uses the REST Data Source).
However, your question reads like you're after a default to be applied to all components using that REST Data Source. The Expression or Function Body options are not available at the REST Source level. I'd see two approaches:
Define an application item (APP_REST_DATEFROM), and a Application Computation to set the value for this item. In the REST Source, then simply use &APP_REST_DATEFROM. as the value for your parameter. At the component level, it's important to switch the parameter value to REST Source Default.
Another option allows you to completely take these parameters out of the REST Source configuration, but implies some PL/SQL coding effort: You could author a REST Source Plug-In, where the Plug-In code implements custom handling for all aspects of the HTTP requests being made for the REST Data Source. That typically includes pagination, but can include all parameter handling (HTTP Header, URL Parameters, etc). An example for such a Plug-In is here: https://github.com/oracle/apex/tree/22.1/plugins/rest-source/fixed-page-size
I hope this helps.

How to set log4j2 log levels and categories in DataWeave 2.x?

When using the log() function in DataWeave I have a few questions:
How can I set a log level and log category so my logs are handled by log4j2 the same way as log messages from the Logger components in the Mule flow?
How can I suppress logging the expression? If the expression result is very large (what if it is streaming data?) I might only want to log to first argument to log, and skip the actual DW expression evaluation.
There is no way to set the logging level with the log() function in DataWeave. As an alternative you could implement a custom function in a custom module in to log that allows to set levels.
You could use the same custom function to implement some logic, however there is the generic problem of determining if a payload is big without fully consuming it. In any case DataWeave logging is meant to be used as a debugging tool and should not be used in production or for big payloads. The best practice is to avoid logging at all unless you need to debug an issue, and then remove the logging.

Mule flow terminate

I have a small question. Do we have any option to stop/terminate the flow where ever we want in Mulesoft 4? Example: After executing transform message or after logger processor want to stop/terminate the flow based on our business requirement.
Two different ways to achieve that are described in the KB article https://help.mulesoft.com/s/article/How-To-Stop-Or-Start-Flows-In-Mule-4-x-Programmatically
Basically you need to get an instance of the Mule registry, the lookup the flow and stop or start it.
Yes, you can make use of groovy script or java class to start/stop mulesoft at runtime. Drag and drop the scripting component from the palette and choose groovy as your engine and use the script below.
flowName = registry.lookupByName('flowName').get();
if (flowName.isStarted())
flowName.stop()
else
flowName.start()

Nifi API - Update parameter context

We've created a parameter context within Nifi which is allocated to several process groups. We would like to update the value of one parameter within the parameter context. Is there any option to do this via the API?
NiFi CLI from nifi-toolkit has commands for interacting with parameters, there is one for set-param:
https://github.com/apache/nifi/tree/master/nifi-toolkit/nifi-toolkit-cli/src/main/java/org/apache/nifi/toolkit/cli/impl/command/nifi/params
You could use that, or look at the code to see how it uses the API.
Also, anything you can do from NiFi UI has go through the REST API. So you can always open Chrome Dev Tools, take some action in the UI like updating a parameter, and then look at which calls are made.

How to hit an API parallely with different input paramters

I am getting doctorCodes as (Dr1124914 ,Dr1074955).
My clinic API gives above response taking one doctorCode a time ,I have to extract a value from my response.
But I want to make parallel calls to my API with all values of doctorCodes as shown above in one go , extracting required field from it
and accumulating finally to my resultant payload .
You can use the Scatter-Gather component to perform parallel calls and aggregate the results using DataWeave. See the documentation at https://docs.mulesoft.com/mule-runtime/4.1/scatter-gather-concept
Note that it works for a fixed number of parallel way, not for dynamic routes. I don't think there is no way to do a dynamic number of routes in Mule 4. If you are interested in that you would have to implement it by yourself in custom Java or scripting code somehow.