how to modify java delegate activity name in camunda bpmn at run time - bpmn

I have one existing camunda bpmn flow, i want to reuse it.
ex.
activity1 -> activity2 -> activity3 -> activity4. like that the flow is going.
i have to reuse this, but instead of activity3 i have to call my activity3.1 version java class.
i have tried using camunda xml model to get the service task and tried to modify by like but didn't worked.
ServiceTask processes = modelInstance.getModelElementById(s.getId());
ExtensionElements extensionElements = processes.getExtensionElements();
CamundaInputOutput formData = extensionElements.getElementsQuery().filterByType(CamundaInputOutput.class).singleResult();
System.out.println(formData.getCamundaInputParameters().iterator().next().getTextContent());
formData.getCamundaInputParameters().iterator().next().setTextContent("com.test.activity3.1");

Related

Camunda : Set Assignee to all UserTasks of the process instance

I have a requirement where I need to set assignee's to all the "user-tasks" in a process instance as soon as the instance is created, which is based on the candidate group set to the user-task.
i tries getting the user-tasks using this :
Collection<UserTask> userTasks = execution.getBpmnModelInstance().getModelElementsByType(UserTask.class);
which is correct in someway but i am not able to set the assignee's , Also, looks like this would apply to the process itself and not the process instance.
secondly , I tried getting it from the taskQuery which gives me only the next task and not all the user-tasks inside a process.
Please help !!
It does not work that way. A process flow can be simplified to "a token moves through the bpmn diagram" ... only the current position of the token is relevant. So naturally, the tasklist only gives you the current task. Not what could happen after ... which you cannot know, because if you had a gateway that continues differently based on the task outcome? So drop playing with the BPMN meta model. Focus on the runtime.
You have two choices to dynamically assign user tasks:
1.) in the modeler, instead of hard-assigning the task to "a-user", use an expression like ${taskAssignment.assignTask(task)} where "taskAssignment" is a bean that provides a String method that returns the user.
2.) add a taskListener on "create" to the task and set the assignee in the listener.
for option 2 you can use the camunda spring boot events (or the (outdated) camunda-bpm-reactor extension) to register one central component rather than adding a listener to every task.

Parallel Tasks in Data Factory Custom Activity (ADF V2)

I am running Custom code activity in ADF v2 using Batch Service. Whenever this runs it only create one CloudTask within my Batch Job although I have more than two dozen parallel.Invoke methods running. Is there a way I can create multiple Tasks from one Custom Activity from ADF so that the processing can spread across all nodes in Batch Pool
I have fixed Pool with two nodes. Max Tasks are also set to 8 per node and Scheduling policy is also set to "Spread". I have only one Custom Task on my pipeline with Multiple Parallel.Invoke (Almost two Dozen).I was hoping this will create Multiple CloudTasks and will be spread Across both of my nodes as both nodes are single core. Looks like when each Custom Activity runs in ADF, it creates only one Task (CloudTask) for Batch Service.
My other hope was to use
https://learn.microsoft.com/en-us/azure/batch/tutorial-parallel-dotnet
and manually create CloudTasks in my console application and create Multiple Tasks Programatically and then run that Console Application with ADF Custom Activity but CloudTask takes JobId and Cmd. Wanted to something like following but instead of passing taskCommandLine, I wanted to pass a C# method name and parameters to execute
string taskId = "task" + i.ToString().PadLeft(3, '0');
string taskCommandLine = "ping -n " + rand.Next(minPings, maxPings +
1).ToString() + " localhost";
CloudTask task = new CloudTask(taskId, taskCommandLine);
// Wanted to do CloudTask task = new CloudTask(taskId,
SomeMethod(args));
tasks.Add(task);
Also it looks like we can't create CloudTasks by using .NET API for Batch within Custom Activity of ADF
What I wanted to Achieve?
I have data in SQL Server table and I want to run different transformations on it by slicing it Horizontally or Vertically (by picking rows or columns). I want to run those transformations in Parallel (wants to have multiple CloudTask instances so that each one can operate on a specific Column Independently and after transformation load it
into a different table). But the issue is it looks like we can't use .NET Batch Service API within ADF and the only way seems to be having multiple Custom Activities in my Data Factory pipeline.
Application needs to deployed on each and every node within Batch pool and CloudTasks needs to be created by calling the application with cmd
CloudTask task =
new CloudTask(
"MyTask",
"cmd /c %AZ_BATCH_APP_PACKAGE_MyTask%\\myTask.exe -args -here");

Nexus 3 Repository Manager Create (Or Run Pre-generated) Task Without Using User Interface

This question arose when I was trying to reboot my Nexus3 container on a weekly schedule and connect to an S3 bucket I have. I have my container set up to connect to the S3 bucket just fine (it creates a new [A-Z,0-9]-metrics.properties file each time) but the previous artifacts are not found when looking though the UI.
I used the Repair - Reconcile component database from blob store task from the UI settings and it works great!
But... all the previous steps are done automatically through scripts and I would like the same for the final step of Reconciling the blob store.
Connecting to the S3 blob store is done with reference to examples from nexus-book-examples. As below:
Map<String, String> config = new HashMap<>()
config.put("bucket", "nexus-artifact-storage")
blobStore.createS3BlobStore('nexus-artifact-storage', config)
AWS credentials are provided during the docker run step so the above is all that is needed for the blob store set up. It is called by a modified version of provision.sh, which is a script from the nexus-book-examples git page.
Is there a way to either:
Create a task with a groovy script? or,
Reference one of the task types and run the task that way with a POST?
depending on the specific version of repository manager that you are using, there may be REST endpoints for listing and running scheduled tasks. This was introduced in 3.6.0 according to this ticket: https://issues.sonatype.org/browse/NEXUS-11935. For more information about the REST integration in 3.x, check out the following: https://help.sonatype.com/display/NXRM3/Tasks+API
For creating a scheduled task, you will have to add some groovy code. Perhaps the following would be a good start:
import org.sonatype.nexus.scheduling.TaskConfiguration
import org.sonatype.nexus.scheduling.TaskInfo
import org.sonatype.nexus.scheduling.TaskScheduler
import groovy.json.JsonOutput
import groovy.json.JsonSlurper
class TaskXO
{
String typeId
Boolean enabled
String name
String alertEmail
Map<String, String> properties
}
TaskXO task = new JsonSlurper().parseText(args)
TaskScheduler scheduler = container.lookup(TaskScheduler.class.name)
TaskConfiguration config = scheduler.createTaskConfigurationInstance(task.typeId)
config.enabled = task.enabled
config.name = task.name
config.alertEmail = task.alertEmail
task.properties?.each { key, value -> config.setString(key, value) }
TaskInfo taskInfo = scheduler.scheduleTask(config, scheduler.scheduleFactory.manual())
JsonOutput.toJson(taskInfo)

Mule ESB: How to call a flow inside Datamapper( script)

I have datamapper, ( source: pojo and target:CSV), I need to call the other flow ( or groovy) inside datamapper. I stuck in passing the parameter to the flow. For example, I don't want entire payload has to go to flow for validation. I need to pass only two values. I used
flowRef(String,Object)
output.Item = flowRef("sampletestFlow",input.Model);
It works fine for single payload. But i have to pass one more parameter ( called input.Policy). I know we have to use
flowRef(String,Object,Map).
But it don't know the format for two input parameter.
Could you please anyone help me on this.
I have handled the scenario by the below way. Have create java class and called the java via damapper script. Below is the code inside datamapper script to call the java code.
stringUtil = new com.test.util.StringUtil();
output.style = stringUtil.formatValue(input.RuleStyle);
Hope this helps.

SoapUI Service mocking - How to transfer values between mock responses?

Is there a way to transfer values generated by groovy script in mockResponse1 context to project scope property. Then I would like to use this value in another scripted mockResponse2?
I can transfer values from Request1 to Request2 (client side). Can't seem to figure how to do it for mockResponses (server side).
Model:
mockResponse1.someVar -> project.Property -> mockResponse2.someOtherVar
I found the solution. This works from a mockResponse script:
// get project scoped property
def a = mockResponse.mockOperation.mockService.project.getPropertyValue("someProjectProperty")
// set project scoped property
mockResponse.mockOperation.mockService.project.setPropertyValue("someProjectProperty", someVar)