Asynchronously start only one Task to process a static Queue, stopping when it's done - .net-4.0

Basically I have a static custom queue of objects I want to process. From multiple threads, I need to kick off a singular Task that will process the queued objects, stopping the task when all items are dequeued.
Some psuedo code:
static CustomQueue _customqueue;
static Task _processQueuedItems;
public static void EnqueueSomething(object something) {
_customqueue.Enqueue(something);
StartProcessingQueue();
}
static void StartProcessingQueue() {
if(_processQueuedItems != null) {
_processQueuedItems = new Task(() => {
while(_customqueue.Any()) {
var stuffToDequeue = _customqueue.Dequeue();
/* do stuff */
}
});
_processQueuedItems.Start();
}
if(_processQueuedItems.Status != TaskStatus.Running) {
_processQueuedItems.Start();
}
}
If it makes a difference my custom queue is a queue that essentially holds items until they reach a certain age, then allows them to dequeue. Everytime an item is touched its timer starts again. I know this piece works fine.
The part I'm struggling with is the parallelism. (Clearly, I don't know what I'm doing here). What I want is to have one thread process the queue until it's complete, then go away. If another call comes in it doesn't start a new thread unless it has to.
I hope that explains my issue okay.

You might want to consider using BlockingCollection<T> here. You could make your custom queue implement IProducerConsumerCollection, in which case BC could use it directly.
You'd then just need to start a long running Task to call blockingCollection.GetConsumingEnumerable() and process the items in a foreach. The task will automatically block when the collection is empty, and restart when a new item is Enqueued.

Related

Custom command to go back in a process instance (execution)

I have a process where I have 3 sequential user tasks (something like Task 1 -> Task 2 -> Task 3). So, to validate the Task 3, I have to validate the Task 1, then the Task 2.
My goal is to implement a workaround to go back in an execution of a process instance thanks to a Command like suggested in this link. The problem is I started to implement the command by it does not work as I want. The algorithm should be something like:
Retrieve the task with the passed id
Get the process instance of this task
Get the historic tasks of the process instance
From the list of the historic tasks, deduce the previous one
Create a new task from the previous historic task
Make the execution to point to this new task
Maybe clean the task pointed before the update
So, the code of my command is like that:
public class MoveTokenCmd implements Command<Void> {
protected String fromTaskId = "20918";
public MoveTokenCmd() {
}
public Void execute(CommandContext commandContext) {
HistoricTaskInstanceEntity currentUserTaskEntity = commandContext.getHistoricTaskInstanceEntityManager()
.findHistoricTaskInstanceById(fromTaskId);
ExecutionEntity currentExecution = commandContext.getExecutionEntityManager()
.findExecutionById(currentUserTaskEntity.getExecutionId());
// Get process Instance
HistoricProcessInstanceEntity historicProcessInstanceEntity = commandContext
.getHistoricProcessInstanceEntityManager()
.findHistoricProcessInstance(currentUserTaskEntity.getProcessInstanceId());
HistoricTaskInstanceQueryImpl historicTaskInstanceQuery = new HistoricTaskInstanceQueryImpl();
historicTaskInstanceQuery.processInstanceId(historicProcessInstanceEntity.getId()).orderByExecutionId().desc();
List<HistoricTaskInstance> historicTaskInstances = commandContext.getHistoricTaskInstanceEntityManager()
.findHistoricTaskInstancesByQueryCriteria(historicTaskInstanceQuery);
int index = 0;
for (HistoricTaskInstance historicTaskInstance : historicTaskInstances) {
if (historicTaskInstance.getId().equals(currentUserTaskEntity.getId())) {
break;
}
index++;
}
if (index > 0) {
HistoricTaskInstance previousTask = historicTaskInstances.get(index - 1);
TaskEntity newTaskEntity = createTaskFromHistoricTask(previousTask, commandContext);
currentExecution.addTask(newTaskEntity);
commandContext.getTaskEntityManager().insert(newTaskEntity);
AtomicOperation.TRANSITION_CREATE_SCOPE.execute(currentExecution);
} else {
// TODO: find the last task of the previous process instance
}
// To overcome the "Task cannot be deleted because is part of a running
// process"
TaskEntity currentUserTask = commandContext.getTaskEntityManager().findTaskById(fromTaskId);
if (currentUserTask != null) {
currentUserTask.setExecutionId(null);
commandContext.getTaskEntityManager().deleteTask(currentUserTask, "jumped to another task", true);
}
return null;
}
private TaskEntity createTaskFromHistoricTask(HistoricTaskInstance historicTaskInstance,
CommandContext commandContext) {
TaskEntity newTaskEntity = new TaskEntity();
newTaskEntity.setProcessDefinitionId(historicTaskInstance.getProcessDefinitionId());
newTaskEntity.setName(historicTaskInstance.getName());
newTaskEntity.setTaskDefinitionKey(historicTaskInstance.getTaskDefinitionKey());
newTaskEntity.setProcessInstanceId(historicTaskInstance.getExecutionId());
newTaskEntity.setExecutionId(historicTaskInstance.getExecutionId());
return newTaskEntity;
}
}
But the problem is I can see my task is created, but the execution does not point to it but to the current one.
I had the idea to use the activity (via the object ActivityImpl) to set it to the execution but I don't know how to retrieve the activity of my new task.
Can someone help me, please?
Unless somethign has changed in the engine significantly the code in the link you reference should still work (I have used it on a number of projects).
That said, when scanning your code I don't see the most important command.
Once you have the current execution, you can move the token by setting the current activity.
Like I said, the code in the referenced article used to work and still should.
Greg
Referring the same link in your question, i would personally recommend to work with the design of you your process. use an exclusive gateway to decide whether the process should end or should be returned to the previous task. if the generation of task is dynamic, you can point to the same task and delete local variable. Activiti has constructs to save your time from implementing the same :).

Is it better to use the Bus Start method or a class constructor to instantiate objects used by a service

I'm using nServiceBus 5 and have created a number of host endpoints, two of which listen for database changes. (The specifics of how to do this can be found here). The intention is to have a service running in the background which publishes an event message using the Bus when notified to do so by the database listener.
The code which creates the database listener object and handles events is in the Start method, implemented as part of IWantToRunWhenBusStartsAndStops.
So - Is putting the code here likely to cause problems later on, for example if an exception is thrown (yes, I do have try/catch blocks, but I removed them from the sample code for clarity)? What happens when the Start method finishes executing?
Would I be better off with a constructor on my RequestNewQuoteSender class to instantiate the database listener as a class property and not use the Start method at all?
namespace MySample.QuoteRequest
{
public partial class RequestNewQuoteSender : IWantToRunWhenBusStartsAndStops
{
public void Start()
{
var changeListener = new DatabaseChangeListener(_ConnectionString);
// Assign the code within the braces to the DBListener's onChange event
changeListener.OnChange += () =>
{
// code to handle database change event
changeListener.Start(_SQLStatement);
};
// Now everything has been set up.... start it running.
changeListener.Start(_SQLStatement);
}
public void Stop() { LogInfo("Service Bus has stopped"); }
}
}
Your code seems fine to me.
Just a few small things:
Make changeListener a class field, so that it won't be GC (not 100% sure if it would be but just to make sure);
Unsubscribe from OnChange on the Stop() method;
You may also want to have a "lock" around changeListener.Start(_SQLStatement); and the Stop so that there are no racing conditions (I leave that one up to you to figure out if you need it or not);
Does this make sense ?

Async WCF Service with multiple async calls inside

I have a web service in WCF that consume some external web services, so what I want to do is make this service asynchronous in order to release the thread, wait for the completion of all the external services, and then return the result to the client.
With Framework 4.0
public class MyService : IMyService
{
public IAsyncResult BeginDoWork(int count, AsyncCallback callback, object serviceState)
{
var proxyOne = new Gateway.BackendOperation.BackendOperationOneSoapClient();
var proxyTwo = new Gateway.BackendOperationTwo.OperationTwoSoapClient();
var taskOne = Task<int>.Factory.FromAsync(proxyOne.BeginGetNumber, proxyOne.EndGetNumber, 10, serviceState);
var taskTwo = Task<int>.Factory.FromAsync(proxyTwo.BeginGetNumber, proxyTwo.EndGetNumber, 10, serviceState);
var tasks = new Queue<Task<int>>();
tasks.Enqueue(taskOne);
tasks.Enqueue(taskTwo);
return Task.Factory.ContinueWhenAll(tasks.ToArray(), innerTasks =>
{
var tcs = new TaskCompletionSource<int>(serviceState);
int sum = 0;
foreach (var innerTask in innerTasks)
{
if (innerTask.IsFaulted)
{
tcs.SetException(innerTask.Exception);
callback(tcs.Task);
return;
}
if (innerTask.IsCompleted)
{
sum = innerTask.Result;
}
}
tcs.SetResult(sum);
callback(tcs.Task);
});
}
public int EndDoWork(IAsyncResult result)
{
try
{
return ((Task<int>)result).Result;
}
catch (AggregateException ex)
{
throw ex.InnerException;
}
}
}
My questions here are:
This code is using three threads: one that is instanced in the
BeginDoWork, another one that is instanced when the code enter
inside the anonymous method ContinueWhenAll, and the last one when
the callback is executed, in this case EndDoWork. Is that correct or
I’m doing something wrong on the calls? Should I use any
synchronization context? Note: The synchronization context is null
on the main thread.
What happen if I “share” information between
threads, for instance, the callback function? Will that cause a
performance issue or the anonymous method is like a closure where I
can share data?
With Framework 4.5 and Async and Await
Now with Framework 4.5, the code seems too much simple than before:
public async Task<int> DoWorkAsync(int count)
{
var proxyOne = new Backend.ServiceOne.ServiceOneClient();
var proxyTwo = new Backend.ServiceTwo.ServiceTwoClient();
var doWorkOne = proxyOne.DoWorkAsync(count);
var doWorkTwo = proxyTwo.DoWorkAsync(count);
var result = await Task.WhenAll(doWorkOne, doWorkTwo);
return doWorkOne.Result + doWorkTwo.Result;
}
But in this case when I debug the application, I always see that the code is executed on the same thread. So my questions here are:
3.. When I’m waiting for the “awaitable” code, is that thread released and goes back to the thread pool to execute more requests?
3.1. If So, I suppose that when I get a result from the await Task, the execution completes on the same thread that the call before. Is that possible? What happen if that thread is processing another request?
3.2 If Not, how can I release the thread to send it back to the thread pool with Asycn and Await pattern?
Thank you!
1. This code is using three threads: one that is instanced in the BeginDoWork, another one that is instanced when the code enter inside the anonymous method ContinueWhenAll, and the last one when the callback is executed, in this case EndDoWork. Is that correct or I’m doing something wrong on the calls? Should I use any synchronization context?
It's better to think in terms of "tasks" rather than "threads". You do have three tasks here, each of which will run on the thread pool, one at a time.
2. What happen if I “share” information between threads, for instance, the callback function? Will that cause a performance issue or the anonymous method is like a closure where I can share data?
You don't have to worry about synchronization because each of these tasks can't run concurrently. BeginDoWork registers the continuation just before returning, so it's already practically done when the continuation can run. EndDoWork will probably not be called until the continuation is complete; but even if it is, it will block until the continuation is complete.
(Technically, the continuation can start running before BeginDoWork completes, but BeginDoWork just returns at that point, so it doesn't matter).
3. When I’m waiting for the “awaitable” code, is that thread released and goes back to the thread pool to execute more requests?
Yes.
3.1. If So, I suppose that when I get a result from the await Task, the execution completes on the same thread that the call before. Is that possible? What happen if that thread is processing another request?
No. Your host (in this case, ASP.NET) may continue the async methods on any thread it happens to have available.
This is perfectly safe because only one thread is executing at a time.
P.S. I recommend
var result = await Task.WhenAll(doWorkOne, doWorkTwo);
return result[0] + result[1];
instead of
var result = await Task.WhenAll(doWorkOne, doWorkTwo);
return doWorkOne.Result + doWorkTwo.Result;
because Task.Result should be avoided in async programming.

Why isn't this transaction isolated?

I have a few methods - a couple of calls to SQL Server and some business logic to generate a unique value. These methods are all contained inside a parent method:
GenerateUniqueValue()
{
//1. Call to db for last value
//2. Business logic to create new value
//3. Update db with new value created
}
I want the call to GenerateUniqueValue to be isolated, i.e - when two clients call it simultaneously, the second client must wait for the first one to finish.
Originally, I made my service a singleton; however, I have to anticipate future changes that may include load balancing, so I believe a singleton approach is out. Next I decided to try the transaction approach by decorating my service:
[ServiceBehavior(TransactionIsolationLevel = IsolationLevel.Serializable, TransactionTimeout = "00:00:30")]
And my GenerateUniqueValue with:
[OperationBehavior(TransactionScopeRequired = true)]
The problem is that a test of simultaneous hits to the service method results in an error:
"System.ServiceModel.ProtocolException: The transaction under which this method call was executing was asynchronously aborted."
Here is my client test code:
private static void Main(string[] args)
{
List<Client> clients = new List<Client>();
for (int i = 1; i < 20; i++)
{
clients.Add(new Client());
}
foreach (var client in clients)
{
Thread thread = new Thread(new ThreadStart(client.GenerateUniqueValue));
thread.Start();
}
Console.ReadLine();
}
If the transaction is suppose to be isolated, why are multiple threads calling out to the method clashing?
Transaction is for treating multiple actions as a single atomic action. So if you want to make the second thread to wait for the first thread's completion, you have to deal with concurrency not transaction.
Try using System.ServiceModel.ServiceBehaviorAttribute.ConcurrencyMode attribute with Single or Reentrant concurrency modes. I guess that's what you are expecting.
[ServiceBehavior(ConcurrencyMode=ConcurrencyMode.Reentrant)]
I guess you got the exception because the IsolationLevel.Serializable would enable the second thread to access the volatile data, but wouldn't let it to change it. You perhapse be doing some change operation which is not permitted with this isolation level.

.NET 4.0 Threading.Tasks

I've recently started working on a new application which will utilize task parallelism. I have just begun writing a tasking framework, but have recently seen a number of posts on SO regarding the new System.Threading.Tasks namespace which may be useful to me (and I would rather use an existing framework than roll my own).
However looking over MSDN I haven't seen how / if, I can implement the functionality which I'm looking for:
Dependency on other tasks completing.
Able to wait on an unknown number of tasks preforming the same action (maybe wrapped in the same task object which is invoked multiple times)
Set maximum concurrent instances of a task since they use a shared resource there is no point running more than one at once
Hint at priority, or scheduler places tasks with lower maximum concurrent instances at a higher priority (so as to keep said resource in use as much as possible)
Edit ability to vary the priority of tasks which are preforming the same action (pretty poor example but, PredictWeather (Tommorrow) will have a higher priority than PredictWeather (NextWeek))
Can someone point me towards an example / tell me how I can achieve this? Cheers.
C# Use Case: (typed in SO so please for give any syntax errors / typos)
**note Do() / DoAfter() shouldn't block the calling thread*
class Application ()
{
Task LeafTask = new Task (LeafWork) {PriorityHint = High, MaxConcurrent = 1};
var Tree = new TaskTree (LeafTask);
Task TraverseTask = new Task (Tree.Traverse);
Task WorkTask = new Task (MoreWork);
Task RunTask = new Task (Run);
Object SharedLeafWorkObject = new Object ();
void Entry ()
{
RunTask.Do ();
RunTask.Join (); // Use this thread for task processing until all invocations of RunTask are complete
}
void Run(){
TraverseTask.Do ();
// Wait for TraverseTask to make sure all leaf tasks are invoked before waiting on them
WorkTask.DoAfter (new [] {TraverseTask, LeafTask});
if (running){
RunTask.DoAfter (WorkTask); // Keep at least one RunTask alive to prevent Join from 'unblocking'
}
else
{
TraverseTask.Join();
WorkTask.Join ();
}
}
void LeafWork (Object leaf){
lock (SharedLeafWorkObject) // Fake a shared resource
{
Thread.Sleep (200); // 'work'
}
}
void MoreWork ()
{
Thread.Sleep (2000); // this one takes a while
}
}
class TaskTreeNode<TItem>
{
Task LeafTask; // = Application::LeafTask
TItem Item;
void Traverse ()
{
if (isLeaf)
{
// LeafTask set in C-Tor or elsewhere
LeafTask.Do(this.Item);
//Edit
//LeafTask.Do(this.Item, this.Depth); // Deeper items get higher priority
return;
}
foreach (var child in this.children)
{
child.Traverse ();
}
}
}
There are numerous examples here:
http://code.msdn.microsoft.com/ParExtSamples
There's a great white paper which covers a lot of the details you mention above here:
"Patterns for Parallel Programming: Understanding and Applying Parallel Patterns with the .NET Framework 4"
http://www.microsoft.com/downloads/details.aspx?FamilyID=86b3d32b-ad26-4bb8-a3ae-c1637026c3ee&displaylang=en
Off the top of my head I think you can do all the things you list in your question.
Dependencies etc: Task.WaitAll(Task[] tasks)
Scheduler: The library supports numerous options for limiting number of threads in use and supports providing your own scheduler. I would avoid altering the priority of threads if at all possible. This is likely to have negative impact on the scheduler, unless you provide your own.