Dividing work into multiple threads - vb.net

I've read a lot of different questions on SO about multithreaded applications and how to split work up between them, but none really seem to fit what I need for this. Here's how my program currently basically works:
Module Module1
'string X declared out here
Sub Main()
'Start given number of threads of Main2()
End Sub
Sub Main2()
'Loops forever
'Call X = nextvalue(X), display info as needed
End Sub
Function nextvalue(byval Y as string)
'Determines the next Y in the sequence
End Function
End Module
This is only a rough outline of what actually happens in my code by the way.
My problem being that if multiple threads start running Main2(), they're dealing with the same X value as in the other threads. The loop inside of main2 executes multiple times per millisecond, so I can't just stagger the loops. There is often duplication of work done.
How can I properly divide up the work so that the two threads running simultaneously never have the same work to run?

You should synchronize the generation and storage of X so that the composite operation appears atomic to all threads.
Module Module1
Private X As String
Private LockObj As Object = New Object()
Private Sub Main2()
Do While True
' This will be used to store a snapshot of X that can be used safely by the current thread.
Dim copy As String
' Generate and store the next value atomically.
SyncLock LockObj
X = nextValue(X)
copy = X
End SyncLock
' Now you can perform operations against the local copy.
' Do not access X outside of the lock above.
Console.WriteLine(copy)
Loop
End Sub
End Module

A thread manager is required to manage the threads and the work that they do. Say it is desirable to split up the work into 10 threads.
Start the manager
Manager creates 10 threads
Assign work to the manager (queue up the work, let's say it queues up 10000 work items)
Manager assigns a work item to complete for each of the 10 threads.
As threads finish thier work, they report back to the manager that they are done and recieve another work item. The queue of work should be thread safe so that items can be enqueued and dequeued. The manager handles the management of work items. The threads just execute the work.
Once this is in place, work items should never be duplicated amongst threads.

Use a lock so that only one thread can access X at a time. Once one thread is done with it, another thread is able to use it. This will prevent two threads from calling nextvalue(x) with the same value.

Related

VB.net - 5 threads reading/writing to the same variable

I have an app that uses 5 concurrent threads to perform tasks. The threads will need to read from a list of items and pick the next available one. As soon as they've done so, they need to add one to the counter so that the next thread is able to pick the next one. I understand i will need to use something like BlockingCollection to do this so that 2 threads dont end up picking the same number and then both incrementing by one.
I'm a little stuck as to how this will work. I have declared by new BlockingCollection object but not sure where to proceed from here? Any help is much appreciated. Thanks.
It sounds to me that you should be using a ConcurrentQueue(Of T). The whole point of a queue is that you can pick the next item off the front so if you use a queue data structure then there's no incrementing of any counter required. On top of that functionality provided by the Queue(Of T) class, the ConcurrentQueue(Of T) class is also thread-safe. Sounds rather like exactly what you need. Just call TryDequeue each time you want an item and it will return False when there are no more.
Try the following in a new Console Application project to see the principle in action:
Imports System.Collections.Concurrent
Imports System.Threading
Module Module1
'A thread-safe queue containing numbers from 1 to 100.
Private numbers As New ConcurrentQueue(Of Integer)(Enumerable.Range(1, 100))
'Random number generator.
Private rng As New Random
Sub Main()
'Read the queued numbers using five threads.
For i = 1 To 5
Call New Thread(AddressOf DisplayNumbers).Start()
Next
Console.ReadLine()
End Sub
Private Sub DisplayNumbers()
Dim number As Integer
'Try to get the next number until there are no more.
Do While numbers.TryDequeue(number)
'Display the number and the thread that read it.
Console.WriteLine($"Thread: {Thread.CurrentThread.ManagedThreadId}; Number: {number}")
'Wait a random time period.
Thread.Sleep(rng.Next(500, 1000))
Loop
End Sub
End Module

How to limit the number of processes being spawned at a time?

I am working on a VB.NET Windows Forms application where the user is supposed to be able to determine how many processes the application is allowed to launch at a time.
My current method mostly works but I've noticed that occasionally the application goes over the set amount. I use two global variables for this, _ConcurrentRuns which is 0 at the start of the application, and _MaxConcurrentRuns which is set by the user.
Private _sync As new Object()
' This is called Synchronously
Private Function RunModel() As Boolean
If CancelExectuion Then Return CancelCleanup()
Do While True
SyncLock _sync
If _ConcurrentRuns < _MaxConcurrentRuns Then
Interlocked.Increment(_ConcurrentRuns)
Exit Do
End If
End SyncLock
Threading.Thread.Sleep(50)
Loop
'This is what will launch an individual process and close it when finished
ret = RunApplication(arg)
' The process has been closed so we decrement the concurrent runs
Interlocked.Decrement(_ConcurrentRuns)
Return ret
End Function
The goal is to let only one thread exit the while loop at a time, I'm not able to catch it in the debug mode however in the task manager it will occasionally go 1-3 processes over what it's supposed to use. This makes me assume that somehow multiple threads are getting inside the synclock somehow, but I have no clue how that could be happening.
I will be very grateful for any and all help that can be provided, thanks for taking the time to read my question.
So it appears that my solution works for this, I don't want to delete this question because it might be helpful to somebody else in the future.
Answer: Use better process monitoring software / set priority to high in task manager.

I need help creating a TaskScheduler to prevent threading overload

I want to add workers into a queue, but only have the first N workers processing in parallel. All samples I find are in C#.
This is probably simple for a programmer, but I'm not one. I know enough about VB to write simple programs.
But my first application runs fine until it suddenly hits 100% CPU and then crashes. Help, please (Yes, I've wasted 5 hours of work time searching before posting this...)
More Context: Performing a recursive inventory of directory structures, files, and permissions across file servers with over 1 million directories/subdirectories.
Process runs serially, but will take months to complete. Management already breathing on my neck. When I try using Tasks, it goes to about 1000 threads, then hits 100% CPU, stops responding, then crashes. This is on a 16 core server with 112 GB RAM.
--Added
So, with the sample provided on using Semaphores, this is what I've put in:
Public Class InvDir
Private mSm as Semaphore
Public Sub New(ByVal maxPrc As Integer)
mSm = New Semaphore(maxPrc, maxPrc)
End Sub
Public Sub GetInventory(ByVal Path As String, ByRef Totals As Object, ByRef MyData As Object)
mSm.WaitOne()
Task.Factory.StartNew(Sub()
Dim CurDir As New IO.DirectoryInfo(Path)
Totals.SubDirectoryCount += CurDir.GetDirectories().Count
Totals.FilesCount += CurDir.GetFiles().Count
For Each CurFile As IO.FileInfo in CurDir.EnumerateFiles()
MyData.AddFile(CurFile.FileName, CurFile.Extension, CurFile.FullName, CurFile.Length)
Next
End Sub).ContinueWith(Function(x) mSm.Release())
End Sub
End Class
You're attempting multithreading with disk I/O. It might be getting slower because you're throwing more threads at it. No matter how many threads there are, the disk can physically only seek one position at a time. (In fact, you mentioned that it works serially.)
If you did want to limit the number of concurrent threads you could use a Semaphore. A semaphore is like a syncLock except you can specify how many threads are allowed to execute the code at a time. In the example below, the semaphore allows three threads to execute. Any more than that have to wait until one finishes. Some modified code from the MSDN page:
Public Class Example
' A semaphore that simulates a limited resource pool.
'
Private Shared _pool As Semaphore
<MTAThread> _
Public Shared Sub Main()
' Create a semaphore that can satisfy up to three
' concurrent requests. Use an initial count of zero,
' so that the entire semaphore count is initially
' owned by the main program thread.
'
_pool = New Semaphore(0, 3)
End Sub
Private Sub SomeWorkerMethod()
'This is the method that would be called using a Task.
_pool.WaitOne()
Try
'Do whatever
Finally
_pool.Release()
End Try
End Sub
End Class
Every new thread must call _pool.WaitOne(). That tells it to wait its turn until there are fewer than three threads executing. Every thread blocks until the semaphore allows it to pass.
Every thread must also call _pool.Release() to let the semaphore know that it can allow the next waiting thread to begin. That's important, even if there's an exception. If threads don't call Release() then the semaphore will just block them forever.
If it's really going to take five months, what about cloning the drive and running the check on multiple instances of the same drive, each looking at different sections?

How to use the same class accross multiple threads and return a variable

My apologies in advance if this has already been answered, but every search I have done does not come close to what I need. Also, this is all pseudo code.
Here is the situation: I created a form (targeting DOT NET 3.5) that does a loop on a gridview recreating a class and runs the code. After the code runs, there is a local variable on the class that gets updated and allows me to use it and the process repeats. Something like this:
For x as Integer = 0 to Me.txtTextBox.Lines.Count - 1 'Can be in the hundreds
Dim objMyClass as MyClass = New MyClass(Me.DatagridView1.Rows(x).Cells(0).Value)
if objMyClass.Start() = True then
'Do my thing with objMyClass.LocalLongVariable
End If
Next
This works just fine, but takes literally days to complete. The last time I ran this it took like 6 days, 7 hours and 40 something minutes to complete and barely bumped the CPU usage.
So, now I want to use MulitThreading to run several of these instances at the same time. I have not been able to get this to work. Everything I try returns different values every time I run it (and it should not). I believe that the threads are accessing the local variable across other threads and are incrementing at will. And SyncLock locks up the entire program. I have also tried adding a custom event that fires when the process is completed and executes a delegate on the Main form, but that has not worked either.
Now, my question is simple: How can I run multiple threads using the same base class (passing a unique string variable) and have the local class variable produce the correct results back to the UI? (And, from what I have been reading, the BackgroundWorker class in not suitable for this many threads (like hundreds); correct me if I read it incorrectly please)
I am looking for something like:
Dim thrd(Me.txtTextBox.Lines.Count) as Thread
Dim objMyClass(Me.txtTextBox.Lines.Count) as MyClass
For x as Integer = 0 to Me.txtTextBox.Lines.Count - 1
thrd(x) = new Thread (Sub()
objMyClass(x) = New MyClass(Me.GridView1.Rows(x).Cells(0).Value
If objMyClass.Start() = True Then
'Do my stuff here (maybe call a delegate??)
End If
End)
thrd(x).IsBackground = True
thrd(x).Start()
Next
Any help/advice on how to proceed will be greatly appreciated. And, if you know of any examples of your suggestion, please post the code/link.
The solution was, in fact, Synclock. My issue was that I was locking the wrong object, objMyClass, instead of the current Me AND I was failing to use Monitor.PulseAll(). Also, I switched to using the ThreadPool.QueueUserWorkItem(AddressOf objMyClass, args) and also used SyncLock on my custom event raised when the thread completes. It's a whole lot easier!! Thanks!!

VB.NET multithreading, block thread until notification received

Before I begin, I have to apologize for two things. One is that it is very difficult for me to explain things in a concise manner. Two is that I need to be somewhat vague due to the nature of the company I work for.
I am working on enhancing the functionality of an application that I've inherited. It is a very intensive application that runs a good portion of my company's day to day business. Because of this I am limited to the scope of what I can change--otherwise I'd probably rewrite it from scratch. Anyways, here is what I need to do:
I have several threads that all perform the same task but on different data input streams. Each thread interacts through an API from another software system we pay licensing on to write out to what is called channels. Unfortunately we have only licensed a certain number of concurrently running channels, so this application is supposed to turn them on an off as needed.
Each thread should wait until there is an available channel, lock the channel for itself and perform its processing and then release the channel. Unfortunately, I don't know how to do this, especially across multiple threads. I also don't really know what to search Google or this site for, or I'd probably have my answer. This was my thought:
A class that handles the distribution of channel numbers. Each thread makes a call to a member of this class. When it does this it would enter a queue and block until the channel handling class recognizes that we have a channel, signals the waiting thread that a channel is available and passing it the channel id. I have no idea where to begin even looking this up. Below I have some horribly written PsuedoCode of how in my mind I would think it would work.
Public Class ChannelHandler
Private Shared WaitQueue as New Queue(of Thread)
'// calling thread adds itself to the queue
Public Shared Sub WaitForChannel(byref t as thread)
WaitQueue.enqueue(t)
End Sub
Public Shared Sub ReleaseChannel(chanNum as integer)
'// my own processing to make the chan num available again
End Sub
'// this would be running on a separate thread, polling my database
'// for an available channel, when it finds one, somehow signal
'// the first thread in the queue that its got a channel and here's the id
Public Shared Sub ChannelLoop()
while true
if WaitQueue.length > 0 then
if thereIsAChannelAvailable then '//i can figure this out my own
dim t as thread = ctype(WaitQueue.dequeue(), Thread)
lockTheChannel(TheAvailableChannelNumber) 'performed by me
'// signal the thread, passing it the channel number
t => SignalReady(theAvailableChannelNumber) '// how to signal?
end if
end if
end while
End Sub
End Class
and then
'// this inside the function that is doing the processing:
ChannelHandler.requestChannel(CurrentThread)
while (waitingForSignal) '// how?
block '// how?
dim channelNumber as int => getChannelNumberThatWasSignaledBack
'// perform processing with channelNumber
ChannelHandler.ReleaseChannel(channelNumber)
I am working with the .NET Framework 3.5 in VB.NET. I am sure there has got to be some sort of mechanism already built for this, but as I said I have no idea exactly what keywords I should be searching for. Any input pointing me in the right direction (ie specific .NET framework classes to use or code samples) would be greatly appreciated. If I need to elaborate on anything, please let me know and I will to the best of my ability.
Edit: The other problem that I have is that these channels can be turned on/off from outside of this application, manually by the user (or as a result of a user initiated event). I am not concerned with a channel be shut down while a thread is using it (it would throw an exception and then pick back up next time it came through. But the issue is that there are not a constant number of threads fighting over a constant number of channels (if a user turns one on manually, the count is reduced, etc). Both items are variable, so I cant rely on the fact that there are no external forces (ie, something outside this set of threads, which is why I do some processing via my DB to determine an available channel number)
What I would do:
Switch the System.Threading.Thread by the System.Threading.Tasks.Task class.
If a new Task needs to be created, but the List(Of Task) (or, in your example, Queue(Of Task) ) count greater than the maximum permitted, use the Task.WaitAny method.
EDIT:
As I answered the previous block on my phone (which is pretty challenging for writing code), let now me write an example about how I would do it:
Imports System.Threading.Tasks
Imports System.Collections.Generic
Public Class Sample
Private Const MAXIMUM_PERMITTED As Integer = 3
Private _waitQueue As New Queue(Of Task)
Public Sub AssignChannel()
Static Dim queueManagerCreated As Boolean
If Not queueManagerCreated Then
Task.Factory.StartNew(Sub() ManageQueue())
queueManagerCreated = True
End If
Dim newTask As New Task(Sub()
' Connect to 3rd Party software
End Sub)
SyncLock (_waitQueue)
_waitQueue.Enqueue(newTask)
End SyncLock
End Sub
Private Sub ManageQueue()
Dim tasksRunning As New List(Of Task)
While True
If _waitQueue.Count <= 0 Then
Threading.Thread.Sleep(10)
Continue While
End If
If tasksRunning.Count > MAXIMUM_PERMITTED Then
Dim endedTaskPos As Integer = Task.WaitAny(tasksRunning.ToArray)
If endedTaskPos > -1 AndAlso
endedTaskPos <= tasksRunning.Count Then
tasksRunning.RemoveAt(endedTaskPos)
Else
Continue While
End If
End If
Dim taskToStart As Task
SyncLock (_waitQueue)
taskToStart = _waitQueue.Dequeue()
End SyncLock
tasksRunning.Add(taskToStart)
taskToStart.Start()
End While
End Sub
End Class