Speeding up simple loop program by multithreading or alternative - vb.net

I have a very simple program that converts .HEIC images to .JPEG images; this is a batch operation. This task is a simple loop that runs following code. The code takes quite long to process when processing a large batches of 1000+ images. (I am using nuget package FileOnQ.Imaging.Heif).
How can I speed up the process? Can I perform the loop execution on different threads? Or is there another alternative.
For each File as String in System.IO.Directory.GetFiles(oBasePath, "*.*", SearchOption.AllDirectories)
Dim oHEICimage As New HeifImage(FullPath)
Dim oNewPath As String = System.IO.Path.Combine(Directory, Name & ".jpeg")
oHEICimage.PrimaryImage.Write(oNewPath)
Next
Note: I removed some code that determinates the file naming etc. because this is not relevant to the question.

If you are looking for a multithread example in a for loop, you can use Threading.Tasks.Parallel.ForEach:
https://learn.microsoft.com/en-us/dotnet/standard/parallel-programming/how-to-write-a-simple-parallel-foreach-loop
However, be aware of creating too many resources in the loop, as it may not speed up the processing. You can also limit the number of threads running running on the loop by specifying MaxDegreeOfParallelism in ParallelOptions.
For example:
Dim files As String() = System.IO.Directory.GetFiles(oBasePath, "*.*", IO.SearchOption.AllDirectories)
Threading.Tasks.Parallel.ForEach(files, Sub(imgFile)
Dim oHEICimage As New HeifImage(imgFile)
Dim oNewPath As String = System.IO.Path.Combine(outDirectory, Name & ".jpeg")
oHEICimage.PrimaryImage.Write(oNewPath)
End Sub)

Related

For each loop keeps stopping on second cycle VB

The software I'm writing is being run in a service installed on a computer.
I want to read a text file, process it, and code it to a different path.
the software is doing exactly what it's supposed to do but it only processes 2 files and it stops. I believe that its something to do with the for each loop. I found some information online saying that its to do with the amount of memory being allocated to each cycle of the for each loop.
Any help is appreciated.
my code goes like this.
For Each foundFile As String In My.Computer.FileSystem.GetFiles("C:\Commsin\", FileIO.SearchOption.SearchTopLevelOnly, "ORDER-*.TXT")
Dim filenum As Integer
filenum = FreeFile()
FileOpen(filenum, foundFile, OpenMode.Input)
While Not EOF(filenum)
<do a bunch of stuff>
End While
<more code>
Dim arrayFileName() As String = GetFileName.Split("\")
Dim FileName As String = arrayFileName(2)
My.Computer.FileSystem.CopyFile(foundFile, "C:\Commsin\Done\" & FileName)
If IO.File.Exists("C:\Commsin\Done\" & FileName) Then
My.Computer.FileSystem.DeleteFile(foundFile, Microsoft.VisualBasic.FileIO.UIOption.AllDialogs, Microsoft.VisualBasic.FileIO.RecycleOption.SendToRecycleBin)
NoOfOrders -= NoOfOrders
End If
Next
Fundamental mistake: Don't modify the collection you are iterating over, i.e. avoid this pattern (pseudocode):
For Each thing In BunchOfThings:
SomeOperation()
BunchOfThings.Delete(thing)
Next thing
It's better to follow this pattern here (pseudocode again):
While Not BunchOfThings.IsEmpty()
thing = BunchOfThings.nextThing()
SomeOperation()
BunchOfThings.Delete(thing)
End While
I'll leave it as an exercise for you to convert your code from the first approach to the second.
It looks like you're trying to extract the filename from the full path using Split().
Why not just use:
Dim fileName As String = IO.Path.GetFileName(foundFile)
Instead of:
Dim arrayFileName() As String = GetFileName.Split("\")
Dim FileName As String = arrayFileName(2)
Thank you, everyone, for your suggestions, I have successfully implemented the recommended changes. It turned out that the issue wasn't with the code itself.
It was with one of the files I was using it had a text row that once split into an array it wasn't at a required length giving an error "Index was outside the bounds of the array."
It was a mistake on the file, I also added some check to prevent this error in the future.
Thank You.

Loading Large file into String Variable VB.NET

Question, if anyone could help please: The file I am reading from inPath is very large 300MB to 1 GB +. I need to load the file into the variable wholeFile as shown in the below program. Approximately 200 MB files works fine but larger files bomb out (Out of Memory Exception Error). The purpose is once file is loaded into the variable, I would need to run RegEx and pick certain section of the file and save somewhere else. Thanks once again for your kind attention.
Dim inPath As String = "C:\temp\300MB-File.txt"
Dim outPath As String = "C:\temp\myFileNew2.txt"
Dim wholeFile as String = ""
Using sw As StreamWriter = File.CreateText(outPath)
For Each oneLine As String In File.ReadLines(inPath)
sw.WriteLine(oneLine)
wholeFile = wholeFile & vbCrLf & oneLine
Next
End Using
The way you're doing that is abominable. Why would you read a file line by line if your purpose is to store the entire contents in a single variable? Why wouldn't you load the whole file in one go?
Dim fileContents = File.ReadAllText(filePath)
That may still have memory issues with large files but the way you're doing will use exponentially more memory. Each time you do that concatenation to the String, you create a new String object and copy the previous contents into it along with the new text. That means that, for a file with N lines, you are going to create N Strings. The first will contain the first line, then the second will contain the first two lines, then the third will contain the first three lines, etc, etc.
If you really want to read the file line by line then you could use a StringBuilder, which avoids so much memory reallocation. Even better would be to get the size of the file first and then create the StringBuilder with the appropriate capacity from the get go, so no reallocation would be needed at all.
When you get right down to it though, files of that size are going to be an issue no matter what. You will either need to ensure that enough memory is allocated to your app to handle it or else you'll have to break the file up into chunks and process each chunk separately. If your regex won't match very large portions of the file then you can simply make each chunk overlap by a line or two and then handle the special cases where you get duplicate matches in the overlapping section.

VB.net - Adding multiple processes as String

I tried to write a code block which displays all of the pre-declared processes in a MessageBox:
Dim pro As String = "chrome" & "firefox"
Dim prox() As Process
Try
prox = Process.GetProcesses()
For Each process As Process In prox
If (pro = process.ProcessName) Then
MsgBox("Process Found: " & pro & " ,")
End If
Next process
But whenever I try to match from a list with more than one program, it fails to match any of them. How can I rewrite the code so it can match from a list of processes?
In a generic way, and looking into future maintaining and ease of reading you could use this:
'use ; as separator, keep one at the beginning and one at the end
Dim pro As String = ";chrome;firefox;iexplorer;safari;etc;"
Then instead of doing a straight equality test if pro = processname do
If (pro.Contains(";" & process.ProcessName & ";") Then
To read you only need to look at two lines (instead of several for an array based solution).
To maintain (add/remove from the list), just update the first line.

Unable to read and process large txt files using threading, background worker or string builder

In .net I'm coding an app that needs to read large txt files, such as 10 mb. My problem is reading files using StreamReader and doing some string manipulation, and then adding the results to a list box.
If I use threading or a background worker, processing becomes very slow. I also tried using string builder but with same result.
Any solutions for this?
You can read a 10 mb text file very quickly using ReadAllLines:
Dim ss() As String
ss = System.IO.File.ReadAllLines(filename)
Then you can manipulate the strings in the array, ss in this case.
When you update the ListBox, you should use .BeginUpdate and .EndUpdate to make that part faster.
You can put Application.DoEvents in the loop to allow Windows messages to be processed. This may keep it from looking so much like the system is locked up.

What is a superfast way to read large files line-by-line in VBA?

I believe I have come up with a very efficient way to read very, very large files line-by-line. Please tell me if you know of a better/faster way or see room for improvement. I am trying to get better at coding, so any sort of advice you have would be nice. Hopefully this is something that other people might find useful, too.
It appears to be something like 8 times faster than using Line Input from my tests.
'This function reads a file into a string. '
'I found this in the book Programming Excel with VBA and .NET. '
Public Function QuickRead(FName As String) As String
Dim I As Integer
Dim res As String
Dim l As Long
I = FreeFile
l = FileLen(FName)
res = Space(l)
Open FName For Binary Access Read As #I
Get #I, , res
Close I
QuickRead = res
End Function
'This function works like the Line Input statement'
Public Sub QRLineInput( _
ByRef strFileData As String, _
ByRef lngFilePosition As Long, _
ByRef strOutputString, _
ByRef blnEOF As Boolean _
)
On Error GoTo LastLine
strOutputString = Mid$(strFileData, lngFilePosition, _
InStr(lngFilePosition, strFileData, vbNewLine) - lngFilePosition)
lngFilePosition = InStr(lngFilePosition, strFileData, vbNewLine) + 2
Exit Sub
LastLine:
blnEOF = True
End Sub
Sub Test()
Dim strFilePathName As String: strFilePathName = "C:\Fld\File.txt"
Dim strFile As String
Dim lngPos As Long
Dim blnEOF As Boolean
Dim strFileLine As String
strFile = QuickRead(strFilePathName) & vbNewLine
lngPos = 1
Do Until blnEOF
Call QRLineInput(strFile, lngPos, strFileLine, blnEOF)
Loop
End Sub
Thanks for the advice!
My two cents…
Not long ago I needed reading large files using VBA and noticed this question. I tested the three approaches to read data from a file to compare its speed and reliability for a wide range of file sizes and line lengths. The approaches are:
Line Input VBA statement
Using the File System Object (FSO)
Using Get VBA statement for the whole file and then parsing the string read as described in posts here
Each test case consists of three steps:
Test case setup that writes a text file containing given number of lines of the same given length filled by the known character pattern.
Integrity test. Read each file line and verify its length and contents.
File read speed test. Read each line of the file repeated 10 times.
As you can notice, Step #3 verifies the true file read speed (as asked in the question) while Step #2 verifies the file read integrity and therefore simulates real conditions when string parsing is needed.
The following chart shows the test results for the File read speed test. The file size is 64M bytes for all tests, and the tests differ in line length that varies from 2 bytes (not including CRLF) to 8M bytes.
CONCLUSION:
All the three methods are reliable for large files with normal and abnormal line lengths (please compare to Graeme Howard’s answer)
All the three methods produce almost equivalent file reading speed for normal line lengths
“Superfast way” (Method #3) works fine for extremely long lines while the other two don’t.
All this is applicable to different Offices, different PCs, for VBA and VB6
You can use Scripting.FileSystemObject to do that thing.
From the Reference:
The ReadLine method allows a script to read individual lines in a text file. To use this method, open the text file, and then set up a Do Loop that continues until the AtEndOfStream property is True. (This simply means that you have reached the end of the file.) Within the Do Loop, call the ReadLine method, store the contents of the first line in a variable, and then perform some action. When the script loops around, it will automatically drop down a line and read the second line of the file into the variable. This will continue until each line has been read (or until the script specifically exits the loop).
And a quick example:
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFile = objFSO.OpenTextFile("C:\FSO\ServerList.txt", 1)
Do Until objFile.AtEndOfStream
strLine = objFile.ReadLine
MsgBox strLine
Loop
objFile.Close
Line Input works fine for small files. However, when file sizes reach around 90k, Line Input jumps all over the place and reads data in the wrong order from the source file.
I tested it with different filesizes:
49k = ok
60k = ok
78k = ok
85k = ok
93k = error
101k = error
127k = error
156k = error
Lesson learned - use Scripting.FileSystemObject
With that code you load the file in memory (as a big string) and then you read that string line by line.
By using Mid$() and InStr() you actually read the "file" twice but since it's in memory, there is no problem.
I don't know if VB's String has a length limit (probably not) but if the text files are hundreds of megabyte in size it's likely to see a performance drop, due to virtual memory usage.
I would think , in a large file scenario using a stream would be far more efficient, because memory consumption would be very small.
But your algorithm could alternate between using a stream and loading the entire thing in memory based on the file size. I wouldn't be surprised if one is only better than the other under certain criteria.
'you can modify above and read full file in one go
and then display each line as shown below
Option Explicit
Public Function QuickRead(FName As String) As Variant
Dim i As Integer
Dim res As String
Dim l As Long
Dim v As Variant
i = FreeFile
l = FileLen(FName)
res = Space(l)
Open FName For Binary Access Read As #i
Get #i, , res
Close i
'split the file with vbcrlf
QuickRead = Split(res, vbCrLf)
End Function
Sub Test()
' you can replace file for "c:\writename.txt to any file name you desire
Dim strFilePathName As String: strFilePathName = "C:\writename.txt"
Dim strFileLine As String
Dim v As Variant
Dim i As Long
v = QuickRead(strFilePathName)
For i = 0 To UBound(v)
MsgBox v(i)
Next
End Sub
My take on it...obviously, you've got to do something with the data you read in. If it involves writing it to the sheet, that'll be deadly slow with a normal For Loop. I came up with the following based upon a rehash of some of the items there, plus some help from the Chip Pearson website.
Reading in the text file (assuming you don't know the length of the range it will create, so only the startingCell is given):
Public Sub ReadInPlainText(startCell As Range, Optional textfilename As Variant)
If IsMissing(textfilename) Then textfilename = Application.GetOpenFilename("All Files (*.*), *.*", , "Select Text File to Read")
If textfilename = "" Then Exit Sub
Dim filelength As Long
Dim filenumber As Integer
filenumber = FreeFile
filelength = filelen(textfilename)
Dim text As String
Dim textlines As Variant
Open textfilename For Binary Access Read As filenumber
text = Space(filelength)
Get #filenumber, , text
'split the file with vbcrlf
textlines = Split(text, vbCrLf)
'output to range
Dim outputRange As Range
Set outputRange = startCell
Set outputRange = outputRange.Resize(UBound(textlines), 1)
outputRange.Value = Application.Transpose(textlines)
Close filenumber
End Sub
Conversely, if you need to write out a range to a text file, this does it quickly in one print statement (note: the file 'Open' type here is in text mode, not binary..unlike the read routine above).
Public Sub WriteRangeAsPlainText(ExportRange As Range, Optional textfilename As Variant)
If IsMissing(textfilename) Then textfilename = Application.GetSaveAsFilename(FileFilter:="Text Files (*.txt), *.txt")
If textfilename = "" Then Exit Sub
Dim filenumber As Integer
filenumber = FreeFile
Open textfilename For Output As filenumber
Dim textlines() As Variant, outputvar As Variant
textlines = Application.Transpose(ExportRange.Value)
outputvar = Join(textlines, vbCrLf)
Print #filenumber, outputvar
Close filenumber
End Sub
Be careful when using Application.Transpose with a huge number of values. If you transpose values to a column, excel will assume you are assuming you transposed them from rows.
Max Column Limit < Max Row Limit, and it will only display the first (Max Column Limit) values, and anithing after that will be "N/A"
I just wanted to share some of my results...
I have text files, which apparently came from a Linux system, so I only have a vbLF/Chr(10) at the end of each line and not vbCR/Chr(13).
Note 1:
This meant that the Line Input method would read in the entire file, instead of just one line at a time.
From my research testing small (152KB) & large (2778LB) files, both on and off the network I found the following:
Open FileName For Input: Line Input was the slowest (See Note 1 above)
Open FileName For Binary Access Read: Input was the fastest for reading the whole file
FSO.OpenTextFile: ReadLine was fast, but a bit slower then Binary Input
Note 2:
If I just needed to check the file header (first 1-2 lines) to check if I had the proper file/format, then FSO.OpenTextFile was the
fastest, followed very closely by Binary Input.
The drawback with the Binary Input is that you have to know how many characters
you want to read.
On normal files, Line Input would also be a good
option as well, but I couldn't test due to Note 1.
 
Note 3:
Obviously, the files on the network showed the largest difference in read speed. They also showed the greatest benefit from reading the file a second time (although there are certainly memory buffers that come into play here).