Best way to optimise For Loops and Do Until Loops [closed] - vb.net

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
I have the following code that searches through folder directories in a DataGridView table, and puts all files of the wanted format into a list, it also gathers a list of their last modified date for later use in the application.
The code works, but it is sore on the eyes. I want to tidy up the following loops to improve efficiency - what I mean is that I have a For loop within a For loop that creates the list of filenames, then I have two separate Do Until loops that search through the list from start to finish to pick out file names that need adjustment.
I would be very interested to learn a better way of achieving the same result, as my knowledge of efficiency in coding is quite elementary. Basically, can this be done in one or two loops, as the idea of looping through the Lists twice seems inefficient?
Public Class
Private Sub btnDirectory_Click(sender As Object, e As EventArgs) Handles btnDirectory.Click
Dim FileNames As New List(Of String)
Dim FileDates As New List(Of Date)
Dim DocNo As String
Dim rowCheck As String
Dim ProjectNo As String = "1111"
Dim FileNameCheck As String
Dim str As String
Dim k As Integer = 0
Dim i As Integer
Dim j As Integer
Dim CorrectType As Boolean = False
'The first loop grabs all files of the wanted format from a datagridview table containing all directories to be checked
For Each rw In Background.Table1.Rows
rowCheck = Background.Table1(0, k).Value
If Not String.IsNullOrEmpty(rowCheck) Then
For Each file As String In My.Computer.FileSystem.GetFiles(Background.Table1(0, k).Value)
CorrectType = False
FileNameCheck = IO.Path.GetFileNameWithoutExtension(file)
If FileNameCheck.Contains(ProjectNo) AndAlso FileNameCheck.Contains("-") AndAlso Not String.IsNullOrEmpty(FileNameCheck) AndAlso FileNameCheck.Contains(" ") Then
DocNo = FileNameCheck.Substring(0, FileNameCheck.IndexOf(" "))
If FileNameCheck.Substring(0, FileNameCheck.IndexOf("-")) = ProjectNo AndAlso CountLetters(DocNo) = 3 Then
CorrectType = True
End If
End If
If CorrectType = True Then
FileNames.Add(FileNameCheck)
FileDates.Add(IO.File.GetLastWriteTime(file))
End If
Next
End If
k += 1
Next
'The next loop tidies up the file formats that contain a "-00-" in their names
j = FileNames.Count
i = 0
Do
str = FileNames(i)
If str.Contains("-00-") Then
FileNames(i) = RemoveChar(str, "-00-") ' RemoveChar is a function that replaces "-00-" with a "-"
End If
i += 1
Loop Until i = j
i = 0
j = FileNames.Count
'Finally, this loop checks that no two files have the exact same name, and gets rid of one of them if that is the case
Do
Dim st1 As String = FileNames(j - 1)
Dim st2 As String = FileNames(j - 2)
If st1 = st2 Then
FileNames.RemoveAt(j - 1)
FileDates.RemoveAt(j - 1)
End If
j -= 1
Loop Until j = 1
End Sub
End Class

The code is certainly hard on the eyes.
the For Each rw loop does not use rw. You could replace this with a loop such as:
For k = 1 to Background.Table1.Rows.Count
' Do things here
Next k
You assign rowCheck and use it once, but you missed the opportunity to reuse it in the For Each file line.
Where you have CorrectType = True you can easily place the corresponding code instead.
If FileNameCheck.Substring(0, FileNameCheck.IndexOf("-")) = ProjectNo AndAlso CountLetters(DocNo) = 3 Then
CorrectType = True
End If
End If
If CorrectType = True Then
FileNames.Add(FileNameCheck)
FileDates.Add(IO.File.GetLastWriteTime(file))
End If
becomes:
If FileNameCheck.Substring(0, FileNameCheck.IndexOf("-")) = ProjectNo AndAlso CountLetters(DocNo) = 3 Then
FileNames.Add(FileNameCheck)
FileDates.Add(IO.File.GetLastWriteTime(file))
End If
I must admit, the next two loops made my eyes bleed (figuratively, not literally).
j = FileNames.Count
i = 0
Do
str = FileNames(i)
If str.Contains("-00-") Then
FileNames(i) = RemoveChar(str, "-00-") ' RemoveChar is a function that replaces "-00-" with a "-"
End If
i += 1
Loop Until i = j
becomes
for i = 1 to FileNames.Count
str = FileNames(i)
If str.Contains("-00-") Then
FileNames(i) = RemoveChar(str, "-00-") ' RemoveChar is a function that replaces "-00-" with a "-"
End If
Next I
And
i = 0
j = FileNames.Count
'Finally, this loop checks that no two files have the exact same name, and gets rid of one of them if that is the case
Do
Dim st1 As String = FileNames(j - 1)
Dim st2 As String = FileNames(j - 2)
If st1 = st2 Then
FileNames.RemoveAt(j - 1)
FileDates.RemoveAt(j - 1)
End If
j -= 1
Loop Until j = 1
becomes
'Finally, this loop checks that no two files have the exact same name, and gets rid of one of them if that is the case
For j = FileNames.Count - 1 to 1 Step -1 ' Check my counting here - stop at 1, 2 or 0?
Dim st1 As String = FileNames(j)
Dim st2 As String = FileNames(j - 1)
If st1 = st2 Then
FileNames.RemoveAt(j)
FileDates.RemoveAt(j)
End If
Next j

Related

vb.net efficiently finding byte sequence in byte array

so I am creating a piece of software that in short, has a list of original byte sequences and new sequences that those bytes need to be changed into, kinda like this in text form "original location(currently irrelevant as sequence can be in different places) $ 56,69,71,73,75,77 : 56,69,71,80,50,54"
I already have code that works fine, however there can be up to 600+ of these sequences to find and change and in some cases it is taking a really really long time 15 mins +, i think it is down to how long it is taking to find the sequences to them change so i am trying to find a better way to do this as currently it is unusable due to how long it takes.
I have copied the whole code for this function below in hopes one of you kind souls can have a look and help =)
Dim originalbytes() As Byte
Dim fd As OpenFileDialog = New OpenFileDialog()
fd.Title = "Select the file"
fd.Filter = "All files (*.*)|*.*|All files (*.*)|*.*"
fd.FilterIndex = 2
If fd.ShowDialog() = DialogResult.OK Then
TextBox2.Text = fd.FileName
originalbytes = File.ReadAllBytes(fd.FileName)
End If
Dim x As Integer = 0
Dim y As Integer = 0
Dim textbox1array() = TextBox1.Lines
Dim changedbytes() = originalbytes
Dim startvalue As Integer = 0
Dim databoxarray() As String
Dim databoxarray2() As String
While x < textbox1array.Length - 1
'for each change to make
databoxarray = textbox1array(x).Replace(" $ ", vbCr).Replace(" : ", vbCr).Split
databoxarray2 = databoxarray(1).Replace(",", vbCr).Split
Dim databox2bytes() As String = databoxarray2
'copy original bytes line to databox2 lines
y = 0
While y < (originalbytes.Length - databox2bytes.Length)
'repeat for all bytes in ori file - size of data to find
If originalbytes(y) = databox2bytes(0) Then
startvalue = y
Dim z As String = 1
Dim samebytecounter As Integer = 1
While z < databox2bytes.Length
'repeat for all ori bytes
If originalbytes(y + z) = databox2bytes(z) Then
samebytecounter = samebytecounter + 1
End If
z = z + 1
End While
If samebytecounter = databox2bytes.Length Then
'same original data found, make changes
Dim bytestoinsert() As String = databoxarray(2).Replace(",", vbCr).Split
Dim t As Integer = 0
While t < bytestoinsert.Length
changedbytes(startvalue + t) = bytestoinsert(t)
t = t + 1
End While
End If
End If
y = y + 1
End While
x = x + 1
End While
File.WriteAllBytes(TextBox2.Text & " modified", changedbytes)
Let 's take a look at that inner while loop in your code, there are some things that can be optimized:
There is no need to check the total length all the time
Dim length as Integer = originalbytes.Length - databox2bytes.Length
While y < length
'repeat for all bytes in ori file - size of data to find
If originalbytes(y) = databox2bytes(0) Then
startvalue = y
z is not necessary, samebytecounter does exactly the same
Dim samebytecounter As Integer = 1
This while loop is a real bottleneck, since you always check the full length of your databox2bytes, you should rather quit the while loop when they don't match
While samebytecounter < databox2bytes.Length AndAlso originalbytes(y + samebytecounter ) = databox2bytes(samebytecounter )
samebytecounter = samebytecounter + 1
End While
This seems fine, but you already splitted the data at the top of your while loop, so, no need to create another array that does the same operation again
If samebytecounter = databox2bytes.Length Then
'same original data found, make changes
Dim t As Integer = 0
While t < databoxarray2.Length
changedbytes(startvalue + t) = databoxarray2(t)
t = t + 1
End While
End If
End If
y = y + 1
End While
For the rest I would agree that the algorithm you created is hugely inefficient, theoretically your code could have been rewritten like eg: (didn't really test this code)
Dim text = System.Text.Encoding.UTF8.GetString(originalbytes, 0, originalbytes.Length)
dim findText = System.Text.Encoding.UTF8.GetString(stringToFind, 0, stringToFind.Length)
dim replaceWith = System.Text.Encoding.UTF8.GetString(stringToSet, 0, stringToSet.Length)
text = text.Replace( findText, replaceWith )
dim outbytes = System.Text.Encoding.UTF8.GetBytes(text)
which would probably be a huge time saver.
For the rest your code seems to be created in such a way that nobody will really understand it if it's laying around for a month or so, I would say, including yourself

how to execute for loop for items checked in 3 checkedlistbox

i made a report for taking output of employees in a company.i made a code for that.but it only show the first items checked.how to impliment for loop in this.
Dim i As Integer
Dim j As Integer
Dim k As Integer
For i = 0 To Employee_Bank_dtl.CheckedListBox1.Items.Count - 1 Step i + 1
If Employee_Bank_dtl.CheckedListBox1.GetItemCheckState(i) = CheckState.Checked Then
Dim xx As String = (CType(Employee_Bank_dtl.CheckedListBox1.Items(i), DataRowView))("VC_BRNAME")
For j = 0 To Employee_Bank_dtl.CheckedListBox2.Items.Count - 1 Step j + 1
If Employee_Bank_dtl.CheckedListBox2.GetItemCheckState(j) = CheckState.Checked Then
Dim yy As String = (CType(Employee_Bank_dtl.CheckedListBox2.Items(j), DataRowView))("vc_empstatus")
For k = 0 To Employee_Bank_dtl.CheckedListBox3.Items.Count - 1 Step k + 1
If Employee_Bank_dtl.CheckedListBox3.GetItemCheckState(k) = CheckState.Checked Then
Dim zz As String = (CType(Employee_Bank_dtl.CheckedListBox3.Items(k), DataRowView))("vc_value")
Dim str = "xxxxxxxxxxxxxx"
conobj.readdata(str)
conobj._adpt.Fill(Me.DataSet10.BRANCH_MAST)
Me.ReportViewer1.RefreshReport()
End If
Next
End If
Next
End If
Next i
You should increment through the collection of items like this not the way you are doing it. You should be able to find the properties you are looking for. Set a break point within the inner loop and right click on quickwatch on the item and you will see all the properties of that item which will contain what your looking for.
For Each item In CheckedListBox1.Items
'set the item property to the right property that holds "VC_BRNAME"
If item.property = "VC_BRNAME" Then
End If
If item.checkstate.checked = True Then
End If
Next
Its my mistake replacing k + 1 to +1 in the 3rd loop will got the right answer

Need Help to Reduce Processing Time for Large CSV Data

I've read through some of the previous questions on speed up processing of large CSV data. I've implement some of the ideas and i got some improvement on processing time. However i still need to further cut down the processing time hopefully someone can help me.
I think my code is too long, I'll try to simplify. Here is what my code suppose to do:
1. Read through a csv file.
2. Group the data by first column; calculate total sum of each column and return the result.Example (Raw Data): A B C1 2 31 2 32 4 42 4 4Result:A B C1 4 62 8 8Note: My actual data will be 100MB file with 630 columns and 29000 rows, total 18.27M records.
Here is how i achieve it:Method 1:
1. Read a csv file through Filestream.
2. Use Split to split the returned string and process line by line, field by field.
3. Storing the result in an array and save the result in a text file.
Note on Method1: Time to process the data using this method takes ~1 min 20 secs.Method 2:1. Read a csv file through Filestream.2. Feed the data into different threads before start process. (For now i feed 100 lines of data into different thread, fix 5 threads for now due to CPU resource constraint)3. Use Split to split the returned string and process line by line, field by field in each thread.4. Join all result from every threads and store in an array. Save the result in text file.Note on Method 2: Time to process the data using this method takes ~50 secs.So i got ~30secs improvement migrating from Method 1 to Method 2. I was wondering whether what i can do to further improve the process time. I've tried to cut down the data into smaller section like 100 lines x 100 columns and process it but the time to process the data become longer instead.Hopefully some one can help me on this.Thank you in advance.Edit:Here is my code for Method 2 (I'll skip Method 1 as i'm not using it already), I have a subroutine that manage the assignment of threads for every 100 lines read from filestream, execute each threads and return the result, finally update the all the results into single array before write the result into file. I tried to make the code as simple as possible. Hopefully this will give more idea to you all on how i process my data.'Subroutine that assign smaller section of raw data into different threadsSub process_control(byval filename as string) Dim sread As New FileStream(filename, FileMode.Open, FileAccess.Read, FileShare.Read) Dim read As New StreamReader(sread) Dim t1 As System.Threading.Thread Dim value, data1(), data2(), data3(), data4(), data5(), threadid(), result1(0), result2(0), result3(0), result4(0), result5(0) As String Dim row as integer Dim rowlimit as integer = 99 Dim check1 as boolean = true row = 0 check = false ReDim data1(rowlimit), data2(rowlimit), data3(rowlimit), data4(rowlimit), data5(rowlimit), threadid(4) do
value = read.ReadLine
If row < rowlimit + 1 then
If data1(rowlimit) = "" Then
data1(row) = value
ElseIf data2(rowlimit) = "" Then
data2(row) = value
ElseIf data3(rowlimit) = "" Then
data3(row) = value
ElseIf data4(rowlimit) = "" Then
data4(row) = value
ElseIf data5(rowlimit) = "" Then
data5(row) = value
End If
Else
If data1(rowlimit) <> "" And data2(rowlimit) = "" And data3(rowlimit) = "" And data4(rowlimit) = "" And data5(rowlimit) = "" Then
threadid(0) = ""
t1 = New Threading.Thread(Sub()
result1 = process(data1).Clone
threadid(0) = System.Threading.Thread.CurrentThread.ManagedThreadId
End Sub)
t1.Start()
row = 0
data2(row) = value
ElseIf data1(rowlimit) <> "" And data2(rowlimit) <> "" And data3(rowlimit) = "" And data4(rowlimit) = "" And data5(rowlimit) = "" Then
threadid(1) = ""
t1 = New Threading.Thread(Sub()
result2 = process(data2).Clone
threadid(1) = System.Threading.Thread.CurrentThread.ManagedThreadId
End Sub)
t1.Start()
row = 0
data3(row) = value
ElseIf data1(rowlimit) <> "" And data2(rowlimit) <> "" And data3(rowlimit) <> "" And data4(rowlimit) = "" And data5(rowlimit) = "" Then
threadid(2) = ""
t1 = New Threading.Thread(Sub()
result3 = process(data3).Clone
threadid(2) = System.Threading.Thread.CurrentThread.ManagedThreadId
End Sub)
t1.Start()
row = 0
data4(row) = value
ElseIf data1(rowlimit) <> "" And data2(rowlimit) <> "" And data3(rowlimit) <> "" And data4(rowlimit) <> "" And data5(rowlimit) = "" Then
threadid(3) = ""
t1 = New Threading.Thread(Sub()
result4 = process(data4).Clone
threadid(3) = System.Threading.Thread.CurrentThread.ManagedThreadId
End Sub)
t1.Start()
row = 0
data5(row) = value
ElseIf data1(rowlimit) <> "" And data2(rowlimit) <> "" And data3(rowlimit) <> "" And data4(rowlimit) <> "" And data5(rowlimit) <> "" Then
threadid(4) = ""
t1 = New Threading.Thread(Sub()
result5 = process(data5).Clone
threadid(4) = System.Threading.Thread.CurrentThread.ManagedThreadId
End Sub)
t1.Start()
row = 0
check1 = True
End If
row += 1
End If If check1 = True Then
Do
System.Threading.Thread.Sleep(100)
Loop Until threadid(0) <> "" And threadid(1) <> "" And threadid(2) <> "" And threadid(3) <> "" And threadid(4) <> ""
row = 0
ReDim data1(rowlimit)
data1(row) = value
row += 1
result1_update(result1) ' consolidate result into a single array
result2_update(result2) ' consolidate result into a single array
result3_update(result3) ' consolidate result into a single array
result4_update(result4) ' consolidate result into a single array
result5_update(result5) ' consolidate result into a single array
check1 = False
ReDim data2(rowlimit), data3(rowlimit), data4(rowlimit), data5(rowlimit)
End If
loop until read.endofstreamend sub
' Function that calculate the sum of each row and columns Function process(ByVal data() As String) As String()
Dim line(), line1(), result() As String
Dim check As Boolean
redim result(0)
For n = 0 To (data.Count - 1)
if result(0) = "" and result.count = 1 then
result(result.count-1) = data(n)
else
check = true
line1 = Split(data(n), ",", -1, CompareMethod.Text)
For m = 0 to (result.count-1)
line = split(result(m),",",-1, CompareMethod.Text)
if line1(0) = line(0) then
check = false
for o = 1 to (line1.count-1)
line(o) = val(line1(o)) + val(line(o))
next o
result(m) = join(line,",")
exit for
end if
Next m
if check = true then
redim preserve result(result.count)
result(result.count-1) = join(line1,",")
end if
end if
Next n
redim preserve result(result.count-2)
process = result.clone
End Function
Looking at your code I noticed a couple of things:
you're using Val which is very easy to use but has a high overhead. Integer.Parse would work much more efficiently.
You're converting from string to number back to string way more than you should have to. Since your summary will only be a fraction of the size of your complete data, you shouldn't have any trouble storing the results in memory. A Dictionary(Of Integer, Integer()) would work well for this.
Consider this code which will read the data, summarize it and put the data in a format easy to write to a file all in less than 10 secs. using random integers up to 1000:
Function SummarizeData(filename As String, delimiter As Char) As Dictionary(Of Integer, Integer())
Dim limit As Integer = 0
SummarizeData = New Dictionary(Of Integer, Integer())
Using sr As New IO.StreamReader(filename)
'Since we don't need the first line for the summary we can read it get _
'the upper bound for the array, and discard the line.
If Not sr.EndOfStream Then
limit = sr.ReadLine.Split(delimiter).Length - 1
Else : Throw New Exception("Empty File")
End If
Do Until sr.EndOfStream
'This creates an array of integers representing the data in one line.
Dim line = sr.ReadLine.Split(" "c).Select(Function(x) Integer.Parse(x)).ToArray
'If the key is already in the dictionary we increment the values
If SummarizeData.ContainsKey(line(0)) Then
For I = 1 To limit
SummarizeData.Item(line(0))(I) += line(I)
Next
Else
'If not we create a new element using the line as the initial values
SummarizeData.Add(line(0), New Integer(limit) {})
SummarizeData.Item(line(0)) = line
End If
Loop
End Using
End Function
To use the function and write the data, this would work:
Dim results = SummarizeData("data.txt", ","c)
'If you don't need the results sorted you can gain a few fractions of a second by _
'removing the Order By clause
IO.File.WriteAllLines("results.txt", (From kvp In results
Order By kvp.Key
Select String.Join(",", kvp.Value)).ToArray)

Speed up large string data parser function

I currently have a file with 1 million characters.. the file is 1 MB in size. I am trying to parse data with this old function that still works but very slow.
start0end
start1end
start2end
start3end
start4end
start5end
start6end
the code, takes about 5 painful minutes to process the whole data.
any pointers and suggestions are appreciated.
Private Sub Button1_Click(sender As Object, e As EventArgs) Handles Button1.Click
Dim sFinal = ""
Dim strData = textbox.Text
Dim strFirst = "start"
Dim strSec = "end"
Dim strID As String, Pos1 As Long, Pos2 As Long, strCur As String = ""
Do While InStr(strData, strFirst) > 0
Pos1 = InStr(strData, strFirst)
strID = Mid(strData, Pos1 + Len(strFirst))
Pos2 = InStr(strID, strSec)
If Pos2 > 0 Then
strID = Microsoft.VisualBasic.Left(strID, Pos2 - 1)
End If
If strID <> strCur Then
strCur = strID
sFinal += strID & ","
End If
strData = Mid(strData, Pos1 + Len(strFirst) + 3 + Len(strID))
Loop
End Sub
The reason that is so slow is because you keep destroying and recreating a 1 MB string over and over. Strings are immutable, so strData = Mid(strData... creates a new string and copies the remaining of the 1 MB string data to a new strData variable over and over and over. Interestingly, even VB6 allowed for a progressive index.
I would have processed the disk file LINE BY LINE and plucked out the info as it was read (see streamreader.ReadLine) to avoid working with a 1MB string. Pretty much the same method could be used there.
' 1 MB textbox data (!?)
Dim sData As String = TextBox1.Text
' start/stop - probably fake
Dim sStart As String = "start"
Dim sStop As String = "end"
' result
Dim sbResult As New StringBuilder
' progressive index
Dim nNDX As Integer = 0
' shortcut at least as far as typing and readability
Dim MagicNumber As Integer = sStart.Length
' NEXT index of start/stop after nNDX
Dim i As Integer = 0
Dim j As Integer = 0
' loop as long as string remains
Do While (nNDX < sData.Length) AndAlso (i >= 0)
i = sData.IndexOf(sStart, nNDX) ' start index
j = sData.IndexOf(sStop, i) ' stop index
' Extract and append bracketed substring
sbResult.Append(sData.Substring(i + MagicNumber, j - (i + MagicNumber)))
' add a cute comma
sbResult.Append(",")
nNDX = j ' where we start next time
i = sData.IndexOf(sStart, nNDX)
Loop
' remove last comma
sbResult.Remove(sbResult.ToString.Length - 1, 1)
' show my work
Console.WriteLine(sbResult.ToString)
EDIT: Small mod for the ad hoc test data

How can I list all the combinations that meet certain criteria using Excel VBA?

Which are the combinations that the sum of each digit is equal to 8 or less, from 1 to 88,888,888?
For example,
70000001 = 7+0+0+0+0+0+0+1 = 8 Should be on the list
00000021 = 0+0+0+0+0+0+2+1 = 3 Should be on the list.
20005002 = 2+0+0+0+5+0+0+2 = 9 Should not be on the list.
Sub Comb()
Dim r As Integer 'Row (to store the number)
Dim i As Integer 'Range
r = 1
For i = 0 To 88888888
If i = 8
'How can I get the sum of the digits on vba?
ActiveSheet.Cells(r, 1) = i
r = r + 1
End If
Else
End Sub
... Is this what you're looking for?
Function AddDigits(sNum As String) As Integer
Dim i As Integer
AddDigits = 0
For i = 1 To Len(sNum)
AddDigits = AddDigits + CInt(Mid(sNum, i, 1))
Next i
End Function
(Just remember to use CStr() on the number you pass into the function.
If not, can you explain what it is you want in a bit more detail.
Hope this helps
The method you suggest is pretty much brute force. On my machine, it ran 6.5min to calculate all numbers. so far a challenge I tried to find a more efficient algorithm.
This one takes about 0.5s:
Private Const cIntNumberOfDigits As Integer = 9
Private mStrNum As String
Private mRng As Range
Private Sub GetNumbers()
Dim dblStart As Double
Set mRng = Range("a1")
dblStart = Timer
mStrNum = Replace(Space(cIntNumberOfDigits), " ", "0")
subGetNumbers 8
Debug.Print (Timer - dblStart) / 10000000, (Timer - dblStart)
End Sub
Private Sub subGetNumbers(intMaxSum As Integer, Optional intStartPos As Integer = 1)
Dim i As Integer
If intStartPos = cIntNumberOfDigits Then
Mid(mStrNum, intStartPos, 1) = intMaxSum
mRng.Value = Val(mStrNum)
Set mRng = mRng.Offset(1)
Mid(mStrNum, intStartPos, 1) = 0
Exit Sub
End If
For i = 0 To intMaxSum
Mid(mStrNum, intStartPos, 1) = CStr(i)
subGetNumbers intMaxSum - i, intStartPos + 1
Next i
Mid(mStrNum, intStartPos, 1) = 0
End Sub
It can be sped up further by about factor 10 by using arrays instead of writing directly to the range and offsetting it, but that should suffice for now! :-)
As an alternative, You can use a function like this:
Function isInnerLowr8(x As Long) As Boolean
Dim strX As String, inSum As Long
isInnerLowr8 = False
strX = Replace(CStr(x), "0", "")
For i = 1 To Len(strX)
Sum = Sum + Val(Mid(strX, i, 1))
If Sum > 8 Then Exit Function
Next i
isInnerLowr8 = True
End Function
Now change If i = 8 to If isInnerLowr8(i) Then.