Excel specifications and limits says:
Total number of characters that a cell can contain: 32,767 characters
Is there a way to get this number programatically?
I'm asking because hardcoding constants should, in general, be avoided if and when feasible. This number may conceivably change by Office version (It hasn't changed between 2003 and 2013, but who knows what Microsoft has in store for us).
It's pretty easy to get the maximum number of rows in a worksheet:
Sheet1.Rows.Count ' returns 65,536 in Office 2003 and 1,048,576 in Office 2007-2013
but apparently, getting the maximum number of characters that a cell can contain isn't as straightforward.
Note that writing too many characters to a cell will not result in an error; it will silently fail and truncate the string — so proper error handling isn't an option here.
In a loop, append characters one by one to the cell contents. Each time, read cell contents, check if the last character added is present. If it isn't then that's the limit.
Upside: Works and is 100% reliable.
Downside: Really slow. It takes 10-15 seconds to complete, due to the many read-writes to/from sheet.
Obviously, this could be optimised by using a good guess (e.g. 32,767) as the initial condition, and using a hunt & bisect search algorithm rather than incrementing by 1. However if the answer is far enough away from the initial guess, this might still take ~1 second to run — not something you would want to call repeatedly.
Function MaximumNumberOfCharactersACellCanContain(r As Range)
'NB: Range r will be overwritten.
Dim sIn As String
Dim sOut As String
Dim i As Long
Application.ScreenUpdating = False
Do
i = i + 1
sIn = sIn & Chr(97 + (i - 1) Mod 26)
r.Cells(1, 1).Value = sIn
sOut = r.Cells(1, 1).Value
If Right(sOut, 1) <> Right(sIn, 1) Then Exit Do
'If Len(sOut) <> Len(sIn) Then Exit Do
Loop
Application.ScreenUpdating = True
MaximumNumberOfCharactersACellCanContain = i - 1
End Function
Example usage:
MsgBox MaximumNumberOfCharactersACellCanContain(Range("A1"))
Alternative: Loop appending a chunk until the assigned length is not whats expected
Const INT_MAX As Integer = 32767
Dim i As Long
ActiveCell.Value = ""
Dim buff As String: buff = Space$(INT_MAX)
Do
i = i + 1
ActiveCell.Value = ActiveCell.Value & buff
If Len(ActiveCell.Value) <> (i * INT_MAX) Then
MaxLen = Len(ActiveCell.Value)
Exit Function
End If
Loop
Or even
ActiveCell.Value = Space$(A_BIG_NUMBER)
MaxLen = Len(ActiveCell.Value)
Here's a variant where we take exponential steps (larger and larger steps whose size increases by a factor stepFactor each time).
Function MaximumNumberOfCharactersACellCanContain(r As Range, _
Optional ByVal stepFactor As Double = 2)
Dim n As Double
Dim nActual As Long
Dim l As Long
n = 1
Do
n = n * stepFactor
nActual = CLng(n)
r.Cells(1, 1).Value = Space$(nActual)
l = Len(r.Cells(1, 1).Value)
If l <> nActual Then
MaximumNumberOfCharactersACellCanContain = l
Exit Function
End If
Loop
End Function
Example usage:
Debug.Print MaximumNumberOfCharactersACellCanContain(Range("A1"), 8)
The choice of stepFactor is a compromise between:
Reducing the number of iterations (larger factor is better), and
Limiting down the cost of the last iteration (the one that fails). If stepFactor is too large, then you're writing a very long string to the cell and this is quite slow.
Making sure the last iteration will never hit the out of memory ceiling (~130 million characters on my system). (Could add error handling do deal with this eventuality.)
stepFactor somewhere between 2 and 8 should be robust and quick.
Related
Context
I have a big macro where I have declared lots of global constant in a dedicated module (i.e, a module which contains only Public Const declarations).
About 100 of those global constants are used to assign columns names to each column of my main Data worksheet :
Public Const columnName = "A"
Public Const columnCity = "B"
Public Const columnPhone = "C"
...
Public Const columnColor = "CX"
This let me reference to the columns (of my Data worksheet, from the 10 other modules) using .Range(columnColor & l) instead of using .Range("CX" & l) (where l is obviously the row number) . This is much easier to code (I don't need to search for the right column) or to update if I decide to insert a column before "F" (I only have to update my const module and not the 10 other code modules).
However, it looks like using .Range(columnCity & l) is notably slower than using .Range("A" & l). (SEE EDIT BELOW)
The most processors intensive tasks are done using big 2D arrays. But I'm still probably calling those global column variables 100 000 times in some subs, since I'm not only checking/updating values/formulas (which I could do on a 2D array) but also dealing with cell's .Interior.Color, .Comment.Text ...
Question
How bad an idea is it to use such global variables (Public Const columnName...) to reference columns ?
Is there some standard way of doing so ?
Edit
As pointed by Tim, I think I indeed spent time changing every .Cells(l, 1) to .Range(columnName & 1) when I refactored my code to use the column variable. That means that :
My problem probably comes from using .Range vs .Cells rather than from the global variables.
I should probably refactor back to .Cells(l, colIndexName).
There is no "standard" way to do this, so you should run some performance tests and figure out the most efficient method if performance is an issue for you.
For example, using a numeric constant and Cells() seems to be about twice as fast as using Range():
Option Explicit
Public Const columnName As String = "A"
Public Const colIndexName As Long = 1
Sub Tester()
Dim l As Long, v, t
t = Timer
With Sheet1
For l = 1 To 300000#
v = .Range(columnName & 1).Value
Next l
End With
Debug.Print Timer - t '>> approx. 1.3 sec
t = Timer
With Sheet1
For l = 1 To 300000#
v = .Cells(1, colIndexName).Value
Next l
End With
Debug.Print Timer - t '>> approx. 0.6 sec
End Sub
However, it's likely only twice as fast if that's all you're doing - as soon as you add in other tasks that difference may wash out.
Need an Answer to provide formatting and results, though this is more of a comment.
I have found no significant difference between .Range(columnCity & l) and .Range("A" & l). Can you provide more insight on how you came to this conclusion?
Here is my code for speed comparison:
Public Const p_sCol As String = "A"
Sub tgr()
Dim ws As Worksheet
Dim i As Long, l As Long
Dim sTemp As String
Dim dTimer As Double
Dim aResults(1 To 1, 1 To 2) As Double
Set ws = ActiveWorkbook.ActiveSheet
l = 1
dTimer = Timer
For i = 1 To 100000
sTemp = vbNullString
sTemp = ws.Range(p_sCol & l).Value
Next i
aResults(1, 1) = Timer - dTimer
dTimer = Timer
For i = 1 To 100000
sTemp = vbNullString
sTemp = ws.Range("A" & l).Value
Next i
aResults(1, 2) = Timer - dTimer
ws.Range("C1:D1").Value = aResults
End Sub
I ran the test 10 times, and the average result for the public variable concatenation over 100,000 iterations was 0.4375 seconds while the average result for hard coding the column letter was 0.429688 seconds, which is a difference of 0.007813 seconds. Sure, the hard coded method was slightly faster, but not noticeably and certainly not significantly.
I'm trying to make MATCH function work like FIND function. First of all, I generate the dummy data to be use for testing. Here is the routine I use:
Sub Data_Generator()
Randomize
Dim Data(1 To 100000, 1 To 1)
Dim p As Single
For i = 1 To 100000
p = Rnd()
If p < 0.4 Then
Data(i, 1) = "A"
ElseIf p >= 0.4 And p <= 0.7 Then
Data(i, 1) = "B"
Else
Data(i, 1) = "C"
End If
Next i
Range("A1:A100000") = Data
End Sub
Now, I create a sub-routine to find the string A in the range Data. There are two methods I use here that employ MATCH function. The first method is to reset the range of lookup array like the following code:
Sub Find_Match_1()
T0 = Timer
Dim i As Long, j As Long, k As Long, Data As Range
Dim Output(1 To 100000, 1 To 1)
On Error GoTo Finish
Do
Set Data = Range(Cells(j + 1, 1), "A100000") 'Reset the range of lookup array
i = WorksheetFunction.Match("A", Data, 0)
j = j + i
Output(j, 1) = j 'Label the position of A
k = k + 1 'Counting the number of [A] found
Loop
Finish:
Range("B1:B100000") = Output
InputBox "The number of [A] found are " & k & " in", "Process is complete", Timer - T0
End Sub
And for the second method, I assign the cell of range where A is located by value vbNullString instead of resetting Range("A1:A100000"). The idea is to delete the string A after being found and to expect MATCH function to find the next string A in the Range("A1:A100000"). Here is the code to implement the second method:
Sub Find_Match_2()
T0 = Timer
Dim n As Long, i As Long, j As Long
Dim Data_Store()
Dim Output(1 To 100000, 1 To 1)
Data_Store = Range("A1:A100000")
On Error GoTo Finish
Do
j = WorksheetFunction.Match("A", Range("A1:A100000"), 0)
Output(j, 1) = j
Cells(j, 1) = vbNullString
n = n + 1
Loop
Finish:
Range("A1:A100000") = Data_Store
Range("B1:B100000") = Output
InputBox "The number of [A] found are " & n & " in", "Process is complete", Timer - T0
End Sub
The goal is to determine which method is better at employing MATCH function in its performance. It turns out the first method only completes less than 0.4 seconds meanwhile the second method completes about a minute on my PC. So my questions are:
Why does the second method take time too long to complete?
How does one improve the performance of the second method?
Can MATCH function be used in an array?
I agree that this is more of a Code Review question, but I chose to look into it for my own curiosity, so I'll share what I found.
I think you're hitting a very classic case of N vs N^2 computational complexity. Look at your two methods, which seem remarkably similar, and consider what they're actually doing, keeping in mind that the MATCH function is probably just a linear search when you use Match_type=0 (because your data is unsorted, whereas other match types could do a binary search on your sorted data).
Method 1:
Start at A1
Continue down the range until an "A" is found
Restart at the cell below the MATCH
Method 2:
Start at A1
Continue down the range until an "A" is found
Clear the "A"
Restart at A1
It should be instantly apparent that while one method is continually shrinking the range it searches, the other is always starting at the first cell and searching the whole range. This will account for some of the speedup, and already boosts Method 1 to a nice lead, but it's not even nearly the full story.
The real key lies in the amount of work Match has to do for each situation. Because its range constantly shrinks and moves its start further down the list, whichever cell Method 1's Match starts from, it only has to search a small number of cells before it hits an A and resumes the outer loop. Meanwhile, Method 2 is continually destroying A's, making them less and less dense and forcing itself to search more and more of the range before getting any hits. By the end, Method 2 is looping through almost 100,000 empty cells/B's/C's before finding its next A.
So on average, the Match for Method 1 is only looking through a couple of cells each time, while the Match for Method 2 is taking longer and longer as time goes on, until the end when it is forced to loop through the entire range. On top of that, Method 2 is doing a bunch of writes to cell values, which is slower than you might think when you have to do it tens of thousands of times.
In all honesty, your best bet would be to just loop through the cells yourself once, looking for A's and handling them as you go. MATCH brings no advantage to the table, and Method 1 is basically just a more complicated version of the loop I described.
I'd write this something like:
Sub Find_Match_3()
T0 = Timer
Dim k As Long, r As Range
Dim Output(1 To 100000, 1 To 1)
For Each r In Range("A1:A100000").Cells
If r.Value = "A" Then
Output(r.Row, 1) = r.Row 'Label the position of A
k = k + 1 'Counting the number of [A] found
End If
Next
Range("B1:B100000") = Output
InputBox "The number of [A] found are " & k & " in", "Process is complete", Timer - T0
End Sub
Which is about 30% faster on my machine.
I have a macro that changes single quotes in front of a number to an apostrophe (or close single curly quote). Typically when you type something like "the '80s" in word, the apostrophe in front of the "8" faces the wrong way. The macro below works, but it is incredibly slow (like 10 seconds per page). In a regular language (even an interpreted one), this would be a fast procedure. Any insights why it takes so long in VBA on Word 2007? Or if someone has some find+replace skills that can do this without iterating, please let me know.
Sub FixNumericalReverseQuotes()
Dim char As Range
Debug.Print "starting " + CStr(Now)
With Selection
total = .Characters.Count
' Will be looking ahead one character, so we need at least 2 in the selection
If total < 2 Then
Return
End If
For x = 1 To total - 1
a_code = Asc(.Characters(x))
b_code = Asc(.Characters(x + 1))
' We want to convert a single quote in front of a number to an apostrophe
' Trying to use all numerical comparisons to speed this up
If (a_code = 145 Or a_code = 39) And b_code >= 48 And b_code <= 57 Then
.Characters(x) = Chr(146)
End If
Next x
End With
Debug.Print "ending " + CStr(Now)
End Sub
Beside two specified (Why...? and How to do without...?) there is an implied question – how to do proper iteration through Word object collection.
Answer is – to use obj.Next property rather than access by index.
That is, instead of:
For i = 1 to ActiveDocument.Characters.Count
'Do something with ActiveDocument.Characters(i), e.g.:
Debug.Pring ActiveDocument.Characters(i).Text
Next
one should use:
Dim ch as Range: Set ch = ActiveDocument.Characters(1)
Do
'Do something with ch, e.g.:
Debug.Print ch.Text
Set ch = ch.Next 'Note iterating
Loop Until ch is Nothing
Timing: 00:03:30 vs. 00:00:06, more than 3 minutes vs. 6 seconds.
Found on Google, link lost, sorry. Confirmed by personal exploration.
Modified version of #Comintern's "Array method":
Sub FixNumericalReverseQuotes()
Dim chars() As Byte
chars = StrConv(Selection.Text, vbFromUnicode)
Dim pos As Long
For pos = 0 To UBound(chars) - 1
If (chars(pos) = 145 Or chars(pos) = 39) _
And (chars(pos + 1) >= 48 And chars(pos + 1) <= 57) Then
' Make the change directly in the selection so track changes is sensible.
' I have to use 213 instead of 146 for reasons I don't understand--
' probably has to do with encoding on Mac, but anyway, this shows the change.
Selection.Characters(pos + 1) = Chr(213)
End If
Next pos
End Sub
Maybe this?
Sub FixNumQuotes()
Dim MyArr As Variant, MyString As String, X As Long, Z As Long
Debug.Print "starting " + CStr(Now)
For Z = 145 To 146
MyArr = Split(Selection.Text, Chr(Z))
For X = LBound(MyArr) To UBound(MyArr)
If IsNumeric(Left(MyArr(X), 1)) Then MyArr(X) = "'" & MyArr(X)
Next
MyString = Join(MyArr, Chr(Z))
Selection.Text = MyString
Next
Selection.Text = Replace(Replace(Selection.Text, Chr(146) & "'", "'"), Chr(145) & "'", "'")
Debug.Print "ending " + CStr(Now)
End Sub
I am not 100% sure on your criteria, I have made both an open and close single quote a ' but you can change that quite easily if you want.
It splits the string to an array on chr(145), checks the first char of each element for a numeric and prefixes it with a single quote if found.
Then it joins the array back to a string on chr(145) then repeats the whole things for chr(146). Finally it looks through the string for an occurence of a single quote AND either of those curled quotes next to each other (because that has to be something we just created) and replaces them with just the single quote we want. This leaves any occurence not next to a number intact.
This final replacement part is the bit you would change if you want something other than ' as the character.
I have been struggling with this for days now. My attempted solution was to use a regular expression on document.text. Then, using the matches in a document.range(start,end), replace the text. This preserves formatting.
The problem is that the start and end in the range do not match the index into text. I think I have found the discrepancy - hidden in the range are field codes (in my case they were hyperlinks). In addition, document.text has a bunch of BEL codes that are easy to strip out. If you loop through a range using the character method, append the characters to a string and print it you will see the field codes that don't show up if you use the .text method.
Amazingly you can get the field codes in document.text if you turn on "show field codes" in one of a number of ways. Unfortunately, that version is not exactly the same as what the range/characters shows - the document.text has just the field code, the range/characters has the field code and the field value. Therefore you can never get the character indices to match.
I have a working version where instead of using range(start,end), I do something like:
Set matchRange = doc.Range.Characters(myMatches(j).FirstIndex + 1)
matchRange.Collapse (wdCollapseStart)
Call matchRange.MoveEnd(WdUnits.wdCharacter, myMatches(j).Length)
matchRange.text = Replacement
As I say, this works but the first statement is dreadfully slow - it appears that Word is iterating through all of the characters to get to the correct point. In doing so, it doesn't seem to count the field codes, so we get to the correct point.
Bottom line, I have not been able to come up with a good way to match the indexing of the document.text string to an equivalent range(start,end) that is not a performance disaster.
Ideas welcome, and thanks.
This is a problem begging for regular expressions. Resolving the .Characters calls that many times is probably what is killing you in performance.
I'd do something like this:
Public Sub FixNumericalReverseQuotesFast()
Dim expression As RegExp
Set expression = New RegExp
Dim buffer As String
buffer = Selection.Range.Text
expression.Global = True
expression.MultiLine = True
expression.Pattern = "[" & Chr$(145) & Chr$(39) & "]\d"
Dim matches As MatchCollection
Set matches = expression.Execute(buffer)
Dim found As Match
For Each found In matches
buffer = Replace(buffer, found, Chr$(146) & Right$(found, 1))
Next
Selection.Range.Text = buffer
End Sub
NOTE: Requires a reference to Microsoft VBScript Regular Expressions 5.5 (or late binding).
EDIT:
The solution without using the Regular Expressions library is still avoiding working with Ranges. This can easily be converted to working with a byte array instead:
Sub FixNumericalReverseQuotes()
Dim chars() As Byte
chars = StrConv(Selection.Text, vbFromUnicode)
Dim pos As Long
For pos = 0 To UBound(chars) - 1
If (chars(pos) = 145 Or chars(pos) = 39) _
And (chars(pos + 1) >= 48 And chars(pos + 1) <= 57) Then
chars(pos) = 146
End If
Next pos
Selection.Text = StrConv(chars, vbUnicode)
End Sub
Benchmarks (100 iterations, 3 pages of text with 100 "hits" per page):
Regex method: 1.4375 seconds
Array method: 2.765625 seconds
OP method: (Ended task after 23 minutes)
About half as fast as the Regex, but still roughly 10ms per page.
EDIT 2: Apparently the methods above are not format safe, so method 3:
Sub FixNumericalReverseQuotesVThree()
Dim full_text As Range
Dim cached As Long
Set full_text = ActiveDocument.Range
full_text.Find.ClearFormatting
full_text.Find.MatchWildcards = True
cached = full_text.End
Do While full_text.Find.Execute("[" & Chr$(145) & Chr$(39) & "][0-9]")
full_text.End = full_text.Start + 2
full_text.Characters(1) = Chr$(96)
full_text.Start = full_text.Start + 1
full_text.End = cached
Loop
End Sub
Again, slower than both the above methods, but still runs reasonably fast (on the order of ms).
Q: Why out of memory when my system have plenty of it left (and office is 64bit)
Q: Could it be that data when split cause such strange behavior?
Q: If splitting that string cause trouble then how to sanititize/restore it for just operations of storing/restoring that string?
Specs: Win 8.1 Pro + Office 2013 64bit, 8GB RAM in system
And here is the code, which just get single LARGE (~1-2MB) string, and split it into multiple cells, so that 32k chars per cell limit do not cause harm:
Public Sub SaveConst(str As String)
Dim i As Long
i = 0
' Clear prior data
Do While LenB(Range("ConstJSON").Offset(0, i)) <> 0
Range("ConstJSON").Offset(0, i) = ""
i = i + 1
Loop
Dim strLen As Long
With Range("ConstJSON")
.Offset(0, 0) = Left$(str, 30000)
i = 1
strLen = Len(str)
Debug.Print strLen
Do While strLen > i * 30000
.Offset(0, i) = Mid$(str, i * 30000 + 1, 30000)
Debug.Print i
i = i + 1
Loop
End With
End Sub
Right now Len(str) report ~270k characters, and i goes up to 4 iteration, and then "Out of memory" bug kick in.
Now that is n-th iteration of that bug in this place. But I have simplified/modified code so that it works sometimes. For exact same data set.
UPDATE:
Thx to Jean code, I'm confident that its SAVING partial string to the cell that cause that error.
.Offset(0, i) = Mid$(str, i * 30000 + 1, 30000)
Or
Range("ConstJSON").Resize(nPieces).Value2 = v
Both cause errors.
UPDATE 2:
I was saving that string to single cell without any fuss. But now that string grew too big to fit, splitting sometimes cause that error "Out of the memory".
Exemplary string:
[...]
""ebiZlecenias"":[{""id"":""91a75940-6d3e-06f8-bcf7-28ecd49e85f2"",""lp"":null,""name"":""ZLECENIE
GŁÓWNE"",""date_entered"":""2014-04-15
08:13:18"",""date_modified"":""2014-04-15
08:13:18"",""modified_user_id"":""2"",""budowa_id"":""8614aab5-29da-ffac-4865-e8c5913c729c"",""rodzaj"":""1"",""etap"":""1"",""data_akceptacji"":null,""opis"":null,""user_id"":null,""data_bazowa_od"":null,""data_bazowa_do"":null,""data_rzeczywista_od"":null,""data_rzeczywista_do"":null,""archiwalny"":null,""deleted"":null,""termin_raportowania"":null,""okres_raportowania"":null,
[...]
EDIT: I believe the problem with your specimen string is that some of the substrings begin with a "-". When that happens, Excel thinks the contents is a formula, and that is what causes the error. Pre-formatting the cell as text did not correct the problem, but preceding each entry with a 'single quote', which coerces the entry to text and will not show up except in the formula bar, seems to have corrected the problem in my macros, even when using your specimen string above as the "base" string.
EDIT2: What seems to be happening is that, if the string length is greater than 8,192 characters (the longest allowed in a formula), and also starts with a token that makes Excel think it might be a formula (e.g: -, +, =), the write to the cell will fail with an out of memory error EVEN IF the cell is formatted as text. This does not happen if the single quote is inserted first.
Below is some code that works on much longer strings.
The code below first creates a long string, in this case the string is slightly more than 100,000,000 characters, and then splits it into sequential columns. No errors:
Option Explicit
Sub MakeLongString()
Dim S As String
Const strLEN As Long = 100 * 10 ^ 6
Const strPAT As String = "ABCDEFGHIJKLMNOPQRSTUVWXYZ"
S = strPAT
Do
S = S & S
Loop Until Len(S) > strLEN
Debug.Print Format(Len(S), "#,###")
SplitString (S)
Debug.Print Range("a1").End(xlToRight).Column
End Sub
Sub SplitString(STR)
Dim R As Range
Dim strLEN As Long
Set R = [a1]
Dim I As Long
strLEN = Len(STR)
Do Until I > strLEN
R(1, I / 30000 + 1) = "'" & Mid(STR, I + 1, 30000)
I = I + 30000
Loop
End Sub
I just ran a test where the range being written to was a multi-cell range, and the target was set by the Offset method as you did, and it also ran to completion without error, filling in the first four rows.
Sub SplitString(STR)
Dim R As Range
Dim strLEN As Long
Set R = [a1:a4]
Dim I As Long
strLEN = Len(STR)
Do Until I > strLEN
R.Offset(, I / 30000) = "'" & Mid(STR, I + 1, 30000)
I = I + 30000
Loop
End Sub
This is worth a try: first split the string into an array, then slap that entire array onto the sheet at once.
Const pieceLength As Long = 3000
Dim s As String
Dim i As Long
Dim nPieces As Long
Dim v As Variant
s = ... ' whatever your string is...
nPieces = WorksheetFunction.Ceiling(Len(s) / pieceLength, 1)
ReDim v(1 To nPieces, 1 To 1)
For i = 1 To nPieces
v(i, 1) = Mid(s, (pieceLength * i) + 1, pieceLength)
Next i
Range("ConstJSON").Resize(nPieces).Value2 = v
I haven't tested your code, so can't say exactly what's wrong with it, but I know that writing to (or reading from) individual cells one at a time is slow and expensive; it's usually much better to read/write large swaths of cells to/from arrays, and manipulate the arrays (instead of the cells).
I'm trying to minimize the value of the sum of the residuals squared by varying the value of De, which is found in F1. I want the values of CFL Calculated to be as close as possible to the values of CFL Measured. The smaller the sum of those residuals squared, the better the fit! After asking stackoverflow for some advice, I decided to use Goal Seek to minimize the sum of the residuals squared to get as close to zero as possible by varying the value of De, which I want to find the most ideal value of.
I got this program to run perfectly, or so I thought... I found out that instead of summing every single residuals using =SUM(D2:D14), I accidentally used =SUM(D2,D14). So I was only summing up the first and last numbers.
Now that I'm trying to sum every residual squared up, I'm getting these crazy errors, and an insane value for De.
I know that the value of De has to be greater than zero, and less than one. how can I use these bounds to keep this goal seek focused within a certain range? The answer for De in this case is about .012, if that helps. I keep getting the error #NUM! in all of the residual cells. Is this because of overflow issues?
If you've concluded that using Goal Seek to minimize these sums by finding the most ideal value of De will not work, how would you go about it? Are there any other solvers I could use?
Here is the code:
Option Explicit
Dim Counter As Long
Dim DeSimpleFinal As Double
Dim simpletime As Variant
Dim Tracker As Double
Dim StepAmount As Double
Dim Volume As Double
Dim SurfArea As Double
Dim pi As Double
Dim FinalTime As Variant
Dim i As Variant
Sub SimpleDeCalculationNEW()
'This is so you can have the data and the table I'm working with!
Counter = 13
Volume = 12.271846
SurfArea = 19.634954
pi = 4 * Atn(1)
Range("A1") = "Time(days)"
Range("B1") = "CFL(measured)"
Range("A2").Value = 0.083
Range("A3").Value = 0.292
Range("A4").Value = 1
Range("A5").Value = 2
Range("A6").Value = 3
Range("A7").Value = 4
Range("A8").Value = 5
Range("A9").Value = 6
Range("A10").Value = 7
Range("A11").Value = 8
Range("A12").Value = 9
Range("A13").Value = 10
Range("A14").Value = 11
Range("B2").Value = 0.0612
Range("B3").Value = 0.119
Range("B4").Value = 0.223
Range("B5").Value = 0.306
Range("B6").Value = 0.361
Range("B7").Value = 0.401
Range("B8").Value = 0.435
Range("B9").Value = 0.459
Range("B10").Value = 0.484
Range("B11").Value = 0.505
Range("B12").Value = 0.523
Range("B13").Value = 0.539
Range("B14").Value = 0.554
Range("H2").Value = Volume
Range("H1").Value = SurfArea
Range("C1") = "CFL Calculated"
Range("D1") = "Residual Squared"
Range("E1") = "De value"
Range("F1").Value = 0.1
'Inserting Equations
Range("C2") = "=((2 * $H$1) / $H$2) * SQRT(($F$1 * A2) / PI())"
Range("C2").Select
Selection.AutoFill Destination:=Range("C2:C" & Counter + 1), Type:=xlFillDefault
Range("D2") = "=((ABS(B2-C2))^2)"
Range("D2").Select
Selection.AutoFill Destination:=Range("D2:D" & Counter + 1), Type:=xlFillDefault
'Summing up the residuals squared
Range("D" & Counter + 2) = "=Sum(D2: D" & Counter + 1 & ")"
'Goal Seek
Range("D" & Counter + 2).GoalSeek Goal:=0, ChangingCell:=Range("F1")
Columns("A:Z").EntireColumn.EntireColumn.AutoFit
DeSimpleFinal = Range("F1")
MsgBox ("The Final Value for DeSimple is: " & DeSimpleFinal)
End Sub
You're getting NUM errors because the value of F1 is going negative in your current solution -- and you are trying to take the square root of F1 in one of your expressions.
Also, goal seek is, in this instance, incredibly sensitive to the particular initial starting "guess" for F1 that you are using. This will be evident if you vary the F1 initial value by a little bit on either side of the 0.1 you are using now. There are, in fact, large regions of instability in the goal seek solution, depending on the F1 value:
As you brought up in your question, you are more likely to get a useable result if you can set constraints on the possible inputs to your solution search. Excel comes with an add-in called Solver that allows that, as well as offers several different search methods. Solver is not loaded automatically when you first start Excel, but loading it is easy, as explained here.
You ask for other solvers. For alternatives and a bit of theory to help understand what's going on, have a look at Numerical Recipes (online books here). Chapter 10 deals with this. It includes ready-made code samples if you want to try something different than GoalSeek or the Solver add-in. Of course the code is in Fortran/C/C++ but these are readily translated into VBA (I've done this many times).
The goalseek function uses a dichotomy algorithm which can be coded like this:
Sub dicho(ByRef target As Range, ByRef modif As Range, ByVal targetvalue As Double, ByVal a As Double, ByVal b As Double)
Dim i As Integer
Dim imax As Integer
Dim eps As Double
eps = 0.01
imax = 10
i = 0
While Abs(target.Value - targetvalue) / Abs(targetvalue) > eps And i < imax
modif.Value = (a + b) / 2
If target.Value - targetvalue > 0 Then
a = (a + b) / 2
Else
b = (a + b) / 2
End If
i = i + 1
Wend
End Sub
Where a and b are you bounds.