I have created an excel sheet that opens a form but I seem to have hit an object limit of some kind. I say this because with no other code changes, if I add a few more text boxes it will no longer opens and I receive compile Error out of memory.
I have seen multiple posts about out of memory where the advice is given to use "set object = nothing". However I don't believe this will help in my case because I am not using ANY global objects and all my pvt sub variables are integer, byte, or currency. As I understand it, all private dimensions should be terminated by VBA garbage collector at the end of the function anyway.
I exported my form1 and it shows UserForm1.frx as 4,143KB and UserForm1.frm as 140KB.
Did I indeed hit some cap? How can I confirm that?
Is there a way to get more out of excel since this appears as a memory limit?
If I am at a cap, does visual studio professional have a higher cap than excel for forms?
By Default Excel is 32 bit and does have a low memory limit of 2 Gigabytes, however I found if you uninstall and reinstall as Excel 2016 64 bit you increase your virtual memory usage to up to 8-Terabytes. This makes any program less widely usable but it does allow for your to build greatly larger programs! That was the fix I needed any way.
Here is the source that led me there:
http://www.decisionmodels.com/memlimitsc.htm
Related
I am running a program in VB.NET, with VS 2013, in 64-bit architecture, and have enabled allowverylargeobjects.
I have a list of objects of a class. The class has various properties that are data, kind of like
Class cMyClass
Property desc1 as String
Property desc2 as String
Property value as Double
End Class
I am populating this list via a read from SQL server. I can successfully put, in debug or release mode, 100 million objects of this class in the list, and operate on them just fine. At one point, though, I am populating the list with 150 million objects. When I run the program through Visual Studio in debug mode (or even in release mode, but through VS), I have no problems populating the list with 150 million objects. But when I use an executable (compiled from release mode), it throws an error at this point (the error box tells me it is in a particular subroutine where the only thing happening is the filling of this list) - "exceededSystem.OutOfMemoryException: Array dimensions exceeded supported range."
I get that it's bad practice to load so much stuff into memory, but I am already very far down this road and need to solve it just once. I can solve it, clearly, by running the program through VS, but would like to understand why this works for me in VS (in debug mode or release mode) but not when running the executable.
I'll add that I don't think it's a hardware problem. The program is using over 20gb memory when running, but it's running on a box with 128gb RAM.
Thank you.
Enable gcAllowVeryLargeObjects in your exe.config file (https://learn.microsoft.com/en-us/dotnet/framework/configure-apps/file-schema/runtime/gcallowverylargeobjects-element)
Even when this is active, you still have a limit on the number of elements:
4,294,967,295 in a multi-dimensional array
2,146,435,071 in a single dimensional array
2,147,483,591 for single byte arrays
Note that as stated in the comment from Tycobb, the gcAllowVeryLargeObjects works at object level, not at process level - so your process might use 20 gbs of RAM that are made up by the sum of many objects < 2 GB.
I am trying a declare a new variable in VBA for Excel. I have an excel model which has 9 modules and 7 class modules. Each module is really large, with an average of 60 variables declared in each module and a minimum of a few hundred lines of code to a maximum of a couple of thousand lines of code in each module. Every time I try typing a new variable, I get an error that says "Out Of Memory". How can I avoid this error and continue declaring more variables ?
As mentioned in the comment we have too little data to provide you with a definite answer.
However the reasons may be plenty:
You are declaring a lot of objects ("Set obj = ") and never cleaning them (Set obj = Nothing). If you do not reduce the reference to an object it will remain in memory.
You have a loop in which you are declaring a lot of objects/variables until you get a memory Overflow.
You are creating too many objects at once that allocate too much memory (e.g. IE object etc.)
How to deal with this?
Start with the code that raises the error as most likely this is happening in a loop or another place which increases memory usage (use debugging F8 to traverse code). There may be many solutions depending on the source of your issue.
Leverage memory statistics throughout different milestones in your code https://social.msdn.microsoft.com/Forums/office/en-US/e3aefd82-ec6a-49c7-9fbf-5d57d8ef65ca/check-size-of-excelexe-in-memory or simply use the Task Manager
See if any of these tips help: https://www.add-ins.com/support/out-of-memory-or-not-enough-resource-problem-with-microsoft-excel.htm
Every time I try typing a new variable, I get an error that says "Out Of Memory".
This sounds like a design-time error--an error you get when editing the code rather than a run-time error that you get when running the code.
If indeed this is a design-time error your file may be corrupted. Try rebuilding it by copying all sheets into a new workbook and copying the code into new, blank modules.
Please note that there are size limits on VBA Forms, Standard, and Class Modules, Procedures, Types, and Variables.
I've only seen it documented here:
https://learn.microsoft.com/en-us/previous-versions/visualstudio/visual-basic-6/aa716182(v=vs.60)
You either need to reduce the scope of your program, splitting it in to logical steps. Or use a more robust programming language like VB.NET.
I'm writing a VBA code that will run everything after I leave the office.
The macro works fine, the problems is that sometimes (more often than I'd like) I get the message:
Excel cannot complete this task with the available resources. Choose less data or close other applications. Excel cannot complete this task with the available resources. Choose less data or close other applications. Continue without Undo?
I just click OK and the code runs fine, but I have do click the OK manually,
I've already tried the
Application.DisplayAlerts = False
but this doesn't work.
Does anyone know if I can't make excel "overpass" this problem?
Thank you in advanced
I believe "Continue without Undo" means Excel is temporarily clearing the RAM it uses to track undo levels and then (it seems) your macro has the resources it needs to complete the process.
Take a look at what your macro is doing to use so much RAM: Is there a way to modify it so that less RAM is required? There are several options for this listed here:
How to clear memory to prevent "out of memory error" in excel vba?
Second option to fix this might be adding RAM to your machine, but it will not fix the cause of the error.
Third, if you want to risk a registry edit and reduced or eliminated undo levels in Excel, you might be able to prevent this error by reducing the number of undo levels (http://support.microsoft.com/kb/211922).
I have a VB.NET application that uses CreateObject to use Excel and dump a lot of data into it. We are getting out of memory exceptions and our app is generally hitting 1GB of memory at this point. However I can't make all the numbers add up.
This is how the data is passed to Excel:
worksheet_object.Range("A1").Resize(rows, cols).Value = an_array
The app is around 400mb with the data on screen (datagrid), when it crashes it has used an additional 600mb despite the fact that Excel with the CSV is only 200mb and the CSV is only 68mb. I realise the in memory array could be somewhat larger but how does 600mb get gobbled up passing the data to Excel unless Excel is somehow using my apps memory?
I have tried to find out if Excel via CreateObject runs in its own memory space or using my apps memory space, drew a complete blank. ProcessExplorer shows them as separate processes so I don't know what to think.
We found running the app as 64-bit rather than 32 solved the problem, but not all our clients will have 64-bit office.
So my question is this: How can that one line use 600mb and is there a better way to pass the data to Excel.
I would bet that this takes some time to run the 1 line as well, 9.7M cells!? But anyway, 2 questions:
1: How can that one line use 600mb (I assume you mean MB (Byte), not mb (bit))
I think you answered that with 9.7M cells. Thats only 61 Bytes per cell. I would assume you have some strings. Each string uses a couple bytes, plus 2 Bytes for every character in the cell. Numbers with a decimal point, they take up 16 Bytes.. period. Not to mention, the array has some size to it, to handle it's information. See this chart for an idea of what the data takes up in VB. 2012 VB Data Types One amazing thing that I just learned because I had not looked at this chart in so long, is that by going to 64bit OS, it should have actually taken MORE memory. Luckily, 64bit can handle more.
2: Is there a better way to pass the data to Excel.
You might try using the OLEDB ACE driver to work with your excel file. It is faster and does not require EXCEL to be installed on the computer, big pluses there.
How To Use ADO.NET to Retrieve and Modify Records in an Excel Workbook With Visual Basic .NET
This post is old but still relevant and very helpfull. For connection strings for newer versions of excel, try ConnectionStrings.com
Novice programmer here. I have a code that copies 12 hours worth of data from a server and displays it in excel. My code takes the displayed code and exports it into a file, appending each 12 hours so that I can have a months worth of data.
My problem is that after 20 days or so I run out of memory. Theoretically, it shouldn't take much more data than the original program and it running out of memory after 20 days says to me memory leak. In an old java program I had I just called the garbage collector with some frequency and the problem went away. Is there some way to do this in excel-vba? I've read about setting variables to nothing, but I have a lot of variables and I think the real issue has to do with it storing ALL of the read-in data as ram, and I don't know how to set that to zero.
Other curious bit - after it crashed due to memory I cannot begin the program again without shutting down excel. So after crashing it doesn't delete things in memory?
Thanks for any help
As far as I understand your question, your Excel program is still running and remains open every day (that's what I've understood from after 20 days of running).
Using a Set myVar = Nothing is still a best practice but let say you don't want to do this.
What I can think of is to create a new Excel instance to run your code within and close your application at the end of your running code.
Something like:
'Don't forget to add the reference ..
'Microsoft Excel X,X object library
Sub myTest()
Dim xlAPp As New Application
' your code
Set xlApp = Nothing
End Sub
VBA doesn't have garbage collection in the traditional sense -- so the answer to your particular query is no -- but keeps a count of referencing. As such, to free memory, you need to do as you suggest and dereference your objects, whenever they're no longer needed.