Microsoft Visual Studio 2015 increase max Process Memory (over 2GB) - vb.net

On Windows 8
Is there a way to increase the process Memory limit of 2GB. My script needs 2.5GB RAM to run after I performed garbage collection to the best of my knowledge.
I need to run in 64-bit (not related to largeaddressaware)

Related

Why does the amount of memory available to x86 applications fluctuate in vb.net? [duplicate]

Which is the maximum amount of memory one can achieve in .NET managed code? Does it depend on the actual architecture (32/64 bits)?
There are no hard, exact figure for .NET code.
If you run on 32 bit Windows; your process can address up to 2 GB, 3 GB if the /3GB switch is used on Windows Server 2003.
If you run a 64 bit process on a 64 bit box your process can address up to 8 TB of address space, if that much RAM is present.
This is not the whole story however, since the CLR takes some overhead for each process. At the same time, .NET will try to allocate new memory in chunks; and if the address space is fragmented, that might mean that you cannot allocate more memory, even though some are available.
In C# 2.0 and 3.0 there is also a 2G limit on the size of a single object in managed code.
The amount of memory your .NET process can address depends both on whether it is running on a 32/64 bit machine and whether or not it it running as a CPU agnostic or CPU specific process.
By default a .NET process is CPU agnostic so it will run with the process type that is natural to the version of Windows. In 64 bit it will be a 64 bit process, and in 32 bit it will be a 32 bit process. You can force a .NET process though to target a particular CPU and say make it run as a 32 bit process on a 64 bit machine.
If you exclude the large address aware setting, the following are the various breakdowns
32 bit process can address 2GB
64 bit process can address 8TB
Here is a link to the full breakdown of addressable space based on the various options Windows provides.
http://msdn.microsoft.com/en-us/library/aa366778.aspx
For 64 bit Windows the virtual memory size is 16 TB divided equally between user and kernel mode, so user processes can address 8 TB (8192 GB). That is less than the entire 16 EB space addressable by 64 bits, but it is still a whole lot more than what we're used to with 32 bits.
I have recently been doing extensive profiling around memory limits in .NET on a 32bit process. We all get bombarded by the idea that we can allocate up to 2.4GB (2^31) in a .NET application but unfortuneately this is not true :(. The application process has that much space to use and the operating system does a great job managing it for us, however, .NET itself seems to have its own overhead which accounts for aproximately 600-800MB for typical real world applications that push the memory limit. This means that as soon as you allocate an array of integers that takes about 1.4GB, you should expect to see an OutOfMemoryException().
Obviously in 64bit, this limit occurs way later (let's chat in 5 years :)), but the general size of everything in memory also grows (I am finding it's ~1.7 to ~2 times) because of the increased word size.
What I know for sure is that the Virtual Memory idea from the operating system definitely does NOT give you virtually endless allocation space within one process. It is only there so that the full 2.4GB is addressable to all the (many) applications running at one time.
I hope this insight helps somewhat.
I originally answered something related here (I am still a newby so am not sure how I am supposed to do these links):
Is there a memory limit for a single .NET process
The .NET runtime can allocate all the free memory available for user-mode programs in its host. Mind that it doesn't mean that all of that memory will be dedicated to your program, as some (relatively small) portions will be dedicated to internal CLR data structures.
In 32 bit systems, assuming a 4GB or more setup (even if PAE is enabled), you should be able to get at the very most roughly 2GB allocated to your application. On 64 bit systems you should be able to get 1TB. For more information concerning windows memory limits, please review this page.
Every figure mentioned there has to be divided by 2, as windows reserves the higher half of the address space for usage by code running in kernel mode (ring 0).
Also, please mind that whenever for a 32 bit system the limit exceeds 4GB, use of PAE is implied, and thus you still can't really exceed the 2GB limit unless the OS supports 4gt, in which case you can reach up to 3GB.
Yes, in a 32 bits environment you are limited to a 4GB address-space but Windows claims about half. On a 64 bits architecture it is, well, a lot bigger. I believe it's 4G * 4G
And on the Compact Framework it usually is in the order of a few hundred MB
I think other answers being quite naive, in real world after 2GB of memory consumption your application will behave really badly. In my experience GUIs generally go massively clunky, unsusable after lots of memory consumptions.
This was my experience, obviously actual cause of this can be objects grows too big so all operations on those objects takes too much time.
The following blog post has detailed findings on x86 and x64 max memory. It also has a small tool (source available) which allows easy easting of the different memory options:
http://www.guylangston.net/blog/Article/MaxMemory.

SSIS Designer and splitting Packages

I have a large solution which is currently in 1 big package. I've started to split the package into smaller packages but have noticed that more memory is being used on the SQL Server when running the solution.
Has anyone else seen this when using multiple packages?
SQL Server, sqlservr.exe, is generally run as a service and as you use it, it's going to continue consuming memory until it's sucked it all and windows forces it back down. That's by design as the database performs best when it has as much data as possible in memory vs having to read from disk. You should configure your SQL Server instance to have a maximum memory so that the OS can have room to breathe. How much memory should you reserve?
SQL Server Integration Services, SSIS, runs in its own address space - even though you can launch it from SQL Server you'll see it, dtexec.exe, handles asking for memory and in the event the process crashes, it does not bring down SQL Server. This is a very good thing, the separation. From a practical perspective, it means that if you are going to run SSIS packages on a machine, you need to leave enough memory for SSIS to run and guess what, SSIS is a blazing fast in-memory ETL solution. As much as it can, an SSIS data flow task is going to hold data in its memory so it can manipulate it (change data type, lookup, etc) in one big pass before writing it to destination as the IO is the most expensive part of ETL.
But, as you're developing these packages, you're running them from Visual Studio, devenv.exe. VS/SSDT needs memory to do its thing. And hey, when you run an SSIS package from Visual Studio, that's wrapped in the debugger call (can't recall the process name) and that too sucks memory to be able to provide debugging capability.
Sadly, a four gigabyte allocation of RAM for a developer machine is insufficient. And if this is a server, the licensing cost alone dwarfs what it'd cost to max that box out on memory.
Were it me, I'd cap SQL Server about 1.5 GB. Under a gig is usually not enough for SQL Server to do much of anything. Assume that Visual Studio and the debugger are going to be good for about 2 gigs when things get hot and heavy. That leaves .5 gigs reserved to the OS (and Outlook, Excel, Windows Explorer, Web browser pointed at StackOverflow and MSDN documentation and crap, we're out of memory)
To address memory usage by SSIS. I would think but have not tested, that 1 package with 10 data flow connected serially versus 10 packages with 1 data flow each, the monolithic package would consume more memory as it will validate all the data flows when it starts. Yes, there is startup overhead that is shared by the monolithic approach which will be allocated for each individual package but I can't imagine it will be of any significance. Plus, that memory is returned to the OS once the dtexec process completes. It is not like SQL Server which will hold on to the memory until the process cycles.

Netbeans out of memory exception

Netbeans out of memory exception
I increased the -Xmx value in netbeans file.
but the IDE is busy acquiring more memory to scan projects ?
the memory usage increases and the process is slow, and non responsive
Sounds like your system is thrashing. The heap size is now so large that there is not enough physical memory on your system to hold it ... and all of the other things you are running.
The end result is that your system has to copy memory pages between physical memory and the disc page file. Too much of that and the system performance will drop dramatically. You will see that the disc activity light is "on" continually. (The behaviour is worst during Java garbage collection, because that entails accessing lots of VM pages in essentially random order.)
If this is your problem then there is no easy solution:
You could reduce the -Xmx a bit ...
You could stop other applications running; e.g. quit your web browser, email client, etc.
You could buy more memory. (This only works up to a point ... if you are using a 32bit system / 32bit OS / 32bit JVM.)
You could switch to a less memory-hungry operating system or distro.

Can a 32bit process access 64GB memory?

I've a strange situation: A server, containing 64GB of memory, runs a SQL server process (64 bit) which consumes 32 GB of memory. There is about 17 GB memory available.
MS Dynamics Nav is running on top of SQL
Besides the 64bit SQL process, there is another SQL process and a NAS, both running 32 bits.
Every now and then, an error message is logged in the eventviewer, saying
There is not enough memory to execute this function.
If you work in a single-user installation, you can try reducing the
value of the 'cache' program property. You can find information about
how to optimize the operating system in the documentation for yo
Now I'm wondering what the problem is, since there is still 17 GB memory available. Is it possible that a 32-bit process cannot allocate memory in the last segment (60 to 64 GB)?
32 bit processes are limited to about 4 GB of memory usage. The x64 architecture should allow a 32bit process to run in any of the available memory space, but your 32bit process will still be limited by it's maximum addressible space (~4GB).

Will more CPU cache help compliation/development in Visual Studio 2008?

I'm thinking of a new laptop to replace my current machine. I notice a lot of machines have the P8xx and T9xxx Intel Core 2 Duo. The T9xxx have a premium but they have I believe 6 megs of cache compared to the 3 megs in the P8xx. Will this help me for compilation times or any other stat? Should I invest the premium in more RAM than the cache?
I do a lot of Web work in Visual Studio 2008, some C++/MFC. I just want to balance my budget around my needs without overkill. Thanks.
Usually that's not as helpful as increasing the number of CPU cores (which can run parallel build if you don't have one-by-one dependency tree) or the speed of CPU itself - but the result may still vary by your real project to work with.
I don't know if more cache will help. It can't hurt I imagine. There are a couple things that helped my Visual Studio performance.
Put as much RAM in your system as possible. RAM is cheap, you should max out your machine.
Go to your power options, and make sure you CPU is running full speed. For instance, on my machine, with Vista installed, switching the power options from "Balanced" to "High performance" roughly double the speed for compiles.