colab pro plus crash every time - google-colaboratory

Why the colab plus has only 12gb in ram. The running process very slow. Keep crashing when the data set only 1000.

Related

How long can I run Google Colab if I don't use GPU

I know there is a limit on how much GPU you can use on Google Colab but what if you are just running a regular CPU script. Is there a limit to how long I can run it for?
I found this question but it is unclear whether it's talking about with GPU or without GPU.
From their docs
Notebooks run by connecting to virtual machines that have maximum
lifetimes that can be as much as 12 hours. Notebooks will also
disconnect from VMs when left idle for too long. Maximum VM lifetime
and idle timeout behavior may vary over time, or based on your usage.
If your notebook is not idle: 12 hours. If it is: 90 minutes after it becomes idle. This applies to using GPU or CPU.

Not able to clear Google colab disk space

I have been using Google colab for quite sometime. Recently I noticed that 43 GB of disk space is already occupied. Even if I change runtime to TPU 43 GB out of 107 GB remains occupied. I tried factory reset runtime but it doesn't work. Even if I use Google colab from another Google account still 43 GB of disk space is occupied. How to clear the disk space ?
There's some amount of space that's used by the base operating system and libraries. You won't be able to trim this value very much by deleting files since most will be required for normal operation.
If you need a larger disk, consider Colab Pro, which has 2x the disk space.

How to run jupyter notebook for long hours?

I am working on a project that involves training a deep learning model for a very long time (estimated to be around 30 hours on my PC). I am running this in jupyter notebook on my windows 10 PC with NVIDIA GTX 1050Ti GPU.
The problem is after running for 24 hours the kernel is automatically becoming idle. I have also tried running this in jupyter lab but the result is same. Even here, the kernel is going to idle mode just after running for 24 hours.
I have also checked the GPU usage in the task manager to see if the training is running in the background. But no the gpu that is being used is just 1 or 2% which was around 50% while the training was going on.
So my question is.. is it possible to run jupyter notebook for that long hours? If yes, how to do that? Is there any default setting that needs to be changed to run notebook for more than 24 hours?

Google Colab Pro not allocating more than 1 GB of GPU memory

I recently upgraded to colab pro. I am trying to use GPU resources from colab pro to train my Mask RCNN model. I was allocated around 15 GB of memory when I tried to run the model right after I signed up for Pro. However, for some reason, I was allocated just 1 GB of memory from the next morning. Since then, I haven't been allocated more than 1 GB. I was wondering if I am missing something or I perturbed the VM inherent packages. I understand that the allocation varies from day to day, but it's been like this for almost 3 days now. Following attempts have already made to improve, but none seems to work.
I have made sure that GPU and "High-RAM" option is selected.
I have tried restarting runtimes several times
I have tried running other scripts (just to make sure that problem was not with mask rcnn script)
I would appreciate any suggestions on this issue.
GPU info
The high memory setting in the screen controls the system RAM rather than GPU memory.
The command !nvidia-smi will show GPU memory. For example:
The highlighted output shows the GPU memory utilization: 0 of 16 GB.

is it possible to increase the ram in google colab with another way?

When I run this code in google colab
n = 100000000
i = []
while True:
i.append(n * 10**66)
it happens to me all the time. My data is huge. After hitting 12.72 GB RAM, but I don't immediately get to the crash prompt and the option to increase my RAM.
I have just this Your session crashed after using all available RAM. View runtime logs
What is the solution ? Is there another way ?
You either need to upgrade to Colab Pro or if your computer itself has more RAM than the VM for Colab, you can connect to your local runtime instead.
Colab Pro will give you about twice as much memory as you have now. If that’s enough, and you’re willing to pay $10 per month, that’s probably the easiest way.
If instead you want to use a local runtime, you can hit the down arrow next to “Connect” in the top right, and choose “Connect to local runtime
The policy was changed. However, currently, this workaround works for me:
Open and copy this notebook to your drive. Check if you already have 25gb RAM by hovering over the RAM indicator on the top right (this was the case for me). If not, follow the instructions in the colab notebook.
Source: Github
To double the RAM size of Google Colab use this Notebook, it gives a 25GB RAM! Note: set Runtime type to "None" to double RAM, then change it again to GPU or TPU.
https://colab.research.google.com/drive/155S_bb3viIoL0wAwkIyr1r8XQu4ARwA9?usp=sharing
as you said 12GB
this needs a large RAM,
if you need a small increase you can use colab pro
If you need a large increase and using a deep learning framework my advice you should use :
1- the university computer (ACADEMIC & RESEARCH COMPUTING)
2- using a platform like AWS, GCP, etc 3- you may use your very professional computer using GPU (I didn't recommend this)