How to transfer data from CPU to GPU runtime on google colab? - tensorflow

I used files.upload() on three hdf5 data files on Google Colab, in order to train some TensorFlow models, and it took a few minutes to complete.
Everything ran smooth with a few minor modifications from the local Jupyter notebook. However, when I proceed to change the runtime from "None" to "GPU", no files previously uploaded are present in the home folder. I just had to re-upload them. Going back to the "None" runtime showed that the files were still there.
Is there a convenient way to copy+paste the data from one runtime to another?
Thanks a lot.

I don't think there is anyway to directly copy data from a CPU instance to a GPU one.
So, you probably need to copy it to Google Drive (or mount it with ocamlfuse).
Another way is to use git to add, and push from one. Then clone from the other.

Related

Is it possible to reuse binary file that I built in Google Colaboratory?

I have to rebuild some packages written in C++ whenever I use Google Colab because of the time restriction, and it cost me around 30 minutes.
Can I save the binary file to Google Drive and reuse it for the next runtime?

is it possible to increase the ram in google colab with another way?

When I run this code in google colab
n = 100000000
i = []
while True:
i.append(n * 10**66)
it happens to me all the time. My data is huge. After hitting 12.72 GB RAM, but I don't immediately get to the crash prompt and the option to increase my RAM.
I have just this Your session crashed after using all available RAM. View runtime logs
What is the solution ? Is there another way ?
You either need to upgrade to Colab Pro or if your computer itself has more RAM than the VM for Colab, you can connect to your local runtime instead.
Colab Pro will give you about twice as much memory as you have now. If that’s enough, and you’re willing to pay $10 per month, that’s probably the easiest way.
If instead you want to use a local runtime, you can hit the down arrow next to “Connect” in the top right, and choose “Connect to local runtime
The policy was changed. However, currently, this workaround works for me:
Open and copy this notebook to your drive. Check if you already have 25gb RAM by hovering over the RAM indicator on the top right (this was the case for me). If not, follow the instructions in the colab notebook.
Source: Github
To double the RAM size of Google Colab use this Notebook, it gives a 25GB RAM! Note: set Runtime type to "None" to double RAM, then change it again to GPU or TPU.
https://colab.research.google.com/drive/155S_bb3viIoL0wAwkIyr1r8XQu4ARwA9?usp=sharing
as you said 12GB
this needs a large RAM,
if you need a small increase you can use colab pro
If you need a large increase and using a deep learning framework my advice you should use :
1- the university computer (ACADEMIC & RESEARCH COMPUTING)
2- using a platform like AWS, GCP, etc 3- you may use your very professional computer using GPU (I didn't recommend this)

When I download packages or files in Google Colab, does that consume my data?

Noob question here. Trying to set up Colab for work due to the situation right now.
If I download python packages or datasets in Google Colab using wget or pip, does that consume my data? To be clear, I only want to run code on Colab, and not download the models or files on my local system from colab.
Asking because my data limits are pretty low (1GB per day) and one large pre-trained model can finish it all up.
No, it won't consume (much) of your data.
Google Colab runs on Google Cloud. If it downloand some data, it travel to Google Cloud, not to your computer.
Only the text you type, the output text, and some images travel to your computer. Only the notebook contents. So, it consumes only a little for you.

Can multiple Colab notebooks share the same Runtime?

In Q1 2019, I ran some experiments and I noticed that Colab notebooks with the same Runtime type (None/GPU/TPU) would always share the same Runtime (i.e., the same VM). For example, I could write a file to disk in one Colab notebook and read it in another Colab notebook, as long as both notebooks had the same Runtime type.
However, I tried again today (October 2019) and it now seems that each Colab notebook gets its own dedicated Runtime.
My questions are:
When did this change happen? Was this change announced anywhere?
Is this always true now? Will Runtimes sometimes be shared and sometimes not?
What is the recommended way to communicate between two Colab notebooks? I'm guessing Google Drive?
Thanks
Distinct notebooks are indeed isolated from one another. Isolation isn't configurable.
For file sharing, I think you're right that Drive is the best bet as described in the docs:
https://colab.research.google.com/notebooks/io.ipynb#scrollTo=u22w3BFiOveA
I have found no easy way of running multiple notebooks within the same runtime. That being said, I have no idea how this effects the quota. On my real computer, I'd limit GPU memory per script and run multiple python threads. They don't let you do this, and I think if you do not use the whole amount of RAM, they should not treat that the same as if you had used all of that GPU for 12 or 24 hrs. They can pool your tasks with other users.

Unable to Unrar in Google Colab

I'm using Google Colab to learn and tinker with ML and TensorFlow. I had a huge dataset in multiple multi-part rar files. I tried simply
!unrar e zip-file 'extdir'
but after successfully extracting a couple of archives it starts throwing up errors, specifically input/output errors.
Does google block you after a couple GBs unrar-ed?
I have already tried resetting the runtime environment and changing the runtime from Py2 to Py3 but nothing made a difference
True, it doesn't work after a couple of runs.
Try unrar-free, the free version of unrar.
Checkout the help manual below:
https://helpmanual.io/help/unrar-free/
No Google doesn't block you for extracting large files. Also, unrar-free gave the same error as before. So, you can install p7zip and extract rarv5. Or you can also use 7z. This solved the exact same problem that I was also facing. (I had a rar file ~20 GiB).
!apt install p7zip-full p7zip-rar
or
!7z e zip-file