How to open multiple notebooks on the same google colab runtime? - google-colaboratory

Scenario:
You have a github repo you want to work on. The repo has the following file structure:
src
|
-- directory containing .py files
notebook1.ipynb
notebook2.ipynb
You head to colab and create a new empty notebook as the entry point between your repo and the google colab runtime.
On that empty colab notebook you add the following command to clone your github repo:
!git clone your_repo_address
Checking the colab file explorer we seen that our repo and file format are copied to colab runtime.
So far so good, now say you want to open notebook1.ipynb and execute cells and work on it.
How the hell you do that?
Every time I try that it opens in a file explorer without the possibility to execute the notebook cells.
Why can't colab work similar to jupyter. It's extremely cumbersome and time wasting in this regard compared to jupyter.

Related

How to make the contents of two colab files same

I am learning python programing using Google Colab. When i open two google colab files from the same account, the folders in content drive are different.
I wanted to know that how can we make the two Content Folder of the two Google Colab files same.
Thanks
Each Google Colab session allocates different resources from Google Cloud. You cannot combine them. Instead, you can connect the notebook to Google Drive account, where you can create a folder and manage its contents.
Step 1: Connect to Google Drive:
from google.colab import drive
drive.mount('/content/drive')
Step 2: Create a folder:
!cd /content/drive/My Drive
!mkdir mydirectory
Step 3: Move to the newly created directory and start working there:
!cd mydirectory
After this, any OS-related operations (create, update, delete files, etc.) you perform will take place in this directory.
You can also create a Jupyter Notebook within this directory and work with that notebook, although that is not strictly necessary.

colab truncate folder with more than 200k files

there is a maximun number of files allowed per folder when we read gdrive from colab? I create a folder from colab with more than 200k a run an "ls" command just after creation and everything is ok, but everytime i close the session and open it again (remount gdrive) the folder get truncated. can't read anymore this quantity actually not more than 20k, need to recreate/unzip the folder again. The folder contains images for training a DL model.
update: I'm running drive.flush_and_unmount() from the notebook where i created the folder (without closing the session) and is running smoothly. From another notebook I'm controlling the qty of files inside the folder (same folder but from another notebook) and it seems that the qty is beginning to increase slowly so it look like that the solution is running drive.flush_and_unmount() to force sync to gdrive but not sure yet if after closing the session and reopening again the folder will be synced. will let you know! at least it is progress

Google Colab: mounted Drive but unable to read files

I am doing Stanford's course CS231n on deep learning and am using Google Colab.
The initialization code and all the files are given, so all i need is just to hit "run" on the given code.
I have followed step be step the official instructions and successfully mounted Google Drive, yet i get an error when trying to read the files:
"cp: cannot stat 'cs231n/assignments/assignment1/cs231n/': No such file or directory /content".
And then some more errors.
The files are located in my drive as in the path "FOLDERNAME".
The errors i get:
How it should be:
Official instructions:
https://cs231n.github.io/assignments2020/assignment1/
How can i solve it?
Thanks!
I am having the exactly same problem as you were. And here is my solution:
delete the folder named cs231n
follow the tutorial again
3. change the path %cd drive/My\ Drive into %cd drive/MyDrive
As you can see, default setting of the folder is 'MyDrive'(without space in midddle) but not 'My Drive'(My\ Drive).
Then, everything should run as your expectation.
Cheers.
Based on the course instructions, it sounds like you need to expand the course archive in your Drive.
Quoting here:
Create a folder in your personal Google Drive and upload assignment1/
folder to the Drive folder. We recommend that you call the Google
Drive folder cs231n/assignments/ so that the final uploaded folder has
the path cs231n/assignments/assignment1/.
If you already did that, I'd check the Drive web UI to make sure that the paths line up with the course instructions, and to move the files around if there's a mismatch.

how to access a csv file which i saved in a colab project to another colab project

Help regarding colab project.
i am working on walmart sales prediction and i have created the required dataframe in a colab file, now for prediction of future sales i need to access that file and if i do that in the same colab file then the size being big the ram crashes... so i am thinking of accessing it in another colab file. How to do it and i dont want to download the file and upload it in that colab project.
df.to_csv('wm_sales.csv')
i have saved the file and its showing in my folder in colab and now i want to use it in another project.
You can simply mount your google drive disk and save any files in the specified directory:

is there any way to load local data set folder directly from google drive to google colab?

see the image carefullyi couldn't load custom data folder from google drive to google colab.though i mounted google drive.like instead of MNIST data set i want to load my own image data set folder.i have tried pydrive wrapper.but i need simple solution.
suppose i have dataset of images inside google drive.how to load it to google colab?
from google.colab import drive
drive.mount('/content/gdrive')
then
with open('/content/gdrive/My Drive/foo.txt', 'w') as f:
f.write('Hello Google Drive!')
!cat /content/gdrive/My\ Drive/foo.txt
here insted of foo.txt i have an image folder called Dog inside ml-data folder.but i can't load it.how to load it in google colab directly from google drive as it is in my local hard drive.
To load data directly from the local machine, you need to follow these steps:
Go to files [left side menu]
Click on upload to session storage
Select file(s) from your machine to upload
It will prompt something indicating that file(s) will be available for the current session only, click ok.
The file(s) will be uploaded in the directory. Click on it (left and right-click both work the same).
And then;
Copy path & use it inside pd.read_csv() function.
Note: After the session is terminated, files will be lost from colab session. To use it again, you'll need to upload it again.
Many times, we prefer to it have all our data in a GitHub repository or in a google drive folder to fetch from there.
Reading many files from Google Drive through colab is going to be less performant and more unreliable than first copying a .zip or similar single file from Drive to the colab VM and unzipping it outside the drive mount directory, and then using that copy of the data.