Upload a folder of files to google colab - google-colaboratory

Google Colab has not direct access to the local folders. To run the local scripts needs uploading the files to Colab. I have a collection of folders and files to upload. I was wondering if there is an efficient way to do so without disturbing the file organization, something like !git clone... ?

Related

How to make the contents of two colab files same

I am learning python programing using Google Colab. When i open two google colab files from the same account, the folders in content drive are different.
I wanted to know that how can we make the two Content Folder of the two Google Colab files same.
Thanks
Each Google Colab session allocates different resources from Google Cloud. You cannot combine them. Instead, you can connect the notebook to Google Drive account, where you can create a folder and manage its contents.
Step 1: Connect to Google Drive:
from google.colab import drive
drive.mount('/content/drive')
Step 2: Create a folder:
!cd /content/drive/My Drive
!mkdir mydirectory
Step 3: Move to the newly created directory and start working there:
!cd mydirectory
After this, any OS-related operations (create, update, delete files, etc.) you perform will take place in this directory.
You can also create a Jupyter Notebook within this directory and work with that notebook, although that is not strictly necessary.

how to access a csv file which i saved in a colab project to another colab project

Help regarding colab project.
i am working on walmart sales prediction and i have created the required dataframe in a colab file, now for prediction of future sales i need to access that file and if i do that in the same colab file then the size being big the ram crashes... so i am thinking of accessing it in another colab file. How to do it and i dont want to download the file and upload it in that colab project.
df.to_csv('wm_sales.csv')
i have saved the file and its showing in my folder in colab and now i want to use it in another project.
You can simply mount your google drive disk and save any files in the specified directory:

How to read images from a local drive into Google Colab

I want to run deep learning models in Google Colab that has a power GPU support. I am quite new to Colab. Originally I though I could uplod the images using os.path.join and PIL.Image.open as I did in the environment of Spyder. But the Colab gives a FileNotFoundError FileNotFoundError: [Errno 2] No such file or directory (the images indeed exist in a local directory). It seems I did not do correctly for the uploading.
You can directly upload to colab or upload images to a google drive folder. Then mount that folder in google colab to use it. On the left there is an arrow that opens a sidebar which contains searchable code snippets. Here, you can search for drive to get google drive related code snippets.
This notebook contains some examples,
https://colab.research.google.com/notebooks/snippets/drive.ipynb#scrollTo=u22w3BFiOveA
The command below list contents of folder named DeepLearning,
!ls -la "/content/gdrive/My Drive/DeepLearning"
Copy contents of drive DeepLearning folder to Virtual Machine in DeepLearning folder,
!cp /content/gdrive/My\ Drive/DeepLearning ./DeepLearning
Copy contents of Virtual Machine DeepLearning folder to google drive DeepLearning folder,
!cp ./DeepLearning /content/gdrive/My\ Drive/DeepLearning
You can run %cd DeepLearning to change directory to DeepLearning folder.
GPU support can be enabled by Runtime > Change Runtime Type > Hardware Accelerator > GPU.

How upload files to current working directory in Google Colab notebook?

I uploaded a Jupyter notebook to Google Colab. The notebook accessed a couple of images from local path /images/pic1.png How to upload data to the directory in Colab where the notebook is running? Please note that these are temporary files. So, I don't mind them being deleted on terminating the session.
I used Files upload feature but it doesn't seem to upload to Google Drive. I want to upload the folder with the same hierarchy as in the local environment without needing any change in the paths of the files in the code.
The better way I've found is mounting a folder in your Google Drive. Upload the image folder to your drive. After that, in your notebook you write:
from google.colab import drive
drive.mount('/content/drive')
After you allow the authentication, you can work using the path "/content/drive/My Drive" with the path for your image folder. Like that:
from IPython.display import Image
Image("/content/drive/My Drive/images/pic1.png")
See more in the notebook with documentation: https://colab.research.google.com/notebooks/io.ipynb

is there any way to load local data set folder directly from google drive to google colab?

see the image carefullyi couldn't load custom data folder from google drive to google colab.though i mounted google drive.like instead of MNIST data set i want to load my own image data set folder.i have tried pydrive wrapper.but i need simple solution.
suppose i have dataset of images inside google drive.how to load it to google colab?
from google.colab import drive
drive.mount('/content/gdrive')
then
with open('/content/gdrive/My Drive/foo.txt', 'w') as f:
f.write('Hello Google Drive!')
!cat /content/gdrive/My\ Drive/foo.txt
here insted of foo.txt i have an image folder called Dog inside ml-data folder.but i can't load it.how to load it in google colab directly from google drive as it is in my local hard drive.
To load data directly from the local machine, you need to follow these steps:
Go to files [left side menu]
Click on upload to session storage
Select file(s) from your machine to upload
It will prompt something indicating that file(s) will be available for the current session only, click ok.
The file(s) will be uploaded in the directory. Click on it (left and right-click both work the same).
And then;
Copy path & use it inside pd.read_csv() function.
Note: After the session is terminated, files will be lost from colab session. To use it again, you'll need to upload it again.
Many times, we prefer to it have all our data in a GitHub repository or in a google drive folder to fetch from there.
Reading many files from Google Drive through colab is going to be less performant and more unreliable than first copying a .zip or similar single file from Drive to the colab VM and unzipping it outside the drive mount directory, and then using that copy of the data.