is there any way to load local data set folder directly from google drive to google colab? - tensorflow

see the image carefullyi couldn't load custom data folder from google drive to google colab.though i mounted google drive.like instead of MNIST data set i want to load my own image data set folder.i have tried pydrive wrapper.but i need simple solution.
suppose i have dataset of images inside google drive.how to load it to google colab?
from google.colab import drive
drive.mount('/content/gdrive')
then
with open('/content/gdrive/My Drive/foo.txt', 'w') as f:
f.write('Hello Google Drive!')
!cat /content/gdrive/My\ Drive/foo.txt
here insted of foo.txt i have an image folder called Dog inside ml-data folder.but i can't load it.how to load it in google colab directly from google drive as it is in my local hard drive.

To load data directly from the local machine, you need to follow these steps:
Go to files [left side menu]
Click on upload to session storage
Select file(s) from your machine to upload
It will prompt something indicating that file(s) will be available for the current session only, click ok.
The file(s) will be uploaded in the directory. Click on it (left and right-click both work the same).
And then;
Copy path & use it inside pd.read_csv() function.
Note: After the session is terminated, files will be lost from colab session. To use it again, you'll need to upload it again.
Many times, we prefer to it have all our data in a GitHub repository or in a google drive folder to fetch from there.

Reading many files from Google Drive through colab is going to be less performant and more unreliable than first copying a .zip or similar single file from Drive to the colab VM and unzipping it outside the drive mount directory, and then using that copy of the data.

Related

How to make the contents of two colab files same

I am learning python programing using Google Colab. When i open two google colab files from the same account, the folders in content drive are different.
I wanted to know that how can we make the two Content Folder of the two Google Colab files same.
Thanks
Each Google Colab session allocates different resources from Google Cloud. You cannot combine them. Instead, you can connect the notebook to Google Drive account, where you can create a folder and manage its contents.
Step 1: Connect to Google Drive:
from google.colab import drive
drive.mount('/content/drive')
Step 2: Create a folder:
!cd /content/drive/My Drive
!mkdir mydirectory
Step 3: Move to the newly created directory and start working there:
!cd mydirectory
After this, any OS-related operations (create, update, delete files, etc.) you perform will take place in this directory.
You can also create a Jupyter Notebook within this directory and work with that notebook, although that is not strictly necessary.

Upload a folder of files to google colab

Google Colab has not direct access to the local folders. To run the local scripts needs uploading the files to Colab. I have a collection of folders and files to upload. I was wondering if there is an efficient way to do so without disturbing the file organization, something like !git clone... ?

.ipynb file in Google Drive doesn't open with Colaboratory by default

I have two files in my Google Drive with .ipynb extension, but one of them is marked as Unknown File type and has a blue icon. What can I do to change the second one to a Colaboratory file?
If you want to get that icon you should save a copy from Colab
Upload your file to Drive
Open a file with Colab
Save a copy in Drive
Go to Drive and find folder
Inside you will find your file with the Colab icon

how to access a csv file which i saved in a colab project to another colab project

Help regarding colab project.
i am working on walmart sales prediction and i have created the required dataframe in a colab file, now for prediction of future sales i need to access that file and if i do that in the same colab file then the size being big the ram crashes... so i am thinking of accessing it in another colab file. How to do it and i dont want to download the file and upload it in that colab project.
df.to_csv('wm_sales.csv')
i have saved the file and its showing in my folder in colab and now i want to use it in another project.
You can simply mount your google drive disk and save any files in the specified directory:

How upload files to current working directory in Google Colab notebook?

I uploaded a Jupyter notebook to Google Colab. The notebook accessed a couple of images from local path /images/pic1.png How to upload data to the directory in Colab where the notebook is running? Please note that these are temporary files. So, I don't mind them being deleted on terminating the session.
I used Files upload feature but it doesn't seem to upload to Google Drive. I want to upload the folder with the same hierarchy as in the local environment without needing any change in the paths of the files in the code.
The better way I've found is mounting a folder in your Google Drive. Upload the image folder to your drive. After that, in your notebook you write:
from google.colab import drive
drive.mount('/content/drive')
After you allow the authentication, you can work using the path "/content/drive/My Drive" with the path for your image folder. Like that:
from IPython.display import Image
Image("/content/drive/My Drive/images/pic1.png")
See more in the notebook with documentation: https://colab.research.google.com/notebooks/io.ipynb