While .py files placed in the same folder at Google Colab I get:
ModuleNotFoundError: No module named filename
Although the 'filename' file is in the same folder and imported by:
from filename import *
Thanks in advance
Your file layout in Drive is distinct from the file layout in Colab.
In order to use Drive files in Colab, you'll need to mount your Drive on the Colab backend using the following snippet:
from google.colab import drive
drive.mount('/content/drive')
Then, if you have a file like mylib.py, you'll want to %cd /content/drive in order to change your working directory. Then, you can import mylib.
Here's a complete example:
https://colab.research.google.com/drive/12qC2abKAIAlUM_jNAokGlooKY-idbSxi
Related
I have shared link form google-drive and I need to downloaded into my colab
I use this code but it's not working
!gdown https://drive.google.com/file/d/1DAQsttkUzc8iPhQMeLNqssuTpzryCDL1/view?usp=sharing
this is the file
lian-v20.pdf
I believe that zip file already placed in your google drive.
Now if you want to access zip file from google drive to google colab then you have to mount your google drive to the google colab. Once you have have with mounting then write this piece of code in your google colab
from zipfile import ZipFile
with ZipFile('/content/drive/MyDrive/SHISHU==folder/Start/HELLO_TEST.zip', 'r')
as zipObj: # path where your zip file found
zipObj.extractall('/content/drive/MyDrive/TRAIN_SET') # path where you want to
extract your zip file and save
Let's say I have a folder on GDrive with the path /content/drive/MyDrive/MyFolder. I know I can access the contents of the folder from Google Colab after mounting Drive. But is it also possible to access the ID/URL of this folder using Colab, and if so, how?
You can use kora to get ID from a file path.
!pip install kora
from kora.xattr import get_id
fid = get_id('/content/drive/MyDrive/Colab Notebooks')
# 0B0l6No313QAhRGVwY0FtQ3l1ckk
I want to run deep learning models in Google Colab that has a power GPU support. I am quite new to Colab. Originally I though I could uplod the images using os.path.join and PIL.Image.open as I did in the environment of Spyder. But the Colab gives a FileNotFoundError FileNotFoundError: [Errno 2] No such file or directory (the images indeed exist in a local directory). It seems I did not do correctly for the uploading.
You can directly upload to colab or upload images to a google drive folder. Then mount that folder in google colab to use it. On the left there is an arrow that opens a sidebar which contains searchable code snippets. Here, you can search for drive to get google drive related code snippets.
This notebook contains some examples,
https://colab.research.google.com/notebooks/snippets/drive.ipynb#scrollTo=u22w3BFiOveA
The command below list contents of folder named DeepLearning,
!ls -la "/content/gdrive/My Drive/DeepLearning"
Copy contents of drive DeepLearning folder to Virtual Machine in DeepLearning folder,
!cp /content/gdrive/My\ Drive/DeepLearning ./DeepLearning
Copy contents of Virtual Machine DeepLearning folder to google drive DeepLearning folder,
!cp ./DeepLearning /content/gdrive/My\ Drive/DeepLearning
You can run %cd DeepLearning to change directory to DeepLearning folder.
GPU support can be enabled by Runtime > Change Runtime Type > Hardware Accelerator > GPU.
I uploaded a Jupyter notebook to Google Colab. The notebook accessed a couple of images from local path /images/pic1.png How to upload data to the directory in Colab where the notebook is running? Please note that these are temporary files. So, I don't mind them being deleted on terminating the session.
I used Files upload feature but it doesn't seem to upload to Google Drive. I want to upload the folder with the same hierarchy as in the local environment without needing any change in the paths of the files in the code.
The better way I've found is mounting a folder in your Google Drive. Upload the image folder to your drive. After that, in your notebook you write:
from google.colab import drive
drive.mount('/content/drive')
After you allow the authentication, you can work using the path "/content/drive/My Drive" with the path for your image folder. Like that:
from IPython.display import Image
Image("/content/drive/My Drive/images/pic1.png")
See more in the notebook with documentation: https://colab.research.google.com/notebooks/io.ipynb
I have my dataset on my local device. Is there any way to upload this dataset into google colab directly.
Note:
I tried this code :
from google.colab import files
uploaded = files.upload()
But it loads file by file. I want to upload the whole dataset directly
Here's the workflow I used to upload a zip file and create a local data directory:
zip the file locally. Something like: $zip -r data.zip data
upload zip file of your data directory to colab using their (Google's) instructions.
from google.colab import files
uploaded = files.upload()
Once zip file is uploaded, perform the following operations:
import zipfile
import io
zf = zipfile.ZipFile(io.BytesIO(uploaded['data.zip']), "r")
zf.extractall()
Your data directory should now be in colab's working directory under a 'data' directory.
Zip or tar the files first, and then use tarfile or zipfile to unpack them.
Another way is to store all the dataset into a numpy object and upload to drive. There you can easily retrieve it. (zipping and unzipping also fine but I faced difficulty with it)