I followed the steps given in this Medium tutorial on google colab and then tried to clone a git repository but I cannot see the repository anywhere in my drive.
The following image is the code snippet I used which is exactly the same as that from the Medium tutorial:
To mount google drive on google Colab just use these commands:
from google.colab import drive
drive.mount('/content/drive')
It would need an authentication process. Do whatever it needs (open the link and copy the key)
Now you have mounted your google drive in /content/drive/ directory in your google Colab machine.
To clone a git repository, first set your current directory as a path in /content/drive/, then just clone the git repository. Like below:
path_clone = "/content/drive/my_project"
%cd path_clone
!git clone <Git project URL address>
Now you would have the cloned Git project in my_projects folder in your Google Drive (which is also connected to your Google Colab runtime machine)
Adjust ur path where you git clone.
try adding drive folder to path
import os
os.chdir("drive")
Just do cd drive (without !) or %cd drive.
See cd vs !cd vs %cd in IPython.
Related
I am learning python programing using Google Colab. When i open two google colab files from the same account, the folders in content drive are different.
I wanted to know that how can we make the two Content Folder of the two Google Colab files same.
Thanks
Each Google Colab session allocates different resources from Google Cloud. You cannot combine them. Instead, you can connect the notebook to Google Drive account, where you can create a folder and manage its contents.
Step 1: Connect to Google Drive:
from google.colab import drive
drive.mount('/content/drive')
Step 2: Create a folder:
!cd /content/drive/My Drive
!mkdir mydirectory
Step 3: Move to the newly created directory and start working there:
!cd mydirectory
After this, any OS-related operations (create, update, delete files, etc.) you perform will take place in this directory.
You can also create a Jupyter Notebook within this directory and work with that notebook, although that is not strictly necessary.
Let's say I have a folder on GDrive with the path /content/drive/MyDrive/MyFolder. I know I can access the contents of the folder from Google Colab after mounting Drive. But is it also possible to access the ID/URL of this folder using Colab, and if so, how?
You can use kora to get ID from a file path.
!pip install kora
from kora.xattr import get_id
fid = get_id('/content/drive/MyDrive/Colab Notebooks')
# 0B0l6No313QAhRGVwY0FtQ3l1ckk
I want to run deep learning models in Google Colab that has a power GPU support. I am quite new to Colab. Originally I though I could uplod the images using os.path.join and PIL.Image.open as I did in the environment of Spyder. But the Colab gives a FileNotFoundError FileNotFoundError: [Errno 2] No such file or directory (the images indeed exist in a local directory). It seems I did not do correctly for the uploading.
You can directly upload to colab or upload images to a google drive folder. Then mount that folder in google colab to use it. On the left there is an arrow that opens a sidebar which contains searchable code snippets. Here, you can search for drive to get google drive related code snippets.
This notebook contains some examples,
https://colab.research.google.com/notebooks/snippets/drive.ipynb#scrollTo=u22w3BFiOveA
The command below list contents of folder named DeepLearning,
!ls -la "/content/gdrive/My Drive/DeepLearning"
Copy contents of drive DeepLearning folder to Virtual Machine in DeepLearning folder,
!cp /content/gdrive/My\ Drive/DeepLearning ./DeepLearning
Copy contents of Virtual Machine DeepLearning folder to google drive DeepLearning folder,
!cp ./DeepLearning /content/gdrive/My\ Drive/DeepLearning
You can run %cd DeepLearning to change directory to DeepLearning folder.
GPU support can be enabled by Runtime > Change Runtime Type > Hardware Accelerator > GPU.
I trained my convolutional neural network implemented in tensorflow in google cloud, but now how do I export the model in "storage in google cloud" to my PC?
I want to download the model that was trained, to use it to make predictions
I have it like this
If your files are already in a bucket you can download them from the web interface by either clicking one by one and saving on your hard drive or you can use
gsutil cp *.* gs://adeepdetectortraining-mlengine/prueba_27_BASIC_GPU
check here
To download recursively all the files and folders inside a bucket, I suggest you to use the gsutil command:
gsutil cp -r gs://<BUCKET_NAME>/<FOLDER>
The above command will download recursively the files and folders inside "FOLDER" on the current path for the local machine, this is possible due to the "-r" option is used.
You would be able to add a destination folder following the gsutil command documentation:
gsutil cp [OPTIONS] src_url dst_url
Where:
src_url is the path for your bucket.
dst_url is the path for your local machine.
Consider that above commands should be executed on your local machine with Cloud SDK installed and configured, this will allow you to copy your files into your local machine.
I have a python project where the main .py file imports other .py files and the main file is then run for different input arguments using a .sh file. Can anyone guide me how to transfer and run the entire project on Google Colab?
Commit your files to a git repo.
In a new google colab file, run
!git clone <http url of your repo>
Then execute your program
!python main.py --foo bar
Alternatively, you could upload your project to google drive and then import the project from there.