I want to train yolo on my own data and am using Google Colab but unable to edit .cfg files stored in contents folder. One solution I can think of is to fuse the google drive and installing darknet in gmail drive and running the program with drive as my working directory.
Related
I am working on deep learning architecture which uses shared library files which are built on the local system, now my computer is not having space to run my code, so I want to run my code in colab. To run on colab, the colab is not showing my shared libraries file which I am trying to build on google colab.
You need to mount your GDrive and copy the files from Colab cloud working directory
!mv /content/filename /content/gdrive/My\ Drive/
So I cannot successfully install miniconda and decided to work in Google Colab. But my local repository is in Google Drive. All of the jupyter notebook that I want to work on is store in Google Drive. I want to work on them in Google Colab. However, I cannot open the files other than the ones stored in Colab folder of Google drive. What should I do now?
What works for me is opening the notebook from Google Drive, not from Colab.
The following steps:
Open Google Drive in your browser, go to the notebook (ipynb) file in your repo (sub)folder;
Right-click the file and select 'Open with Google Colaboratory', now Colab opens with the notebook;
For this you might have to install the 'Open in Colab' chrome extension
Mount Google Drive from Colab (if not mounted yet), see code snippet;
Change directory to the folder where you want to run your notebook, see code snippet;
Now I can run my notebook
Mount drive:
from google.colab import drive
drive.mount('/content/drive')
Change directory to your repository folder:
%cd drive/My Drive/ColabNotebooks/(YOUR_REPO_PATH)
To open and work a notebook in your Google drive, open it from Google Drive UI in Chrome. From the Google colab UI menu -> Open notebook, we can only open those in Colab Notebooks folder.
Easy way would be to save all your notebooks in the Colab Notebooks folder.
First you need to mount google drive and then
from google.colab import drive
drive.mount('/content/gdrive')
Click the link on colb and authenticate (copy password and paste in colab) to mount and access your gdrive.
You can access your folder as shown below
path_to_data = '/content/gdrive/My Drive/MyFolder_where_data_stored'
Unique_Labels_List = os.listdir(path_to_data)
After running model, save the model, zip and download to local.
!zip -r ./MyFolder.zip /content/MyFolder
files.download("/content/MyFolder.zip")
There are many other things you can do. Check this resource for some more commands and you can find many more by searching in google. Thanks!
I have to download a really large file (140 GB) from colab, which I created in colab, I tried downloading it manually by navigating to the files tab, but it is taking lots of time and getting failed. So, is there any other way or direct code through which I can download it to my pc?
140 GB may be too big for Google Drive. So, you might try copying it to a GCS bucket first, then download from there.
Authenticate using auth.athenticate_user() first, then
!gsutil cp file.zip gs://your-bucket-name/
Then, download it through the console.
I have a large file in my Google Colab Workspace. I want to download it to my local machine using the command files.download(name of the file). Will this use my internet or Colab's?
Both. Files downloaded from Colab will use the Internet connection of the client machine to receive the data, which will have been sent using the Internet connection of the Colab backend.
To inspect the file size, hover over the file in the file browser. In the image below, for example, the file is 294 kilobytes.
FYI, you can download by right clicking on the file and selecting download, which is a bit simpler than executing the code snippet in your question.
I have a Python file available under some URL, for example
https://gist.githubusercontent.com/messa/d19ad7fd4dc0f95df9caf984caef127c/raw/4d0daebdcfcf16ea3b7914ee6186bd98dbfb3c20/demo.py
In reality it will not be gist but some courseware/homework review software.
How can I open such URL using Google Colab so I can for example run the Python code?
I know I can build colab URL for Github gists or repositories, but can I do it for any arbitrary URL?
There is currently no way to do what you are asking for – to construct a URL that will cause Colab to automatically load the contents of a .py file at a particular URL into a new Colab notebook.
The closest thing to this is to host a notebook on github, and then use a Colab url to open it: e.g.
http://github.com/username/repository/path/to/notebook.ipynb can be opened in Colab using http://colab.research.google.com/github/username/repository/path/to/notebook.ipynb
http://gist.github.com/username/hash/filename.ipynb can be opened in Colab using http://colab.research.google.com/gist/username/hash/filename.ipynb