I have to download a really large file (140 GB) from colab, which I created in colab, I tried downloading it manually by navigating to the files tab, but it is taking lots of time and getting failed. So, is there any other way or direct code through which I can download it to my pc?
140 GB may be too big for Google Drive. So, you might try copying it to a GCS bucket first, then download from there.
Authenticate using auth.athenticate_user() first, then
!gsutil cp file.zip gs://your-bucket-name/
Then, download it through the console.
Related
I am working on deep learning architecture which uses shared library files which are built on the local system, now my computer is not having space to run my code, so I want to run my code in colab. To run on colab, the colab is not showing my shared libraries file which I am trying to build on google colab.
You need to mount your GDrive and copy the files from Colab cloud working directory
!mv /content/filename /content/gdrive/My\ Drive/
I have a large file in my Google Colab Workspace. I want to download it to my local machine using the command files.download(name of the file). Will this use my internet or Colab's?
Both. Files downloaded from Colab will use the Internet connection of the client machine to receive the data, which will have been sent using the Internet connection of the Colab backend.
To inspect the file size, hover over the file in the file browser. In the image below, for example, the file is 294 kilobytes.
FYI, you can download by right clicking on the file and selecting download, which is a bit simpler than executing the code snippet in your question.
When I try to read a CSV file using
read.csv(file="/Users/User1/Documents/file.csv", header=T)
it says no file path exists. This command works with the same file when I tried it on a different computer.
So I tried read.csv(file.choose()), but this just freezes my console. This command works on the other computer with the same file.
I uninstalled and reinstalled the program and even tried an older version of R, but the problem persists.
I cannot change my working directory either. setwd() also freezes the console.
Has anyone had this same problem?
From my experience, I have faced the same problem while having a big chunk of data. The loading process also creates pressure on the computer so you need to check your ram, storage and the processor of your computer. So, I'll recommend you to work on python, otherwise, you can have a try with the h2o server. Google about how to do the analysis using the h2o server in R. It may also work.
I want to train yolo on my own data and am using Google Colab but unable to edit .cfg files stored in contents folder. One solution I can think of is to fuse the google drive and installing darknet in gmail drive and running the program with drive as my working directory.
I have crouton running on a chromebook 11 with ubuntu precise in it. I am looking for a way to sync files in some folders in the ubuntu chroot with Google Drive. I am thinking I can create a link between the mounted chroot partition and a sync'ed folder in the Chrome OS, but I can't find where the synced Google Drive folder is in Chrome OS.
Could anyone please help?
Bottom line I want to sync files I create in the chroot to an online service, Google Drive or Dropbox, whatever works.
Thank you in advance
SUMMARY
The best way to sync files (code, in my case) is to use git and bitbucket/github. Install git in the chroot, and sync the code into the Downloads folder so it can be accessed from both ChromeOS and the chroot (and compiled and ran locally on both computers). I stored all other files in Google Drive so they could be accessed from ChromeOS, other computers, and the chroot.
Here are the options I looked into, but did not end up using
Accessing the Google Drive cache directly on the chromebook
Google drive in the chrome os stores the file data in
/home/chronos/user/GCache/v1/files
However, it does not name the files how they are named in google drive (it names them by some uuid, and stores meta in another folder).
3rd party tools to mount Google Drive folder in linux
https://github.com/dsoprea/GDriveFS
I was able to get GDriveFS working, but it was slow and chmod does not work in its file system. The permission are all static at 666, so programs will not execute.
https://github.com/astrada/google-drive-ocamlfuse/
I was not able to get ocamlfuse working on an arm Chromebook.
https://github.com/Grive/grive
I was able to get Grive working, but it has some problems and hasn't been updated in over a year. Would not recommend if there is a chance of merge conflicts.