It is rather obvious that this simplest recommended way which also provides the simplest API for reading the files from a drive folder ―
from google.colab import drive
drive.mount('/content/gdrive')
― does not allow access to files/folders shared with/by you in Google Drive, but only allows access to your own drive's files, making its use less of a fit for shared notebooks (or at least, shared notebooks that need to access data from shared google drive folders).
Is it possible to access google drive folders shared by/with you using the same API, or any of the other ones which rely on google drive sharing permissions and not "anyone with the link can access" google drive hyperlinks?
I believe all other ways require the use and hard-coding of file id's, whereas the API mentioned above can access drive files by name, which at times can be simpler to maintain.
Can you use the same API for shared Google Drive files?
A simple solution for shared files is to add them to you Drive by right clicking on the file in the 'shared with me' list in Drive and selecting the 'Add to Drive' item. Afterward, the file will appear in /content/drive/My Drive/
Related
I use Google Colab extensively. In order to get an easy access to files in my Google drive, I mount the drive to the file system of the virtual machine that runs Colab. Like that:
from google.colab import drive as cdrive
cdrive.mount('/content/gdrive')
% cd /content/gdrive/'My Drive'/'Colab Notebooks'/my_directory
In the beginning of each session, I need to give a permission to access my drive. In order to do that, I need to press 'Allow', copy a one-time-password and paste it to a dedicated text area. It's a bit tedious.
Is there a better way? can I give a permanent permission based on my machine? any other ideas?
At this moment, we can access google drive with the Mount Drive button on the left menu bar.
Confirm the access Google Drive action.
Then it will be mounted to your Colab notebook.
I need to understand whether there is a way to incremental upload to Google Colaboratory.
I was trying to upload a huge number of image files to Google Colaboratory when my Internet connection failed and I had to start again. I observed that the images, which were already uploaded, where now getting duplicated.
Is there any way that only missed files get uploaded? This will save time and space.
I suggest you not to upload them just in Colab, because there is no solution to this problem, you just need to re-select manually the files not uploaded yet. I suggest you to use the google.colab package to manage these problems in Colab. Just upload everything you need to your google drive, then import:
from google.colab import drive
drive.mount('/content/gdrive')
In this way, you just need to login to your google account through google authentication API, and you can use files/folders as if they were uploaded on Colab. In this way, you can manage connection errors, since you're uploading them to google drive, and you can choose between overwriting existing files or just skip them.
We are sharing a google drive folder where we put the colab notebooks. Now we need to upload some text files permanently for notebook usage. I do not want to upload files every time I open colab. From what I searched, I had to upload files to google drive and mount it to colab in some way.
So, when I mount google drive to colab, can my teammates access all my files in it, or simply the shared folder.If not, is there a way to share only a folder or a file of google drive in colab.
If you share a folder with your teammates in Google Drive then that folder will appear in each of their drive mounts in colab. Each person running code in a notebook (even if they share a notebook) gets their own VM. One person should never see another person's Drive mount.
An alternative to sharing a data-file folder in Drive is to upload your data to GCS and have your notebook fetch it from there (example).
Colaboratory allows to mount Google Drive and use data from Drive but I have massive datasets (including images) on my local system that would take a long time and huge space on drive.
So, I am looking for something similar but here I want to mount my local system's Drive.
One option is to run Jupyter locally and connect to it using Colab, thereby providing the benefits of Drive storage and sharing for your notebooks, but allowing easy access to local files.
Instructions are here: https://research.google.com/colaboratory/local-runtimes.html
I am testing a Google Chrome notebook. Whenever I download a file it goes to a folder called "File shelf" which is somehow connected to my Gmail account. Is it possible to access this folder on a "normal" system running any browser? I did not find how to do it yet. In general, is there a description on how to manage this folder: delete, copy to external storage (USB), etc?
The 'File shelf' is not connected to your Gmail. It's a place on the SSD. If you click 'Ctrl-M' you will get a folder with all the files on your local SSD.
Once you are in the folder - right click will give you the options to delete/rename each file.
If you want wish to do more with this new File API - here is a post I've wrote on it.
Good luck.