Can you access Google Drive file revision history in Google Colab? - google-colaboratory

I wrote over filenames in Google Drive and want to link the old names listed under file revision history in the GUI (ex from picture: medium.1.jpg) to the new ones (ex from picture: 504.jpg). Is there a way to access this metadata directly in Google Colab so I can make a dictionary?
gdrive_version_hist

Related

is it possible to read a Google Drive folder (all files) as BigQuery external data source?

I am using Google Drive as an external data source in BigQuery. I can able to access a single file, but unable to read a folder with multiple files.
Note:
I have picked up the shareable link from Google Drive for folder and used "bq mk.." command referencing the link ID. Although it creates the table but unable to pull data.
I've not tried it with drive so I have no sense of how performant it is, but when defining an external table (or load job), you can specify the source data as a list of URIs. My suspicion is that it's not particularly scalable and may run into limits in drive, as that's not a typical access pattern. Google Cloud Storage is a much more suitable datasource for this kind of thing.

Does anybody know how to share the google colab document so people can run the notebook but cannot see the actual code?

I am trying to share the google colab document (that contains my data-visualization project) with friends so they can run the code but not actually see the code, because I don't want them to copy the code. How do I do this?

How to access data from machine using google colab

I want to use google colab. But my data is pretty huge. So I want to access my data directly from the machine in google colab. And I also want to save the files directly in my machine directory. Is there a way I can do that as I can't seem to find any.
Look at how to use local runtime here.
https://research.google.com/colaboratory/local-runtimes.html
Otherwise, you can store your data on GDrive, GCS, or S3. Then, you can just mount it, no need to upload every time.

Write out file with google colab

Was there a way to write out files with google colab?
For example, if I use
import requests
r = requests.get(url)
Where will those files be stored? Can they be found?
And similarly, can I get the file I outputted via say tensorflow save function
saver=tf.Saver(....)
...
path = saver.save(sess, "./my_model.ckpt")
Thanks!
In your first example, the data is still in r.content. So you also need to save them first with open('data.dat', 'wb').write(r.content)
Then you can download them with files.download
from google.colab import files
files.download('data.dat')
Downloading your model is the same:
files.download('my_model.ckpt')
I found it is easier to first mount your Google drive to the non-persistent VM and then use os.chdir() to change your current working folder.
After doing this, you can do the exactly same stuff as in local machine.
I have a gist listing several ways to save and transfer files between Colab VM and Google drive, but I think mounting Google drive is the easiest approach.
For more details, please refer to mount_your_google_drive.md in this gist
https://gist.github.com/Joshua1989/dc7e60aa487430ea704a8cb3f2c5d6a6

Accessing Dropbox Datastore database

I use an outdated iOS app called Loggr, and now would like to extract data stored in it. It syncs with Dropbox Datastore, which I can see on my Dropbox account:
But I cant find any files corresponding to among my Dropbox files. My question, how do I extract the information from the Datastore?
Dropbox Datastores are a structured data storage system, separate from files, so they won't appear as files in your account. They should be available under "Apps you use" here though:
https://www.dropbox.com/developers/apps/datastores