Is there a way to check how many people are using google colab at the same time you are?
I tried to look up via google and other sources and couldn't find any concrete information regarding the number of Users using the GPU at once.
No, there's no way to view overall Colab usage.
You could add analytics reporting to individual notebooks using Python APIs like this one:
https://developers.google.com/analytics/devguides/reporting/core/v4/quickstart/service-py
But, that would only report usage for users of a given notebook who execute code rather than users of Colab overall.
Related
I wanted to subscribe to Colab Pro with one of the accounts I had.
But when I was subscribing I didn't notice that I was switched automatically to the default account.
I just want to use Colab pro on the other account I have, since all my data are on the other drive.
Do you have an idea how to solve this?
Goal
I'd like to have a credentials file for things like API keys stored in a place where someone I've shared the Colab notebook with can't access the file or the information.
Situation
I'm calling several APIs in a Colab Notebook, and have multiple keys for different APIs. I'd prefer a simpler approach, if there are different levels of complexity.
Current attempts
I'm storing the keys in the main Python notebook, as I'm researching the best way to approach this. I'm pretty new at authentication, so would prefer a simpler solution. I haven't seen any articles addressing this directly.
Greatly appreciate any input on this.
You can store the credential files in your Google Drive.
Only you can access them at /content/drive/MyDrive/ after mounting it. Other people need their own credential files in their own Drive.
I'm making a simple script in Google Colabs (Jupyter Notebook) that can grab stuff from our big data environment (in BigQuery) and analyze it. I'm avoiding using environmental variables as most of the engineers won't know how to set it up. Ideally, i'm looking for a way to authenticate in using our Google username/password. Does anyone have any experience authenticating into GBQ this way? Thanks
The Colab docs contain an example showing how to issue an authenticated BigQuery query.
from google.colab import auth
auth.authenticate_user()
print('Authenticated')
Then,
# Display query output immediately
%%bigquery --project yourprojectid
SELECT
COUNT(*) as total_rows
FROM `bigquery-public-data.samples.gsod`
I want to use google colab. But my data is pretty huge. So I want to access my data directly from the machine in google colab. And I also want to save the files directly in my machine directory. Is there a way I can do that as I can't seem to find any.
Look at how to use local runtime here.
https://research.google.com/colaboratory/local-runtimes.html
Otherwise, you can store your data on GDrive, GCS, or S3. Then, you can just mount it, no need to upload every time.
I just set up a IPython (0.13.1) Notebook server in order to use it during my introductory course on Python. The server is protected with password and is currently up and running.
Now, I need my students (about 15 people) to access the same ipynb document at the same time, play around with it and eventualy modify the code examples, while making sure anyone overwrites the uploaded version of the ipynb file.
How can I set this up?
First, take a look on teaching with ipython notebook. Try to list what type of applications you want to run on this. On the other hand, it possible to use some cloud computing resources, for example on Heroku.