I trying to use jovian platform and google colab to run jupyter notebook. When I am trying to upload the notebook to my jovian account, I see this error.
Colab commit failed: (HTTP 400) Unauthorized access to tahmid1989/01-pytorch-basics
I am giving correct API key when it is prompted.
I am using same google account for my google colab and jovian.
Still it is not working.
Here is what I am trying.
!pip install jovian --upgrade --quiet
import jovian
jovian.commit(project='01-pytorch-basics')
Are you opening Colab from Jovian? It would be better to open the notebook on Colab via Jovian.
enter image description here
I can only commit my notebook from Colab to Jovian when I open it via Jovian.
Alternatively, you can try to run the Jupyter locally or run on Binder, then will be able to commit to Jovian..
logout and login once from jovian.ai
restart kernel on Colab
run jovian.commit as usual
I understand why it’s happening, will fix. But for now logout/login should work for you.
Related
my colab restart after showing memory error.
I am trying to train cnn model on google colab notebook but after fist epoch my notebook restart itself by showing the error of memory allocation issue.
I was running the same one day before and it was running fine.
Logs are attached here.
It sounds like you might be running out of memory in google Colab.
One way to get around this might be to use your own computer. The method for this with anaconda is:
open anaconda
type in to anaconda: pip install jupyter
Once it is downloaded, type: jupyter notebook
this then opens a webpage, in the webpage there is a dropdown (I think called new) which you should click on, then click on terminal
in the terminal, you should type in (all on one line): jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8889 --NotebookApp.port_retries=0
There will then be 3 output links, copy the second one.
back in colab, click the dropdown in the top right next to connect, then click connect to local runtime
Then input the copied URL into the prompt, then connect.
I know that for jupyter notebooks and jupyter lab, there are available code formatter extensions such as nb_blackor blackcellmagic. However when I installed them, it doesn't seem to work on google colab.
Do you know if there are any native option in colab or an extension that formats code (pep8 compliant)?
I don't think there's an extension directly in Colab.
What you could do, though, is to download your notebook, run
pip install -U nbqa
nbqa black notebook.ipynb
and then reupload your (now formatted) notebook to Colab
disclaimer: I'm the author of nbQA
UPDATE: as of version 21.8b0, black runs directly on notebooks, no third-party tool required
I have tried everything, none of the JupyterLab/Notebook backend hack methods seem to work as of February 2022. However, until later here is a relatively simple workaround:
[Run only once, at startup]
Connect to your drive
from google.colab import drive
drive.mount("/content/drive")
Install black for jupyter
!pip install black[jupyter]
Restart kernel
[Then]
Place your .ipynb file somewhere on your drive
Anytime you want format your code run:
!black /content/drive/MyDrive/YOUR_PATH/YOUR_NOTEBOOK.ipynb
Don't save your notebook, hit F5 to refresh the page
Voila!
Now save!
I would like to be able to turn off my local machine while having my code continuously running in jupyter lab and come back to it running later, however as soon as the SSH is terminated, the jupyter lab kernel is stopped. My code also stops executing when I close the jupyter lab browser tab.
From the Google Cloud Platform marketplace I'm using a 'Deep Learning VM'. From there, I SSH to it through the suggested gcloud command (Cloud SDK) gcloud compute ssh --project projectname --zone zonename vmname -- -L 8080:localhost:8080. It then opens a PuTTY connection to the VM and automatically has jupyter lab running that I can now access on local host.
What can I do to be able to run my code with my local machine off in this case?
I usually use "nohup" when using jupter notebook through ssh!
:~$ nohup jupyter notebook --ip=0.0.0.0 --port=xxxx --no-browser &
you can know more about it here
Hope it helps!
You can use Notebook remote execution.
Basically your Notebook code will run in a remote machine and results will be stored there or in GCS for later view.
You have the following options:
nbconvert based options:
nbconvert: Provides a convenient way to execute the input cells of an .ipynb notebook file and save the results, both input and output cells, as a .ipynb file.
papermill: is a Python package for parameterizing and executing Jupyter Notebooks. (Uses nbconvert --execute under the hood.)
notebook executor: This tool that can be used to schedule the execution of Jupyter notebooks from anywhere (local, GCE, GCP Notebooks) to the Cloud AI Deep Learning VM. You can read more about the usage of this tool here. (Uses gcloud sdk and papermill under the hood)
Notebook training tool
Python package allows users to run a Jupyter notebook at Google Cloud AI Platform Training Jobs.
AI Platform Notebook Scheduler
This is in Alpha (Beta soon) with AI Platform Notebooks and the recommended option. Allows you scheduling a Notebook for recurring runs follows the exact same sequence of steps, but requires a crontab-formatted schedule option.
There are other options which allow you to execute Notebooks remotely:
tensorflow_cloud (Keras for GCP) Provides APIs that will allow to easily go from debugging and training your Keras and TensorFlow code in a local environment to distributed training in the cloud.
GCP runner Allows running any Jupyter notebook function on Google Cloud Platform
Unlike all other solutions listed above, it allows to run training for the whole project, not single Python file or Jupyter notebook
I am working with tensorflow 2.0 beta, and while i managed to get my GPU working on anaconda through a few youtube tutorials I am unable to get my gpu running in google colab. I know google has the option to enable a gpu from one of their servers but My GTX 1070 is much faster, and i need to run off colab and not just Jupyter exclusively.
So I read the documentation like a good boy and the only thing i think i could have done wrong is my path settings I have screenshots bellow.
I followed several different youtube tutorials faithfully until the final one here gave me a way to install it to jupyter. Which is great, but I also need it to run on google colab as well.
I've been trying this since Friday and it's now tuesday and I'm losing my mind over this. Help me stackoverflow, you're my only hope.
https://imgur.com/a/8WibGWT
If you can get it running on your own Jupyter server then you can point colab to that local server.
Full instructions here: https://research.google.com/colaboratory/local-runtimes.html but edited highlights are:
install jupyter_http_over_ws:
pip install jupyter_http_over_ws
jupyter serverextension enable --py jupyter_http_over_ws
start your local server allowing colab domain:
jupyter notebook \
--NotebookApp.allow_origin='https://colab.research.google.com' \
--port=8888 \
--NotebookApp.port_retries=0
Click 'connect to local runtime' in colab
I have a Python file available under some URL, for example
https://gist.githubusercontent.com/messa/d19ad7fd4dc0f95df9caf984caef127c/raw/4d0daebdcfcf16ea3b7914ee6186bd98dbfb3c20/demo.py
In reality it will not be gist but some courseware/homework review software.
How can I open such URL using Google Colab so I can for example run the Python code?
I know I can build colab URL for Github gists or repositories, but can I do it for any arbitrary URL?
There is currently no way to do what you are asking for – to construct a URL that will cause Colab to automatically load the contents of a .py file at a particular URL into a new Colab notebook.
The closest thing to this is to host a notebook on github, and then use a Colab url to open it: e.g.
http://github.com/username/repository/path/to/notebook.ipynb can be opened in Colab using http://colab.research.google.com/github/username/repository/path/to/notebook.ipynb
http://gist.github.com/username/hash/filename.ipynb can be opened in Colab using http://colab.research.google.com/gist/username/hash/filename.ipynb