Unable to connect colab with my local computer - google-colaboratory

When i am running this code in jupyter notebook , it is giving me syntax error. I want to connect colab with my local runtime
I want to connect colab with my local runtime(system)
jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8888
--NotebookApp.port_retries=0

Related

Value_counts() not working in Kaggle notebook but works in Jupyter notebook?

I'm transferring my exploratory data analysis from jupyter notebook (localhost) to kaggle notebook and I keep getting an error about how AttributeError: 'DataFrameGroupBy' object has no attribute 'value_counts' even tho the same line of code works in the jupyter notebook.
Here's how it looks on my local jupyter notebook:
jupyter notebook local
But When I try to do the same on Kaggle notebook, here's what happens:
kaggle notebook
Same code but one works the other doesn't.

colab restart inn between cnn training

my colab restart after showing memory error.
I am trying to train cnn model on google colab notebook but after fist epoch my notebook restart itself by showing the error of memory allocation issue.
I was running the same one day before and it was running fine.
Logs are attached here.
It sounds like you might be running out of memory in google Colab.
One way to get around this might be to use your own computer. The method for this with anaconda is:
open anaconda
type in to anaconda: pip install jupyter
Once it is downloaded, type: jupyter notebook
this then opens a webpage, in the webpage there is a dropdown (I think called new) which you should click on, then click on terminal
in the terminal, you should type in (all on one line): jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8889 --NotebookApp.port_retries=0
There will then be 3 output links, copy the second one.
back in colab, click the dropdown in the top right next to connect, then click connect to local runtime
Then input the copied URL into the prompt, then connect.

'Upload' function not working for Jupyter Notebook in ssh mode on a Ubuntu 18.04 machine

I am new to using Jupyter, but am well versed with R. My new role requires me to use R-Kernel inside a jupyter notebook via ssh to share common data and save space. However, I am unable to upload any files from my local machine for some reason - although the permissions check out. There is no error - the entire computer just hangs the moment I click 'Upload'!! Has anybody ever faced this issue?
I am using Jupyter 3.1.18 via ssh on Ubuntu 18.04.
I don't have jupyter installed on my local machine.

Cannot upload notebook to jovian

I trying to use jovian platform and google colab to run jupyter notebook. When I am trying to upload the notebook to my jovian account, I see this error.
Colab commit failed: (HTTP 400) Unauthorized access to tahmid1989/01-pytorch-basics
I am giving correct API key when it is prompted.
I am using same google account for my google colab and jovian.
Still it is not working.
Here is what I am trying.
!pip install jovian --upgrade --quiet
import jovian
jovian.commit(project='01-pytorch-basics')
Are you opening Colab from Jovian? It would be better to open the notebook on Colab via Jovian.
enter image description here
I can only commit my notebook from Colab to Jovian when I open it via Jovian.
Alternatively, you can try to run the Jupyter locally or run on Binder, then will be able to commit to Jovian..
logout and login once from jovian.ai
restart kernel on Colab
run jovian.commit as usual
I understand why it’s happening, will fix. But for now logout/login should work for you.

Keep jupyter lab notebook running when SSH is terminated on local machine?

I would like to be able to turn off my local machine while having my code continuously running in jupyter lab and come back to it running later, however as soon as the SSH is terminated, the jupyter lab kernel is stopped. My code also stops executing when I close the jupyter lab browser tab.
From the Google Cloud Platform marketplace I'm using a 'Deep Learning VM'. From there, I SSH to it through the suggested gcloud command (Cloud SDK) gcloud compute ssh --project projectname --zone zonename vmname -- -L 8080:localhost:8080. It then opens a PuTTY connection to the VM and automatically has jupyter lab running that I can now access on local host.
What can I do to be able to run my code with my local machine off in this case?
I usually use "nohup" when using jupter notebook through ssh!
:~$ nohup jupyter notebook --ip=0.0.0.0 --port=xxxx --no-browser &
you can know more about it here
Hope it helps!
You can use Notebook remote execution.
Basically your Notebook code will run in a remote machine and results will be stored there or in GCS for later view.
You have the following options:
nbconvert based options:
nbconvert: Provides a convenient way to execute the input cells of an .ipynb notebook file and save the results, both input and output cells, as a .ipynb file.
papermill: is a Python package for parameterizing and executing Jupyter Notebooks. (Uses nbconvert --execute under the hood.)
notebook executor: This tool that can be used to schedule the execution of Jupyter notebooks from anywhere (local, GCE, GCP Notebooks) to the Cloud AI Deep Learning VM. You can read more about the usage of this tool here. (Uses gcloud sdk and papermill under the hood)
Notebook training tool
Python package allows users to run a Jupyter notebook at Google Cloud AI Platform Training Jobs.
AI Platform Notebook Scheduler
This is in Alpha (Beta soon) with AI Platform Notebooks and the recommended option. Allows you scheduling a Notebook for recurring runs follows the exact same sequence of steps, but requires a crontab-formatted schedule option.
There are other options which allow you to execute Notebooks remotely:
tensorflow_cloud (Keras for GCP) Provides APIs that will allow to easily go from debugging and training your Keras and TensorFlow code in a local environment to distributed training in the cloud.
GCP runner Allows running any Jupyter notebook function on Google Cloud Platform
Unlike all other solutions listed above, it allows to run training for the whole project, not single Python file or Jupyter notebook