Following these instructions (Running a notebook server and Remote access to IPython Notebooks
) I proceed as follows:
On the remote server:
1) Setting NotebookApp.password()
In [1]: from IPython.lib import passwd
In [2]: passwd()
Enter password:
Verify password:
Out[2]: 'sha1:67c9e60bb8b6:9ffede0825894254b2e042ea597d771089e11aed'
2) Create profile
user#remote_host$ ipython profile create
3) Edit ~/.ipython/profile_default/ipython_notebook_config.py
# Password to use for web authentication
c = get_config()
c.NotebookApp.password =
u'sha1:67c9e60bb8b6:9ffede0825894254b2e042ea597d771089e11aed'
4) Start notebook on port 8889
user#remote_host$ ipython notebook --no-browser --port=8889
and the notebook starts
[I 16:08:10.012 NotebookApp] Using MathJax from CDN:https://cdn.mathjax.org/mathjax/latest/MathJax.js
[W 16:08:10.131 NotebookApp] Terminals not available (error was No module named 'terminado')
[I 16:08:10.132 NotebookApp] Serving notebooks from local directory: /cluster/home/user
[I 16:08:10.132 NotebookApp] 0 active kernels
[I 16:08:10.132 NotebookApp] The IPython Notebook is running at: http://localhost:8889/
[I 16:08:10.132 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
On my local machine
5) SSH tunneling
user#local$ ssh -N -f -L localhost:8888:127.0.0.1:8889 username#remote_host
On the remote host (/etc/hosts) you find
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
6) Finally, I try to open localhost:8888 on my browser, and I get:
channel 2: open failed: connect failed: Connection refused
channel 2: open failed: connect failed: Connection refused
channel 2: open failed: connect failed: Connection refused
channel 2: open failed: connect failed: Connection refused
channel 2: open failed: connect failed: Connection refused
All these steps work on one server, but fail on another one.
I tried contacting the administrator and said the following:
I assume that you are using two separate SSH connections: one from
which you run ipython and one that you use to do port forwarding.
There is no guarantee that the two connections will land you on the
same login node. In the case where the two connections are on
different hosts, you will experience the exact failure you have
encountered. Therefore you should setup the port forwarding in the
connection that you use to run ipython.
How can I setup the port forwarding in the connection that I use to run ipython?
I tried using my ip address but it didn't work
$ ssh -N -f -L local_ip_address:8888:127.0.0.1:8889 user#remote_host
Finally this is how the problem was solved:
# Login to the server from your local workstation and in the same connection do the port forwarding.
user#local$ ssh -L 8888:localhost:8889 username#remote_host
user#remote_host$ ipython notebook --no-browser --port=8889
Just follow this instruction.
https://coderwall.com/p/ohk6cg/remote-access-to-ipython-notebooks-via-ssh
Related
I have a server running Jupyter and I use ssh port forward to access it by:
ssh -L 8888:127.0.0.1:8889 -N -T Server
so I can access it from localhost:8888 or 127.0.0.1:8889, but failed to do so from $MY_IPADDRESS:8888
I had set my Jupyter config to allow remote and listen on all ip. Any suggestion is highly appreciated.
I think I'm missing one step in the script below.
The first time I run it, the VM gets created just fine, but the connection is refused. It continues to be refused even if I wait ten minutes after creating the VM.
However, if I use the GCP console to connect manually "Open in browser window", I get the message "Transferring SSH keys...", and the connection works. After this step, the script can connect fine.
What should I add to this script to get it to work without having to manually connect from the console?
#!/bin/bash
MY_INSTANCE="janne"
MY_TEMPLATE="dev-tf-nogpu-template"
HOME_PATH="/XXX/data/celeba/"
# Create instance
gcloud compute instances create $MY_INSTANCE --source-instance-template $MY_TEMPLATE
# Start instance
gcloud compute instances start $MY_INSTANCE
# Copy needed directories & files
gcloud compute scp ${HOME_PATH}src/ $MY_INSTANCE:~ --recurse --compress
gcloud compute scp ${HOME_PATH}save/ $MY_INSTANCE:~ --recurse --compress
gcloud compute scp ${HOME_PATH}pyinstall $MY_INSTANCE:~
gcloud compute scp ${HOME_PATH}gcpstartup.sh $MY_INSTANCE:~
# Execute startup script
gcloud compute ssh --zone us-west1-b $MY_INSTANCE --command "bash gcpstartup.sh"
# Connect over ssh
gcloud compute ssh --project XXX --zone us-west1-b $MY_INSTANCE
The full output of this script is:
(base) xxx#ubu-dt:/XXX/data/celeba$ bash gcpcreate.sh
Created [https://www.googleapis.com/compute/v1/projects/XXX/zones/us-west1-b/instances/janne].
NAME ZONE MACHINE_TYPE PREEMPTIBLE INTERNAL_IP EXTERNAL_IP STATUS
janne us-west1-b n1-standard-1 XXX XXX RUNNING
Starting instance(s) janne...done.
Updated [https://compute.googleapis.com/compute/v1/projects/xxx/zones/us-west1-b/instances/janne].
ssh: connect to host 34.83.3.161 port 22: Connection refused
lost connection
ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].
ssh: connect to host 34.83.3.161 port 22: Connection refused
lost connection
ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].
ssh: connect to host 34.83.3.161 port 22: Connection refused
lost connection
ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].
ssh: connect to host 34.83.3.161 port 22: Connection refused
lost connection
ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].
ssh: connect to host 34.83.3.161 port 22: Connection refused
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
ssh: connect to host 34.83.3.161 port 22: Connection refused
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
Edit: adding gcloud version info
(base) bjorn#ubu-dt:/media/bjorn/data/celeba$ gcloud version
Google Cloud SDK 269.0.0
alpha 2019.10.25
beta 2019.10.25
bq 2.0.49
core 2019.10.25
gsutil 4.45
kubectl 2019.10.25
The solution I found is this: wait.
For OS login, SSH starts working about 20 seconds after the instance is started.
For non-OS login, it takes about a minute.
So I just added this after gcloud compute instances start $MY_INSTANCE
sleep 20s
When you connect through Console it manages the keys for you.
Your last comment leads me to believe that when you connect from console you are generating an SSH key and it somehow allows you to run the script, I would recommend you to take a look at how to manage SSH keys in metadata and creating your own SSH key to access through the SDK.
If outside of the script through the SDK you cannot directly SSH either then I assume that it's because of the same reason of the generated key.
Also please make sure that when using the SDK the service account has the correct permissions.
Let me know.
I was setting up a firewall with UFW in Ubuntu server, I skipped the step sudo ufw allow ssh and instead run the command sudo ufw enable. Rebooted the VPS but now when I try to connect using ssh, I get the following error ssh: connect to host {IP Address} port 22: Operation timed out.
I am using Google Cloud Compute Infrastructure and I'm not understanding details in this article https://cloud.google.com/compute/docs/ssh-in-browser#ssherror
Is there a way I can rollback?
You can login to your instance using the serial console. After logging in you should run the command: sudo ufw allow ssh, for allowing ssh access to your instance.
See Interacting with the Serial Console for more information
I'd like to create a ssh tunnel from my computer to a remote server to a docker container running Jupyter Notebook (computer>server>Docker container) that allows me to run a Jupyter Notebook in my browser on my computer.
The Docker container is hosted on a machine running OS X (El Capitan). Docker is using the default machine IP: 192.168.99.100.
$ docker-machine ls
NAME ACTIVE DRIVER STATE URL SWARM DOCKER ERRORS
default * virtualbox Running tcp://192.168.99.100:2376 v1.11.1
I am able to physically sit at the server running the Docker container and use my browser (192.168.99.100:8888) to create Jupyter Notebooks from that Docker container. This verifies that my Docker port bindings work and that I'm running the Jupyter Notebook correctly.
However, I don't know how to establish a ssh tunnel from a client machine to that remote machine's Docker container and launch a Jupyter Notebook in my browser on the client machine.
The output from:
$ docker ps
produces the following:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
48a8ac126c72 kubu4/bioinformatics:v11 "/bin/bash" 55 minutes ago Up 55 minutes 8787/tcp, 0.0.0.0:8888->8888/tcp stupefied_pasteur
My attempts at creating a ssh tunnel to the remote machine's Docker container results in the following error message in Terminal when I try to launch the Jupyter Notebook in my browser on the client machine (localhost:8888):
channel 3: open failed: connect failed: Connection refused
I'm currently using the following in my .ssh/config file to create the tunnel:
Host tunnel3
HostName remote.ip.address
User user
ControlMaster auto
ServerAliveInterval 30
ServerAliveCountMax 3
LocalForward localhost:8888 localhost:8888
I can use this tunneling configuration to successfully launch Jupyter Notebooks in my client browser if I run the Jupyter Notebook on the remote machine outside of the Docker container that's on the remote machine.
Just for added info, this is the output when I launch the Jupyter Notebook in the remote machine's Docker container:
$ jupyter notebook
[I 18:23:32.951 NotebookApp] Writing notebook server cookie secret to /root/.local/share/jupyter/runtime/notebook_cookie_secret
[I 18:23:33.072 NotebookApp] Serving notebooks from local directory: /usr/local/bioinformatics
[I 18:23:33.073 NotebookApp] 0 active kernels
[I 18:23:33.073 NotebookApp] The Jupyter Notebook is running at: http://0.0.0.0:8888/
[I 18:23:33.074 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
I figured it out! The "A-ha!" moment was remembering that the remote machine running Docker was OS X (El Capitan). All my Docker builds/tests had been performed on a Linux (Ubuntu 14.04) machine. The difference, it turns out, is critical to solving this problem.
Docker installs on Ubuntu allow you to use "localhost" to address the Docker container. Docker installs on OSX generate an IP address to use to address the Docker container.
Realizing this, I changed my ssh tunneling configuration in the.ssh/config file on my client computer.
Old tunneling config:
Host tunnel3
HostName remote.ip.address
User user
ControlMaster auto
ServerAliveInterval 30
ServerAliveCountMax 3
LocalForward localhost:8888 localhost:8888
New tunneling config:
Host tunnel3
HostName remote.ip.address
User user
ControlMaster auto
ServerAliveInterval 30
ServerAliveCountMax 3
LocalForward localhost:8888 192.168.99.100:8888
With this change, I can successfully create/use Jupyter Notebooks in my client browser that are actually hosted in the Docker container on the remote machine, using localhost:8888 in the URL bar.
Had the same problem, trying to ssh-tunnel into a google cloud instance, then into a docker container.
Local machine: Ubuntu (14.04)
Cloud Instance: Debian (9-stretch)
Find the IP address Debian assigns to docker (credit):
docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' container_name_or_id
This gave me 172.18.0.2 for the first instance running, 172.18.0.3 for the second, ..0.4, ..0.5, etc. (Note: The below didn't work if I was running multiple containers on the same instance. Since I only need to run one container, I'm not going to figure out how to fix it)
ssh into the compute instance
Make sure ports are exposed between your Docker container and Compute instance (I used 8888:8888), then (credit):
gcloud compute ssh {stuff to your instance} -- -L 8888:172.18.0.2:8888
Run jupyter
jupyter-notebook --no-browser --ip=0.0.0.0 --allow-root
Now I can open my local browser to localhost:8888/?token... and use jupyter running in a container on my gcloud instance.
I'm trying to open an ipython-notebook (which is running on a server) on a macbook from a remote location through an ssh tunnel but no data received.
This is the command for the SSH tunnel
ssh -L 5558:localhost:5558 -N -t -x user#remote-host
and this is the command I used to lunch the notebook form the server
ipython notebook --pylab=inline --port=5558 --ip=* --no-browser --notebook-dir notebooks
Than I tried to open it on a new tab with this remote-host:5558 but no data received.
Thanks in advance!
The directive -L AAAA:somehost:BBBB will cause SSH to listen on port AAAA on localhost (the machine the ssh command is run on) and forward any connection to that port, over the SSH session, to the host somehost port BBBB. So, you need to open http://localhost:5558/ in the browser on the machine you run the ssh command on.
Read this: How do I add a kernel on a remote machine in IPython (Jupyter) Notebook?
Remote jupyter kernel/kernels administration utility (the rk) here: https://github.com/korniichuk/rk