Reverse tunneling in RaspberryPI and cloud server? - ssh

I have a Raspberry Pi and and i have done reverse tunneling with an AWS instance. I ran the following command below on my Raspberry Pi.
ssh -N -R 1234:localhost:22 username#instance_IP
and on my Linux instance i am able to ssh using..
ssh -l user_pi -p 1234 localhost
but i am not able to ssh directly into my PI instead i first have to login to AWS and then into my PI..
how can i login to my PI directly using tunneling?
Thanks a lot!!

I found out that in case of direct remote connecting you need to allow Tcp Forwarding
AllowTcpForwarding yes

Related

Putty multihop tunnel replicate in bash

Im experiencing a problem replicate my putty ssh tunneling with Cmder bash (on windows machine).
1. I want to access web interface on port 7183 on server_2. To get there I have to go through jump_server first and and tunnel twice, as from the jump_server, only visible port is 22.
Steps with putty:
1. connect to jump_server with tunnel (L22 server_2:22) using username_1
2. connect to localhost with tunnel (L7183 localhost:7183) using username_2
After that, Im able to access the web interface when I type localhost:7183 into browser on my local machine.
Now Im trying to reproduce this in Cmder, but I havent been able to do that with one big command, nor 2 separate commands:
ssh -L 7183:localhost:7183 username_1#jump_server ssh -L 22:localhost:22 -N username_2#server_2 -vvv
This is only the last command I used as I tried interchanging ports and hosts without success.
2. Is the syntax different when I want to open port 12345 on my local machine and have it forwarded to port 21050 on server_2 or that would be remote tunneling?
Finally managed to achieve the 1. question with:
ssh username_1#jump_server -L 22:server_2:22 -N -vvv
ssh -L 7183:localhost:7183 username_2#localhost
Now Im albe to access the web interface from server_2 on my localhost:7183

Connect to Spark running via YARN through a SSH tunnel

I have a Spark installation running under YARN on a remote cluster, with a firewall between me and the head node. I can use a ssh tunnel to access the head node:
> ssh -N -f -L 10000:remotenode:10000 between_machine
and this setup works, for example, to access a HiveServer2 running on remotenote. If Spark was running in cluster mode, I would need to just do the same for the 7077 port and direct the pyspark client to localhost with
> ssh -N -f -L 7077:remotenode:7077 between_machine
> ./pyspark --master spark://localhost:7077
How can I do that with Spark running under the YARN scheduler?
If you are looking for a port to connect, here is a quote from the doc:
You can access this interface by simply opening
http://:4040 in a web browser. If multiple SparkContexts
are running on the same host, they will bind to successive ports
beginning with 4040 (4041, 4042, etc).
If you are just looking for a more universal way to get to the host via ssh "tunnel", you could try ssh working as socks proxy:
ssh user#host -D 20000
And then configuring your browser to connect via socks proxy (host - localhost, port - 20000).

Reach webserver via SSH tunnel

I have a RaspberryPi in my private local network (example: 192.168.1.2) and I have a dedicated server (example: 99.99.99.99) from some provider.
From my RaspberryPi I can connect to the server via ssh without trouble, the opposite situation is not possible. The RaspberryPi is not reachable from the internet.
Now I want to reach the webserver on my RaspberryPi from the internet with some ssh brigde/tunnel.
So if I enter the IP 99.99.99.99 in my browser, I want to see the website from the RaspberryPi. How it is possible?
The -R option to ssh will permit a remote tunnel to be opened towards the ssh client. So, if from the pi you run
ssh -R0.0.0.0:8080:address_of_pi:80 99.99.99.99
Then you will open an ssh and while that ssh is active anyone can go to 99.99.99:8080 and get to your pi.
You need to use 8080 as the port on the webserver address because the ssh process cannot bind to port 80 without being root.

Connect ipython-notebook via SSH tunnel from a remote location

I'm trying to open an ipython-notebook (which is running on a server) on a macbook from a remote location through an ssh tunnel but no data received.
This is the command for the SSH tunnel
ssh -L 5558:localhost:5558 -N -t -x user#remote-host
and this is the command I used to lunch the notebook form the server
ipython notebook --pylab=inline --port=5558 --ip=* --no-browser --notebook-dir notebooks
Than I tried to open it on a new tab with this remote-host:5558 but no data received.
Thanks in advance!
The directive -L AAAA:somehost:BBBB will cause SSH to listen on port AAAA on localhost (the machine the ssh command is run on) and forward any connection to that port, over the SSH session, to the host somehost port BBBB. So, you need to open http://localhost:5558/ in the browser on the machine you run the ssh command on.
Read this: How do I add a kernel on a remote machine in IPython (Jupyter) Notebook?
Remote jupyter kernel/kernels administration utility (the rk) here: https://github.com/korniichuk/rk

using telnet to connect to a ssh based server

Is it possible to use tunneling to connect to a ssh server via telnet? I'm using an API that can only telnet to a host, but that host will only accept ssh connections. If it is possible, what do I need to do to set that up?
Use netcat and ssh
$ nc -l -p 12345 -c "ssh someone#remotehost.com"
make sure that you have RSA auth setup, since you cannot enter a password.
i think what would work would be to run a telnet server on a local port on the host and use ssh to forward that locally where the api could connect to it; but that's just a bit silly