ssh remoteforward over multiple hops - ssh

I am attempting a multi-hop SSH tunnel that needs to route traffic in both directions. My network config is:
My personal shell is on machineA
machineA can SSH into machineB
machineB can SSH into machineC
machineC is locally connected via ethernet to machineD
There is a service running on machineD wherein machineC sends UDP packets to machineD's portX, and machineD sends its replies via UDP to machineC's portY.
I have successfully done the following:
(from machineA)
ssh machineB
(from resulting shell)
ssh machineC
(from resulting shell)
echo "my command" | nc -u machineD portX #Send command to machineD's service
nc -ul portY #Read the results on machineC's port
I would like to do all of this via tunnels, so that I can run custom scripts directly on machine A to formulate service commands and parse the results. I tried the following in my .ssh config file:
host machineB
hostname x.x.x.x
user username_machineB
localforward 1234 machineC:22
host machineC
hostname localhost
user username_machineC
port 1234
localforward 1235 machineD:portX
remoteforward 1236 localhost:portY
I thought I could then do the following:
(from machineA)
ssh machineB
(from machineA again)
ssh machineC
(from machineA again)
echo "my command" | nc -u localhost 1235
nc -ul 1236
But...it doesn't seem to work. I don't see any of the expected replies on 1236. I'm not exactly sure how to debug this. I'm also not entirely sure of the format of those "localforward" and "remoteforward" lines on machineC's configuration. I don't know who will be interpreted as "localhost" when evaluating those lines. I suspect that remoteforwarding might be disabled on machineC, but I want to make sure I have configured everything else correctly first. Am I Doing It Wrong?
Alternatively, is there another way to achieve my end goal without having to change any configuration on machineB, C, or D? What I would like to do is use machineA to programatically construct complex commands intended for machineD, and parse the results using scripts on machineA as well.

You have to think backwards when you are doing this.
So basically machC can talk to machD's portX.
So you really want to run this on machA:
ssh machC
This is your end goal since that machine sends and receives from machD
Now you cannot get to machC directly, this is where your ProxyCommand entries come in.
host machC
ProxyCommand ssh machB nc %h %p
So you said machA can ssh to machB no problem. now if you do:
ssh -v machC
You'll see it hop through those things.
But really you want a port forwarding and listener from machC to the ports on machD so you change the machC settings:
host machC
ProxyCommand ssh machB nc %h %p
# first part is port on your current shell, second part is relative to machC
LocalForward 1234 machD:portX
RemoteForward 1235 localhost:portY
so using your example above:
host machineB
hostname x.x.x.x
user username_machineB
host machineC
ProxyCommand ssh machineB nc %h %p
hostname localhost
user username_machineC
localforward 1235 machineD:portX
remoteforward 1236 localhost:portY
Then you can use command:
ssh machineC
use -v to see the hops and tunnels, and -N if you don't care about getting a shell.
Now you can talk to your localhost's port 1235 to send to machineD portX and read from 1236 to listen to machineC portY

Related

SSH Config ProxyJump - Port forwarding from proxy

i have a question regarding port forwarding in combination with proxy jump in my ssh config:
Is it possible to make use of DynamicForward from the host used as proxy? Here's my config:
Host proxy
HostName proxy.private.com
User user
IdentityFile ~/path/to/file
DynamicForward 3000
Host target
HostName target.somewhere.com
User user
IdentityFile ~/path/to/file
ProxyJump proxy
It does not work with this config, but this would be exactly what i need.
Any tips on how to get it to work?
If there is nothing preventing you from using ProxyCommand you can most likely use this approach:
In your ~/.ssh/config file:
Host target
HostName target.somewhere.com
User target-user
IdentityFile ~/path/to/target-user-file
ProxyCommand ssh -A <proxy-user>#<proxy-host> -i <proxy-user-key> -W %h:%p
DynamicForward 3000
You can then run this command on your local machine:
ssh target -D 3000
I was able to test this by running this command locally and retreiving public IP of the target host:
curl -x socks5h://localhost:3000 https://ifconfig.me/
Usefull links I read:
More details on these use cases can be found here
Detail on this very approach can be found on this site (sadly not in english nor HTTPS)
You can probably define another Host on top to avoid having to mess with ssh parameter each time. This would be done by using CanonicalizeHostname, but I couldn't manage to it. An alias might be more interesting at that point ?

Using SSH ProxyJump with Coda

I am away from home, and need to proxy jump via my home server to connect to a number of sites.
The settings in .ssh/config work 100% of the time every time when executing from the command line, but coda simply refuses to use these connections.
Host home
Hostname my.home.server
Port 222
ProxyCommand bash -c '/usr/local/bin/knock -v %h $KNOCKS; sleep 1; exec /usr/bin/nc %h %p'
Host host1
Hostname host1.com
User root
Host home-host1
Hostname host1.com
User root
Host home-*
ProxyCommand ssh -W %h:%p home
So If I want to connect via home I:
ssh home-host1 and it jumps through my home to host1
Now this works all the time, every time for ssh via the console, but coda simply wont connect.
In the coda setup I have added home-host1 as the server and tried setting and clearing the user name and port so that just like ssh in a terminal everything comes from the config file.
I have also cleared known_hosts just in case it was caching something from there but no-go.
What am I doing wrong ??
Wow, after a lot more trial and error I succeeded.
coda does not appear to like ProxyJump but it will work with ProxyCommand
Host home-*
ProxyCommand ssh -W %h:%p home

SSH tunnel through a Jump Host

We are using OPENVPN to access our server over private IP with Jump Box. Our architecture is like below it may help us to undertsand the environment:
We are using 5 different subnet and want to access them via openvpn. The subnet are below:
10.110.0.0/20
10.120.0.0/20
10.121.0.0/20
10.130.0.0/20
10.133.0.0/20
our VPN server and Jump Box exist in the 10.133.0.0/20 subnet. Below are the route from my laptop when we are connected with OPENVPN:
dagar#dagar:~$ route -n
Kernel IP routing table
Destination Gateway Genmask Flags Metric Ref Use Iface
0.0.0.0 192.168.43.1 0.0.0.0 UG 600 0 0 wlp1s0
10.133.0.0 172.21.1.1 255.255.0.0 UG 0 0 0 tun0
Below is my .ssh/config file configuration:
Host 10.139.*.*
IdentityFile "/home/dagar/.ssh/id_rsa"
User admin
ProxyCommand ssh -W %h:%p abc#10.133.27.252 -p 911
Port 911
Host 10.133.*.*
IdentityFile "/home/dagar/.ssh/id_rsa"
User admin
ProxyCommand ssh -W %h:%p abc#10.133.27.252 -p 911
Port 911
In the above configuration 10.133.27.252 is my Jump Box server IP.
We are able to SSH all subnet server using Jump Box with above configuration except 10.133.0.0/20 subnet servers.
We are able to SSH 10.133.0.0/20 servers directly from user laptop when we comment the ProxyCommand from .ssh/config file for 10.133.00/20 subnet.
We want to SSH 10.133.0.0/20 subnet machines also via Jump Box.
Can you please help me why its not working for one subnet?
Any help or guidance will be appriciated.
Thank You

Connecting to a remote server from local machine via ssh-tunnel

I am running Ansible on my machine. And my machine does not have ssh access to the remote machine. Port 22 connection originating from local machine are blocked by the institute firewall. But I have access to a machine (ssh-tunnel), through which I can login to the remote machine. Now is there a way we can run ansible playbook from local machine on remote hosts.
In a way is it possible to make Ansible/ssh connect to the remote machine, via ssh-tunnel. But not exactly login to ssh-tunnel. The connection will pass through the tunnel.
Other way is I can install ansible on ssh-tunnel, but that is not the desired and run plays from there. But that would not be a desired solution.
Please let me know if this is possible.
There are two ways to achieve this without install the Ansible on the ssh-tunnel machine.
Solution#1:
Use these variables in your inventory:
[remote_machine]
remote ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222 ansible_ssh_user='username' ansible_ssh_private_key_file='/home/user/private_key'
hope you understand above parameters, if need help please ask in comments
Solution#2:
Create ~/.ssh/config file and add the following parameters:
####### Access to the Private Server through ssh-tunnel/bastion ########
Host ssh-tunnel-server
HostName x.x.x.x
StrictHostKeyChecking no
User username
ForwardAgent yes
Host private-server
HostName y.y.y.y
StrictHostKeyChecking no
User username
ProxyCommand ssh -q ssh-tunnel-server nc -q0 %h %p
Hope that help you, if you need any help, feel free to ask
No request to install ansible on the jump and remote servers, ansible is ssh service only tool :-)
First make sure you can work it directly with SSH Tunnel.
On local machine (Local_A), you can login to Remote machine (Remote_B) via jump box (Jump_C).
login server Local_A
ssh -f user#remote_B -L 2000:Jump_C:22 -N
The other options are:
-f tells ssh to background itself after it authenticates, so you don't have to sit around running something on the remote server for the tunnel to remain alive.
-N says that you want an SSH connection, but you don't actually want to run any remote commands. If all you're creating is a tunnel, then including this option saves resources.
-L [bind_address:]port:host:hostport
Specifies that the given port on the local (client) host is to be forwarded to the given host and port on the remote side.
There will be a password challenge unless you have set up DSA or RSA keys for a passwordless login.
There are lots of documents teaching you how to do the ssh tunnel.
Then try below ansible command from Local_A:
ansible -vvvv remote_B -m shell -a 'hostname -f' --ssh-extra-args="-L 2000:Jump_C:22"
You should see the remote_B hostname. Let me know the result.
Let's say you can ssh into x.x.x.x from your local machine, and ssh into y.y.y.y from x.x.x.x, while y.y.y.y is the target of your ansible playbook.
inventory:
[target]
y.y.y.y
playbook.yml
---
- hosts: target
tasks: ...
Run:
ansible-playbook --ssh-common-args="-o ProxyCommand='ssh -W %h:%p root#x.x.x.x'" -i inventory playbook.yml

How to add socks proxy to ssh config file?

I know how to forward SOCKS proxy on the command like below
ssh -D port_number user#host
This works well but I want to be able to put that forwarding into my SSH config file. But I am not able to locate any useful information or tutorial about.
I have bunch of normal SSH profiles in the config so I prefer to have the forwardings attached to the SSH profiles.
Use the config setting "DynamicForward" Here is a quick example of what it should look like:
Host example.com
User username
DynamicForward 8080
If the DynamicForward option is only given a port number, then it will bind to localhost:port.
You can add a specific IP to get it to bind to an address other than the localhost. Using "*:8080" will bind the proxy to all IP addresses on the box.
To use an IPv6 address enclose the address in square brackets:
[2001:0db8:85a3:0000:0000:8a2e:0370:7334]:8080
For details, please see the ssh_config man page (type man ssh_config).
I do not recommend use socat because it only support socks4
But you can use ncat
install ncat
add this in your ssh config file ProxyCommand ncat --proxy-type socks5 --proxy 127.0.0.1:1080 %h %p
You may need to check ncat options if it does not work.
This is how it is done:
Host server-fwd
Hostname a.b.c.d
User username
Port 22
LocalForward localhost:AAAA localhost:DD
LocalForward localhost:BBBB localhost:EEE
LocalForward localhost:CCCC localhost:FFFF
Change the "server-fwd" to whatever name you like, change "a.b.c.d" to the IP you're connecting to, change "username" to whatever your account is, maybe change the port number if necessary.
The LocalForward lines are the ones you have been looking for. The middle column (i.e. AAAA, BBBB and CCCC) are the ports on the system you are running the ssh command from. The right column (i.e. DD, EEE and FFFF) are the ports on the server you're connecting to. It's localhost in both cases because in the first case it's when the ssh command is run locally and in the second case it is relative to the server you just logged into.
Yes, I use this a lot. ;)