SSH through multiple hosts to execute another ssh session [closed] - ssh

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I spent to much time trying to do something which in plain words looks simple
I am at home, without firewall and all open ports. I need to ssh to the router at work where I have access to ssh port 22. My personal machine is on that subnet having internal ip address. So, what I need to do is to ssh from one machine to the second and from the second to the third. On the third I need to execute another ssh which tunnels some ports to my home machine. All that in bash script from my home. I have tried many solutions on the internet but nothing works.
The whole ideal is to get to my PC at work and run ssh tunnel for port 22 which will allow me to sshfs my work PC.
I could do it manually, by sshing to the router, that form the router to the work pc and then execute the ssh tunnel. I need a one-click solution.
Thanks in advance!

Have you tried just stacking the ssh commands like ssh -t localhost ssh localhost be sure to add the -t option for each hop except the last one ssh -t localhost ssh -t localhost ssh localhost

Maybe try VNC? With the right setup/port forwarding, you wouldn't have to jump from 1 PC to the next.

Related

Centos SSH access denied [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I'm pulling my hair out over this and can't find a solution anywhere.
After a reboot I started getting "Connection refused" through port 22000 which was the one I had configured and SSH was still listening to(I double checked). I'm connecting from a PC on the same LAN.
I could connect to port 22 suddenly after that but there I got "Access denied" after entering my password. After troubleshooting this I got tired and reinstalled openssh, with that I got a clean config and everything including the firewall and selinux is now configured to use port 22 with these commands:
sudo semanage port -a -t ssh_port_t -p tcp 22
sudo firewall-cmd --permanent --zone=public --add-port=22/tcp
sudo firewall-cmd --reload
Still access denied even though it's the correct password, I know since I can use the exact same directly on the server.
I have tried:
Putting "PermitRootLogin yes" in the sshd_config and login with root but that is also denied, same with a new test account I made. I removed "AllowUsers [username]" from the config before this.
Restarted the SSH service and rebooted as well several times.
The solution here to no avail: Centos 7 Remote SSH access denied
Setting selinux to "Permissive"
Disabling the firewall
Changing password to one without special characters
Triple checking that the SSH service is running
Neither "/var/log/secure" nor "/var/log/messages" log anything regarding my attempts to login.
I must have missed something, anyone have any ideas what?
use: ssh -vvv username#host to check the issue.
Try creating a pem file and see if it works.
I solved it and I really don't want to post the answer since it was embarrassingly easy, but I refuse to leave the question unsolved for the poor souls with similar problems.
I rebooted my PC.... facedesk

rsync through ssh tunnel [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I want to rsync to a cluster node to which I usually connect passing through another system:
Say I connect first to
ssh user#bridge
and from there to
ssh user#clusternode
Now I want to rsync from my workstation to clusternode. I do the following:
I open a ssh tunnel
ssh -L8000:clusternode:8000 user#bridge
I rsync from my workstation to clusternode
rsync -e "ssh -p8000" source user#localhost:destination
and it does not work, I get
ssh_exchange_identification: Connection closed by remote host
Why does it not work? What do I have to do?
I have found a lot of information here:
http://toddharris.net/blog/2005/10/23/rsyncing-through-an-ssh-tunnel/
I think to understand that my problem is the second authentication between the bridge and the destination, so I changed to method 2 that is also not very elegant, but it works. I would like to try method 3, but I don't know how to configure a rsync daemon
Try this one-liner:
rsync -av -e "ssh -A root#proxy ssh" ./src root#target:/dst
Here's what worked for me.
I run a command in the background to tunnel to the remote host:
ssh -N -L 2222:remote.example.com:22 bridge.example.com&
then I rsync to localhost like this:
rsync -auve "ssh -p 2222" . me#localhost:/some/path
You should connect to the port 22 of clusternode, so the tunnel should look like
ssh -L localhost:8000:clusternode:22 user#bridge

SSH config file paragraph to open a specific directory on remote server [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Is there a way to specifically ssh into a particular directory in remote location, specifically using the local ssh config file (not terminal)? Something like Dir option in the paragraph below, for example,
Host remote_dir
Hostname remote_server
User username
Dir path/to/remote_dir/
So, if I, ssh using the Host value from above paragraph,
ssh remote_dir
Then, I would like to be logged in and the terminal to be ready for me at path/to/remote_dir/ of the remote server,
username#remote_server: path/to/remote_dir/ > pwd
/home/username/path/to/remote_dir/
In this post on ServerFault, they say you can't do it all through the ssh config file. But you can do it with the ssh config and your .bash_profile or whatever the terminal nerds call it.
in the ssh config file add
Host dev
Hostname server.com
User joe
then in your .bash_profile add an alias
alias domain1="ssh dev -t 'cd domains/domain1; bash'"
Here the dev refers to what you set up in the config file.
In the Terminal, just type domain1, you will be asked to put in your password and will go straight to the directory. Make a new alias for all your domains and it will make logging in to each one super easy.
Take a look at
https://serverfault.com/questions/167416/change-directory-automatically-on-ssh-login
This is the accepted answer:
LocalCommand isn't what you want, anyway. That's run on your machine. You want RemoteCommand. Something like this worked for me:
Host example.net
RemoteCommand cd / && exec bash --login
RequestTTY yes
(Old answer) For a similar use case, ssh -t is also an option:
ssh server -t "cd /my/remote/directory; bash --login"
It is not the same, as it does not use ssh config. But you can define an alias for the command and end up with a similar effect.

How do I ssh two deep with private keys? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
At work, I can ssh to a server with private keys set up on my work machine.
jake#work$ ssh server
jake#server$
I'm trying to ssh from home to work to server with the private keys. The process should look like this:
jake#home$ ssh work
jake#work$ ssh server
jake#server$
But instead its asking me for a password. If I call ssh server with -v, it shows that its looking for keys .ssh/id_dsa and .ssh/id_rsa but my key is named differently.
I can get into server by specifying the key myself:
jake#home$ ssh work
jake#work$ ssh server -i .ssh/idfoo
jake#server$
How do I get ssh to find the right keys for this two step login process?
You can specify the key using Host+IdentityFile in your ~/.ssh/config on work:
Host server
IdentityFile idFoo
Or just this alone in the config file, to apply a key identity to all sessions:
IdentityFile idFoo
But I can't explain why this is required only when trying to ssh from work->server from a work ssh session.

How to ssh to a remote server behind multiple firewalls? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Here is my situation
I could access Server A from my home laptop via ssh.
Server B is only accessible from Server A via ssh.
Server C is only accessible from Server B via ssh.
Is there anyway that I could configure my .ssh/config so that I could ssh to Server C directly from my laptop ? I need this because I need regularly transfer files from Server C back to my laptop. I'm using 'scp' but go through this ssh hierarchy manually is too painful. I'm wondering whether there's a more straight-forward to do this via the magic of ssh.
You want to set up SSH tunnels to to allow SSH like this:
A => B
B => C
Here's an example of how to setup the tunnel to B through A on Linux:
ssh -f myusername#hostA -N -L 4444:hostB:22
Then, you should be able to ssh to port 4444 on hostA, and have that forwarded to port 22 (where SSH commonly runs) on hostB. After running the above command, try this:
ssh -p 4444 hostA
That should connect you to hostB. You may have to change ports for this to work, if port 4444 on hostA is already being used you'll have to pick a different port. Assuming that this works, you can use the same command with different hostnames to set up the tunnel from B => C:
ssh -f myusername#hostA -N -L 4444:hostB:4444
ssh -f myusername#hostB -N -L 4444:hostC:22
This is also useful if you want to set up a SOCKS proxy for web browsing. I do this so that my web traffic looks like it's coming from my university, so that I can use online access to scientific journals.
References:
Tunneling protocol
Breaking firewalls with OpenSSH and Putty
How to create an SSH tunnel using Putty, and then use that tunnel as a Firefox SOCKS proxy