How to copy files from remote PC-A-B to local drive? - ssh

Introduction. My work computer (PC-B) is accessible only from inside the network (PC-A) and I can connect to PC-B via SSH in one command: ssh -J user#PC-A user#PC-B.
Problem. I need to copy folders from remote PC-B to my local drive.
I tried:
(a) from my local PC: scp -r user#PC-A user#PC-B:/path/to/folder /home/ but it does not work.
(b) while remotely connected to PC-B: scp path/to/folder userHome#PC-HOME - connection timed out.
Is there any simple solution?

You can use ProxyJump directly in the scp command:
scp -r -o 'ProxyJump user#PC-A' user#PC-B:/path/to/folder /home/
You can also create an alias in ~/.ssh/config and do not type address
of the proxy server each time:
Host PC-A-alias
User user
Hostname PC-A
Host PC-B-alias
User user
Hostname PC-B
ProxyJump PC-A-alias
Now you can just use PC-B-alias with ssh, scp and other commands that use SSH such as rsync.

Related

Multiple jumps ssh tunnel, one command line

I'm currently connecting my local machine with the target running commands in my local (mobaxterm), in pivotonone and pivottwo, this is the flow of data:
mobaxterm <--- pivotone <--- pivottwo <--- target
These are the commands that I run on each machine:
local(mobaxterm)
ssh -L 5601:127.0.0.1:5601 root#pivotone
pivotone
ssh -L 5601:127.0.0.1:5601 root#pivottwo
pivottwo
ssh -L 5601:127.0.0.1:5601 root#target
I was wandering if I could do the same but with just one command in my mobaxterm machine?
You don't need the -L option to manage jump hosts.
ssh -J root#pivotone,root#pivottwo root#target
You can automate this in your .ssh/config file
Host target
ProxyJump root#pivotone,root#pivottwo
Then you can simply run
ssh root#target

sshfs with two consecutive ssh authentications

with two consecutive ssh authentications I mean the following:
I ssh to remote system A
from remote system A, I ssh to remote system B
There is no way to ssh to B directly.
I have no problems mounting directories from A using sshfs.
I thought about mounting directories from B on A but unfortunately A does not have sshfs installed. Even if, I would not know if it works.
Is there maybe another way to access directories on B in a convenient way?
My ~/.ssh/config looks like this now:
Host A
User user
HostName A.example.com
ControlMaster auto
ControlPath ~/.ssh/%r#%h:%p
Host B
User user
HostName B.example.com
ProxyCommand ssh -W %h:%p A
How would my sshfs command look like?
This does not work:
sshfs -o allow_other,defer_permissions -o user#B.example.com:/somedir ~/somedir
It outputs the error message:
remote host has disconnected
Use ProxyCommand or ProxyJump to do that transparently for the end application (sshfs). For example in ~/.ssh/config
Host A
# other configuration options needed
Host B
ProxyCommand ssh -W %h:%p A
Then you should be able to use sshfs transparently by directly specifying host B.

Connecting to a remote server from local machine via ssh-tunnel

I am running Ansible on my machine. And my machine does not have ssh access to the remote machine. Port 22 connection originating from local machine are blocked by the institute firewall. But I have access to a machine (ssh-tunnel), through which I can login to the remote machine. Now is there a way we can run ansible playbook from local machine on remote hosts.
In a way is it possible to make Ansible/ssh connect to the remote machine, via ssh-tunnel. But not exactly login to ssh-tunnel. The connection will pass through the tunnel.
Other way is I can install ansible on ssh-tunnel, but that is not the desired and run plays from there. But that would not be a desired solution.
Please let me know if this is possible.
There are two ways to achieve this without install the Ansible on the ssh-tunnel machine.
Solution#1:
Use these variables in your inventory:
[remote_machine]
remote ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222 ansible_ssh_user='username' ansible_ssh_private_key_file='/home/user/private_key'
hope you understand above parameters, if need help please ask in comments
Solution#2:
Create ~/.ssh/config file and add the following parameters:
####### Access to the Private Server through ssh-tunnel/bastion ########
Host ssh-tunnel-server
HostName x.x.x.x
StrictHostKeyChecking no
User username
ForwardAgent yes
Host private-server
HostName y.y.y.y
StrictHostKeyChecking no
User username
ProxyCommand ssh -q ssh-tunnel-server nc -q0 %h %p
Hope that help you, if you need any help, feel free to ask
No request to install ansible on the jump and remote servers, ansible is ssh service only tool :-)
First make sure you can work it directly with SSH Tunnel.
On local machine (Local_A), you can login to Remote machine (Remote_B) via jump box (Jump_C).
login server Local_A
ssh -f user#remote_B -L 2000:Jump_C:22 -N
The other options are:
-f tells ssh to background itself after it authenticates, so you don't have to sit around running something on the remote server for the tunnel to remain alive.
-N says that you want an SSH connection, but you don't actually want to run any remote commands. If all you're creating is a tunnel, then including this option saves resources.
-L [bind_address:]port:host:hostport
Specifies that the given port on the local (client) host is to be forwarded to the given host and port on the remote side.
There will be a password challenge unless you have set up DSA or RSA keys for a passwordless login.
There are lots of documents teaching you how to do the ssh tunnel.
Then try below ansible command from Local_A:
ansible -vvvv remote_B -m shell -a 'hostname -f' --ssh-extra-args="-L 2000:Jump_C:22"
You should see the remote_B hostname. Let me know the result.
Let's say you can ssh into x.x.x.x from your local machine, and ssh into y.y.y.y from x.x.x.x, while y.y.y.y is the target of your ansible playbook.
inventory:
[target]
y.y.y.y
playbook.yml
---
- hosts: target
tasks: ...
Run:
ansible-playbook --ssh-common-args="-o ProxyCommand='ssh -W %h:%p root#x.x.x.x'" -i inventory playbook.yml

How to transfer files between two computers with a server in the middle?

I have a PC-1 in my home and need to transfer files back and forth to my PC-2 at my University.
The problem is that PC-2 has only access to local network.
So, in order to access it from home I have to ssh to the University server and only then ssh to PC-2.
I know that scp can transfer files between two PCs, but did not find anything in documentation for when there is a server in the middle.
Can it be done with scp or other tool?
Alternative answer if the ssh tunnel is disabled on the server side :
PC-2 to PC-1
ssh university-server 'ssh PC-2 "cat remotefile"' > localfile
PC-1 to PC-2
ssh university-server 'ssh PC-2 "cat > remotefile"' < localfile
Explanation :
You are asking university-server to ssh to PC-2 with the specified command ( in this case cat) and using pipe redirection to write or read from local files
PS: Modified the answer according to working correction in the comment
You can use an ssh tunnel, to connect to PC-2 from PC-1 using university-server as an intermediate.
Establish the tunnel
ssh -f -N university-server -L 2222:PC-02:22
the tunnel will be kept in background until the ssh process is killed
scp file transfert
scp -P 2222 user#localhost:file .
scp -P 2222 file user#localhost:path

Passwordless connect a remote server from vagrant vm through ssh (rsync)

I'd like to run a rsync command from a vagrant vm to a remote server (to push files) without the need for a password.
So, the involved machines are: host, guest vm and remote
host is authorized on remote via authorized_keys, however when I run the rsync command from the vm I get asked for a password.
Is there a way to get passwordless rsync from the vm using the keys on the already-authorized host?
I'd like to avoid copying a new authorized key to the remote every time I create a vm.
Also, adding my server's password in the vagrant file is not an option.
Use ssh key forwarding via ssh-agent. Follow these steps:
On your host machine:
ssh-add PATH_TO_KEY <use Tab if unsure>
vagrant ssh
In the vagrant box edit your ~/.ssh/config:
Host name_or_ip_of_remote
ForwardAgent yes
Now try to connect to the remote from the vagrant box:
ssh name_or_ip_of_remote
It should work without a password. As rsync is using ssh under the hood, it will work without a password too.