SCP through ssh tunnel - ssh

ssh abc#202.221.23.87 -p 22724 -L 12345:172.21.33.51:18081
I openned the tunnel successfully and I tried on my localhost:
scp -P 12345 abc#127.0.0.1:/testfuel_20140714.zip .
But it's not working. It showed nothing. How to make scp works? Thanks so much.

You don't appear to be creating the tunnel on localhost. Are you trying to download or upload a file?
ssh -p 22724 -L 12345:localhost:18081 abc#202.221.23.87

Related

Converting a putty ssh tunnel to a command line one

I am trying to convert below putty configuration to a ssh command.
I do not understand the mysql5:3306 part. What is this mysql5 element and what is the corresponding command line option in the ssh command on Linux?
Apparently,
ssh -N -L 3306:mysql5:3306 user#remote
worked to establish the ssh tunnel.

How to connect to expo via private tunnel (not ngrok)

I have the problem that at work I can not connect via network to expo, so I need to use tunnel, which is fine. However sometimes the tunnel is really slow destroying any developer expierience.
Since I can also host expo locally on localhost I had the idea of simply ssh-tunneling to a remote server that has an open port.
my remote host runs ubuntu
so i SSH there like so:
ssh -R 0.0.0.0:19000:0.0.0.0:19000 user#ip
in order for this to work i also added
GatewayPorts clientspecified
to my /etc/ssh/sshd_config
...
sudo netstat -plutn
shows me
tcp 0 0 0.0.0.0:19000 0.0.0.0:* LISTEN 20183/2
so accepting requests (i also tried to forward port 19001 to get something back when i enter it in the browser which worke fine)
However when i enter:
exp://serverip:19000 into the expo client on my android phone he can't connect.
Any ideas on help?
It looks like Expo uses multiple ports 19000, 19001, and 19002. So you will need to forward all of these.
e.g.
$ ssh -f -N -R 19000:localhost:19000 user#ip
$ ssh -f -N -R 19001:localhost:19001 user#ip
$ ssh -f -N -R 19002:localhost:19002 user#ip
Also, you can set the REACT_NATIVE_PACKAGER_HOSTNAME environment variable to use the remote host.
$ export REACT_NATIVE_PACKAGER_HOSTNAME="ip"
$ expo start

Connect over SSH using a .pem file

I would like to know how to connect over SSH using a .pem file to any server.
Currently I'm executing the following command:
ssh user#mydomain.example
What option should I use?
Use the -i option:
ssh -i mykey.pem user#mydomain.example
As noted in this answer, this file needs to have correct permissions set. The ssh man page says:
SSH will simply ignore a private key file if it is accessible by others.
You can change the permissions with this command:
chmod go= mykey.pem
That is, set permissions for group and others equal to the empty list of permissions.
chmod 400 mykey.pem
ssh -i mykey.pem user#mydomain.example
Will connect you over SSH using a .pem file to any server.
For AWS if the user is ubuntu use the following to connect to remote server.
chmod 400 mykey.pem
ssh -i mykey.pem ubuntu#your-ip
To connect from Terminal to AWS AMI:
chmod 400 mykey.pem
ssh -i mykey.pem ec2-user#mydomain.example
You can connect to a AWS ec-2 instance using the following commands.
chmod 400 mykey.pem
ssh -i mykey.pem username#your-ip
by default the machine name usually be like ubuntu since usually ubuntu machine is used as a server so the following command will work in that case.
ssh -i mykey.pem ubuntu#your-ip
If you still got error messages like:
Received disconnect from 34.219.50.0 port 22:2: Too many authentication failures. Disconnected from 34.219.50.0 port 22
Edit your SSH config located at ~/.ssh/config and add new record at the end
Host mydomain.example
User ubuntu
IdentityFile /home/you/path-to-pem/key.pem
IdentitiesOnly yes
Call short command: ssh mydomain.example
what resolved it for me was to run: sudo chown $USER: {.pem_file}

How to transfer files between two computers with a server in the middle?

I have a PC-1 in my home and need to transfer files back and forth to my PC-2 at my University.
The problem is that PC-2 has only access to local network.
So, in order to access it from home I have to ssh to the University server and only then ssh to PC-2.
I know that scp can transfer files between two PCs, but did not find anything in documentation for when there is a server in the middle.
Can it be done with scp or other tool?
Alternative answer if the ssh tunnel is disabled on the server side :
PC-2 to PC-1
ssh university-server 'ssh PC-2 "cat remotefile"' > localfile
PC-1 to PC-2
ssh university-server 'ssh PC-2 "cat > remotefile"' < localfile
Explanation :
You are asking university-server to ssh to PC-2 with the specified command ( in this case cat) and using pipe redirection to write or read from local files
PS: Modified the answer according to working correction in the comment
You can use an ssh tunnel, to connect to PC-2 from PC-1 using university-server as an intermediate.
Establish the tunnel
ssh -f -N university-server -L 2222:PC-02:22
the tunnel will be kept in background until the ssh process is killed
scp file transfert
scp -P 2222 user#localhost:file .
scp -P 2222 file user#localhost:path

How to SCP a file from a 2-deep connection

Say I SSH into a server Server1 and from there SSH into server Server2 which is only accessible from a connection to Server1. Below simulates the example terminal commands for this behaviour:
[name#mylaptop]$ ssh user#Server1
user#Server1's password:
*** Welcome to Server1! ***
[user#Server1]$ ssh user2#Server2
user2#Server2's password:
*** Welcome to Server2! ***
[user2#Server2]$
Now I have a file, named file.txt in my home directory on Server2:
[user2#Server2]$ ls
file.txt
[user2#Server2]$
Is it possible to use scp to copy file.txt from Server2 onto mylaptop with a single command (i.e. not needing to first copy the file to Server1)?
In other words, can this be done easier than the following:
[name#mylaptop]$ ssh user#Server1
user#Server1's password:
*** Welcome to Server1! ***
[user#Server1]$ scp user2#Server2:~/file.txt .
user2#Server2's password:
file.txt 100% 690 0.7KB/s 00:00
[user#Server1]$ logout
Connection to Server1 closed.
[name#mylaptop]$ scp user1#Server1:~/file.txt .
user#Server1's password:
file.txt 100% 690 0.7KB/s 00:00
[name#mylaptop]$ ls
file.txt
It's possible and relatively easy, even when you need to use certificates for authentication (typical in AWS environments).
The command below will copy files from a remotePath on server2 directly into your machine at localPath. Internally the scp request is proxied via server1.
scp -i user2-cert.pem -o ProxyCommand="ssh -i user1-cert.pem -W %h:%p user1#server1" user2#server2:/<remotePath> <localpath>
If you use password authentication instead, try with
scp -o ProxyCommand="ssh -W %h:%p user1#server1" user2#server2:/<remotePath> <localpath>
If you use the same user credentials in both servers:
scp -o ProxyCommand="ssh -W %h:%p commonuser#server1" commonuser#server2:/<remotePath> <localpath>
You can use port forwarding:
Execute
ssh -L60000:Server2:22 user#Server1
in one terminal and keep this process open.
Then in another terminal run
scp -P 60000 user2#localhost:file.txt .
(You can replace 60000 by your favourite port number)
Try the answers on ServerFault :
https://serverfault.com/questions/37629/how-do-i-do-multihop-scp-transfers.
The answers cover a variety of flavours of ssh.