Rsync over ssh with key gets error EC2 - ssh

I can ssh into my EC2 with:
ssh -i /Users/User/Downloads/key.pem ubuntu#ec2-myec2.amazonaws.com
My directory:
/A
/B
/folderToTransfer
I can rsync into the same directory with:
rsync -avrz -e “ssh -i /Users/User/Downloads/key.pem” /
/Users/User/Documents/Programming/A/B/folderToTransfer /
ubuntu#ec2-myec2.amazonaws.com
New directory:
/A
/B
/folderToTransfer
/ubuntu#ec2-myec2.amazonaws.com
But this fails (when adding :~/ to the end)
rsync -avrz -e “ssh -i /Users/User/Downloads/key.pem” /
/Users/User/Documents/Programming/A/B/folderToTransfer /
ubuntu#ec2-myec2.amazonaws.com:~/
With the error
rsync: Failed to exec ?\#200\#234ssh: No such file or directory (2)
rsync error: error in IPC code (code 14) at /BuildRoot/Library/Caches/com.apple.xbs/Sources/rsync/rsync-47/rsync/pipe.c(86) [sender=2.6.9]
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: error in rsync protocol data stream (code 12) at /BuildRoot/Library/Caches/com.apple.xbs/Sources/rsync/rsync-47/rsync/io.c(453) [sender=2.6.9]
Others don't seem to have this same problem when rsyncing with ssh -i, what have I done wrong?

rsync -avrz -e “ssh -i /Users/User/Downloads/key.pem” /
^ ^
You're not using the ASCII double quote character " here. You're using some kind of open- and close-quote characters intended for typesetting. Your command is failing because the shell doesn't treat these characters as quote marks; rsync ends up trying to execute a program named “ssh.
Replace the characters with ASCII double quotes:
rsync -avrz -e "ssh -i /Users/User/Downloads/key.pem" /

The AWS ec2 solutions (with keys) below work for me only with a good formatting of the command line
solution 1
rsync -avz -e "ssh -i /root/xxxxxx.pem" ec2-user#xxxxxx.com:/var/www/ /var/www/copywww/
solution 2
rsync -avrz -e " ssh -i /root/xxxxxx.pem " ec2-user#xxxxx.com:/var/www/ /var/www/copywww

Related

Streaming stdout from remote shell call

I have a read-only remote filesystem that stores logs.
I use ssh -t to run grep queries on these logs. Sometimes, the queries can take too long and cause the ssh to timeout.
Is there some way to stream the stdout back and keep ssh connection alive?
Example command:
ssh -t my-host.com "cd /path/to/my/folder ; find ./ -name '*' -print0 | xargs -0 -n1 -P8 zgrep -B 5 -H 'My search string'" > search_result.txt
Thanks

Sudo over SSH mixes up password tty and stdin

Setup:
Local *nix machine with a SQL script script.sql (Postgres).
Remote machine remote (Debian 7) with Postgres.
I can SSH in as some_user, who is a sudoer.
Anything with Postgres needs to be done as postgres user.
The server only listens on localhost:5432.
How do I execute script.sql on remote without copying it there first?
This works well:
ssh -t some_user#remote 'sudo -u postgres psql -c "COMMANDS FOO BAR"'
The -t flag means that sudo will ask for some_user's password correctly on the local terminal.
One thing remains, to be able to pipe script.sql to psql. This does not work:
ssh -t some_user#remote 'sudo -u postgres psql' < script.sql
It fails with the message:
Pseudo-terminal will not be allocated because stdin is not a terminal.
sudo: no tty present and no askpass program specified
Edit: simplified example
Postgres and psql don't seem to figure much in the problem. The following code has the same issues:
ssh some_user#remote xargs sudo ls < input_file
The problem seems to be: we need to send 2 inputs to sudo, both the password using a tty, and the stdin to pass to ls.
Edit: even simpler
ssh localhost xargs sudo ls < input_file
sudo: no tty present and no askpass program specified
Adding -t does not work:
$ ssh -t localhost xargs sudo ls < input_file
Pseudo-terminal will not be allocated because stdin is not a terminal.
sudo: no tty present and no askpass program specified
Adding another -t does not work either:
$ ssh -t -t localhost xargs sudo ls < input_file
<content of input_file>
<waiting on a prompt>
ssh -T some_user#remote "sudo -u postgres psql -f-" < script.sql
"-f-" will read the script from STDIN. Just redirect the file in there, and there you go.
Don't bother with -t option to ssh, you don't need a full terminal for this.
ssh -T ${user}#${ip} sudo DEBIAN_FRONTEND=noninteractive postgres psql -f- < test.sql
Use DEBIAN_FRONTEND=noninteractive for resolve no tty present or equivalent of your distribution.

Permission denied using ssh command in shell

I'm trying to execute this shell with command line
host="192.168.X.XXX"
user="USERNAME"
pass="MYPASS"
sshpass -p "$pass" scp -o StrictHostKeyChecking=no /home/MYPATH/File.import "$user#$host:/"home/MYPATH/
To copy a file from my local server in to remote server. The remote server is a copy of the remote server but when I try to execute this shell I have this error:
**PERMISSION DENIED, PLEASE TRY AGAIN**
I didn't understand why if I try to execute this command in command line is working.
USERNAME#MYSERVER:~$ sshpass -p 'MYPASS' scp -o StrictHostKeyChecking=no /home/MYPATH/File.import USERNAME#192.168.X.XXX:/home/MYPATH/
Somebody have a solution??
Please use a pipe or the -e option for the password anyway.
export SSHPASS=password
sshpass -e ssh user#remote
Your simple command with -e option:
export SSHPASS=password
sshpass -e scp -o StrictHostKeyChecking=no /home/MYPATH/File.import user#192.168.X.XXX:/home/MYPATH/
Please remove the wrong quotes from your command:
sshpass -p "$pass" scp -o StrictHostKeyChecking=no /home/MYPATH/File.import $user#$host:/home/MYPATH/
You should also be able to remove the quotes around $pass.
Please ensure that you have no special characters in your pass variable or escape them correctly (and no typos anywhere).
For simplicity use a ssh command instead of scp for testing
Use the -v or -vvv option for the scp command to check what scp is trying to do. Also check the secure log or auth.log on the remote server
You have to install "sshpass" command then use the below snippet
export SSHPASS=password
sshpass -e sftp user#hostname << !
cd sftp_path
put filename
bye
!
A gotchya that I encountered was escaping special characters in the password which wasn't necessary when entering it in interactive ssh mode.

Rsync issue pulling from remote server to local server

I've been hitting a wall with this one and i'm not entirely sure what the issue may be. I will be posting two examples of what i'm trying to do one which works and one which doesn't, unfortunately I need the latter to work.
The Error:
Unexpected local arg: /data/
If arg is a remote file/dir, prefix it with a colon (:).
rsync error: syntax or usage error (code 1) at main.c(1215) [receiver=3.0.6]
This Command executes without issues, however it seems that it does not fully rsync my /data directory.
rsync -e "ssh -e 'none'" --force -azPxvIh --delete-after --exclude-from="/tmp/exclude.txt" root#myserver:/ / > "/tmp/wetrun.txt"
This is the command I've been trying utilize to rsync my /data directory and seems to fail.
rsync -e "ssh -e 'none'" --force -azPxvIh --delete-after --exclude-from="/tmp/exclude.txt" root#myserver:/data/ /data/ > "/tmp/wetrun.txt"

Problems with ${(z)var}

Code:
HOST=localhost
PORT=1234
RSYNCCMD="rsync -avP -e \"ssh -p $PORT\""
${(z)RSYNCCMD} root#$HOST:"\"/foo\"" /bar
Output:
rsync: Failed to exec ssh -p 1234: No such file or directory (2)
...
If I enter the same thing (rsync -avP -e "ssh -p 1234" ...) directly into the console, it works.
How do I fix it?
using ${(Q)${(z)RSYNCCMD}} might work for you (instead of ${(z)RSYNCCMD})
(${(z)RSYNCCMD} seems to be expanded to rsync -avP -e \"ssh\ -p\ 1234\", (Q) does an additional unquoting magic)