Upload via scp gives "Permisssion denied" on Karaf ssh console - ssh

The scp command to get a file from the Karaf directories via Karaf ssh console works well :
scp -P 8101 karaf#localhost:/deploy/README
(after I have entered the password)
But the reverse operation to upload a file fails with a "Permission denied" error :
scp README -v -P 8101 karaf#localhost:/deploy/
I tried to locally remove the file first, same error. I gave 777 on the "deploy" directory, and also tried with a new test directory.
Where can it come from ?
Thanks
Arnaud

Found it : the -P option cannot be anywhere in the line, it has to be before the local and remote filenames

Related

SSH failed: No such file or directory (2)

I try to copy files with rsync and i have checked multiple times that the address is correct but still i get feedback: no such directory. The code is:
rsync -azvrP -e ssh master_123#165.x.x.115:/applications/123/public_html/wp-content/uploads/2023/ /applications/321/public_html/wp-content/uploads/2023/
What could be the error?

SCP log file to server

Hello I am trying to SCP a log file to serve and I keep getting error
Warning: Identity file ids-east-1.pem not accessible: No such file or directory.
ec2-11.com: Permission denied (publickey).
lost connection
I have tried all the solutions presented earlier but can't seem to figure out what's wrong.
The key I am using is :
scp -r -i ids-east-1.pem ~/int/resources/tests/tasks/lib/testing.log ec2-user#11.com:/home/wn/shelf/wrDb/fractions
Just a reminder- I am able to get a log file from this server using :
scp -i ids-east-1.pem ec2-user#11.com:/home/wn/shelf/wrDb/fractions/chrono.log ~/Desktop/aws_chrono.log
If one command works, but the other gives you:
Warning: Identity file ids-east-1.pem not accessible: No such file or directory.
You are likely not running the commands from the same directory. Try specifying the key path fully (something like):
scp -i ~/.ssh/ids-east-1.pem ...

Trying to transfer local files to web server

I recently set up Lamp stack on ubuntu 14.04 for my web server. I'm working through Digital Ocean. These are the steps I went through...
On local machine I logged in to my web server with
sftp user#web_server_ip
Then
sftp> cd /var/www/html
How would I go upon getting onto my local machine to get the file for the site? And how would I transfer them?
I know that I have to use the [get] and [put] commands
I'm just confused what's considered local/remote? if I'm logged into the remote server on my local machine. Overthinking it?
This is the tutorial I'm trying to follow: How To Use SFTP to Securely Transfer Files with a Remote Server
Edit:
So I tried moving a whole directory from my local machine and this is what I ended up doing
scp -r /path/directory_name name#ip_address:/var/www/html
scp: /var/www/html/portfolio.take7: Permission denied
Should I be changing permission by using sudo prior to scp -r?
Edit2:
I have also tried
Where_directory_is$ scp -r /path/directory_name name#ip_address:/var/www/html
/var/www/html: No such file or directory
It might be easier to start with SCP which allows you to copy files with one command. So for example, if you had a local file /path/filename.css and wanted to transfer it to your server, you could use the following command on your local machine:
scp /path/filename.css username#remote_hostname_or_IP:~
This command copies the local file and transfers it to the home directory of the username on the remote server using SSH. You can then SSH in (ssh username#remote_hostname_or_IP) and then do what you need with the file sitting in your home directory, such as move it to the proper Apache directory.
Once you start to get more comfortable, you can switch to sftp if you like.
Update
Here is how to set up your Apache permissions. Let's say you have an account you on the linux computer running Apache, and we'll say the IP is 192.168.1.100.
On your local machine, create this shell script, secure.sh, and remember shell scripts need to have execute privileges (chmod +x secure.sh). Fill it with the following contents:
#!/usr/bin/env bash
# Lockdown the public web files
find /var/www -exec chown you:www-data {} \;
find /var/www -type d -exec chmod -v 750 {} \;
find /var/www -type f -exec chmod -v 640 {} \;
This shell script is setting the permissions for anything in the /var/www/ directory to be 750 for the directories and 640 for the files. This gives you complete read/write permissions for the files and www-data (which is the account for Apache) read permissions. Run this anytime you have uploaded files to ensure the permissions are always set correctly.
Next, SSH into your remote computer and go to the /var/www/html directory. Ensure that the ownership is not set to root. If it is, scp the secure.sh file into your remote computer, become root and run it. This only needs to be done once, so you can remotely set the permissions.
Now you can copy directly to /var/www/ through the scp -r command on your local computer from the top of the directory you wish to copy to /var/www/html:
scp -r ./ you#192.168.1.100:/var/www/html/
Then run this command to remotely run the secure.sh shell script and send the output to out.txt:
ssh you#192.168.1.100 -p 23815 ./secure.sh > out.txt
Then cat out.txt to see that the file permissions changed accordingly.
If this is a public facing computer, then you must add an SSH key to your scp connection. You can use this tutorial to find out more about generating your own keys, it is quite easy. To use the key, you only need to add -i private_key_file to your scp and ssh commands. Lastly, it would actually be safer to keep the /var/www files as root, SSH into the computer, su to become root, then run secure.sh as root (with the owner changed to root in the shell script). It all depends on the level of security you need to worry about. If it is a development computer (which is what I am assuming) no worries then.
For folders use
scp -r root#yourIp:/home/path/ /pathOfDirectory/
For files
scp -r root#yourIp:/home/path/ /pathOfDirectory/file fileNameCopied

Can't rsync into subfolder, or even ssh at this point

I need to rsync log files like this:
rsync --progress -rvze ssh name#host:/path/to/folder/*.log
When I run this command though, I get an error:
rsync: getcwd(): No such file or directory (2)
No such file or directory? That's odd. So I try to ssh directly:
ssh name#host
it prompts to enter my name, I do, then I type
cd /path/to/folder
which works fine (log files are present).
I double checked my ssh keys, everything seems to be in order there, but for some reason I can't ssh into a subfolder on this host, so there's no way I can get rsync working correctly.
EDIT:
Running the identical rsync command on my Mac, it works fine. Running it in my ubuntu EC2 instance is still failing.
Are you sure there are any log files at all? If not this command will fail with the 'No such file or directory'
Rather use:
rsync --progress --include='*.log' -rvze ssh name#host: /path/to/folder/ local_folder
The 'direct' ssh syntax you use in your second test is not supported:
ssh name#host:/path/to/folder/
because it will use host:/path/to/folder/ as the hostname.

rsync deploy and file/directories permissions

I'm trying to use rsync to deploy my website that resides on a shared web host.
Phpsuexec is running on it and that caused me problems with permissions on files and directories I've transfered via rsync. Actually files should be set to 644 and directories to 755, otherwise I get a 500 error.
After several attempts, I came with this rsync command:
rsync -avz -e ssh --chmod=Du=rwx,go=rx,Fu=rw,og=r -p --exclude-from=/var/www/mylocalfolder/.rsyncignore /var/www/mylocalfolder/ user#mywebsite.net:~/
Unfortunately this command doesn't works as expected because all the sent directories have been set to 744. On the other hand, files permission have been correctly set on 644.
I can't understand what is wrong.
P.S. I use Linux on my local machine.
Try it like this :
--chmod=Du=rwx,Dg=rx,Do=rx,Fu=rw,Fg=r,Fo=r
It worked for me.