SCP folder that does not exist in remote - scp

How can I copy a folder and its contents (folders inside too), from my local machine to a remote machine using SCP.
I want to copy:
/wp-content
to my remote server, but wp-content does not exist there.

I would tar the folder first and then send over scp
first tar
tar -zcvf wp-content.tar.gz ./wp-content
then send over scp
scp ./wp-content.tar.gz your_username#remotehost.com:/some/remote/directory
then login to the remote machine
ssh your_username#remotehost.com
Navigate to the directory and untar
cd /some/remote/directory
tar -xzvf wp-content.tar.gz ./

rsync -ave ssh /wp-conent user#server_ip:/path/some_non_existing_dir_to_copy/

scp -r wp-content user#remotehost:/destination
scp -r (recursive copy) will create the destination folder if it's not there already.

Related

Windows 10 SSH folder and known_hosts file is missing

On windows it is usually stored in the %USERPROFILE%\ssh or
%USERPROFILE%.ssh folders.
However I do not see the ssh folders when going to %USERPROFILE%.
Is it possible to create the ssh folder and the known_hosts file myself?
Yes, this is expected.
You can in a CMD do:
cd "%USERPROFILE%"
mkdir .ssh
From there, assuming you have ssh-keygen in your PATH (which is included in Git For Windows for example), you can type:
ssh-keygen -t rsa -P ""
That will generate a key in the default path ~/.ssh(/id_rsa[.pub]), with ~/.ssh being translated in %USERPROFILE%\.ssh

Can't write to a file with sshfs

I'm trying to mount a directory from a remote machine to my laptop.
Here is the command.
sshfs user01:/home/user01/somedir /home/user02/mount -o allow_other -o rw
When I try to write a file, I get:
E212: Can't open file for writing
Here are the contents of /etc/fuse.conf
user_allow_other
The permission of the remote dir of /home/user01/somedir
-rw-r--r-- 1 user01 users
I prefer not to change the permissions on the remote machine.
Here is the command that worked.
sshfs -o sftp_server="/usr/bin/sudo -u user01 /usr/libexec/sftp-server" you#<server>:/home/user01 /source/home_dir/mountpoint
Make sure on destination server usr/libexec/sftp-server is the correct path, or adjust.

Trying to transfer local files to web server

I recently set up Lamp stack on ubuntu 14.04 for my web server. I'm working through Digital Ocean. These are the steps I went through...
On local machine I logged in to my web server with
sftp user#web_server_ip
Then
sftp> cd /var/www/html
How would I go upon getting onto my local machine to get the file for the site? And how would I transfer them?
I know that I have to use the [get] and [put] commands
I'm just confused what's considered local/remote? if I'm logged into the remote server on my local machine. Overthinking it?
This is the tutorial I'm trying to follow: How To Use SFTP to Securely Transfer Files with a Remote Server
Edit:
So I tried moving a whole directory from my local machine and this is what I ended up doing
scp -r /path/directory_name name#ip_address:/var/www/html
scp: /var/www/html/portfolio.take7: Permission denied
Should I be changing permission by using sudo prior to scp -r?
Edit2:
I have also tried
Where_directory_is$ scp -r /path/directory_name name#ip_address:/var/www/html
/var/www/html: No such file or directory
It might be easier to start with SCP which allows you to copy files with one command. So for example, if you had a local file /path/filename.css and wanted to transfer it to your server, you could use the following command on your local machine:
scp /path/filename.css username#remote_hostname_or_IP:~
This command copies the local file and transfers it to the home directory of the username on the remote server using SSH. You can then SSH in (ssh username#remote_hostname_or_IP) and then do what you need with the file sitting in your home directory, such as move it to the proper Apache directory.
Once you start to get more comfortable, you can switch to sftp if you like.
Update
Here is how to set up your Apache permissions. Let's say you have an account you on the linux computer running Apache, and we'll say the IP is 192.168.1.100.
On your local machine, create this shell script, secure.sh, and remember shell scripts need to have execute privileges (chmod +x secure.sh). Fill it with the following contents:
#!/usr/bin/env bash
# Lockdown the public web files
find /var/www -exec chown you:www-data {} \;
find /var/www -type d -exec chmod -v 750 {} \;
find /var/www -type f -exec chmod -v 640 {} \;
This shell script is setting the permissions for anything in the /var/www/ directory to be 750 for the directories and 640 for the files. This gives you complete read/write permissions for the files and www-data (which is the account for Apache) read permissions. Run this anytime you have uploaded files to ensure the permissions are always set correctly.
Next, SSH into your remote computer and go to the /var/www/html directory. Ensure that the ownership is not set to root. If it is, scp the secure.sh file into your remote computer, become root and run it. This only needs to be done once, so you can remotely set the permissions.
Now you can copy directly to /var/www/ through the scp -r command on your local computer from the top of the directory you wish to copy to /var/www/html:
scp -r ./ you#192.168.1.100:/var/www/html/
Then run this command to remotely run the secure.sh shell script and send the output to out.txt:
ssh you#192.168.1.100 -p 23815 ./secure.sh > out.txt
Then cat out.txt to see that the file permissions changed accordingly.
If this is a public facing computer, then you must add an SSH key to your scp connection. You can use this tutorial to find out more about generating your own keys, it is quite easy. To use the key, you only need to add -i private_key_file to your scp and ssh commands. Lastly, it would actually be safer to keep the /var/www files as root, SSH into the computer, su to become root, then run secure.sh as root (with the owner changed to root in the shell script). It all depends on the level of security you need to worry about. If it is a development computer (which is what I am assuming) no worries then.
For folders use
scp -r root#yourIp:/home/path/ /pathOfDirectory/
For files
scp -r root#yourIp:/home/path/ /pathOfDirectory/file fileNameCopied

How to copy entire folder and subfolder from remote sever to local machine using PuTTY psftp

I am doing an automation to copy folder and subfolder from remote server to local machine i know the command to copy all the files inside folder
mget *.extension
But I want to know is there any command in psftp to copy folder and subfolder recursively to my local machine.
You can use scp. (example) scp -r user#remote:/path/to/folder /home/user/Desktop/
To copy files recursively with the psftp, use the get or the mget with the -r (recurse) switch.
For example:
get -r /remote/path C:\local\path
See:
https://the.earth.li/~sgtatham/putty/latest/htmldoc/Chapter6.html#psftp-cmd-get
https://the.earth.li/~sgtatham/putty/latest/htmldoc/Chapter6.html#psftp-cmd-mgetput
Or as #jbarnett suggested, you can use the SCP. PuTTY has an SCP client too, the pscp:
pscp -r user#example.com:/remote/path C:\local\path
See https://the.earth.li/~sgtatham/putty/latest/htmldoc/Chapter5.html#pscp

How to use scp on ssh for downloading file to local machine

I had temporary access to server and I made command
scp user#example.com:/home/textfile.txt ./
Now it is just moved to the root of server. How to download it to local machine?
You'll find it in the directory where you ran the SCP command ...