I need to upload all the wordpress 4.9.6 files to a VM running Ubuntu on Google cloud.
So far, I've been able to upload individual files via SSH and move them within directories on the server, but when it comes to upload a folder and subsequently moving them, I just can't.
Can someone please be lovely and help me?
You can remote copy a whole folder with scp.
scp -r user#your.server.example.com:/path/to/foo /home/user/Desktop/
From man scp
-r Recursively copy entire directories
If you are using a version control system as git, you can clone the repository to google cloud. See this useful link.
git clone https://github.com/yourgitaccount/worpress-project.git
Related
I'm running Ubuntu and have a remote CentOS system which stores (and has access to) various files and network locations. I have SSH access to the CentOS machine and want to be able to work locally on Ubuntu.
I'm trying to mirror a remote directory structure. The remote directory is structured:
/my_data/user/*
And I want to replicate this structure locally (a lot of scripts rely on absolute paths).
However, for reasons of speed, I want a certain subfolder, for example:
/my_data/user/sourcelibs/
To be stored locally on disk. I know the sourcelibs subfolder doesn't change much (but the rest might). So I can comfortably rsync it:
mkdir -p /my_data/user/sourcelibs/
rsync -r remote_user#remote_host:/my_data/user/sourcelibs/ /my_data/user/sourcelibs/
My question is, if I use sshfs to mount /my_data/user:
sudo sshfs -o allow_other,default_permissions, remote_user#remote_host:/my_data/user /my_data/user
Will it overwrite my existing files? Is there a way to have sshfs mount but exclude certain subfolders?
Yes, sshfs will overwrite existing files. I have almost the same use case and just tested this myself. BTW, you'll need to add -o nonempty to your sshfs command since the destination dir /my_data/user already exists.
What I found to work is make a copy of the remote directory excluding the large sub dirs. IDK if keeping 2 copies in sync on the remote machine is feasible for your use case? But if you'll mostly be updating on your local machine and rarely making changes remotely, that could work.
I've switched to windows and am having a hard time using bitbucket with it.
Within the downloads menu you can add files but this is not added to source.
I've also tried using source tree (web application) but when I attempt to push the files they do not exist in my directory. Any idea how I can do this or a guide which explains how to upload files. It's been a number of years since I have used bitbucket on windows.
Bitbucket is simply a git provider. You'll have to use Git commands (or a GUI like Sourcetree, Gitkraken, etc.) to push (upload) your files to source.
A basic guide to working with Git can be found here.
Windows or Linux only you need a git console or git cli installed in your default cli.
First clone your repository to your local system. Go to your
bitbucket account and your repository need to be file uploaded. Copy
the git url command. Open git console and type the following command.
git clone "git URL"
Explore you repository and make changes. Like add file which you want to upload to it.
Go to your git console again and type the following commands.
git add .
git commit -m "commit message like - adding file"
git push origin master
Original question: Let's say I have a remote folder with .R, .py and .pkl files. How to avoid syncing .pkl files or how to sync .R files only? How to avoid syncing specific sub-folders?
Sorry I was wrong on my understanding. sshfs is not a sync utility but is a software to mount a remote system to a local folder accessible via SSH.
I am not deleting this question so that it helps the future readers.
Now, my next question would be: how to selectively block sub-folders in the remote folder from being shown in the local folder? how to selectively block sub-folders in the local folder from being updated in the remote folder?
You can mount with --bind option needed directories. Mount will exclude them from sshfs.
\For example if you have mounted dir in /var/www/site/ via
sshfs /var/www/site
Then you can mount the cache dir locally
mount /var/www/site/cache /tmp/site1cahe --bind
And cache will be local-only :)
I'm not sure how to set the publication settings...
... My RPi is at 192.168.2.126, and is running Apache and ftp.
... The site is to be located in the folder /var/www/GarageDoor on the RPi
... The site is accessed as http://192.168.2.126/GarageDoor/GarageDoors.html
I'm also concerned because my ftp client can't move the file(s) associated with this site directly to /var/www/GarageDoor either. I end up transferring them to my /usr folder, then copying the files manually to the /var/www... folder.
Seems like you need "sudo" permission to copy a file to this folder. I can't figure out how to give either Kompozer or my ftp client such permission. (I'm using bitvise sftp client)
Any ideas would be appreciated.
This sounds a lot like a permissions error. Enter the following into the raspberry pi command line:
sudo chmod 777 /var/www/GarageDoor/GarageDoors.html
Because that would make the file readable by anyone. If that doesn't work, try the +x option to make the file executable.
sudo chmod +x -R /var/www/GarageDoor
Note on the second command: This will make all files in /GarageDoor have executable permissions. (-R is for recursive)
I SCP ed to copy files successfully, i.e. the transfer shows it's successful, but I don't see anything in my local folders.
The command:
scp name1#server1.edu:/file/*.* ~/Desktop/
I am running Debian, if that might be of some help.
Check the SFTP home path.Your successfully transferred files are copied by default to this path.Hope this will help you !