Can I Copy a hard drive through ssh? - ssh

Can I copy an entire hard drive of one computer in one go over ssh?
I have an old windows machine that I am ready to put Linux on but I would like to backup its contents.

The command you're looking for is scp
scp -r username#remotehost:/disk /some/local/directory
Note the -r flag to make it transfer directories recursively.
https://linux.die.net/man/1/scp
It transfers files over ssh.

Related

sshfs: will a mount overwrite existing files? Can I tell it to exclude a certain subfolder?

I'm running Ubuntu and have a remote CentOS system which stores (and has access to) various files and network locations. I have SSH access to the CentOS machine and want to be able to work locally on Ubuntu.
I'm trying to mirror a remote directory structure. The remote directory is structured:
/my_data/user/*
And I want to replicate this structure locally (a lot of scripts rely on absolute paths).
However, for reasons of speed, I want a certain subfolder, for example:
/my_data/user/sourcelibs/
To be stored locally on disk. I know the sourcelibs subfolder doesn't change much (but the rest might). So I can comfortably rsync it:
mkdir -p /my_data/user/sourcelibs/
rsync -r remote_user#remote_host:/my_data/user/sourcelibs/ /my_data/user/sourcelibs/
My question is, if I use sshfs to mount /my_data/user:
sudo sshfs -o allow_other,default_permissions, remote_user#remote_host:/my_data/user /my_data/user
Will it overwrite my existing files? Is there a way to have sshfs mount but exclude certain subfolders?
Yes, sshfs will overwrite existing files. I have almost the same use case and just tested this myself. BTW, you'll need to add -o nonempty to your sshfs command since the destination dir /my_data/user already exists.
What I found to work is make a copy of the remote directory excluding the large sub dirs. IDK if keeping 2 copies in sync on the remote machine is feasible for your use case? But if you'll mostly be updating on your local machine and rarely making changes remotely, that could work.

Wordpress on Google Cloud VM

I need to upload all the wordpress 4.9.6 files to a VM running Ubuntu on Google cloud.
So far, I've been able to upload individual files via SSH and move them within directories on the server, but when it comes to upload a folder and subsequently moving them, I just can't.
Can someone please be lovely and help me?
You can remote copy a whole folder with scp.
scp -r user#your.server.example.com:/path/to/foo /home/user/Desktop/
From man scp
-r Recursively copy entire directories
If you are using a version control system as git, you can clone the repository to google cloud. See this useful link.
git clone https://github.com/yourgitaccount/worpress-project.git

Kompozer Can't "Publish" web page to my Raspberry Pi Server

I'm not sure how to set the publication settings...
... My RPi is at 192.168.2.126, and is running Apache and ftp.
... The site is to be located in the folder /var/www/GarageDoor on the RPi
... The site is accessed as http://192.168.2.126/GarageDoor/GarageDoors.html
I'm also concerned because my ftp client can't move the file(s) associated with this site directly to /var/www/GarageDoor either. I end up transferring them to my /usr folder, then copying the files manually to the /var/www... folder.
Seems like you need "sudo" permission to copy a file to this folder. I can't figure out how to give either Kompozer or my ftp client such permission. (I'm using bitvise sftp client)
Any ideas would be appreciated.
This sounds a lot like a permissions error. Enter the following into the raspberry pi command line:
sudo chmod 777 /var/www/GarageDoor/GarageDoors.html
Because that would make the file readable by anyone. If that doesn't work, try the +x option to make the file executable.
sudo chmod +x -R /var/www/GarageDoor
Note on the second command: This will make all files in /GarageDoor have executable permissions. (-R is for recursive)

SCP copies files successfully, but files not visible in local computer

I SCP ed to copy files successfully, i.e. the transfer shows it's successful, but I don't see anything in my local folders.
The command:
scp name1#server1.edu:/file/*.* ~/Desktop/
I am running Debian, if that might be of some help.
Check the SFTP home path.Your successfully transferred files are copied by default to this path.Hope this will help you !

How to backup source repository and zip destination folders?

All sources are on windows OS, and destination backup is on Unix system (we are using Samba).
My source repository is similar to :
-Repository
--Folder1
---Files
--Folder2
---Files
etc...
I would like to get a destination similar to :
-Repository
--Folder1.zip
--Folder2.zip
etc...
After first backup, I only backup files that have changed since the last backup. (or backup if new folders/files have been created).
Is someone know a tool or a script for my backup needs? Can we do that with Robocopy?
You may install cygwin on your Windows machine, and use simple sh script, like the following one:
#!/bin/bash
for i in $( ls ); do
tar czf $i.tar.gz $i
done
# specify your destination here:
rsync *.tar.gz /tmp
rm *.tar.gz
BTW, that's not the most straightforward way, I suppose.
If you are open to forget about the zipping part, I would advise backuppc on the Unix system and rsync (win32 port) / or samba config on the Windows system
see https://help.ubuntu.com/community/BackupPC and http://backuppc.sourceforge.net/ for more infos