How to backup source repository and zip destination folders? - backup

All sources are on windows OS, and destination backup is on Unix system (we are using Samba).
My source repository is similar to :
-Repository
--Folder1
---Files
--Folder2
---Files
etc...
I would like to get a destination similar to :
-Repository
--Folder1.zip
--Folder2.zip
etc...
After first backup, I only backup files that have changed since the last backup. (or backup if new folders/files have been created).
Is someone know a tool or a script for my backup needs? Can we do that with Robocopy?

You may install cygwin on your Windows machine, and use simple sh script, like the following one:
#!/bin/bash
for i in $( ls ); do
tar czf $i.tar.gz $i
done
# specify your destination here:
rsync *.tar.gz /tmp
rm *.tar.gz
BTW, that's not the most straightforward way, I suppose.

If you are open to forget about the zipping part, I would advise backuppc on the Unix system and rsync (win32 port) / or samba config on the Windows system
see https://help.ubuntu.com/community/BackupPC and http://backuppc.sourceforge.net/ for more infos

Related

sshfs: will a mount overwrite existing files? Can I tell it to exclude a certain subfolder?

I'm running Ubuntu and have a remote CentOS system which stores (and has access to) various files and network locations. I have SSH access to the CentOS machine and want to be able to work locally on Ubuntu.
I'm trying to mirror a remote directory structure. The remote directory is structured:
/my_data/user/*
And I want to replicate this structure locally (a lot of scripts rely on absolute paths).
However, for reasons of speed, I want a certain subfolder, for example:
/my_data/user/sourcelibs/
To be stored locally on disk. I know the sourcelibs subfolder doesn't change much (but the rest might). So I can comfortably rsync it:
mkdir -p /my_data/user/sourcelibs/
rsync -r remote_user#remote_host:/my_data/user/sourcelibs/ /my_data/user/sourcelibs/
My question is, if I use sshfs to mount /my_data/user:
sudo sshfs -o allow_other,default_permissions, remote_user#remote_host:/my_data/user /my_data/user
Will it overwrite my existing files? Is there a way to have sshfs mount but exclude certain subfolders?
Yes, sshfs will overwrite existing files. I have almost the same use case and just tested this myself. BTW, you'll need to add -o nonempty to your sshfs command since the destination dir /my_data/user already exists.
What I found to work is make a copy of the remote directory excluding the large sub dirs. IDK if keeping 2 copies in sync on the remote machine is feasible for your use case? But if you'll mostly be updating on your local machine and rarely making changes remotely, that could work.

Can I Copy a hard drive through ssh?

Can I copy an entire hard drive of one computer in one go over ssh?
I have an old windows machine that I am ready to put Linux on but I would like to backup its contents.
The command you're looking for is scp
scp -r username#remotehost:/disk /some/local/directory
Note the -r flag to make it transfer directories recursively.
https://linux.die.net/man/1/scp
It transfers files over ssh.

Sync clients' files with server - Electron/node.js

My goal is to make an Electron application, which synchronizes clients' folder with server. To explain it more clearly:
If client doesn't have the files present on the host server, the application downloads all of the files from server to client.
If client has the files, but some files have been updated on the server, the application deletes ONLY the outdated files (leaving the unmodified ones) and downloads the updated files.
If a file has been removed from the host server, but is present at client's folder, the application deletes the file.
Simply, the application has to make sure, that client has EXACT copy of host server's folder.
So far, I did this via wget -m, however frequently wget did not recognize, that some files changed and left clients with outdated files.
Recently I've heard of zsync-windows and webtorrent npm package, but I am not sure which approach is right and how to actually accomplish my goal. Thanks for any help.
rsync is a good approach but you will need to access it via node.js
An npm package like this may help you:
https://github.com/mattijs/node-rsync
But things will get slightly more difficult on windows systems:
How to get rsync command on windows?
If you have ssh access to the server an approach could be using rsync through a Node.js package.
There's a good article here on how to implement this.
You can use rsync which is widely used for backups and mirroring and as an improved copy command for everyday use. It offers a large number of options that control every aspect of its behaviour and permit very flexible specification of the set of files to be copied.
It is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination.
For your use case:
If the client doesn't have the files present on the host server, the application downloads all of the files from a server to the client. This can be achieved by simple rsync.
If the client has the files, but some files have been updated on the server, the application deletes ONLY the outdated files (leaving the unmodified ones) and downloads the updated files. Use: –remove-source-files or -delete based on whether you want to delete the outdated files from the source or the destination.
If a file has been removed from the host server but is present at the client's folder, the application deletes the file. Use: -delete option of rsync.
rsync -a --delete source destination
Given it's a folder list (and therefore having simple filenames without spaces, etc.), you can pick the filenames with below code
# Get last item from each line of FILELIST
awk '{print $NF}' FILELIST | sort >weblist
# Generate a list of your files
find -type f -print | sort >mylist
# Compare results
comm -23 mylist weblist >diffs
# Remove old files
xargs -r echo rm -fv <diffs
you'll need to remove the final echo to allow rm work
Next time you want to update your mirror, you can modify the comm line (by swapping the two file arguments) to find the set of files you don't have, and feed those to wget.
or
rsync -av --delete https://mirror.abcd.org/xyz/xyz-folder/ my-client-xyz-directory/

SCP is creating subdirectory... but I just want it to copy directly

I'm trying to use scp to copy recursively from a local directory to a remote directory.... I have created the folders on the remote side:
Remote Location (already created):
/usr/local/www/foosite
I am running scp from the local machine in directory:
/usr/local/web/www/foosite
But it's copying the "foosite" directory as a subdirectory... I just want the contents of the folder, not the folder itself...
Here is the command I'm using:
scp -r /usr/local/web/www/foosite scpuser#216.99.999.99:/usr/local/www/foosite
The problem is that if you don't use the asterisk (*) in the local part of the call, scp will create a new top level directory in the remote server. It should look like this:
scp -r /usr/local/web/www/foosite/* scpuser#216.99.999.99:/usr/local/www/foosite
This says "Copy the CONTENTS" (but not the directory itself) to the remote location.
Hope this helps... Took me an hour or so to figure this out!!!
Old question, but I think there is a better answer. The trick is to leave the foosite directory off of the destination:
scp -r /usr/local/web/www/foosite scpuser#216.99.999.99:/usr/local/www
This will create the foosite directory on the destination if it does not exist, but will just copy files into foosite if the directory already exists. Basically the -r option will copy the last directory in the path and anything under it. If that last directory already exists on the destination, it just doesn't do the mkdir.

Restoring Apache Tomcat after an accidental delete

I have a server running apache tomcat. The path to the server is following:
root#serverb:/usr/tomcat/apache-tomcat-7.0.23# pwd
/usr/tomcat/apache-tomcat-7.0.23
root#serverb:/usr/tomcat/apache-tomcat-7.0.23# ls
LICENSE NOTICE RELEASE-NOTES RUNNING.txt bin conf lib logs temp webapps work ws.war
From time to time, I have to go logs/ folder and run following command:
find . -mtime +2 -exec rm {} \;
However, I accidentally ran this command in /usr/tomcat/apache-tomcat-7.0.23 as a result, my ws.war file and other files from within bin/ folder got deleted.
I have the backup of ws.war but not of the apache folder. Is there anyway I can reinstall the apache and restore my server.
Most likely you're not asking how to create a backup after you need it (not before...), right?
Of course, you can get tomcat at http://tomcat.apache.org, but if you don't have your configuration and changed settings (e.g. memory settings, host setup etc.) you'll have to redo it from memory or until nobody complains any more.
Congratulations, you've learnt about the importance of backups. When you're done with the new installation, consider having a proper backup from now on. Keep in mind: IMHO you're only allowed to call something a backup if you have demonstrated that you can use it to restore to a new environment in the time that you specify as acceptable downtime.