Currently I'm developing a control website for my home server. The server has LDAP setup for Mac's to login. The home directories are also on the server. I want to create a backup tool for my family, so they can backup while I'm off. I don't want to do this scheduled (at least not allways, since they must be able to start a backup right away).
I got stuck when I was trying to find a way to run the rsync commands as a privileged user.
I've got some ideas on this but I would like to hear the cons and pros of the options.
Create simple deamon that runs as root and backup's folder -arg1 to -arg2 minding the old backup in -arg3.
Run rsync as the logged in user by remembering the users pass at login at the control panel. (Problem: running ps will reveal password).
Create special rsync user (Problem: rsync user can read everything).
The project is located at https://github.com/hermanbanken/ldap-control and this issue is also on GitHub at https://github.com/hermanbanken/ldap-control/issues/1.
sudo is on OSX later versions.
sudo rsync .....
Related
I am quite new to server migrations, but fairly familiar with cPanel though. My current task is to migrate an entire website from a server with cPanel to another one.
What I did so far:
Use the Backup Wizard on the old server to create a full backup archive, and FTP it to the new server.
The full archive (about 6 GB because there are a lot of images) is now in my new server's public_html directory.
Now, what I need is a way to make the server take this tar archive, which is a full backup, and restore from it.
What I tried:
I tried simply extracting the archive, but it is taking forever to finish (again the archive is 6 GB), and for some reason my browser tab has to stay open until the end of the process, otherwise the extracting halts.
Also, as I have WHM access, I tried the "Restore a Full Backup/cpmove File" option, but for some reason, under the "Username for the account that you wish to restore:" textbox, WHM does not find my cPanel username.
If anyone can either tell me what I am doing wrong, or propose another option, I would really appreciate it.
P.S.: I only have WHM access to the new server, not the old one.
Edit: I got the WHM method working now. My mistake was that my tar archive was not stored in the /home directory, but in the home directory of the cPanel user (which is /home/username)!
Move your cpanel full backup file to /home directory and if you root have access of the server use below command as /scripts/restorepkg (cpanel username )
OR
Login to WHM with root user and go through the steps mentioned on below URL.
http://documentation.cpanel.net:8090/display/68Docs/Restore+a+Full+Backup+cpmove+File
I'm developing an automated backup system for a server using PostgreSQL and Tomcat. The environment is CentOS minimal 7. Long story short, a VM will download the .sql dumps and a .tar.gz folder containing Tomcat via FTP.
No problems in setting up vsftpd, I can access the Server via FTP with a custom user (ftpuser) which currently can access a specific folder (/home/ftpuser/backups/). I can compress tomcat there so my script will fetch the backups/ folder and download it, but I cant figure out how to dump the postgresql db to the /home/ftpuser/backups/ folder without having to do some stupid things with sudo.
Postgres user haven't the permission to write there and i can't give them to him even with chown or chmod. I inserted postgres in sudoers and if I dump the db and then I "sudo cp" it to that folder is okay, but in this way I cant use a script to do that, due to "sudo" asking password.
The question is.. Is there a way to enable "pg_dump" to write .sql dump to /home/ftpuser/backups/ folder?
Thanks for the replies.
pg_dump does not need to be run from the postgres user.
Run it from a user that can write to the desired folder, and pass the --username=database_user parameter to specify the desired database user. You'll probably need a .pgpass file for the password used by this user (unless it has been defined to be trusted on pg_hba.conf).
Currently running a Google compute engine instance and using SFTP on the server.
Followed details to lock a user to the SFTP path using steps listed here: https://bensmann.no/restrict-sftp-users-to-home-folder/
To lock the user to a directory, the home directory of that user needs to be owned by root. Initially, the setup worked correctly but found that Google compute engine sporadically "auto-resets" the permissions back to the user.
I am using an SSH key that is set in the Google Cloud Console and that key is associated with the username. My guess is that Google Compute Engine is using this "meta-data" and reconfiguring the folder permissions to match that of the user associated with the SSH key.
Is there any way to disable this "auto-reset"? Or, rather, is there a better method to hosting SFTP and locking a single user to a SFTP path without having to change the home folder ownership to root?
Set your sshd rule to apply to the google-sudoers group.
The tool that manages user accounts is accounts daemon. You can turn it off temporarily but it's not recommended. The tool syncs the instance metadata's SSH keys with the linux accounts on the VM. If you do this any account changes won't be picked up, SSH from Cloud Console will probably stop working.
sudo systemctl stop google-accounts-daemon.service
That said it may be what you want if you ultimately want to block SSH access to the VM.
I've been working with the server only for 2 days so I am sorry if that is simple question. I looked everywhere, but didn't find an answer.
So I have a Google compute engine account and I have owner privileges. When I run
gcloud compute ssh instance --zone us-central1-a
it works, but it creates a key with username that it takes from my computer account.
So when I am in google shell I can add or remove files using sudo. But when I go to Filezilla I have to use ssh file key and username from that key. And the only folder that accessible with that username is it's own folder. I am not sure what is the problem so I gave all the facts I could.
I'm not entirely sure I'm answering the right question, but I'll take a stab at it. The ssh keys created by/used by gcloud are specific to a particular linux user on your VM. As you note, you can use sudo when ssh'd in to edit files/directories owned by different users---the way this works is that you (roughly speaking) temporarily switch users to root when doing the file edit.
An scp client like Filezilla isn't going to be able to switch users that way. So you'll need a different technique to edit files with Filezilla.
I suggest ssh-ing in to your vm and using chmod or chown to change the ownership of files/directories that you want to use with Filezilla. Alternatively you could you use useradd -G to add you username to a group that can edit the files you care about.
Exactly what you'll do depends on the security policy you want to enforce for your files, but there a lots of decent options. The key test to run---can you get to a state where you can edit the files when logged in with SSH, but not using sudo? If so then you should be able to edit the files with Filezilla.
Does anyone know of a script or program that can be used for backing up multiple websites?
Ideally, I would like the have it setup on a server where the backups will be stored.
I would like to be able to add the website login info, and it connects and creates a zip file or similar that it would then be sent back to the remote server to be saved as a backup etc...
But it would also need to be able to be set up as a cron so it backed up everyday at least?
I can find PC to Server backups that a similar, but no server to server remote backup scripts etc...
It would be heavily used, and needs to be a gui so the less techy can use it too?
Does anyone know of anything similar to what we need?
HTTP-Track website mirroring utility.
Wget and scripts
RSync and FTP login (or SFTP for security)
Git can be used for backup and has security features and networking ability.
7Zip can be called from the command line to create a zip file.
In any case you will need to implement either secure FTP (SSH secured) OR a password-secured upload form. If you feel clever you might use WebDAV.
Here's what I would do:
Put a backup generator script on each website (outputting a ZIP)
Protect its access with a .htpasswd file
On the backupserver, make a cron script download all the backups and store them