rsync remote to local automatic backup - backup

I would like to auto backup my server monthly and weekly. My server is running Centos 5.5 and while searching the web I'm found a tool named rsync. I got my first update manually by using this command in terminal:
sudo rsync -chavzP --stats USERNAME#IPADDRES: PATH_TO_BACKUP LOCAL_PATH_TO_BACKUP
I then prompt my password for that user and bob's my uncle.
This backups the necessary files from my remote server to my local device but does somebody know how I can automate this? Like automatic running this script every sunday?
EDIT
I forgot to mention that I let direct admin backup the files I need and then copy those files from the remote server to a local server.

this command worked for me. Combine it with a cronjob
rsync -avz username#ipaddress:/path/to/backup /path/to/save

Related

Is it possible to edit code on my own machine and save it to account I've ssh'd into?

Scenario:
I'm using ssh to connect to a remote machine. I use the command line and run ssh <pathname>, which connects me to the machine at . I want to edit and run code on that remote machine. So far the only way I know is to create, edit, and run the files in the command window in vi, because my only connection to that machine is that command window.
My Question is:
I'd love to be able to edit my code in VSCode on my own machine, and then use the command line to save that file to the remote machine. Does anyone know if this is possible? I'm using OS X and ssh'ing into a Linux Fedora machine.
Thanks!
Sounds like you're looking for a command like scp. SCP stands for secure copy protocol, and it builds on top of SSH to copy files from one machine to another. So to upload your code to your server, all you'd have to do is do
scp path/to/source.file username#host:path/to/destination.file
EDIT: As #Pam Stums mentioned in a comment below the question, rsync is also a valid solution, and is definitely less tedious if you would like to automatically sync client and server directories.
You could export the directory on the remote machine using nfs or samba and mount it as a share on your local machine and then edit the files locally.
If you're happy using vim, check out netrw (it comes with most vim distributions; :help netrw for details) to let you use macvim locally to edit the remote files.

Where can I find the sql file after mysqldump

I have successfully connected using ssh and inputted the right credentials. Where can I find the backup sql file? Thanks in advance
Connected to the remote server, take the dump of the database using following command
mysqldump -R -h root -u username -ppassword databasename > /home/krishna/databasename.sql;
Then you can able to find your database in the /home/krishna/ folder.
Run pwd on the remote machine to see where mysqldump file resides. You can transfer it to your personal computer using scp as,
scp $PWD/dumpfile localuser#localhostip:/home/localuser
This command will prompt for local pc password, enter it. And the file will be copied to your home folder on local machine.
I can see you have logged in report server through SSH so you will get your MySQLdump file in your SSH user home directory. If you want to download that file on your local pc. Login through FTP with same SSH user details and download it
Thanks you for the answers! I consolidated all of it and came up with my own. I used mysqldump with the command line that you guys suggested and made a back up. Then I used ftp to gain access to the server's folders. That's where I downloaded the file. Again Thank you all so much

Firebird remote backup

I want to backup a firebird database.
I am using gbak.exe utility. It works fine.
But, when i want to do a backup from a remote computer, the backup file is stored on the serveur file system.
Is there a way to force gbak utility to download backup file ?
Thanks
Backup is stored on the Firebird Server
gbak -b -service remote_fb_ip:service_mgr absolute_path_to_db_file absolute_path_to_backupfile -user SYSDBA -pass masterkey
Backup is stored on the local machine
gbak -b remote_fb_ip:absolute_path_to_db_file path_to_local_file -user SYSDBA -pass masterkey
see:
remote server local backup
and
gbak documentation
It is always a problem to grab a remote database onto a different remote computer. For this purposes, our institute uses Handy Backup (for Firebird-based apps, too), but if you are preferring GBAK, these are some more ways to do it.
A simplest method is to call directly to a remote database from a local machine using GBAK (I see it was already described before me). Another method is an installation of GBAK to a remote machine using administrative instruments for Windows networks. This method can be tricky, as in mixed-architecture networks (with domain and non-domain sections) some obstacles are always existed.
Therefore, the simplest method is writing a backup script (batch) file calling GBAK and then copying the resulted Firebird file to some different network destination, using a command-line network file manager or FTP manager like FileZilla. It require some (minimal) skill and research, but can work for many times after a successful testing.
Best regards!
If you have gbak locally, you can back up over a network. Simply specify the host name before the database.
For example:
gbak -B 192.168.0.10:mydatabase mylocalfile.fbk -user SYSDBA -password masterkey
Try this command:
"C:\Program Files (x86)\Firebird\Firebird_2_5\bin\gbak" -v -t -user SYSDBA -password "masterkey" 192.168.201.10:/database/MyDatabase.fdb E:\Backup\BackupDatabase.fbk
Of course you need to update your paths accordingly :)
I believe you should be able to do this if you use the service manager for the backup, and specify stdout as the backup file. In that case the file should be streamed to the gbak client and you can write it to disk with a redirect.
gbak -backup -service hostname:service_mgr employee stdout > backupfile.fbk
However I am not 100% sure if this actually works, as the gbak documentation doesn't mention this. I will check this and amend my answer later this week.

Automatically run script/commands after connecting to OpenShift SSH

I set up an OpenShift application and set up my local PuTTY to connect to the server via SSH. Everything works fine, but I don't know how to run a few commands (mainly alias) after I connected to the server automatically (I don't want to copy&paste the same commands everytime I connect).
On my local linux shell I can use .bashrc, but this doesn't seem to work on OpenShift. I can't write a file in my home directory (/var/lib/openshift/[some letters and numbers]/) and I don't know the right place to put this file. Does anybody know where I have to put a file which will be run everytime I login?
I'd prefer a solution which doesn't involve my local SSH software as I'm connecting to this OpenShift application from different machines.
You can use your .bash_profile located in your $OPENSHIFT_DATA_DIR.
This has to be done in .bashrc or .profile or .bash_profile files. As you say they don't work then you can have a script in a file, scp that file to the remote server and then run when you ssh in a single command.
I have not used openshift but have used aws ec2 instances alot with ruby scripts,
ssh ubuntu#ec2-address ruby basic-auto.rb
The above command excutes the ruby file after the ssh. You can have a script in any language or may be a bash file(.sh) which executes after ssh.

Backup server permissions

Currently I'm developing a control website for my home server. The server has LDAP setup for Mac's to login. The home directories are also on the server. I want to create a backup tool for my family, so they can backup while I'm off. I don't want to do this scheduled (at least not allways, since they must be able to start a backup right away).
I got stuck when I was trying to find a way to run the rsync commands as a privileged user.
I've got some ideas on this but I would like to hear the cons and pros of the options.
Create simple deamon that runs as root and backup's folder -arg1 to -arg2 minding the old backup in -arg3.
Run rsync as the logged in user by remembering the users pass at login at the control panel. (Problem: running ps will reveal password).
Create special rsync user (Problem: rsync user can read everything).
The project is located at https://github.com/hermanbanken/ldap-control and this issue is also on GitHub at https://github.com/hermanbanken/ldap-control/issues/1.
sudo is on OSX later versions.
sudo rsync .....