does anybody know a way to perform a periodic remote backup of the full environment (so comprehensive of the application servers and SQL databases) in Jelastic?
I wanted to use Google Drive to store the backups as I was already using it with Plesk.
Thanks.
Related
I am using azure server for sql database.
I want to enabled backup database daily.
And also need to dump sql file for current database and other images uploaded to server.
Any suggestions please?
You can install backup software to your azure server and backup your SQL server to azure cloud storage. There are plenty such software (Duplicati, CloudBerry, Acronis etc).
Some of them have special features to backup SQL server in a proper way, also there are free versions among them.
You can do this in a different ways. You can use the third party applications and schedule backup jobs. Or you can use the native tools and configure everything by your-self. Hope this will be useful for you.
Since you're going down the Azure services route, for the images you had ought to look at Azure Blob Storage
And to back it up...Look at this answer
I know that Cloudberry works with Azure. You can try this software for doing backups daily both full image or icnremental backup.The price is afforable. The tool's simple. I see the person above has already mentioned Cloudberry. Seems to be a good thing.
I'm sure there's a good amount of developers here that use DirectAdmin and I had a quick question.
I've always used cPanel and I'm not on a server that is using DirectAdmin instead. Where in DirectAdmin can you generate a full backup of the account at the user level?
Also, do DirectAdmin backups include everything related to the account like cPanel backups do? For example, not only the files and databases but also the cron jobs, DNS zones, email accounts, etc.?
And where are the backups stored by default? Is there an option to send the backups to a remote server via FTP like you can with cPanel?
There are two different backup systems built into DA:
Admin Tools | System Backup. This tool lets you backup configuration data and arbitrary directories, locally or using FTP or SCP.
Admin Tools | Admin Backup/Transfer. This tool is oriented toward backing up data account by account, in one archive per account, in a format that you can use to restore from (in the same tool) on the original or another DA server (i.e. if you want to transfer to a new server). You can back up locally and/or via FTP.
Both options can also be scheduled via cron.
Depending on your level of access, only one of these might be available to you. This page has further info for non-administrators: http://www.site-helper.com/backup.html.
You can improve your DirectAdmin backup with an incremental backup plugin that includes local and remote backup location, please check the setup guide here
To start i dont have much experience with DB's. In summary, i have an application for a client. They dont want to necessarilary host their DB online but have a local sql server set up for their 2 computers.
I have a batch script that backs up the database every night. Is there a way in the batch script to send them to the cloud like a skydrive,etc?
Try our SQLBackupAndFTP software. You can schedule backup jobs with SQLBackupAndFTP (full, differential and transaction log backups), save backups at local folders, FTP, Dropbox, Box, Google Drive, Amazon S3, SkyDrive, delete old backups and configure email notifications... Basic features are available in free version or you can try all features in trial mode.
Just backup to SkyDrive folder. If DB is big use rsync for sending to cloud or do full backups every week, differential every day and transaction every hour (depending of you application etc) then there will be less data to send to cloud.
backup database to SD folder - using backup compression (if available in your edition of sql server, see WITH COMPRESSION clause of BACKUP statement help in BOL) or using 3rd party backup compression tool (either free or paid)
also you can backup database to temporary folder and then zip it to SD folder
If you are willing to spend a bit of money, cloudberry has an sql backup tool that will do exactly that (and more).
http://www.cloudberrylab.com/sql-backup-amazon-s3-azure.aspx
Apache database hosted in a virtual server to be used with a JSF and JPA application.It there any method where regular back ups can be performed, for example once a day? Like an script?
Here's a lot of information about making backups of a Derby database: http://db.apache.org/derby/docs/10.9/adminguide/cadminhubbkup98797.html
Choose a backup method that works well for you, then use your operating system's scheduling tools (cron, etc.) to arrange for that backup to be performed regularly.
I am using Azure VM role. I created a separate VHD (uploaded to page blob) for storing SQL data files (to overcome data persistence issue with VM role). The SharePoint 2010 has been configured on VM. I want to run 2 instances of Azure VM, where I am faining as mounting the data VHD in write mode on 2 instances is not possible. Can anyone help me out in this?
To add to what Joannes said:
A Cloud Drive may be mounted by exactly one writer, but you can make any number of read-only snapshots. This won't help with a scale-out scenario that you're describing, but I just wanted to clarify.
SharePoint 2010 is not a supported configuration in a VM Role currently. There's licensing, compatibility with SQL Azure to consider, scale-out, and potentially other issues. Same goes with installing SQL Server in a VM Role.
Support issues aside, you could look into Azure Connect as a way to reach an on-premise SQL Server instance. This alleviates your need to store SQL Server data files in a Cloud Drive. This will have bandwidth-related performance and cost implications, but it's certainly an option.
CloudDrive is not intended for scaling out. In other words, a blob can be mounted by no more than 1 VM at the same time. This limitation is very unlikely to be lifted in the future, as a single blob is note intended to support scalable writes.