How to backup/restore a Firebird database? - backup

I am really confused here about Firebird v2.5 backup/restore process. What should I use to backup/restore a local Firebird database:
fbsvcmgr.exe, gbak.exe, isql.exe or nbackup.exe
Are these all options or I am wrong about something!
What is the practical way to do it for a C++ application?
How should I know if a database already exists the first time, so I can decide whether to restore it or not.

I usually use gbak (don't know about the others).
Backup
gbak -b -v -user SYSDBA -password "masterkey" D:\database.FDB E:\database.fbk
Restore
gbak -c -user SYSDBA -password masterkey E:\database.fbk E:\database_restore.fdb
If file exists for restore you could do it with gbak restore flags
-c = create new file
-r = replace file
Here is good page for FB backup/restore: http://www.destructor.de/firebird/gbak.htm

There are two primary ways to create backups in Firebird:
gbak, which creates a logical backup of the database (object 'descriptions' (e.g. table structure, views, etc) and data)
nbackup (also known as nbak), which creates a physical backup of the database (physical data pages changed since the previous nbackup)
In most cases, I'd suggest to use gbak, because it is simpler and also allows you to move backups between platforms and Firebird versions, while nbackup is only really suitable for the same platform and Firebird version (but has the advantage of allowing incremental backups).
ISQL is an interactive query CLI, and cannot be used to create backups. Fbsvcmgr is the "Firebird Service Manager" tool, which can be used to invoke service operations on a (remote) Firebird server. This includes backup and restore operations through gbak and nbackup. Fbsvcmgr is pretty low-level and hard to use (see fbsvcmgr -? for options).
For gbak, you'd normally invoke the services through the gbak executable (option -se[rvice] <service>), see also Remote Backups & Restores in the gbak documentation. For nbackup you either can use the nbackup tool locally, or you need to use the fbsvcmgr (or another tool that supports service operations) to invoke the same functionality remotely (action_nbak and action_nrest), see also Backups on remote machines (Firebird 2.5+) in the nbackup documentation.
For detailed description on gbak, see gbak - Firebird Backup & Restore Utility. For nbackup, see Firebird's nbackup tool.
With a gbak backup, you'd normally restore the database using 'create' (option -c[reate]) or 'recreate' (-r[ecreate] without o[verwrite] option), which will fail if the database file already exists. See also the gbak manual linked above for more information.
I won't really answer your question about how to do it from a C++ application, because I don't program C++, and your question is already too broad as it is. But know that it is possible to invoke Firebird service operations, including backup and restore using both gbak and nbackup, from C++ code (that is essentially what Firebird tools itself do). If you want to know more about that, I'd suggest you ask on the firebird-support Google Group.

Related

Firebird remote backup

I want to backup a firebird database.
I am using gbak.exe utility. It works fine.
But, when i want to do a backup from a remote computer, the backup file is stored on the serveur file system.
Is there a way to force gbak utility to download backup file ?
Thanks
Backup is stored on the Firebird Server
gbak -b -service remote_fb_ip:service_mgr absolute_path_to_db_file absolute_path_to_backupfile -user SYSDBA -pass masterkey
Backup is stored on the local machine
gbak -b remote_fb_ip:absolute_path_to_db_file path_to_local_file -user SYSDBA -pass masterkey
see:
remote server local backup
and
gbak documentation
It is always a problem to grab a remote database onto a different remote computer. For this purposes, our institute uses Handy Backup (for Firebird-based apps, too), but if you are preferring GBAK, these are some more ways to do it.
A simplest method is to call directly to a remote database from a local machine using GBAK (I see it was already described before me). Another method is an installation of GBAK to a remote machine using administrative instruments for Windows networks. This method can be tricky, as in mixed-architecture networks (with domain and non-domain sections) some obstacles are always existed.
Therefore, the simplest method is writing a backup script (batch) file calling GBAK and then copying the resulted Firebird file to some different network destination, using a command-line network file manager or FTP manager like FileZilla. It require some (minimal) skill and research, but can work for many times after a successful testing.
Best regards!
If you have gbak locally, you can back up over a network. Simply specify the host name before the database.
For example:
gbak -B 192.168.0.10:mydatabase mylocalfile.fbk -user SYSDBA -password masterkey
Try this command:
"C:\Program Files (x86)\Firebird\Firebird_2_5\bin\gbak" -v -t -user SYSDBA -password "masterkey" 192.168.201.10:/database/MyDatabase.fdb E:\Backup\BackupDatabase.fbk
Of course you need to update your paths accordingly :)
I believe you should be able to do this if you use the service manager for the backup, and specify stdout as the backup file. In that case the file should be streamed to the gbak client and you can write it to disk with a redirect.
gbak -backup -service hostname:service_mgr employee stdout > backupfile.fbk
However I am not 100% sure if this actually works, as the gbak documentation doesn't mention this. I will check this and amend my answer later this week.

Use SQL to delete old MediaWiki revisions without shell access?

Does anyone know a SQL query that will purge a MediaWiki database of old revisions? My database has grown out of control, and I need to prune it to make it possible to download and manage.
I don't have shell access so, I need to do this with a SQL query.
I have tried the solution suggested here, but it doesn't work http://www.mediawiki.org/wiki/Extension_talk:SpecialDeleteOldRevisions2#Deleting_only_archived_revisions
Thanks for reading :)
Nicholas
As you, I don't have shell access to my MediaWiki. So I can't do a lot of things like maintenance.
Here is my solution : host your MediaWiki web site on your computer just to do your maintenance tasks
Backup your database
Backup your MediaWiki folder
Setup Apache (the web server) on your computer
Setup MySQL on your computer
Restore your MediaWiki database on your computer
Put your MediWiki folder on the Apache root folder
Finally run the maintenance task you want using shell. I suggest you the deleteOldRevisions script
After that, rebackup the folder and the database and restore them on the remote host
Use the Maintenance extension and run the relevant maintenance scripts with it. Direct database manipulation is pure madness, and using a local LAMP install as suggested by the other answer quite cumbersome.
Shell access is really required to properly run a MediaWiki but this is a common problem, please report your experience with the extension on the talk page or file a bug if you find any.

Distributing a hsqldb database for use

I have gone through the docs and I haven't been able to fully understand them.
My question :
For testing purpose we created a replica of a database in hsqldb and would want to use this as inprocess db for unit testing.
Is there a way that I can distribute the replica database so that people connect to this db.I have used the backup command and have the tar file.But how do I open a connection to the db which takes this backed up db... something on the lines of handing a .mdb file in case of Access to another user and asking him/her to use that.
Regards,
Chetan
You need to expand the backed up database using standard gzip / unzip tools, before you can connect to it.
The HSQLDB Jar can be used to extract the database files from the backup file. For example:
java -cp hsqldb.jar org.hsqldb.lib.tar.DbBackup --extract tardir/backup.tar dbdir
In the example, the first file path is the backup file, and the second one, dbdir, is the directory path where the database files are expanded.

Joomla 1.5 Site Backup Strategy

I would like to make a complete backup of my whole joomla 1.5 based site from time to time. How would this ideally be done? Are there any common pitfalls? Not that I only have ftp access to the hosting server. Is there a step by step tutorial somewhere? I am using latest Joomgallery and Kunena 1.0.9 (Legacy mode).
Maybe there is a good way to automate this?
There's two parts of the backup you have to worry about, the database and the files.
The first part is the database. It can be backed up using something like phpMyAdmin. If you don't have this available on your server already, it's not too hard to upload and get it going yourself. From there, you can just Export the entire database to a gzip file.
The second part is the code and uploaded files. The code base shouldn't change too often, so you could probably just make one backup of this. There's a number of ways. The simplest is to just download the entire folder via FTP, though if you're Linux, I'm sure someone will know a single command line to get all the changed files (rsync?).
The database is the main thing you have to worry about though: everything else should be able to be rebuilt just by reinstalling.
I think this: http://www.joomlapack.net/ is what you need. I use it myself and it works like a charm. Both for backups and for moving my Joomla installations from developer sites and to the real site.
get an FTP synchronisation tool and keep an up-to-date copy of your site locally. Then you could run the batch script
mysqldump -hhost -uuser -p%1 schema > C:\backup.sql
to create a backup of your mysql tables at various points in time.
edit
you would have to have MySQL Server installed on your local machine and path to its bin directory in you PATH, in order to run the mysqldump command without much hassle. -p%1 would take the command-line provided password, as you wouldn't want to store passwords in your batch script.
If you only have FTP access you are in a bit of a problem, as beside all files you'll also have to backup the database. Without accessing the database, a full-backup won't do you any good.
Whatever backup strategy you choose - be sure it can handle UTF-8 correctly. Joomla 1.5 stores all content with UTF-8, even when the database charset is set on 'iso-5589-1' - so when the backup solution is detecting the database charset, some characters like € or é will result in "strange" ¬ / é - not really what you'll want.
I absolutely endorse using Joomlapack - it works great. The optional remote tools allow you to initiate the backup from a Windows desktop machine - it performs the backup and downloads it. The remote has a scheduler, and you can also set it off to backup and download a list of sites.
Joomlapack also provides a file "kickstart.php" which you copy to your empty server account along with the backup, which automates the restore procedure. You do have to create an empty database with PHPMyAdmin or similar, and you are given the opportunity to supply the database parameters (host, database, username, password) during the process.
One pitfall I did run into with this though is that some common components can have absolute URLs in their configuration - e.g. SOBI2, Virtuemart. It's then just a matter of finding the appropriate configuration file, editing it and re-uploading it.
Another problem was one archive file (either ZIP or their JPA format) got a filename with a "?" character in it (from a Linux server) and this caused a bit of a problem trying to install it locally on a Windows WAMP stack - the extract process on the ZIP file failed, and it stopped the process completing cleanly.
I suggest using automatic backup service by http://www.everlive.net
Update:
Ok, here is some more information. EverLive.net is a website where you can create a free account. Enter your website details and you are ready to take your backups withe just one click. Restore is also possible in the same way.
Further you can use automatic backup option to take automatic backups at defined intervals. Other than that, you can use the website health check service to inform you if your website is not available.

The most efficient way to move psql databases

What is the most efficient, secure way to pipe the contents of a postgresSQL database into a compressed tarfile, then copy to another machine?
This would be used for localhosting development, or backing up to a remote server, using *nix based machines at both ends.
This page has a complete backup script for a webserver, including the pg_dump output.
Here is the syntax it uses:
BACKUP="/backup/$NOW"
PFILE="$(hostname).$(date +'%T').pg.sql.gz"
PGSQLUSER="vivek"
PGDUMP="/usr/bin/pg_dump"
$PGDUMP -x -D -U${PGSQLUSER} | $GZIP -c > ${BACKUP}/${PFILE}
After you have gzipped it, you can transfer it to the other server with scp, rsync or nfs depending on your network and services.
pg_dump is indeed the proper solution. Be sure to read the man page. In Espo's example, some options are questionable (-x and -D) and may not suit you.
As with every other database manipulation, test a lot!