I have gone through the docs and I haven't been able to fully understand them.
My question :
For testing purpose we created a replica of a database in hsqldb and would want to use this as inprocess db for unit testing.
Is there a way that I can distribute the replica database so that people connect to this db.I have used the backup command and have the tar file.But how do I open a connection to the db which takes this backed up db... something on the lines of handing a .mdb file in case of Access to another user and asking him/her to use that.
Regards,
Chetan
You need to expand the backed up database using standard gzip / unzip tools, before you can connect to it.
The HSQLDB Jar can be used to extract the database files from the backup file. For example:
java -cp hsqldb.jar org.hsqldb.lib.tar.DbBackup --extract tardir/backup.tar dbdir
In the example, the first file path is the backup file, and the second one, dbdir, is the directory path where the database files are expanded.
Related
WE have a call recordings everyday that are posted in FTP. As a backup i wanted to copy those recordings from FTP in to local drive on daily basis.Is it possible to copy entire folder as a whole and paste the folder in my local and rename it ?
thanks in advance
BIDS in 2008 has an FTP Task, but I never use it. There are also several third party extensions for SFTP and the like, but I don't use those.
[Edit: Since the above links are to CodePlex, which is shutting down, I'll provide more detail. The first link is to a project titled "SSIS SFTP Task Control Flow Component" and the second is to a project titled "SSIS Extensions - SFTP Task, PGP Task, Zip Task". As of June 2017 the projects do not appear to have an obvious rehosting.]
I tend to use WinSCP with a script file. It's certainly the least Microsoft-y approach, but, IMX, it's more foolproof. This has the advantage of working for FTP, FTPS, and SFTP. It has the disadvantage that you're storing the password in plain text. You can store it in the registry with WinSCP's session manager, but I believe that's a per-user registry which makes for lots of possible fun when you have the SQL Agent using an SSIS Proxy account.
Create your script file with the commands you want to use. Be sure to specify option batch abort and option confirm off. You also might want to specify option failonnomatch, but I haven't tested that one yet myself since I'm still on a slightly older version that doesn't support that option.
Mine tend to look like this:
# Set batch settings
option batch abort
option confirm off
# Connect
open sftp://user:password#sftp.server.com -hostkey="ssh-rsa 2048 00:11:22:33:44:55:66:77:88:99:aa:bb:cc:dd:ee:ff"
# Change remote and local directories
lcd "M:\SSIS\eSchoolPlus to Clever\Output"
# Transfer files
put -nopreservetime -nopermissions -transfer=ascii file1.csv
put -nopreservetime -nopermissions -transfer=ascii file2.csv
put -nopreservetime -nopermissions -transfer=ascii file3.csv
put -nopreservetime -nopermissions -transfer=ascii file4.csv
put -nopreservetime -nopermissions -transfer=ascii file5.csv
close
exit
The options on put are due to how the remote server works for the script I happened to grab. get works about the same.
Then you use an Execute Process Task. The executable is C:\Program Files (x86)\WinSCP\WinSCP.exe and the Arguments I use are:
-console -script="X:\Path\To\Script\WinSCPScript.txt" -xmllog="X:\Path\To\Script\WinSCP-!S-!Y-!M-!D-!T.log" -xmlgroups`
That creates a daily log of just the WinSCP information. The XML logs are more readable and useful than the older log format, IMO, which contains a lot of mostly debugging information.
You can also go the route of using the WinSCP .NET library with an Execute Script Task, but that's obviously a lot more effort. They do have a HOWTO for that, however.
Yes - you would use a File System Task and set the Operation to Copy Directory (before Microsoft changed it with Win 95, folders were called directories). You can change the name in the destination connection where you choose the destination folder. You can use an expression to add a date if you want.
https://msdn.microsoft.com/en-us/ms140185.aspx
Here's some good step-by-step instructions if you need them since Microsoft gives vague descriptions with no examples:
http://oops-solution.blogspot.com/2011/11/file-operation-copy-move-rename-and.htmlhttp://oops-solution.blogspot.com/2011/11/file-operation-copy-move-rename-and.html
I have a windows form application that requires users to log in to access the information. I have created a local compact database file for the credentials to be stored. I added the database file to my the folder but when I open my application and try to log in it tells me that it cannot find the database file.
Should the file be stored on a different folder, or should I need to install an instance of sql on the user computer.
This is my first deployment so I am not sure how to go about it. I have done some research on the subject, but it does not seem related to my issue. The help section of Intallshield was not clear either.
I am looking for some resources on how to accomplish this.
I figure out the issue, in order to work all files, including the database files need to be dumped under the userprofile folder.
I am using a post deployment script in my SQL server database project to bulk insert data. This works great if I publish to a local database, but if I publish to a remote database, obvisouly the CSV I am bulk inserting from, will not exists on the remote server. I have considered using a command line copy on the CSV to a shared folder, but this raises security concerns, as anyone with access to this folder could possibly tamper with a deployment.
How should I be using post deployment scripts to copy data to a remote server? A CSV is easier to maintian than a bunch of inserts, so I would prefer using a CSV. Has anyone ran into and solved this issue?
The best way is the one you mentioned. Command line copy it to a secured shared folder and BCP from there.
From that point on the security of the folder depends on network services.
Does anyone know of a script or program that can be used for backing up multiple websites?
Ideally, I would like the have it setup on a server where the backups will be stored.
I would like to be able to add the website login info, and it connects and creates a zip file or similar that it would then be sent back to the remote server to be saved as a backup etc...
But it would also need to be able to be set up as a cron so it backed up everyday at least?
I can find PC to Server backups that a similar, but no server to server remote backup scripts etc...
It would be heavily used, and needs to be a gui so the less techy can use it too?
Does anyone know of anything similar to what we need?
HTTP-Track website mirroring utility.
Wget and scripts
RSync and FTP login (or SFTP for security)
Git can be used for backup and has security features and networking ability.
7Zip can be called from the command line to create a zip file.
In any case you will need to implement either secure FTP (SSH secured) OR a password-secured upload form. If you feel clever you might use WebDAV.
Here's what I would do:
Put a backup generator script on each website (outputting a ZIP)
Protect its access with a .htpasswd file
On the backupserver, make a cron script download all the backups and store them
I would like to make a complete backup of my whole joomla 1.5 based site from time to time. How would this ideally be done? Are there any common pitfalls? Not that I only have ftp access to the hosting server. Is there a step by step tutorial somewhere? I am using latest Joomgallery and Kunena 1.0.9 (Legacy mode).
Maybe there is a good way to automate this?
There's two parts of the backup you have to worry about, the database and the files.
The first part is the database. It can be backed up using something like phpMyAdmin. If you don't have this available on your server already, it's not too hard to upload and get it going yourself. From there, you can just Export the entire database to a gzip file.
The second part is the code and uploaded files. The code base shouldn't change too often, so you could probably just make one backup of this. There's a number of ways. The simplest is to just download the entire folder via FTP, though if you're Linux, I'm sure someone will know a single command line to get all the changed files (rsync?).
The database is the main thing you have to worry about though: everything else should be able to be rebuilt just by reinstalling.
I think this: http://www.joomlapack.net/ is what you need. I use it myself and it works like a charm. Both for backups and for moving my Joomla installations from developer sites and to the real site.
get an FTP synchronisation tool and keep an up-to-date copy of your site locally. Then you could run the batch script
mysqldump -hhost -uuser -p%1 schema > C:\backup.sql
to create a backup of your mysql tables at various points in time.
edit
you would have to have MySQL Server installed on your local machine and path to its bin directory in you PATH, in order to run the mysqldump command without much hassle. -p%1 would take the command-line provided password, as you wouldn't want to store passwords in your batch script.
If you only have FTP access you are in a bit of a problem, as beside all files you'll also have to backup the database. Without accessing the database, a full-backup won't do you any good.
Whatever backup strategy you choose - be sure it can handle UTF-8 correctly. Joomla 1.5 stores all content with UTF-8, even when the database charset is set on 'iso-5589-1' - so when the backup solution is detecting the database charset, some characters like € or é will result in "strange" ¬ / é - not really what you'll want.
I absolutely endorse using Joomlapack - it works great. The optional remote tools allow you to initiate the backup from a Windows desktop machine - it performs the backup and downloads it. The remote has a scheduler, and you can also set it off to backup and download a list of sites.
Joomlapack also provides a file "kickstart.php" which you copy to your empty server account along with the backup, which automates the restore procedure. You do have to create an empty database with PHPMyAdmin or similar, and you are given the opportunity to supply the database parameters (host, database, username, password) during the process.
One pitfall I did run into with this though is that some common components can have absolute URLs in their configuration - e.g. SOBI2, Virtuemart. It's then just a matter of finding the appropriate configuration file, editing it and re-uploading it.
Another problem was one archive file (either ZIP or their JPA format) got a filename with a "?" character in it (from a Linux server) and this caused a bit of a problem trying to install it locally on a Windows WAMP stack - the extract process on the ZIP file failed, and it stopped the process completing cleanly.
I suggest using automatic backup service by http://www.everlive.net
Update:
Ok, here is some more information. EverLive.net is a website where you can create a free account. Enter your website details and you are ready to take your backups withe just one click. Restore is also possible in the same way.
Further you can use automatic backup option to take automatic backups at defined intervals. Other than that, you can use the website health check service to inform you if your website is not available.