In my WiX project I have a file data and a SQL scripts which create db, create/alter tables insert/update rows, etc. All scripts are separated into three parts and are executed via SqlScript element. I use ContinueOnError="no" but if the previous script was executed successfully it don't be rollbacked. Can I wrap all scripts in transaction and use try/catch blocks? Is there a chance to handle catch event from WiZ? What can you advise to make such kind of the installer?
We don't use Wix SQL extension, we run custom actions to do the job we need.
On install, we use custom actions to first backup the database, then run the right upgrade scripts (based on version of current database), and if needed restore the database to the backup as a rollback action of the upgrade.
On uninstall, we backup the database, delete it (conditionally based on user input), and restore if anything goes wrong during the uninstall.
Wix does not handle SQL scripts that way.
I believe your choices are fairly limited.
Create a database backup before installation and restore it on install failure. Unless you know for certain that the data size will always be small this probably should not be an automated part of the installer.
Provide rollback sql scripts to be sequenced and run in case of install failure. This can be a real pain in the ass to get correct depending on the types of DB changes you need.
Offhand I am unaware of any installer tool kit that even attempts to automate database rollbacks as part of a larger install. There are just too many variables to account for. (e.g. how long the rest of the non-DB installation takes and the affect that could have on database connection timeout)
Related
I have an application that has been released for several years. I have updated the SQL Server database around 90 times and each time I have created an update SQL script, so when the user runs the new software version the database is updated.
I would like to amalgamate all the updates into a single SQL script to make deployment faster and simpler, is there an easy way to do this? I presume I could simply grab the latest database after it has run through all 90 updates via SQL Server Management Studio?
EDIT
For clarification; the software I wrote automatically applies new database updates when the user downloads the latest version. This is done via C# / .Net and look for an embedded sql script on startup in the format XX_update.sql calling each script one by one i.e.
1_update.sql - this creates the tables and initial data etc. This was my initial release database.
2_update.sql - updates to the initial database such as adding a new SP or changing column datatype etc
3_update.sql
4_update.sql
...
90_update.sql (4 years and lots of program updates later!).
Ideally, I would install my software and create a brand new database running through all 90 update scripts. Then take this database and convert it into a script which I can replace all 90 scripts above.
This is too long for a comment.
There is no "easy" way to do this. It depends on what the individual updates are doing.
There is a process you can follow in the future, though. Whenever an update occurs, you should maintain scripts both for incremental updates and for complete updates. You might also want to periodically introduce major versions, and be able to upgrade to and from those.
In the meantime, you'll need to build the complete update by walking through the individual ones.
I use a similar system at work and while I prefer to run the scripts separately I have amalgamated several scripts sometimes when they have to be deployed by another person with no problem.
In SQL Server the rule is that as long as you separate the scripts by go and use SQL Server Management Studio or another tool that process the batch separator properly there is no problem in amalgamating it, because it would look like separate scripts to SQL Server (instead of being sent to SQL Server as a one big script the tool send it in batches using the go as the batch separator).
The only problem is that if you have an error in the middle of the script, the tool would continue sending the rest of batches to the server (at least by default, maybe there is an option for changing this). That is why I prefer when possible using a tool to run then separately, just to err on the safer side and stop if there is a problem and locate easily the problematic command.
Also, for new deployments your best option is what you say of using a database that is already updated instead of using the original one and apply all the updates. And to prevent being in this situation again you can keep a empty template database that is not used for any testing and update it when you update the rest of databases.
Are you running your updates manaually on the server. Instead, you can create a .bat file, powershell script or a exe. Update scripts in your folder can be numbered. The script can just pick up updates one by one and execute it over the database connection instead of you doing it.
If you have multiple script files and you want them to combine into one,
rename these as txt, combine, rename the resulting file as sql.
https://www.online-tech-tips.com/free-software-downloads/combine-text-files/
This is easy. Download SQL Server Data Tools and use that to create the deployment script. If there's no existing database, it will create all the objects, and if targeting an older version it will perform a diff against the target database and create scripts that create the missing objects, and alter the existing ones.
You can also generate scripts for the current version using SSMS, or use SSMS to take a backup, and use a restore to in an install.
I'll be thankful if we could continue the subject discussed on the post of the link below which treats of: run multiples script files on a database the sql server.
TransactSQL to run another TransactSQL script
Here in my company we have a similar problem. We have two environments: pre-production and production. When we are developing new releases, a lot of changes are made in pre-production database which is a smallest copy of production. These changes generates a set of scripts files (*.sql files that can have CREATE, ALTER VIEWS, TABLES, INSERT and others commands) and we try to control then by using the SVN (we know the file names). When we have to publish this modifications, these scripts are executed one by one on the production database.
The problem is, we (developers team) do not have permissions to run scripts on the production database, this is made by the DBA and on the most of times, one of these scripts have errors that can causes a momentary system crash.
There is a way to use the solution discussed on this post to run an atomic execution of all the script files and if something get wrong, we do a rollback? Maybe using the BEGIN TRANS, COMMIT and ROLLBACK statements.
Any suggestions?
Thanks
My suggestion is to have another test environment before production where your deployment can be tested. This works quite well.
if Visual Studio Schema Compare is available it can be used on these enviroment migration.
Check some concepts shown in This SO Question
Regards.
Whenever you make database changes, how do you apply these changes to others databases on the team (and also your servers)?
Currently we are using a file called changes.sql where we put all our changes, separated with ISO date comments.
Is there a better way?
We use an expanded version of your approach.
We have an database upgrade folder for each release, which contains all the scripts which are part of the release. There is one index file in the folder, which contains pseudo links to all the scripts which should be run.
We have a cruise control job which runs each night to restore a copy of the current production database, then runs the current release's upgrade scripts against it (by executing the scripts defined in the index file). There's also a CI job which runs whenever anyone checks anything into the upgrade folder for the current release.
The scripts need to be re-runnable obviously, eg they should check for the existence of something before dropping or creating it.
Take a look at http://dbmaintain.org/overview.html - It is a quite powerful tool to manage database updates. It basically works by executing several SQL scripts in the correct order. It remembers which scripts were already executed. If an executed script is changed it either reports an error (in production mode) or clears the database and executes all scripts again (in testing mode). There is a good tutorial too.
Edit: You can also group the sql scripts (i.e. by release). The big advantage here is that you can use the same tests for your unit tests, testing environments, coninuous integration, near-live and production environments.
Not at my current job, but in the past I used a database project in Visual Studio 2010, which was then published to SVN. We had an SOP rather than software automation to push changes from development to QA, staging, and production.
The team I worked with was small - five developers with shared responsibility for DB design and .NET development.
You should also consider using version control on your database. One example is Liquibase. By using Version control you can comment all the changes to the table structure, thus you don't need a changes.sql file.
We use a migration tool (migratordotnet - other alternatives exist) that lets you write C# classes that execute database commands. The migrations run locally on each invocation of the program or of the integration tests, and on the servers on each deployment. The migration framework automatically keeps track of which migrations have been applied. Of course, the migrations are a part of the version control repository.
I have installer to install procedures, scripts, views, etc in SQL server 2005/2008.
Now I want to add a condition in the installer like if there is any error while installing, I want to undo all the changes done in SQL server.
I tried to store the procedures, views, etc which I am changing while installing and reverting them back if I get any error. But am not able to do it the way I want.
Can someone guide me if he had done the same thing?
To specify I am using WIX installer.
Also if someone has tried SMO, it will be of great help.
The simplest and most robust way to handle this is not to use the installer at all. Rather wrap all your SQL into a transaction block. Using this means that if anything fails for any reasons (as part of the SQL) then the transaction will gracefully roll back and all your DB changes will be reverted without you having to implement any more than defining the transaction block on your SQL statement.
Assuming MS SQL more information regarding transactions can be found here:
http://msdn.microsoft.com/en-us/library/ms188929.aspx
Most other mainstream SQL implementation follow a very similar model, but obviously refer to their docs instead.
If you need to trigger "rollback" of the SQL component of your install if some other component of your install fails. Then unfortunately you can't use transactions in this manner. However in this case you could simply call a rollback script that deletes any SP's / tables etc you have added. That said in .NET you can being the SQL transaction handling into the code (i.e. C#) if that is available to you you could use this to wrap up everything.
It can be difficult to rollback an SQL upgrade script, particularly if that script could fail at any point. The problem is that the built-in rollback machinery cannot handle, for example, most DDL statements. Therefore, you would have to implement such rollbacks manually with compensating scripts that undo the changes.
It might be simpler to back-up the database at the outset of the installation and restore it should the installation fail.
My Application Database Without Project and without Source safe, i planned to make my DB to be as project and add it to TFS, but I have no idea how to script the stored procedures, Triggers, Views, Functions, and what is the best practice to Make Update Script for All My stored procedures, Triggers, Views, and Functions to My customers DB.
The best procedure (IMHO) is to manually maintain a strict version of your schemas. Then when you need to make changes you write a delta script to move from one version to the next. I suggest you write the DDL scripts by hand -- you can make them concise and comment them.
You can use a tool like Visual Studio Team System for database architects, take a look at Running static code analysis on SQL Server database with Visual Studio Team System for database architects it will show you how to import the data, disregard the static code analysis that comes later it does not apply to your question
I've found a good way to get SQL scripts into SCM from an existing database is to use SMSS's "export all to script" option or whatever it's called, can't remember now.
Then every other change you add the change script into your SCM with a different version number in the file name.
Every release (or set cycle depending on your development/release methodology) you apply all change scripts, then re-script the entire database, tag it, and start again.
The best way to do it - save the database in TFS as set of database creation script, i.e. MyTable table should be added to TFS as MyTable.sql file (CREATE TABLE...) etc. We are using SQL Examiner to do this - see the following article: How to keep your database under version control
We are working with SVN and I never tested SQL Examiner with TFS, but I know that the tool supports TFS.