Run multiple script files SQL with a TransactSQL - sql

I'll be thankful if we could continue the subject discussed on the post of the link below which treats of: run multiples script files on a database the sql server.
TransactSQL to run another TransactSQL script
Here in my company we have a similar problem. We have two environments: pre-production and production. When we are developing new releases, a lot of changes are made in pre-production database which is a smallest copy of production. These changes generates a set of scripts files (*.sql files that can have CREATE, ALTER VIEWS, TABLES, INSERT and others commands) and we try to control then by using the SVN (we know the file names). When we have to publish this modifications, these scripts are executed one by one on the production database.
The problem is, we (developers team) do not have permissions to run scripts on the production database, this is made by the DBA and on the most of times, one of these scripts have errors that can causes a momentary system crash.
There is a way to use the solution discussed on this post to run an atomic execution of all the script files and if something get wrong, we do a rollback? Maybe using the BEGIN TRANS, COMMIT and ROLLBACK statements.
Any suggestions?
Thanks

My suggestion is to have another test environment before production where your deployment can be tested. This works quite well.

if Visual Studio Schema Compare is available it can be used on these enviroment migration.
Check some concepts shown in This SO Question
Regards.

Related

Visual Studio Database Project: Include 'If Exists' checks for all the objects

We use TFS Continuous Integration to handle our staging and deployments of code. In our current environment, we (developers) aren't allowed to manually update databases in Production. A script must be staged and then given to a DBA to run.
By default, the database project builds and outputs a database creation script that will create all the tables and stored procedures. However, it does not include checks to see if the object already exists.
For example, when it attempts to create the Customer table, I would like to have the script check if the table already exists, if it does alter the table.
Is this at all possible?
VS can create a script for just the changes. I think this approach will be better than using existence checks because it will be able to handle column changes, and overall it makes for a shorter and more targeted script.
Right-click the project and select Publish.
Click Edit and enter the connection details for your staging database.
Back on the Publish dialog, click Advanced and make sure "Always re-create database" is not checked.
Back on the Publish dialog, click Generate Script.
What this approach does is compare the objects in the database project to your staging database and generates a SQL script for just what is different. You can even save the publish settings to a file to make it easier to generate future scripts.
Keith is right you need to script the changes rather than just using the create statements.
You basically either need a copy of the production database to run a comparison against or you give the DBA's a way to run the comparison and deploy.
The way I prefer to do it is with TFS is to use SSDT in Visual Studio, I then have a custom build step as part of the .sqlproj file that builds the dacpac, uses sqlpackage.exe to compare the dacpac to the mirror of production (or dev, uat, whatever) - this then outputs a script that will take that version of the database to the same version of the code as the dacpac.
You can adjust this slightly to auto-deploy to dev, uat etc and just create the script in production but the choice of exactly what you do it up to you!
If you can't get a mirror of production or a copy of the schema of production etc, you can give the dacpac to the dbas and and either a batch file or powershell script ot drive sqlpackage.exe to create a script or just go ahead and deploy.
Exactly what works depends on the environment you are in!
Ed

Building SQL deployment scripts into the application?

We currently have a rather manual, fiddly, messy & error prone way of running SQL deployment scripts when we update our clients' software installations. We're considering finding a 3rd party SQL deployment tool to automate this process.
However, I'm pushing the idea of building our own SQL deployment tool into the application itself. It would be simple - on application startup, it would:
1) Check the existing database schema version (eg. "35")
2) Check against "up to date" database schema version (eg. "38")
3) Retrieve relevant SQL deployment scripts from resource files (eg. "36", "37", "38")
4) Lock the database and run each required SQL deployment script
Note that this would still be run by an IT technician in case any errors occurred, not by end users.
It seems unorthodox but I don't really see any problem. Your thoughts?
I don't see anything inheritly wrong with this.
At a company I've worked for, they built a custom SQL-script installer that would allow them to automatically apply changes to the database, roll back the changes if necessary, and keep tabs on the version of what's been applied.
No matter the desired result of the application, you'll need to set conventions (i.e. database releases should have this folder structure, etc.) and identify the needs and processes that will be used in running the tool (i.e. just how automated you'll make it)
Don't build your own. Far too common a problem for a bespoke solution.
You're looking for a database migration tool, my recommendation would be liquibase. It can be run from the command line or integrated into the build process. Unique features that are especially valuable to me is the generation of SQL upgrade (and downgrade) scripts, which are often demanded from us when supporting production installs.
For more a more detailed listing of alternative migration tools see the following answer:
Migrations for Java

How to automatically export stored procedures for a release SQL Server 2008

Our team was releasing a new version of our system yesterday and we came across some issues with stored procedures. To cut a long story short we had to upload the old stored procedures to fix the issues.
I have now been given the task to automatically back up the stored procedures for our database before we release a build. I have went through a lot of sites and I've looked at generating scripts, making batch files, doing whole backups and scheduling tasks etc but none of these solutions would automatically backup only the stored procedures.
Any help in this case would be greatly appreciated thanks in advance.
Best Regards
Ryan
In Management Studio, right click on your database in the Object Explorer window, go to Tasks -> Generate Scripts... and follow the wizard.
You need to use SMO libraries to create your scripts and use them in command line batch files. Read more in http://msdn.microsoft.com/en-us/library/ms162153.aspx
Before run de script generator, set Option Continue scripting on Error, otherwise script will not be gereated.
If option DROP and Create is chosen, set option Script Object-Levels permission for stored procedures
Is your software source code checked into source control? It might be of benefit if your database is as well. This is the method software has used to manage versions and releases for years, and its about time DB's joined the party.
I suggest you look into a database project (available in the current 2015 free version of SQL Server Data Tools), which is a way of checking your objects in and out of a repository etc. It's a more complete way of managing database objects and fits into the software lifecycle. You can release your database codebase in conjunction with your software codebase and manage it all in one.

How do you share SQL changes within your team?

Whenever you make database changes, how do you apply these changes to others databases on the team (and also your servers)?
Currently we are using a file called changes.sql where we put all our changes, separated with ISO date comments.
Is there a better way?
We use an expanded version of your approach.
We have an database upgrade folder for each release, which contains all the scripts which are part of the release. There is one index file in the folder, which contains pseudo links to all the scripts which should be run.
We have a cruise control job which runs each night to restore a copy of the current production database, then runs the current release's upgrade scripts against it (by executing the scripts defined in the index file). There's also a CI job which runs whenever anyone checks anything into the upgrade folder for the current release.
The scripts need to be re-runnable obviously, eg they should check for the existence of something before dropping or creating it.
Take a look at http://dbmaintain.org/overview.html - It is a quite powerful tool to manage database updates. It basically works by executing several SQL scripts in the correct order. It remembers which scripts were already executed. If an executed script is changed it either reports an error (in production mode) or clears the database and executes all scripts again (in testing mode). There is a good tutorial too.
Edit: You can also group the sql scripts (i.e. by release). The big advantage here is that you can use the same tests for your unit tests, testing environments, coninuous integration, near-live and production environments.
Not at my current job, but in the past I used a database project in Visual Studio 2010, which was then published to SVN. We had an SOP rather than software automation to push changes from development to QA, staging, and production.
The team I worked with was small - five developers with shared responsibility for DB design and .NET development.
You should also consider using version control on your database. One example is Liquibase. By using Version control you can comment all the changes to the table structure, thus you don't need a changes.sql file.
We use a migration tool (migratordotnet - other alternatives exist) that lets you write C# classes that execute database commands. The migrations run locally on each invocation of the program or of the integration tests, and on the servers on each deployment. The migration framework automatically keeps track of which migrations have been applied. Of course, the migrations are a part of the version control repository.

Making a SETUP file

I have a VB project that runs on SQL SERVER 2005, while making the setup file for it, how do I include the DB?
You don't
Typically you have a DB generation script that is run either as part of setup or as part of first run of application
You also need to consider migrations (changes to DB when new releases of your application are published)
Consider using MigratorDotNet or RikMigrations to solve these problems in a seperate installer/upgrade program if you are still using VB6
I disagree, you could include the database. Simply distribute the .MDF file with your application.
Of course, the setup application would have know how to attach the database to an existing SQL Server RDBMS.
Both methods given in the above answers will work. I have tried them both. However using a
db generation script reduces the size of the final deployment files considerably. I would launch the script on the first run of the application and not in the setup itself.
I will second jack on this one.
From my experience of using installs that require an actual database file tend to have more issues then when updating or on first install when running scripts. As jack mentioned another bonus is reduced file size.
You can create who database scripts by right clicking on the required database, and selecting the script database option. Note however this will only create the tables and fields and not replicate any data.