Understanding SQL Database Project workflow - sql

The build process for my SQL 2008 Database Project takes upwards of 15 minutes for 1 build. I only need to manage roughly 50 stored procedures. I created a database project and a server project.
Next thing I do is fix all the build errors. Now I modify a stored procedure. Then I have to build the entire database and script the entire database just to see if my stored procedure compiles.
Is there any way to test the stored procedure without going through a 15 minute build - then deploy the script? Can I build just changes instead of the entire DB?

First of all, you should look at why your builds take so long. Maybe you need more memory or something.
Second, why do you have to script the entire database to test the stored procedure? Just deploy to a test database, or even your local sandbox database.

I decided to go with MSSCCI. It's simple UI which plugs directly into SSMS and behaves like Team Explorer is exactly what I've been searching for. Getting started.

I've never used visual studios to "build" and maintain databases. This, I'd guess, will quickly become unmanageable as the database starts getting bigger. And I'm assuming when you "build" it, it's verifying and deploying all objects in the database.
I would suggest you do not use visual studios in this fashion. Just maintain your sql code independently. And store them in version control systems manually. That way you can update each stored procs separately. In other words, keep each stored proc as a file.
if object_id(<ProcName>) is not null
drop proc <ProcName>
GO
create proc <ProcName>
...
GO
Then, store that as ProcName.sql and let source control handle the rest. Sorry if this isn't helpful or if I'm just stating the obvious.

Related

How to deploy SQL script to clients

Our company is in the process of adapting TFS for source repository and project management. I am in charge of database part of the project. We are using SQL Server 2008 R2, Visual Studio 2012 and TFS Online. We have a database that is used by several of our applications. So far I have been the only one handling any change to this database. As the company is expending we are going to have multiple dev teams. So I am planning to save the database as as SSDT project to TFS.
At the moment I am maintaining my database like the following:
I have separate folders for UDFs, Stored Procedures, and Config.
Under these folders I have subfolders for each objects. For example, for stored procedures I have subfolders for each stored procedure which contains the SQL script to create the SP. The config folder contains any script similar to SSDT's post deployment script (for example, populating static data).
The SQL script contains code to drop the procedure and create it.
I have a c# app to concatenate all the SQL files into one single SQL file. Let's call it the FINAL script. When creating FINAL script I can specify version number which adds an update statement to update the version table on the database.
FINAL script is made available for customers to download and execute on the database. So the script mainly contains any add/edit to SPs, UDFs, and static data. It does not touch any existing data (data entered by user) in most cases.
As a newbie to TFS and SSDT I am not exactly sure how this can be done using SSDT/TFS or if there is better way of doing something similar. So far what I have understood about SSDT and TFS is:
I can import an existing database to SSDT project.
This will create scripts for all objects including tables.
I can easily do a publish of the database to a local server or to a server I have access to.
Things that seem confusing so far:
How do I supply clients with my latest update script? I am thinking of manually including the FINAL script to the SSDT project but there must be better way of doing it.
How do I publish the changes to a copy of the database without the loss of any user-entered data? My guess is when publishing the tables get created. I can take care of the static data but I am not sure how to handle data entered by users.
May be there is something fundamentally wrong in my understanding of this whole thing. That is why I am here... :)
You want to pull your DB into a SQL Project. Maintain all of your changes there. This tells your system what the schema of your database should be. From there, I'd generate the dacpac files (through building the project) and provide those to your clients along with having them install the SSDT tools that include SQLPackage. They can run SQLPackage to make changes to their database to handle the schema changes automatically. This will bring their database in line with your schema, no matter how far off it might be.
I'd also create a publish profile for them to use. This lets you control some of the settings.
You can choose to not drop any objects not in your project
You can choose to ignore users/permissions
You can set an option to not allow changes if there would be data loss.
You can wrap everything in a transaction so a failed update rolls back
If you give them a batch file to run, you can specify an output file or a Diff report, or have them generate their own script to do the update.
I blogged about this at http://schottsql.blogspot.com/2013/10/all-ssdt-articles.html
(or http://schottsql.blogspot.com/search/label/SSDT if that doesn't work well). That will take you through some basics of why you might want to use SQL Projects, creating them, maintaining them, and publishing the changes to an existing database.

How to automatically export stored procedures for a release SQL Server 2008

Our team was releasing a new version of our system yesterday and we came across some issues with stored procedures. To cut a long story short we had to upload the old stored procedures to fix the issues.
I have now been given the task to automatically back up the stored procedures for our database before we release a build. I have went through a lot of sites and I've looked at generating scripts, making batch files, doing whole backups and scheduling tasks etc but none of these solutions would automatically backup only the stored procedures.
Any help in this case would be greatly appreciated thanks in advance.
Best Regards
Ryan
In Management Studio, right click on your database in the Object Explorer window, go to Tasks -> Generate Scripts... and follow the wizard.
You need to use SMO libraries to create your scripts and use them in command line batch files. Read more in http://msdn.microsoft.com/en-us/library/ms162153.aspx
Before run de script generator, set Option Continue scripting on Error, otherwise script will not be gereated.
If option DROP and Create is chosen, set option Script Object-Levels permission for stored procedures
Is your software source code checked into source control? It might be of benefit if your database is as well. This is the method software has used to manage versions and releases for years, and its about time DB's joined the party.
I suggest you look into a database project (available in the current 2015 free version of SQL Server Data Tools), which is a way of checking your objects in and out of a repository etc. It's a more complete way of managing database objects and fits into the software lifecycle. You can release your database codebase in conjunction with your software codebase and manage it all in one.

Visual Studio DB project fails to detect changes to stored procedure parameters. Is that normal?

Whilst working on a SQL Server 2008 database project in Visual Studio 2010 I added a new parameter to an existing stored procedure definition. When I built the project it failed to detect that references to the sproc elsewhere in the project did not have enough parameters. It even let me deploy the project.
Is this the way it's meant to behave or have I forgotten to tick a box somewhere?!
Sam : )
Database projects do not detect problems with procedure/function parameters. Also, you will notice you can delete the offending procedure/function from your project all together and it won't fail.
In my case, I use an external tool for managing programmability, so not failing the build because of missing procs is a plus.
If you want to validate your procedures and functions you can write a scipt that will execute all your stored procedures with using "SET FMTONLY ON". The procedure will be compiled, but no permanent changes will be made to the DB during execution. You can't use this with procedures that use temporary tables (#table syntax).
That's how Microsoft does it in Visual Studio to determine what the output of your stored procedure should be.
Unless you re-run the code generation wizard (by deleting the sproc in the VS Server Explorer then dragging it back in) your project doesn't know that the database has changed. You may get runtime errors but not compile errors.
If it doesn't know about any changes it will compile normally. So yes, it's supposed to behave that way.

SQL SERVER Project

My Application Database Without Project and without Source safe, i planned to make my DB to be as project and add it to TFS, but I have no idea how to script the stored procedures, Triggers, Views, Functions, and what is the best practice to Make Update Script for All My stored procedures, Triggers, Views, and Functions to My customers DB.
The best procedure (IMHO) is to manually maintain a strict version of your schemas. Then when you need to make changes you write a delta script to move from one version to the next. I suggest you write the DDL scripts by hand -- you can make them concise and comment them.
You can use a tool like Visual Studio Team System for database architects, take a look at Running static code analysis on SQL Server database with Visual Studio Team System for database architects it will show you how to import the data, disregard the static code analysis that comes later it does not apply to your question
I've found a good way to get SQL scripts into SCM from an existing database is to use SMSS's "export all to script" option or whatever it's called, can't remember now.
Then every other change you add the change script into your SCM with a different version number in the file name.
Every release (or set cycle depending on your development/release methodology) you apply all change scripts, then re-script the entire database, tag it, and start again.
The best way to do it - save the database in TFS as set of database creation script, i.e. MyTable table should be added to TFS as MyTable.sql file (CREATE TABLE...) etc. We are using SQL Examiner to do this - see the following article: How to keep your database under version control
We are working with SVN and I never tested SQL Examiner with TFS, but I know that the tool supports TFS.

Should you store your SQL Stored Procedures in Source Control?

When developing an application with lots of stored procedures, should you store them in some sort of source versioning system (such as source-safe, TFS, SVN)? If so, why? And is there a convenient front end way to do this with SQL Server Management Studio?
Yes. All code should be stored in source control.
Simply put, code is code and mistakes happen. It's nice to be able to go back and see what changed over time and be able to go back to those changes.
We have to add it manually to a source control system, but you can create addons for the Sql Server the Management System. I haven't ever created one to automatically add it to source control, but I suppose you could. Also, all the code is stored in sql tables, so you could in theory create a process or something to go through the table(s) and retrieve all the code and commit it automatically.
Update: I would always write extra code to check and see if the code exists and if it doesn't create a filler procedure and then the actual script do and alter procedure.
IF NOT EXISTS (SELECT * FROM dbo.sysobjects WHERE
id = OBJECT_ID(N'[dbo].[SomeStoredProcedure]') AND
OBJECTPROPERTY(id,N'IsProcedure') = 1)
EXEC sp_executesql N'CREATE PROCEDURE [dbo].[SomeStoredProcedure] AS
SELECT ''SPROC Template'''
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE SomeStoredProcedure
Doing a drop and recreate will remove all the user permissions you have setup for it.
ABSOLUTELY POSITIVELY WITHOUT QUESTION NO EXCEPTIONS IN ALL PERPETUITY THROUGHOUT THE UNIVERSE YES!
Get your database under version control. Check the series of posts by Scott Allen.
When it comes to version control, the database is often a second or even third-class citizen. From what I've seen, teams that would never think of writing code without version control in a million years-- and rightly so-- can somehow be completely oblivious to the need for version control around the critical databases their applications rely on. I don't know how you can call yourself a software engineer and maintain a straight face when your database isn't under exactly the same rigorous level of source control as the rest of your code. Don't let this happen to you. Get your database under version control.
I recommend that you do store them. You never know when you'll need to rollback, or dig into logic you may have removed..
Here's a good way to easily grab your Stored Procs into files that you can throw into whatever source control you desire..
Stored Procedures to .sql files
Storing stored procedures is a great idea. Its a pain though. Just how do you get all that stuff into subversion? You can manually do it, but then its tedious and you end up not doing it at all.
I use a tool from the subsonic project.
sonic.exe version /server servername /db databasename /out outputdirectory
This command saves everything to 2 text files. One contains database schema, stored procs, user accounts, constraints, and primary keys. The other one contains the data.
Now that you have these two files you can use subversion(cvs,source safe) to move it into source control.
More info for using The Command Line Tool (SubCommander)
Most definitely yes. Then the question becomes how you store them in source control. Do you drop and recreate the stored procedure or just alter, do you add permissions at the end of the script or in a separate script. There was a post on Coding Horror a while back about the topic that I found interesting. Is Your Database Under Version Control?
Sure you should.
In MS SQL 2008, you can do it right from Management Studio.
SQL is code. All code belongs under source code control.
That is all.
Absolutely.
Positively.
A set of SPs is an interface, that is likely to be modified more frequently than structural changes.
And because SPs contain business logic, changes should be stored in version control to track the modifications and adjustments to the logic.
Storing these in version control is a symptom of organizational maturity at a coding level, and is a best practice.
Most definitely.
You should.
To my knowledge, no such tool exists to automate this process. At least, five years ago, when I was considering building one, there didn't seem to be any competition.
We store our procs in Subversion, all your SQL Code including DDL should be in some kind of source control repository
SPs and table schemas for that matter are all assets that should be under version control. In a perfect world the DB would be built from scripts, including the test data, as part of your CI process. Even if that's not the case, having a DB/developer is a good model to follow. In that way new ideas can be tried out in a local sandbox without impacting everyone, once the change is tested it can be checked in.
Management Studio can be linked to source control, although I don't have experience of doing this. We've always tracked our SP/schema as files. Management studio can automatically generate change scripts, which are very useful, as table drop/recreate can be too heavy handed for any table that has data.
SQL procs also surely need the same security/benefits of version control as the rest of the code in the project.
As others have said, yes they should be.
I don't know of an easy way to do this with SQL Server Management Studio, but if you also use Visual Studio, database projects are a great way to manage this.
There are methods in SMO to generate scripts if you prefer to code your own scripting tool.
http://www.sqlteam.com/article/scripting-database-objects-using-smo-updated
If you're not using asset management alongside source control, then I say throw everything in source control. Images, word documents, the whole shebang. Can't lose it, can always reverse any changes to it and if any machine goes down - nothing is lost.