I am new to working with SQL database but have been confronted with testing my database. For security reasons any queries or updates are done through stored procedures.
It was suggested to me by a peer to use stored procedures to test other stored procedures. Is this a good or bad solution to unit testing my stored procedures to ensure they are doing what they are suppose to be doing?
I found an excellent solution using Visual Studio:
Visual Studio Unit Testing
It allows you to create unit tests for SQL stored procedures, you can also populate the database using regular expressions - very cool stuff.
Apart from DbUnit and NUnit you can use tSQLt if you like writing tsql. tSQLt is an open source framework written in tsql for writing unit test cases for tsql.
DbUnit from dbunit.org is a good framework. Another great tools are TSqlUnit and tsqlt from tsqlt.org.
I have been using TST as testing framework for SQL Server with very positive results.
It works well and I could even integrate it as part of our CI-environment in Team City.
The key is to be very careful on how to set up your test data. Whatever test framework you choose, be sure to apply the Test Data Builder pattern. If you don't, you may very quickly get some test refactoring problems.
I don't know the which tool can do a unit test for the SQL statements or store procedures.
I usually write the SQL script to do a test for that.
Create an empty database, of course, the same structure as the original one.
Prepare the data (import some data from the original database).
Invoke the stored procedure.
Then see the data if it's correct or not.
Sometimes write some assert statements like the
IF EXISTS (SELECT * FROM XX INNER JOIN XXX ON XXXXXXXX WHERE XXX=XXX)
RAISEERROR XXXXXX
If some exceptions or errors are thrown then you can check the stored procedure.
But it's a waste of time.
I usually check all the executing path for the stored procedure most of the time, and just test the main error expecting points.
I have been using TestContainer(Mysql) to mock mysql server.
Using Kotlin/Java JUnit5 as the test runner.
so in the test case, it will
start TestContainer
seed data that is required for the test.
execute the Stored Procedure.
assert that the data is accurate.
Related
I'm creating SQL procedures in Entity Framework by using Sql methods in the migration. For example in an Up() I'm doing
Sql(#"SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION dbo.MyFunction
...
GO");
Now my problem is with the GOs: if I want to be able to script my migrations (for using on a live server) I need the GOs in the generated script, otherwise it doesn't work. However for running Update-Database without the -script option (for use while developing), it gets upset about the GOs.
I've found some partial answers:
How to add code to initialize sql database
The answer by Bart provides a method to call to divide the statement up into multiple Sql calls. The problem with this is that it affects the generated script when running with the -script option, so back to square 1.
How can I override SQL scripts generated by MigratorScriptingDecorator Also looks like it could be useful.
My question is therefore: is there either a way to know whether you're running the migration with the -script option, or a better way to script my SQL procedures in the migrations?
I have posted a similar question where someone actually gave me an awesome answer
here is the answer to it. I hope it helps.
basically you have to alter your configuration file of your migration configurator
The build process for my SQL 2008 Database Project takes upwards of 15 minutes for 1 build. I only need to manage roughly 50 stored procedures. I created a database project and a server project.
Next thing I do is fix all the build errors. Now I modify a stored procedure. Then I have to build the entire database and script the entire database just to see if my stored procedure compiles.
Is there any way to test the stored procedure without going through a 15 minute build - then deploy the script? Can I build just changes instead of the entire DB?
First of all, you should look at why your builds take so long. Maybe you need more memory or something.
Second, why do you have to script the entire database to test the stored procedure? Just deploy to a test database, or even your local sandbox database.
I decided to go with MSSCCI. It's simple UI which plugs directly into SSMS and behaves like Team Explorer is exactly what I've been searching for. Getting started.
I've never used visual studios to "build" and maintain databases. This, I'd guess, will quickly become unmanageable as the database starts getting bigger. And I'm assuming when you "build" it, it's verifying and deploying all objects in the database.
I would suggest you do not use visual studios in this fashion. Just maintain your sql code independently. And store them in version control systems manually. That way you can update each stored procs separately. In other words, keep each stored proc as a file.
if object_id(<ProcName>) is not null
drop proc <ProcName>
GO
create proc <ProcName>
...
GO
Then, store that as ProcName.sql and let source control handle the rest. Sorry if this isn't helpful or if I'm just stating the obvious.
I'm looking for a way to copy stored procedures from one sql database to another on the same instance. It needs to be automatic, preferably through code (t-sql or anything), so using the generate scripts trick is not viable (plus I don't want to maintain that many scripts, and people forget to run them).
I've searched a bit on this and have not found a workable solution. Someone suggested a clever trick with generating all the stored procedure text into a sql field and then converting that and executing it on the destination db but unfortunately that had issues.
Has anyone got any other ideas on how this can be done, if it's at all possible?
If I can't do it programmatically, would there be a quick solution using ssis?
Thanks.
Edit: Using mixture of sql 2005 and 2008 versions.
You can do it programatically in .NET using the SMO framework.
Free/easy implementation could be done via PowerShell.
I have a script that does just this - it generates the scripts for SQL objects, including Stored Procs, and executes the creation scripts on the target server.
It's a handy workaround when security concerns don't allow linked servers but you need to keep certain resources in sync across multiple servers.
If all you care about are the procs, it should be fairly straightforward to check sys.sql_modules on your source and target DBs and execute any that don't exist in the target via the definition field in that view.
My Application Database Without Project and without Source safe, i planned to make my DB to be as project and add it to TFS, but I have no idea how to script the stored procedures, Triggers, Views, Functions, and what is the best practice to Make Update Script for All My stored procedures, Triggers, Views, and Functions to My customers DB.
The best procedure (IMHO) is to manually maintain a strict version of your schemas. Then when you need to make changes you write a delta script to move from one version to the next. I suggest you write the DDL scripts by hand -- you can make them concise and comment them.
You can use a tool like Visual Studio Team System for database architects, take a look at Running static code analysis on SQL Server database with Visual Studio Team System for database architects it will show you how to import the data, disregard the static code analysis that comes later it does not apply to your question
I've found a good way to get SQL scripts into SCM from an existing database is to use SMSS's "export all to script" option or whatever it's called, can't remember now.
Then every other change you add the change script into your SCM with a different version number in the file name.
Every release (or set cycle depending on your development/release methodology) you apply all change scripts, then re-script the entire database, tag it, and start again.
The best way to do it - save the database in TFS as set of database creation script, i.e. MyTable table should be added to TFS as MyTable.sql file (CREATE TABLE...) etc. We are using SQL Examiner to do this - see the following article: How to keep your database under version control
We are working with SVN and I never tested SQL Examiner with TFS, but I know that the tool supports TFS.
Ok, so I've got a bit of a SQL and Powershell problem. There are 2 SQL scripts, one to setup 4 different global stored procedures. Another to execute them and manipulate data before returning it to PS to be placed in a CSV file. The reason I'm not putting them into a single file is for readability. The procs are enclosing huge chunks of sql and I cannot create permanent procs in our production environment.
The problem I'm running into is the script runs fine in SQL Mgmt Studio but when ran by PS, I get several errors around the 'go's in the script.
I'm pretty sure this is a problem with the format that PS and the .NET classes expect when executing and returning data sets but...I'm at a loss.
I'm running SQL Server 2005 btw.
Any ideas or similar experiences?
What errors do you get? How are you executing each file? GO is a batch separator understood only by certain tools (e.g. Management Studio); PowerShell doesn't know what GO means. Have you tried executing the separate CREATE PROCEDURE scripts without issuing a GO command between them? If they are separate commands this shouldn't be an issue.
"GO" is a delimiter used by SQL Management Studio. It is not a valid SQL keyword. You can configure SQL Management Studio and change "GO" to "ENGAGE" if you wanted to.
Just remove "GO" from the scripts.