Unit tests for Stored Procedures in SQL Server - sql

I want to implement Test First Development in a project that will be implemented only using stored procedures and function in SQL Server.
There is a way to simplify the implementation of unit tests for the stored procedures and functions? If not, what is the best strategic to create those unit tests?

It's certainly possible to do xUnit style SQL unit testing and TDD for database development - I've been doing it that way for the last 4 years. There are a number of popular T-SQL based test frameworks, such as tsqlunit. Red Gate also have a product in this area that I've briefly looked at.
Then of course you have the option to write your tests in another language, such as C#, and use NUnit to invoke them, but that's entering the realm of integration rather than unit tests and are better for validating the interaction between your back-end and your SQL public interface.
http://sourceforge.net/apps/trac/tsqlunit/
http://tsqlt.org/
Perhaps I can be so bold as to point you towards the manual for my own free (100% T-SQL) SQL Server unit testing framework - SS-Unit - as that provides some idea of how you can write unit tests, even if you don't intend on using it:-
http://www.chrisoldwood.com/sql.htm
http://www.chrisoldwood.com/sql/ss-unit/manual/SS-Unit.html
I also gave a presentation to the ACCU a few years ago on how to unit test T-SQL code, and the slides for that are also available with some examples of how you can write unit tests either before or after.
http://www.chrisoldwood.com/articles.htm
Here is a blog post based around my database TDD talk at the ACCU conference a couple of years ago that collates a few relevant posts (all mine, sadly) around this way of developing a database API.
http://chrisoldwood.blogspot.co.uk/2012/05/my-accu-conference-session-database.html
(That seems like a fairly gratuitous amount of navel gazing. It's not meant to be, it's just that I have a number of links to bits and pieces that I think are relevant. I'll happily delete the answer if it violates the SO rules)

It is doable. Create tests and in the setup create a new instance of db and give it some data and then execute the procs. Validate your assumptions, like I got the correct data back. Drop the test db then do it all again in the next test.

Unit testing in database is actually big topic,and there is a lot of different ways to do it.I The simplest way of doing it is to write you own test like this:
BEGIN TRY
<statement to test>
THROW 50000,'No error raised',16;
END TRY
BEGIN CATCH
if ERROR_MESSAGE() not like '%<constraint being violated>%'
THROW 50000,'<Description of Operation> Failed',16;
END CATCH
In this way you can implement different kind of data tests:
- CHECK constraint,foreign key constraint tests,uniqueness tests and so on...

Related

PL/SQL package automated testing

I would like to test private procedures/functions in given pl/sql package (oracle environment).
I need rather simple test (run procedure, check DB if row exist/or check return value) but in complex testing suite.
Which tool / approach would you recommend me ? (It's not possible to use paid solution).
I'm using SQL developer so option #1 is it's internal JUnit testing framework.
Other option is utPLSQL, to be more independent of SQL developer.
Third approach is complete different.
I would use Apache JMeter to connect to DB a write tests in JMeter.
I'm quite familiar with this tool.
Downside is that it would be probably difficult/impossible for me to test private functions.
Your opinion ?
I don't get your "simple test in complex test suite". However if your main testing scenario is:
Set up test data in database table(s)
Run a PL/SQL subprogram
Verify the subprogram modified data in database table(s) correctly
I have a good news for you - http://dbfit.github.io/dbfit/ is a great tool for that kind of testing. I have used it in several occasions and I'm very happy with it.
Oh, and one can't access the package's private subprograms. You can only access package's public interface (the specification).

Validating SQL Files - SQL Validator

We have an application where we add SQL changes in SQL files and they are run during install in test and other environment. So we around 25 people working and if someone makes any mistake then it breaks in test environment as those are DDL, DML and some time even syntax errors.
Now to avoid that I was thinking about building SQl validator that will run a execute plan on the query in the SQL file. The SQL statements are written as single line or multiple lines. We have to format them and then find out the syntax error or any other error then report it. So that while install in test it should break and save lots of time and rework.
Now we do manual review of the SQL files but after that also we have few errors which are not caught and errors out in test.
Any suggestion would be highly appreciated.
You say you want a SQL validator but what you probably need is a better development process.
Let me guess - you have a single development server that everybody works on. They make changes there, hope it doesn't break anything, extract SQL statements from that database, and then apply them to other servers.
That's how most Oracle developers work. It's painful and does not scale. There's a better way using some simple software development methodologies developed decades ago.
Version control as the single source of truth. Most Oracle shops only use version control as a glorified backup. Here's a quick test - drop every schema on the development server. If you can't get back up and running from the source code repository in 5 minutes, you're not really using version control.
Ask yourself an almost philosophical question - where does the true, ideal version of your product exist? Even if you're building a database product, the answer should be "version-controlled text files", not "the development database".
An infinite number of databases and schemas. This is easier than it sounds - every Oracle developer and tester runs Oracle on their desktop. Then they can create as many instances and schemas as they want. I've seen many Java programmers do this, but sadly most Oracle developers think it's impossible.
Automated tests. Automated unit tests give you confidence. It sounds like right now you have 0 confidence in the code. You shouldn't be worried about syntax errors - that's a Programming 101 problem. Errors will decrease significantly if everyone on the team is constantly building and testing code. Things will still break sometimes - you can add something like continuous integration or just shame people who constantly break the builds.
The combination of those three things are the ultimate SQL validator. Version control to ensure you have the right code. Local instances to make it easy to install and validate the code. And automated tests to do the validation.

Is it possible to implement Test Driven Development in SQL?

I am not a Db guy. I am just curious if there is a possibility to write asserts in Sql so that you can write unit tests for your scripts, for your sprocs etc. and then even implement a Test Driven Development approach to your sessions?
thanks!
You can do that actually, not directly from SQL throught, but the language you do your application.
Of cause you SQL must be incapsulated in DAL (Data Access Layer) and all data got by Repositories (or other data access classes). You can do unit testing of those classes, that would be running SQL scripts at the end. So, basically you will test your SQL code.
It is simplier to me, than trying to write such tests in SQL. :)
Sometime ago I had thought on that:
http://www.beletsky.net/2010/11/testing-database-and-test-database.html

Using TSQLUNIT for SQL unit testing: don't you need to duplicate your SQL code?

I'm considering writing some unit tests for my Tsql stored procedures, I have two concerns:
I will have to write a lot of SQL to create test fixtures (test data prepared in _setup procedures)
I will have to "re-write" my query in the test procedure to obtain the results to compare against the results from the stored procedure I'm testing.
Considering that my DB has hundreds of tables and really complex stored procedures... I don't see how this will save me time?? any thoughts? am I missing something? is there any other way to go?
Automated unit-testing often gets left by the wayside as managers push for quick releases rather than increasing project scope and budget to emphasis stability. The fact is, unit-test takes time. In my experience, the benefits far outweigh any drawbacks. In cases where stored procedures are being called by external systems unit-testing has been invaluable in eliminating unforeseen problems and guaranteeing stability prior to integration testing.
Regarding your concerns:
If you place any data required to unit test your stored procedure(s) in XML files which can be read prior to running the unit test(s), you can read the data using the standard API routines for reading XML data and potentially re-use the data for multiple tests. Run each test in the context of a transaction which is rolled back at the end of the test to allow the overall environment to be configured once at the beginning of a test run rather than having to perform lots of steps for each individual test. Unit-tests can be bundled with automated nightly build processes to further bullet-proof your code.
There will be some overhead initially, but this will decrease over time as you and your team become more familiar with the unit-test concepts and how to leverage reusability.
You shouldn't need to re-write your query to compare the results. A standard scenario might be something like the following:
load test data and prepare environment
begin transaction
run stored procedure using test data
compare actual output to expected output using Assert statements
if actual and expected output don't match, test fails
if actual and expected output match, test passes
rollback transaction
/...
repeat steps 2 thru 7 for any additional tests
.../
Cleanup test environment
Keep in mind, you are testing a specific set of conditions looking for pass/fail so it's Ok to hard code the expected values within your test routines.
Hope this helps,
Bill
In theory, Unit Testing (in general) means more time up front writing tests, but should make things easier for you later on. For example, the time invested pays dividends later on when you have the ability to spot regression bugs very easily. The wikipedia entry on unit testing has a good overview of the general benefits.
Whether it will be good for you in practice is a hard question to answer - depends on the project.
As for 'having to re-write the query to test the query results', obviously that isn't going to prove anything. I suppose what you need to do is set up test data that will return a predictable result when the query (or whatever) is run, and then test for that specific result. That way you are testing the query against your mental model of it, rather than testing the query against a copy of itself.
But yeah, sounds like that will take a lot of setting up time - I can imagine that preparing a SQL stored procedure test will involve doing a lot more setting-up than your average .Net object test.
The thing I wonder about is, WHY are you considering writing unit tests? Do you have operational issues with the database? Is it hard to implement changes? Is management making your raise dependent on unit tests?
If there's no clear reason, I wouldn't start with unit tests "for fun". When there's a well-oiled change system in place, unit tests add overhead but no value.
There are also serious risks with unit tests:
People start seeing unit tests as a "quality guarantee". Just keep hacking till the unit tests give the green light, and then it's good enough for production.
Small changes that used to be a "quick fix", will grow bigger because they require (changes to) the unit tests. This way unit tests make you less flexible.
Unit tests often check many things that don't matter to anyone using the production system. So unit tests force you to spill resources on stuff only the unit tests care about.
Sorry for the the rant (I've had bad experience with unit tests.)

ASP.NET MVC TDD with LINQ and SQL database

I am trying to start a new MVC project with tests and I thought the best way to go would have 2 databases. 1 for testing against and 1 for when I run the app and use it (also test really as it's not production yet).
For the test database I was thinking of putting create table scripts and fill data scripts within the test setup method and then deleting all this in the tear down method.
I am going to be using Linq to SQL though and I don't think that will allow me to do this?
Will I have to just go the ADO route if I want to do it this way? Or should I just use a mock object and store data as an array or something?.
Any tips on best practices?
How did Jeff go about doing this for StackOveflow?
What I do is define an interface for a DataContext wrapper and use an implementation of the wrapper for the DataContext. This allows me to use an alternate, fake DataContext implementation in my tests (or mock it, if easier). This abstracts the database out of my unit tests completely. I found some starter code at http://andrewtokeley.net/archive/2008/07/06/mocking-linq-to-sql-datacontext.aspx, although I've extended it so that it handles the validation implementations on my entity classes.
I should also mention that I have a separate staging server for QA, so there is live testing of the entire system. I just don't use an actual database in my unit testing.
I checked out the link from tvanfosson and RikMigrations and after playing about with them I prefer the mocking datacontext method best. I realised I don't need to create tables and drop them all the time.
After a little more research I found Stephen Walther's article http://stephenwalther.com/blog/archive/2008/08/17/asp-net-mvc-tip-33-unit-test-linq-to-sql.aspx which to me seems easier and more reliable.
So I am going with this implementation.
Thanks for the help.
You may want to find some other way around actually hitting the database for your unit tests because it takes a lot more time. That being said, have you considered using Migrations for creating / deleting your tables instead of using sql scripts? RikMigrations is what I have been using to create my database so I can easily revision all of my code in one place. Justin Etheredge has a great article on using RikMigrations.
Consider these methods on DataContext:
http://msdn.microsoft.com/en-us/library/system.data.linq.datacontext.createdatabase.aspx
http://msdn.microsoft.com/en-us/library/system.data.linq.datacontext.executecommand(v=VS.100).aspx
I agree with much of the above, relating to unit testing. However, I think it's important to raise the point that using Mock Repositories and unit tests doesn't give you the same level of tests as a DB Integration Test would.
For example, our databases often have cascading deletes built right in to the schema. In this case, deleting a primary entity in an aggregate will automatically delete all child entities. However, this would not automatically apply in a mocked repository that was not backed up by a physical database with these business rules (unless you built all of those rules in to the Mock). This is important because if somebody comes along and changes the design of my schema, I need it to break my tests so I can adjust the code/schema accordingly. I appreciate that this is Integration Testing and not Unit Testing but thought it was worth mentioning.
My preferred option is to create a Master Design Database that contains sample data (the same sort of data you would create in your Mocks). During the start of each test run, I have an automated script that creates a backup of the MasterDB and restores it to "TestDB" (which all my tests use). That way, I maintain a repository of clean test data in Master than recreates itself upon each test run. My tests can play around with the data and test out all the scenarios needed.
When I debug the application, I have another script that backs up and restores the Master DB to a DEV database. I can play around with data here too without worrying about losing my sample data. I don't typically run this particular script every session because of the delay waiting for the DB to be recreated. I may run it once a day and then play around/debug the app throughout the day. If for example, I delete all the records from a table as part of my debugging, I would run the script to recreate the DevDB when I'm done.
These steps sound like they would add a huge amount of time to the process, but actually - they don't. Our application currently has in the region of 3500 tests, with about 3000 of them accessing the DB at some point. The database backup and restore typically takes around 10-12 seconds at the start of each test run. And since the whole test suite is only executed upon TFS checkin, we don't mind if we have to wait a while longer anyway. On an average day, our entire test suite takes about 15-20 minutes to run.
I appreciate and accept that integration testing is much slower than unit testing (because of the inherent need to use a real DB) but it more closely represents the 'real world' app. For example, Mock Repositories don't return DB error codes, the don't time-out, they don't lock up, they don't run out of disk space, etc.
Unit tests are ok for simple calculations, basic business rules, etc. and certainly they are absolutely the best choice for most operations that don't involve DB (or other resource) access. But I don't think they are as valuable as integration tests - people talk a lot about unit tests, but little is said about integration tests.
I expect those passionate about unit tests will be sending flames my way for this. That's fine - I'm just trying to bring some balance and to remind people that projects that are full of passed unit tests can still fail badly the moment you implement them in the field.
This article gives example of mocking linq to sql with typemock.
http://blog.benhall.me.uk/2007/11/how-to-unit-test-linq-to-sql-and.html