Tests with constraint checking errors not caught when using AbstractTransactionalSpringContextTests (rollback) - nhibernate

The majority of my integration tests use spring's AbstractTransactionalSpringContextTests to do a rollback instead of commiting to the database. This works well normally but because foreign key constraints are not applied until the commit stage there's a hole in my testing strategy.
How can I improve my tests?
I want to avoid commiting if possible as this causes tests to take much longer to run (when there are many)

I understand you don't want to commit to the database e.g. flush the session, but performance might be acceptable when using an in-memory sqlite database for this purpose.
I've done unit tests using NHibernate (with Fluent NHibernate) and an in-memory sql-lite database (how to here); this works quite fast as long as you only create the relevant parts of your database instead of the complete schema.
You can easily extend the AbstractTransactionalSpringContextTests class to flush to the db, see the spring.net docs 22.2.10 or this thread on the spring.net forum, so you should be able to get this working quickly for your test suite.

Related

Speed of CockroachDB schema changes during unit tests

I have issues with schema changes in my testing infrastructure. We use a unit test framework, and truncate our database between each unit test. I've noticed that running a series of CREATE statements is very slow as well, specifically duplicating a database for running parallel tests. I'm finding it's taking upwards of 10 minutes to duplicate a database 10 times (I'm using SHOW CREATE ALL TABLES and running those statements).
The problem with CockroachDB schema changes getting slow in tests is known, but there are some workarounds. See this guide with some recommended settings for unit tests, which should help address some of the problems: https://www.cockroachlabs.com/docs/stable/local-testing.html#use-a-local-single-node-cluster-with-in-memory-storage
Future releases (that is, v22.2 and later) will continue improving the performance of schema changes when repeatedly dropping and creating tables.

Schema generation using Hibernate Vs Manual schema generation

I have worked on projects that involved creation of schema by hand coding sql scripts. Then we have used hibernate to do DML related activites.
Now, I am starting a project that involves extensive database entities creation and I was wondering if it is a good idea to use Hibernate itself to generate the entities. In other words, is hibernate capable of handling all possible DDL related scenarios, especially the ones that are complex in nature. Or, is it advisable to hand code the DDL sql scripts and use Hibernate for DML related tasks.
Thanks for your inputs.
No, Hibernate isn't able to handle all possible situations (synonyms, tablespaces, and all sorts of various things can't be handled by Hibernate).
I would only consider using Hibernate (to handle the schema creation and updates) for a quick and dirty POC. Otherwise, SQL scripts or Liquibase are your friends. You'll need them once you have a database in production that you need to migrate anyway.

How to Unit Test when stored procedures are used by NHibernate Mapping

I've asked about how to inject stored procedures support when working with Views in SQL Server.
How to use stored procedures in NHibernate for Create/Update when working with database view
Here's a piece of mapping file:
<sql-insert>
exec sp_cycle_insert ?,?,?
</sql-insert>
<sql-update>
exec sp_cycle_update ?,?,?,?,?,?
</sql-update>
<sql-delete>
raiserror ('Cycle can not be deleted', 10, 1)
</sql-delete>
So, I've done the refactoring, etc, and I run my tests.... All failed.
There reason is that SQL Server has view & stored procedures, whereas every time I run a test I set up database from scratch with:
new SchemaExport(configuration).Execute(false, true, false);
I thought about possible solution and here there are: is there a way to:
run additional scripts (I guess that is the solution) with stuff needed by database (like stored procedures, view etc)
On the other hand, running scripts can fail (currently I use sdf files, but what if I change to different provider in the future?). Also procedures/views use WITH construction as well as some SQL Server 2005 functions that can be not supported by database used during testing.
. So I though that it's time to mock repositories. But also here I see obstacles: views compute some readonly properties and NHibernate accesses backing fields using:
access="nosetter.camelcase"
If I switch to mocking the repository, I would be responsible for implementing view's logic in code. Are there any other solutions? Or I'm in big trouble!?
run additional scripts (I guess that is the solution) with stuff needed by database (likestored procedures, view etc)
Use IAuxiliaryDatabaseObject object(s) for this. It'll contain extra script to run when creating/dropping your schema using SchemaExport. This/these objects you pass in to your NH Configuration object (AddAuxiliaryDatabaseObject).
So I though that it's time to mock repositories. But also here I see obstacles: views compute some readonly properties and NHibernate accesses backing fields using
You should probably do both. Do integration tests against your real database to verify that your infrastructure/DAL/whatever-you-call-it layer works. In higher layers your probably want to write unit tests instead where things like repositories are mocked.
If I understand your question correctly you have problems setting up your test state because some data is private on your entities? That's not really an "issue" caused by NH/repos/data acess, but a general one. There are different ways to solve that, you can; relax your API to make it more testable, have an ctor accepting all data, use reflection one way or the other, let your entity's interface be readonly but its implementation have setters etc etc etc. It's hard to give general recommendation but try to find a way that suits your case.

How to run Integration Testing on DB through repositories with LINQ2SQL?

How do you go about integration testing your database through your domain layer/model (repositories) that uses LINQ 2 SQL in the implementation and leave the DB as you found it? In other words, the ideal world of unit testing the DB, the integration test would leave the DB as it found it.
Are there tools out there that will handle this automagically? What are the best practices for performing integration tests on a DB through repositories?
The Spring Framework provides support for integration testing when using NUnit. The NUnit classes are located in the assembly Spring.Testing.NUnit.dll. In there are some classes that perform transaction management. These classes create and roll back a database transaction for each test. You simply write code that can assume the existence of a transaction.
Whether or not this will actually work with Linq to SQL is another matter. Spring says this works with ORMs. SQL Server 2008 allows you to nest transactions, so in theory you could start a transaction, perform your test through the Linq to SQL classes, and then
roll your transaction back. But I haven't
tried it.
Ryan Garaguay has an interesting article about this which uses TransactionScope and NUnit to roll back the database changes (although he is using SQLCommand and SQLConnection objects in his test code, rather than Linq)

ASP.NET MVC TDD with LINQ and SQL database

I am trying to start a new MVC project with tests and I thought the best way to go would have 2 databases. 1 for testing against and 1 for when I run the app and use it (also test really as it's not production yet).
For the test database I was thinking of putting create table scripts and fill data scripts within the test setup method and then deleting all this in the tear down method.
I am going to be using Linq to SQL though and I don't think that will allow me to do this?
Will I have to just go the ADO route if I want to do it this way? Or should I just use a mock object and store data as an array or something?.
Any tips on best practices?
How did Jeff go about doing this for StackOveflow?
What I do is define an interface for a DataContext wrapper and use an implementation of the wrapper for the DataContext. This allows me to use an alternate, fake DataContext implementation in my tests (or mock it, if easier). This abstracts the database out of my unit tests completely. I found some starter code at http://andrewtokeley.net/archive/2008/07/06/mocking-linq-to-sql-datacontext.aspx, although I've extended it so that it handles the validation implementations on my entity classes.
I should also mention that I have a separate staging server for QA, so there is live testing of the entire system. I just don't use an actual database in my unit testing.
I checked out the link from tvanfosson and RikMigrations and after playing about with them I prefer the mocking datacontext method best. I realised I don't need to create tables and drop them all the time.
After a little more research I found Stephen Walther's article http://stephenwalther.com/blog/archive/2008/08/17/asp-net-mvc-tip-33-unit-test-linq-to-sql.aspx which to me seems easier and more reliable.
So I am going with this implementation.
Thanks for the help.
You may want to find some other way around actually hitting the database for your unit tests because it takes a lot more time. That being said, have you considered using Migrations for creating / deleting your tables instead of using sql scripts? RikMigrations is what I have been using to create my database so I can easily revision all of my code in one place. Justin Etheredge has a great article on using RikMigrations.
Consider these methods on DataContext:
http://msdn.microsoft.com/en-us/library/system.data.linq.datacontext.createdatabase.aspx
http://msdn.microsoft.com/en-us/library/system.data.linq.datacontext.executecommand(v=VS.100).aspx
I agree with much of the above, relating to unit testing. However, I think it's important to raise the point that using Mock Repositories and unit tests doesn't give you the same level of tests as a DB Integration Test would.
For example, our databases often have cascading deletes built right in to the schema. In this case, deleting a primary entity in an aggregate will automatically delete all child entities. However, this would not automatically apply in a mocked repository that was not backed up by a physical database with these business rules (unless you built all of those rules in to the Mock). This is important because if somebody comes along and changes the design of my schema, I need it to break my tests so I can adjust the code/schema accordingly. I appreciate that this is Integration Testing and not Unit Testing but thought it was worth mentioning.
My preferred option is to create a Master Design Database that contains sample data (the same sort of data you would create in your Mocks). During the start of each test run, I have an automated script that creates a backup of the MasterDB and restores it to "TestDB" (which all my tests use). That way, I maintain a repository of clean test data in Master than recreates itself upon each test run. My tests can play around with the data and test out all the scenarios needed.
When I debug the application, I have another script that backs up and restores the Master DB to a DEV database. I can play around with data here too without worrying about losing my sample data. I don't typically run this particular script every session because of the delay waiting for the DB to be recreated. I may run it once a day and then play around/debug the app throughout the day. If for example, I delete all the records from a table as part of my debugging, I would run the script to recreate the DevDB when I'm done.
These steps sound like they would add a huge amount of time to the process, but actually - they don't. Our application currently has in the region of 3500 tests, with about 3000 of them accessing the DB at some point. The database backup and restore typically takes around 10-12 seconds at the start of each test run. And since the whole test suite is only executed upon TFS checkin, we don't mind if we have to wait a while longer anyway. On an average day, our entire test suite takes about 15-20 minutes to run.
I appreciate and accept that integration testing is much slower than unit testing (because of the inherent need to use a real DB) but it more closely represents the 'real world' app. For example, Mock Repositories don't return DB error codes, the don't time-out, they don't lock up, they don't run out of disk space, etc.
Unit tests are ok for simple calculations, basic business rules, etc. and certainly they are absolutely the best choice for most operations that don't involve DB (or other resource) access. But I don't think they are as valuable as integration tests - people talk a lot about unit tests, but little is said about integration tests.
I expect those passionate about unit tests will be sending flames my way for this. That's fine - I'm just trying to bring some balance and to remind people that projects that are full of passed unit tests can still fail badly the moment you implement them in the field.
This article gives example of mocking linq to sql with typemock.
http://blog.benhall.me.uk/2007/11/how-to-unit-test-linq-to-sql-and.html