I'm trying to set up a integration test class that wraps each test in a transaction. This way I can rollback the transaction after each test instead of resetting the DB before each test.
I also want to be able to use transactions in the integration tests themselves.
I am using NHibernate and the Rhino Commons UnitOfWork for the the project. I am using MSTest for the tests.
I want something like this:
[TestInitialize]
public void TestInit() {
// create outer transaction scope
UnitOfWork.Start();
UnitOfWork.Current.BeginTransaction();
}
[TestCleanup]
public void TestCleanup() {
// rollback outer transaction
UnitOfWork.Current.Dispose();
}
[TestMethod]
public void IntegrationTest() {
using (UnitOfWork.Start(UnitOfWorkNestingOptions.CreateNewOrNestUnitOfWork)) {
UnitOfWork.Current.BeginTransaction();
// integration test code
UnitOfWork.Current.TransactionalFlush();
// possibly more transactions
}
}
This is the first time I have used NHibernate, Rhino Commons, and MSTest. I am not clear on the behavior of sessions with nested Rhino Commons UnitOfWorks. What I have here does not rollback the changes from the integration test.
I tried using TransactionScope from System.Transactions, but get the following error when the UnitOfWorks end:
System.InvalidOperationException: Disconnect cannot be called while a transaction is in progress..
So here are my questions:
Is there a way to get this behavior with UnitOfWork in Rhino Commons? If not, should I just reset the database before each test or is there another way to nest transactions that plays nicely with the UnitOfWork?
Thank you.
I believe UnitOfWork.Start().BeginTransaction() returns a RhinoTransaction. So to make it more clear you can try to rewrite the code as this:
using(IUnitOfWork uow = UnitOfWork.Start(UnitOfWorkNestingOptions.CreateNewOrNestUnitOfWork))
{
RhinoTransaction tx = uow.BeginTransaction();
.
.
.
.
tx.Rollback();
}
Be warned though, I have not tried the code above, let me know if it works.
Related
I'm trying to implement unit tests in my company's project, and I'm running into some weird trouble trying to use a separate set of data in my database.
As I want tests to be performed in a confined environment, I'm looking for the easiest way to input data in a dedicated database. Long story short, to this extent, I decided to use a MySQL dump of inserted data.
This is basically my seeder code:
public function run()
{
\Illuminate\Support\Facades\DB::unprepared(file_get_contents(__DIR__ . '/data1.sql'));
}
Now here's the problem.
In my unit test, I can call the seeder, but :
If I call the seeder in the setUpBeforeClass(), it works. Although it doesn't fit my needs as I want to be able to invoke different sets of data for different tests
If I call the seeder within a test, the data is never inserted in the database (either with or without the transaction trait).
If I use DB::insert instead of ::raw or ::unprepared or ::statement without using a raw sql file, it works. But my inserts are too complicated for that.
Here's a few things I tried with the same results :
DB::raw(file_get_contents(__DIR__.'/database/data1.sql'));
DB::statement(file_get_contents(__DIR__ . '/database/data1.sql'));
$seeder = new CheckTestSeeder();
$seeder->run();
\Illuminate\Support\Facades\Artisan::call('db:seed', ['--class' => 'CheckTestSeeder']);
$this->seeInDatabase('jackpot.progressive', [
'name_progressive' => 'aaa'
]);
Any pointers on how to proceed and why I have different behaviors if I do that in the setUpBeforeClass() and within the test would be appreciated!
You may use Illuminate\Foundation\Testing\RefreshDatabase trait as explained here. If you need something more, you can override refreshTestDatabase method in RefreshDatabase trait.
protected function refreshTestDatabase()
{
parent::refreshTestDatabase();
\Illuminate\Support\Facades\Artisan::call('db:seed', ['--class' => 'CheckTestSeeder']);
}
Is there a convenient way to force Grails / Hibernate to recreate the database schema from an integration test?
If you add the following in DataSource.groovy an empty database will be created before the integration tests are run:
environments {
test {
dataSource {
dbCreate = "create"
}
}
}
By default each integration test executes within a transaction that is rolled-back at the end of the test, so unless you're not using this default behaviour there shouldn't be any need to programatically recreate the database.
Update
Based on your comment, it seems you really do want to recreate the schema before some integration tests. In that case, the only way I can think of, is to run
drop and recreate the schema
use grails schema-export to import a fresh schema
class MyIntegrationTest {
SessionFactory sessionFactory
/**
* Helper for executing SQL statements
* #param jdbcWork A closure that is passed an <tt>Sql</tt> object that is used to execute the JDBC statements
*/
private doJdbcWork(Closure jdbcWork) {
sessionFactory.currentSession.doWork(
new Work() {
#Override
void execute(Connection connection) throws SQLException {
// do not close this Sql instance ourselves
Sql sql = new Sql(connection)
jdbcWork(sql)
}
}
)
}
private recreateSchema() {
doJdbcWork {Sql sql ->
// use the sql object to drop the database and create a new blank database
// something like the following might work for MySQL
sql.execute("drop database my-schema")
sql.execute("create database my-schema")
}
// generate the DDL and import it
// there must be a better way to execute a grails command from within an
// integration test, but unfortunately I don't know what it is
'grails test schema-export export'.execute()
}
#Test
void myTestMethod() {
recreateSchema()
// now do the test
}
}
First and foremost, this code is completely untested, so treat with deep suspicion and low expectations. Secondly, you may need to change the default transational behaviour of integration tests (with #Transactional) in order for this to work.
This seems to work fine, but it's obviously very tightly coupled to H2 so it would have been nice if the Hibernate plugin had exposed an api to take care of this.
http://h2database.com/html/grammar.html#script
class SomethingTestingTransactionsSpec extends IntegrationSpec {
static transactional = false // Why I need this
SessionFactory sessionFactory // Injected by Spring
DataSource dataSource // Also injected
File schemaDump
Sql sql
void setup() {
sql = new Sql(dataSource)
schemaDump = File.createTempFile("test-database-dump", ".sql") // Java 7 API
sql.execute("script drop to ${schemaDump.absolutePath}")
}
void cleanup() {
sql.execute("runscript from ${schemaDump.absolutePath}")
sessionFactory.currentSession.clear()
schemaDump.delete()
}
// Spock tests ...
}
It should be trivial to extract this code into a bean registered only for test environments. That should clean up the test code a bit and improve efficiency by only having to dump the schema once.
Well, you have access to executing arbitrary sql via sessionFactory, so you could call a grails schema export at the beginning of your tests and then just re-import the schema into your DB when needed.
Alternatively, I wonder if calling database migration plugin externally will accomplish the same.
Or you can trick grails into thinking your domain class has changed and force a reload via https://github.com/grails/grails-core/blob/v2.1.1/grails-hibernate/src/main/groovy/org/codehaus/groovy/grails/plugins/orm/hibernate/HibernatePluginSupport.groovy#L340 ( don't ask me how )
I have a system which after getting a message - enqueues it (write to a table), and another process polls the DB and dequeues it for processing. In my automatic tests I've merged the operations in the same process, but cannot (conceptually) merge the NH sessions from the two operations.
Naturally - problems arise.
I've read everything I could about getting the SQLite-InMemory-NHibernate combination to work in the testing world, but I've now ran into RANDOMLY failing tests, due to "no such table" errors. To make it clear - "random" means that the same test with the same exact configuration and code will sometimes fail.
I have the following SQLite configuration:
return SQLiteConfiguration
.Standard
.ConnectionString(x => x.Is("Data Source=:memory:; Version=3; New=True; Pooling=True; Max Pool Size=1;"))
.Raw(NHibernate.Cfg.Environment.ReleaseConnections, "on_close");
At the beginning of my test (every test) I fetch the "static" session provider, and kindly ask it to flush the existing DB clean, and recreate the schema:
public void PurgeDatabaseOrCreateNew()
{
using (var session = GetNewSession())
using (var tx = session.BeginTransaction())
{
PurgeDatabaseOrCreateNew(session);
tx.Commit();
}
}
private void PurgeDatabaseOrCreateNew(ISession session)
{
//http://ayende.com/Blog/archive/2009/04/28/nhibernate-unit-testing.aspx
new SchemaExport(_Configuration)
.Execute(false, true, false, session.Connection, null);
}
So yes, it's on a different session, but the connection is pooled on SQLite, so the next session I create will see the generated schema. Yet, while most of the times it works - sometimes the later "enqueue" operation will fail because it cannot see a table for my incoming messages.
Also - that seems to happen at max one or twice per test suite run; not all the tests are failing, just the first one (and sometimes another one. Not quite sure if it's the second or not).
The worst part is the randomness, naturally. I've told myself I've fixed this several times now, just because it simply "stopped failing". At random.
This happens on FW4.0, System.Data.SQLite x86 version, Win7 64b and 2008R2 (three differen machine in total), NH2.1.2, configured with FNH, on TestDriven.NET 32b precesses and NUnit console 32b processes.
Help?
Hi I'm pretty sure I have the exact same problem as you. I open and close multiple sessions per integration test. After digging through the SQLite connection pooling and some experimenting of my own, I've come to the following conclusion:
The SQLite pooling code caches the connection using WeakReferences, which isn't the best option for caching, since the reference to the connection(s) will be cleared when there is no normal (strong) reference to the connection and the GC runs. Since you can't predict when the GC runs, this explains the "randomness". Try and add a GC.Collect(); between closing one and opening another session, your test will always fail.
My solution was to cache the connection myself between opening sessions, like this:
public class BaseIntegrationTest
{
private static ISessionFactory _sessionFactory;
private static Configuration _configuration;
private static SchemaExport _schemaExport;
// I cache the whole session because I don't want it and the
// underlying connection to get closed.
// The "Connection" property of the ISession is what we actually want.
// Using the NHibernate SQLite Driver to get the connection would probably
// work too.
private static ISession _keepConnectionAlive;
static BaseIntegrationTest()
{
_configuration = new Configuration();
_configuration.Configure();
_configuration.AddAssembly(typeof(Product).Assembly);
_sessionFactory = _configuration.BuildSessionFactory();
_schemaExport = new SchemaExport(_configuration);
_keepConnectionAlive = _sessionFactory.OpenSession();
}
[SetUp]
protected void RecreateDB()
{
_schemaExport.Execute(false, true, false, _keepConnectionAlive.Connection, null);
}
protected ISession OpenSession()
{
return _sessionFactory.OpenSession(_keepConnectionAlive.Connection);
}
}
Each of my integrationtests inherits from this class, and calls OpenSession() to get a session. RecreateDB is called by NUnit before each test because of the [SetUp] attribute.
I hope this helps you or anyone else who gets this error.
Only thing that comes into mind that you are randomly leaving session open after the test. You must make sure any existing ISession is closed before you open another one. If you are not using the using() statement or calling Dispose() manually the session might still be alive somewhere causing those random exceptions.
I have a suite of integration tests that run inside transactions.
Sometimes it seems that NHibernate transactions are not being correctly rolled back. I can't work out what causes this.
Here is a slightly simplified overview of the base class that these integration test fixtures run with:
public class IntegrationTestFixture
{
private TransactionScope _transactionScope;
private ConnectionScope _connectionScope;
[TestFixtureSetUp]
public virtual void TestFixtureSetUp()
{
var session = NHibernateSessionManager.SessionFactory.OpenSession();
CallSessionContext.Bind(session);
_connectionScope = new ConnectionScope();
_transactionScope = new TransactionScope();
}
[TestFixtureTearDown]
public virtual void TestFixtureTearDown()
{
_transactionScope.Dispose();
_connectionScope.Dispose();
var session = CurrentSessionContext.Unbind(SessionFactory);
session.Close();
session.Dispose();
}
}
A call to the TransactionScope's commit method is never made, therefore how is it possible that data still ends up in the database?
Update I never really got my head around the way NHibernate treats transactions but I found that calling Session.Flush() within a transaction would sometimes result in the data remaining in the database, even if the transaction is then rolled back. I am not sure why you can't call Flush, but then roll back. This is a pity because during integration testing you want to be able to hit the database and flush is the only way to do this sometimes.
I had an issue with using the Identity generator that sounds similar. In a nutshell, when saving an object using an identity generator, it has to write to the database in order to get the identity value and can do this outside a transaction.
Ayende has a blog post about this.
I've use Moq to mock my repositories. However, someone recently said that they prefer to create hard-coded test implementations of their repository interfaces.
What are the pros and cons of each approach?
Edit: clarified meaning of repository with link to Fowler.
I generally see two scenarios with repositories. I ask for something, and I get it, or I ask for something, and it isn't there.
If you are mocking your repository, that means you system under test (SUT) is something that is using your repository. So you generally want to test that your SUT behaves correctly when it is given an object from the repository. And you also want to test that it handles the situation properly when you expect to get something back and don't, or aren't sure if you are going to get something back.
Hard-coded test doubles are ok if you are doing integration testing. Say, you want to save an object, and then get it back. But this is testing the interaction of two objects together, not just the behavior of the SUT. They are two different things. If you start coding fake repositories, you need unit tests for those as well, otherwise you end up basing the success and failure of your code on untested code.
That's my opinion on Mocking vs. Test Doubles.
SCNR:
"You call yourself a repository? I've seen matchboxes with more capacity!"
I assume that by "repository" you mean a DAO; if not then this answer won't apply.
Lately I've been making "in memory" "mock" (or test) implementations of my DAO, that basically operate off of data (a List, Map, etc.) passed into the mock's constructor. This way the unit test class is free to throw in whatever data it needs for the test, can change it, etc., without forcing all unit tests operating on the "in memory" DAO to be coded to use the same test data.
One plus that I see in this approach is that if I have a dozen unit tests that need to use the same DAO for their test (to inject into the class under test, for example), I don't need to remember all of the details of the test data each time (as you would if the "mock" was hardcoded) - the unit test creates the test data itself. On the downside, this means each unit test has to spend a few lines creating and wiring up it's test data; but that's a small downside to me.
A code example:
public interface UserDao {
User getUser(int userid);
User getUser(String login);
}
public class InMemoryUserDao implements UserDao {
private List users;
public InMemoryUserDao(List users) {
this.users = users;
}
public User getUser(int userid) {
for (Iterator it = users.iterator(); it.hasNext();) {
User user = (User) it.next();
if (userid == user.getId()) {
return user;
}
}
return null;
}
public User getUser(String login) {
for (Iterator it = users.iterator(); it.hasNext();) {
User user = (User) it.next();
if (login.equals(user.getLogin())) {
return user;
}
}
return null;
}
}