I have been using Spring.NET and NHibernate for some years and I am very satisfied. However, I was always playing around with multi threading, Reactive Extensions and eventually Task Parallel Library which is a great framework. Unfortunately all kind of multithreading approaches fail because of NHiberntate's session which is not thread safe.
I am asking you how can I benefit from parallel programming and still utilising NHibernate.
For instance: I have a CustomerRegistrationService class which method Register performs several tasks:
ICustumer customer = this.CreateCustomerAndAdresses(params);
this.CreateMembership(customer);
this.CreateGeoLookups(customer.Address);
this.SendWelcomeMail(customer);
The last two methods would be ideal candidates to run parallel, CreateGeoLookups calls some web services to determine geo locations of the customer's address and creates some new entities as well as updates the customer itself. SendWelcomMail does what it says.
Because CreateGeoLookups does use NHibernate (although through repository objects so NHibernate is acutally hidden via Interfaces/Dependency Inection) it won't work with Task.Factory.StarNew(...) or other Threading mechanisms.
My question is not to solve this very issue I have described but I would like to hear from you about NHibenrate, Spring.NET and parallel approaches.
Thank you very much
Max
In NH its the ISession that isn't thread-safe but the ISessionFactory is entirely thread-safe, easily supporting what it seems you are after. If you have designed your session-lifecycle-management (and the repositories that depend upon it) such that you assume one single consistent ISession across calls, then, yes, you will have this kind of trouble. But if you have designed your session-handling pattern to only assume a single ISessionFactory but not to make assumptions about ISession, then there is nothing inherently preventing you from interacting with NH in parallel.
Although you don't specifically mention your use case as being for the web, its important to take note that in web-centric use-cases (e.g., what is a pretty common case for Spring.NET users as well as many other NH-managing-frameworks), the often-used 'Session-Per-Request' pattern of ISession management (often referred to in Spring.NET as 'Open Session In View' or just 'OSIV') will NOT work and you will need to switch to a different duration of your ISession lifecycle. This is because (as the name suggests) the session-per-request/OSIV pattern makes the (now incorrect in your case) assumption that there is only a single ISession instance for the duration of each HttpRequest (and presumably you would want to be spawning these parallel NH calls all within the context of a single HttpRequest in the web use case).
Obviously in the non-web case where there's rarely a similar concept to session-per-request you wouldn't be as likely to run into this issue as session-lifecycle management is rarely as fine-grained/short-lived as it in web-based applications.
Hope this helps.
-Steve B.
This a difficult thing you ask for. The DTC has to be taken with care.
The only solution i may know is the use of reliable, transactional messaging (e.g. MSMQ + NServiceBus/MassTransit).
Such a design enables you to do this. It would look like this:
var customerUid=CreateCustomers();
Bus.Publish(new CustomerCreatedEvent() { CustomerUid = customerUid});
Then you could use two event handlers (Reactors) that handle the event and send an EMail or create the lookups.
This won´t allow you sharing the Transaction either but will ensure that the Reactors are run (in a new Transaction) when the creation of the customer suceeded.
Also this has nothing to do with the TPL.
Well thank you for answering. I know that the 'ISession that isn't thread-safe but the ISessionFactory is entirely thread-safe'. My problem in the above code for example is that the whole operation is wrapped in one transaction. So this.CreateCustomerAndAdresses(params) on main thread #1 will use for instance ISession #1 with transaction #1. Calling the other three in parallel will create three more threads and three more sessions and transactions which leads to database timeouts in my case. My assumption is that the transaction #1 is not successfully commited because it waits for the three concurrent tasks to complete. But the three concurrent tasks try to read from the database while a transaction is still active leading to deadlocks/timeouts.
So is there some way to tell the other threads/sessions not to create a new transaction but use the main transaction #1?
I am using the TxScopeTransactionManager from Spring.NET which utilises DTC (System.Transactions). I have googled that maybe System.Transactions.DependentTransaction could work but do not have a clue how to integrate it in my Spring.NET transaction managed scenario.
Thanks
Related
Im new in Nhibernate.
I have application with lazy loading.
I want to write method
public User GetUser(int id)
in my UserPersister class.
Later, in application I want use some referenced property like User.Role or User.Address.
It wan`t work if I close Session that I used to retreive user.
My first idea was to create Singleton Session and I will be able to get all data then.
I read some articles that it is bad idea becouse of performance and memory leaking.
Is it true? What is the solution of this problem ?
Regards
Martin
Have a look at Effectus for a simple approach to WPF + NHibernate.
First of all remember that NH session != sqlconnection. Having global (singleton) session is generally not good idea even in wpf because you can reach multithread scenario sooner or later. But i definitelly wouldn't argue against with performance nonsense in your case.
I would suggest you to open it for shortest possible time to complete the usecase. Perform kind of analysis where you identify usecases of your app as "sessions" with limited lifespan. For example shopping cart. You start and you finish. You can make session live until you finish such usecase. Then throw it away...
We are taking a long, hard look at our (Java) web application patterns. In the past, we've suffered from an overly anaemic object model and overly procedural separation between controllers, services and DAOs, with simple value objects (basically just bags of data) travelling between them. We've used declarative (XML) managed ORM (Hibernate) for persistence. All entity management has taken place in DAOs.
In trying to move to a richer domain model, we find ourselves struggling with how best to design the persistence layer. I've spent a lot of time reading and thinking about Domain Driven Design patterns. However, I'd like some advice.
First, the things I'm more confident about:
We'll have "thin" controllers at the front that deal only with HTTP and HTML - processing forms, validation, UI logic.
We'll have a layer of stateless business logic services that implements common algorithms or logic, unaware of the UI, but very much aware of (and delegating to) the domain model.
We'll have a richer domain model which contains state, relationships, and logic inherent to the objects in that domain model.
The question comes around persistence. Previously, our services would be injected (via Spring) with DAOs, and would use DAO methods like find() and save() to perform persistence. However, a richer domain model would seem to imply that objects should know how to save and delete themselves, and perhaps that higher level services should know how to locate (query for) domain objects.
Here, a few questions and uncertainties arise:
Do we want to inject DAOs into domain objects, so that they can do "this.someDao.save(this)" in a save() method? This is a little awkward since domain objects are not singletons, so we'll need factories or post-construction setting of DAOs. When loading entities from a database, this gets messy. I know Spring AOP can be used for this, but I couldn't get it to work (using Play! framework, another line of experimentation) and it seems quite messy and magical.
Do we instead keep DAOs (repositories?) completely separate, on par with stateless business logic services? This can make some sense, but it means that if "save" or "delete" are inherent operations of a domain object, the domain object can't express those.
Do we just dispense with DAOs entirely and use JPA to let entities manage themselves.
Herein lies the next subtlety: It's quite convenient to map entities using JPA. The Play! framework gives us a nice entity base class, too, with operations like save() and delete(). However, this means that our domain model entities are quite closely tied to the database structure, and we are passing objects around with a large amount of persistence logic, perhaps all the way up to the view layer. If nothing else, this will make the domain model less re-usable in other contexts.
If we want to avoid this, then we'd need some kind of mapping DAO - either using simple JDBC (or at least Spring's JdbcTemplate), or using a parallel hierarchy of database entities and "business" entities, with DAOs forever copying information from one hierarchy to another.
What is the appropriate design choice here?
Martin
Your questions and doubts ring an interesting alarm here, I think you went a bit too far in your interpretation of a "rich domain model". Richness doesn't go as far as implying that persistence logic must be handled by the domain objects, in other words, no, they shouldn't know how to save and delete themselves (at least not explicitely, though Hibernate actually adds some persistence logic transparently). This is often referred to as persistence ignorance.
I suggest that you keep the existing DAO injection system (a nice thing to have for unit testing) and leave the persistence layer as is while trying to move some business logic to your entities where it's fit. A good starting point to do that is to identify Aggregates and establish your Aggregate Roots. They'll often contain more business logic than the other entities.
However, this is not to say domain objects should contain all logic (especially not logic needed by many other objects across the application, which often belongs in Services).
I am not a Java expert, but I use NHibernate in my .NET code so my experience should be directly translatable to the Java world.
When using ORM (like Hibernate you mentioned) to build Domain-Driven Design application, one of good (I won't say best) practices is to create so-called application services between the UI and the Domain. They are similar to stateless business objects you mentioned, but should contain almost no logic. They should look like this:
public void SayHello(int id, String helloString)
{
SomeDomainObject target = domainObjectRepository.findById(id); //This uses Hibernate to load the object.
target.sayHello(helloString); //There is a single domain object method invocation per application service method.
domainObjectRepository.Save(target); //This one is optional. Hibernate should already know that this object needs saving because it tracks changes.
}
Any changes to objects contained by DomainObject (also adding objects to collections) will be handled by Hibernate.
You will also need some kind of AOP to intercept application service method invocations and create Hibernate's session before the method executes and save changes after method finishes with no exceptions.
There is a really good sample how to do DDD in Java here. It is based on the sample problem from Eric Evans' 'Blue Book'. The application logic class sample code is here.
I'm trying to get to grips with NHibernate, Fluent NHibernate and Spring.
Following domain-driven design principals, I'm writing a standard tiered web application composed of:
a presentation tier (ASP.Net)
a business tier, comprising:
an application tier (basically a set of methods made visible to UI tier)
repository interfaces and domain components (used by the application tier)
A persistence tier (basically the implementation of the repository interfaces defined in the business tier)
I would like help determining a way of instantiating an NHibernate ISession in such a way that it can be shared by multiple repositories over the lifetime of a single request to the business tier. Specifically, I would like to:
allow the ISession instance and any transaction to be controlled outwith the repository implementation (perhaps by some aspect of the IOC framework, an interceptor?)
allow the ISession instance to be available to the repositories in a test-friendly manner (perhaps via injection or trough some shared 'context' abstraction)
avoid any unnecessary transactions being created (i.e. when only read-only operations have been executed)
allow me to write tests that use SQLLite
allow me to use Fluent NHibernate
allow the repository implementation to remain ignorant of the host environment. I don't yet know if the businese tier will run in-process with the presentation tier or will be hosted separately under WCF (in IIS), so I don't want to bind my code too closely to a HTTP context (for example).
My first attempt to solve this problem had been using the Registry pattern; storing the ISession instance in a ThreadStatic property. However, subsequent reading has suggested that isn't the best solution (as ASP.Net can switch the thread within the page lifecycle, I believe).
Any thoughts, part solutions, pattern names, pointers to up-to-date samples (NHibernate 2) will be most gratefully received.
I have not used Spring.NET so I can't comment on that. However, the rest sounds remarkably (or perhaps not so remarkably; we're hardly the first to implement these things ;) similar to my own experience. I too had trouble finding a One True Best Practice so I just read as much as I could and came up with my own interpretation.
In my situation I wanted transaction/session management to be external to the repository as well as keep repository concerns from bubbling up out of them (i.e. the code using the repository should not need to know that it's using NHibernate internally and shouldn't need to know anything about NHibernate session management). In my case it was decided that transactions would be created by default lest developers forget them, so I had to have a read-only escape mechanism. I went with the Unit of Work pattern with the NHibernate ISession instance store inside. Calling code (I also created a DSL interface for the UoW) might look something like:
using (var uow = UoW.Start().ReadOnly().WithHttpContext()
.InNewScope().WithScopeContext(ScopeContextProvider.For<CRMModel>())
{
// Repository access
}
In practice, that could be as short as UoW.Start() depending on how much context is already available. The HttpContext part refers to the storage location for the UoW which is, unsurprisingly, the HttpContext in this case. As you mentioned, for a ASP .NET application, HttpContext is the safest place to store things. ScopeContextProvider basically makes sure the right data context is provided for the UoW (ISession instance to the appropriate database/server, other settings). The "ScopeContext" concept also makes it easy to insert a "test" scope context.
Going this route makes the repositories explicitly dependent on the UoW interface. Actually, you might be able to abstract it some but I'm not sure I see the benefit. What I mean is, each repository method retrieves the current UoW instance and then pulls out the ISession object (or simply a SqlConnection for those methods that don't use NHibernate) to run the NHibernate query/operation. This works for me though because it also seems like the ideal time to make sure that the current UoW is not read-only for methods that might need to run CRUD.
Overall, I think this is one approach that solves all your points:
Allows session management to be external to the repository
ISession context can be mocked or pointed at a context provider for a test environment
Avoids unnecessary transactions (well, you'd have to invert what I did and have a .Transactional() call or something)
I can't see why you couldn't test with SQLite since that's more of an NHibernate concern
I use Fluent NHibernate myself
Allows the repository to be ignorant of the host environment (that is, the repository caller controls the UoW storage context)
As for the UoW implementation, I'm partially kicking myself for not looking around more before I started. There's a project called machine.uow which I understand is fairly popular and works well with NHibernate. I haven't played with it much so I can't say if it solves all my requirements as neatly as the one I wrote myself, but it might have saved development time as well.
Perhaps we'll get some comments as to where I went wrong or how to improve things, but I hope this is at least helpful in some way.
For reference, the software stack I'm using is:
ASP.NET MVC
Fluent NHibernate on top of NHibernate
Ninject for dependency injection
What you are describing is supported by the Spring.NET framework almost out of the box. Only for FluentNHibernate you need to add a custom SessionFactory (not a lot of code, look here:Using Fluent NHibernate in Spring.NET) to Spring.NET.
Every repository can use the same ISession, just inject the SessionFactory in your repositories and use Spring.NET's transaction services.
Just try it out, they have pretty thorough documentation imho.
If I have 10 database calls on my web page, and none of them require any transactions etc.
They are simply getting data from the database (reads), should I still use the unit of work class?
Is it because creating a new session per database call is too expensive?
With NHibernate, session factory creation is very expensive (so you'll want to cache the session factory once it's created, probably on the HttpApplication) but session creation is very cheap. In other words, if it keeps your code cleaner, multiple session creations is not necessarily a bad thing. I think the NH documentation says it best:
An ISessionFactory is an
expensive-to-create, threadsafe object
intended to be shared by all
application threads. An ISession is an
inexpensive, non-threadsafe object
that should be used once, for a single
business process, and then discarded.
So, using the UoW pattern is probably not more efficient due to the extra overhead, but it's a good practice and the overhead is probably not going to hurt you. Premature optimization and all that.
Yes, you should use a transaction. From Ayende's blog:
"NHibernate assume that all access to the database is done under a transaction, and strongly discourage any use of the session without a transaction."
For more details, here's a link to his blog posting:
http://ayende.com/Blog/archive/2008/12/28/nh-prof-alerts-use-of-implicit-transactions-is-discouraged.aspx
We're using the DTO pattern to marshal our domain objects from the service layer into our repository, and then down to the database via NHibernate.
I've run into an issue whereby I pull a DTO out of the repository (e.g. CustomerDTO) and then convert it into the domain object (Customer) in my service layer. I then try and save a new object back (e.g. SalesOrder) which contains the same Customer object. This is in turn converted to a SalesOrderDTO (and CustomerDTO) for pushing into the repository.
NHibernate does not like this- it complains that the CustomerDTO is a duplicate record. I'm assuming that this is because it pulled out the first CustomerDTO in the same session and because the returning has been converted back and forth it cannot recognise this as the same object.
Am I stuck here or is there a way around this?
Thanks
James
You can re-attach an object to a session in NHibernate by using Lock - e.g.
_session.Lock(myDetachedObject, NHibernate.LockMode.None);
which may or may not help depending on exactly what is happening here. On a side note, using DTO's with NHibernate is not the most common practice, the fact that NHibernate (mostly) supports persistence ignorance means that typically DTO's aren't as widely used as with some other ORM frameworks.
It's really about how NHibernate session works. So if you within a session pull an instance of your CustomerDTO and then after a while you should get the same CustomerDTO (say by primary key) - you actually will get reference to the very same object like you did in your first retrieval.
So what you do is that you either merge the objects by calling session.Merge or you ask your session for the object by calling session.Get(primaryKey) do your updates and flush the session.
However as suggested by Steve - this is not usually what you do - you really want to get your domain object from the datastore and use DTOs (if neede) for transferring the data to UI, webservice whatever...
As others have noted, implementing Equals and GetHashCode is a step in the right direction. Also look into NHibernate's support for the "attach" OR/M idiom.
You also have the nosetter.camelcase option at your disposal: http://davybrion.com/blog/2009/03/entities-required-properties-and-properties-that-shouldnt-be-modified/
Furthermore, I'd like to encourage you not to be dissuaded by the lack of information out there online. It doesn't mean you're crazy, or doing things wrong. It just means you're working in an edge case. Unfortunately the biggest consumers of libraries like NHibernate are smallish in-house and/or web apps, where there exists the freedom to lean all your persistence needs against a single database. In reality, there are many exceptions to this rule.
For example, I'm currently working on a commercial desktop app where one of my domain objects has its data spread between a SQL CE database and image files on disk. Unfortunately NHibernate can only help me with the SQL CE persistence. I'm forced to use a sort of "Double Mapping" (see Martin Fowler's "Patterns of Enterprise Application Architecture") map my domain model through a repository layer that knows what data goes to NHibernate and what to disk.
It happens. It's a real need. Sometimes an apparent lack in a tool indicates you're taking a bad approach. But sometimes the truth is that you just truly are in an edge case, and need to build out some of these patterns for yourself to get it done.
I'm assuming that this is because it
pulled out the first CustomerDTO in
the same session and because the
returning has been converted back and
forth it cannot recognise this as the
same object.
You are right. Hibernate can't. Consider implementing Equals and Hashcode to fix this. I think a re-attach may only work if you haven't loaded the object within this session.