WCF with MSMQ DTC - closing NHibernate sessions - wcf

I have a WCF MSMQ service (hosted in a windows service). My method has the TransactionScopeRequired attribute on it.
I am using Nhibernate to save my data to my database. I want to make sure I close each Nhibernate session after each call.
I was using the following approach (using the castle facility) in my data access
using(var session = sessionManager.OpenSession())
using(var transaction = session.BeginTransaction())
{
// do work
transaction.Commit()
}
But when my main service method exits I am getting an error because I have already disposed of the Nhibernate session and I think the DTC needs this to do its commit.
My question is:
What would be the best way to close the Nhibernate session - after the DTC has committed (i.e after i have exited my service method?).
Thank you.

If you wrap your code in the following
using (TransactionScope sc = new TransactionScope(TransactionScopeOption.Suppress))
{
// code here
sc.Complete();
}
Then NHibernate will not enlist in the ambient transaction and therefore DTC will not have a dependency on the database transaction.
This is a hunch as you haven't supplied the error details in your question.
EDIT
Of course by following this advice your database commit will not be performed under the same transaction as the dequeue action so if there is a failure in your database this may or may not cause the dequeue transaction to roll the message being processed back onto the queue, so you risk dropping messages in this way. You can compensate for this in various ways or you can just run the risk if the cost of dropped messages are not high.

Related

How to setup in-memory outbox in asp.net and bind it to ef context's SaveChanges

I'm having this setup
asp.net core (2.1)
ef core (2.1)
masstransit (5.1)
During a controller action where I do both database changes and publishing events, I would like to hold the outgoing messages in an in-memory outbox until I know that the EF context transaction is successfully committed to the database.
I'm willing to take the risk that the sending then fails, but as that's close to zero compared to db transactions failing due to e.g. concurrency exceptions I don't see it as a risk worth considering in my case.
Is this somehow possible to setup with the current UseInMemoryOutbox implementation?
Or do I have to roll my own outbox table like discussed in this SO answer and save the messages to be sent in the same db transaction and then have a background worker polling that table and sending outgoing messages to achieve this?
It would be better to send a command to a queue (via MT, using a send endpoint) in the controller action, and then use a consumer to (1) pull the message from the queue, (2) use the in-memory outbox for events, and (3) insert the relevant data into the database. That way, the controller isn't dependent upon database latency and the message queue to produce events. And the command is more durably processed by the consumer, allowing the API controller to quickly return a Http/201 Accepted response code.

TransactionScopeAsyncFlowOption does not flow from WCF client to service

My WCF service method needs to perform concurrent tasks in a transaction which is flowed from the client to service. To enable a transaction scope to flow through threads, I enabled the TransactionScopeAsyncFlowOption in the constructor of transaction scope class before sending call to service.
using (var transaction = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
{
//service call here. Async flow option has no effect on service side.
}
The transaction flows from client to service but it is not able to flow to sub tasks. If, however, I create new transaction scope in the service and enable its async flow there, then transaction flows to sub tasks. So my question is, why the TransactionScopeAsyncFlow option has no effect on the transaction at the service end? Should not it take the client transaction scope settings so there would not be any need to create new transaction scope in service just to enable its async flow?
I found similar question here as well. What actually happens is that WCF loses nearly all of its context upon crossing the threshold of first await and we should explicitly opt-in to flow any ambient context such as ambient transaction to subsequent async calls. There are some workarounds mentioned in this book. Check them out if you need more detail. I'll go with creating new scope before async call in the service operation.

WCF Transaction Handling from Client/Consumer

I am using WCF to Access My BL and DAL. I want to handle transactions from client not from the BL.
But when I use TransactionScope on client side, the transaction does not work correctly. Data is saved if the transaction is aborted or an exception is thrown.
When I use TransactionScope in BL it works well. My problem is that I want to handle transactions from consumer applications not from service. Is there any way to do That?
Any Suggestion? Please Help.
http://www.codeproject.com/Articles/690136/All-About-TransactionScope
There can only be two problem
The transaction flow is not set for bindings
Or using a binding that does not support distributed transaction

Scoping transactions and sessions in NHibernate for long running tasks

When using NHibernate in web applications, I will usually let my IoC container take care of opening and closing an ISession per request and commit/rollback the transaction. The nature of HTTP makes it very easy to define a clear Unit-of-Work in such applications.
Now, I have been tasked with putting together a small program, which will be invoked regularly by a task scheduler, for sending out newsletters. The concepts of both newsletters and subscribers are already well defined entities in our domain model, and sending a newsletter to all subscribers would involve doing something similar to this:
var subscribers = _session
.QueryOver<Subscription>()
.Where(s => !s.HasReceivedNewsletter)
.List();
foreach (var subscriber in subscribers)
{
SendNewsletterTo(subscriber);
subscriber.HasReceivedNewsletter = true;
}
Notice how each Subscriber object is updated within the loop, recording that she has now received the newsletter. The idea is, that if the mail sending program should crash, it can be restarted and continue sending newsletters from where it left off.
The problem I am facing, is in defining and implementing the Unit-of-Work pattern here. I will probably need to commit changes to the database by the end of each iteration of the loop. Simply wrapping the loop body with a using (var trans = _session.BeginTransaction()) block seems to be extremely expensive in running time, and I also seem to experience locking issues between this long running process and other (web) applications using the same database.
After reading some articles and documentation on NHibernate transactions, I have come to think, that I might need to detach the list of subscribers from the session to avoid the locking issues, and reattach each to a fresh session in the loop body. I am not sure how this will work for performance, though.
So, NHibernate experts, how would you design and implement a long running job like this?
Don't you want to use asynchronous durable messaging here? Something like NServiceBus, Rhino Service Bus or MassTransit. It seems you don't have to send a lot of messages as soon as possible, so I think you should do it asynchronously with 1 durable message per user basis
Don't you think that Stateless session with no transaction will do better here?
There's no problem having multiple transactions in a session. It's appropriate here to scope the transaction to updating a single subscriber because it's an independent operation. Depending on the number of subscribers and the likelihood of failure, it might be best to grab a small number of subscribers at a time.
foreach (var subscriber in subscribers)
{
using (var txn = _session.BeginTransaction())
{
try
{
SendNewsletterTo(subscriber);
subscriber.HasReceivedNewsletter = true;
txn.Commit();
}
catch (Exception ex)
{
txn.Rollback();
// log exception, clean up any actions SendNewsletterTo has taken if needed
// Dispose of session and start over
}
}
}

How long can/should an NHibernate session be kept open?

I've created a windows service which is listening to a MSMQ. For each message I receive, some DB transactions need to be made. Later it might be possible that there will be 1 message every second.
Currently the Nhib session is kept open until the service is stopped manually.
Is it a good practice or should I close the session after each message?
Thanks in advance
An NHibernate session is meant to be relatively short lived, so its generally not a good idea to keep it active for a longer period. The session caches entities and as more entities are fetched more data is cached, if you don't manage the caching in some way. This leads to a performance degradation.
The NHibernate docs describe ISession like this:
A single-threaded, short-lived object representing a conversation between the application and the persistent store. Wraps an ADO.NET connection. Factory for ITransaction. Holds a mandatory (first-level) cache of persistent objects, used when navigating the object graph or looking up objects by identifier.
I would suggest using a session-per-conversation, i.e. if you have a few db operations that "belong together" you use the same session for those operations, but when those operations are done you close the session.
So, using a new session for each message you process sounds like a good idea.
In contrast to the session factory (ISessionFactory) which is thread safe you should open and close the session (ISession) with every database transaction.