I have a C# .Net Web Service. I am calling a dll (C# .Net) that uses nHibernate to connect to my database. When I call the dll, it executes a query to the db and loads the parent Object "Task". However, when the dll tries to access the child objects "Task.SubTasks", it throws the following error:
NHibernate.HibernateException failed to lazily initialize a collection of role: SubTasks no session or session was closed
I'm new to nHibernate so not sure what piece of code I'm missing.
Do I need to start a Factory session in my web service before calling the dll? If so, how do I do that?
EDIT: Added the web service code and the CreateContainer() method code. This code gets called just prior to calling the dll
[WebMethod]
public byte[] GetTaskSubtask (string subtaskId)
{
var container = CreateContainer(windsorPath);
IoC.Initialize(container);
//DLL CALL
byte[] theDoc = CommonExport.GetSubtaskDocument(subtaskId);
return theDoc;
}
/// <summary>
/// Register the IoC container.
/// </summary>
/// <param name="aWindsorConfig">The path to the windsor configuration
/// file.</param>
/// <returns>An initialized container.</returns>
protected override IWindsorContainer CreateContainer(
string aWindsorConfig)
{
//This method is a workaround. This method should not be overridden.
//This method is overridden because the CreateContainer(string) method
//in UnitOfWorkApplication instantiates a RhinoContainer instance that
//has a dependency on Binsor. At the time of writing this the Mammoth
//application did not have the libraries needed to resolve the Binsor
//dependency.
IWindsorContainer container = new RhinoContainer();
container.Register(
Component.For<IUnitOfWorkFactory>().ImplementedBy
<NHibernateUnitOfWorkFactory>());
return container;
}
EDIT: Adding DLL code and repository code...
DLL Code
public static byte[] GetSubtaskDocument(string subtaskId)
{
BOESubtask task = taskRepo.FindBOESubtaskById(Guid.Parse(subtaskId));
foreach(subtask st in task.Subtasks) <--this is the line that throws the error
{
//do some work
}
}
Repository for task
/// <summary>
/// Queries the database for the Subtasks whose ID matches the
/// passed in ID.
/// </summary>
/// <param name="aTaskId">The ID to find matching Subtasks
/// for.</param>
/// <returns>The Subtasks whose ID matches the passed in
/// ID (or null).</returns>
public Task FindTaskById(Guid aTaskId)
{
var task = new Task();
using (UnitOfWork.Start())
{
task = FindOne(DetachedCriteria.For<Task>()
.Add(Restrictions.Eq("Id", aTaskId)));
UnitOfWork.Current.Flush();
}
return task;
}
Repository for subtask
/// <summary>
/// Queries the database for the Subtasks whose ID matches the
/// passed in ID.
/// </summary>
/// <param name="aBOESubtaskId">The ID to find matching Subtasks
/// for.</param>
/// <returns>The Subtasks whose ID matches the passed in
/// ID (or null).</returns>
public Subtask FindBOESubtaskById(Guid aSubtaskId)
{
var subtask = new Subtask();
using (UnitOfWork.Start())
{
subtask = FindOne(DetachedCriteria.For<Subtask>()
.Add(Restrictions.Eq("Id", aSubtaskId)));
UnitOfWork.Current.Flush();
}
return subtask;
}
You have apparently mapped a collection in one of your NHibernate data classes with lazy loading enabled (or better: not disabled, as it is the default behavior). NHibernate loads the entity and creates a proxy for the mapped collections. As soon as they are accessed, NHibernate attempts to load the items for that collection. But if you close your NHibernate session before that happens, the error you received will occur. You are probably exposing your data object through your web service to the web service client. During the serialization process, the XmlSerializer tries to serialize the collection which prompts NHibernate to populate it. As the session is closed, the error occurs.
Two ways to prevent this:
close the session after the response has been sent
or
disable lazy loading for your collections so that they are loaded instantly
Addition after the above edits:
in your repository, you start UnitsOfWork within a using-statement. They are being disposed as soon as the code is completed. I don't know the implementation of UnitOfWork but i assume it controls the lifetime of the NHibernate session. By disposing the UnitOfWork, your are probalby closing the NHibernate session. As your mapping initializes collections lazy loaded, these collections are not yet populated and the error occurs. NHibernate needs the exact instance of the session that loaded an entity to populate lazily initialized collections.
You will run into problems like this if you use lazy loading and have a repository that closes the session before the response is complete. One option would be to initialize the UnitOfWork at the start of the request and close it after the response is complete (for instance in Application_BeginRequest, Application_EndRequest in Global.asax.cs). That would of course mean a close integration of your repository into the web service.
In any case, creating a Session for a single request in combination with lazy loading is a bad idea and is very likely to create similar problems in the future. If you can't change the repository implementation you might probably have to disable lazy loading.
Using Garland's feedback I resolved the issue. I removed the UnitOfWork(s) code from the repository in the DLL and wrapped the Web Service call to the DLL in a UnitOfWork See code mods below:
Web Service
[WebMethod]
public byte[] GetSubtaskDocument (string subtaskId)
{
var container = CreateContainer(windsorString);
IoC.Initialize(container);
byte[] theDoc;
using (UnitOfWork.Start())
{
//DLL call
theDoc = CommonExport.GetSubtaskDocument(subtaskId);
UnitOfWork.Current.Flush();
}
return theDoc;
}
Repository call in the DLL
public Subtask FindSubtaskById(Guid aSubtaskId)
{
return FindOne(DetachedCriteria.For<Subtask>()
.Add(Restrictions.Eq("Id", aSubtaskId)));
}
Related
Is it possible to create our own HTTP method by just overriding the HttpMethodAttribute class and specify our own supportedMethods ?
In fact, depending on the case, we need to return the View as complete view with the _Layout, and sometimes we just need to return the PartialView of this view. So my idea is to put an custom attribute, like [HttpPartial] and so the client will tell, depending on the methods used in the request, if it wants the complete view (GET method) or the partial view (PARTIAL method).
Yes, that's possible for APIs. You can look at how the HttpGetAttribute is implemented, and roll your own for a custom method, replacing "get" with "foo":
/// <summary>
/// Identifies an action that supports the HTTP FOO method.
/// </summary>
public class HttpFooAttribute : HttpMethodAttribute
{
private static readonly IEnumerable<string> _supportedMethods = new[] { "FOO" };
/// <summary>
/// Creates a new <see cref="HttpFooAttribute"/>.
/// </summary>
public HttpFooAttribute()
: base(_supportedMethods)
{
}
/// <summary>
/// Creates a new <see cref="HttpFooAttribute"/> with the given route template.
/// </summary>
/// <param name="template">The route template. May not be null.</param>
public HttpFooAttribute(string template)
: base(_supportedMethods, template)
{
if (template == null)
{
throw new ArgumentNullException(nameof(template));
}
}
}
Then apply it to your API action methods:
[Route("Bar")]
public class BarApiController : Controller
{
[HttpFoo("/"]
public IActionResult Foo()
{
return Ok("Foo");
}
}
Now you can request FOO https://your-api:44312/bar/.
This is less useful for actions returning views, as any HTML-rendering user agent only lets the user initiate GET or POST requests through hyperlinks and forms.
You could send more methods through an XMLHttpRequest or fetch(), but it'll require more documentation and client customization.
Don't break or hamper the web, don't invent new HTTP methods for your application logic. Simply use a query string parameter or send it in your body: &renderAsPartial=true, { "renderAsPartial": true }.
See the IANA's Hypertext Transfer Protocol (HTTP) Method Registry for existing methods and see RCF 7231 section 8.1 for how to register new HTTP methods.
I'm working on an application which should validate the model based on some metadata saved in a database. The purpose of this is to allow administrators change how some models are validated, without changing the code, depending on clients' preferences. The changes are applied for the entire application, not for specific users accessing it. How it is changed, doesn't matter at the moment. They could be modified directly on the database, or using an application. The idea is that they should be customizable.
Let's say i have the model "Person" with the property "Name" of type "string".
public class Person
{
public string Name { get; set; }
}
This model is used by my app which is distributed and istalled on several servers. Each of them is independent. Some users may want the Name to have maximum 30 letters and to be required when creating a new "Person", others may want it to have 25 and not to be required. Normally, this would be solved using data annotations, but those are evaluated during the compile time and are somehow "hardcoded".
Shortly, I want to find a way to customize and store in a database how the model validates, without the need of altering the application code.
Also, it would be nice to work with jquery validation and have as few request to database(/service) as possible. Besides that, i can't use any known ORM like EF.
You could create a custom validation attribute that validates by examining the metadata stored in the database. Custom validation attributes are easy to create, simply extend System.ComponentModel.DataAnnotations.ValidationAttribute and override the IsValid() method.
To get the client side rules that work with jQuery validation you will need to create a custom adapter for the type of your custom validation attribute that extends System.Web.Mvc.DataAnnotationsModelValidator<YourCustomValidationAttribute>. This class then needs to be registered in the OnApplicationStart() method of your Global.asax.
DataAnnotationsModelValidatorProvider.RegisterAdapter(typeof(YourCustomValidationAttribute), typeof(YourCustomAdapter));
Here's an example adapter:
public class FooAdapter : DataAnnotationsModelValidator<FooAttribute>
{
/// <summary>
/// This constructor is used by the MVC framework to retrieve the client validation rules for the attribute
/// type associated with this adapter.
/// </summary>
/// <param name="metadata">Information about the type being validated.</param>
/// <param name="context">The ControllerContext for the controller handling the request.</param>
/// <param name="attribute">The attribute associated with this adapter.</param>
public FooAdapter(ModelMetadata metadata, ControllerContext context, FooAttribute attribute)
: base(metadata, context, attribute)
{
_metadata = metadata;
}
/// <summary>
/// Overrides the definition in System.Web.Mvc.ModelValidator to provide the client validation rules specific
/// to this type.
/// </summary>
/// <returns>The set of rules that will be used for client side validation.</returns>
public override IEnumerable<ModelClientValidationRule> GetClientValidationRules()
{
return new[] { new ModelClientValidationRequiredRule(
String.Format("The {0} field is invalid.", _metadata.DisplayName ?? _metadata.PropertyName)) };
}
/// <summary>
/// The metadata associated with the property tagged by the validation attribute.
/// </summary>
private ModelMetadata _metadata;
}
This may also be useful if you would like to asynchronously call server side validation http://msdn.microsoft.com/en-us/library/system.web.mvc.remoteattribute(v=vs.108).aspx
We're using NServiceBus to perform document processing for a number of tenants.
Each tenant has their own database and we're using NHibernate for data access. In the web application we're using our IoC tool (StructureMap) to handle session management. Essentially we maintain a session factory for each tenant. We're able to identify the tenant from HttpContext.
When we kick off document processing using NServiceBus we have access to the tenant identifier. We need this tenant id to be available throughout the processing of the document (we have 2 sagas and fire off a number of events).
We would need to create a NHibernate SessionFactory for each tenant so would need some way of obtaining the tenant id when we configure StructureMap.
I've seen a few posts suggesting to use a message header to store the tenant identifier but am unsure how to:
Set a message header when we first submit a document (sending a SubmitDocumentCommand)
Reference the header when we configure StructureMap
Access the header within our sagas/handlers
Ensure the header flows from one message to the next. When we send a SubmitDocumentCommand it is handled by the DocumentSubmissionSaga. If the submission succeeds we will send off a DocumentSubmittedEvent. We'd want to make sure the tenant id is available at all points in the process.
I believe with this information I can successfully implement multitenancy with NHibernate but anything more specific to this scenario would be appreciated.
You can flow the header using a message mutator that registers itself: Here is a quick example from my own code. And you can always use Bus.CurrentMessageContext.Headers to set/get to the header anywhere...
Hope this helps :)
/// <summary>
/// Mutator to set the channel header
/// </summary>
public class FlowChannelMutator : IMutateOutgoingTransportMessages, INeedInitialization
{
/// <summary>
/// The bus is needed to get access to the current message context
/// </summary>
public IBus Bus { get; set; }
/// <summary>
/// Keeps track of the channel
/// </summary>
/// <param name="messages"></param>
/// <param name="transportMessage"></param>
public void MutateOutgoing(object[] messages, TransportMessage transportMessage)
{
if (Bus.CurrentMessageContext != null &&
Bus.CurrentMessageContext.Headers.ContainsKey("x-messagehandler-channel"))
{
if (!transportMessage.Headers.ContainsKey("x-messagehandler-channel"))
{
transportMessage.Headers["x-messagehandler-channel"] =
Bus.CurrentMessageContext.Headers["x-messagehandler-channel"];
}
}
}
/// <summary>
/// Initializes
/// </summary>
public void Init()
{
Configure.Instance.Configurer.ConfigureComponent<FlowChannelMutator>(DependencyLifecycle.InstancePerCall);
}
}
I am wondering how much should my service layer know of my repository? In past project I always returned lists and had a method for each thing I needed.
So if I needed to return all rows that had an Id of 5 that would be a method. I do have generic repository for create, update, delete and other NHibernate options but for querying I don't.
Now I am starting to use more IQueryable as I started to run into problems of having so many methods for each case.
Say if I needed to return all that had a certain Id and needed 3 tables that where eager loaded this would be a new method. If I needed a certain Id and no eager loading that would be a separate method.
So now I am thinking if I method that does the where clause part and return IQueryable then I can add on the result (i.e. if I need to do eager loading).
At the same time though this now makes the service layer more aware of the repository layer and I no longer can switch out the repository as easy as now I have specific NHibernate in the service layer.
I am also not sure how that would effect mocking.
So now I am wondering if I go down this route if the repository is needed as it now seems like they have been blended together.
Edit
If I get rid of my repository and just have the session in my service layer is there a point to having a unit of work class then?
public class UnitOfWork : IUnitOfWork, IDisposable
{
private ITransaction transaction;
private readonly ISession session;
public UnitOfWork(ISession session)
{
this.session = session;
session.FlushMode = FlushMode.Auto;
}
/// <summary>
/// Starts a transaction with the database. Uses IsolationLevel.ReadCommitted
/// </summary>
public void BeginTransaction()
{
transaction = session.BeginTransaction(IsolationLevel.ReadCommitted);
}
/// <summary>
/// starts a transaction with the database.
/// </summary>
/// <param name="level">IsolationLevel the transaction should run in.</param>
public void BeginTransaction(IsolationLevel level)
{
transaction = session.BeginTransaction(level);
}
private bool IsTransactionActive()
{
return transaction.IsActive;
}
/// <summary>
/// Commits the transaction and writes to the database.
/// </summary>
public void Commit()
{
// make sure a transaction was started before we try to commit.
if (!IsTransactionActive())
{
throw new InvalidOperationException("Oops! We don't have an active transaction. Did a rollback occur before this commit was triggered: "
+ transaction.WasRolledBack + " did a commit happen before this commit: " + transaction.WasCommitted);
}
transaction.Commit();
}
/// <summary>
/// Rollback any writes to the databases.
/// </summary>
public void Rollback()
{
if (IsTransactionActive())
{
transaction.Rollback();
}
}
public void Dispose() // don't know where to call this to see if it will solve my problem
{
if (session.IsOpen)
{
session.Close();
}
}
Everyone has an opinion how to use the repository, what to abstract etc. Ayende Rahien has got few good posts about the issue: Architecting in the pit of doom: The evils of the repository abstraction layer and Repository is the new Singleton. Those give you some pretty good reasons why you shouldn't try to create yet another abstraction on top of NHibernate's ISession.
The thing about NHibernate is that it gives you the most if you don't try to abstract it out. Making your service layer depend on NHibernate is not necessarily a bad thing. It gives you control over sessions, caching and other NHibernate features, and thus enables you to imporove performance, not to mention saving you from all the redundant wrapping code that you've mentioned.
I've just discovered that if I get an object from an NHibernate session and change a property on object, NHibernate will automatically update the object on commit without me calling Session.Update(myObj)!
I can see how this could be helpful, but as default behaviour it seems crazy!
Update: I now understand persistence ignorance, so this behaviour is now clearly the preferred option. I'll leave this now embarrassing question here to hopefully help other profane users.
How can I stop this happening? Is this default NHibernate behaviour or something coming from Fluent NHibernate's AutoPersistenceModel?
If there's no way to stop this, what do I do? Unless I'm missing the point this behaviour seems to create a right mess.
I'm using NHibernate 2.0.1.4 and a Fluent NHibernate build from 18/3/2009
Is this guy right with his answer?
I've also read that overriding an Event Listener could be a solution to this. However, IDirtyCheckEventListener.OnDirtyCheck isn't called in this situation. Does anyone know which listener I need to override?
You can set Session.FlushMode to FlushMode.Never. This will make your operations explicit
ie: on tx.Commit() or session.Flush(). Of course this will still update the database upon commit/flush. If you do not want this behavior, then call session.Evict(yourObj) and it will then become transient and NHibernate will not issue any db commands for it.
Response to your edit: Yes, that guy gives you more options on how to control it.
My solution:
In your initial ISession creation, (somewhere inside your injection framework registrations) set DefaultReadOnly to true.
In your IRepository implementation which wraps around NHibernate and manages the ISession and such, in the Insert, Update, InsertUpdate and Delete (or similar) methods which call ISession.Save, Update, SaveUpdate, etc., call SetReadOnly for the entity and flag set to false.
Calling SaveOrUpdate() or Save() makes an object persistent. If you've retrieved it using an ISession or from a reference to a persistent object, then the object is persistent and flushing the session will save changes. You can prevent this behavior by calling Evict() on the object which makes it transient.
Edited to add: I generally consider an ISession to be a unit of work. This is easily implemented in a web app. using session-per-request but requires more control in WinForms.
We did this by using the Event Listeners with NH (This isn't my work - but I can't find the link for where I did it...).
We have a EventListener for when reading in the data, to set it as ReadOnly - and then one for Save (and SaveOrUpdate) to set them as loaded, so that object will persist when we manually call Save() on it.
That - or you could use an IStatelessSession which has no State/ChangeTracking.
This sets the entity/item as ReadOnly immediately on loading.
I've only included one Insertion event listener, but my config code references all of them.
/// <summary>
/// A listener that once an object is loaded will change it's status to ReadOnly so that
/// it will not be automatically saved by NH
/// </summary>
/// <remarks>
/// For this object to then be saved, the SaveUpdateEventListener is to be used.
/// </remarks>
public class PostLoadEventListener : IPostLoadEventListener
{
public void OnPostLoad(PostLoadEvent #event)
{
EntityEntry entry = #event.Session.PersistenceContext.GetEntry(#event.Entity);
entry.BackSetStatus(Status.ReadOnly);
}
}
On saving the object, we call this to set that object to Loaded (meaning it will now persist)
public class SaveUpdateEventListener : ISaveOrUpdateEventListener
{
public static readonly CascadingAction ResetReadOnly = new ResetReadOnlyCascadeAction();
/// <summary>
/// Changes the status of any loaded item to ReadOnly.
/// </summary>
/// <remarks>
/// Changes the status of all loaded entities, so that NH will no longer TrackChanges on them.
/// </remarks>
public void OnSaveOrUpdate(SaveOrUpdateEvent #event)
{
var session = #event.Session;
EntityEntry entry = session.PersistenceContext.GetEntry(#event.Entity);
if (entry != null && entry.Persister.IsMutable && entry.Status == Status.ReadOnly)
{
entry.BackSetStatus(Status.Loaded);
CascadeOnUpdate(#event, entry.Persister, #event.Entry);
}
}
private static void CascadeOnUpdate(SaveOrUpdateEvent #event, IEntityPersister entityPersister,
object entityEntry)
{
IEventSource source = #event.Session;
source.PersistenceContext.IncrementCascadeLevel();
try
{
new Cascade(ResetReadOnly, CascadePoint.BeforeFlush, source).CascadeOn(entityPersister, entityEntry);
}
finally
{
source.PersistenceContext.DecrementCascadeLevel();
}
}
}
And we implement it into NH thus so:
public static ISessionFactory CreateSessionFactory(IPersistenceConfigurer dbConfig, Action<MappingConfiguration> mappingConfig, bool enabledChangeTracking,bool enabledAuditing, int queryTimeout)
{
return Fluently.Configure()
.Database(dbConfig)
.Mappings(mappingConfig)
.Mappings(x => x.FluentMappings.AddFromAssemblyOf<__AuditEntity>())
.ExposeConfiguration(x => Configure(x, enabledChangeTracking, enabledAuditing,queryTimeout))
.BuildSessionFactory();
}
/// <summary>
/// Configures the specified config.
/// </summary>
/// <param name="config">The config.</param>
/// <param name="enableChangeTracking">if set to <c>true</c> [enable change tracking].</param>
/// <param name="queryTimeOut">The query time out in minutes.</param>
private static void Configure(NHibernate.Cfg.Configuration config, bool enableChangeTracking, bool enableAuditing, int queryTimeOut)
{
config.SetProperty(NHibernate.Cfg.Environment.Hbm2ddlKeyWords, "none");
if (queryTimeOut > 0)
{
config.SetProperty("command_timeout", (TimeSpan.FromMinutes(queryTimeOut).TotalSeconds).ToString());
}
if (!enableChangeTracking)
{
config.AppendListeners(NHibernate.Event.ListenerType.PostLoad, new[] { new Enact.Core.DB.NHib.Listeners.PostLoadEventListener() });
config.AppendListeners(NHibernate.Event.ListenerType.SaveUpdate, new[] { new Enact.Core.DB.NHib.Listeners.SaveUpdateEventListener() });
config.AppendListeners(NHibernate.Event.ListenerType.PostUpdate, new[] { new Enact.Core.DB.NHib.Listeners.PostUpdateEventListener() });
config.AppendListeners(NHibernate.Event.ListenerType.PostInsert, new[] { new Enact.Core.DB.NHib.Listeners.PostInsertEventListener() });
}
}