Using NHibernate, what is the best way to handle InsertDate (a.k.a CreateDate) and UpdateDate columns?
For example:
public class Customer
{
public virtual string Name { get; set; }
public virtual DateTime InsertDate { get; set; }
public virtual DateTime UpdateDate { get; set; }
}
There are probably multiple ways that this could be handle, but could someone who has done this before provide me with some advice.
If it's not obvious, I want InsertDate to be set to the date that a record was inserted and after that to be immutable. UpdateDate needs to be changed every time a record is updated.
Bonus marks if you answer using Fluent Nhibernate.
Use auditing interceptors for this. A good example can be found here.
I've used something similar to Ayende Rahien.
Not so important - just to be complete - my version used Interceptors, not listeners. For more infos to interceptors and listeners read this stackoverflow question.
Related
I have pre-existing tables, using a kind of open schema. I have an Item table, and various entities are classified as Items, and then have properties stored in Item property tables. A single entity type may have fields stored in multiple tables. We expose entities with views. So, most entities correspond to a view, and then when we insert/update we have to systematically update the tables or use stored procedures.
I'm trying to determine if NHibernate will gain us anything over our custom-built repositories (which follow a factory pattern). Right now, I'm seeing great difficulty in getting NHibernate to deal with this kind of database schema. The way I see it, we'd either have to completely refactor our database to follow NHibernate's conventions, or completely refactor or entities somehow.
I'm not seeing much in the documentation about how to do this, except for the very simplest of examples that involve databases that more or less follow NHibernate's conventions.
Here's a representative database diagram. We have Episode as an entity that pulls info from Item, IP_Episode, IP_EpisodeBroadcastInfo, IP_Show, etc. to build all the fields that it needs.
You mention conventions. That is a Fluent NHibernate concept, and yes, what you are doing is not exactly in line with Fluent NHibernate's existing conventions. However, it is well within NHibernate's capabilities. NHibernate excels at being able to be mapped to all sorts of different database schemas. Don't feel constrained to the way Fluent NHibernate wants you to go. I'm not saying don't use Fluent NHibernate. If you are consistent and reasonable in your database schema, you can write your own conventions to match.
To illustate NHibernate's flexibility, let's assume we have a table structure similar to this:
create table Episode (
Id int not null primary key,
NumberInSeries int null
);
create table Show (
Episode_id int not null primary key,
Title nvarchar(100) not null,
foreign key (Episode_id) references Episode (Id)
);
create table Broadcast (
Episode_id int not null primary key,
InitialAirDate datetime not null,
foreign key (Episode_id) references Episode (Id)
);
One row in Episode corresponds to zero or one rows in Show and zero or one rows in Broadcast. You could model this type of relationship several different ways in .NET. Here are the various options available to you via NHibernate:
1. Inheritance
public class Episode
{
public virtual int Id { get; set; }
public virtual int? NumberInSeries { get; set; }
}
public class Show : Episode
{
public virtual string Title { get; set; }
}
public class Broadcast : Episode
{
public virtual DateTime InitialAirDate { get; set; }
}
Use this when you want to model a relationship that does not change. If an Episode is a Show, it is always a Show. Also, this modeling would imply that an Episode cannot be both a Show and a Broadcast. I don't believe this is what you want, but you may find it useful elsewhere in your model.
For more info, see...
Official documentation on inheritance mapping
Ayende's blog post on inheritance mapping
2. one-to-one
public class Episode
{
public virtual int Id { get; set; }
public virtual int? NumberInSeries { get; set; }
public virtual Show Show { get; set; }
public virtual Broadcast Broadcast { get; set; }
}
public class Show
{
public virtual Episode Episode { get; set; }
public virtual string Title { get; set; }
}
public class Broadcast
{
public virtual Episode Episode { get; set; }
public virtual DateTime InitialAirDate { get; set; }
}
This gives you more control over which tables actually contain a row associated with a given Episode, because you can set episode.Broadcast = null for example. It's also fine to have both Show and Broadcast information for a given Episode.
For more info, see...
Official documentation on one-to-one
Ayende's blog post on one-to-one
3. join
public class Episode
{
// These properties come from the Episode table...
public virtual int Id { get; set; }
public virtual int? NumberInSeries { get; set; }
// This one comes from the Show table.
public virtual string Title { get; set; }
// This one comes from the Broadcast table.
public virtual DateTime InitialAirDate { get; set; }
}
This is a nice and simple way to represent the data, but you do not get control over whether on not rows are inserted into the Show and Broadcast tables or not.
For more info, see...
Official documentation on join
Ayende's blog post on join
Since you said, "A single entity type may have fields stored in multiple tables", it sounds to me like join should be able to handle the way you currently have things modeled.
I'm a total newbie with ORMs and the DDD, so please, be patient with me. Also, I'm no native speaker so the domain lingo will be a little hard to express in English.
I'm developing a system to control lawsuits.
My domain has an Entity called Case.
Public class Case
{
public virtual int Id { get; set; }
public virtual List<Clients> Clients { get; set;}
public virtual LawsuitType LawsuitType { get; set;}
}
The CaseType is, from what I gathered, a Value Object. It's a simple type, it has only the case type description. Example: "Divorce", "Child Support", etc. It is only the description that interests me. But I don't want to be a free descriptor. I want to control the options presented to the user, and also do some reports.
So I was thinking to map this on Database with the table "LawsuitTypes". The table would have a int Id, and a string descriptor.
Can I accomplish that using ComponentMap? Or have I got things wrong and the CaseType is an Entity?
Thanks, Luiz Angelo.
Edit:
Using an enum was suggested. But that wouldn't work because it would mean that the LawsuitTypes are set by the developer, and not the user. Some users have the power to add/remove LawsuitTypes, while others don't.
IMHO you should treat LawsuitTypes as an own entity. Keep in mind, that you may want to extend the LawsuitTypes with additional information some day (requirements changes very fast sometimes). What comes in my mind is a "default" property or somethig like that... This means additional work of cource, but this way you are more flexible for future needs.
If I understand your question correctly, the Description("") attribute and a simple enum should work. More on that here.
public enum LawsuitTypes
{
Divorce,
[Description("Child Support")]
ChildSupport,
[Description("Some Other Element")]
SomeOtherElement
}
Here's the scenario:
I've got an association between "Groups" and "Users, represented by a "UserGroupAssignment" object.
public class UserGroupAssignment
{
[Key]
public virtual long Id { get; set; }
[Association("UserAssignmentToUser", "UserId", "Id", IsForeignKey = true)]
public virtual User { get; set; }
[Association("UserAssignmentToGroup", "GroupId", "Id", IsForeignKey = true)]
public virtual Group { get; set; }
public virtual bool IsPrimary { get; set; }
public virtual DateTime? ValidFrom { get; set; }
public virtual DateTime? ValidTo { get; set; }
}
I have two business logic methods, GetUserAssignmentsForGroups and GetGroupAssignmentsForUsers that I return the assignments with the User and Group properties populated respectively. i.e. GetUserAssignmentsForGroup takes a GroupId and returns the assignments for that Group with the User property populated.
What I want is to expose those two methods as domain query methods like so:
[Query]
public IQueryable<UserGroupAssignment> GetAssignmentsForGroupWithUsers(long groupId)
{
return this.businessLogic.GetUserAssignmentsForGroups(groupId);
}
[Query]
public IQueryable<UserGroupAssignment> GetAssignmentsForUserWithGroups(long userId)
{
return this.businessLogic.GetGroupAssignmentsForUsers(userId)
}
My problem is that whilst the business logic methods return the correctly populated Assignments via NHibernate, RIA Services is NOT passing the sub-entities (User or Group) across the wire.
I don't want to use [Include] attributes on the User or Group properties of the UserAssignment class, as I want to minimise the payload over the wire - I don't want to send the group over when I'm only interested in the User of each UserAssignment, for example.
So my question is this:
How do I tell RIA services to
explicitly include User sub-entities
in one domain query method and Group
sub-entities in the other?
Remember, I'm using NHibernate at the back end and custom query methods in the RIA Services, so can't use the EF-style include in the client query.
Thanks
Joel
you should apply the [Include] attribute in the metadata class. then create one domain service method for fetching data without properties included, and a separate method for fetching data with properties included.
You might find this thread helpful in understanding how [Include] attribute works.
Old question, but still interesting. Did you find a solution ?
As far as I know of WCF RIA Architecture it isn't so easy.
An easy and dirty way could be to override the Query method, force the enumeration of the IQueryable being returned (I guess you're using LINQ to nHibernate, in which case, good luck) then examine the HttpContext (you're using WCF RiaServices so you MUST have aspNetCompatibility turned on) and set to null the reference that you don't want to send over the wire (User or Group).
Anyway this way FORCE you to use the [IncludeAttribute]. However I don't see any reasonable route that avoid its use, and this way allow you to send the entity over the wire just when you need to.
IMO I belive that in order to totally avoid the use of [Include] you must rollout your own serializer serverside and deserializer clientside or change the UserGroupAssignment entity so that the user property become a string containing the serialized User (or Group) that you decide to valorize or not according your method.
Please let us knows if you already found a solution, the question is interesting.
I have several XML files and each file contains data of ‘root objects’ which I parse using Linq to XML and then create actual root objects which I persist using NHibernate and the sharp architecture repository. I have started to optimise the data insert and manage to add 30000 objects in about 1 hour and 40 minutes to the database. However, this is still too slow.
I think one bottle neck is the lookup of objects in the database which requires IO. Objects have to be looked up for reuse.
The root object has several authors:
public virtual IList<Author> Authors { get; set; }
Authors have this structure:
public class Author : Entity
{
public virtual Initials Initials { get; set; }
public virtual ForeName ForeName { get; set; }
public virtual LastName LastName { get; set; }
}
I have achieved a great speed up by using a typed Id (something I wouldn't normally do):
public class LastName : EntityWithTypedId<string>, IHasAssignedId<string>
{
public LastName()
{
}
public LastName(string Id)
{
SetAssignedIdTo(Id);
}
public virtual void SetAssignedIdTo(string assignedId)
{
Id = assignedId;
}
}
Which I look up (and potentially create) like this:
LastName LastName = LastNameRepository.Get(TLastName);
if (LastName == null)
{
LastName = LastNameRepository.Save(new LastName(TLastName));
LastNameRepository.DbContext.CommitChanges();
}
Author.LastName = LastName;
I am looking authors up like this:
propertyValues = new Dictionary<string, object>();
propertyValues.Add("Initials", Author.Initials);
propertyValues.Add("ForeName", Author.ForeName);
propertyValues.Add("LastName", Author.LastName);
Author TAuthor = AuthorRepository.FindOne(propertyValues);
if (TAuthor == null)
{
AuthorRepository.SaveOrUpdate(Author);
AuthorRepository.DbContext.CommitChanges();
Root.Authors.Add(Author);
}
else
{
Root.Authors.Add(TAuthor);
}
Can I improve this? Should I use stored procedures/HQL/pure SQL/ICriteria instead to perform the lookup? Could I use some form of caching to speed up the lookup and reduce IO? The CommitChanges seems to be necessary or should I wrap everything into a transaction?
I already flush my session etc. every 10 root objects.
Any feedback would be very much welcome. Many thanks in advance.
Best wishes,
Christian
In all honesty I would say that you shouldn't even be using SA/NHibernate for something like this. It's a bulk data import from XML - an ETL tool like SSIS would be a better choice. Even a hand-cranked process on the DB server would work better - step 1, load XML to a table, step 2, do the UPSERT. Incidentally, SQL 2008 introduced the MERGE command for UPSERT operations, which might be of use.
I would also agree with Dan's comment - is it really necessary to treat initials, forename and surname as separate entities? Treating them as simple strings would boost performance. What in your domain model specifies that they are entities in their own right?
If you really must continue using SA/NHibernate, have a read of this:
http://www.lostechies.com/blogs/jimmy_bogard/archive/2010/06/24/bulk-processing-with-nhibernate.aspx
The suggestion in Jimmy's blog about batching SELECTs should help quite a lot. If you plan to process a batch of 250 records at once, do all the SELECTs as a single NH command, process all the data, then do all the updates as another single batch (which I believe your use of EntityWithTypedId and the adonet.batch_size config setting will help achieve)
Finally - regarding the statement "which I parse using Linq to XML" - is that really the best way of doing it? I'm guessing that it might be, given the size of your input file, but are you aware of the approach of simply deserializing the XML file into an object graph? SO won't let me post the link to a page describing this, because I haven't earned enough reputation yet - but if you want to read up on it, Google "don't parse that xml" and the first article will explain it.
Hope this helps.
Jon
The first thing I would do is simplify the Authors entity as I don't think you need the Initials, ForeName, and LastName objects as separate entities. I think using plain strings would be more efficient:
public class Author : Entity
{
public virtual string Initials { get; set; }
public virtual string ForeName { get; set; }
public virtual string LastName { get; set; }
}
I'm using S#arpArchitecture (ASP.NET MVC and Fluent NHibernate). I have an entity as follows:
public class User : Entity
{
public User(){}
public User(string firstName, string lastName)
{
FirstName = firstName;
LastName = lastName;
}
public virtual string FirstName { get; set; }
public virtual string LastName { get; set; }
public virtual DateTime? LastUpdate{ get; set; }
}
I will call the SaveOrUpdate method on my repository class that will persist this User object. It works. How would i persist the LastUpdate property automatically with the latest date and time? I could override the SaveOrUpdate method in my repository by always setting the LastUpdate property to the current date and time but that does not seem to be correct because if there's nothing changed in my entity, I don't think NHibernate will persist my object back to the DB and forcing the setting of this property will always make it persist back.
I only want this LastUpdate property set if something else has changed in my entity.
You should use an NHibernate Interceptor to accomplish this. Override OnFlushDirty in your interceptor implementation to set the LastUpdate property only when something has changed (i.e. your object is dirty).
Since you didn't specify your database type, I will take the liberty and add a solution for people using MySQL / MariaDB:
You can use the following in your fluent mapping class:
Map(y => y.LastUpdate).CustomSqlType("timestamp").Generated.Always();
This creates a MySQL TIMESTAMP column with a default value of CURRENT_TIMESTAMP. Thus, if Nhibernate chooses to flush your entity, MySQL will make sure that the column is properly updated.
I'm pretty sure this does not play well with MSSQL as TIMESTAMPS are different beasts in MSSQL (here and here)
I'm going to point to another StackOverflow question that I think answers my question.