Fluent NHibernate Has Many to Many Convention Questions - fluent-nhibernate

I am new to fluent nhibernate and nhibernate. I want to write a fluent nhibernate autopersistence convention to handle creating the many to many mappings for my entities.
This is what I have right now:
using System;
using FluentNHibernate.Conventions;
using FluentNHibernate.Mapping;
namespace Namespace
{
public class HasManyToManyConvention : IHasManyToManyConvention
{
public bool Accept(IManyToManyPart target) {
return true;
}
public void Apply(IManyToManyPart target) {
var parentName = target.EntityType.Name;
var childName = target.ChildType.Name;
const string tableNameFmt = "{0}To{1}";
const string keyColumnFmt = "{0}Fk";
string tableName;
if (parentName.CompareTo(childName) < 0) {
tableName = String.Format(tableNameFmt, parentName, childName);
}
else {
tableName = String.Format(tableNameFmt, childName, parentName);
}
target.WithChildKeyColumn(String.Format(keyColumnFmt, childName));
target.WithParentKeyColumn(String.Format(keyColumnFmt, parentName));
target.WithTableName(tableName);
target.Cascade.All();
}
}
}
It seems to work, but I feel that there is a better way to do this.
Now my questions:
Do you have a better way to do this?
Do you usually want the Cascade behavior here?
Do I need to worry about something besides making sure both sides of this association end up with the same table name?

Related

JPA - Attributeconverter not working with native query

I have created AttributeConverter class which is converting Enum to DB value and overriden the necessary methods.
If I use JPA query like below then converter is getting called and getting correct result.
public List<Driver> findByStatus(DriverStatus status);
BUT If I use with Query annotation that AttributeConverter is not getting called. I have more complex query where I need to use native query with Attribute Converter but it is not working for me.
#Query(value = "select * from driver where status=:status", nativeQuery = true)
public List<Driver> findByStatus1(DriverStatus status);
is there any way to handle this requirement ?
Update 1 - below is the Converter code
#Converter(autoApply = true)
public class DriverStatusConverter implements AttributeConverter<DriverStatus, String> {
#Override
public String convertToDatabaseColumn(DriverStatus driverStatus) {
if (driverStatus == null) {
return null;
}
System.err.println("from converter" +driverStatus.getCode());
return driverStatus.getCode();
}
#Override
public DriverStatus convertToEntityAttribute(String code) {
if (code == null) {
return null;
}
return Stream.of(DriverStatus.values()).filter(c -> c.getCode().equals(code)).findFirst()
.orElseThrow(IllegalArgumentException::new);
}
}
Faced similar issue, wasn't able to find satisfying solution, so had to resort to the ugly hack... Repository method signature was changed like so:
public List<Driver> findByStatus1(Integer status);
...and was called like so:
repository.findByStatus1(DriverStatus.YOUR_DESIRED_STATUS.getCode());
Yes, it's ugly and kinda defeats the purpose of AttributeConverter, but at least it works. In my particular case I had to resort to such measures just with one native query. AttributeConverter still works for the all other JPA queries.
I just come across same problem and use following to go around it
DriverStatusConverter move logic code in convertToEntityAttribute to public static method:
#Override
public DriverStatus convertToEntityAttribute(String code) {
return convertStringToEnum(code);
}
public static DriverStatus convertStringToEnum(String code) {
if (code == null) {
return null;
}
return Stream.of(DriverStatus.values()).filter(c -> c.getCode().equals(code)).findFirst()
.orElseThrow(IllegalArgumentException::new);
}
create new DriverDTO for native query result
public interface DriverDTO {
public Integer getId();
public String getName();
public String getStatusCode();
default DriverStatus getStatus() {
return DriverStatusConverter.convertStringToEnum(getStatusCode());
}
}
DriverRepository findByStatusCode() select columns in query and return DriverDTO class
#Query(value = "select id, name, status as statusCode from driver WHERE status=:statusCode", nativeQuery = true)
public List<DriverDTO> findByStatusCode(String statusCode);
Remove the nativeQuery = true and try.
#Query(value = "select * from driver where status=:status")
public List<Driver> findByStatus1(DriverStatus status);
Alternatively you can try using entityManager like so -
This is just sample code.
TypedQuery<Trip> q = entityManager.createQuery("SELECT t FROM Trip t WHERE t.vehicle = :v", Trip.class);
q.setParameter("v", Vehicle.PLANE);
List<Trip> trips = q.getResultList();
Reference -
https://thorben-janssen.com/jpa-21-type-converter-better-way-to/

How can I use MEF to manage interdependent modules?

I found this question difficult to express (particularly in title form), so please bear with me.
I have an application that I am continually modifying to do different things. It seems like MEF might be a good way to manage the different pieces of functionality. Broadly speaking, there are three sections of the application that form a pipeline of sorts:
Acquisition
Transformation
Expression
In it's simplest form, I can express each of these stages as an interface (IAcquisition etc). The problems start when I want to use acquisition components that provides richer data than standard. I want to design modules that use this richer data, but I can't rely on it being there.
I could, of course, add all of the data to the interface specification. I could deal with poorer data sources by throwing an exception or returning a null value. This seems a long way from ideal.
I'd prefer to do the MEF binding in three stages, such that modules are offered to the user only if they are compatible with those selected previously.
So my question: Can I specify metadata which restricts the set of available imports?
An example:
Acquision1 offers BasicData only
Acquision2 offers BasicData and AdvancedData
Transformation1 requires BasicData
Transformation2 requires BasicData and AdvancedData
Acquisition module is selected first.
If Acquisition1 is selected, don't offer Transformation 2, otherwise offer both.
Is this possible? If so, how?
Your question suggests a structure like this:
public class BasicData
{
public string Basic { get; set; } // example data
}
public class AdvancedData : BasicData
{
public string Advanced { get; set; } // example data
}
Now you have your acquisition, transformation and expression components. You want to be able to deal with different kinds of data, so they're generic:
public interface IAcquisition<out TDataKind>
{
TDataKind Acquire();
}
public interface ITransformation<TDataKind>
{
TDataKind Transform(TDataKind data);
}
public interface IExpression<in TDataKind>
{
void Express(TDataKind data);
}
And now you want to build a pipeline out of them that looks like this:
IExpression.Express(ITransformation.Transform(IAcquisition.Acquire));
So let's start building a pipeline builder:
using System;
using System.Collections.Generic;
using System.ComponentModel.Composition;
using System.ComponentModel.Composition.Hosting;
using System.ComponentModel.Composition.Primitives;
using System.Linq;
using System.Linq.Expressions;
// namespace ...
public static class PipelineBuidler
{
private static readonly string AcquisitionIdentity =
AttributedModelServices.GetTypeIdentity(typeof(IAcquisition<>));
private static readonly string TransformationIdentity =
AttributedModelServices.GetTypeIdentity(typeof(ITransformation<>));
private static readonly string ExpressionIdentity =
AttributedModelServices.GetTypeIdentity(typeof(IExpression<>));
public static Action BuildPipeline(ComposablePartCatalog catalog,
Func<IEnumerable<string>, int> acquisitionSelector,
Func<IEnumerable<string>, int> transformationSelector,
Func<IEnumerable<string>, int> expressionSelector)
{
var container = new CompositionContainer(catalog);
The class holds MEF type identities for your three contract interfaces. We'll need those later to identify the correct exports. Our BuildPipeline method returns an Action. That is going to be the pipeline, so we can just do pipeline(). It takes a ComposablePartCatalog and three Funcs (to select an export). That way, we can keep all the dirty work inside this class. Then we start by creating a CompositionContainer.
Now we have to build ImportDefinitions, first for the acquisition component:
var aImportDef = new ImportDefinition(def => (def.ContractName == AcquisitionIdentity), null, ImportCardinality.ZeroOrMore, true, false);
This ImportDefinition simply filters out all exports of the IAcquisition<> interface. Now we can give it to the container:
var aExports = container.GetExports(aImportDef).ToArray();
aExports now holds all IAcquisition<> exports in the catalog. So let's run the selector on this:
var selectedAExport = aExports[acquisitionSelector(aExports.Select(export => export.Metadata["Name"] as string))];
And there we have our acquisition component:
var acquisition = selectedAExport.Value;
var acquisitionDataKind = (Type)selectedAExport.Metadata["DataKind"];
Now we're going to do the same for the transformation and the expression components, but with one slight difference: The ImportDefinition is going to ensure that each component can handle the output of the previous component.
var tImportDef = new ImportDefinition(def => (def.ContractName == TransformationIdentity) && ((Type)def.Metadata["DataKind"]).IsAssignableFrom(acquisitionDataKind),
null, ImportCardinality.ZeroOrMore, true, false);
var tExports = container.GetExports(tImportDef).ToArray();
var selectedTExport = tExports[transformationSelector(tExports.Select(export => export.Metadata["Name"] as string))];
var transformation = selectedTExport.Value;
var transformationDataKind = (Type)selectedTExport.Metadata["DataKind"];
var eImportDef = new ImportDefinition(def => (def.ContractName == ExpressionIdentity) && ((Type)def.Metadata["DataKind"]).IsAssignableFrom(transformationDataKind),
null, ImportCardinality.ZeroOrMore, true, false);
var eExports = container.GetExports(eImportDef).ToArray();
var selectedEExport = eExports[expressionSelector(eExports.Select(export => export.Metadata["Name"] as string))];
var expression = selectedEExport.Value;
var expressionDataKind = (Type)selectedEExport.Metadata["DataKind"];
And now we can wire it all up in an expression tree:
var acquired = Expression.Call(Expression.Constant(acquisition), typeof(IAcquisition<>).MakeGenericType(acquisitionDataKind).GetMethod("Acquire"));
var transformed = Expression.Call(Expression.Constant(transformation), typeof(ITransformation<>).MakeGenericType(transformationDataKind).GetMethod("Transform"), acquired);
var expressed = Expression.Call(Expression.Constant(expression), typeof(IExpression<>).MakeGenericType(expressionDataKind).GetMethod("Express"), transformed);
return Expression.Lambda<Action>(expressed).Compile();
}
}
And that's it! A simple example application would look like this:
[Export(typeof(IAcquisition<>))]
[ExportMetadata("DataKind", typeof(BasicData))]
[ExportMetadata("Name", "Basic acquisition")]
public class Acquisition1 : IAcquisition<BasicData>
{
public BasicData Acquire()
{
return new BasicData { Basic = "Acquisition1" };
}
}
[Export(typeof(IAcquisition<>))]
[ExportMetadata("DataKind", typeof(AdvancedData))]
[ExportMetadata("Name", "Advanced acquisition")]
public class Acquisition2 : IAcquisition<AdvancedData>
{
public AdvancedData Acquire()
{
return new AdvancedData { Advanced = "Acquisition2A", Basic = "Acquisition2B" };
}
}
[Export(typeof(ITransformation<>))]
[ExportMetadata("DataKind", typeof(BasicData))]
[ExportMetadata("Name", "Basic transformation")]
public class Transformation1 : ITransformation<BasicData>
{
public BasicData Transform(BasicData data)
{
data.Basic += " - Transformed1";
return data;
}
}
[Export(typeof(ITransformation<>))]
[ExportMetadata("DataKind", typeof(AdvancedData))]
[ExportMetadata("Name", "Advanced transformation")]
public class Transformation2 : ITransformation<AdvancedData>
{
public AdvancedData Transform(AdvancedData data)
{
data.Basic += " - Transformed2";
data.Advanced += " - Transformed2";
return data;
}
}
[Export(typeof(IExpression<>))]
[ExportMetadata("DataKind", typeof(BasicData))]
[ExportMetadata("Name", "Basic expression")]
public class Expression1 : IExpression<BasicData>
{
public void Express(BasicData data)
{
Console.WriteLine("Expression1: {0}", data.Basic);
}
}
[Export(typeof(IExpression<>))]
[ExportMetadata("DataKind", typeof(AdvancedData))]
[ExportMetadata("Name", "Advanced expression")]
public class Expression2 : IExpression<AdvancedData>
{
public void Express(AdvancedData data)
{
Console.WriteLine("Expression2: ({0}) - ({1})", data.Basic, data.Advanced);
}
}
class Program
{
static void Main(string[] args)
{
var pipeline = PipelineBuidler.BuildPipeline(new AssemblyCatalog(typeof(Program).Assembly), StringSelector, StringSelector, StringSelector);
pipeline();
}
static int StringSelector(IEnumerable<string> strings)
{
int i = 0;
foreach (var item in strings)
Console.WriteLine("[{0}] {1}", i++, item);
return int.Parse(Console.ReadLine());
}
}

NHibernate vs petapoco loading mechanism

I am having the difficulty to understand NHibernate an petapoco loading mechanism. Actually I did a test to compare how both behave upon a query.
My class is as follows:
UserTest.cs with the following properties:
private string name;
private int id;
private int customerId;
public int ID
{
get { return id; }
set { id = value; }
}
public string Name
{
get { return name; }
set { name = value; }
}
public int? CustomerID
{
get { return customerId; }
set
{
if (value != customerId)
{
customerId = value;
if (this.ID > 0)
{
DoSomeOtherWork();
}
}
}
}
When I do a User.Load in NHibernate, I have observed that DoSomeOtherWork is never called whereas in PetaPoco, when I do a query from loading User such as Connection.db.Fetch<UserTest>(...) or Connection.db.Query<UserTest>(...), I can see that DoSomeOtherWork is called.
Why is that so?
Is there a way to avoid calling DoSomeOherWork when using PetaPoco such that it has the same behaviour as NHibernate? I dont want to usePetaPoco.Ignoreas I need to get and set theCustomerID`.
PetaPoco it a micro-ORM (much lighter than Nhibernate) and materializes your POCO object when you fetch the record. There is no other magic than that, so the answer is:
No, you can't avoid calling the setter of the property.

Many to many relationship using Fluent Nhibernate Automapping

We are facing problem applying many-to-many relationship using fluent nhibernate automapping.
The simplified form of domain model are as follows:
public class Group
{
private readonly IList<Recipient> _recipients = new List<Recipient>();
public virtual IList<Recipient> Recipients
{
get { return _recipients; }
}
}
public class Recipient
{
private readonly IList<Group> _groups = new List<Group>();
public virtual IList<Group> Groups
{
get { return _ groups; }
}
}
As the code above describes that Group and Recipient are having many-to-many relationship.
We are using automapping feature of fluent nhibernate to map our domain model with database. So, we needed to use Convention for automapping.
Following is code we used for many to many convention:-
public class ManyToManyConvention : IHasManyToManyConvention
{
#region IConvention<IManyToManyCollectionInspector,IManyToManyCollectionInstance> Members
public void Apply(FluentNHibernate.Conventions.Instances.IManyToManyCollectionInstance instance)
{
if (instance.OtherSide == null)
{
instance.Table(
string.Format(
"{0}To{1}",
instance.EntityType.Name + "_Id",
instance.ChildType.Name + "_Id"));
}
else
{
instance.Inverse();
}
instance.Cascade.All();
}
#endregion
}
I found this solution here :
http://blog.vuscode.com/malovicn/archive/2009/11/04/fluent-nhibernate-samples-auto-mapping-part-12.aspx#Many%20to%20Many%20convention
But in above code while debugging both time for Recipients-> Groups and for Groups->Recipients instance.OtherSide is coming not null. The assumption was 1st time instance.OtherSide will be not null and second time it will be null as relationship is applied on one side so we will just apply inverse to that.
So it’s creating 2 mapping tables which are same.
It is load to database to have 2 tables of same schema. Even when I try to save our domain model to database using many to many relationship. It’s saving only 1 side i.e it saves Recipients in the Groups , But not saving Groups in Recipients.In database also it is having entry in only one mapping table not in both.
So , the question is Are we doing the right thing? If not then how to do it.
you could take inverse itself as a criteria
public void Apply(IManyToManyCollectionInstance instance)
{
Debug.Assert(instance.OtherSide != null);
// Hack: the cast is nessesary because the compiler tries to take the Method and not the property
if (((IManyToManyCollectionInspector)instance.OtherSide).Inverse)
{
instance.Table(
string.Format(
"{0}To{1}",
instance.EntityType.Name + "_Id",
instance.ChildType.Name + "_Id"));
}
else
{
instance.Inverse();
}
instance.Cascade.All();
}

JSON.NET and nHibernate Lazy Loading of Collections

Is anybody using JSON.NET with nHibernate? I notice that I am getting errors when I try to load a class with child collections.
I was facing the same problem so I tried to use #Liedman's code but the GetSerializableMembers() was never get called for the proxied reference.
I found another method to override:
public class NHibernateContractResolver : DefaultContractResolver
{
protected override JsonContract CreateContract(Type objectType)
{
if (typeof(NHibernate.Proxy.INHibernateProxy).IsAssignableFrom(objectType))
return base.CreateContract(objectType.BaseType);
else
return base.CreateContract(objectType);
}
}
We had this exact problem, which was solved with inspiration from Handcraftsman's response here.
The problem arises from JSON.NET being confused about how to serialize NHibernate's proxy classes. Solution: serialize the proxy instances like their base class.
A simplified version of Handcraftsman's code goes like this:
public class NHibernateContractResolver : DefaultContractResolver {
protected override List<MemberInfo> GetSerializableMembers(Type objectType) {
if (typeof(INHibernateProxy).IsAssignableFrom(objectType)) {
return base.GetSerializableMembers(objectType.BaseType);
} else {
return base.GetSerializableMembers(objectType);
}
}
}
IMHO, this code has the advantage of still relying on JSON.NET's default behaviour regarding custom attributes, etc. (and the code is a lot shorter!).
It is used like this
var serializer = new JsonSerializer{
ReferenceLoopHandling = ReferenceLoopHandling.Ignore,
ContractResolver = new NHibernateContractResolver()
};
StringWriter stringWriter = new StringWriter();
JsonWriter jsonWriter = new Newtonsoft.Json.JsonTextWriter(stringWriter);
serializer.Serialize(jsonWriter, objectToSerialize);
string serializedObject = stringWriter.ToString();
Note: This code was written for and used with NHibernate 2.1. As some commenters have pointed out, it doesn't work out of the box with later versions of NHibernate, you will have to make some adjustments. I will try to update the code if I ever have to do it with later versions of NHibernate.
I use NHibernate with Json.NET and noticed that I was getting inexplicable "__interceptors" properties in my serialized objects. A google search turned up this excellent solution by Lee Henson which I adapted to work with Json.NET 3.5 Release 5 as follows.
public class NHibernateContractResolver : DefaultContractResolver
{
private static readonly MemberInfo[] NHibernateProxyInterfaceMembers = typeof(INHibernateProxy).GetMembers();
protected override List<MemberInfo> GetSerializableMembers(Type objectType)
{
var members = base.GetSerializableMembers(objectType);
members.RemoveAll(memberInfo =>
(IsMemberPartOfNHibernateProxyInterface(memberInfo)) ||
(IsMemberDynamicProxyMixin(memberInfo)) ||
(IsMemberMarkedWithIgnoreAttribute(memberInfo, objectType)) ||
(IsMemberInheritedFromProxySuperclass(memberInfo, objectType)));
var actualMemberInfos = new List<MemberInfo>();
foreach (var memberInfo in members)
{
var infos = memberInfo.DeclaringType.BaseType.GetMember(memberInfo.Name);
actualMemberInfos.Add(infos.Length == 0 ? memberInfo : infos[0]);
}
return actualMemberInfos;
}
private static bool IsMemberDynamicProxyMixin(MemberInfo memberInfo)
{
return memberInfo.Name == "__interceptors";
}
private static bool IsMemberInheritedFromProxySuperclass(MemberInfo memberInfo, Type objectType)
{
return memberInfo.DeclaringType.Assembly == typeof(INHibernateProxy).Assembly;
}
private static bool IsMemberMarkedWithIgnoreAttribute(MemberInfo memberInfo, Type objectType)
{
var infos = typeof(INHibernateProxy).IsAssignableFrom(objectType)
? objectType.BaseType.GetMember(memberInfo.Name)
: objectType.GetMember(memberInfo.Name);
return infos[0].GetCustomAttributes(typeof(JsonIgnoreAttribute), true).Length > 0;
}
private static bool IsMemberPartOfNHibernateProxyInterface(MemberInfo memberInfo)
{
return Array.Exists(NHibernateProxyInterfaceMembers, mi => memberInfo.Name == mi.Name);
}
}
To use it just put an instance in the ContractResolver property of your JsonSerializer. The circular dependency problem noted by jishi can be resolved by setting the ReferenceLoopHandling property to ReferenceLoopHandling.Ignore . Here's an extension method that can be used to serialize objects using Json.Net
public static void SerializeToJsonFile<T>(this T itemToSerialize, string filePath)
{
using (StreamWriter streamWriter = new StreamWriter(filePath))
{
using (JsonWriter jsonWriter = new JsonTextWriter(streamWriter))
{
jsonWriter.Formatting = Formatting.Indented;
JsonSerializer serializer = new JsonSerializer
{
NullValueHandling = NullValueHandling.Ignore,
ReferenceLoopHandling = ReferenceLoopHandling.Ignore,
ContractResolver = new NHibernateContractResolver(),
};
serializer.Serialize(jsonWriter, itemToSerialize);
}
}
}
Are you getting a circular dependancy-error? How do you ignore objects from serialization?
Since lazy loading generates a proxy-objects, any attributes your class-members have will be lost. I ran into the same issue with Newtonsoft JSON-serializer, since the proxy-object didn't have the [JsonIgnore] attributes anymore.
You will probably want to eager load most of the object so that it can be serialized:
ICriteria ic = _session.CreateCriteria(typeof(Person));
ic.Add(Restrictions.Eq("Id", id));
if (fetchEager)
{
ic.SetFetchMode("Person", FetchMode.Eager);
}
A nice way to do this is to add a bool to the constructor (bool isFetchEager) of your data provider method.
I'd say this is a design problem in my opinion. Because NH makes connections to the database underneath all and has proxies in the middle, it is not good for the transparency of your application to serialize them directly (and as you can see Json.NET does not like them at all).
You should not serialize the entities themselves, but you should convert them into "view" objects or POCO or DTO objects (whatever you want to call them) and then serialize these.
The difference is that while NH entity may have proxies, lazy attributes, etc. View objects are simple objects with only primitives which are serializable by default.
How to manage FKs?
My personal rule is:
Entity level: Person class and with a Gender class associated
View level: Person view with GenderId and GenderName properties.
This means that you need to expand your properties into primitives when converting to view objects. This way also your json objects are simpler and easier to handle.
When you need to push the changes to the DB, in my case I use AutoMapper and do a ValueResolver class which can convert your new Guid to the Gender object.
UPDATE: Check http://blog.andrewawhitaker.com/blog/2014/06/19/queryover-series-part-4-transforming/ for a way to get the view directly (AliasToBean) from NH. This would be a boost in the DB side.
The problem can happen when NHibernate wraps the nested collection properties in a PersistentGenericBag<> type.
The GetSerializableMembers and CreateContract overrides cannot detect that these nested collection properties are "proxied". One way to resolve this is to override the CreateProperty method. The trick is to get the value from the property using reflection and test whether the type is of PersistentGenericBag. This method also has the ability to filter any properties that generated exceptions.
public class NHibernateContractResolver : DefaultContractResolver
{
protected override JsonProperty CreateProperty(MemberInfo member, MemberSerialization memberSerialization)
{
JsonProperty property = base.CreateProperty(member, memberSerialization);
property.ShouldSerialize = instance =>
{
try
{
PropertyInfo prop = (PropertyInfo)member;
if (prop.CanRead)
{
var value = prop.GetValue(instance, null);
if (value != null && typeof(NHibernate.Collection.Generic.PersistentGenericBag<>).IsSubclassOfRawGeneric(value.GetType()))
return false;
return true;
}
}
catch
{ }
return false;
};
return property;
}
}
The IsSubclassOfRawGeneric extension used above:
public static class TypeExtensions
{
public static bool IsSubclassOfRawGeneric(this Type generic, Type? toCheck)
{
while (toCheck != null && toCheck != typeof(object))
{
var cur = toCheck.IsGenericType ? toCheck.GetGenericTypeDefinition() : toCheck;
if (generic == cur)
{
return true;
}
toCheck = toCheck?.BaseType;
}
return false;
}
}
If you serialize objects that contain NHibernate proxy classes you might end up downloading the whole database, because once the property is accessed NHibernate would trigger a request to the database.
I've just implemented a Unit of Work for NHibernate: NHUnit that fixes two of the most annoying issues from NHibernate: proxy classes and cartesian product when using fetch.
How would you use this?
var customer = await _dbContext.Customers.Get(customerId) //returns a wrapper to configure the query
.Include(c => c.Addresses.Single().Country, //include Addresses and Country
c => c.PhoneNumbers.Single().PhoneNumberType) //include all PhoneNumbers with PhoneNumberType
.Unproxy() //instructs the framework to strip all the proxy classes when the Value is returned
.Deferred() //instructs the framework to delay execution (future)
.ValueAsync(token); //this is where all deferred queries get executed
The above code is basically configuring a query: return a customer by id with multiple child objects which should be executed with other queries (futures) and the returned result should be stripped of NHibernate proxies. The query gets executed when ValueAsync is called.
NHUnit determines if it should do join with the main query, create new future queries or make use of batch fetch.
There is a simple example project on Github to show you how to use NHUnit package. If others are interested in this project I will invest more time to make it better.
This is what I use:
Have a marker interface and inherit it on your entities, e.g. in my case empty IEntity.
We will use the marker interface to detect NHibernate entity types in the contract resolver.
public class CustomerEntity : IEntity { ... }
Create a custom contract resolver for JSON.NET
public class NHibernateProxyJsonValueProvider : IValueProvider {
private readonly IValueProvider _valueProvider;
public NHibernateProxyJsonValueProvider(IValueProvider valueProvider)
{
_valueProvider = valueProvider;
}
public void SetValue(object target, object value)
{
_valueProvider.SetValue(target, value);
}
private static (bool isProxy, bool isInitialized) GetProxy(object proxy)
{
// this is pretty much what NHibernateUtil.IsInitialized() does.
switch (proxy)
{
case INHibernateProxy hibernateProxy:
return (true, !hibernateProxy.HibernateLazyInitializer.IsUninitialized);
case ILazyInitializedCollection initializedCollection:
return (true, initializedCollection.WasInitialized);
case IPersistentCollection persistentCollection:
return (true, persistentCollection.WasInitialized);
default:
return (false, false);
}
}
public object GetValue(object target)
{
object value = _valueProvider.GetValue(target);
(bool isProxy, bool isInitialized) = GetProxy(value);
if (isProxy)
{
if (isInitialized)
{
return value;
}
if (value is IEnumerable)
{
return Enumerable.Empty<object>();
}
return null;
}
return value;
}
}
public class NHibernateContractResolver : CamelCasePropertyNamesContractResolver {
protected override JsonContract CreateContract(Type objectType)
{
if (objectType.IsAssignableTo(typeof(IEntity)))
{
return base.CreateObjectContract(objectType);
}
return base.CreateContract(objectType);
}
protected override JsonProperty CreateProperty(MemberInfo member, MemberSerialization memberSerialization)
{
JsonProperty property = base.CreateProperty(member, memberSerialization);
property.ValueProvider = new NHibernateProxyJsonValueProvider(property.ValueProvider);
return property;
}
}
Normal uninitialized lazy loaded properties will result in null in the json output.
Collection uninitialized lazy loaded properties will result in an [] empty array in json.
So for a lazy loaded property to appear in the json output you need to eagerly load it in the query or in code before serialization.
Usage:
JsonConvert.SerializeObject(entityToSerialize, new JsonSerializerSettings() {
ContractResolver = new NHibernateContractResolver()
});
Or globally in in ASP.NET Core Startup class
services.AddNewtonsoftJson(options =>
{
options.SerializerSettings.ContractResolver = new NHibernateContractResolver();
});
Using:
NET 5.0
NHibernate 5.3.8
JSON.NET latest via ASP.NET Core