Can I cache entities with non mapped properties in NHibernates 2nd level cache? - nhibernate

Hi i have setup my SessionFactory to cache entities and queries:
private ISessionFactory CreateSessionFactory()
{
var cfg = new Configuration().Proxy(
properties => properties.ProxyFactoryFactory<DefaultProxyFactoryFactory>()).DataBaseIntegration(
properties =>
{
properties.Driver<SqlClientDriver>();
properties.ConnectionStringName = this.namedConnection;
properties.Dialect<MsSql2005Dialect>();
}).AddAssembly(this.resourceAssembly).Cache(
properties =>
{
properties.UseQueryCache = true;
properties.Provider<SysCacheProvider>();
properties.DefaultExpiration = 3600;
});
cfg.AddMapping(this.DomainMapping);
new SchemaUpdate(cfg).Execute(true, true);
return cfg.BuildSessionFactory();
}
This is my user mapping
public class UserMapping : EntityMapping<Guid, User>
{
public UserMapping()
{
this.Table("USERS");
this.Property(
x => x.CorpId,
mapper => mapper.Column(
c =>
{
c.Name("CorporateId");
c.UniqueKey("UKUserCorporateId");
c.NotNullable(true);
}));
this.Set(
x => x.Desks,
mapper =>
{
mapper.Table("DESKS2USERS");
mapper.Key(km => km.Column("UserId"));
mapper.Inverse(false);
mapper.Cascade(Cascade.All | Cascade.DeleteOrphans | Cascade.Remove);
},
rel => rel.ManyToMany(mapper => mapper.Column("DeskId")));
this.Cache(
mapper =>
{
mapper.Usage(CacheUsage.ReadWrite);
mapper.Include(CacheInclude.All);
});
}
}
What I want to do is get a user or query some users and add information to the domain object and cache the updated object.
public class User : Entity<Guid>, IUser
{
public virtual string CorpId { get; set; }
public virtual ISet<Desk> Desks { get; set; }
public virtual MailAddress EmailAddress { get; set; }
public virtual string Name
{
get
{
return string.Format(CultureInfo.CurrentCulture, "{0}, {1}", this.SurName, this.GivenName);
}
}
public virtual string GivenName { get; set; }
public virtual string SurName { get; set; }
}
something like this:
var users = this.session.Query<User>().Cacheable().ToList();
if (users.Any(user => user.EmailAddress == null))
{
UserEditor.UpdateThroughActiveDirectoryData(users);
}
return this.View(new UserViewModel { Users = users.OrderBy(entity => entity.Name) });
or this:
var user = this.session.Get<User>(id);
if (user.EmailAddress == null)
{
UserEditor.UpdateThroughActiveDirectoryData(user);
}
return this.View(user);
The UpdateThroughActiveDirectory methods work but are executed everytime i get data from the cache, the updated entities do not keep the additional data. Is there a way to also store this data in nhibernates 2nd level cache?

NHibernate doesn't cache entire entity in second level cache. It caches only the state / data from the mapped properties. You can read more about it here: http://ayende.com/blog/3112/nhibernate-and-the-second-level-cache-tips
There's an interesting discussion in comments of that post that explains this a little further:
Frans Bouma: Objects need to serializable, are they not? As we're talking about multiple appdomains. I wonder what's more
efficient: relying on the cache of the db server or transporting
objects back/forth using serialization layers.
Ayende Rahien: No, they don't need that. This is because NHibernate doesn't save the entity in the cache. Doing so would open
you to race conditions. NHibernate saves the entity data alone,
which is usually composed of primitive data (that is what the DB can
store, after all). In general, it is more efficient to hit a cache
server, because those are very easily scalable to high degrees, and
there is no I/O involved.

Related

DDD - updating nested collection of value objects throws NHibernate exception

TLDR version: I'm having trouble getting my DDD domain model to work with NHibernate. If my value object itself contains a collection of value objects, I can't assign a new value without getting an NHibernate exception, and want to know what the best practice is in this situation.
Longer version:
Say I have an entity which contains a value object as a property, ValueObjectA, which itself contains a set of a different value objects of type ValueObjectB.
ValueObjectB only exists meaningfully as a property of ValueObjectA, i.e. if myEntity.ValueObjectA == null, it doesn't make sense for ValueObjectB to exist either.
I've written some example code to illustrate what I mean, with simplifications for brevity.
public class Entity
{
public int Id { get; private set; }
public ValueObjectA ValueObjectA { get; set; }
// Constructor: public Entity(ValueObjectA valueObjectA)
}
public class ValueObjectA : IEquatable<ValueObjectA>
{
public string X { get; private set; }
public ISet<ValueObjectB> ValueObjectBs { get; private set; }
// Constructor: public ValueObjectA(string x, ISet<ValueObjectB> valueObjectBs)
// Implementation of Equals/GetHahcode
}
public class ValueObjectB : IEquatable<ValueObjectB>
{
public int Y { get; private set; }
public int Z { get; private set; }
// Constructor: public ValueObjectB(int y, int z)
// Implementation of Equals/GetHahcode
}
I have a corresponding mapping class using mapping by code:
public class EntityMap : ClassMapping<Entity>
{
public EntityMap()
{
Table("Entity");
Id(x => x.Id, map => map.Generator(Generators.Identity));
Component(x => x.ValueObjectA, c =>
{
c.Property(x => x.X);
// Component relation is equilavent to <composite-element> in xml mappings
c.Set(x => x.ValueObjectBs, map =>
{
map.Table("ValueObjectB");
map.Inverse(true);
map.Cascade(Cascade.All | Cascade.DeleteOrphans);
map.Key(k => k.Column("Id"));
}, r => r.Component(ce =>
{
ce.Property(x => x.Y);
ce.Property(x => x.Z);
}));
});
}
}
The properties of ValueObjectA are mapped to the Entity table, but the properties of ValueObjectA.ValueObjectB are mapped to another table, since it is a one to many relationship. When a ValueObjectB is removed, I want that row to be deleted in the ValueObjectB table.
Since value objects are immutable, when I change the properties of entity.ValueObjectA, I should create a new instance of ValueObjectA. The problem is that the set of ValueObjectBs is a reference type, so when I try to save the entity with a different ValueObjectA, NHibernate will throw an exception because the original set that NHibernate is tracking is no longer referenced:
A collection with cascade="all-delete-orphan" was no longer referenced
by the owning entity instance.
Consider the following code:
var valueObjectBs_1 = new HashSet<ValueObjectB>
{
new ValueObjectB(1, 2),
new ValueObjectB(3, 4)
};
var valueObjectA_1 = new ValueObjectA("first", valueObjectBs_1);
var entity = new Entity(valueObjectA_1);
// Save entity, reload entity
var valueObjectBs_2 = new HashSet<ValueObjectB>
{
new ValueObjectB(1, 2)
};
var valueObjectA_2 = new ValueObjectA("second", valueObjectBs_2);
entity.ValueObjectA = valueObjectA_2;
// Save entity again
// NHIBERNATE EXCEPTION
I've managed to get around this by creating another ValueObjectA in order to preserve the reference to the set, e.g.
valueObjectA_1.ValueObjectBs.Remove(new ValueObjectB(3, 4));
entity.ValueObjectA = new ValueObjectA(valueObjectA_2.X, valueObjectA_1.ValueObjectBs);
However... that feels like a code smell - even if I wrote a custom setter for Entity.ValueObjectA, the implementation is starting to get complicated where the design is supposed to be simple.
public class Entity
{
// ...
private ValueObjectA valueObjectA;
public ValueObjectA ValueObjectA
{
// get
set
{
// Add/Remove relevant values from ValueObjectA.ValueObjectBs
valueObjectA = new ValueObjectA(value.X, ValueObjectA.ValueObjectBs);
}
}
}
What is the best practice in this type of situation? Or is this a sign that I'm trying to do something which violates the principles of DDD?
What you have is an anemic domain model.
You should replace public setters of the entity with methods that have meaningful names from the Ubiquitous language, that check the invariants and that do all the necessary cleanup in case of value objects replacements.
Although it may seem that things are more complicated this is payed back by the fact the now the entity is in full control about what happens with its internals. You now have full encapsulation.

Redis caching with ServiceStack OrmLite and SQL Server persistence

We have a Web app (ASP.NET/C#) with SQL Server backend. We use ServiceStack OrmLite as our POCO Micro ORM. We would now like to extend a part of our app to cache frequently-read data (mainly a collection of POCO objects as values, with numeric keys). But I'm not sure how to go about integrating a simple caching solution (in-memory or Redis based) that works seamlessly with OrmLite and MSSQL as the Master database.
I've read about the ServiceStack Redis Client, MemoryCacheClient and Multi nested database connections (OrmLiteConnectionFactory), but I couldn't find any examples, tutorial or code samples to learn more about implementing caching that works with OrmLite.
Any suggestions or links will be helpful and much appreciated.
I use this extension to help simplify the integration between the db and the cache.
public static class ICacheClientExtensions
{
public static T ToResultUsingCache<T>(this ICacheClient cache, string cacheKey, Func<T> fn, int hours = 1) where T : class
{
var cacheResult = cache.Get<T>(cacheKey);
if (cacheResult != null)
{
return cacheResult;
}
var result = fn();
if (result == null) return null;
cache.Set(cacheKey, result, TimeSpan.FromHours(hours));
return result;
}
}
public class MyService : Service
{
public Data Get(GetData request)
{
var key = UrnId.Create<Data>(request.Id);
Func<Data> fn = () => Db.GetData(request.Id);
return Cache.ToResultUsingCache(key, fn);
}
[Route("/data/{id}")]
public class GetData: IReturn<Data>
{
public int Id{ get; set; }
}
}
You'd need to implement the caching logic yourself, but it's not much work - here's a pseudocode example:
public class QueryObject
{
public DateTime? StartDate { get; set; }
public string SomeString { get; set; }
}
public class Foo
{
public DateTime DateTime { get; set; }
public string Name { get; set; }
}
public class FooResponse
{
public List<Dto> Data { get; set; }
}
public FooResponse GetFooData(QueryObject queryObject)
{
using (var dbConn = connectionFactory.OpenDbConnection())
using (var cache = redisClientsManager.GetCacheClient())
{
var cacheKey = string.Format("fooQuery:{0}", queryObject.GetHashCode()); //insert your own logic for generating a cache key here
var response = cache.Get<Response>(cacheKey);
//return cached result
if (response != null) return response;
//not cached - hit the DB and cache the result
response = new FooResponse()
{
Data =
dbConn.Select<Foo>(
x => x.DateTime > queryObject.StartDate.Value && x.Name.StartsWith(queryObject.SomeString)).ToList()
};
cache.Add(cacheKey, response, DateTime.Now.AddMinutes(15)); //the next time we get the same query in the next 15 mins will return cached result
return response;
}
}
Have you checked Service stack caching wiki. It gives detailed info about caching. Now in your case from the details you are providing I can say that you can go for any kind of caching. As of now it will not make any difference.
PS: A piece of advice caching should be done when there is no option or the only thing pending in application. Because it comes with it's own problem is invalidating caching, managing and all that. So, if you application is not too big, just leave it for now.

Losing the record ID

I have a record structure where I have a parent record with many children records. On the same page I will have a couple queries to get all the children.
A later query I will get a record set when I expand it it shows "Proxy". That is fine an all for getting data from the record since everything is generally there. Only problem I have is when I go to grab the record "ID" it is always "0" since it is proxy. This makes it pretty tough when building a dropdown list where I use the record ID as the "selected value". What makes this worse is it is random. So out of a list of 5 items 2 of them will have an ID of "0" because they are proxy.
I can use evict to force it to load at times. However when I am needing lazy load (For Grids) the evict is bad since it kills the lazy load and I can't display the grid contents on the fly.
I am using the following to start my session:
ISession session = FluentSessionManager.SessionFactory.OpenSession();
session.BeginTransaction();
CurrentSessionContext.Bind(session);
I even use ".SetFetchMode("MyTable", Eager)" within my queries and it still shows "Proxy".
Proxy is fine, but I need the record ID. Anyone else run into this and have a simple fix?
I would greatly appreciate some help on this.
Thanks.
Per request, here is the query I am running that will result in Patients.Children having an ID of "0" because it is showing up as "Proxy":
public IList<Patients> GetAllPatients()
{
return FluentSessionManager.GetSession()
.CreateCriteria<Patients>()
.Add(Expression.Eq("IsDeleted", false))
.SetFetchMode("Children", Eager)
.List<Patients>();
}
I have found the silver bullet that fixes the proxy issue where you loose your record id!
I was using ClearCache to take care of the problem. That worked just fine for the first couple layers in the record structure. However when you have a scenario of Parient.Child.AnotherLevel.OneMoreLevel.DownOneMore that would not fix the 4th and 5th levels. This method I came up with does. I also did find it mostly presented itself when I would have one to many followed by many to one mapping. So here is the answer to everyone else out there that is running into the same problem.
Domain Structure:
public class Parent : DomainBase<int>
{
public virtual int ID { get { return base.ID2; } set { base.ID2 = value; } }
public virtual string Name { get; set; }
....
}
DomainBase:
public abstract class DomainBase<Y>, IDomainBase<Y>
{
public virtual Y ID //Everything has an identity Key.
{
get;
set;
}
protected internal virtual Y ID2 // Real identity Key
{
get
{
Y myID = this.ID;
if (typeof(Y).ToString() == "System.Int32")
{
if (int.Parse(this.ID.ToString()) == 0)
{
myID = ReadOnlyID;
}
}
return myID;
}
set
{
this.ID = value;
this.ReadOnlyID = value;
}
}
protected internal virtual Y ReadOnlyID { get; set; } // Real identity Key
}
IDomainBase:
public interface IDomainBase<Y>
{
Y ID { get; set; }
}
Domain Mapping:
public class ParentMap : ClassMap<Parent, int>
{
public ParentMap()
{
Schema("dbo");
Table("Parent");
Id(x => x.ID);
Map(x => x.Name);
....
}
}
ClassMap:
public class ClassMap<TEntityType, TIdType> : FluentNHibernate.Mapping.ClassMap<TEntityType> where TEntityType : DomainBase<TIdType>
{
public ClassMap()
{
Id(x => x.ID, "ID");
Map(x => x.ReadOnlyID, "ID").ReadOnly();
}
}

fluent nhibernate: compositeid() of same types, getting message no id mapped

I've scoured Google and SO but haven't come across anyone having the same problem. Here is my model:
public class Hierarchy
{
public virtual Event Prerequisite { get; set; }
public virtual Event Dependent { get; set; }
public override bool Equals(object obj)
{
var other = obj as Hierarchy;
if (other == null)
{
return false;
}
else
{
return this.Prerequisite == other.Prerequisite && this.Dependent == other.Dependent;
}
}
public override int GetHashCode()
{
return (Prerequisite.Id.ToString() + "|" + Dependent.Id.ToString()).GetHashCode();
}
}
Here is my mapping:
public class HierarchyMap : ClassMap<Hierarchy>
{
public HierarchyMap()
{
CompositeId()
.KeyReference(h => h.Prerequisite, "PrerequisiteId")
.KeyReference(h => h.Dependent, "DependentId");
}
}
And here is the ever present result:
{"The entity 'Hierarchy' doesn't have an Id mapped. Use the Id method to map your identity property. For example: Id(x => x.Id)."}
Is there some special configuration I need to do to enable composite id's? I have the latest FNh (as of 6/29/2012).
Edit
I consider the question open even though I've decided to map an Id and reference the 2 Event's instead of using a CompositeId. Feel free to propose an answer.
I figured out this was due to auto mapping trying to auto map the ID
Even though i had an actual map for my class - it still tried to auto map the ID. Once i excluded the class from auto mapping, it worked just fine.

dynamic-component fluent automapping

Does anyone know how can we automatically map dynamic components using Fluent Automapping in NHibernate?
I know that we can map normal classes as components, but couldn't figure out how to map dictionaries as dynamic-components using fluent automapping.
Thanks
We've used the following approach successfully (with FluentNH 1.2.0.712):
public class SomeClass
{
public int Id { get; set; }
public IDictionary Properties { get; set; }
}
public class SomeClassMapping : ClassMap<SomeClass>
{
public SomeClassMapping()
{
Id(x => x.Id);
// Maps the MyEnum members to separate int columns.
DynamicComponent(x => x.Properties,
c =>
{
foreach (var name in Enum.GetNames(typeof(MyEnum)))
c.Map<int>(name);
});
}
}
Here we've mapped all members of some Enum to separate columns where all of them are of type int. Right now I'm working on a scenario where we use different types for the dynamic columns which looks like this instead:
// ExtendedProperties contains custom objects with Name and Type members
foreach (var property in ExtendedProperties)
{
var prop = property;
part.Map(prop.Name).CustomType(prop.Type);
}
This also works very well.
What I'm still about to figure out is how to use References instead of Map for referencing other types that have their own mapping...
UPDATE:
The case with References is unfortunately more complicated, please refer to this Google Groups thread. In short:
// This won't work
foreach (var property in ExtendedProperties)
{
var prop = property;
part.Reference(dict => dict[part.Name]);
}
// This works but is not very dynamic
foreach (var property in ExtendedProperties)
{
var prop = property;
part.Reference<PropertyType>(dict => dict["MyProperty"]);
}
That's all for now.
I got struggle with exactly the same problem. With fluent nHibernate we cannot map this but on my own I somehow was able to solve this. My solution is to build lambda expression on the fly and the assign this into object. For instance, lets say that:
Let my copy part of the site that Oliver refer:
DynamicComponent(
x => x.Properties,
part =>
{
// Works
part.Map("Size").CustomType(typeof(string));
// Works
var keySize = "Size";
part.Map(keySize).CustomType(typeof(string));
// Does not work
part.Map(d => d[keySize]).CustomType(typeof(string));
// Works
part.References<Picture>(d => d["Picture"]);
// Does not work
var key = "Picture";
part.References<Picture>(d => d[key]);
});
And we have this problem that we need to hardcode "Picture" in mapping. But somehow after some research I created following solution:
var someExternalColumnNames = GetFromSomewhereDynamicColumns();
'x' is a DynamicComponent callback in fluent Nhibernate e.g. (DynamicColumns): DynamicComponent(a => a.DynamicColumns, x => (...content of method below...))
foreach(var x in someExternalColumnNames)
{
if (x.IsReferenceToPerson == true)
{
var param = Expression.Parameter(typeof(IDictionary), "paramFirst");
var key = Expression.Constant(x.Name);
var me = MemberExpression.Call(param, typeof(IDictionary).GetMethod("get_Item"), new[] { key });
var r = Expression.Lambda<Func<IDictionary, object>>(me, param);
m.References<Person>(r, x.Name);
}
else
{
m.Map(x.Name)
}
}
//
// Some class that we want to reference, just an example of Fluent Nhibernate mapping
public class PersonMap : ClassMap<Person>
{
public PersonMap()
{
Table("Person");
Id(x => x.PersonId, "PersonId");
Map(x => x.Name);
}
}
public class Person
{
public virtual Guid PersonId { get; set; }
public virtual string Name { get; set; }
public Person()
{ }
}
Maybe it would be helpful