I have a record structure where I have a parent record with many children records. On the same page I will have a couple queries to get all the children.
A later query I will get a record set when I expand it it shows "Proxy". That is fine an all for getting data from the record since everything is generally there. Only problem I have is when I go to grab the record "ID" it is always "0" since it is proxy. This makes it pretty tough when building a dropdown list where I use the record ID as the "selected value". What makes this worse is it is random. So out of a list of 5 items 2 of them will have an ID of "0" because they are proxy.
I can use evict to force it to load at times. However when I am needing lazy load (For Grids) the evict is bad since it kills the lazy load and I can't display the grid contents on the fly.
I am using the following to start my session:
ISession session = FluentSessionManager.SessionFactory.OpenSession();
session.BeginTransaction();
CurrentSessionContext.Bind(session);
I even use ".SetFetchMode("MyTable", Eager)" within my queries and it still shows "Proxy".
Proxy is fine, but I need the record ID. Anyone else run into this and have a simple fix?
I would greatly appreciate some help on this.
Thanks.
Per request, here is the query I am running that will result in Patients.Children having an ID of "0" because it is showing up as "Proxy":
public IList<Patients> GetAllPatients()
{
return FluentSessionManager.GetSession()
.CreateCriteria<Patients>()
.Add(Expression.Eq("IsDeleted", false))
.SetFetchMode("Children", Eager)
.List<Patients>();
}
I have found the silver bullet that fixes the proxy issue where you loose your record id!
I was using ClearCache to take care of the problem. That worked just fine for the first couple layers in the record structure. However when you have a scenario of Parient.Child.AnotherLevel.OneMoreLevel.DownOneMore that would not fix the 4th and 5th levels. This method I came up with does. I also did find it mostly presented itself when I would have one to many followed by many to one mapping. So here is the answer to everyone else out there that is running into the same problem.
Domain Structure:
public class Parent : DomainBase<int>
{
public virtual int ID { get { return base.ID2; } set { base.ID2 = value; } }
public virtual string Name { get; set; }
....
}
DomainBase:
public abstract class DomainBase<Y>, IDomainBase<Y>
{
public virtual Y ID //Everything has an identity Key.
{
get;
set;
}
protected internal virtual Y ID2 // Real identity Key
{
get
{
Y myID = this.ID;
if (typeof(Y).ToString() == "System.Int32")
{
if (int.Parse(this.ID.ToString()) == 0)
{
myID = ReadOnlyID;
}
}
return myID;
}
set
{
this.ID = value;
this.ReadOnlyID = value;
}
}
protected internal virtual Y ReadOnlyID { get; set; } // Real identity Key
}
IDomainBase:
public interface IDomainBase<Y>
{
Y ID { get; set; }
}
Domain Mapping:
public class ParentMap : ClassMap<Parent, int>
{
public ParentMap()
{
Schema("dbo");
Table("Parent");
Id(x => x.ID);
Map(x => x.Name);
....
}
}
ClassMap:
public class ClassMap<TEntityType, TIdType> : FluentNHibernate.Mapping.ClassMap<TEntityType> where TEntityType : DomainBase<TIdType>
{
public ClassMap()
{
Id(x => x.ID, "ID");
Map(x => x.ReadOnlyID, "ID").ReadOnly();
}
}
Related
I have a bunch of GUID constants in my code for certain tag categories that are important in my application. This is mapped in a simple two column many to many table. I often want to avoid fetching the nhibernate object because all I really need is the GUID and it's already hardcoded. I also noticed it's much quicker and easier to do certain queries with direct table access. So, the goal is to to map those nhibernate many to many tables as a class so they can be read and written to without disrupting nhibernates usage of them in the regular sense; while at the same time using GUIDs for identifiers.
Anyway, I have settled on using a composedID across the two columns nhibernate generates. But there is a problem. If I use composed ID and make a Category tag object and try to save my TagID and CategoryTag ID directly, TagID gets saved to the CategoryTagID column and CategoryTagID gets saved to the TagID column!
public class CategoryTagMapping : ClassMapping<CategoryTag>
{
public CategoryTagMapping ()
{
Table("CategoryTag");
/*Id(x => x.ID, map => map.Generator(Generators.Guid));*/
Property(x => x.CategoryTagID, map => { map.Column("CategoryTagID");});
Property(x => x.TagID, map => { map.Column("TagID"); });
ComposedId(p =>
{
p.Property(p1 => p1.CategoryTagID, a => a.Column("TagID"));
p.Property(p1 => p1.TagID, a => a.Column("CategoryTagID"));
});
}
}
public class CategoryTag
{
/*public virtual Guid ID {get;set;}*/
public virtual Guid CategoryTagID { get; set; }
public virtual Guid TagID { get; set; }
public override bool Equals(object obj)
{
if (obj == null)
return false;
var t = obj as CategoryTag;
if (t == null)
return false;
if (this.CategoryTagID == t.CategoryTagID && this.TagID == t.TagID)
return true;
return false;
}
public override int GetHashCode()
{
return (this.CategoryTagID + "|" + this.TagID).GetHashCode();
}
}
Trying to do this:
CategoryTag A = new CategoryTag { CategoryTagID = Constants.GUID1, TagID = Constants.GUID2 };
If I add the ID column by uncommenting the two lines, the saving works properly. But then that breaks the regular usage of the table because mysql can't auto increment the guid field and nhibernate won't generate an ID to go in the ID column.
Anyhow, maybe it's a bug, but maybe there's a workaround. Is there something wrong with the mapping, or the equals/gethashcode?
Thanks!
I am stupid. The columns are mismatched in the composedID mapping part.
I've scoured Google and SO but haven't come across anyone having the same problem. Here is my model:
public class Hierarchy
{
public virtual Event Prerequisite { get; set; }
public virtual Event Dependent { get; set; }
public override bool Equals(object obj)
{
var other = obj as Hierarchy;
if (other == null)
{
return false;
}
else
{
return this.Prerequisite == other.Prerequisite && this.Dependent == other.Dependent;
}
}
public override int GetHashCode()
{
return (Prerequisite.Id.ToString() + "|" + Dependent.Id.ToString()).GetHashCode();
}
}
Here is my mapping:
public class HierarchyMap : ClassMap<Hierarchy>
{
public HierarchyMap()
{
CompositeId()
.KeyReference(h => h.Prerequisite, "PrerequisiteId")
.KeyReference(h => h.Dependent, "DependentId");
}
}
And here is the ever present result:
{"The entity 'Hierarchy' doesn't have an Id mapped. Use the Id method to map your identity property. For example: Id(x => x.Id)."}
Is there some special configuration I need to do to enable composite id's? I have the latest FNh (as of 6/29/2012).
Edit
I consider the question open even though I've decided to map an Id and reference the 2 Event's instead of using a CompositeId. Feel free to propose an answer.
I figured out this was due to auto mapping trying to auto map the ID
Even though i had an actual map for my class - it still tried to auto map the ID. Once i excluded the class from auto mapping, it worked just fine.
Hi i have setup my SessionFactory to cache entities and queries:
private ISessionFactory CreateSessionFactory()
{
var cfg = new Configuration().Proxy(
properties => properties.ProxyFactoryFactory<DefaultProxyFactoryFactory>()).DataBaseIntegration(
properties =>
{
properties.Driver<SqlClientDriver>();
properties.ConnectionStringName = this.namedConnection;
properties.Dialect<MsSql2005Dialect>();
}).AddAssembly(this.resourceAssembly).Cache(
properties =>
{
properties.UseQueryCache = true;
properties.Provider<SysCacheProvider>();
properties.DefaultExpiration = 3600;
});
cfg.AddMapping(this.DomainMapping);
new SchemaUpdate(cfg).Execute(true, true);
return cfg.BuildSessionFactory();
}
This is my user mapping
public class UserMapping : EntityMapping<Guid, User>
{
public UserMapping()
{
this.Table("USERS");
this.Property(
x => x.CorpId,
mapper => mapper.Column(
c =>
{
c.Name("CorporateId");
c.UniqueKey("UKUserCorporateId");
c.NotNullable(true);
}));
this.Set(
x => x.Desks,
mapper =>
{
mapper.Table("DESKS2USERS");
mapper.Key(km => km.Column("UserId"));
mapper.Inverse(false);
mapper.Cascade(Cascade.All | Cascade.DeleteOrphans | Cascade.Remove);
},
rel => rel.ManyToMany(mapper => mapper.Column("DeskId")));
this.Cache(
mapper =>
{
mapper.Usage(CacheUsage.ReadWrite);
mapper.Include(CacheInclude.All);
});
}
}
What I want to do is get a user or query some users and add information to the domain object and cache the updated object.
public class User : Entity<Guid>, IUser
{
public virtual string CorpId { get; set; }
public virtual ISet<Desk> Desks { get; set; }
public virtual MailAddress EmailAddress { get; set; }
public virtual string Name
{
get
{
return string.Format(CultureInfo.CurrentCulture, "{0}, {1}", this.SurName, this.GivenName);
}
}
public virtual string GivenName { get; set; }
public virtual string SurName { get; set; }
}
something like this:
var users = this.session.Query<User>().Cacheable().ToList();
if (users.Any(user => user.EmailAddress == null))
{
UserEditor.UpdateThroughActiveDirectoryData(users);
}
return this.View(new UserViewModel { Users = users.OrderBy(entity => entity.Name) });
or this:
var user = this.session.Get<User>(id);
if (user.EmailAddress == null)
{
UserEditor.UpdateThroughActiveDirectoryData(user);
}
return this.View(user);
The UpdateThroughActiveDirectory methods work but are executed everytime i get data from the cache, the updated entities do not keep the additional data. Is there a way to also store this data in nhibernates 2nd level cache?
NHibernate doesn't cache entire entity in second level cache. It caches only the state / data from the mapped properties. You can read more about it here: http://ayende.com/blog/3112/nhibernate-and-the-second-level-cache-tips
There's an interesting discussion in comments of that post that explains this a little further:
Frans Bouma: Objects need to serializable, are they not? As we're talking about multiple appdomains. I wonder what's more
efficient: relying on the cache of the db server or transporting
objects back/forth using serialization layers.
Ayende Rahien: No, they don't need that. This is because NHibernate doesn't save the entity in the cache. Doing so would open
you to race conditions. NHibernate saves the entity data alone,
which is usually composed of primitive data (that is what the DB can
store, after all). In general, it is more efficient to hit a cache
server, because those are very easily scalable to high degrees, and
there is no I/O involved.
Imagine a database table that looks like this:
create table [dbo].[user]
(
id int IDENTITY(1,1),
username varchar(50) NOT NULL,
firstname varchar(20) NOT NULL,
lastname varchar(30) NOT NULL,
currentid int NULL,
processedby varchar(50) NOT NULL,
processeddate varchar(50) NOT NULL
processedaction varchar(50) NOT NULL
)
What I want to do is to setup NHibernate to load it into my user object, but I only want the current version of the object "user" to be brought back. I know how to do a SQL select to do this on my own, and I feel as if there's something in nHibernate with the usage of triggers and event listeners, but can anyone tell me how to implement the nHibernate repository so I can:
{Repository}.GetCurrent(id) <- pass it any of the ids that are assigned to any of the historical or the current record, and get back the current object.
{Repository}.Save(user) <- I want to always insert the changes to a new row, and then update the old versions to link back to the new id.
Edit
So, there's some confusion here, and maybe I explained it wrong... What I'm trying to do is this, in regards to always getting the current record back...
Select uc.*
FROM User uo
JOIN User uc on uo.currentid=uc.id
WHERE uo.id==:id
But, I don't want to expose "CurrentID" to my object model, since it has no bearing on the rest of the system, IMHO. In the above SQL statement, uo is considered the "original" object set, and uc is considered the current object in the system.
Edit #2:
Looking at this as a possible solution.
http://ayende.com/blog/4196/append-only-models-with-nhibernate
I'm honestly being pigheaded, as I'm thinking about this backward. In this way of running a database, the autoincrementing field should be the version field, and the "id" field should be whatever the autoincrementer's value has at the time of the initial insert.
Answer:
I don't want to take #Firo's fury, and I'm not going to remove it from him, as he took me down the right path... what I wound up with was:
Created a base generic class with two types given
a. type of the object's "ID"
b. type of the object itself.
instantiate all classes.
create a generic interface IRepository class with a type of the object to store/retrieve.
create an abstract generic class with a type of the object to store/retrieve.
create a concrete implementation class for each type to store/retrieve.
inside of the create/update, the procedure looks like:
Type Commit(Type item)
{
var clone = item.DeepClone();
_Session.Evict(item);
clone.Id = 0;
clone.ProcessedDate = DateTime.Now;
if (clone.Action.HasValue)
{
if (clone.Action == ProcessedAction.Create)
clone.Action = ProcessedAction.Update;
}
else
{
clone.Action = ProcessedAction.Create;
}
clone.ProcessedBy = UserRepos.Where(u => u.Username == System.Threading.Thread.CurrentPrincipal.Identity.Name).First().Current;
var savedItem = (_Session.Merge(clone) as Type);
_Session.CreateQuery("UPDATE Type SET CurrentID = :newID where ID=:newID OR CurrentID=:oldID")
.SetParameter("newID", savedItem.Id)
.SetParameter("oldID", item.Id)
.ExecuteUpdate();
return savedItem;
}
In the delete method, we simply update the {object}.Action = ProcessedAction.Delete
I wanted to do this another way, but realizing we need to eventually do historical comparisons, we weren't able to ask nHibernate to filter the deleted objects, as the users will want to see that. We'll create a business facade to take care of the deleted records.
Again, much thanks to #Firo for his help with this.
So, with all that, I can finally do this:
var result = {Repository}.Where(obj => obj.Id == {objectID from caller}).FirstOrDefault();
if (result != null)
{
return result.Current;
}
else
{
return null;
}
and always get my current object back for any requesting ID. Hope it helps someone that is in my situation.
in mapping if you use FluentNHibernate
public UserMap : ClassMap<User>
{
public UserMap()
{
Where("id = currentid"); // always bring back the most recent
}
}
// in Userrepository
public void Update(User user)
{
var clone = user.Clone();
session.Evict(user); // to prevent flushing the changes
var newId = session.Save(clone);
session.CreateQuery("UPDATE User u SET u.currentid = :current") // <-- hql
.SetParameter("current", newId)
.ExecuteUpdate();
}
objectgraphs are a lot trickier with this simple code. I would then do one of the following:
use NHibernate.Envers to store auditing information for me
explicitly creating new entities in BL code
i once saw an append-only-model doing something like the following
// UserBase is there to ensure that all others referencing the User doesnt have to update because user properties changed
class UserBase
{
public virtual int Id { get; set; }
public virtual ICollection<PersonDetails> AllDetails { get; private set; }
public virtual PersonDetails CurrentDetails
{
get { return _currentDetauils; }
set { _currentDetauils = value; AllDetails.Add(value); }
}
// same as above
public virtual ICollection<ConfigDetails> AllConfigs { get; set; }
}
class Order
{
public virtual int Id { get; set; }
public virtual UserBase User { get; set; }
public virtual IList<OrderDetail> AllDetails { get; private set; }
public virtual IList<OrderDetail> ActiveDetails { get; private set; }
public virtual void Add(OrderDetail detail)
{
AllDetails.Add(detail);
ActiveDetails.Add(detail);
}
public virtual void Delete(OrderDetail detail)
{
detail.Active = false;
ActiveDetails.Remove(detail);
}
}
class OrderDetail
{
public virtual int Id { get; set; }
public virtual Order Parent { get; set; }
public virtual bool Active { get; set; }
}
class OrderMap : ClassMap<Order>
{
public OrderMap()
{
HasMany(o => o.AllDetails);
HasMany(o => o.ActiveDetails).Where("active=1");
}
}
// somewhere
public void UpdateTaxCharge(OrderDetail detail, TaxCharge charge)
{
var clone = detail.Clone();
clone.TaxCharge = charge;
detail.Order.Delete(detail);
detail.Order.Add(clone);
}
You can tell NHibernate what exactly SQL it should generate when persisting and loading an entity. For example you can tell NHibernate to use a stored procedure instead of a plain SQL statement. If this is an option for you I can farther elaborate my answer.
I want to select all requests that are outstanding for a given manager. A manager can have multiple teams.
I compose the queries applying various restrictions based upon permissions, and alter the queries to provide row counts, existence checks, sub queries, etc.
The composition makes use of QueryOver, though using ICriteria instead would also be acceptable.
Given the following classes;
class Team {
public virtual int Manager { get; set; }
public virtual ISet<int> Members { get; set; }
}
class Request {
public virtual int Owner { get; set; }
public virtual bool IsOutstanding { get; set; }
}
class static SomeRestrictions {
public static void TeamsForManager<TRoot> (this IQueryOver<TRoot, Team> query, int managerId) {
// In reality this is a little more complex
query.Where (x => x.Manager == managerId);
}
}
This is the current query that I'm trying (which doesn't work).
var users = QueryOver.Of<Team> ();
users.TeamsForManager (5)
users.Select (/* not sure */);
var requests = session.QueryOver<Request> ()
.Where (x => x.IsOutstanding)
.WithSubquery.WhereProperty (x => x.Owner).In (users);
The HQL to select the users would be:
"SELECT m FROM Team t JOIN t.Members m WHERE <TeamsForManager restrictions>"
But I don't want to use HQL because I can't then compose it with other restrictions based upon permissions. I also wouldn't be able to compose it with other queries to turn it into row counts/existence checks, etc.
i saw you changed the model but this would have been the way
var users = QueryOver.Of<Team> ();
users.TeamsForManager (5);
users.JoinAlias(t => t.Members, () => membervalue).Select(() => membervalue);