NHibernate allows me to query a database and get an IList of objects in return. Suppose I get a list of a couple of dozen objects and modify a half-dozen or so. Does NHibernate have a way to persist changes to the collection, or do I have to persist each object as I change it?
Here's an example. Suppose I run the following code:
var hql = "from Project";
var query = session.CreateQuery(hql);
var myProjectList = query.List<Project>();
I will get back an IList that contains all projects. Now suppose I execute the following code:
var myNewProject = new Project("My New Project");
myProjectList .Add(myNewProject);
And let's say I do this several times, adding several new projects to the list. Now I'm ready to persist the changes to the collection.
I'd like to persist the changes by simply passing myProjectList to the current ISession for updating. But ISession.SaveOrUpdate() appears to take only individual objects, not collections like myProjectList. Is there a way that I can persist changes to myProjectList, or do I have to persist each new object as I create it? Thanks for your help.
David Veeneman
Foresight Systems
If you load objects like in your example - then yes you have to persist them one by one.
However, if you make a small design change, and load something like : Account that has an IList<Project> - if you specify cascade "what_cascade_you_need" in the mapping , then when you change the projects on Account , you only have to save Account and everything will get saved.
Related
I am trying get a record updated from database with QueryOver.
My code initially creates an entity and saves in database, then the same record is updated on database externally( from other program, manually or the same program running in other machine), and when I call queryOver filtering by the field changed, the query gets the record but without latest changes.
This is my code:
//create the entity and save in database
MyEntity myEntity = CreateDummyEntity();
myEntity.Name = "new_name";
MyService.SaveEntity(myEntity);
// now the entity is updated externally changing the name property with the
// "modified_name" value (for example manually in TOAD, SQL Server,etc..)
//get the entity with QueryOver
var result = NhibernateHelper.Session
.QueryOver<MyEntity>()
.Where(param => param.Name == "modified_name")
.List<T>();
The previous statement gets a collection with only one record(good), BUT with the name property established with the old value instead of "modified_name".
How I can fix this behaviour? First Level cache is disturbing me? The same problem occurs with
CreateCriteria<T>();
The session in my NhibernateHelper is not being closed in any moment due application framework requirements, only are created transactions for each commit associated to a session.Save().
If I open a new session to execute the query evidently I get the latest changes from database, but this approach is not allowed by design requirement.
Also I have checked in the NHibernate SQL output that a select with a WHERE clause is being executed (therefore Nhibernate hits the database) but don´t updates the returned object!!!!
UPDATE
Here's the code in SaveEntity after to call session.Save: A call to Commit method is done
public virtual void Commit()
{
try
{
this.session.Flush();
this.transaction.Commit();
}
catch
{
this.transaction.Rollback();
throw;
}
finally
{
this.transaction = this.session.BeginTransaction();
}
}
The SQL generated by NHibernate for SaveEntity:
NHibernate: INSERT INTO MYCOMPANY.MYENTITY (NAME) VALUES (:p0);:p0 = 'new_name'.
The SQL generated by NHibernate for QueryOver:
NHibernate: SELECT this_.NAME as NAME26_0_
FROM MYCOMPANY.MYENTITY this_
WHERE this_.NAME = :p0;:p0 = 'modified_name' [Type: String (0)].
Queries has been modified due to company confidential policies.
Help very appreciated.
As far as I know, you have several options :
have your Session as a IStatelessSession, by calling sessionFactory.OpenStatelesSession() instead of sessionFactory.OpenSession()
perform Session.Evict(myEntity) after persisting an entity in DB
perform Session.Clear() before your QueryOver
set the CacheMode of your Session to Ignore, Put or Refresh before your QueryOver (never tested that)
I guess the choice will depend on the usage you have of your long running sessions ( which, IMHO, seem to bring more problems than solutions )
Calling session.Save(myEntity) does not cause the changes to be persisted to the DB immediately*. These changes are persisted when session.Flush() is called either by the framework itself or by yourself. More information about flushing and when it is invoked can be found on this question and the nhibernate documentation about flushing.
Also performing a query will not cause the first level cache to be hit. This is because the first level cache only works with Get and Load, i.e. session.Get<MyEntity>(1) would hit the first level cache if MyEntity with an id of 1 had already been previously loaded, whereas session.QueryOver<MyEntity>().Where(x => x.id == 1) would not.
Further information about NHibernate's caching functionality can be found in this post by Ayende Rahien.
In summary you have two options:
Use a transaction within the SaveEntity method, i.e.
using (var transaction = Helper.Session.BeginTransaction())
{
Helper.Session.Save(myEntity);
transaction.Commit();
}
Call session.Flush() within the SaveEntity method, i.e.
Helper.Session.Save(myEntity);
Helper.Session.Flush();
The first option is the best in pretty much all scenarios.
*The only exception I know to this rule is when using Identity as the id generator type.
try changing your last query to:
var result = NhibernateHelper.Session
.QueryOver<MyEntity>()
.CacheMode(CacheMode.Refresh)
.Where(param => param.Name == "modified_name")
if that still doesn't work, try add this after the query:
NhibernateHelper.Session.Refresh(result);
After search and search and think and think.... I´ve found the solution.
The fix: It consist in open a new session, call QueryOver<T>() in this session and the data is succesfully refreshed. If you get child collections not initialized you can call HibernateUtil.Initialize(entity) or sets lazy="false" in your mappings. Take special care about lazy="false" in large collections, because you can get a poor performance. To fix this problem(performance problem loading large collections), set lazy="true" in your collection mappings and call the mentioned method HibernateUtil.Initialize(entity) of the affected collection to get child records from database; for example, you can get all records from a table, and if you need access to all child records of a specific entity, call HibernateUtil.Initialize(collection) only for the interested objects.
Note: as #martin ernst says, the update problem can be a bug in hibernate and my solution is only a temporal fix, and must be solved in hibernate.
People here do not want to call Session.Clear() since it is too strong.
On the other hand, Session.Evict() may seem un-applicable when the objects are not known beforehand.
Actually it is still usable.
You need to first retrieve the cached objects using the query, then call Evict() on them. And then again retrieve fresh objects calling the same query again.
This approach is slightly inefficient in case the object was not cached to begin with - since then there would be actually two "fresh" queries - but there seems to be not much to do about that shortcoming...
By the way, Evict() accepts null argument too without exceptions - this is useful in case the queried object is actually not present in the DB.
var cachedObjects = NhibernateHelper.Session
.QueryOver<MyEntity>()
.Where(param => param.Name == "modified_name")
.List<T>();
foreach (var obj in cachedObjects)
NhibernateHelper.Session.Evict(obj);
var freshObjects = NhibernateHelper.Session
.QueryOver<MyEntity>()
.Where(param => param.Name == "modified_name")
.List<T>()
I'm getting something very similar, and have tried debugging NHibernate.
In my scenario, the session creates an object with a couple children in a related collection (cascade:all), and then calls ISession.Flush().
The records are written into the DB, and the session needs to continue without closing. Meanwhile, another two child records are written into the DB and committed.
Once the original session then attempts to re-load the graph using QueryOver with JoinAlias, the SQL statement generated looks perfectly fine, and the rows are being returned correctly, however the collection that should receive these new children is found to have already been initialized within the session (as it should be), and based on that NH decides for some reason to completely ignore the respective rows.
I think NH makes an invalid assumption here that if the collection is already marked "Initialized" it does not need to be re-loaded from the query.
It would be great if someone more familiar with NHibernate internals could chime in on this.
My application uses multiple threads with one managed object context per thread.
For clarity I will refer to the different managed object contexts as: moc1, moc2, ..., etc.
Let's assume we have two models with a simple one-many relationship:
User 1----* Document
When a user logs in I fetch the corresponding model from one of the contexts (eg. moc1).
(pseudo code)
UserModel *globalLoggedUser = ( Fetch the logged in user using moc1 )
I then store this user so that I can reference it later.
In another part of the application I need to loop through thousands of items from an array and create Document objects for it. Each document then needs to be bound to the current user. This happens in a different background thread (which has its own context)
for( NSString *documentName in documents) {
( Create new document using moc2 )
** THIS IS WHERE MY PROBLEM IS **
// What I currently do:
UserModel *tempUser = ( Fetch the logged in user using moc2 )
( bind new document to tempUser )
// What I would like to do:
( bind new document to globalLoggedUser )
// Note that in this case tempUser and globalLoggedUser are the same user, except they are attached to different contexts.
}
As you can see, I would like to avoid having to fetch a new user into the current context each time.
The problem is, globalLoggedUser is part of moc1, whereas the new document is part of moc2 (or moc3, moc4, etc, depends on the thread).
So what's the best way to go about this? How can I globally save/cache an object and then use that same object to bind relationships in different contexts without incurring a penalty of having to fetch each time?
Thanks for any help you can provide.
You are correct that you can't use the same NSManagedObject across threads.
From the Core Data Programming Guide:
Using thread confinement, you should not pass managed objects or managed object contexts between threads. To “pass” a managed object from one context another across thread boundaries, you either:
Pass its object ID (objectID) and use objectWithID: or existingObjectWithID:error: on receiving managed object context.
The corresponding managed objects must have been saved—you cannot pass the ID of a newly-inserted managed object to another context.
Execute a fetch on the receiving context.
I think you'd be fine if you just fetched the logged in user with moc2 before you run the 'document' loop, as I don't see any reason to do the fetch each time inside the loop. (Is there some reason you are doing that?)
Don't worry about binding anything to the UserModel from thread 1, the tempUser you get from moc2 is referencing the same data in the database as globalLoggedUser.
Starting with a List of entities and needing all dependent entities through an association, is there a way to use the corresponding navigation-propertiy to load all child-entities with one db-round-trip? Ie. generate a single WHERE fkId IN (...) statement via navigation property?
More details
I've found these ways to load the children:
Keep the set of parent-entities as IQueriable<T>
Not good since the db will have to find the main set every time and join to get the requested data.
Put the parent-objects into an array or list, then get related data through navigation properties.
var children = parentArray.Select(p => p.Children).Distinct()
This is slow since it will generate a select for every main-entity.
Creates duplicate objects since each set of children is created independetly.
Put the foreign keys from the main entities into an array then filter the entire dependent-ObjectSet
var foreignKeyIds = parentArray.Select(p => p.Id).ToArray();
var children = Children.Where(d => foreignKeyIds.Contains(d.Id))
Linq then generates the desired "WHERE foreignKeyId IN (...)"-clause.
This is fast but only possible for 1:*-relations since linking-tables are mapped away.
Removes the readablity advantage of EF by using Ids after all
The navigation-properties of type EntityCollection<T> are not populated
Eager loading though the .Include()-methods, included for completeness (asking for lazy-loading)
Alledgedly joins everything included together and returns one giant flat result.
Have to decide up front which data to use
It there some way to get the simplicity of 2 with the performance of 3?
You could attach the parent object to your context and get the children when needed.
foreach (T parent in parents) {
_context.Attach(parent);
}
var children = parents.Select(p => p.Children);
Edit: for attaching multiple, just iterate.
I think finding a good answer is not possible or at least not worth the trouble. Instead a micro ORM like Dapper give the big benefit of removing the need to map between sql-columns and object-properties and does it without the need to create a model first. Also one simply writes the desired sql instead of understanding what linq to write to have it generated. IQueryable<T> will be missed though.
I couldn't find an answer to this issue so I assume it is something I am doing wrong.
I have a PersistenceModel set up where I have set a convention as follows: -
persistenceModel.Conventions.Add(DefaultLazy.Always());
However, for one of the HasManyToMany relationships in one of my entities I want eager loading to take place which I am setting up as follows: -
HasManyToMany(x => x.Affiliates).Not.LazyLoad();
Intuitively, I expect eager loading to take place as I am overriding the lazy load default that I have specified as a convention but it still lazy loads. If I set the DefaultLazy convention to never and then set LazyLoad on an individual relationship it doesn't work either.
Any ideas?
When you set Not.LazyLoad(), you tell NHibernate to load Affiliates when the parent loads. NHibernate will do this by performing another select on the Affliates many-to-many table regardless of whether you access the Affiliates collection or not. NHibernate is using another select because that is the default fetching mode. You want to override fetching mode as well, either in the query or in the mapping. To do it in the mapping, add the following:
HasManyToMany(x => x.Affiliates)
.Not.LazyLoad()
.Fetch.Join();
You might also want to include a ".Cascade.AllDeleteOrphan()" if you want NHibernate to persist new Affiliaites added to the collection and delete orphaned ones. If you do not do this, you will have to explicitly call session.Save(newAffiliate). Otherwise you'll receive a TransientObjectException when your Affiliates collection contains a new Affiliate.
It may be one stupid thing to ask, but have you execute the query inside your session? Say,
Using(var session = OpenSession())
{
session.Query<Entity>().ToList();
}
I had this problem before, and finally realized the objects that I was accessing hadn't been queried before disposing the session.
Using NHibernate, is it possible to fill an existing object using the results of a query, rather than returning a new entity? For example:
var foo = new Foo();
session.GetById(foo, id);
Well... kind of... If you object is transient you can Session.Get<Foo>(id) another object into NH identity map and then manually copy its fields into your object. If your object is persistent (attached to a session), you can Session.Refresh(foo) to re-retrieve it from DB.
I guess you can try doing Session.Lock on your transient instance to reattach it to the session and then Session.Refresh to refresh it... Should work... at least in theory...