I am using NHibernate in my application and have the following situation. (Note that the situation here is much more simplified for the sake of understanding)
An object is queried from the database. It is edited and updated using Session.Update(). The object is then Evicted from the Session (using Session.Evict(obj)) before the transaction is committed. Expected result is that the changes are persisted to the database.
This used to work fine when I had my Id column as a NHibernate identity column.
Recently, we changed the Id column to be non-identity. As a result, the above scenario does not persist to the database unless I explicitly call Session.Flush() before I Evict.
Can someone point/explain to me the reason for this behavior?
This link, NHibernate Session.Flush & Evict vs Clear, mentions something about Evict and the identity column, which to me is not very clear.
Your workflow is not correct.
First, when you retrieve an object from the database, session.Update(entity) does not do anything. Changes will happen automatically on Flush/Commit
Next, Evict removes all knowledge the session has of the object, therefore it will not persist any changes applied to it. You should almost never use this method under normal conditions, which makes me think you are not handling the session correctly.
Third, the fact that using identity causes inserts to happen immediately on Save is a limitation, not a feature.
The correct workflow is:
using (var session = factory.OpenSession())
using (var transaction = session.BeginTransaction())
{
var entity = session.Get<EntityType>(id);
entity.SomeProperty = newValue;
transaction.Commit();
}
The exact structure (using statements, etc) can change in a desktop application, but the basic idea is the same.
Identity forces NHibernate to immediatly save the entity to the database on session.Save() non identity allows it to batch the inserts to send them as a whole. Evict will remove all information of the object from the session. So while with identity it forgets about the entity it is already in the database even if the session doesnt know.
to remedy that you can
set Flushmode.Auto (forces immediate flushing)
call session.Flush() before Evict
Evict after the transaction completed
which is the best option depends on the context
Related
When doing a criteria query with NHibernate, I want to get fresh results and not old ones from a cache.
The process is basically:
Query persistent objects into NHibernate application.
Change database entries externally (another program, manual edit in SSMS / MSSQL etc.).
Query persistence objects (with same query code), previously loaded objects shall be refreshed from database.
Here's the code (slightly changed object names):
public IOrder GetOrderByOrderId(int orderId)
{
...
IList result;
var query =
session.CreateCriteria(typeof(Order))
.SetFetchMode("Products", FetchMode.Eager)
.SetFetchMode("Customer", FetchMode.Eager)
.SetFetchMode("OrderItems", FetchMode.Eager)
.Add(Restrictions.Eq("OrderId", orderId));
query.SetCacheMode(CacheMode.Ignore);
query.SetCacheable(false);
result = query.List();
...
}
The SetCacheMode and SetCacheable have been added by me to disable the cache. Also, the NHibernate factory is set up with config parameter UseQueryCache=false:
Cfg.SetProperty(NHibernate.Cfg.Environment.UseQueryCache, "false");
No matter what I do, including Put/Refresh cache modes, for query or session: NHibernate keeps returning me outdated objects the second time the query is called, without the externally committed changes. Info btw.: the outdated value in this case is the value of a Version column (to test if a stale object state can be detected before saving). But I need fresh query results for multiple reasons!
NHibernate even generates an SQL query, but it is never used for the values returned.
Keeping the sessions open is neccessary to do dynamic updates on dirty columns only (also no stateless sessions for solution!); I don't want to add Clear(), Evict() or such everywhere in code, especially since the query is on a lower level and doesn't remember the objects previously loaded. Pessimistic locking would kill performance (multi-user environment!)
Is there any way to force NHibernate, by configuration, to send queries directly to the DB and get fresh results, not using unwanted caching functions?
First of all: this doesn't have anything to do with second-level caching (which is what SetCacheMode and SetCacheable control). Even if it did, those control caching of the query, not caching of the returned entities.
When an object has already been loaded into the current session (also called "first-level cache" by some people, although it's not a cache but an Identity Map), querying it again from the DB using any method will never override its value.
This is by design and there are good reasons for it behaving this way.
If you need to update potentially changed values in multiple records with a query, you will have to Evict them previously.
Alternatively, you might want to read about Stateless Sessions.
Is this code running in a transaction? Or is that external process running in a transaction? If one of those two is still in a transaction, you will not see any updates.
If that is not the case, you might be able to find the problem in the log messages that NHibernate is creating. These are very informative and will always tell you exactly what it is doing.
Keeping the sessions open is neccessary to do dynamic updates on dirty columns only
This is either the problem or it will become a problem in the future. NHibernate is doing all it can to make your life better, but you are trying to do as much as possible to prevent NHibernate to do it's job properly.
If you want NHibernate to update the dirty columns only, you could look at the dynamic-update-attribute in your class mapping file.
I'm using NHibernate with Fluent NHibernate.
I have code where I start a transaction, then I enter a loop which creates several objects. For each object I check certain conditions. If these conditions are met, then I execute a session.SaveOrUpdate() on the object. At the end of the loop, I issue a commit transaction.
I have a breakpoint set on the session.SaveOrUpdate command, proving that it is never reached (because the conditions have not been met by any of the objects in the loop). Nevertheless, when the transaction is committed, the objects are saved!
I am using an AuditInterceptor and have set a breakpoint in the OnSave method. It is being called, but the stack trace only traces back to the statement that commits the transaction.
There are no objects of any kind that have had SaveOrUpdate executed on them at this point, so cascading doesn't explain it.
Why is NHibernate saving these objects?
From NHibernate ISession.Update thread:
It's the normal and default behavior:
Hibernate maintains a cache of Objects
that have been inserted, updated or
deleted. It also maintains a cache of
Objects that have been queried from
the database. These Objects are
referred to as persistent Objects as
long as the EntityManager that was
used to fetch them is still active.
What this means is that any changes to
these Objects within the bounds of a
transaction are automatically
persisted when the transaction is
committed. These updates are implicit
within the boundary of the transaction
and you don’t have to explicitly call
any method to persist the values.
From Hibernate Pitfalls part 2:
Q) Do I still have to do Save and
Update inside transactions?
Save() is only needed for objects that
are not persistent (such as new
objects). You can use Update to bring
an object that has been evicted back
into a session.
From NHibernate's automatic (dirty checking) update behaviour:
I've just discovered that if I get an
object from an NHibernate session and
change a property on object,
NHibernate will automatically update
the object on commit without me
calling Session.Update(myObj)!
Answer: You can set Session.FlushMode to
FlushMode.Never. This will make your
operations explicit ie: on tx.Commit() or session.Flush().
Of course this will still update the
database upon commit/flush. If you do
not want this behavior, then call
session.Evict(yourObj) and it will
then become transient and NHibernate
will not issue any db commands for it.
It's to do with the sessions flush mode being FlushMode.Commit (default). When the transaction is committed any changes made to objects within the session are saved and the changes persisted.
There's a FlushMode property on the session that you can set. If you want a readonly transaction specify FlushMode.Manual.
Hope this helps!
I'm looking into using NHibernate to handle the persistence layer in my business application. Right now, I'm wondering how to handle concurrent edits/updates in NHibernate, or more specifically, how to let multiple, distributed instances of my application know that one user changed a given dataset.
I'm not looking for versioning, i.e. consistency in the face of concurrent edits - I know NHibernate supports optimistic/pessimistic concurrency, and I currently use optimistic concurrency via and handling the StateStateException.
I just want to know: Given that user A changes a row in the dataset, how to let user B know that a change occured so that the dataset can be reloaded? Right now, I'm lazy loading a large list of customers into a grid using NHibernate. Now, if a new customer is added, I'd like to add it to the grid without reloading the whole data again - the application's first concern is performance.
Also, will NHibernate gracefully handle changes to existing rows? Will it somehow detect changes in the underlying database and update the in-memory .NET objects so that accessing their properties will yield the updated values?
I thought about using an additional table, saving the IDs of updated objects along with a timestamp to refresh items myself, but if NHinbernate offers something of it's own, that would be a much bett choice, obviously...
You need database-level notifications/events for this. That is, the database engine has to provide notifications. For example, SQL Server has this feature. NHibernate runs on the application, so all it could potentially do by itself is polling the database for changes (your idea of using an additional table looks like polling), and we all know that's not good. NHibernate's SysCache2 takes advantage of SqlCacheDependencies to invalidate cache entries when the database raises a notification, but I'm not aware that it can raise any events to be consumed by the app itself (which wouldn't be useful anyway, since 2nd-level caches don't work with whole entities).
Another possible way to implement this would be having a NHibernate event listener place a broadcast message on a bus after any updates, then each application instance would receive this message and re-fetch from database accordingly.
I had to solve some similar issues on my current SQLite Fluent NHibernate project.
For detecting database updates, I used a System.IO.FileSystemWatcher. See the sample code in my answer to my own, similar, question. Also, take a look at another answer to the same question, which suggested using NHibernate Interceptors.
I also lazy load my DB into a grid. Whenever the FileSystemWatcher detects the DB has been updated, I call the following Criteria Query, and add the new records that it returns to the BindingList that is the DataSource for my grid. (I just save the highest ID of the new records in a private variable)
/// <summary>
/// Return list of measurements with Id fields higher than id parameter
/// </summary>
/// <param name="id"></param>
/// <returns></returns>
public static IList<T> GetMeasurementsNewerThan(int id)
{
var session = GetMainSession();
using (var transaction = session.BeginTransaction())
{
var newestMeasurements =
session.CreateCriteria(typeof(T))
.Add(Expression.Gt("Id", id))
.List<T>();
transaction.Commit();
return newestMeasurements;
}
}
This is a common question, but the explanations found so far and observed behaviour are some way apart.
We have want the following nHibernate strategy in our MVC website:
A SessionScope for the request (to track changes)
An ActiveRecord.TransactonScope to wrap our inserts only (to enable rollback/commit of batch)
Selects to be outside a Transaction (to reduce extent of locks)
Delayed Flush of inserts (so that our insert/updates occur as a UoW at end of session)
Now currently we:
Don't get the implied transaction from the SessionScope (with FlushAction Auto or Never)
If we use ActiveRecord.TransactionScope there is no delayed flush and any contained selects are also caught up in a long-running transaction.
I'm wondering if it's because we have an old version of nHibernate (it was from trunk very near 2.0).
We just can't get the expected nHibernate behaviour, and performance sucks (using NHProf and SqlProfiler to monitor db locks).
Here's what we have tried since:
Written our own TransactionScope (inherits from ITransactionScope) that:
Opens a ActiveRecord.TransactionScope on the Commit, not in the ctor (delays transaction until needed)
Opens a 'SessionScope' in the ctor if none are available (as a guard)
Converted our ids to Guid from identity
This stopped the auto flush of insert/update outside of the Transaction (!)
Now we have the following application behaviour:
Request from MVC
SELECTs needed by services are fired, all outside a transaction
Repository.Add calls do not hit the db until scope.Commit is called in our Controllers
All INSERTs / UPDATEs occur wrapped inside a transaction as an atomic unit, with no SELECTs contained.
... But for some reason nHProf now != sqlProfiler (selects seems to happen in the db before nHProf reports it).
NOTE
Before I get flamed I realise the issues here, and know that the SELECTs aren't in the Transaction. That's the design. Some of our operations will contain the SELECTs (we now have a couple of our own TransactionScope implementations) in serialised transactions. The vast majority of our code does not need up-to-the-minute live data, and we have serialised workloads with individual operators.
ALSO
If anyone knows how to get an identity column (non-PK) refreshed post-insert without a need to manually refresh the entity, and in particular by using ActiveRecord markup (I think it's possible in nHibernate mapping files using a 'generated' attribute) please let me know!!
Is it possible to creat a readonly connection in nHibernate ?
Read-only : where nHibernate will not flush out any changes to the underlying database implicitly or explicitly.
When closing a nhibernate connection it does automatically flush out the changes to the persistent object.
Setting the flush mode to never is one way - but is reversable (i.e some code can reset the flush mode).
I think you've already found the solution, setting flush mode to never. Yes, it is changeable but even if it wasn't, code could simply create another session that had a different flush mode.
I think the appropriate solution is to suggest read-only with session.FlushMode = FlushMode.Never and enforce it by using a connection to the database that only has SELECT permissions (or whatever is appropriate for your situation). Maintaining separate ISessionFactory factories might help by allowing something like ReadOnlySessionFactory.Create().
Take a look at Read Only entities that became available in NHibernate 3.1 GA
https://nhibernate.jira.com/browse/NH-908
There is a newer readonly feature in NHibernate (I don't know which version, but it's in 3.3.0 for sure). You can set the session to read only using this:
session.DefaultReadOnly = true
It disables the cache for old values and therefore improves performance and memory consumption.
There is a chapter about read-only entities in the NHibernate reference documentation.
Accumulating updates, and just never flushing seems like a bad solution to me.
I posted a similar question. The solution provided uses a different approach. All the events are set to empty, and thus ignored. My feeling is that it's a better approach.
I am surprised that this is not easier to do. I like the entity framework approach of using an extension method .AsNoTracking() which ensures that read only queries remain that way.
How to create an NHibernate read-only session with Fluent NHibernate that doesn't accumulate updates?