NHibernate update on single property updates all properties in sql - sql

I am performing a standard update in NHibernate to a single property. However on commit of the transaction the sql update seems to set all fields I have mapped on the table even though they have not changed. Surely this can't be normal behaviour in Nhibernate? Am I doing something wrong? Thanks
using (var session = sessionFactory.OpenSession())
{
using (var transaction = session.BeginTransaction())
{
var singleMeeting = session.Load<Meeting>(10193);
singleMeeting.Subject = "This is a test 2";
transaction.Commit();
}
}

This is the normal behavior. You can try adding dynamic-update="true" to your class definition to override this behavior.

Well. yes this is normal behaviour for NHibernate. You can use generated attribute for your properties to change the behaviour. Details on Ayende's blog.
Why is this default is because with dynamics you don't get your query plan cached. And usually you don't mind that you send few more bytes over high speed network connection between your application server and database. Unless you are saving long strings where this setting is perfectly appropriate.

Related

Session Flush not Showing SQL when persisting unsaved entities

The scenario is a (more complex) version of the following:
IList<T> ts = Session.QueryOvery<T>().List();
// modify data of multiple objects
ts[0].Foo = "foo0";
ts[1].Foo = "foo1";
using (ITransaction trx = Session.BeginTransaction())
{
// save only one object
Session.Save (ts[0]);
trx.Commit();
}
As NH goes, this will also save ts[1] by default, to prevent stale state (side note : we love control over our SQL, so we turn that off by setting Session.FlushMode=FlushMode.Never).
What really vexes me is the fact that, even though Show_SQL is activated, no sql is shown for the ts[1] updates that are definitely sent to the Database by the flush.
Is there any way I can get those to show up?
As stated in https://stackoverflow.com/a/9403516/1236044 , you just need to add adonet.batch_size setting with value 0 to your config :
<property name="adonet.batch_size">0</property>

NHibernate ISession.Save() - Why is this persisting my entities immediately?

I am creating a large number of entities with NHibernate, attaching them to my ISession, and then using a transaction to commit my changes to the database. Code sample is below:
ISession _context = SessionProvider.OpenSession();
//Create new entities
for(int i=0; i<100; i++)
{
MyEntity entity = new MyEntity(i);
//Attach new entity to the context
_context.Save(entity);
}
//Persist all changes to the database
using(var tx = _context.BeginTransaction())
{
//Flush the session
tx.Commit();
}
I was under the impression that the line _context.Save() simply makes the ISession aware of the new entity, but that no changes are persisted to the database until I Flush the session via the line tx.Commit().
What I've observed though, is that the database gets a new entity every time I call _context.Save(). I end up with too many individual calls to the database as a result.
Does anyone know why ISession.Save() is automatically persisting changes? Have I misunderstood something about how NHibernate behaves? Thanks.
***EDIT - Just to clarify (in light of the two suggested answers) - my problem here is that the database IS getting updated as soon as I call _context.Save(). I don't expect this to happen. I expect nothing to be inserted into the database until I call tx.Commit(). Neither of the two suggested answers so far helps with this unfortunately.
Some good information on identity generators can be found here
Try:
using(Session _context = SessionProvider.OpenSession())
using(var tx = _context.BeginTransaction())
{
//Create new entities
for(int i=0; i<100; i++)
{
MyEntity entity = new MyEntity(i);
//Attach new entity to the context
_context.Save(entity);
}
//Flush the session
tx.Commit();
}
Which identity generator are you using? If you are using post-insert generators like MSSQL/MySQL's Identity or Oracle's sequence to generate the value of your Id fields, that is your problem.
From NHibernate POID Generators Revealed:
Post insert generators, as the name
suggest, assigns the id’s after the
entity is stored in the database. A
select statement is executed against
database. They have many drawbacks,
and in my opinion they must be used
only on brownfield projects. Those
generators are what WE DO NOT SUGGEST
as NH Team.
Some of the drawbacks are the
following
Unit Of Work is broken with the use of
those strategies. It doesn’t matter if
you’re using FlushMode.Commit, each
Save results in an insert statement
against DB. As a best practice, we
should defer insertions to the commit,
but using a post insert generator
makes it commit on save (which is what
UoW doesn’t do).
Those strategies
nullify batcher, you can’t take the
advantage of sending multiple queries
at once(as it must go to database at
the time of Save)
You can set your batch size in your configuration:
<add key="hibernate.batch_size" value="10" />
Or you can set it in code. And make sure you do your saves within a transaction scope.
Try setting the FlushMode to Commit:
ISession _context = SessionProvider.OpenSession();
context.FlushMode = FlushMode.Commit;
peer's suggestion to set the batch size is good also.
My understanding is that when using database identity columns, NHibernate will defer inserts until the session is flushed unless it needs to perform the insert in order to retrieve a foreign key or ensure that a query returns the expected results.
Well
rebelliard's answer is a possibility depending on your mapping
you are not using explicit transactions (StuffHappens' answer)
default flush mode is auto and that complicates things (Jamie Ide's answer)
if by any change you make any queries using the nhibernate api the default behaviour is to flush the cache to the database first so that the results of those queries will match the session entity representation.
What about :
ISession _context = SessionProvider.OpenSession();
//Persist all changes to the database
using(var tx = _context.BeginTransaction())
{
//Create new entities
for(int i=0; i<100; i++)
{
MyEntity entity = new MyEntity(i);
//Attach new entity to the context
_context.Save(entity);
}
//Flush the session
tx.Commit();
}

NHibernate 3.0: TransactionScope and Auto-Flushing

In NHibernate 3.0, FlushMode.Auto does not work when running under an ambient transaction only (that is, without starting an NHibernate transaction). Should it?
using (TransactionScope scope = new TransactionScope())
{
ISession session = sessionFactory.OpenSession();
MappedEntity entity = new MappedEntity() { Name = "Entity", Value = 20 };
session.Save(entity);
entity.Value = 30;
session.SaveOrUpdate(entity);
// This returns one entity, when it should return none
var list = session.
CreateQuery("from MappedEntity where Value = 20").
List<MappedEntity>();
}
(Example shamelessly stolen from this related question)
In the NHibernate source I can see that's it's checking whether there's a transaction in progress (in SessionImpl.AutoFlushIfRequired), but the relevant method ( SessionImpl.TransactionInProgress) does not consider ambient transactions - unlike its cousin ConnectionManager.IsInActiveTransaction, which does consider ambient transactions.
Good news. Thanks to Jeff Sternal (who nicely identified the problem) I updated https://nhibernate.jira.com/browse/NH-3583 and thanks to the NH staff, there's already a fix and a pull request so in the upcoming release 4.1.x.x this ISSUE will be fixed.
You should use an explicit NHibernate transaction always.
using (TransactionScope scope = new TransactionScope())
using (ISession session = sessionFactory.OpenSession())
using (ITransaction transaction = session.BeginTransaction())
{
//Do work here
transaction.Commit();
scope.Complete();
}
I see you also wrote in the NH dev list - while this can change in the future, that's how it works now.
The answer provided by Diego does not work in the case where you have an oracle database.
(releated question). The session.BeginTransaction will fail because the connection is already part of a transaction.
Looks like we have to write some code around this problem in our application (WCF,NHibernate, Oracle), but it just feels like something that NHibernate should provide out of the box.
So if anyone has a good answer, it would be really appreciated.
For me, I don't know the reason behind, but by forcing session flush before session is disposed seemed to worked for me. e.g.
using(session)
{
//do your work
session.Flush();
}
I verified that this works with my distributed transaction, since without doing this, I always get "transaction has aborted" when TransactionScope is disposed.
Even set session.FlushMode to Commit did NOT work for me.

NHibernate ISession.Update

I have noticed, by using log4net, that when calling ISession.Update, it updates all the changed objects.
For example:
// Change 2 instances
user1.IsDeleted = true;
user2.UserName = "Xyz";
// Call session.Update to update the 2 users
using (ITransaction transaction = session.BeginTransaction())
{
Session.Update(user1); // This updates both user1 & user2
transaction.Commit();
}
using (ITransaction transaction = session.BeginTransaction())
{
Session.Update(user2); // Now there is no need for this
transaction.Commit();
}
Is this the default behavior of NHibernate or has something to do with my mapping file?
Can I make NHibernate update one by one?
It's the normal and default behavior:
Hibernate maintains a cache of Objects
that have been inserted, updated or
deleted. It also maintains a cache of
Objects that have been queried from
the database. These Objects are
referred to as persistent Objects as
long as the EntityManager that was
used to fetch them is still active.
What this means is that any changes to
these Objects within the bounds of a
transaction are automatically
persisted when the transaction is
committed. These updates are implicit
within the boundary of the transaction
and you don’t have to explicitly call
any method to persist the values.
From Hibernate Pitfalls part 2:
Q) Do I still have to do Save and
Update inside transactions?
Save() is only needed for objects that
are not persistent (such as new
objects). You can use Update to bring
an object that has been evicted back
into a session.
From NHibernate's automatic (dirty checking) update behaviour:
I've just discovered that if I get an
object from an NHibernate session and
change a property on object,
NHibernate will automatically update
the object on commit without me
calling Session.Update(myObj)!
Answer: You can set Session.FlushMode to
FlushMode.Never. This will make your
operations explicit ie: on tx.Commit() or session.Flush().
Of course this will still update the
database upon commit/flush. If you do
not want this behavior, then call
session.Evict(yourObj) and it will
then become transient and NHibernate
will not issue any db commands for it.
This is the default behavior when FlushMode of session is Auto or Commit.
In these cases calling transaction.Commit() flushes the session & updates ALL persistent objects
So if you remove the calls Session.Update it wouldn't make any difference
Can I make NHibernate update one by one?
Yes. use FlushMode.Never or postpone commiting the session if possible. I guess you don't need to use Evict for this case

Flushing in NHibernate

This question is a bit of a dupe, but I still don't understand the best way to handle flushing.
I am migrating an existing code base, which contains a lot of code like the following:
private void btnSave_Click()
{
SaveForm();
ReloadList();
}
private void SaveForm()
{
var foo = FooRepository.Get(_editingFooId);
foo.Name = txtName.Text;
FooRepository.Save(foo);
}
private void ReloadList()
{
fooRepeater.DataSource = FooRepository.LoadAll();
fooRepeater.DataBind();
}
Now that I am changing the FooRepository to Nhibernate, what should I use for the FooRepository.Save method? Should the FooRepository always flush the session when the entity is saved?
I'm not sure if I understand your question, but here is what I think:
Think in "putting objects to the session" instead of "getting and storing data". NH will store all new and changed objects in the session without any special call to it.
Consider this scenarios:
Data change:
Get data from the database with any query. The entities are now in the NH session
Change entities by just changing property values
Commit the transaction. Changes are flushed and stored to the database.
Create a new object:
Call a constructor to create a new object
Store it to the database by calling "Save". It is in the session now.
You still can change the object after Save
Commit the changes. The latest state will be stored to the database.
If you work with detached entities, you also need Update or SaveOrUpdate to put detached entities to the session.
Of course you can configure NH to behave differently. But it works best if you follow this default behaviour.
It doesn't matter whether or not you explicitly flush the session between modifying a Foo entity and loading all Foos from the repository. NHibernate is smart enough to auto-flush itself if you have made changes in the session that may affect the results of the query you are trying to run.
Ideally I try to use one session per "unit of work". This means one cohesive piece of work which may involve several smaller steps. If you feel that you do not have a seam in your architecture where you can achieve this, then managing the session inside the repository will also work. Just be aware that you are missing out on some of the power that NHibernate provides you.
I'd vote up Stefan Moser's answer if I could - I'm still getting to grips with Nh myself but I think it's nice to be able to write code like this:
private void SaveForm()
{
using (var unitofwork = UnitOfWork.Start())
{
var foo = FooRepository.Get(_editingFooId);
var bar = BarRepository.Get(_barId);
foo.Name = txtName.Text;
bar.SomeOtherProperty = txtBlah.Text;
FooRepository.Save(foo);
BarRepository.Save(bar);
UnitOfWork.CommitChanges();
}
}
so this way either the whole action succeeds or it fails and rolls back, keeping flushing/transaction management outside of the Repositories.