I´m using NHibernate to data access Layer.
I have an entity in memory previously loaded, and Im making changes in order to save after on database. The problem comes when my application is running in some machines at the same time, and other user has deleted from database the same object that I have in memory and I want to save. When I try save the changes or delete this entity, a StaleStateException in fired.
I check if an entity exists on database calling session.Get<T> of this way (it´s get a null succesfully):
using (var session = NHibernateSessionHelper.OpenSession())
{
using (var transaction = session.BeginTransaction())
{
var entity = session.Get<T>(persistObject.Id);
return entity == null ? false : true;
}
}
The problem comes when I can´t differentiate between when the entity has been deleted by other session/user (therefore my entity in memory is obsolete) or the entity has been recently created and is able to save.
I think that the unique solution is implement a mechanism to check if the entity has already been saved or loaded from database, in order to discard the entity or save when proceed.
Is there a way to check this behaviour by using nhibernate? Im tried with session.Refresh() and session.Get<T> but I still without know if the object is new and ready to save or obsolete.
Help very appreciated.
The situaltion you have is a tipical error handling one. Because a user have dicided to delete an object in your database and an other user want save a change to the same object you can't say what the right state should be. The user is the one who should dicide what to do with situation. You can make your error handling smarter by giving him some chooses. For situations where you have multiple users which write to the same object you should also implement a meganism like optimistic locking to prevent an other user to override a change which he did not see. If you have many of these situation you should think about redesign your db/object structure to edit less data at once or rethink the work/processes your users do with the system.
Related
I have problem with cache in Fluent NHibernate. I want to disable it for query by ID e.g.
session.Get<Person>(10);
Do you have any ideas ?
Are you referring to the first-level (session) cache?
You can refresh the state of an entity from the database by using Refresh, that is:
// Will get the state from the first-level cache if already present in the session:
var entity = Session.Get<EntityType>(entityId);
// Line below will update the entity with the current state from the database:
Session.Refresh(entity);
If you already hold the entity, call directly session.Refresh(person) on it instead of getting it again.
You may also evict it with session.Evict(person), causing it to no more be in the session, and no more tracked for changes either. Then discard it and eventually get it again later if you need.
Otherwise, this is unusual to consider it is a trouble getting it from the session cache. This is frequently a sign of bad session usage, such as using a same session across many user interactions (anti-pattern).
You can still do what Fredy proposes. Or call session.Clear() before getting for clearing the session cache (and losing all pending changes by the way).
Instead of a Person object that is mapped you could create a DTO for Person and do a QueryOver().
The PersonDTO object wont be cached in Nhibernates first-lvl-cache.
The question is about Doctrine but I think that can be extended to many ORM's.
Detach:
An entity is detached from an EntityManager and thus no longer managed
by invoking the EntityManager#detach($entity) method on it or by
cascading the detach operation to it. Changes made to the detached
entity, if any (including removal of the entity), will not be
synchronized to the database after the entity has been detached.
Merge:
Merging entities refers to the merging of (usually detached) entities
into the context of an EntityManager so that they become managed
again. To merge the state of an entity into an EntityManager use the
EntityManager#merge($entity) method. The state of the passed entity
will be merged into a managed copy of this entity and this copy will
subsequently be returned.
I understand (almost) how this works, but the question is: why one would need detaching/merging entitiies? Can you give me an example/scenario when these two operations can be used/needed?
When should I Detaching an entity?
Detaching an entity from the an EM (EntityManager) is widely used when you deal with more than one EM and avoid concurrency conflicts, for example:
$user= $em->find('models\User', 1);
$user->setName('Foo');
// You can not remove this user,
// because it still attached to the first Entity Manager
$em2->remove($user);
$em2->flush();
You can not take control of $user object by $em2 because its session belongs to $em that initially load the $user from database. Them how to solve the problem above? You need to detaching the object from the original $em first:
$user= $em->find('models\User', 1);
$user->setName('Foo');
$em->detach($user);
$em2->remove($user);
$em2->flush();
When should I use merging function?
Basically when you want to update an entity:
$user= $em->find('models\User', 1);
$user->setName('Foo');
$em->merge($user);
$em->flush();
The EM will make a compare between the $user in database vs the $user in memory. Once the EM recognize the changed fields, it only updates them and keeps the old ones.
The flush method triggers a commit and the user name will updated in the database
You would need to detach an entity when dealing with issues of concurrency.
Suppose you are using an asynchronous API that makes callbacks to your project. When you issue the API call along with callback instruction, you might still be managing the entity that is affected by the callback, and therefore overwrite the changes made by the callback.
You can also detach entity when you have pernamently data in your database, but in your code you modify this entities depending on the user account.
For example browser game which have some characters and some attacks to fight. AttackOne used by "UserFoo" (lvl 90) will be modified by better bonuses than used by "UserBarr" (lvl 20), but in our database AttackOne all the time is the same attack
Using fluent NHibernate I have a property on a class mapped using Version
Version(x => x.Version);
When I save the object, the Version property gets incremented in the database as I would expect, but the value of the property on the object only seems to change sometimes.
using (var tx = session.BeginTransaction())
{
session.Merge(item);
tx.Commit();
item.Version; // Sometimes this is still 1, when I expect it to be 2.
}
The problem is then that if it remains as 1 and I make more changes and save again I get a StaleObjectStateException.
What's weird is that sometimes it works fine and the item.Version value does get correctly incremented, but I can't figure out the difference between the cases where it does and the cases where it doesn't.
I've tried searching but can't seem to find any documentation on this. Can anyone explain what NHibernates expected behaviour is with the Version mapping?
[NHibernate version 2.1.2]
From the ISession.Merge documentation:
Copy the state of the given object onto the persistent object with the same identifier. If there is no persistent instance currently associated with the session, it will be loaded. Return the persistent instance. If the given instance is unsaved, save a copy of and return it as a newly persistent instance. The given instance does not become associated with the session.
So, it will not modify item.
(I might add I have never used Merge in my apps. You might want to review how you are dealing with attached and detached entities)
Did you try
item = session.Merge(item);
tx.Commit();
?
You need to flush the session before the updated version will propagate up to your entities. Unless you flush the session, you are responsible for keeping the entities up to date yourself.
You should TYPICALLY let the session flush on its own when its closed. However, in some instances where you rely on database updates that happen via nhibernate and not settings you make to the entity itself, you might need to flush the session yourself after a commit. In this case be aware that when you flush the session ANY entities that are dirty will be committed. This may not be desirable so be sure that the scope is very limited.
When I do this:
Cat x = Session.Load<Cat>(123);
x.Name = "fritz";
Session.Flush();
NHibernate detects the change and UPDATEs the DB. But, when I do this:
Cat x = new Cat();
Session.Save(x);
x.Name = "fritz";
Session.Flush();
I get NULL for name, because that's what was there when I called Session.Save(). Why doesn't NHibernate detect the changes - or better yet, take the values for the INSERT statement at the time of Flush()?
Added: To clarify: The Session.FlushMode is set to None, so there are no automatic flushes until I say so. I'm also using GUID primary keys (with guid.comb generator).
The reason I'm doing this is because I'm using the Session as a "dirtiness" tracker. I'm writing a Windows Forms application and every one of my forms has a separate session which lasts as long as the form does. The session is kept disconnected as much as possible so that I don't run out of ADO.NET connections (it's an MDI application and there's no limit how many forms can be opened). Every form also has an "OK" button and a "Cancel" button. The "OK" button calls Session.Flush() and then closes the form; the "Cancel" button just closes the form and silently discards all changes the user has made.
Well, at least that's what I would like. The above bug is giving me problems.
Unless you have a very good reason not to, you have to use a transaction instead of an explict flush call.
using (var session = createSession())
{
using (var transaction = session.BeginTransaction())
{
Cat x = new Cat();
session.Save(x);
x.Name = "fritz";
try
{
transaction.Commit();
}
catch
{
// prevents your database from getting corrupt when you have a bug
transaction.RollBack();
throw;
}
}
}
In a real application, it is a good practice to hide the transaction creation, commit and roll-back in a separate part in your application, so that you don't have to call it in each data-access block. In most applications with a relation database, one transaction is wrapped around the whole code that processes one screen for a user. This causes all actions made by the user to succeed, or fail when there is a bug, as an atomic block. All data changed by the user in a screen is stored, or nothing is stored, never parts of it.
NHibernate does track the changes, but your code as written will cause it to issue an insert with the values at the time you called Save and an update for the changes made after calling Save. Save makes the object persistent (insert), then NHibernate begins tracking changes made to the persistent object (update). Both the insert and the update are committed when the session is flushed.
We're facing similar issues with our Windows Forms application and I think we're going to take a different approach. The FlushMode is left at Auto (default) and the object is evicted (ISession.Evict) from the session if the user cancels the operation. Evicting the object makes it transient which is exactly the desired behavior.
My main application form (WinForms) has a DataGridView, that uses DataBinding and Fluent NHibernate to display data from a SQLite database. This form is open for the entire time the application is running.
For performance reasons, I set the convention DefaultLazy.Always() for all DB access.
So far, the only way I've found to make this work is to keep a Session (let's call it MainSession) open all the time for the main form, so NHibernate can lazy load new data as the user navigates with the grid.
Another part of the application can run in the background, and Save to the DB. Currently, (after considerable struggle), my approach is to call MainSession.Disconnect(), create a disposable Session for each Save, and MainSession.Reconnect() after finishing the Save. Otherwise SQLite will throw "The database file is locked" exceptions.
This seems to be working well so far, but past experience has made me nervous about keeping a session open for a long time (I ran into performance problems when I tried to use a single session for both Saves and Loads - the cache filled up, and bogged down everything - see Commit is VERY slow in my NHibernate / SQLite project).
So, my question - is this a good approach, or am I looking at problems down the road?
If it's a bad approach, what are the alternatives? I've considered opening and closing my main session whenever the user navigates with the grid, but it's not obvious to me how I would do that - hook every event from the grid that could possibly cause a lazy load?
I have the nagging feeling that trying to manage my own sessions this way is fundamentally the wrong approach, but it's not obvious what the right one is.
Edit
It's been more than a year since I asked this question...and it turns out that keeping a main session open for the lifetime of the app has indeed led to performance problems.
There seem to be a lot more NH users on SO these days - anyone want to suggest a better approach?
yeah it's me again. ;-)
stumbling upon your new question reminds me of the following: Did you understand the principle of lazy loading or are you mistaking lazy loading for pagination? NHibernate also provides functionality for that.
If you just want to display some defined properties within your grid that are of course within the object graph i think you should retrieve the whole data at once using 'fetched joins'. If the rowcount of the data is to high you can think about pagination, as far as i know its also possible using DataGridView and Binding.
Lazy Loading results in multiple database calls - in your case i'ld think at least one per row. This seems not to be the best performing solution.
If instead you are using paging with FetchType.Join you can get rid of the long running session and all your problems should be solved. So how about that?
I had a project where there was a main grid for selection.
I had a class which paged through the set and i called session.Clear() everytime when i got a new page.
class MyList : IList<Data>
{
private int _pagesize = 50;
private int _session; // from ctor
private int _firstresult = int.MinValue;
private IList<Data> cached;
public Data this[int index]
get
{
if (!index.Between(_firstresult, _firstresult + cached.Count))
{
_firstresult = index;
GetData();
}
if (!index.Between(_firstresult, _firstresult + cached.Count))
throw new IndexOutOfRangeException();
return cachedData[index - _firstresult];
}
void GetData()
{
Session.Clear();
cached = Session.QueryOver<Data>()
.Skip(_firstresult)
.Take(_pagesize)
.List();
}
}
If you need Databinding maybe implement IBindingList