Create Entity by using EntityListener - cuba-platform

I have a customer which has an association to a customerBudget entity. A CustomerEntityListener will create a customerBudget entity.
I get the following error:
IllegalStateException: During synchronization a new object was found through a relationship that was not marked cascade PERSIST: de.company.entity.Customer-c4775b5b-413b-0567-3612-e0860bca9300 [new,managed].
the code in onAfterInsert(Customer entity)
LoadContext<Customer> loadContext = LoadContext.create(Customer.class);
loadContext.setId(entity.getId());
Customer customer = dataManager.load(loadContext);
CustomerBudget customerBudget = new CustomerBudget();
customerBudget.setCustomer(customer);
CommitContext commitContext = new CommitContext(customerBudget);
dataManager.commit(commitContext);
How can I create and persist Entites in an EntityListener?

You can implement BeforeInsertEntityListener interface and create a new entity in the current persistence context via EntityManager.
The listener may look as follows:
#Component("demo_CustomerEntityListener")
public class CustomerEntityListener implements BeforeInsertEntityListener<Customer> {
#Inject
private Metadata metadata;
#Inject
private Persistence persistence;
#Override
public void onBeforeInsert(Customer entity) {
CustomerBudget customerBudget = metadata.create(CustomerBudget.class);
customerBudget.setCustomer(entity);
persistence.getEntityManager().persist(customerBudget);
}
}
The new entity will be saved to the database on the transaction commit together with the Customer.
The Metadata.create() method is the preferred way to instantiate entities - for entities with integer identifiers it assigns an ID from a sequence; it also handles entity extension if needed.
Entity listeners work in the transaction that saves the entity, which allows you to make atomic changes in your data - everything will be either saved or discarded together with the entity the listener is invoked on.
Unlike EntityManager, DataManager always creates a new transaction. So you should use it in an entity listener only if you really want to load or save entities in a separate transaction. I'm not sure why you get this particular exception, but your code looks weird: you load an instance of Customer which is in the database instead of using the instance which is passed to the listener. When I tried to run your code on HSQLDB, the server went to infinite lock - the new transaction inside DataManager waits until the current transaction saving the Customer is committed.

Related

How to save and then update same class instance during one request with NHibernate?

I'm relatively new to NHibernate and I've got a question about it.
I use this code snippet in my MVC project in Controller's method:
MyClass entity = new MyClass
{
Foo = "bar"
};
_myRepository.Save(entity);
....
entity.Foo = "bar2";
_myRepository.Save(entity);
The first time entity saved in database succesfully. But the second time not a single request doesnt go to database. My method save in repository just does:
public void Save(T entity)
{
_session.SaveOrUpdate(entity);
}
What should I do to be able to save and then update this entity during one request? If I add _session.Flush(); after saving entity to database it works, but I'm not sure, if it's the right thing to do.
Thanks
This is the expected behavior.
Changes are only saved on Flush
Flush may be called explicitly or implicitly (see 9.6. Flush)
When using an identity generator (not recommended), inserts are sent immediately, because that's the only way to return the ID.
you should be using transactions.
a couple of good sources: here and here.
also, summer of nHibernate is how I first started with nHibernate. it's a very good resource for learning the basics.

NHibernate does not delete entity

In the TestFixtureTearDown-part of an NUnit test I try to delete some test-entities created in the TestFixtureSetUp-part. I use the following code
sessionFactory = NHibernateHelper.CreateSessionFactory(cssc["DefaultTestConnectionString"].ConnectionString);
uow = new NHibernateUnitOfWork(sessionFactory);
var g = reposGebruiker.GetByName(gebruiker.GebruikerNaam);
reposGebruiker.Delete(g);
var k = reposKlant.GetByName(klant.Naam);
reposKlant.Delete(k);
// Commit changes to persistant storage
uow.Commit();
However, after the commit, the two entities were still in the database. After searching on I came across this page on SO and so I added:
uow.Session.Flush();
However, still the entities remain in the DB. Does anyone have an idea as to why this is?
I've never used the UoW class you're using, but my projects are implemented using ISession.BeginTransaction and ISession.Transaction.Commit in a helper like this:
public void CreateContext(Action logic)
{
ISession.BeginTransaction();
logic();
ISession.Transaction.Commit();
}
And then:
CreateContext(() =>
Session.Delete(someObject));
This should work.
I want to mention that this is an example, and you'd want to make some abstractions.
How are the repositories created? In for the delete to succeed, the objects must be loaded in the same UoW (ISession) in which the Delete command is issued. The Delete method makes the objects non-persistent and marks them for deletion.

NHibernate FlushMode: How do I set up NHibernate for automatically updating an entity

After I retrieve an entity, I change a property of it.
Then I retrieve the same entity.
How do I say Nhibernate, that it shall update the entity before it loads the entity?
Here the code:
EmployeeRepository employeeRepository = new EmployeeRepository();
Employee employee = employeeRepository.GetById(4);
employee.LastName = "TEST!!!";
Employee employee2 = employeeRepository.GetById(4);
Currently Nhibernate don't make an update in my program. I thought just setting the FlushMode to Auto will update the entity automatically.
EDIT
The background is that I try to reprdouce this behaviour in another application.
There is NO save method! Just this code. The NHibernate version is really old, it is version 1.2.1.4000. maybe there is the catch.
When I set the FlushMode in the brownfield application to Commit then no update statement is generated.
But in my own project I still can not reproduce this "automatic" behaviour.
Are both calls to the employeeRepository ultimately using the same NHibernate ISession instance? If so, then they will return the same object, and the updated LastName value will be reflected. If not, then you will need to make sure you are disposing your ISession instance each time to take advantage of auto flushing.
According to the documentation for the default FlushMode of Auto:
The ISession is sometimes flushed
before query execution in order to
ensure that queries never return stale
state. This is the default flush mode.
So you have to manually flush the session to ensure that your changes are persisted before reading the object again.
EmployeeRepository employeeRepository = new EmployeeRepository();
Employee employee = employeeRepository.GetById(4);
employee.LastName = "TEST!!!";
session.Flush();
Employee employee2 = employeeRepository.GetById(4);
If your repository is using the same ISession for both calls (as it should imo) then employee 4 will be retrieved from the cache and have the change. However, the change will not have been persisted to the database yet.
If your repository GetById methods uses a new session for each call then it will always hit the database to retrieve the employee. If you're disposing of the session in the method then your objects are returned as detached from a session. This strategy defeats the purpose of NHibernate and relegates it to a simple data access tool.

Unit Of Work for non-trivial CRUD operations for multiple Repositories

I have seen unit of work pattern implemented with something like a following code:
private HashSet<object> _newEntities = new HashSet<object>();
private HashSet<object> _updatedEntities = new HashSet<object>();
private HashSet<object> _deletedEntities = new HashSet<object>();
and then there are methods for adding entities to each of these HashSets.
On Commit UnitOfWork creates some Mapper instance for each entity and calls Insert, Update, Delete methods from some imaginary Mapper.
The problem for me with this approach is: the names of Insert, Update, Delete methods are hard-coded, so it seems such a UnitOfWork is capable only of doing simple CRUD operations. But what if I need the following usage:
UnitOfWork ouw = new UnitOfWork();
uow.Start();
ARepository arep = new ARepository();
BRepository brep = new BRepository();
arep.DoSomeNonSimpleUpdateHere();
brep.DoSomeNonSimpleDeleteHere();
uow.Commit();
Now the three-HashSet approach fails because I then I could register A and B entities only for Insert, Update, Delete operations but I need those custom operations now.
So it seems that I cannot always stack the Repository operations and then perform them all with UnitOfWork.Commit();
How to solve this problem? The first idea is - I could store addresses of methods
arep.DoSomeNonSimpleUpdateHere();
brep.DoSomeNonSimpleDeleteHere();
in UoW instance and execute them on uow.Commit() but then I have also to store all the method parameters. That sounds complicated.
The other idea is to make Repositories completely UoW-aware: In DoSomeNonSimpleUpdateHere I can detect that there is a UoW running and so I do not perform DoSomeNonSimpleUpdateHere but save the operation parameters and 'pending' status in some stack of the Repository instance (obviously I cannot save everything in UoW because UoW shouldn't depend on concrete Repository implementations). And then I register the involved Repository in the UoW instance. When UoW calls Commit, it opens a transaction, and calls some thing like Flush() for each pending Repository. Now every method of Repository needs some stuff for UoW detection and operation postponing for later Commit().
So the short question is - what is the easiest way to register all the pending changes in multiple repositories in UoW and then Commit() them all in a single transaction?
It would seem that even complicated updates can be broken down into a series of modifications to one or more DomainObjects. Calling DoSomeNonSimpleUpdateHere() may modify several different DomainObjects, which would trigger corresponding calls to UnitOfWork.registerDirty(DomainObject) for each object. In the sample code below, I have replaced the call to DoSomeNonSimpleUpdateHere with code that removes inactive users from the system.
UnitOfWork uow = GetSession().GetUnitOfWork();
uow.Start();
UserRepository repository = new UserRespository();
UserList users = repository.GetAllUsers();
foreach (User user in users)
{
if (!user.IsActive())
users.Remove( user );
}
uow.Commit();
If you are concerned about having to iterate over all users, here is an alternative approach that uses a Criteria object to limit the number of users pulled from the database.
UnitOfWork uow = GetSession().GetUnitOfWork();
uow.Start();
Repository repository = new UserRespository();
Criteria inactiveUsersCriteria = new Criteria();
inactiveUsersCriteria.equal( User.ACTIVATED, 0 );
UserList inactiveUsers = repository.GetMatching( inactiveUsersCriteria );
inactiveUsers.RemoveAll();
uow.Commit();
The UserList.Remove and UserList.RemoveAll methods will notify the UnitOfWork of each removed User. When UnitOfWork.Commit() is called, it will delete each User found in its _deletedEntities. This approach allows you to create arbitrarily complex code without having to write SQL queries for each special case. Using batched updates will be useful here, since the UnitOfWork will have to execute multiple delete statements instead of only one statement for all inactive users.
The fact that you have this problem suggests that you aren't using the Repository pattern as such, but something more like multiple table data gateways. Generally, a repository is for loading and saving an aggregate root. As such, when you save an entity, your persistence layer saves all the changes in that aggregate root entity instance's object graph.
If, in your code, you have roughly one "repository" per one table (or Entity), you're probably actually using a table data gateway or a data transfer object. In that case, you probably need to have a means of passing in a reference to the active transaction (or the Unit of Work) in each Save() method.
In Evans DDD book, he recommends leaving transaction control to the client of a repository, and I would agree that it's not a good practice, though it may be harder to avoid if you're actually using a table data gateway pattern.
I finally found this one:
http://www.goeleven.com/Blog/82
The author solves the problem using three Lists for update/insert/delete, but he does not store entities there. Instead repository delegates and their parameters are stored. So on Commit the author calls each registered delegate. With this approach I could register even some complex repository methods and so avoid using a separate TableDataGateway.

Flushing in NHibernate

This question is a bit of a dupe, but I still don't understand the best way to handle flushing.
I am migrating an existing code base, which contains a lot of code like the following:
private void btnSave_Click()
{
SaveForm();
ReloadList();
}
private void SaveForm()
{
var foo = FooRepository.Get(_editingFooId);
foo.Name = txtName.Text;
FooRepository.Save(foo);
}
private void ReloadList()
{
fooRepeater.DataSource = FooRepository.LoadAll();
fooRepeater.DataBind();
}
Now that I am changing the FooRepository to Nhibernate, what should I use for the FooRepository.Save method? Should the FooRepository always flush the session when the entity is saved?
I'm not sure if I understand your question, but here is what I think:
Think in "putting objects to the session" instead of "getting and storing data". NH will store all new and changed objects in the session without any special call to it.
Consider this scenarios:
Data change:
Get data from the database with any query. The entities are now in the NH session
Change entities by just changing property values
Commit the transaction. Changes are flushed and stored to the database.
Create a new object:
Call a constructor to create a new object
Store it to the database by calling "Save". It is in the session now.
You still can change the object after Save
Commit the changes. The latest state will be stored to the database.
If you work with detached entities, you also need Update or SaveOrUpdate to put detached entities to the session.
Of course you can configure NH to behave differently. But it works best if you follow this default behaviour.
It doesn't matter whether or not you explicitly flush the session between modifying a Foo entity and loading all Foos from the repository. NHibernate is smart enough to auto-flush itself if you have made changes in the session that may affect the results of the query you are trying to run.
Ideally I try to use one session per "unit of work". This means one cohesive piece of work which may involve several smaller steps. If you feel that you do not have a seam in your architecture where you can achieve this, then managing the session inside the repository will also work. Just be aware that you are missing out on some of the power that NHibernate provides you.
I'd vote up Stefan Moser's answer if I could - I'm still getting to grips with Nh myself but I think it's nice to be able to write code like this:
private void SaveForm()
{
using (var unitofwork = UnitOfWork.Start())
{
var foo = FooRepository.Get(_editingFooId);
var bar = BarRepository.Get(_barId);
foo.Name = txtName.Text;
bar.SomeOtherProperty = txtBlah.Text;
FooRepository.Save(foo);
BarRepository.Save(bar);
UnitOfWork.CommitChanges();
}
}
so this way either the whole action succeeds or it fails and rolls back, keeping flushing/transaction management outside of the Repositories.