How to determine if nHibernate object changed - nhibernate

Probably a stupid question but I'm still trying to wrap my head around nHibernate.
As far as I can tell from using the software, nHibernate requires you to do a little bit of extra handling for saving changes properly.
Let's imagine I have an object X which can contain many of object Y. I'll create an X which has 2 Y's, each of which have their own properties. I then decide I want to update X. I'm going to add a new Y, and change one of the existing Y's.
So I load in my object X using it's ID. I then iterate through the Y's that I'm adding, add them to the X and save the lot using an update statement.
If you do this, you find the "old" Y's get orphaned in the database. Which, when I think about it, is exactly what I'd expect to happen - I haven't got rid of those objects after all, I've just created some new ones.
So there's two ways to look at this. Either I ought to be deleting all the Y data and then re-creating it, or I ought to be able to flag up to nHibernate that what I'm doing is a change and that it should be updating existing objects rather than creating new ones. Trouble is, I'm not sure which is the "right" approach or how best to do it - the former seems tremendously inefficient and the latter means setting a lot of "changed" flags and very fiddly code.
So I'm pretty sure there must be an easier solution that I'm missing in my stupidity. Can someone point me at the best approach and how best to handle it in nHibernate ... that is if the question makes any sense at all :)
Cheers,
Matt

You probably have a mapping or usage problem.
Correctly configured, your usage should be something like this:
using (var session = sessionFactory.OpenSession())
using (var tx = session.BeginTransaction())
{
var x = session.Get<X>(theId);
x.Ys[0].SomeProperty = theNewValue;
x.Ys.Add(theNewY);
tx.Commit();
}
You should post more details about the actual classes, mappings and usage.
Also, I suggest that you read the docs in full: http://nhibernate.info/doc/nh/en/index.html. It's only a few hours, that will save you many days of frustration.

Related

RavenDB Consistency - WaitForIndexesAfterSaveChanges() / WaitForNonStaleResultsAsOfNow

I am using RavenDB 3.5.
I know that querying entities is not acid but loading per ID is.
Apparently writing to DB is also acid.
So far so good.
Now a question:
I've found some code:
session.Advanced.WaitForIndexesAfterSaveChanges();
entity = session.Load<T>(id);
session.Delete(entity);
session.SaveChanges();
// Func<T, T> command
command?.Invoke(entity);
what would be the purpose of calling WaitForIndexesAfterSaveChanges() here?
is this because of executing a command?
or is it rather because might depedning/consuming queries are supposed to immediately catch up with those changes made?
if this would be the case, I could remove WaitForIndexesAfterSaveChanges() in this code block and just add WaitForNonStaleResultsAsOfNow() in the queries, couldn't I?
When would I use WaitForIndexesAfterSaveChanges() in the first place if my critical queries are already flagged with WaitForNonStaleResultsAsOfNow()?
The most likely reason for this behavior is wanting to wait, in this operation, for the indexes to complete.
A good example why you want to do that is when you create a new item, and the next operation is going to show a list of items. You can use WaitForIndexesAfterSaveChanges to wait, during the save, for the indexes to update.

Can I insert a Document into Lucene without generating a TokenStream?

Is there a way to add a document to the index by supplying terms and term frequencies directly, rather than via Analysis and/or TokenStream? I ask because I want to model some data where I know the term frequencies, but there is no underlying text document to be analyzed. I could create one by repeating the same term many times (I don't care about positions or highlighting in this case, either, just scoring), but that seems a bit perverse (and probably slower than just supplying the counts directly).
(also asked on the mailing list)
At any rate, you don't need to pass everything through an Analyzer in order to create the document. I'm not aware of any way to pass in Terms and Frequencies as you've asked (though I'd be interested to know if you find a good approach to it), but you can certainly pass in IndexableFields one term at a time. That would still require you to add each term multiple times, like:
IndexableField field = new StringField(fieldName, myTerm, FieldType.TYPE_NOT_STORED);
for (int i = 0; i < frequency; i++) {
document.add(field);
}
You can also take a step further back, and cut the Document class out entirely, by using any Iterable<IndexableField>, a simple List, for instance, which might suffice for a more direct approach for modelling your data.
Not sure if that gets you any closer to what you are looking for, but perhaps a step vaguely in the right direction.

what does this error mean in nhibernate

Out of the blue, i am getting this error when doing a number of updates using nhibernate.
Row was updated or deleted by another transaction (or unsaved-value mapping was incorrect): [MyDomainObject]
there is no additional information in the error. Is there some recommended way to help identify the root issue or can someone give me a better explanation on what this error indicated or is a sympton around.
Some additional info
I looked at the object and all of the data looks fine, it has an ID, etc . .
Note this is running in a single call stack from an asp.net-mvc website so i wouldn't expect there to be any threading issues to worry about in terms of concurrency.
NHibernate has an object, let's call it theObject. theObject.Id has a value of 42. NHibernate notices that the object is dirty. The object's Id is different than the unsaved-value, which is zero (0) for integer primary keys. So NHibernate issues an update statement, but no rows are updated, which means that there is no row in the database for that type of object with an Id of 42. So the object has been deleted without NHibernate knowing about it. This could happen inside another transaction (e.g. you have threading issues) or if someone (or another application) deleted/altered the row using SQL directly against the database.
The other possibility is that your unsaved-value is wrong. e.g. You are using -1 to indicate an unsaved-entity, but your mapping has a unsaved-value of zero. This is unlikely as your application is generally working from the sounds of it. If the unsaved-value was wrong, you wouldn't have been able to save any entities to the database as NHibernate would have been issuing UPDATE statements when it should have been issuing INSERT.
It means that you have multiple transactions accessing the same data, thus producing concurrency issues. You should improve on your data access handling, you probably are updating data from multiple threads, syndicate the changed data into a queue first which handles all the access to the db.
An old post, but hopefully my info will help someone. I was getting a similar error but only when persisting associations, after I had added in a new object. The error was of the form:
NHibernate.StaleObjectStateException: Row was updated or deleted by another transaction (or unsaved-value mapping was incorrect) [My.Entity#0]
Note the zero on the end, which is my identifier property. It should not be trying to save with key zero as I was using identity specification in SQL Server (generator class=native). I had not changed my unsaved-value in my xml so I had no idea what the problem was; for some reason NHibernate was trying to do an update using key value as 0 instead of a save (and getting the next key identity) for my new object.
In the end I found the cause was that I was initialising Version number to 1 for the new object in my constructor! Even though my identifier property was zero, for some reason NHibernate was also looking for a version property of zero as well, to identify it as an unsaved transient instance. The book "NHibernate in Action" does actually mention this on page 120, but for some reason my objects were fine when persisting with version number of 1 normally, and only failing if saving a new object through an association.
So make sure you do not set your Version value (leave as zero or null).
You say that your data is ok, but check if for example you are mapping the ID as self generate. I had the exact same problem, but I was sending an object with an ID different from 0.
Hope it helps!
My problem was this:
[Bind(Include="Name")] EventType eventType
Should have been:
[Bind(Include="EventTypeId,Name")] EventType eventType
Just as other answers suggest nhibernate was using zero as the id for my entity.
If you have a trigger on the table, it can be the reason. In this case, add inside it
SET ROWCOUNT 0;
SET NOCOUNT ON;
This error happened to me in the following way:
List < Device > allDevices = new List < Device > ();
//Add Devices to the list
allDevices.Add(aDevice);
//Add allDevices to database //Will work fine
// allDevices.Clear(); //Should be used here
//Later we add more devices
allDevices.Add(anotherDevice);
//Add allDevices to Database -> We get the error
//Solution to this
allDevices.Clear(); //Before adding new transaction with the oldData,

Fastest way to query for object existence in NHibernate

I am looking for the fastest way to check for the existence of an object.
The scenario is pretty simple, assume a directory tool, which reads the current hard drive. When a directory is found, it should be either created, or, if already present, updated.
First lets only focus on the creation part:
public static DatabaseDirectory Get(DirectoryInfo dI)
{
var result = DatabaseController.Session
.CreateCriteria(typeof (DatabaseDirectory))
.Add(Restrictions.Eq("FullName", dI.FullName))
.List<DatabaseDirectory>().FirstOrDefault();
if (result == null)
{
result = new DatabaseDirectory
{
CreationTime = dI.CreationTime,
Existing = dI.Exists,
Extension = dI.Extension,
FullName = dI.FullName,
LastAccessTime = dI.LastAccessTime,
LastWriteTime = dI.LastWriteTime,
Name = dI.Name
};
}
return result;
}
Is this the way to go regarding:
Speed
Separation of Concern
What comes to mind is the following: A scan will always be performed "as a whole". Meaning, during a scan of drive C, I know that nothing new gets added to the database (from some other process). So it MAY be a good idea to "cache" all existing directories prior to the scan, and look them up this way. On the other hand, this may be not suitable for large sets of data, like files (which will be 600.000 or more)...
Perhaps some performance gain can be achieved using "index columns" or something like this, but I am not so familiar with this topic. If anybody has some references, just point me in the right direction...
Thanks,
Chris
PS: I am using NHibernate, Fluent Interface, Automapping and SQL Express (could switch to full SQL)
Note:
In the given problem, the path is not the ID in the database. The ID is an auto-increment, and I can't change this requirement (other reasons). So the real question is, what is the fastest way to "check for the existance of an object, where the ID is not known, just a property of that object"
And batching might be possible, by selecting a big group with something like "starts with C:Testfiles\" but the problem then remains, how do I know in advance how big this set will be. I cant select "max 1000" and check in this buffered dictionary, because i might "hit next to the searched dir"... I hope this problem is clear. The most important part, is, is buffering really affecting performance this much. If so, does it make sense to load the whole DB in a dictionary, containing only PATH and ID (which will be OK, even if there are 1.000.000 object, I think..)
First off, I highly recommend that you (anyone using NH, really) read Ayende's article about the differences between Get, Load, and query.
In your case, since you need to check for existence, I would use .Get(id) instead of a query for selecting a single object.
However, I wonder if you might improve performance by utilizing some knowledge of your problem domain. If you're going to scan the whole drive and check each directory for existence in the database, you might get better performance by doing bulk operations. Perhaps create a DTO object that only contains the PK of your DatabaseDirectory object to further minimize data transfer/processing. Something like:
Dictionary<string, DirectoryInfo> directories;
session.CreateQuery("select new DatabaseDirectoryDTO(dd.FullName) from DatabaseDirectory dd where dd.FullName in (:ids)")
.SetParameterList("ids", directories.Keys)
.List();
Then just remove those elements that match the returned ID values to get the directories that don't exist. You might have to break the process into smaller batches depending on how large your input set is (for the files, almost certainly).
As far as separation of concerns, just keep the operation at a repository level. Have a method like SyncDirectories that takes a collection (maybe a Dictionary if you follow something like the above) that handles the process for updating the database. That way your higher application logic doesn't have to worry about how it all works and won't be affected should you find an even faster way to do it in the future.

Batch Update in NHibernate

Does batch update command exist in NHibernate? As far as I am aware it doesn't. So what's the best way to handle this situation? I would like to do the following:
Fetch a list of objects ( let's call them a list of users, List<User> ) from the database
Change the properties of those objects, ( Users.Foreach(User=>User.Country="Antartica")
Update each item back individually ( Users.Foreach(User=>NHibernate.Session.Update(User)).
Call Session.Flush to update the database.
Is this a good approach? Will this resulted in a lot of round trip between my code and the database?
What do you think? Or is there a more elegant solution?
I know I'm late to the party on this, but thought you may like to know this is now possible using HQL in NHibernate 2.1+
session.CreateQuery(#"update Users set Country = 'Antarctica'")
.ExecuteUpdate();
Starting NHibernate 3.2 batch jobs have improvements which minimizes database roundtrips. More information can be found on HunabKu blog.
Here is example from it - these batch updates do only 6 roundtrips:
using (ISession s = OpenSession())
using (s.BeginTransaction())
{
for (int i = 0; i < 12; i++)
{
var user = new User {UserName = "user-" + i};
var group = new Group {Name = "group-" + i};
s.Save(user);
s.Save(group);
user.AddMembership(group);
}
s.Transaction.Commit();
}
You can set the batch size for updates in the nhibernate config file.
<property name="hibernate.adonet.batch_size">16</property>
And you don't need to call Session.Update(User) there - just flush or commit a transaction and NHibernate will handle things for you.
EDIT: I was going to post a link to the relevant section of the nhibernate docs but the site is down - here's an old post from Ayende on the subject:
As to whether the use of NHibernate (or any ORM) here is a good approach, it depends on the context. If you are doing a one-off update of every row in a large table with a single value (like setting all users to the country 'Antarctica' (which is a continent, not a country by the way!), then you should probably use a sql UPDATE statement. If you are going to be updating several records at once with a country as part of your business logic in the general usage of your application, then using an ORM could be a more sensible method. This depends on the number of rows you are updating each time.
Perhaps the most sensible option here if you are not sure is to tweak the batch_size option in NHibernate and see how that works out. If the performance of the system is not acceptable then you might look at implementing a straight sql UPDATE statement in your code.
Starting with NHibernate 5.0 it is possible to make bulk operations using LINQ.
session.Query<Cat>()
.Where(c => c.BodyWeight > 20)
.Update(c => new { BodyWeight = c.BodyWeight / 2 });
NHibernate will generate a single "update" sql query.
See Updating entities
You don't need to update, nor flush:
IList<User> users = session.CreateQuery (...).List<User>;
users.Foreach(u=>u.Country="Antartica")
session.Transaction.Commit();
I think NHibernate writes a batch for all the changes.
The problem is, that your users need to be loaded into memory. If it gets a problem, you can still use native SQL using NHibernate. But until you didn't prove that it is a performance problem, stick with the nice solution.
No it's not a good approach!
Native SQL is many times better for this sort of update.
UPDATE USERS SET COUNTRY = 'Antartica';
Just could not be simpler and the database engine will process this one hundred times more efficiently than row at a time Java code.