transactions and delete using fluent nhibernate - nhibernate

I am starting to play with (Fluent) nHibernate and I am wondering if someone can help with the following. I'm sure it's a total noob question.
I want to do:
delete from TABX where name = 'abc'
where table TABX is defined as:
ID int
name varchar(32)
...
I build the code based on internet samples:
using (ITransaction transaction = session.BeginTransaction())
{
IQuery query = session.CreateQuery("FROM TABX WHERE name = :uid")
.SetString("uid", "abc");
session.Delete(query.List<Person>()[0]);
transaction.Commit();
}
but alas, it's generating two queries (one select and one delete). I want to do this in a single statement, as in my original SQL. What is the correct way of doing this?
Also, I noticed that in most samples on the internet, people tend to always wrap all queries in transactions. Why is that? If I'm only running a single statement, that seems an overkill. Do people tend to just mindlessly cut and paste, or is there a reason beyond that? For example, in my query above, if I do manage it to get it from two queries down to one, i should be able to remove the begin/commit transaction, no?
if it matters, I'm using PostgreSQL for experimenting.

You can do a delete in one step with the following code:
session.CreateQuery("DELETE TABX WHERE name = :uid")
.SetString("uid", "abc")
.ExecuteUpdate();
However, by doing it that way you avoid event listener calls (it's just mapped to a simple SQL call), cache updates, etc.

Your first query comes from query.List<Person>().
Your actual delete statement comes from session.Delete(...)
Usually, when you are dealing with only one object, you will use Load() or Get().
Session.Load(type, id) will create the object for you without looking it up in the database . However, as soon as you access one of the object's properties, it will hydrate the object.
Session.Get(type, id) will actually look up the data for you.
As far as transactions, this is a good article explaining why it is good to wrap all of your nHibernate queries with transactions.
http://nhprof.com/Learn/Alerts/DoNotUseImplicitTransactions

In NHibernate, I've noticed it is most common to do a delete with two queries like you see. I believe this is expected behavior. The only way around it off the top of my head is to use caching, then the first query could be loaded from the cache if it happened to be run earlier.
As far as wrapping everything in a transaction: in most databases, transactions are implicit for every query anyways. The explicit transactions are just a guarantee that the data won't be changed out from under you mid-operation.

Related

NHibernate QueryCache in Multiuser-Environment

For our web-application (ASP.NET) we're using Fluent NHibernate (2.1.2) with 2nd-Level caching not only for entities, but also for queries (generating queries with the criteria API). We're using the Session-Per-Request pattern and one SessionFactory applicationwide, so the cache serves all Nhibernate-Sessions.
Problem:
We have to deal with different "Access-Rigths" per user on the data-objects in our legacy-database (Oracle) - that is, views constrain the returning data per user-rights.
So there's the situation, where for example the same view is queried by our criteria with the excact same query, but returns a different resultset, depending on the user-Rights.
Now, to gain performance, the mentioned query is cached. But this gives us the problem, that when the query is first fired from an action of user A, it caches the resulting ID's, which are the ID's to which user A has access rights. Shortly after, the same query is fired from an action of user B and Nhibernate then picks the cached ID's from the first call (from user A) and tries to get the corresponding entities, to which User B doesn't have access-rights (or maybe not for all of them). We're checking the rights with event-listeners, so our appliction throws an access-right-exception in the mentioned case.
Thoughts:
Not caching the queries could be an option against this. But performance is cleary an issue in our application, so it would be really desirable to have cached queries user-wise.
We even thought about a SessionFactory per user, to have a cache per user, sort of. But this has clearly an impact on ressources, is somewhat of an overkill and honestly isn't an option, because
there are entities, which have to be accessed, and are manipulated, by multiple users (think of a user-group), creating issues with stale data in the "individual caches" and so on. So that's a no-go.
What would be a valid solution for this? Is there something like "best practice" for such a situation?
Idea:
As I was stuck with this yesterday, seeing no way out, I slept over it, and today I came up with some sort of a "hack".
As NHibernate caches the query by query-text and parameters ("clauses"), I thought about a way, to "smuggle" something user-dependent in that signature of the queries, so it would
cache every query per user, but would not alter the query itself (concerning the result of the query).
So "creativity" guided me to this (example-code):
string userName = GetCurrentUser();
ICriteria criteria = session.CreateCriteria(typeof (EntityType))
.SetCacheable(true)
.SetCacheMode(CacheMode.Normal)
.Add(Expression.Eq("PropertyA", 1))
.Add(Expression.IsNotNull("PropertyB"))
.Add(Expression.Sql(string.Format("'{0}' = '{0}'", userName)));
return criteria.List();
This line:
.Add(Expression.Sql(string.Format("{0} = {0}", userName)))
results in a where-clause, which always evaluates to true, but "changes" the query from Nhibernate's viewpoint, so it caches per separate "userName".
I know, it's kind of ugly and I'm not really pleased with it.
Does anybody knows any alternative approach?
thanks in advance.

Delete with QueryOver?

I would like to improve my code when deleting a group of objects in NHibernate (V3).
Currently, I iterate on a retrieved collection and I call delete on each object. This generates n+1 SQL statements.
I notice that NHibernate Session provides this method : Delete(string query)
By using this method I think I can do the same thing with a single SQL statement.
Do you know if there is a way to combine this method with QueryOver API to avoid HSQL ?
As far as I know the only way to do single-shot deletes and updates is using HQL. As a compromise, you might want to take a look at this workaround.

NHibernate handling mutliple resultsets from a sp call

I'm using a stored procedure to handle search on my site, it includes full text searching, relevance and paging. I also wanted it to return the total number of results that would have been returned, had paging not being there. So I've now got my SP returning 2 select statements, the search and just SELECT #totalResults.
Is there any way I can get NHibernate to handle this? I'm currently accessing the ISession's connection, creating a command and executing the SP myself, and mapping the results. This isn't ideal, so I'm hoping I can get NH to handle this for me.
Or if anyone has any other better ways of creating complicated searches etc with NH, I'd really like to hear it.
No, NHibernate only uses the first result set returned by the stored procedure and ignores any others.
You will need to use an alternative method, like ADO.NET.
Or, you can incur processing overhead by having two stored procedures. One for each result set. Gross.

Temporal data using NHibernate

Can anyone supply some tips/pointers/links on how to implement a temporal state-table with NHibernate? I.e., each entity table has start_date and end_date columns that describe the time interval in which this this row is considered valid.
When a user inserts a new entity, the start_date receives 'now' and the end_date will be null (or a date far into the future, I haven't decided yet).
When updating, I'd like to change the UPDATE query into the following:
UPDATE end_date for this entity's row, and
INSERT a new row with the current date/time and a null end_date.
I tried using event listeners to manually write an UPDATE query for 1, but can't seem to figure out how to implement a new INSERT query for 2.
Is this the proper way to go? Or am I completely off-mark here?
Actually we have a working solution were i work but it effectively kills a part of the nhibernate's mechanism.
For 'temporal entities' NH acts only as an insert/select engine. Deletes and updates are done by a different utility where the ORM part of NH comes handy.
If you only have a handful of temporal entities you may only use nhibernate but be prepared to write your own code to ensure state relations are valid.
We went that route in our first try and after the number of temporal entities started adding up the mechanism was effectively broken.
Now, inserts don't need no special tooling, just place the values in the appropriate datetime properties and you're set. We implement selects with Filters (definitely check 16.1 of NH ref as it has an example, but the condition must not use a BETWEEN) although if you go that way you will have to modify the NH source code to apply filters on all kinds of selects.
Check my post at http://savale.blogspot.com/2010/01/enabling-filters-on-mapped-entities.html for doing that.
It might also work if you specify the "where" clause on the mapping (instead of filters) but i haven't tried or tested it yet, and it is my understanding that the mapped "where" on the mapping do not support parameters (at least not officially).
As i side note, the reason for using a custom tool for updates/deletes will become clear once you have read Richard Snodgrass's books on temporal databases http://www.cs.arizona.edu/~rts/publications.html
To directly answer your question, both the NULL _end value and a value far in the future will work (but prefer the NOT-NULL solution it will make your queries easier as you will not have to check with ISNULL).
For updates you effectively make a clone of the original entity then set the original entity's _end to now, and then go to the cloned and change the relevant properties, change _start to now, _end to the far-in-the-future value
I suggest the excellent timeNarrative from Martin Fowler.
I think the best approach is to have something like a Java map (sorry, I'm a Java programmer) and let NHibernate to map that. The map would map something like a Period, with a "start" and "end" fields, to a value. You can the write a UserType to map the Period to two different database columns
While this article is a few years old, the pattern still seems valid. I have not yet used this but I need a similar solution in a current project and will be modeling it the way it is described here.

LINQ to SQL updates

Does any one have some idea on how to run the following statement with LINQ?
UPDATE FileEntity SET DateDeleted = GETDATE() WHERE ID IN (1,2,3)
I've come to both love and hate LINQ, but so far there's been little that hasn't worked out well. The obvious solution which I want to avoid is to enumerate all file entities and set them manually.
foreach (var file in db.FileEntities.Where(x => ids.Contains(x.ID)))
{
file.DateDeleted = DateTime.Now;
}
db.SubmitChanges();
There problem with the above code, except for the sizable overhead is that each entity has a Data field which can be rather large, so for a large update a lot of data runs cross the database connection for no particular reason. (The LINQ solution is to delay load the Data property, but this wouldn't be necessary if there was some way to just update the field with LINQ to SQL).
I'm thinking some query expression provider thing that would result in the above T-SQL...
LINQ cannot perform in store updates - it is language integrated query, not update. Most (maybe even all) OR mappers will generate a select statement to fetch the data, modify it in memory, and perform the update using a separate update statement. Smart OR mappers will only fetch the primary key until additional data is required, but then they will usually fetch the whole rest because it would be much to expensive to fetch only a single attribute at once.
If you really care about this optimization, use a stored procedure or hand-written SQL statement. If you want compacter code, you can use the following.
db.FileEntities.
Where(x => ids.Contains(x.ID)).
Select(x => x.DateDeleted = DateTime.Now; return x; );
db.SubmitChanges();
I don't like this because I find it less readable, but some prefer such an solution.
LINQ to SQL is an ORM, like any other, and as such, it was not designed to handle bulk updates/inserts/deleted. The general idea with L2S, EF, NHibernate, LLBLGen and the rest is to handle the mapping of relational data to your object graphs for you, eliminating the need to manage a large library of stored procs which, ultimately, limit your flexability and adaptability.
When it comes to bulk updates, those are best left to the thing that does them best...the database server. L2S and EF both provide the ability to map stored procedures to your model, which allows your stored procs to be somewhat entity oriented. Since your using L2S, just write a proc that takes the set of identities as input, and executes the SQL statement at the beginning of your question. Drag that stored proc onto your L2S model, and call it.
Its the best solution for the problem at hand, which is a bulk update. Like with reporting, object graphs and object-relational mapping are not the best solution for bulk processes.