I am using a where clause on a bag to filter a bag. Is there any way to make nhibernate insert these filter values into the database?
IMHO, it is not worth trying to share a table between many-to-many relations. You can't access the boolean flags in this tables, so you can't change it from NH, except using native sql, which is not how NHibernate should be used. And it will be very very very complicated to make changes on these bags. These table is completely invisible for the object oriented point of view and completely managed by NH.
Just map it to different tables and you'll be happy. Believe me.
Related
I'm using Nhibernate and Fluent Nhibernate.
I fell into the trap I think a lot of new users fall for, and ended up with all my varchar columns at 255 chars. For politic reasons far too boring to go into, there was immediately data in these fields that I'm not supposed too delete (boo) so I need to update the column lengths without dropping and re-creating tables.
However if I apply a convention for string length to the Fluent configuration, and use the NHibernate UpdateSchema method, only new tables seem to get the new varchar length. Is this correct and is there a way to apply this to the existing tables??
you don't necessarily need nHibernate for that..
you didn't mention what the underlying DB instance is, but I'm sure it has options for updating column properties. I think it's the simplest solution.
UpdateSchema only applies non-destructive updates. It is not meant as a migration utility, but for rapid changes to models/database tables during development. For more information, see my answer to NHibernate, ORM : how is refactoring handled? existing data?.
I'm a relative newbie at NHibernate, so I'll beg forgiveness in advance if this a stupid question. I've googled it and searched the documentation, and am getting all wrapped around the axle.
I'm maintaining/enhancing an existing application that uses NHibernate for a relatively straightforward table. The table has about 10-12 fields, and no foreign key relations. The table contains somewhere around a dozen or so rows, give or take.
Two of the fields are huge blobs (multi-megabytes). As a result, the table is taking an excessive amount of time (4 minutes) to load when working with a remote DB.
The thing is that those two fields are not needed until a user selects one of the rows and begins to work on it, and then they are only needed for the one row that he selects.
This seems like exactly what lazy loading was meant for. I just can't quite figure out how to apply it unless I break up the existing DB schema and put those columns in their own table with one-to-one mapping, which I don't want to.
If it matters, the program is using NHiberate.Mapping.Attributes rather than hbm files, so I need to be able to make alterations in the attributes of the Domain objects that will propagate to the hbm.
Thanks for any help.
You need lazy properties IN NHibernate 3 to accomplish this. I assume, but don't know, that you can set that using attributes.
Hey all, quick NHibernate question.
In my current project, we have a denormalized table that, for a given unique header record, will have one or more denormalized rows.
When the user is accessing a POCO representing the header and performs an update, I need this change to cascade down to all of the denormalized rows. For example, if the user changes field 'A' in the normalized header, I need all denormalized rows to now reflect the new value for field 'A'.
My current though is to just do a foreach in the normalized header on the property set, since I already have an IList representing the denormalized rows, but I was hoping for a more elegant solution that does not involve writing a foreach loop for each normalized field that needs to propagate down to the denormalized table.
FYI in the pure Sproc world, we'd just issue a second update command in a save sproc with an appropriate where clause - but we're also trying to move away from the sproc dependencies and perform most operations in c#
TIA
Thanks all for the answers above. I looked into the event listener as suggested, and it seemed a bit too heavy for what we were trying to accomplish.
Since we're using a repository pattern and the intent is to embed as much of this kind of behavior in the model, we ultimately went with embedding the cascading updates in the setters of the header object's properties. Since these kinds of cascades can be tough to test, etc. it lets us test everything in the model among the pocos without ever having to rely on a SQL trigger or NHibernate.
In short, when a header is updated in it's setter, I do a quick for-each for the list of detail objects, and also update any other denormalized pocos in the object tree, then drop this into the database with a simple saveorupdate with nHibernate.
-Bob
Struggling between choosing linq2sql and nhibernate.
Let me give you some insight in the application in point form:
this is a asp.net mvc application
it will have lots of tables, maybe 50-60 (sql server 2008)
i would want all the basic crud logic done for me (which I think nhiberate + repository pattern can give me)
i don't have too complicated mappings, my tables will look something like:
User(userID, username)
UserProfile(userID, ...)
Content(contentID, title, body, date)
Content_User(contentID, userID)
So in general, I will have a PK table, then lots of other tables that reference that PK (i.e. FK tables).
I will also have lots of mapping tables, that will contain PK, FK pairs.
Entity wise, I want User.cs, UserProfile.cs and then a way to load each object.
I am not looking for a User class that has a UserProfile property, and a Content Collection property (there will be maybe 10-20 tables that will related to the user, I just like to keep things linear if that makes sense).
The one thing that makes me learn towards nhibernate is: cross db potential, and the repository pattern that will give me basic db operations accross all my main tables almost instantly!
Since you seem to have a quite straight forward mapping from class to table in mind Linq to SQL should do the trick, without any difficulties. That would let you get started very quickly, without the initial work of mapping the domain manually to the database.
An alternative could be using NHibernate + Fluent NHibernate and its AutoMapping feature, but keep in mind that the Fluent NHibernate AutoMapping is still quite young.
I'm not quite sure I understand what you want your entities to look like, but with Linq to SQL you will get a big generated mess, which you then could extend by using partial classes. NHibernate lets you design you classes however you want and doesn't generate anything for you out of the box. You could kind of use POCO classes with Linq to SQL but that would take away all the benefits of using Linq to SQL rather than NHibernate.
Concerning the repository pattern, and the use of a generic repository, that can be done quite nicely with Linq to SQL as well, and not only with NHibernate. In my opinion that is one of the nice things about Linq to SQL.
If you probably will need support for other databases than SQL Server, NHibernate is the only choice. However, if it probably won't be an issue I would recommend not using that as the primary factor when choosing. The other factors will probably influence your project more.
Conclusion:
All in all, I would recomment Linq to SQL, in this case, since it would let you get started quickly and is sufficient for your needs. The precondition for that is that you don't have a problem with the thought of having generated, messy code in your domain, and that you are quite sure there will not be any need to support other databases in the future. Otherwise I would recommend NHibernate, since it is truly an awesome ORM.
linq2sql really wants you to work with 1 table per class mapping. So if you have a UserMaster and a UserDetail table, you are looking at two objects when using default linq object generation. You can get around that by mapping linq entities to business entities (see Rob Conery's storefront screencasts), but then you are back to writing object mapping code or using something like Automapper.
If you want to be able to split your classes across multiple tables, then I'd say go with NHibernate. If not, then linq has a lower learning curve.
The only way I'd ever use nHibernate in through Castle Project's ActiveRecord library. Otherwise, nHibernate becomes its own little infrastructure project. Check out some questions in the nHibernate tag to see what I'm talking about.
The only thing I might change about AR is to return results of SELECT operations as List instead of T[]. Of course, with the source code in C# I can do that if I want.
With ActiveRecord, the mapping information is saved in attributes you decorate your classes with. It's genius and I am a huge proponent of the pattern and this particular library.
I'm trying to setup a sample project using NHibernate and Fluent NHibernate.
I am using the example mappings from the Fluent NHibernate web-site.
My question is about the many-to-many mapping between Store and Product. It seems (when looking at the generated SQL) that when adding a product to the store, NHibernate deletes all the records from the association table (StoreProduct) that belong to that store, and then inserts all the records again, now including the association to the new product I added.
Is this the default behavior or am I missing something? It just seems not very efficient to delete and re-insert all the associations every time I need to add one.
This is expected behavior. I believe this should only happen when you use the bag mapping strategy which it looks like they are using in the example. A bag indicates that there is an unordered collection that can have duplicate items. Because bag items are not unique NHibernate cannot tell when you've added or removed an item from a bag easily. The easiest thing for NHibernate then is to do is delete all associations and then re-add.
It's been a while since I've played with many-to-many mappings (I usually just map as two one-to-many relationships) but I believe that if you use use a different construct, for example, a set (which does not allow duplicates) you should find that the behavior is different. Of course, you should use what ever construct makes the most semantic sense for your application.