Mapping list of ints in NHibernate - nhibernate

In NHibernate manual I have found mapping like this:
<bag name="Sizes" table="SIZES" order-by="SIZE ASC">
<key column="OWNER"/>
<element column="SIZE" type="Int32"/>
</bag>
I can't help wondering why would anyone want to do something like this? Is there something better about mapping plain integers than creating an entity corresponding to a given integer (Size entity in this scenario) and creating true one-to-many relationship?

The requirements of your business domain that your application operates in will dictate whether you should have a list of ints (a collection of value types), or a list of entities. If there's a good use case for a IList<int> then by all means go for it, and let NHibernate map this association accordingly. Otherwise just remove it.
However, removing it because it seems unfamiliar to you, isn't a valid reason.
I use this a lot in my domain models. Right now I have a lot of 'Twitter Applications' that index tweets based on search 'Keywords', so I have mapped it like this:
public class TwitterApplication
{
public virtual int Id { get; set; }
public virtual string ApplicationName { get; set; }
// other properties (snip)
public virtual ISet<string> Keywords { get; set; }
}
I use this mapping because I know that:
The number of Keywords will be small
(about 4 - 6)
I'm not interested in storing
Keyword DateAdded etc.
I'm not going to be doing Paging or
Querying on the Keywords, just
retrieve them all at the same time,
or not at all
On this basis, I decided mapping it as a collection of strings was appropriate.

The question is not if you want to map it like this, the question is, if you need a list of ints in your model.
When you have a list of ints in your model, then you want to map it like this. You don't want to write complex code in your model, just because of the mapping.
So, do you think it is useful to have a list of ints in a class? Or a list of Guids, enums, doubles?
class Graph
{
IList<double> Values { get; private set; }
}
Doesn't it make sense to you?

Related

Share a model between Azure and SQL Repositories

I have a set of entities that I would like to persist via the repository pattern.
Vanilla SQL is pretty straight forward, write some methods that have queries that take/return the entities.
Azure table storage is also pretty straight forward, except that most of the implementations I have seen want the entities to be decended from some common Azure base class. (TableServiceEntity etc)
EF works as well, but also wants to own a bit more of the entities.
Is there a good way to abstract away both the SQL and Azure table stuff so that the entities can be persisted either way?
Bidirectional support is not really needed, we are just going to have two different deployment types that need to be supported.
I would like the models to be as agnostic as possible of the repository they are being persisted in, with as few dependancies (none?!) if possible.
This is doable. I've helped architect this for a client on a decently large scale.
1) Do not inherit from TableServiceEntity but instead implement the following attribute on your entities:
[DataServiceKey(new string[] { "PartitionKey", "RowKey" }), Serializable]
Also, implement some sort of an interface on your entities that provides PartitionKey,RowKey, and Timestamp to your entities.
public interface ITableEntity
{
string PartitionKey { get; set; }
string RowKey { get; set; }
DateTime Timestamp { get; set; }
}
At least this approach will allow you to have your own inheritance strategy for your own entities and not be restricted due to lack of multi-inheritance. Try to have PartitionKey and RowKey simply provide a pass-through to the real key properties instead of duplicating the keys.
public string PartitionKey
{
get
{
return this.Id;
}
set
{
this.Id = value;
}
}
2) Do realize that you will have two types of repositories in your system: relational-specific and ATS-specific.
3) You can generate your entities via EDMX and use partial classes to inject them with ITableEntity and DataServiceKey attribute
4) At some point you will need your ATS-specific repositories to do some transformations of your entities for persistence's sake, because of the way you'll be saving data into ATS is not the way you'll want it modeled in your domain (this especially relates to hierarchical or relational data)
HTH

Fluent NHibernate automapping: One-to-many entities, many-to-many backend?

My goal is to use NHibernate schema generation along with Fluent NHibernate's automapper to generate my database. I'm having trouble with what I'll call "unidirectional many-to-many relationships."
Many of my entities have localized resources. A single class might look like this:
public class Something {
public virtual int Id {get; private set;}
public virtual Resource Title {get;set;}
public virtual Resource Description {get;set;}
public virtual IList<Resource> Bullets {get;set;}
}
The Resource class doesn't have any references back; these are entirely unidirectional.
public class Resource {
public virtual int Id {get; private set;}
public virtual IList<LocalizedResource> LocalizedResources {get;set;}
// etc.
}
public class LocalizedResource { //
public virtual int Id {get; private set; }
public virtual string CultureCode {get;set;}
public virtual string Value {get;set;}
public virtual Resource Resource {get;set;}
}
Without the IList<Resource>, everything is generated as I'd want -- Resource ID's are in the Title and Description fields. When I add in the IList though, NHibernate adds the field something_id to the Resource table. I understand why it does this, but in this situation it's not a sustainable approach.
What I want is to create a junction table for the bullets. Something like:
CREATE TABLE SomethingBullet (
Id int NOT NULL PRIMARY KEY IDENTITY(1,1),
Something_Id int NOT NULL,
Resource_Id int NOT NULL
)
This way when I add the other twenty-odd entities into the database I won't end up with a ridiculously wide and sparse Resource table.
How do I instruct the Automapper to treat all IList<Resource> properties this way?
Every many-to-many is in fact composed with one-to-many's in object model. If your relationship doesn't need to be bidirectional, just don't map the second side. The mapping on your mapped side is not affected at all:
HasManyToMany(x => x.Bullets).AsSet();
In this case, NHibernate already knows that it needs to generate the intermediate table.
See also this article for many-to-many tips.
:)
The only way I found to make this work with automapping is by constructing your own custom automapping step and replacing the "native" HasManyToManyStep. It's either that or an override, I'm afraid.
I lifted mine off of Samer Abu Rabie, posted here.
The good news is that Samer's code, so far, seems to work flawlessly with my conventions and whatnots, so, once it was in place, it was completely transparent to everything else in my code.
The bad news is that it costs you the ability to have unidirectional one-to-many relationships, as Samer's code assumes that all x-to-many unidirectional relationships are many-to-many. Depending on your model, this may or may not be a good thing.
Presumably, you could code up a different implementation of ShouldMap that would distinguish between what you want to be many-to-many and what you want to be one-to-many, and everything would then work again. Do note that that would require having two custom steps to replace the native HasManyToManyStep, although, again, Samer's code is a good starting point.
Let us know how it goes. :)
Cheers,
J.

NHibernate QueryOver value collection

I have a project using NH 3.1 and have been using the QueryOver syntax for everything thus far.
One aspect of this project lives in a organization-wide database that I have read-only access to and is using a completely differently DBMS (Oracle vs MSSQL). So I store references from my objects (Foos) to their objects (Bars) using a standard many-to-many table
FooBars
FooID int not null PK
BarID int not null PK
And my domain object, instead of having a Iset<Bar> instead has an ISet<int> BarIDs which is manually mapped to the FooBars table. This prevents NH from trying to do the impossible and join all the way over to the Bars table (I can use a BarRepository.Get() to retrieve the details of the Bars later, if I need them, and in this case, I wouldn't, because I just need the IDs to filter the list of objects returned).
Given IList<int> SelectedBars how can I write a QueryOver<Foo> where BarIDs contains any element in SelectedBars?
SQL something like
...FROM foos INNER JOIN foobars on foo.fooID = foobars.fooID WHERE barID IN ( ... )
It is not possible with QueryOver. Two years ago, I had a similar question about filtering value collections. (Note: QueryOver is based on Criteria API).
I'm not 100% sure, but it probably works with HQL. It is much more powerful.
You may include an SQL statement into the QueryOver criteria.
I don't really understand why you don't map it as a list of entities. There is lazy loading to avoid unnecessary loading - although there are some trade offs sometimes. You can access the ID of NH proxies without hitting the database. Mapping ids makes usually life much harder.
Try:
session.QueryOver<Foo>()
.JoinQueryOver(x => x.FooBars)
.WhereRestrictionOn(x => x.BarId).IsIn( ... )
So 3 years later, I'm back to report how I did solve this.
public class Foo :Entity {
public virtual ISet<FooBar> BarIDs { get; protected internal set; }
} ...
public class FooBar :Entity {
protected internal FooBar() { }
protected internal FooBar(Foo f, int BarID) { ... }
public virtual Foo Foo { get; protected internal set; }
public virtual int BarID { get; protected internal set; }
}
This is basically what Stefan suggested, and what's hinted at in the related post. You just have to eat the overhead of writing an extra entity and referencing it. Remember that I'm storing BarIDs instead of full Bar objects, because I'm dealing with a hard boundary between two databases: the Bars are stored in an entirely different database on a different platform than the Foos. Otherwise of course, you're far better off just telling Foo that is has an ISet<Bar>.
Finding Foos by SelectedBarIDs is then easy, much like Thilak suggested:
session.QueryOver<Foo>().JoinQueryOver<FooBar>(f => f.BarIDs).
WhereRestrictionOn(b => b.BarID).IsIn(...)...
It's an interesting problem, working across the database boundary like this. I can't say I like having to do it, but if someone else is going to take the time to maintain a list of Bars and make it available for my use, it would be a giant waste of resources for me to do the same. So a small inefficiency in having the wrapper class is a very easy cost to justify.

Keeping NHibernate from loading some fields

This is a follow on question to My earlier question on lazy loading properties. Since the application is an enhancement to a production application in a fairly major enterprise, and it currently running using NHib 1.2, upgrading to version 3 is not going to happen, so the suggested answer of using Lazy Properties in 3.0 won't work for me.
To summarize the problem, I have simple database with 2 tables. One has about a dozen simple fields, plus a one to many relation to the second table as a child table. Two of the fields are very large blobs (several megabytes each), and I want to, effectively, lazy load them. This table has about 10 records, and they populate a grid at start up, but access to the large blobs are only needed for whatever row is selected.
The object structure looks something like this:
[Serializable]
[Class(Schema = "dbo", Lazy = false)]
public class DataObject
{
[Id(-2, Name = "Identity", Column="Id")]
[Generator(-1, Class="native")]
public virtual long Identity { get; set;}
[Property]
public string FieldA { get; set;}
[Property]
public byte[] LongBlob {get; set;}
[Property]
public string VeryLongString { get; set;}
[Bag(-2, Cascade=CascadeStyle.All, Lazy= false, Inverse=true)]
[Key(-1, Column="ParentId")]
[OneToMany(0, ClassType=typeof(DataObjectChild))]
public IList<DataObjectChild> ChildObjects { get; set;}
}
Currently, the table is accessed with a simple query:
objectList = (List<DataObject>) Session.CreateQuery("from DataObject").List<DataObject>();
And that takes care of everything, including loading the child objects.
What I would like is a query to do exactly the same thing, except select a list of the properties of the DataObject that includes everything EXCEPT the two blobs. Then I can add custom property Getters that will go out and load those properties when they are accessed.
So far, I have not been successful at doing that.
Things I have tried:
a) constructing an HQL query using a SELECT list.
It looks something like this:
objectList = (List<DataObject>) Session.CreateQuery(
"SELECT new DataObject " +
"(obj.Identity, obj.FieldA) " +
"from DataObject as obj")
That works, though I have to add a constructor to the DataObject that will construct it from fields I select, which is rather cumbersome. But then it doesn't load and expand the list of child objects, and I can't figure out how to do that easily (and don't really want to - I want to tell NHib to do it.)
b) removing the [Property] attribute from the two fields in the object definition. That keeps the NHibernate.Mapping.Attributes from mapping those fields, so they don't get included in the query, but then I have no way to access them from NHib at all, including writing them out when I go to save a new or modified DataObject.
I'm thinking there has to be an easier way. Can somebody point me in the right direction?
Thanks
I think you're on the right path. However, you're going to have to do more manual work since you're using a very old version of NHibernate. I would fetch projections of your entities that contain only the columns you want to eagerly load. Then when the UI requests the large blob objects, you're going to have to write another query to get them and supply them to the UI.
Another option would be to have a second class (e.g. SmallDataObject) with the same mapping (but without the blobs) and use that for the list. When editing a list item you would load the class with the blobs using the id of the selected list item.
In any case, modifying the mapping after the creation of the SessionFactory is not possible, so you cannot get rid of the mapped properties on demand.

NHibernate: mapping single column from many-to-one to a primitive type

I have a following mapping:
<set name="People" lazy="true" table="ProjectPeople">
<key column="ProjectId" />
<composite-element class="PersonRole">
<many-to-one name="Person" column="PersonId" cascade="save-update" not-null="true" />
<many-to-one name="Role" column="RoleId" cascade="save-update" not-null="true" />
</composite-element>
</set>
Now, I do not really want to have a separate class for Role in domain, I need only the Role name. However, in DB Roles should still be normalized to a separate table Role (Id, Name).
How do I map it so that People use following PersonRole class?
public class PersonRole {
public virtual Person Person { get; set; }
public virtual string Role { get; set; }
}
Update: added bounty, seems like a question useful not only to me.
You won't actually get the answer you hope for, simply because it is not possible. (N)Hibernate is an Object-Relational-Mapping framework and support three kinds of mapping strategies:
table per class hierarchy
table per subclass
table per concrete class
It also allows you to deviate from this by using formula or sql-insert etc, but as you've found out, these only cause you more pain in the end, are not encouraged by the Hibernate community and are bad for the maintainability of your code.
Solution?
Actually, it is very simple. You do not want to use a class for Role. I assume you mean that you do not want to expose a class of type Role and that you do not want to have to type prObject.Role.Name all the time. Just prObject.Role, which should return a string. You have several options:
Use an inner class in, say, PersonRole, this class can be internal or private. Add a property Role that sets and updates a member field;
Use an internal class. Add a property Role that sets and updates a member field;
Let's examine option 2:
// mapped to table Role, will not be visible to users of your DAL
// class can't be private, it's on namespace level, it can when it's an inner class
internal class Role
{
// typical mapping, need not be internal/protected when class is internal
// cannot be private, because then virtual is not possible
internal virtual int Id { get; private set; }
internal virtual string Name { get; set; }
}
// the composite element
public class PersonRole
{
// mapped properties public
public virtual Person Person { get; set; }
// mapped properties hidden
internal virtual Role dbRole { get; set; }
// not mapped, but convenience property in your DAL
// for clarity, it is actually better to rename to something like RoleName
public string Role /* need not be virtual, but can be */
{
get
{
return this.dbRole.Name;
}
set
{
this.dbRole.Name = value; /* this works and triggers the cascade */
}
}
}
And the mapping can look as expected. Result: you have not violated the one-table-per-class rule (EDIT: asker says that he explicitly wants to violate that rule, and Hib supports it, which is correct), but you've hidden the objects from modification and access by using typical object oriented techniques. All NH features (cascade etc) still work as expected.
(N)Hibernate is all about this type of decisions: how to make a well thought-through and safe abstraction layer to your database without sacrificing clarity, brevity or maintainability or violating OO or ORM rules.
Update (after q. was closed)
Other excellent approaches I use a lot when dealing with this type of issue are:
Create your mappings normally (i.e., one-class-per-table, I know you don't like it, but it's for the best) and use extension methods:
// trivial general example
public static string GetFullName(this Person p)
{
return String.Format("{0} {1}", p.FirstName, p.LastName);
}
// gettor / settor for role.name
public static string GetRoleName(this PersonRole pr)
{
return pr.Role == null ? "" : pr.Role.Name;
}
public static SetRoleName(this PersonRole pr, string name)
{
pr.Role = (pr.Role ?? new Role());
pr.Role.Name = name;
}
Create your mappings normally but use partial classes, which enable you to "decorate" your class any which way you like. The advantage: if you use generated mapping of your tables, you an regenerate as often as you wish. Of course, the partial classes should go in separate files so considering your wish for diminishing "bloat" this probably isn't a good scenario currently.
public partial class PersonRole
{
public string Role {...}
}
Perhaps simplest: just overload ToString() for Role, which makes it suitable for use in String.Format and friends, but of course doesn't make it assignable. By default, each entity class or POCO should have a ToString() overload anyway.
Though it is possible to do this with NHibernate directly, the q. has been closed before I had time to look at it (no ones fault, I just didn't have the time). I'll update if I find the time to do it through Hibernate HBM mapping, even though I don't agree to the approach. It is not good to wrestle with advanced concepts of Hib when the end result is less clear for other programmers and less clear overall (where did that table go? why isn't there a IDao abstraction for that table? See also NHibernate Best Practices and S#arp). However, the exercise is interesting nevertheless.
Considering the comments on "best practices": in typical situations, it shouldn't be only "one class per table", but also one IDaoXXX, one DaoConcreteXXX and one GetDaoXXX for each table, where you use class/interface hierarchy to differentiate between read-only and read/write tables. That's a minimum of four classes/lines of code per table. This is typically auto-generated but gives a very clear access layer (dao) to your data layer (dal). The data layer is best kept as spartan as possible. Nothing of these "best practices" prevent you using extension methods or partial methods for moving Role.Name into Role.
These are best general practices. It's not always possible or feasible or even necessary in certain special or typical sitations.
Personally I would create a Role class like Yassir
But If you want to use the structure that you have at the moment then create a view that contains the foriegn Key to your Person Table and the Role Description.
Modify the Set mapping table to point at your new view
Then modify your Role mapping so that it is a property instead of the many to one mapping.
However taking this approach I think will mean that you will not be able to update your role as it is reerencing a view.
Edit: To update the role you could add <sql-insert>,<sql-update> and <sql-delete> to your mapping file so that the cascade-all will work
i don't think it is possible to map many-to-one to a primitive type if i were you i would add a Role class to the model
This the biggest turn off of the whole OO purist thing.
Surely the goal is to have a working application. Not somebodies version of a perfect class hierarchy. So what if you have to code "prObject.Role.Name " instead of "prObject.Role". How does this help you make a better more reliable program?
From the application design purist point of view what you want is just plain wrong. A person can have several roles, a role can usually be assigned to several people.
Why go to all this trouble to enforce an unrealistic one role per person class hierachy when the undelaying data model is many roles per person?
If you really do have an "only one role per person" rule then it should be refleced in the underlying data model.