Fluid NHibernate, Custom Types and Id mapping - fluent-nhibernate

I have an object in C# that I want to use as a primary key in a database that auto-increments when new objects are added. The object is basically a wrapper of a ulong value that uses some bits of the value for additional hints. I want to store it as a 'pure' ulong value in a database but I would like get an automatic conversion when the value is loaded / unloaded from DB. IE, apply the 'hint' bits to the value based on the table they come from.
I went on a journey of implementing my own IUserType object based on number of examples I found online ( tons of help on this forum ).
I have an ObjectId class that acts is an object ID
class ObjectIdType: IUserType
{
private static readonly NHibernate.SqlTypes.SqlType[] SQL_TYPES = { NHibernateUtil.UInt64.SqlType };
public NHibernate.SqlTypes.SqlType[] SqlTypes
{
get { return SQL_TYPES; }
}
public Type ReturnedType
{
get { return typeof(ObjectId); }
}
...
}
I have a mapping class that looks like this:
public class ObjectTableMap()
{
Id(x => x.Id)
.Column("instance_id")
.CustomType<ObjectIdType>()
.GeneratedBy.Native();
}
At this point I get an exception at config that Id can only be an integer. I guess that makes sense but I was half expecting that having the custom type implemented, the native ulong database type would take over and work.
I've tried to go down the path of creating a custom generator but its still a bit out of my skill level so I am stumbling though it.
My question is, is it possible for me to accomplish what I am trying to do with the mapping?

I think, it is not possible, because your mapping uses the native generator for the Id. This can only be used for integral types (and GUIDs). You can try to use assigned Ids with your custom type, so you are responsible for assigning the values to your Id property.
There is another alternative: Why not set your information bits on class level, instead depending on your table? Your entities represent the tables, so you should have the same information in your entity classes. Example:
class Entity
{
protected virtual ulong InternalId { get; set; } // Mapped as Id
public virtual ulong Id // This property is not mapped
{
get
{
var retVal = InternalId;
// Flip your hint bits here based on class information
return retVal;
}
}
}
You could also turn InternalId into a public property and make the setter protected.

Related

How to easily access widely different subsets of fields of related objects/DB tables?

Imagine we have a number of related objects (equivalently DB tables), for example:
public class Person {
private String name;
private Date birthday;
private int height;
private Job job;
private House house;
..
}
public class Job {
private String company;
private int salary;
..
}
public class House {
private Address address;
private int age;
private int numRooms;
..
}
public class Address {
private String town;
private String street;
..
}
How to best design a system for easily defining and accessing widely varying subsets of data on these objects/tables? Design patterns, pros and cons, are very welcome. I'm using Java, but this is a more general problem.
For example, I want to easily say:
I'd like some object with (Person.name, Person.height, Job.company, Address.street)
I'd like some object with (Job.company, House.numRooms, Address.town)
Etc.
Other assumptions:
We can assume that we're always getting a known structure of objects on the input, e.g. a Person with its Job, House, and Address.
The resulting object doesn't necessarily need to know the names of the fields it was constructed from, i.e. for subset defined as (Person.name, Person.height, Job.company, Address.street) it can be the array of Objects {"Joe Doe", 180, "ACompany Inc.", "Main Street"}.
The object/table hierarchy is complex, so there are hundreds of data fields.
There may be hundreds of subsets that need to be defined.
A minority of fields to obtain may be computed from actual fields, e.g. I may want to get a person's age, computed as (now().getYear() - Person.birtday.getYear()).
Here are some options I see:
A SQL view for each subset.
Minuses:
They will be almost the same for similar subsets. This is OK just for field names, but not great for the joins part, which could ideally be refactored out to a common place.
Less testable than a solution in code.
Using a DTO assembler, e.g. http://www.genericdtoassembler.org/
This could be used to flatten the complex structure of input objects into a single DTO.
Minuses:
I'm not sure how I'd then proceed to easily define subsets of fields on this DTO. Perhaps if I could somehow set the ones irrelevant to the current subset to null? Not sure how.
Not sure if I can do computed fields easily in this way.
A custom mapper I came up with.
Relevant code:
// The enum has a value for each field in the Person objects hierarchy
// that we may be interested in.
public enum DataField {
PERSON_NAME(new PersonNameExtractor()),
..
PERSON_AGE(new PersonAgeExtractor()),
..
COMPANY(new CompanyExtractor()),
..
}
// This is the container for field-value pairs from a given instance of
// the object hierarchy.
public class Vector {
private Map<DataField, Object> fields;
..
}
// Extractors know how to get the value for a given DataField
// from the object hierarchy. There's one extractor per each field.
public interface Extractor<T> {
public T extract(Person person);
}
public class PersonNameExtractor implements Extractor<String> {
public String extract(Person person) {
return person.getName();
}
}
public class PersonAgeExtractor implements Extractor<Integer> {
public int extract(Person person) {
return now().getYear() - person.getBirthday().getYear();
}
}
public class CompanyExtractor implements Extractor<String> {
public String extract(Person person) {
return person.getJob().getCompany();
}
}
// Building the Vector using all the fields from the DataField enum
// and the extractors.
public class FullVectorBuilder {
public Vector buildVector(Person person) {
Vector vector = new Vector();
for (DataField field : DataField.values()) {
vector.addField(field, field.getExtractor().extract(person));
}
return vector;
}
}
// Definition of a subset of fields on the Vector.
public interface Selector {
public List<DataField> getFields();
}
public class SampleSubsetSelector implements Selector {
private List<DataField> fields = ImmutableList.of(PERSON_NAME, COMPANY);
...
}
// Finally, a builder for the subset Vector, choosing only
// fields pointed to by the selector.
public class SubsetVectorBuilder {
public Vector buildSubsetVector(Vector fullVector, Selector selector) {
Vector subsetVector = new Vector();
for (DataField field : selector.getFields()) {
subsetVector.addField(field, fullVector.getValue(field));
}
return subsetVector;
}
}
Minuses:
Need to create a tiny Extractor class for each of hundreds of data fields.
This is a custom solution that I came up with, seems to work and I like it, but I feel this problem must have been encountered and solved before, likely in a better way.. Has it?
Edit
Each object knows how to turn itself into a Map of fields, keyed on an enum of all fields.
E.g.
public enum DataField {
PERSON_NAME,
..
PERSON_AGE,
..
COMPANY,
..
}
public class Person {
private String name;
private Date birthday;
private int height;
private Job job;
private House house;
..
public Map<DataField, Object> toMap() {
return ImmutableMap
.add(DataField.PERSON_NAME, name)
.add(DataField.BIRTHDAY, birthday)
.add(DataField.HEIGHT, height)
.add(DataField.AGE, now().getYear() - birthday.getYear())
.build();
}
}
Then, I could build a Vector combining all the Maps, and select subsets from it like in 3.
Minuses:
Enum name clashes, e.g. if Job has an Address and House has an Address, then I want to be able to specify a subset taking street name of both. But how do I then define the toMap() method in the Address class?
No obvious place to put code doing computed fields requiring data from more than one object, e.g. physical distance from Address of House to Address of Company.
Many thanks!
Over in-memory object mapping in the application, I would favor database processing of the data for better performance. Views, or more elaborate OLAP/datawarehouse tooling could do the trick. If the calculated fields remain basic, as in "age = now - birth", I see nothing wrong with having that logic in the DB.
On the code side, given the large number of DTOs you have to deal with, you could use classless dynamic (available in some JVM languages) or JSON objects. The idea is that when a data structure changes, you only need to modify the DB and the UI, saving you the cost of changing a whole bunch of classes in between.

WCF, Linq Error:cannot implicitly convert type System.linq.iorderedQueryable<> to System.Collection.Generic.List<>

I am getting an error : i am using entity framework, wcf.
Error:cannot implicitly convert type System.linq.iorderedQueryable<xDataModel.Info> to System.Collection.Generic.List<xServiceLibrary.Info>
Below are my code:
WCF Service:
namespace xServiceLibrary
{
public List<Info> GetScenario()
{
xEntities db = new xEntities();
var query = from qinfo in db.Infoes
select qinfo;
//return query.Cast<Info>().ToList(); (not working)
//return query.toList(); (not working)
return query;
}
}
Interface:
namespace xServiceLibrary
{
[OperationContract]
List<Info> GetScenario();
}
Class:
namespace xServiceLibrary
{
[DataContract]
public class Info
{
[DataMember]
public int Scenario_Id;
[DataMember]
public string Scenario_Name { get; set; }
[DataMember]
public string Company_Name { get; set; }
}
}
update:(2)
I have two class library files.
One is xDataModel namespace in which i have created xmodel.edmx file.
second is xServiceLibrary namespace where i am implementing Wcf Service.
i have attached the xDataModel.dll file in my xServiceLibrary so that i could query my EF Model.
i am not able to understand the concept. any help would be appreciated.
The problem is that you have two different types named Info: DataModel.Info and ServiceLibrary.Info - because these are different types you cannot cast one into the other.
If there is no strong reason for both being there I would eliminate one of them. Otherwise as a workaround you could project DataModel.Info to ServiceLibrary.Info by copying the relevant properties one by one:
var results = (from qinfo in db.Infoes
select new ServiceLibrary.Info()
{
Scenario_Id = qinfo.Scenario_Id,
//and so on
}).ToList();
The problem is that you have two different classes, both called Info, both in scope at the time you run your query. This is a very very bad thing, especially if you thought they were the same class.
If DataModel.Info and ServiceLibrary.Info are the same class, you need to figure out why they are both in scope at the same time and fix that.
If they are different classes, you need to be explicit about which one you are trying to return. Assuming that your EF model includes a set of DataModel.Info objects, your options there are:
Return a List<DataModel.Info> which you can get by calling query.ToList()
Return a List<ServiceLibrary.Info> which you can get by copying the fields from your DataModel.Info objects:
var query = from qinfo in db.Info
select new ServiceLibrary.Info
{
Scenario_Id = q.Scenario_Id,
Scenario_Name = q.Scenario_Name
Company_Name = q.Company_Name
};
Return something else, such as your custom DTO object, similar to #2 but with only the specific fields you need (e.g. if ServiceLibrary.Info is a heavy object you don't want to pass around.
In general, though, your problem is centered around the fact that the compiler is interpreting List<Info> as List<ServiceLibrary.Info> and you probably don't want it to.

Fluent NHibernate - mapping an Entity as a different type

I have a class which I would like to map as a component onto any table which contains it:
public class Time
{
public int Hours { get; set; }
public int Minutes { get; set; }
public int Seconds { get; set; }
}
I would like to store this class as a bigint in the database - the same as how TimeSpan is stored but my class has completely different behaviour so I decided to create my own.
I'm using FLH's automapper and have this class set as a component (other classes have Time as a property). I've got as far as creating an override but am not sure how to go about mapping it:
I gave it a try this way:
public class TimeMappingOverride : IAutoMappingOverride<Time>
{
public void Override(AutoMapping<Time> mapping)
{
mapping.Map(x => x.ToTimeSpan());
mapping.IgnoreProperty(x => x.Hours);
mapping.IgnoreProperty(x => x.Minutes);
mapping.IgnoreProperty(x => x.Seconds);
}
}
But got this error:
Unable to cast object of type 'System.Linq.Expressions.UnaryExpression' to type 'System.Linq.Expressions.MethodCallExpression'.
How should I go about this?
Details of components can be found here: http://wiki.fluentnhibernate.org/Fluent_mapping#Components
But first of all, you can't map a method.
Assuming you change ToTimeSpan() to a property AsTimeSpan, there are two ways to do it, only the harder of which will work for you because you are using automapping:
Create a ComponentMap<Time> -- once done, your existing mapping will just work. This is not compatible with automapping.
Declare the component mapping inline:
mapping.Component(x => x.AsTimeSpan, component => {
component.Map(Hours);
component.Map(Minutes);
component.Map(Seconds);
});
You'll have to do this every time, though.
Of course, this doesn't address "I would like to store this class as bigint…"
Are you saying you want to persist it as seconds only? If so, scratch everything at the top and again you have two options:
Implement NHibernate IUserType (ugh)
Create a private property or field that stores the value as seconds only, and wire only this up to NHibernate. The getters and setters of the pubic properties will have to convert to/from seconds.
I personally haven't worked with AutoMappings yet, but my suggestion would be to look into NHibernate's IUserType to change how a type is being persisted. I believe that's a cleaner way of defining your custom mapping of Time <-> bigint.
Reading the code above, Map(x => x.ToTimeSpan()) will not work as you cannot embed application-to-database transformation code into your mappings. Even if that would be possible, the declaration misses the transformation from the database to the application. A IUserType, on the other hand, can do custom transformations in the NullSafeGet and NullSafeSet methods.

How do you automap List<float> or float[] with Fluent NHibernate?

Having successfully gotten a sample program working, I'm now starting
to do Real Work with Fluent NHibernate - trying to use Automapping on my project's class
heirarchy.
It's a scientific instrumentation application, and the classes I'm
mapping have several properties that are arrays of floats e.g.
private float[] _rawY;
public virtual float[] RawY
{
get
{
return _rawY;
}
set
{
_rawY = value;
}
}
These arrays can contain a maximum of 500 values.
I didn't expect Automapping to work on arrays, but tried it anyway,
with some success at first. Each array was auto mapped to a BLOB
(using SQLite), which seemed like a viable solution.
The first problem came when I tried to call SaveOrUpdate on the
objects containing the arrays - I got "No persister for float[]"
exceptions.
So my next thought was to convert all my arrays into ILists e.g.
public virtual IList<float> RawY { get; set; }
But now I get:
NHibernate.MappingException: Association references unmapped class: System.Single
Since Automapping can deal with lists of complex objects, it never
occured to me it would not be able to map lists of basic types. But
after doing some Googling for a solution, this seems to be the case.
Some people seem to have solved the problem, but the sample code I
saw requires more knowledge of NHibernate than I have right now - I
didn't understand it.
Questions:
1. How can I make this work with Automapping?
2. Also, is it better to use arrays or lists for this application?
I can modify my app to use either if necessary (though I prefer
lists).
Edit:
I've studied the code in Mapping Collection of Strings, and I see there is test code in the source that sets up an IList of strings, e.g.
public virtual IList<string> ListOfSimpleChildren { get; set; }
[Test]
public void CanSetAsElement()
{
new MappingTester<OneToManyTarget>()
.ForMapping(m => m.HasMany(x => x.ListOfSimpleChildren).Element("columnName"))
.Element("class/bag/element").Exists();
}
so this must be possible using pure Automapping, but I've had zero luck getting anything to work, probably because I don't have the requisite knowlege of manually mapping with NHibernate.
Starting to think I'm going to have to hack this (by encoding the array of floats as a single string, or creating a class that contains a single float which I then aggregate into my lists), unless someone can tell me how to do it properly.
End Edit
Here's my CreateSessionFactory method, if that helps formulate a
reply...
private static ISessionFactory CreateSessionFactory()
{
ISessionFactory sessionFactory = null;
const string autoMapExportDir = "AutoMapExport";
if( !Directory.Exists(autoMapExportDir) )
Directory.CreateDirectory(autoMapExportDir);
try
{
var autoPersistenceModel =
AutoMap.AssemblyOf<DlsAppOverlordExportRunData>()
.Where(t => t.Namespace == "DlsAppAutomapped")
.Conventions.Add( DefaultCascade.All() )
;
sessionFactory = Fluently.Configure()
.Database(SQLiteConfiguration.Standard
.UsingFile(DbFile)
.ShowSql()
)
.Mappings(m => m.AutoMappings.Add(autoPersistenceModel)
.ExportTo(autoMapExportDir)
)
.ExposeConfiguration(BuildSchema)
.BuildSessionFactory()
;
}
catch (Exception e)
{
Debug.WriteLine(e);
}
return sessionFactory;
}
I would probably do a one to many relationship and make the list another table...
But maybe you need to rethink your object, is there also a RawX that you could compose into a RawPoint? This would make a table with 3 columns (ParentID, X, Y).
The discontinuity comes from wanting to map a List to a value that in an RDBMS won't go in a column very neatly. A table is really the method that they use to store Lists of data.
This is the whole point of using an ORM like NHibernate. When doing all the querying and SQL composition by hand in your application, adding a table had a high cost in maintenance and implementation. With NHibernate the cost is nearly 0, so take advantage of the strengths of the RDBMS and let NHibernate abstract the ugliness away.
I see your problem with mapping the array, try it with an override mapping first and see if it will work, then you could maybe create a convention override if you want the automap to work.
.Override<MyType>(map =>
{
map.HasMany(x => x.RawY).AsList();
})
Not sure if that will work, I need to get an nHibernate testing setup configured for this stuff.
Since I posted my question, the Fluent NHibernate team have fixed this problem.
You can now automap ILists of C# value types (strings, ints, floats, etc).
Just make sure you have a recent version of FNH.
Edit
I recently upgraded from FNH 1.0 to FNH 1.3.
This version will also automap arrays - float[], int[], etc.
Seems to map them as BLOBs. I assume this will be more efficient than ILists, but have not done any profiling to confirm.
I eventually got an override to work - see the end of the code listing. The key points are:
a new mapping class called DlsAppOverlordExportRunDataMap
the addition of a UseOverridesFromAssemblyOf clause in
CreateSessionFactory
Also, it turns out that (at least with v. 1.0.0.594) there is a very big gotcha with Automapping - the mapping class (e.g. DlsAppOverlordExportRunDataMap) cannot be in the same Namespace as the domain class (e.g. DlsAppOverlordExportRunData)!
Otherwise, NHibernate will throw "NHibernate.MappingException: (XmlDocument)(2,4): XML validation error: ..." , with absolutely no indication of what or where the real problem is.
This is probably a bug, and may be fixed in later versions of Fluent NHibernate.
namespace DlsAppAutomapped
{
public class DlsAppOverlordExportRunData
{
public virtual int Id { get; set; }
// Note: List<float> needs overrides in order to be mapped by NHibernate.
// See class DlsAppOverlordExportRunDataMap.
public virtual IList<float> RawY { get; set; }
}
}
namespace FrontEnd
{
// NEW - SET UP THE OVERRIDES
// Must be in different namespace from DlsAppOverlordExportRunData!!!
public class DlsAppOverlordExportRunDataMap : IAutoMappingOverride<DlsAppOverlordExportRunData>
{
public void Override(AutoMapping<DlsAppOverlordExportRunData> mapping)
{
// Creates table called "RawY", with primary key
// "DlsAppOverlordExportRunData_Id", and numeric column "Value"
mapping.HasMany(x => x.RawY)
.Element("Value");
}
}
}
private static ISessionFactory CreateSessionFactory()
{
ISessionFactory sessionFactory = null;
const string autoMapExportDir = "AutoMapExport";
if( !Directory.Exists(autoMapExportDir) )
Directory.CreateDirectory(autoMapExportDir);
try
{
var autoPersistenceModel =
AutoMap.AssemblyOf<DlsAppOverlordExportRunData>()
.Where(t => t.Namespace == "DlsAppAutomapped")
// NEW - USE THE OVERRIDES
.UseOverridesFromAssemblyOf<DlsAppOverlordExportRunData>()
.Conventions.Add( DefaultCascade.All() )
;
sessionFactory = Fluently.Configure()
.Database(SQLiteConfiguration.Standard
.UsingFile(DbFile)
.ShowSql()
)
.Mappings(m => m.AutoMappings.Add(autoPersistenceModel)
.ExportTo(autoMapExportDir)
)
.ExposeConfiguration(BuildSchema)
.BuildSessionFactory()
;
}
catch (Exception e)
{
Debug.WriteLine(e);
}
return sessionFactory;
}
Didn't get any answers here or on the Fluent NHibernate mailing list that actually worked, so here's what I did.
It smells like a horrible hack, but it works. (Whether it will scale up to large data sets remains to be seen).
First, I wrapped a float property (called Value) in a class:
// Hack - need to embed simple types in a class before NHibernate
// will map them
public class MappableFloat
{
public virtual int Id { get; private set; }
public virtual float Value { get; set; }
}
I then declare the properties in other classes that need to be Lists of floats e.g.
public virtual IList<MappableFloat> RawYMappable { get; set; }
NHibernate creates a single database table, with multiple foreign keys, e.g.
create table "MappableFloat" (
Id integer,
Value NUMERIC,
DlsAppOverlordExportRunData_Id INTEGER,
DlsAppOverlordExportData_Id INTEGER,
primary key (Id)
)

NHibernate - Do I have to have a class to interface with a table?

I have a class called Entry. This class as a collection of strings called TopicsOfInterest. In my database, TopicsOfInterest is represented by a separate table since it is there is a one-to-many relationship between entries and their topics of interest. I'd like to use nhibernate to populate this collection, but since the table stores very little (only an entry id and a string), I was hoping I could somehow bypass the creation of a class to represent it and all that goes with (mappings, configuration, etc..)
Is this possible, and if so, how? I'm using Fluent Nhibernate, so something specific to that would be even more helpful.
public class Entry
{
private readonly IList<string> topicsOfInterest;
public Entry()
{
topicsOfInterest = new List<string>();
}
public virtual int Id { get; set; }
public virtual IEnumerable<string> TopicsOfInterest
{
get { return topicsOfInterest; }
}
}
public class EntryMapping : ClassMap<Entry>
{
public EntryMapping()
{
Id(entry => entry.Id);
HasMany(entry => entry.TopicsOfInterest)
.Table("TableName")
.AsList()
.Element("ColumnName")
.Cascade.All()
.Access.CamelCaseField();
}
}
I had a similar requirement to map a collection of floats.
I'm using Automapping to generate my entire relational model - you imply that you already have some tables, so this may not apply, unless you choose to switch to an Automapping approach.
Turns out that NHibernate will NOT Automap collections of basic types - you need an override.
See my answer to my own question How do you automap List or float[] with Fluent NHibernate?.
I've provided a lot of sample code - you should be able to substitute "string" for "float", and get it working. Note the gotchas in the explanatory text.