automapping nested components naming conventions - fluent-nhibernate

Im making the switch from Fluent Mapping to Automapping in my current project.
If I have the following domain:
public class Matter{
public Client Client{get;set;}
}
public class Client {
public Name Name{get;set;}
}
public class Name{
public string FirstName{get;set;}
public string LastName{get;set;}
}
When Automapping this model, the column names for the Name component are expected to be:
Name_FirstName
Name_LastName
(i already have an underscore convention).
Is there a convention I could implement that would get the automapper to generate column names like:
Client_Name_FirstName
Client_Name_LastName
I hope ive described that effectively.
Cheers,
Byron

Sorry to pick up an old message but for the sake of others coming from a search engine perhaps http://wiki.fluentnhibernate.org/Conventions is what you're after?

Related

DDD ValueObject and Enumeration , is there any good way to implement serialization?

In DDD, Value Object and Enumeration are quite beautiful so that I want use both two in the daily program logic, not only domain logic. When use customized value objects and enumerations, serialization problem is coming : should I implemented all the value objects and enumeration with System.Text.Json.JsonConverter<T> , or is there any good way to handle serialization and deserialization ?
Update:
to make it clear, Eumeration demo as below(ValueObject derived classes are same.):
[JsonConverter(typeof(CustomizedConverter))]
public class CustomizedEnumeration1 : Enumeration
{
public string Customized { get; protected set; }
public ... // some other customized property or class
public CustomizedEnumeration(int id, string name, string customized) : base(id, string) {
Customized = customized;
}
}
public class Customized2 : Enumeration
{ ... }
public class OtherCustomized: Enumeration
{ ... }
In DDD, properties sometimes are sealed by protected/private setter, deserialization has no right to set the value. Many derived classes can't deserialize as expected, so we have to rewrite serialization with System.Text.Json.JsonConverter<T> one by one. rewrite every derived Enumeration / Valueobject converter is not good, can any one point out any easy abstraction for that ?
You can achieve your desired result. You need to switch to NewtonsoftJson serialization.
Call this in Startup.cs in the ConfigureServices method:
services.AddControllers().AddNewtonsoftJson();
After this, your constructor will be called by deserialization for classes with private setter.
There is no need for custom converters.
For reference, I am using ASP Net Core 3.1

Object mapper vs Object wrapper

I would appreciate a little help here...
Lets say that in an application we have a Data Layer and a Business Logic Layer. In the DAL we have the following entity:
public class Customer {
public string Name {get; set;}
public ICollection<Address> Addresses {get; set;}
}
public class Address {
public string Street {get; set;}
}
In the BLL we have the following POCOs:
public class CustomerDto {
public string Name {get; set;}
public ICollection<AddressDto> Addresses {get; set;}
}
public class AddressDto {
public string Street {get; set;}
}
The entities in the DAL are populated with a ligth-weight ORM and retrieve from the BLL using a repository. Ex:
public class CustomerInformationService {
private readonly ICustomerRepository _repository {get; set;}
public CustomerInformationService (ICustomerRepository repository)
{
_repository = repository;
}
public class CustomerDto Get(int id)
{
var customerEntity = _repository.Get(id);
var customerDto = /* SOME TRANSFORMATION HERE */
return customerDTO;
}
}
My questions is about the /* SOME TRANSFORMATION HERE */ part. There is a discussion in our team about how to do the "mapping".
One approach is to use a mapper either an automapper or a manual mapping.
The second approach is to use sort of like a wrapper around Entity and reference the DTO in order to save a copying operation between object. Something like this:
public class CustomerDto
{
private IEntity _customerEntity;
public IEntity CustomerEntity { get {return _customerEntity;}}
public CustomerDto(IEntity customerEntity)
{
_customerEntity = customerEntity;
}
public string Name
{
get { return _customerEntity.Name; }
}
public ICollection<Address> Addresses
{
get { return _customerEntity.Addresses; }
}
}
The second approach feels a little weird to me because _customerEntity.Addresses feels like a leak (_customerEntity's reference) between my DAL and my BLL but I am not sure.
Are there any advantages/disavantages of using one approach over the other one?
Additional info: We usually pull a max. of 1000 records at a time that would need to be transform between Entity and DTO.
You did not mentioned your "ligth-weight ORM". I will answer in two sections.
If you are using ORM that creates proxies
You should avoid exposing Entities outside certain boundary. ORMs like NHibernate/EF implement lazy loading based on proxies. If you expose Entities to application/UI layer, you will have little control over ORM behavior. This may lead to many unexpected issues and debugging will also very difficult.
Wrapping Entities in DTOs will gain nothing. You are accessing Entities anyway.
Using DTOs and mapping them with some mapper tool like AutoMapper is good solution here.
If you are using ORM that does not create proxies
Do NOT use DTOs, directly use your Entities. DTOs are useful and recommended here in many cases. But the example you given in question does not need DTOs at all.
In case you choose to use DTOs, wrapping Entities in DTOs does not make sense. If you want to use Entity anyway, why wrap it? Again, tool like AutoMapper could help.
Refer this question. It's bit different; I am asking Yes/No and you are asking How. But still it will help you.
I bet for the service layer approach. Basically because something that looks like a business object or domain object has nothing to do with DTOs.
And, indeed, you and your team should use AutoMapper instead of repeating the same code tons of times which will consist in setting some properties from A to B, A to C, C to B...

Fluent NHibernate HasMany relation with different subtypes of same superclass

I´m using Fluent Nhibernate with automapping and having problem setting up a bi-directional HasMany relationship because of my current inheritance.
I simplified version of my code looks like this
public abstract class BaseClass
{
public BaseClass Parent { get; set; }
}
public class ClassA : BaseClass
{
public IList<ClassB> BChilds { get; protected set; }
public IList<ClassC> CChilds { get; protected set; }
}
public class ClassB : BaseClass
{
public IList<ClassD> DChilds { get; protected set; }
}
public class ClassC : BaseClass
{
}
public class ClassD : BaseClass
{
}
Every class can have one parent and some parents can have childs of two types. I´m using table-per-type inheritance which result in the tables
"BaseClass"
"ClassA"
"ClassB"
"ClassC"
"ClassD"
To get a working bi-directional mapping I have made the following overrides
(one example from ClassA)
mapping.HasMany<BaseType>(x => x.BChilds).KeyColumn("Parent_Id");
mapping.HasMany<BaseType>(x => x.CChilds).KeyColumn("Parent_Id");
This works fine on classes with only one type of children, but ClassA with two child types will get all subtypes of BaseType in each list which ofcourse will end up in an exception. I have looked at two different workarounds tho none of them feels really sufficient and I really believe there is a better way to solve it.
Workaround 1: Point to the concrete subtype in the HasMany mapping. (Updated with more info)
mapping.HasMany<ClassB>(x => x.BChilds).KeyColumns("Parent_Id");
(BaseType replaced with ClassB)
With this mapping NHibernate will in some cases look in the ClassB table for a column named Parent_Id, obviously there is no such column as it belongs to the BaseClass table. The problem only occurs if you add a statement based on BChilds during a ClassA select. e.g loading an entity of ClassA then calling ClassA.BChilds seems to work, but doing a query (using NhibernateLinq) something like
Query<ClassA>().Where(c => c.BChilds.Count == 0)
the wrong table will be used. Therefore I have to manually create a new column in this table with the same name and copy all the values. It works but it´s risky and not flexible at all.
Workaround 2: Add a column to the BaseClass that tells the concrete type and add a where statement to the HasMany mapping.
(after my update to workaround1 I´m no longer sure if this could be a workable solution)
By adding a column they same way as it´s done when using table-per-hierarchy inheritance with a discriminatorValue. i.e BaseType table will get a new column with a value of ClassA, ClassB... Tho given how well NHibernate handles the inheritance overall and by reading the NHibernate manual I believe that the discriminator shouldn´t be needed in a table-per-type scenario, seems like Nhibernate already doing the hardpart and should be able to take care of this in a clean way to without adding a new column, just can´t figure out how.
What's your base class mapping and what does your subclass map look like?
You should be able to do
mapping.HasMany(x => x.BChilds);
And with the correct mapping, you shouldn't have a problem.
If it's fluent nhibernate, look into
UseUnionSubclassForInheritanceMapping();

AutoMapping Custom Collections with FluentNHibernate

I am retrofitting a very large application to use NHibernate as it's data access strategy. Everything is going well with AutoMapping. Luckily when the domain layer was built, we used a code generator. The main issue that I am running into now is that every collection is hidden behind a custom class that derives from List<>. For example
public class League
{
public OwnerList owners {get;set;}
}
public class OwnerList : AppList<Owner> { }
public class AppList<T> : List<T> { }
What kind of Convention do I have to write to get this done?
I don't think you're going to be able to achieve this with a convention. You will have to create an auto mapping override and then do the following:
mapping.HasMany(l => a.owners).CollectionType<OwnerList>();

(Fluent)NHibernate: Mapping an IDictionary<MappedClass, MyEnum>

I've found a number of posts about this but none seem to help me directly. Also there seems to be confusion about solutions working or not working during different stages of FluentNHibernate's development.
I have the following classes:
public class MappedClass
{
...
}
public enum MyEnum
{
One,
Two
}
public class Foo
{
...
public virtual IDictionary<MappedClass, MyEnum> Values { get; set; }
}
My questions are:
Will I need a separate (third) table of MyEnum?
How can I map the MyEnum type? Should I?
What should Foo's mapping look like?
I've tried mapping HasMany(x => x.Values).AsMap("MappedClass")...
This results in: NHibernate.MappingException : Association references unmapped class: MyEnum
It looks like this questions is a duplicate of Fluent code for mapping an IDictionary<SomeEntity, int>?. The solution was to use hbm.xml to map a ternary association table. It looks like at the time FluentNHibernate's AsTernaryAssocation() method only worked for entity types. I can't tell if this has changed, or if it is a planned feature.