Spring Data Rest: Target bean is not of type of the persistent entity - spring-data-rest

I have a data model called Project which is lifecycled from an Angular app via Spring Data Rest. The Project has Scenes, Scenes have UpstreamKeys held in a Map<Integer, UpstreamKey> and UpstreamKey is an abstract class with two implementations ChromaKey and LumaKey. (see code below)
When I edit (PUT) and existing Project with a ChromaKey in the map, and change that to a LumaKey, I get an error message in the backend:
Caused by: java.lang.IllegalArgumentException: Target bean of type io.mewald.notime.designer.model.scene.usk.LumaKey is not of type of the persistent entity (io.mewald.notime.designer.model.scene.usk.chroma.ChromaKey)!: io.mewald.notime.designer.model.scene.usk.LumaKey
at org.springframework.util.Assert.instanceCheckFailed(Assert.java:702) ~[spring-core-5.3.10.jar:5.3.10]
at org.springframework.util.Assert.isInstanceOf(Assert.java:621) ~[spring-core-5.3.10.jar:5.3.10]
at org.springframework.data.mapping.model.BasicPersistentEntity.verifyBeanType(BasicPersistentEntity.java:584) ~[spring-data-commons-2.5.5.jar:2.5.5]
at org.springframework.data.mapping.model.BasicPersistentEntity.getPropertyAccessor(BasicPersistentEntity.java:458) ~[spring-data-commons-2.5.5.jar:2.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader$MergingPropertyHandler.<init>(DomainObjectReader.java:639) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.lambda$mergeForPut$1(DomainObjectReader.java:141) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at java.base/java.util.Optional.map(Optional.java:265) ~[na:na]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.mergeForPut(DomainObjectReader.java:139) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.lambda$mergeMaps$6(DomainObjectReader.java:429) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at java.base/java.util.Optional.map(Optional.java:265) ~[na:na]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.mergeMaps(DomainObjectReader.java:418) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.access$000(DomainObjectReader.java:65) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader$MergingPropertyHandler.doWithPersistentProperty(DomainObjectReader.java:673) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:374) ~[spring-data-commons-2.5.5.jar:2.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.lambda$mergeForPut$1(DomainObjectReader.java:143) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at java.base/java.util.Optional.map(Optional.java:265) ~[na:na]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.mergeForPut(DomainObjectReader.java:139) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.lambda$mergeCollections$7(DomainObjectReader.java:469) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at java.base/java.util.Optional.map(Optional.java:265) ~[na:na]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.mergeCollections(DomainObjectReader.java:452) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.access$100(DomainObjectReader.java:65) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader$MergingPropertyHandler.doWithPersistentProperty(DomainObjectReader.java:675) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:374) ~[spring-data-commons-2.5.5.jar:2.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.lambda$mergeForPut$1(DomainObjectReader.java:143) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at java.base/java.util.Optional.map(Optional.java:265) ~[na:na]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.mergeForPut(DomainObjectReader.java:139) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.json.DomainObjectReader.readPut(DomainObjectReader.java:116) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.config.JsonPatchHandler.applyPut(JsonPatchHandler.java:100) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
at org.springframework.data.rest.webmvc.config.PersistentEntityResourceHandlerMethodArgumentResolver.readPutForUpdate(PersistentEntityResourceHandlerMethodArgumentResolver.java:234) ~[spring-data-rest-webmvc-3.5.5.jar:3.5.5]
... 90 common frames omitted
My interpretation is that it is somehow trying to merge the LumaKey into the ChromaKey and - of course - that doesn't work. Why is it not simply replacing the bean? What would be a fix to get this working?
Here's the structure of the data model relevant to this questions:
#NoArgsConstructor #Data #SuperBuilder #AllArgsConstructor
#Document
public class Project {
#Id
private String id;
...
private List<Scene> scenes;
}
#NoArgsConstructor #Data #SuperBuilder #AllArgsConstructor #EqualsAndHashCode(callSuper = false)
public class Scene {
...
private Map<Integer, UpstreamKey> upstreamKeys;
}
#Data #NoArgsConstructor #AllArgsConstructor #SuperBuilder(toBuilder = true)
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY, property = "type")
#JsonSubTypes({
#JsonSubTypes.Type(value = ChromaKey.class, name = "ChromaKey"),
#JsonSubTypes.Type(value = LumaKey.class, name = "LumaKey")
})
public abstract class UpstreamKey {
...
public abstract String getType();
}
#Data #EqualsAndHashCode(callSuper = true) #NoArgsConstructor #AllArgsConstructor #SuperBuilder(toBuilder = true)
public class ChromaKey extends UpstreamKey {
...
#Override
public String getType() {
return "ChromaKey";
}
}
#Data #EqualsAndHashCode(callSuper = true) #NoArgsConstructor #AllArgsConstructor #SuperBuilder(toBuilder = true)
public class LumaKey extends UpstreamKey {
...
#Override
public String getType() {
return "LumaKey";
}
}
EDIT: Taken from the documentation at https://docs.spring.io/spring-data/rest/docs/current/reference/html/#repository-resources.item-resource
The PUT method replaces the state of the target resource with the supplied request body.
The documentation explicitly talks about replace so I am wondering why a method called mergeForPut is involved at all.
EDIT: Just realised my spring-boot-starter-parent was a bit old. I updated to 2.6.4 but the error remains.
EDIT: I just refactored everything so that scene.upstreamKeys can be a List instead of a Map just in case this might cause it. But it does not. Still the same error.
EDIT: As requested, I created a minimum project that replicates the error I am describing above: https://github.com/mathias-ewald/demo-sdr-target-bean-is-not-of-type

Though I can achieve the update in spring data mongodb, it's not possible in spring data rest now. In current spring data rest implementation, a PUT request try to merge the nested collection member by calling DomainObjectReader.mergeForPut(...) which fails on type mismatch, rather than replace it. See this github issue and pull request.
So the workaround is implement a controller yourself.
Edit 2022-04-05
Annotate UpstreamKey with #Immutable make spring data rest perform replacement rather than merge. This solution only works if the nested collection member is just value (no id, no audit data, no #JsonIgnore to hide server side data) instead of entity (with id).
Find the updated project and the diff
Below is result mongo data after the put opertaion.
{
_id: ObjectId('624b9fe3f32185579ff56849'),
name: 'Project 1',
scenes: [
{
name: 'Scene 1',
usks: [
{
c: 3,
a: 1,
_class: 'com.example.demo.LumaKey'
}
]
}
],
_class: 'com.example.demo.Project'
}

Shortly you want to do type casting between LumaKey and ChromaKey. So it's not ok with Java.
You can check this answer. I debug your code and this is what you want.
https://stackoverflow.com/a/17004028/8909313
Edit 1
You try to delete ChromaKey and add LumaKey.
Edit 2
Please debug this
PersistentEntityResourceHandlerMethodArgumentResolver.java (line 158 [readPutForUpdate method])
In this method request json and mongo json deserialize. And Object Mapper map them.
When mongodb object's usks field's type is ChromaKey but your request's type LumaKey.
And then spring call mapper method for mapping.
mapper.readerFor(target.getClass()).readValue(source);
for this line target's type is ChromaKey but source's type is LumaKey.
So mapper cannot cast your ChromaKey to LumaKey.
I try to write custom JsonDeserializer but it didn't solve your problem. Maybe you can try this way.
I think it's a logical problem.

Related

#Cacheable annotation cannot work as expected when deserialize beans with LocalDateTime type property

I found that the annotation #Cacheable cannot work when the method returns a Java Bean type, this is the complete description:
I annotated #Cacheable on a method to use spring cache:
#Cacheable(cacheNames="userCache", key="#userId")
public User getUser(long userId){
return userRepository.getUserById(userId);
}
And the User class like this:
public class User{
Long userId;
String username;
#JsonSerialize(using = LocalDateTimeSerializer.class)
#JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss")
private LocalDateTime birthDateTime;
}
As you can see, I annotated the relating Jackson annotations to make Jackson deserialization for LocalDateTime types work, and this is the related dependency in pom.xml:
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>2.12.5</version>
</dependency>
After that, I call the #Cacheable method getUser like this:
User user = userCache.getUser(1L);
and there throws an exception:
org.redisson.client.RedisException: Unexpected exception while processing command
at org.redisson.command.CommandAsyncService.convertException(CommandAsyncService.java:326)
at org.redisson.command.CommandAsyncService.get(CommandAsyncService.java:123)
at org.redisson.RedissonObject.get(RedissonObject.java:82)
...blabla
Caused by: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Java 8 date/time type java.time.LocalDateTime not supported by default: add Module "com.fasterxml.jackson.datatype:jackson-datatype-jsr310" to enable handling at [Source: (io.netty.buffer.ByteBufInputStream); line: 1, column: 101] (through reference chain: com.stackoverflow.domain.User["birthDateTime"]) at com.fasterxml.jackson.databind.exc.InvalidDefinitionException.from(InvalidDefinitionException.java:67)
at com.fasterxml.jackson.databind.DeserializationContext.reportBadDefinition(DeserializationContext.java:1764)
at com.fasterxml.jackson.databind.deser.impl.UnsupportedTypeDeserializer.deserialize(UnsupportedTypeDeserializer.java:36)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:129)
3.Before I use the #Cacheable, there is no problem if I get the User from database straightly. But when I begin to use #Cacheable, it always throws the exception above, no matter if I configured those Jackson deserialization for LocalDateTime. Is #Cacheable cannot work well with Java Bean with LocalDateTime property, or just my configuration of Jackson is wrong?
I had the same problem. Spring Cache doesn't use the implicit ObjectMapper used by other Spring components.
Include the module, you already did that.
Create a configuration which will override the default Spring Cache Configuration:
#Configuration
#EnableCaching
public class CacheConfiguration {
#Bean
public RedisSerializationContext.SerializationPair<Object> serializationPair() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new JavaTimeModule())
.activateDefaultTyping(
objectMapper.getPolymorphicTypeValidator(),
ObjectMapper.DefaultTyping.EVERYTHING,
JsonTypeInfo.As.PROPERTY
);
return RedisSerializationContext.SerializationPair.fromSerializer(new GenericJackson2JsonRedisSerializer(objectMapper));
}
#Bean
public RedisCacheConfiguration redisCacheConfiguration(
#Value("${cache.default-ttl-in-seconds}") Integer ttl,
RedisSerializationContext.SerializationPair<Object> serializationPair
) {
return RedisCacheConfiguration.defaultCacheConfig()
.disableCachingNullValues()
.entryTtl(Duration.ofSeconds(ttl))
.serializeValuesWith(serializationPair);
}
}

Why jackson is not serializing this?

#Data
public class IdentificacaoBiometricaDto {
private Integer cdIdentifBiom;
private String nrMatricula;
private String deImpressaoDigital;
private Integer cdFilialAtualizacao;
}
I am using retrofit 2.6.1, jackson 2.9.9 and lombok 1.8.10.
The exception is:
Caused by: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class br.com.clamed.modelo.loja.dto.central.IdentificacaoBiometricaDto and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS)
at com.fasterxml.jackson.databind.exc.InvalidDefinitionException.from(InvalidDefinitionException.java:77)
at com.fasterxml.jackson.databind.SerializerProvider.reportBadDefinition(SerializerProvider.java:1191)
at com.fasterxml.jackson.databind.DatabindContext.reportBadDefinition(DatabindContext.java:313)
at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.failForEmpty(UnknownSerializer.java:71)
at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.serialize(UnknownSerializer.java:33)
at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:480)
at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:400)
at com.fasterxml.jackson.databind.ObjectWriter$Prefetch.serialize(ObjectWriter.java:1392)
at com.fasterxml.jackson.databind.ObjectWriter._configAndWriteValue(ObjectWriter.java:1120)
at com.fasterxml.jackson.databind.ObjectWriter.writeValueAsBytes(ObjectWriter.java:1017)
at retrofit2.converter.jackson.JacksonRequestBodyConverter.convert(JacksonRequestBodyConverter.java:34)
at retrofit2.converter.jackson.JacksonRequestBodyConverter.convert(JacksonRequestBodyConverter.java:24)
at retrofit2.ParameterHandler$Body.apply(ParameterHandler.java:355)
... 14 more
The object mapper:
return new ObjectMapper().registerModule(new ParameterNamesModule())
.registerModule(new Jdk8Module())
.registerModule(new JavaTimeModule())
.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
I am setting all fields, when passing it to a request body, retrofit fails because jackson could not serialize the object.
Retrofit call:
#POST("/usuario/v1.0/cadastraBiometria")
Call<IdentificacaoBiometricaDto> cadastraBiometria(#Body IdentificacaoBiometricaDto identificacaoBiometricaDto);
Rest service:
#RestController
#RequestMapping("/usuario")
public class UsuarioController {
#PostMapping(value = "/v1.0/cadastraBiometria")
public ResponseEntity<IdentificacaoBiometricaDto> cadastraBiometria(#RequestBody IdentificacaoBiometricaDto identificacaoBiometricaDto) {
}
}
Update:
If I change the retrofit converter to Gson it works;
If I serialize it using Jackson directly, it works;
Removing lombok makes no difference;
Found the problem. The biometric reader library was causing this. For some reason it's incompatible with openjdk-11 and is causing all sort of unrelated problems.
Yes, very weird. But the lib is very poorly done.

Jackson, how to expose fields when serializing a class which extend a collection?

I have a class that we use for paginated results, as follows:
public class PaginatedList<T> extends LinkedList<T> {
private int offset;
private int count;
private int totalResultCount;
//...
}
and I'd like Jackson to serialize it like this:
{
"results":[1,2,3],
"offset":0,
"count":3,
"totalResultCount":15
}
(where the parent list contains the three integer values 1,2 and 3.)
In my first attempt I discovered that Jackson effectively ignores any properties on classes which are assignable to a Collection class. In hindsight, this makes sense, and so I'm now in search of a workaround. A search of SO resulted in two similar questions:
jackson-serialization-includes-subclasss-fields
jaxb-how-to-serialize-fields-in-a-subclass-of-a-collection
However, both of these resulted in the suggestion to switch from inheritance to composition.
I am specifically looking for a solution that allows the class to extend a collection. This 'PaginatedList' class is part of the common core of the enterprise, and extends Collection so that it can be used (and introspected) as a collection throughout the code. Changing to composition isn't an option. That being said, I am free to annotate and otherwise change this class to support serialization as I described above.
So, from what I can tell, there's two parts I'm missing (what I'm looking for in an answer):
How to get Jackson to 'see' the added properties?
How to get Jackson to label the collection's content as a 'results' property in the JSON output?
(PS: I'm only concerned with serialization.)
Ashley Frieze pointed this out in a comment, and deserves the credit for this answer.
I solved this by creating a JsonSerializer instance as follows:
public class PaginatedListSerializer extends JsonSerializer<PaginatedList> {
#Override
public Class<PaginatedList> handledType() {
return PaginatedList.class;
}
#Override
public void serialize(PaginatedList value, JsonGenerator jgen, SerializerProvider provider) throws IOException, JsonProcessingException {
jgen.writeStartObject();
jgen.writeArrayFieldStart("results");
for (Object entry : value) {
jgen.writeObject(entry);
}
jgen.writeEndArray();
jgen.writeNumberField("offset", value.offset);
jgen.writeNumberField("count", value.count);
jgen.writeNumberField("totalResultCount", value.totalResultCount);
jgen.writeEndObject();
}
}
and, of course, register it as a module:
SimpleModule testModule = new SimpleModule("PaginatedListSerializerModule", new Version(1, 0, 0, null, null, null));
testModule.addSerializer(new PaginatedListSerializer());
mapper.registerModule(testModule);

Register JodaModule in Jax-RS Application

I'm writing a Jax-RS application using Jersey, and Jackson2 under the hood to facilitate JSON i/o. The service itself works fine, but I'd like to improve it by having the Jackson mapper automagically serialize/deserialize date and date-times to JodaTime objects.
I'm following the documentation here and have added the relevant jars, but I'm lost on this instruction:
Registering module
To use Joda datatypes with Jackson, you will first need to register the module first (same as with all Jackson datatype modules):
ObjectMapper mapper = new ObjectMapper();
mapper.registerModule(new JodaModule());
I've tried to do this in the custom class that extends jax.ws.rs.core.Application, but I'm not at all confident in that solution. I'm currently getting this error:
Can not instantiate value of type [simple type, class org.joda.time.DateTime] from String value ('2014-10-22'); no single-String constructor/factory method
at [Source: org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream#3471b6d5; line: 7, column: 25]
Other than the general impression that this module registration needs to happen at application (servlet?) startup, I have no idea what to do with this information. Do I need to annotate a custom class with something in particular to have it picked up ? Should I be extending some class ?
The examples I find on StackOverflow usually stick it in main() and call the mapper directly, but I'm relying on Jackson Databinding so the examples aren't relevant. Any direction is appreciated.
You'll basically want to create/configure/return the ObjectMapper in a ContextResolver. Something like
#Provider
public class ObjectMapperContextResolver implements ContextResolver<ObjectMapper> {
final ObjectMapper mapper = new ObjectMapper();
public ObjectMapperContextResolver() {
mapper.registerModule(new JodaModule());
}
#Override
public ObjectMapper getContext(Class<?> type) {
return mapper;
}
}
If you are using package scanning to discover your resources, then the #Provider annotation should allow this class to be discovered and registered also.
Basically what happens, is the the MessageBodyReader and MessageBodyWriter provided by Jackson, used for unmarshalling and marshalling, respectively, will call the getContext method in the ContextResolver, to determine the ObjectMapper to use. The reader/writer will pass in the class (in a reader it will be the type expected in a method param, in a writer it will be the type returned as-a/in-a response), meaning we are allowed to use differently configured ObjectMapper for different classes, as seen here. In the above solution, it is used for all classes.

Mule DataMapper IOException

I have a class hierarchy with a base class called EntityModel, and two classes InvestorModel and AgentModel that each inherit directly from it and add a few properties. I am then creating Mule Data Maps to map JSON to each child class individually.
The InvestorModel map works fine, but the AgentModel map fails (in the IDE preview) with an IOException stating that it can't instantiate EntityModel. This seems strange as it can instantiate it in the InvestorModel map. I'm posting the error, but I don't really have any source to post as these are just mapping files. I just don't know where to start looking.
Mule Studio is up to date and v3.5.0
java.io.IOException: org.jetel.exception.JetelException: za.co.sci.core.shared.EntityModel can not be instantiated.
at org.jetel.component.tree.writer.TreeFormatter.write(TreeFormatter.java:72)
at org.jetel.util.MultiFileWriter.writeRecord2CurrentTarget(MultiFileWriter.java:420)
at org.jetel.util.MultiFileWriter.write(MultiFileWriter.java:297)
at org.jetel.component.TreeWriter.execute(TreeWriter.java:464)
at org.jetel.graph.Node.run(Node.java:465)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: org.jetel.exception.JetelException: za.co.sci.core.shared.EntityModel can not be instantiated.
at com.opensys.cloveretl.component.tree.writer.bean.BeanWriter.a(Unknown Source)
at com.opensys.cloveretl.component.tree.writer.bean.BeanWriter.a(Unknown Source)
at com.opensys.cloveretl.component.tree.writer.bean.BeanWriter.a(Unknown Source)
at com.opensys.cloveretl.component.tree.writer.bean.BeanWriter.writeStartNode(Unknown Source)
at org.jetel.component.tree.writer.model.runtime.WritableObject.writeContent(WritableObject.java:67)
at org.jetel.component.tree.writer.model.runtime.WritableContainer.write(WritableContainer.java:67)
at org.jetel.component.tree.writer.model.runtime.WritableObject.writeContent(WritableObject.java:77)
at org.jetel.component.tree.writer.model.runtime.WritableContainer.write(WritableContainer.java:67)
at org.jetel.component.tree.writer.model.runtime.WritableObject.writeContent(WritableObject.java:77)
at org.jetel.component.tree.writer.model.runtime.WritableContainer.write(WritableContainer.java:67)
at org.jetel.component.tree.writer.model.runtime.WritableObject.writeContent(WritableObject.java:77)
at org.jetel.component.tree.writer.model.runtime.WritableContainer.write(WritableContainer.java:67)
at org.jetel.component.tree.writer.model.runtime.WritableObject.writeContent(WritableObject.java:77)
at org.jetel.component.tree.writer.TreeFormatter.write(TreeFormatter.java:69)
... 7 more
Class snippets:
public abstract class EntityModel implements Serializable {
protected Long id;
private long entityNumber;
private EntityStatus status;
private String entityName;
...
public class AgentModel extends EntityModel implements Serializable{
private int agentCode;
private AgentType agentType;
private AgentClass agentClass;
...
public class InvestorModel extends EntityModel implements Serializable {
private boolean blockedRand;
private String utAUTType;
...
Turns out the error was due to the base class being abstract. Kind of obvious really.
The reason the one map worked, but not the other was because of the order of the fields. The first field to be mapped on the InvestorModel was a field defined in InvestorModel so the mapper knew which class to instantiate. On the AgentModel map the first field was defined on the abstract class EntityModel so the mapper tried to instantiate that class, but failed, it didn't matter that I have chosen AgentModel as the destination.