AVRO serialization issue kafkaproducer using scala - confluent-schema-registry

I am getting this error, I tried to change the port of schemaregistry from **8081 to 18081. Also I added the dependencies like avro-tools, kafka-schema-registry, etc.
"first_name": "John", "last_name": "Doe", "age": 34, "height": 178.0, "weight": 75.0, "automated_email": false}
21/07/18 19:26:56 INFO clients.Metadata: Cluster ID: Gv4kdRbATLK6YtRxsf94vw
Exception in thread "main" org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.net.SocketException: Unexpected end of file from server
at java.base/sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:866) at java.base/sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:689) at java.base/sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:863) at java.base/sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:689) at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1610) at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1515) at java.base/java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:527) at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:212) at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:256) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:356) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:348) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:334) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:168) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:222) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:198) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:70) at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:807) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:784) at com.example.Producerexample$.main(Producerexample.scala:50) at com.example.Producerexample.main(Producerexample.scala)
Process finished with exit code 1

this is the code of above error
package com.example
import java.util.Properties
//import org.apache.avro.util.Utf8
import io.confluent.kafka.serializers.KafkaAvroSerializer
import org.apache.kafka.clients.producer.{KafkaProducer, Producer,ProducerRecord,Callback,RecordMetadata}
import org.apache.kafka.common.serialization.StringSerializer
import com.example.Customer
object Producerexample {
def main(args: Array[String]): Unit = {
val properties: Properties = new Properties()
// normal producer
properties.setProperty("bootstrap.servers", "http://localhost:9092")
properties.setProperty("acks", "all")
properties.setProperty("retries", "10")
// avro part
properties.setProperty("key.serializer", classOf[StringSerializer].getName)
properties.setProperty("value.serializer", classOf[KafkaAvroSerializer].getName)
properties.setProperty("schema.registry.url", "http://localhost:18081")
val producer: Producer[String, Customer] =
new KafkaProducer[String, Customer](properties)
val topic: String = "customer-avro2"
// copied from avro examples
val customer: Customer = Customer.newBuilder()
.setFirstName("John")
.setLastName("Doe")
.setAge(34)
.setHeight(178f)
.setWeight(75f)
.setAutomatedEmail(false)
.build()
val producerRecord: ProducerRecord[String, Customer] =
new ProducerRecord[String, Customer](topic, customer)
println(customer)
producer.send(producerRecord, new Callback() {
def onCompletion(metadata: RecordMetadata,
exception: Exception): Unit = {
if (exception == null) {
println(metadata)
} else {
exception.printStackTrace()
}
}
}
)
producer.flush()
producer.close()
}
}

Related

Problem to deserialize json to java object of generic type with jackson 2.13.4.2

There's a message defined as below, note there're different implementations of the generic type M
// The message definition
#Value
#Builder(toBuilder = true)
#Jacksonized
public class MyMessage<M> {
#Builder.Default
Map<String, String> props = new HashMap<>();
M content;
}
// One implementation of the generic type M in MyMessage
class ContentType1 {
String name;
SomeSimplePojo pojo;
Map<String, String> contentProps;
}
Here's an example of above message:
{
"props": {
"trace-id": "3468f6022b749dbc"
},
"content": {
"name": "contentExample1",
"pojo": {
"field1": "val1",
"field2": "val2"
},
"contentProps": {
"/Count": "9",
"/Email": "someone#stackoverflow.com"
}
}
}
The message was deserialized basically with below code snippet, by com.fasterxml.jackson.* version 2.10.4. It worked fine before.
ObjectMapper objectMapper = new ObjectMapper();
MyMessage<String> rawMsg = objectMapper.readValue(jsonStr, new TypeReference<MyMessage<String>>() {});
// Here clazzOfM is the class of type M
MyMessage<M> convertedMsg = MyMessage.<M>builder().content(objectMapper.convertValue(rawMsg.getContent(), clazzOfM)).props(rawMsg.getProps()).build();
But recently, I upgraded com.fasterxml.jackson.databind to 2.13.4.2, all other com.fasterxml.jackson.* to 2.13.4. Then it failed at this line MyMessage<String> rawMsg = objectMapper.readValue(jsonStr, new TypeReference<MyMessage<String>>() {}); with Exception, which points to the generic type M, field content at column 52:
Caused by: com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot deserialize value of type `java.lang.String` from Object value (token `JsonToken.START_OBJECT`)
at [Source: (String)"{"props":{"trace-id":"3468f6022b749dbc"},"content":{"name":"contentExample1","pojo":{"field1":"val1","field2":"val2"},"contentProps":{"/Count":"9","/Email":"someone#stackoverflow.com"}}}"; line: 1, column: 52] (through reference chain: com.demo.example.MyMessage$MyMessageBuilder["content"])
at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:59)
at com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1741)
at com.fasterxml.jackson.databind.DeserializationContext.handleUnexpectedToken(DeserializationContext.java:1515)
at com.fasterxml.jackson.databind.DeserializationContext.handleUnexpectedToken(DeserializationContext.java:1420)
at com.fasterxml.jackson.databind.DeserializationContext.extractScalarFromObject(DeserializationContext.java:932)
at com.fasterxml.jackson.databind.deser.std.StringDeserializer.deserialize(StringDeserializer.java:62)
at com.fasterxml.jackson.databind.deser.std.StringDeserializer.deserialize(StringDeserializer.java:11)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeSetAndReturn(MethodProperty.java:158)
at com.fasterxml.jackson.databind.deser.BuilderBasedDeserializer.vanillaDeserialize(BuilderBasedDeserializer.java:293)
at com.fasterxml.jackson.databind.deser.BuilderBasedDeserializer.deserialize(BuilderBasedDeserializer.java:217)
at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:323)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4674)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3629)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3612)
I tried to change the problematic line to MyMessage<M> rawMsg = objectMapper.readValue(jsonStr, new TypeReference<MyMessage<M>>() {});, no exception then, but I still have two questions.
Is the above change a correct approach to deserialize json text to MyMessage by jackson 2.13.4(.2)? What is the best practice to this kind of deserialization if the above isn't.
After the above change, I notice that the type of rawMsg.content is LinkedHashMap, it isn't M(CotentType1 in this test) as I expected. But the type of convertedMsg.content IS ContentType1 after executing this converting line MyMessage<M> convertedMsg = MyMessage.<M>builder().content(objectMapper.convertValue(rawMsg.getContent(), clazzOfM)).props(rawMsg.getProps()).build();.
I can't understand why the type of rawMsg.content is LinkedHashMap instead of ContentType1. Could someone help explain?

java.lang.VerifyError in ktor server POST in full stack tutorial

In the official kotlin tutorial https://kotlinlang.org/docs/multiplatform-full-stack-app.html
I have a ktor server running, and when I do a GET I get the correct json response. When I POST:
POST http://localhost:9090/shoppingList
Content-Type: application/json
{
"desc": "Peppers 🌶",
"priority": 5
}
The server returns a 500 with this error:
2023-01-10 08:53:06.605 [DefaultDispatcher-worker-1] INFO ktor.application - Responding at http://0.0.0.0:9090
2023-01-10 08:53:12.317 [eventLoopGroupProxy-4-2] ERROR ktor.application - Unhandled: POST - /shoppingList
java.lang.VerifyError: class kotlinx.serialization.json.internal.StreamingJsonDecoder overrides final method kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableElement(Lkotlinx/serialization/descriptors/SerialDescriptor;ILkotlinx/serialization/DeserializationStrategy;Ljava/lang/Object;)Ljava/lang/Object;
at java.base/java.lang.ClassLoader.defineClass1(Native Method)
at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1016)
at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:174)
at ...
This is just the first part of the tutorial for building a full stack web app in Kotlin so I'd like to work out of the tutorial or I am missing something.
Server code below but this is also copy and paste right out of the turorial and using the tutorial git repo
val shoppingList = mutableListOf(
ShoppingListItem("Cucumbers 🥒", 1),
ShoppingListItem("Tomatoes 🍅", 2),
ShoppingListItem("Orange Juice 🍊", 3)
)
fun main() {
embeddedServer(Netty, 9090) {
install(ContentNegotiation) {
json()
}
install(CORS) {
allowMethod(HttpMethod.Get)
allowMethod(HttpMethod.Post)
allowMethod(HttpMethod.Delete)
anyHost()
}
install(Compression) {
gzip()
}
routing {
route(ShoppingListItem.path) {
get {
call.respond(shoppingList)
}
post {
shoppingList += call.receive<ShoppingListItem>()
call.respond(HttpStatusCode.OK)
}
delete("/{id}") {
val id = call.parameters["id"]?.toInt() ?: error("Invalid delete request")
shoppingList.removeIf { it.id == id }
call.respond(HttpStatusCode.OK)
}
}
}
routing {
get("/hello") {
call.respondText("Hello, API!")
}
}
}.start(wait = true)
This was because the main branch of the tutorial repo is out of date and needs its dependencies updated per this issue: https://github.com/kotlin-hands-on/jvm-js-fullstack/issues/21

JsonbDeserializer's deserialize method does NOT read existing element

Currently I'm using WildFly 21.0.2 and JSON-B and JSON-P APIs. The Yasson version in WildFly modules is 1.0.5. I have the following JSON coming from REST endpoint:
{
"circuitInfoResponseList": [
{
"org.my.company.dto.FiberCircuitInfoResponse": {
"attendanceType": "BY_RADIUS",
"index": 0,
...
This is my JsonbDeserializer implementation:
public CircuitInfoResponse deserialize(JsonParser jsonParser, DeserializationContext deserializationContext, Type type) {
jsonParser.next();
String className = jsonParser.getString();
jsonParser.next();
try {
return deserializationContext.deserialize(Class.forName(className).asSubclass(CircuitInfoResponse.class), jsonParser);
} catch (ClassNotFoundException e) {
e.printStackTrace();
throw new JsonbException("Cannot deserialize object.");
}
//return deserializationContext.deserialize(FiberCircuitInfoResponse.class, jsonParser);
}
This method gets the SECOND entry from the json attendanceType and NOT the desired org.my.company.dto.FiberCircuitInfoResponse. BTW... when I serialize the JSON Object I can see the string org.my.company.dto.FiberCircuitInfoResponse however when it arrives and the client side it does NOT contain that string. It comes likes this:
[
{
"circuitInfoResponseList": [
{
"attendanceType": "BY_RADIUS",
"index": 0,
Without that information I cannot tell which subclass to create. I've already tried to follow this tips but without success:
https://javaee.github.io/jsonb-spec/users-guide.html
https://github.com/m0mus/JavaOne2016-JSONB-Demo/blob/4ecc22f69d57fda765631237d897b0a487f58d90/src/main/java/com/oracle/jsonb/demo/serializer/AnimalDeserializer.java
https://javaee.github.io/javaee-spec/javadocs/javax/json/bind/serializer/JsonbDeserializer.html
These are my POJO classes.
Parent class:
import lombok.*;
import lombok.experimental.SuperBuilder;
import javax.json.bind.annotation.JsonbTypeDeserializer;
#Data
#SuperBuilder
#NoArgsConstructor(access = AccessLevel.PROTECTED)
#JsonbTypeDeserializer(CircuitInfoResponseJsonbXerializer.class)
public class CircuitInfoResponse {
...
}
Child class:
import lombok.AccessLevel;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
#Data
#SuperBuilder
#EqualsAndHashCode(callSuper = false)
#NoArgsConstructor(access = AccessLevel.PROTECTED)
public class FiberCircuitInfoResponse extends CircuitInfoResponse {
...
}
Serialize code:
Type responseListType = new ArrayList<SimulationServiceResponse>() {}.getClass().getGenericSuperclass();
JsonbConfig config = new JsonbConfig()
.withSerializers(new CircuitInfoResponseJsonbXerializer());
Jsonb jsonb = JsonbBuilder.create(config);
String json = jsonb.toJson(response, responseListType);
System.out.println(json);
return Response.status(Response.Status.OK).entity(json).build();
Deserialize code:
String restJsonResponse = restResponse.readEntity(String.class);
JsonbConfig config = new JsonbConfig()
.withDeserializers(new CircuitInfoResponseJsonbXerializer());
Jsonb jsonbCustom = JsonbBuilder.create(config);
List<SimulationServiceResponse> restResponseEntity = jsonbCustom.fromJson(restJsonResponse, new ArrayList<SimulationServiceResponse>() {}.getClass().getGenericSuperclass());
This is the class that contains a list of Parent class above:
import lombok.AccessLevel;
import lombok.Data;
import lombok.Getter;
import lombok.Setter;
#Data
public class SimulationServiceResponse {
...
#Getter(AccessLevel.NONE)
#Setter(AccessLevel.NONE)
private List<CircuitInfoResponse> circuitInfoResponseList;
public List<CircuitInfoResponse> getCircuitInfoResponseList() {
if (circuitInfoResponseList == null) {
circuitInfoResponseList = new ArrayList<>();
}
return circuitInfoResponseList;
}
public void setCircuitInfoResponseList(List<CircuitInfoResponse> list) {
this.circuitInfoResponseList = list;
}
}
Do you guys have any idea of what I'm doing wrong?

Controlling job execution based on exceptions in simple chunk job in Spring Batch

I'm having simple chunk CSV processing job.
I would like to change execution flow when there is particular type of error during processing (eg. invalid line structure)
In order to prevent throwing errors I need to provide custom exceptionHandler that will swallow parsing exception:
#Bean
fun processCsvStep(
stepBuilderFactory: StepBuilderFactory,
reader: ItemReader<InputRow>,
processor: ItemProcessor<InputRow, OutputObject>,
writer: ItemWriter<OutputObject>
) = stepBuilderFactory.get(PROCESS_CSV_STEP)
.chunk<InputRow, OutputObject>(
CHUNKS_NUMBER
)
.reader(reader)
.processor(processor)
.writer(writer)
.exceptionHandler { context: RepeatContext, throwable: Throwable ->
context.setTerminateOnly()
logger.error { "Exception during parsing: ${throwable.message}" }
}
.build()!!
Then in my Job I can rely only on rollback count:
#Bean
fun createCsvJob(jobs: JobBuilderFactory, processCsvStep: Step, moveCsvStep: Step, moveFailedCsvStep: Step) = jobs.get(PROCESS_CSV_JOB)
.start(processCsvStep)
.next { jobExecution: JobExecution, stepExecution: StepExecution ->
return#next when (stepExecution.rollbackCount) {
0 -> FlowExecutionStatus.COMPLETED
else -> FlowExecutionStatus.FAILED
}
}
.on(FlowExecutionStatus.FAILED.name)
.to(moveFailedCsvStep)
.on(FlowExecutionStatus.COMPLETED.name)
.to(moveCsvStep)
.end()
.build()!!
Is there any way to pass information from exception handler to JobExecutionDecider? I would like to make execution decision based on type of exception that happened during parsing. Is this possible?
I would like to make execution decision based on type of exception that happened during parsing. Is this possible?
You can get access to the exception that happened during the step from the decider through stepExecution#getFailureExceptions. Here is an example:
import java.util.Arrays;
import java.util.List;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.job.flow.FlowExecutionStatus;
import org.springframework.batch.core.job.flow.JobExecutionDecider;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.support.ListItemReader;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public ItemReader<Integer> itemReader() {
return new ListItemReader<>(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8, 9, 10));
}
#Bean
public ItemWriter<Integer> itemWriter() {
return items -> {
for (Integer item : items) {
if (items.contains(3)) {
throw new IllegalArgumentException("no 3!");
}
System.out.println("item = " + item);
}
};
}
#Bean
public Step step1() {
return steps.get("step1")
.<Integer, Integer>chunk(5)
.reader(itemReader())
.writer(itemWriter())
.build();
}
#Bean
public Step step2() {
return steps.get("step2")
.tasklet((contribution, chunkContext) -> {
System.out.println("step2");
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public Step step3() {
return steps.get("step3")
.tasklet((contribution, chunkContext) -> {
System.out.println("step3");
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public JobExecutionDecider decider() {
return (jobExecution, stepExecution) -> {
int rollbackCount = stepExecution.getRollbackCount();
List<Throwable> failureExceptions = stepExecution.getFailureExceptions();
System.out.println("rollbackCount = " + rollbackCount);
System.out.println("failureExceptions = " + failureExceptions);
// make the decision based on rollbackCount and/or failureExceptions and return status accordingly
return FlowExecutionStatus.COMPLETED;
};
}
#Bean
public Job job() {
return jobs.get("job")
.start(step1())
.on("*").to(decider())
.from(decider()).on("COMPLETED").to(step2())
.from(decider()).on("FAILED").to(step3())
.build()
.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
}
In this example, if an exception occurs during step1, the decider can get it from the step execution and make the decision accordingly (go to step2 or step3).
So I'm not sure you really need an exception handler and a way to pass information to the decider. The same idea applies of you want to make the decision based on the rollbackCount, commitCount, readCount, or any other metric.
Hope this helps.

Using jackson mixin class for a list of objects

I'm having a problem deserializing the following json
{
"GrpHdr": {
"MsgId": "Message-1",
"CreDtTm": "2018-03-02T10:15:30+01:00[Europe/Paris]",
"NbOfTxs": "1",
"InitgPty": {
"Nm": "Remitter"
}
},
"PmtInf": [
{
"PmtInfId": "1"
},
{
"PmtInfId": "2"
}
]
}
I have created a MixIn class:
public abstract class CustomerCreditTransferInitiationMixIn {
public PaymentInstructions paymentInstructions;
#JsonCreator
public CustomerCreditTransferInitiationMixIn(
#JsonProperty("GrpHdr") GroupHeader GrpHdr,
#JsonProperty("PmtInf") List<PaymentInstruction> PmtInf
) {
this.paymentInstructions = PaymentInstructions.valueOf(PmtInf);
}
#JsonProperty("GrpHdr")
abstract GroupHeader getGroupHeader();
#JsonProperty("PmtInf")
abstract List<PaymentInstruction> getPaymentInstructions();
}
I'm having no trouble deserializing the group header in this case. Mapping different names. But in the PmtInf case I get confused. It is a list that I want to deserialize to a List of PaymentInstructions. But PmtInf is a paymentistruction.
I have created a test:
#Test
public void JacksonMixinAnnotationTestJsonIsoFileFromTester() throws JsonProcessingException, Throwable {
CustomerCreditTransferInitiation customerCreditTransferInitiation;
String jsonFile = "testWithShortNames";
InputStream inputStream = new ClassPathResource(jsonFile + ".json").getInputStream();
ObjectMapper objectMapper = buildMapper();
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
objectMapper.registerModule(new JavaTimeModule());
objectMapper.addMixIn(CustomerCreditTransferInitiation.class, CustomerCreditTransferInitiationMixIn.class);
objectMapper.addMixIn(GroupHeader.class, GroupHeaderMixIn.class);
objectMapper.addMixIn(PaymentInstruction.class, PaymentInstructionMixIn.class);
objectMapper.addMixIn(PartyIdentification.class, PartyIdentificationMixIn.class);
customerCreditTransferInitiation = objectMapper.readValue(inputStream, CustomerCreditTransferInitiation.class);
//GroupHeader
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getMessageId());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getCreationDateTime());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getNumberOfTransactions());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getInitiatingParty());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getInitiatingParty().getName());
//PaymentInstructions
Assert.assertNotNull(customerCreditTransferInitiation.getPaymentInstructions());}
Getting the following error:
com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException:
Unrecognized field "PmtInfId" (class
com.seb.payment.iso.domain.PaymentInstruction), not marked as
ignorable (19 known properties: "paymentInformationId",
"paymentMethod", "created", "paymentTypeInformation", "controlSum",
"debtorAgent", "instructionForDebtorAgent", "numberOfTransactions",
"requestExecutionTime", "debtorAccount", "creditTransferTransactions",
"debtorAgentAccount", "batchBooking", "poolingAdjustmentDate",
"ultimateDebtor", "chargeBearerType", "debtor", "chargesAccount",
"chargesAccountAgent"]) at [Source: UNKNOWN; line: -1, column: -1]
(through reference chain:
com.seb.payment.iso.domain.CustomerCreditTransferInitiation["PmtInf"]->com.seb.payment.iso.domain.PaymentInstruction["PmtInfId"])
In our case we have implemented our own deserializers in abstract iterable.
On:
ObjectReader objectReader = ObjectMapperFactory.instance().readerFor(this.itemClass);
MixedIn classes are lost