#JsonIdentityInfo doesn't handle circular references correctly in a generic list - jackson

I'm serializing and deserializing a list including two objects, with circular references there.
First I tried ObjectIdGenerators.StringIdGenerator as the generator.
//Using latest version for the dependencies
//jackson.version = 2.13.1
//jackson-jsog = 1.1.2
#NoArgsConstructor
#JsonIdentityInfo(generator = ObjectIdGenerators.StringIdGenerator.class)
public static class Outer {
#Setter
#Getter
private Inner inner;
}
#NoArgsConstructor
#JsonIdentityInfo(generator = ObjectIdGenerators.StringIdGenerator.class)
public static class Inner {
#Getter
private Outer outer;
public Inner(Outer outer) {
this.outer = outer;
outer.setInner(this);
}
}
public static void main(String[] args) throws Exception {
// There are circular references between Outer and Inner
Outer outer = new Outer();
Inner inner = new Inner(outer);
// Turn on type info
PolymorphicTypeValidator ptv = BasicPolymorphicTypeValidator.builder().allowIfSubType(Object.class).build();
ObjectMapper mapper = new ObjectMapper()
.activateDefaultTyping(ptv, ObjectMapper.DefaultTyping.EVERYTHING,
JsonTypeInfo.As.PROPERTY);
List<Object> source = Lists.newArrayList(outer, inner);
String json = mapper.writeValueAsString(source);
System.out.println(json);
//This is the json serialized by StringIdGenerator, which doesn't make sense for the second instance(no type info there,just a UUID)
//See below
List<Object> target = mapper.readerForListOf(Object.class).readValue(json);
System.out.println(target);
//So I finally got an Outer instance and a string in the target list, not equal to source list at all
//[com.foo.bar.SomeTest$Outer#4b29d1d2, acb5231d-13de-4e22-92c6-cb1ec0530de1]
}
This is the json serialized by StringIdGenerator, which doesn't make sense for the second instance(no type info there,just a UUID)
[
"java.util.ArrayList",
[
{
"#class": "com.foo.bar.SomeTest$Outer",
"#id": "df6ed346-6b27-4983-8dc2-5c07ddfa8f8f",
"inner":
{
"#class": "com.foo.bar.SomeTest$Inner",
"#id": "acb5231d-13de-4e22-92c6-cb1ec0530de1",
"outer": "df6ed346-6b27-4983-8dc2-5c07ddfa8f8f"
}
},
"acb5231d-13de-4e22-92c6-cb1ec0530de1"
]
]
Then I tried JSOGGenerator
#NoArgsConstructor
#JsonIdentityInfo(generator = JSOGGenerator.class)
public static class Outer {
#Setter
#Getter
private Inner inner;
}
#NoArgsConstructor
#JsonIdentityInfo(generator = JSOGGenerator.class)
public static class Inner {
#Getter
private Outer outer;
public Inner(Outer outer) {
this.outer = outer;
outer.setInner(this);
}
}
public static void main(String[] args) throws Exception {
// There are circular references between Outer and Inner
Outer outer = new Outer();
Inner inner = new Inner(outer);
// Turn on type info
PolymorphicTypeValidator ptv = BasicPolymorphicTypeValidator.builder().allowIfSubType(Object.class).build();
ObjectMapper mapper = new ObjectMapper()
.activateDefaultTyping(ptv, ObjectMapper.DefaultTyping.EVERYTHING,
JsonTypeInfo.As.PROPERTY);
List<Object> source = Lists.newArrayList(outer, inner);
String json = mapper.writeValueAsString(source);
System.out.println(json);
//This is the json serialized by JSOGGenerator, which does make sense
//See below
List<Object> target = mapper.readerForListOf(Object.class).readValue(json);
//However, I got an InvalidTypeIdException while deserializing
//Exception in thread "main" com.fasterxml.jackson.databind.exc.InvalidTypeIdException: Could not resolve subtype of [simple type, class com.voodoodyne.jackson.jsog.JSOGRef]: missing type id property '#class'
}
This is the json serialized by JSOGGenerator, which does make sense.
[
"java.util.ArrayList",
[
{
"#class": "com.foo.bar.SomeTest$Outer",
"#id": "1",
"inner":
{
"#class": "com.foo.bar.SomeTest$Inner",
"#id": "2",
"outer":
{
"#ref": "1"
}
}
},
{
"#ref": "2"
}
]
]
However, I got an InvalidTypeIdException while deserializing.
And I found the property "#id" is not handled correctly.
See ObjectIdValueProperty._valueDeserializer._baseType, which type is JSOGRef and it can not parse "1"(String) as object id value.
My questions
What's the correct way to deal with this?
Do we have some other generator to try?

Related

#JsonIdentityReference does not recognize equal values

I'm trying to serialize an object (Root), with some duplicated entries of MyObject. Just want store the whole objects one, I'm using #JsonIdentityReference, which works pretty well.
However, I realize that it will generate un-deserializable object, if there're equal objects with different reference. I wonder if there's a configuration in Jackson to change this behavior, thanks!
#Value
#AllArgsConstructor
#NoArgsConstructor(force = true)
class Root {
private List<MyObject> allObjects;
private Map<String, MyObject> objectMap;
}
#Value
#AllArgsConstructor
#NoArgsConstructor(force = true)
#JsonIdentityReference
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id")
class MyObject {
private String id;
private int value;
}
public class Main {
public static void main() throws JsonProcessingException {
// Constructing equal objects
val obj1 = new MyObject("a", 1);
val obj2 = new MyObject("a", 1);
assert obj1.equals(obj2);
val root = new Root(
Lists.newArrayList(obj1),
ImmutableMap.of(
"lorem", obj2
)
);
val objectMapper = new ObjectMapper();
val json = objectMapper.writeValueAsString(root);
// {"allObjects":[{"id":"a","value":1}],"objectMap":{"lorem":{"id":"a","value":1}}}
// Note here both obj1 and obj2 are expanded.
// Exception: Already had POJO for id
val deserialized = objectMapper.readValue(json, Root.class);
assert root.equals(deserialized);
}
}
I'm using Jackson 2.10.
Full stacktrace:
Exception in thread "main" com.fasterxml.jackson.databind.JsonMappingException: Already had POJO for id (java.lang.String) [[ObjectId: key=a, type=com.fasterxml.jackson.databind.deser.impl.PropertyBasedObjectIdGenerator, scope=java.lang.Object]] (through reference chain: Root["objectMap"]->java.util.LinkedHashMap["lorem"]->MyObject["id"])
at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:394)
at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:353)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.wrapAndThrow(BeanDeserializerBase.java:1714)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:371)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeWithObjectId(BeanDeserializerBase.java:1257)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:157)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer._readAndBindStringKeyMap(MapDeserializer.java:527)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:364)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:29)
at com.fasterxml.jackson.databind.deser.impl.FieldProperty.deserializeAndSet(FieldProperty.java:138)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:288)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:151)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4202)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3205)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3173)
at Main.main(Main.java:53)
Caused by: java.lang.IllegalStateException: Already had POJO for id (java.lang.String) [[ObjectId: key=a, type=com.fasterxml.jackson.databind.deser.impl.PropertyBasedObjectIdGenerator, scope=java.lang.Object]]
at com.fasterxml.jackson.annotation.SimpleObjectIdResolver.bindItem(SimpleObjectIdResolver.java:24)
at com.fasterxml.jackson.databind.deser.impl.ReadableObjectId.bindItem(ReadableObjectId.java:57)
at com.fasterxml.jackson.databind.deser.impl.ObjectIdValueProperty.deserializeSetAndReturn(ObjectIdValueProperty.java:101)
at com.fasterxml.jackson.databind.deser.impl.ObjectIdValueProperty.deserializeAndSet(ObjectIdValueProperty.java:83)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:369)
... 14 more
As I mentioned earlier, this setup only works if obj1 == obj2, as the two objects with same ID should be identity-equal. In that case, the second object would also net get expanded during serialization (alwaysAsId = false only expands the first object).
However, if you want to have this setup and are fine with the serialization, you could use a custom Resolver for deserialization that stores a single instance per key:
#JsonIdentityReference(alwaysAsId = false)
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id", resolver = CustomScopeResolver.class)
static class MyObject {
private String id;
// ...
}
class CustomScopeResolver implements ObjectIdResolver {
Map<String, MyObject> data = new HashMap<>();
#Override
public void bindItem(final IdKey id, final Object pojo) {
data.put(id.key.toString(), (MyObject) pojo);
}
#Override
public Object resolveId(final IdKey id) {
return data.get(id.key);
}
#Override
public ObjectIdResolver newForDeserialization(final Object context) {
return new CustomScopeResolver();
}
#Override
public boolean canUseFor(final ObjectIdResolver resolverType) {
return false;
}
}
NEW EDIT: Apparently, its very easy: Just turn on objectMapper.configure(SerializationFeature.USE_EQUALITY_FOR_OBJECT_ID, true); so that the DefaultSerializerProvider uses a regular Hashmap instead of an IdentityHashMap to manage the serialized beans.
DEPRECATED: Update for Serialization: It is possible to achieve this by adding a custom SerializationProvider:
class CustomEqualObjectsSerializerProvider extends DefaultSerializerProvider {
private final Collection<MyObject> data = new HashSet<>();
private final SerializerProvider src;
private final SerializationConfig config;
private final SerializerFactory f;
public CustomEqualObjectsSerializerProvider(
final SerializerProvider src,
final SerializationConfig config,
final SerializerFactory f) {
super(src, config, f);
this.src = src;
this.config = config;
this.f = f;
}
#Override
public DefaultSerializerProvider createInstance(final SerializationConfig config, final SerializerFactory jsf) {
return new CustomEqualObjectsSerializerProvider(src, this.config, f);
}
#Override
public WritableObjectId findObjectId(final Object forPojo, final ObjectIdGenerator<?> generatorType) {
// check if there is an equivalent pojo, use it if exists
final Optional<MyObject> equivalentObject = data.stream()
.filter(forPojo::equals)
.findFirst();
if (equivalentObject.isPresent()) {
return super.findObjectId(equivalentObject.get(), generatorType);
} else {
if (forPojo instanceof MyObject) {
data.add((MyObject) forPojo);
}
return super.findObjectId(forPojo, generatorType);
}
}
}
#Test
public void main() throws IOException {
// Constructing equal objects
final MyObject obj1 = new MyObject();
obj1.setId("a");
final MyObject obj2 = new MyObject();
obj2.setId("a");
assert obj1.equals(obj2);
final Root root = new Root();
root.setAllObjects(Collections.singletonList(obj1));
root.setObjectMap(Collections.singletonMap(
"lorem", obj2));
final ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setSerializerProvider(
new CustomEqualObjectsSerializerProvider(
objectMapper.getSerializerProvider(),
objectMapper.getSerializationConfig(),
objectMapper.getSerializerFactory()));
final String json = objectMapper.writeValueAsString(root);
System.out.println(json); // second object is not expanded!
}

Spring Batch - Unable to deserialize the execution context - OffsetDateTime - cannot deserialize

I'm trying to create a spring batch job with multiples steps and passing object from step to step.
To do this I use ExecutionContext that i promoted from step to job context.
At first run, no problem data goes right from step to step
At next runs, I get the error :
"Unable to deserialize the execution context" Caused by: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of java.time.OffsetDateTime (no Creators, like default construct, exist): cannot deserialize from Object value (no delegate- or property-based Creator)
I write context in a ItemWriter like so :
#Override
public void write(List<? extends Employee> items) throws Exception {
ExecutionContext stepContext = this.stepExecution.getExecutionContext();
List<Employee> e = new ArrayList<Employee>();
e.addAll(items);
stepContext.put("someKey", e);
}
And read it back in a ItemReader (from another step) with :
#BeforeStep
public void retrieveInterstepData(StepExecution stepExecution) {
JobExecution jobExecution = stepExecution.getJobExecution();
ExecutionContext jobContext = jobExecution.getExecutionContext();
this.someObject = (List<Employee>) jobContext.get("someKey");
}
I check spring database context and my dates (LocalDate, OffsetDateTime, ...) are store like :
"LocalDate": {
"year": 2019,
"month": "OCTOBER",
"dayOfMonth": 30,
"monthValue": 10,
"era": ["java.time.chrono.IsoEra", "CE"],
"dayOfWeek": "WEDNESDAY",
"dayOfYear": 303,
"leapYear": false,
"chronology": {
"id": "ISO",
"calendarType": "iso8601"
}
}
"OffsetDateTime": {
"offset": {
"totalSeconds": 0,
"id": "Z",
"rules": {
"fixedOffset": true,
"transitionRules": ["java.util.Collections$UnmodifiableRandomAccessList", []],
"transitions": ["java.util.Collections$UnmodifiableRandomAccessList", []]
}
},
"month": "OCTOBER",
"year": 2019,
"dayOfMonth": 28,
"hour": 13,
"minute": 42,
"monthValue": 10,
"nano": 511651000,
"second": 36,
"dayOfWeek": "MONDAY",
"dayOfYear": 301
}
I guess it's jackson's choice to store it like that (I custom nothing)
But it seems that jackson can't read it's own format at next run ?!
My stubs are generated with from swagger with "swagger-codegen-maven-plugin" and configOptions/dateLibrary=java8 so I can't change them.
I tried to add
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
</dependency>
And
#PostConstruct
public void init() {
objectMapper.registerModule(new JavaTimeModule());
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
}
In the #SpringBootApplication
no change
Any ideas ? Either to store dates more simply like "2019-11-04" or make jackson read it's own format ?
Your object mapper should be set on the Jackson2ExecutionContextStringSerializer used by the job repository. You can extend DefaultBatchConfigurer and override createJobRepository:
#Bean
public JobRepository createJobRepository() throws Exception {
ObjectMapper objectMapper = new ObjectMapper().registerModule(new JavaTimeModule());
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
Jackson2ExecutionContextStringSerializer defaultSerializer = new Jackson2ExecutionContextStringSerializer();
defaultSerializer.setObjectMapper(objectMapper);
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setSerializer(defaultSerializer);
factory.afterPropertiesSet();
return factory.getObject();
}
EDIT :
My bad I just saw that I have a
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
That provide 2 batchConfigurer to spring.
Thanks !
ORIGINAL :
Thanks it seems promising.
But I dont find where to extends and use it, on which class.
I have a Batch Class configuration :
#Configuration
#EnableConfigurationProperties(BatchProperties.class)
public class BatchDatabaseConfiguration {
#Value("${spring.datasource.driver-class-name}")
private String driverClassName;
#Value("${spring.datasource.url}")
private String dbURL;
#Bean("batchDataSource")
public DataSource batchDataSource() {
final DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driverClassName);
dataSource.setUrl(dbURL);
return dataSource;
}
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
#Bean(name = "batchTransactionManager")
public PlatformTransactionManager batchTransactionManager(#Qualifier("batchDataSource") DataSource dataSource) {
DataSourceTransactionManager tm = new DataSourceTransactionManager();
tm.setDataSource(dataSource);
return tm;
}
}
And a Class with Job's definition :
#Configuration
#EnableBatchProcessing
public class ExtractionJobConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Job creationJob() {
...
}
[...]
}
And the main :
#EntityScan(basePackages = { "..." })
#SpringBootApplication
#EnableAsync
public class App {
public static void main(String[] args) {
ApplicationContext ctx = SpringApplication.run(App.class, args);
}
What do you think ?
I also read that Spring Batch 4.2.0+ allow for customisation of ObjectMapper in Jackson2ExecutionContextStringSerializer (https://jira.spring.io/browse/BATCH-2828)
Is that what you propose ? (I don't find other information)

Using jackson mixin class for a list of objects

I'm having a problem deserializing the following json
{
"GrpHdr": {
"MsgId": "Message-1",
"CreDtTm": "2018-03-02T10:15:30+01:00[Europe/Paris]",
"NbOfTxs": "1",
"InitgPty": {
"Nm": "Remitter"
}
},
"PmtInf": [
{
"PmtInfId": "1"
},
{
"PmtInfId": "2"
}
]
}
I have created a MixIn class:
public abstract class CustomerCreditTransferInitiationMixIn {
public PaymentInstructions paymentInstructions;
#JsonCreator
public CustomerCreditTransferInitiationMixIn(
#JsonProperty("GrpHdr") GroupHeader GrpHdr,
#JsonProperty("PmtInf") List<PaymentInstruction> PmtInf
) {
this.paymentInstructions = PaymentInstructions.valueOf(PmtInf);
}
#JsonProperty("GrpHdr")
abstract GroupHeader getGroupHeader();
#JsonProperty("PmtInf")
abstract List<PaymentInstruction> getPaymentInstructions();
}
I'm having no trouble deserializing the group header in this case. Mapping different names. But in the PmtInf case I get confused. It is a list that I want to deserialize to a List of PaymentInstructions. But PmtInf is a paymentistruction.
I have created a test:
#Test
public void JacksonMixinAnnotationTestJsonIsoFileFromTester() throws JsonProcessingException, Throwable {
CustomerCreditTransferInitiation customerCreditTransferInitiation;
String jsonFile = "testWithShortNames";
InputStream inputStream = new ClassPathResource(jsonFile + ".json").getInputStream();
ObjectMapper objectMapper = buildMapper();
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
objectMapper.registerModule(new JavaTimeModule());
objectMapper.addMixIn(CustomerCreditTransferInitiation.class, CustomerCreditTransferInitiationMixIn.class);
objectMapper.addMixIn(GroupHeader.class, GroupHeaderMixIn.class);
objectMapper.addMixIn(PaymentInstruction.class, PaymentInstructionMixIn.class);
objectMapper.addMixIn(PartyIdentification.class, PartyIdentificationMixIn.class);
customerCreditTransferInitiation = objectMapper.readValue(inputStream, CustomerCreditTransferInitiation.class);
//GroupHeader
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getMessageId());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getCreationDateTime());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getNumberOfTransactions());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getInitiatingParty());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getInitiatingParty().getName());
//PaymentInstructions
Assert.assertNotNull(customerCreditTransferInitiation.getPaymentInstructions());}
Getting the following error:
com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException:
Unrecognized field "PmtInfId" (class
com.seb.payment.iso.domain.PaymentInstruction), not marked as
ignorable (19 known properties: "paymentInformationId",
"paymentMethod", "created", "paymentTypeInformation", "controlSum",
"debtorAgent", "instructionForDebtorAgent", "numberOfTransactions",
"requestExecutionTime", "debtorAccount", "creditTransferTransactions",
"debtorAgentAccount", "batchBooking", "poolingAdjustmentDate",
"ultimateDebtor", "chargeBearerType", "debtor", "chargesAccount",
"chargesAccountAgent"]) at [Source: UNKNOWN; line: -1, column: -1]
(through reference chain:
com.seb.payment.iso.domain.CustomerCreditTransferInitiation["PmtInf"]->com.seb.payment.iso.domain.PaymentInstruction["PmtInfId"])
In our case we have implemented our own deserializers in abstract iterable.
On:
ObjectReader objectReader = ObjectMapperFactory.instance().readerFor(this.itemClass);
MixedIn classes are lost

The implementation of the FlinkKafkaConsumer010 is not serializable error

I created a custom class that is based on Apache Flink. The following are some parts of the class definition:
public class StreamData {
private StreamExecutionEnvironment env;
private DataStream<byte[]> data ;
private Properties properties;
public StreamData(){
env = StreamExecutionEnvironment.getExecutionEnvironment();
}
public StreamData(StreamExecutionEnvironment e , DataStream<byte[]> d){
env = e ;
data = d ;
}
public StreamData getDataFromESB(String id, int from) {
final Pattern TOPIC = Pattern.compile(id);
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
properties.setProperty("group.id", Long.toString(System.currentTimeMillis()));
properties.setProperty("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
properties.setProperty("value.deserializer", "org.apache.kafka.common.serialization.ByteArrayDeserializer");
properties.put("metadata.max.age.ms", 30000);
properties.put("enable.auto.commit", "false");
if (from == 0)
properties.setProperty("auto.offset.reset", "earliest");
else
properties.setProperty("auto.offset.reset", "latest");
StreamExecutionEnvironment e = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<byte[]> stream = env
.addSource(new FlinkKafkaConsumer011<>(TOPIC, new AbstractDeserializationSchema<byte[]>() {
#Override
public byte[] deserialize(byte[] bytes) {
return bytes;
}
}, properties));
return new StreamData(e, stream);
}
public void print(){
data.print() ;
}
public void execute() throws Exception {
env.execute() ;
}
Using class StreamData, trying to get some data from Apache Kafka and print them in the main function:
StreamData stream = new StreamData();
stream.getDataFromESB("original_data", 0);
stream.print();
stream.execute();
I got the error:
Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: The implementation of the FlinkKafkaConsumer010 is not serializable. The object probably contains or references non serializable fields.
Caused by: java.io.NotSerializableException: StreamData
As mentioned here, I think it's because of some data type in getDataFromESB function is not serializable. But I don't know how to solve the problem!
Your AbstractDeserializationSchema is an anonymous inner class, which as a result contains a reference to the outer StreamData class which isn't serializable. Either let StreamData implement Serializable, or define your schema as a top-level class.
It seems that you are importing FlinkKafkaConsumer010 in your code but using FlinkKafkaConsumer011. Please use the following dependency in your sbt file:
"org.apache.flink" %% "flink-connector-kafka-0.11" % flinkVersion

SerializationException of Avro Date Object (Date LogicalType)

I have a publisher that accepts a GenericRecord class.
#Override
public Future<RecordMetadata> publish(GenericRecord genericRecord) {
Future<RecordMetadata> recordMetadataFuture =
getPublisher().send(new ProducerRecord<>(producerConfiguration.getProperties()
.getProperty(ProducerConfiguration.PROPERTY_NAME_TOPIC), "sample.key",genericRecord));
return recordMetadataFuture;
}
private KafkaProducer<String, GenericRecord> getPublisher() {
return new KafkaProducer<>(producerConfiguration.getProperties());
}
And I have the following avro schema:
{
"type" : "record",
"name" : "SampleDate",
"namespace": "com.sample.data.generated.avro",
"doc" : "sample date",
"fields" : [
{
"name" : "sampleDate",
"type" : {
"type" : "int",
"logicalType" : "date"
}
}
]
}
I have built my own serializer:
Date Serializer:
#Component
public class SampleDateSerializer implements Serializer<GenericRecord> {
private AvroGenericSerializer serializer;
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
serializer = new AvroGenericSerializer(SampleDate.SCHEMA$);
}
#Override
public byte[] serialize(String topic, GenericRecord data) {
return serializer.serialize(data);
}
#Override
public void close() {
}
Generic Serializer:
public class AvroGenericSerializer {
private EncoderFactory avroEncoderFactory;
private DecoderFactory avroDecoderFactory;
private GenericDatumWriter<GenericRecord> avroWriter;
private GenericDatumReader<GenericRecord> avroReader;
public AvroGenericSerializer(Schema schema) {
avroEncoderFactory = EncoderFactory.get();
avroDecoderFactory = DecoderFactory.get();
avroWriter = new GenericDatumWriter<>(schema);
avroReader = new GenericDatumReader<>(schema);
}
public byte[] serialize(GenericRecord data) {
final ByteArrayOutputStream stream = new ByteArrayOutputStream();
final BinaryEncoder binaryEncoder = avroEncoderFactory.binaryEncoder(stream, null);
try {
avroWriter.write(data, binaryEncoder);
binaryEncoder.flush();
stream.close();
return stream.toByteArray();
} catch (IOException e) {
throw new RuntimeException("Can't serialize Avro object", e);
}
}
public GenericRecord deserialize(byte[] bytes) {
try {
return avroReader.read(null, avroDecoderFactory.binaryDecoder(bytes, null));
} catch (IOException e) {
throw new RuntimeException("Can't deserialize Avro object", e);
}
}
}
However, when testing my publisher class, I am encountering the following error:
org.apache.kafka.common.errors.SerializationException: Can't convert value of class com.sample.data.generated.avro.SampleDate to class com.sample.message.serialize.SampleDateSerializer specified in value.serializer
Debugging the code, I have found out that the
GenericDatumWriter.write()...
method is returning null when calling the
Conversion conversion = this.getData().getConversionByClass(datum.getClass(), logicalType);
which is called from
org.apache.avro.generic.GenericData
public <T> Conversion<T> getConversionByClass(Class<T> datumClass, LogicalType logicalType) {
Map conversions = (Map)this.conversionsByClass.get(datumClass);
return conversions != null?(Conversion)conversions.get(logicalType.getName()):null;
}
In this regard, is there a way for me to populate the
GenericData.conversionsByClass
Map, so that it can return the correct converter to use for the given
date logicalType?
I have solved it by passing the GenericData object in my GenericDatumWriter.
My Generic Serializer now looks like this:
public AvroGenericSerializer(Schema schema) {
avroEncoderFactory = EncoderFactory.get();
avroDecoderFactory = DecoderFactory.get();
final GenericData genericData = new GenericData();
genericData.addLogicalTypeConversion(new TimeConversions.DateConversion());
avroWriter = new GenericDatumWriter<>(schema, genericData);
avroReader = new GenericDatumReader<>(schema);
}