Java-8 LocalDateTime serializing with DateTimeFormatter - jackson

I am using Dropwizard 0.8.4 and jackson-datatype-jsr310.
a)
Would like to serialise my LocalDateTime to JSON output as DateTimeFormatter.ISO_INSTANT but could not find any clean way to do that (without implementing custom serialising classes), shouldn't this be very standard thing to do with simple annotations?
Currently my code works with:
#JsonProperty
#JsonFormat(shape=JsonFormat.Shape.STRING, pattern="yyyy-MM-dd'T'HH:mm:ss'Z'")
#JsonSerialize(using = LocalDateTimeSerializer.class)
public LocalDateTime getTime() {
...
}
but that pattern isn't really same as ISO_INSTANT, and as I debugged it a bit, ISO_INSTANT cannot even be presented by a String-pattern.
b) Would I be better off using Joda-Time, that seems to be supported by Dropwizard by default?
c) Is there way to skip serialising Java-field into JSON based on the value (boolean being false)? I tried #JsonFilter and SimpleBeanPropertyFilter but didn't get it to work, and it seems to be deprecated as well. Also #JsonProperty(defaultValue...) didn't seem to work either.

Related

Best way to define model using AndroidAnnotations #Rest and Kotlin?

I am totally new to Android and Kotlin and I was looking into Android Annotations.
I managed to decode a JSON response using the following code:
class ExampleModel {
#JvmField
final var id: Int = 0
lateinit var title: String
var description: String? = null
var author: Author? = null
}
#Rest(
rootUrl = "...",
converters = [MappingJackson2HttpMessageConverter::class]
)
interface ExampleClient {
#Get("/promotions")
fun getModels(): List<ExampleModel>
}
Now it does work but there are a couple of questions I'd like to ask.
Is it possible to use data classes? I tried but I kept getting an error from MappingJackson2HttpMessageConverter saying that there was no constructor available.
Is it somehow possible to just ignore extra keys that might appear in the JSON? Let's say that I am not interested in the author data for now, is there a way to just remove its declaration without having the decoding fail with "unexpected key"?
Consider that I usually work with Swift so if you could point me to the "Codable" equivalent in Kotlin I would really appreciate it.
Cheers
Kotlin Data classes don't have default constructor which is usually required by json deserialization libraries. Any data class require at least one constructor argument, but you can work around it. Define default values, you can use null. For example:
data class Pojo(val name: String? = null, val age: Int? = null)
Such code will allow to use Pojo() constructor. It should work, but it's better to use json deserializer that is more kotlin native or generate data classes with AutoValue.
Jackson that you're using here allows to ignore fields with #JsonIgnoreProperties.
If you're learning Android, don't start from Android Annotations if you don't have to. It's not very popular or modern solution. I used it in few projects back in the day, those were very difficult to maintain or to introduce new developers. Look into android architecture components and jetpack - google made few nice code labs. Also for json pick Moshi or Gson.

Jackson private constructors, JDK 9+, Lombok

I'm looking for documentation on how Jackson works with private constructors on immutable types. Using Jackson 2.9.6 and the default object mapper provided by spring boot two running with jdk-10.0.1
Given JSON:
{"a":"test"}
and given a class like:
public class ExampleValue {
private final String a;
private ExampleValue() {
this.a = null;
}
public String getA() {
return this.a;
}
}
Deserialisation (surprisingly, at least to me) seems to work.
Whereas this does not:
public class ExampleValue {
private final String a;
private ExampleValue(final String a) {
this.a = a;
}
public String getA() {
return this.a;
}
}
And this does:
public class ExampleValue {
private final String a;
#java.beans.ConstructorProperties({"a"})
private ExampleValue(final String a) {
this.a = a;
}
public String getA() {
return this.a;
}
}
My assumption is that the only way the first example can work is by using reflection to set the value of the final field (which I presume it does by java.lang.reflect.AccessibleObject.setAccessible(true).
Question 1: am I right that this is how Jackson works in this case? I presume this would have the potential to fail under a security manager which does not allow this operation?
My personal preference, therefore, would be the last code example above, since it involves less "magic" and works under a security manager. However, I have been slightly confused by various threads I've found about Lombok and constructor generation which used to generate by default #java.beans.ConstructorProperties(...) but then changed default to no longer do this and now allows one to configure it optionally using lombok.anyConstructor.addConstructorProperties=true
Some people (including in the lombok release notes for v1.16.20) suggest that:
Oracle more or less broke this annotation with the release of JDK9, necessitating this breaking change.
but I'm not precisely clear on what is meant by this, what did Oracle break? For me using JDK 10 with jackson 2.9.6 it seems to work ok.
Question 2: Is any one able to shed any light on how this annotation was broken in JDK 9 and why lombok now considers it undesirable to generate this annotation by default anymore.
Answer 1: This is exactly how it works (also to my surprise). According to the Jackson documentation on Mapper Features, the properties INFER_PROPERTY_MUTATORS, ALLOW_FINAL_FIELDS_AS_MUTATORS, and CAN_OVERRIDE_ACCESS_MODIFIERS all default to true. Therefore, in your first example, Jackson
creates an instance using the private constructor with the help of AccessibleObject#setAccessible (CAN_OVERRIDE_ACCESS_MODIFIERS),
detects a fully-accessable getter method for a (private) field, and considers the field as mutable property (INFER_PROPERTY_MUTATORS),
ignores the final on the field due to ALLOW_FINAL_FIELDS_AS_MUTATORS, and
gains access to that field using AccessibleObject#setAccessible (CAN_OVERRIDE_ACCESS_MODIFIERS).
However, I agree that one should not rely on that, because as you said a security manager could prohibit it, or Jackson's defaults may change. Furthermore, it feels "not right" to me, as I would expect that class to be immutable and the field to be unsettable.
Example 2 does not work because Jackson does not find a usable constructor (because it cannot map the field names to the parameter names of the only existing constructor, as these names are not present at runtime). #java.beans.ConstructorProperties in your third example bypasses this problem, as Jackson explicitly looks for that annotation at runtime.
Answer 2:
My interpretation is that #java.beans.ConstructorProperties is not really broken, but just cannot be assumed to be present any more with Java 9+. This is due to its membership in the java.desktop module (see, e.g., this thread for a discussion on this topic). As modularized Java applications may have a module path without this module, lombok would break such applications if it would generate this annotation by default. (Furthermore, this annotation is not available in general on the Android SDK.)
So if you have a non-modularized application or a modularized application with java.desktop on the module path, it's perfectly fine to let lombok generate the annotation by setting lombok.anyConstructor.addConstructorProperties=true, or to add the annotation manually if you are not using lombok.

How to convert existing POCO classes in C# to google Protobuf standard POCO

I have POCO classes , I use NewtonSoft json for seralization. Now i want to migrate it to Google protocol buff. Is there any way i can migrate all my classes (not manually) so that i can use google protocol buff for serialization and deseralization.
Do you just want it to work? The absolute simplest way to do this would be to use protobuf-net and add [ProtoContract(ImplicitFields = ImplicitFields.AllPublic)]. What this does is tell protobuf-net to make up the field numbers, which it does by taking all the public members, sorting them alphabetically, and just counting upwards. Then you can use your type with ProtoBuf.Serializer and it should behave in the way you expect.
This is simple, but it isn't very robust. If you add, remove or rename members it can all get out of sync. The problem here is that the protocol buffers format doesn't include names - just field numbers, and it is much harder to guarantee numbers over time. If your type is likely to change, you probably want to define field numbers explicitly. For example:
[ProtoContract]
public class Foo {
[ProtoMember(1)]
public int Id {get;set;}
[ProtoMember(2)]
public List<string> Names {get;} = new List<string>();
}
One other thing to watch out for would be non-zero default values. By default protobuf-net assumes certain things about implicit default values. If you are routinely using non-zero default values without doing it very carefully, protobuf-net may misunderstand you. You can turn that off globally if you desire:
RuntimeTypeModel.Default.UseImplicitZeroDefaults = false;

F#, Json.NET 6.0 and WebApi - serialization of record types

Json.NET 6.0.1 adds F# support for records and discriminated unions. When serializing a F# record type using Json.NET I now get nicely formatted JSON.
The serialization is done as follow:
let converters = [| (new StringEnumConverter() :> JsonConverter) |]
JsonConvert.SerializeObject(questionSet, Formatting.Indented, converters)
However, when I try to expose my F# types through a ASP.NET WebApi 5.0 service, written in C#, the serialized JSON includes an #-sign infront of all properties. The #-sign comes from the internal backing field for the record type (this used to be a known problem with Json.Net and F#).
But - since I'm using the updated version of Json.NET, shouldn't the result be the same as when calling JsonConvert? Or is JsonConvert behaving differently than JsonTextWriterand JsonTextReader?
As far as I can tell from reading the JsonMediaTypeFormatter in the WebApi source JsonTextWriterand JsonTextReader is used by WebApi.
You can adorn your records with the [<CLIMutable>] attribute:
[<CLIMutable>]
type MyDtr = {
Message : string
Time : string }
That's what I do.
For nice XML formatting, you can use:
GlobalConfiguration.Configuration.Formatters.XmlFormatter.UseXmlSerializer <- true
For nice JSON formatting, you can use:
config.Formatters.JsonFormatter.SerializerSettings.ContractResolver <-
Newtonsoft.Json.Serialization.CamelCasePropertyNamesContractResolver()
I believe it's because the backing fields that are emitted by F# records don't follow the same naming convention as C# property backing fields.
The easiest way I've found to get around this is to change the ContractResolver at the startup of your web application from the System.Net.Http.Formatting.JsonContractResolver to use the Newtonsoft.Json.Serialization.DefaultContractResolver instead: -
Formatters.JsonFormatter.SerializerSettings.ContractResolver <- DefaultContractResolver()
You'll then get all JSON formatting done via Newtonsoft's JSON formatter rather than the NET one.

Is protobuf-net suited for serializing arbitrary object/domain models?

I have been exploring the CQRS/DDD-principles and patterns for a while now and have started implementing a sample project where I have split my storage-model into a WriteModel and a ReadModel. The WriteModel will use a simple NoSQL-like database where aggregates are stored in a key-value style, with value being just a serialized version of the aggregate.
I am now looking at ProtoBuf-Net for serializing and deserializing my domain model aggregates in and out of storage. Other than this post I haven't found any guidance or tips for using ProtoBuf-Net in this area. The point is that the (ideal) requirements for serialization and deserialization of aggregates is that the domain model should have as little knowledge as possible about this infrastructural concern, which implies the following:
No attributes on the classes
No constructors, getters, setters or any other piece of code just for the sake of serialization.
Ability to use any (custom) type possible and have it serialized/deserialized.
Thus far I have implemented just the serialization of the first versions of my aggregates which works perfectly fine. I use the RuntimeTypeModel.Default-instance to configure the MetaModel at runtime and have UseConstructor = false everywhere, which enables me to completely separate the serialization mechanics from my domain-assembly. I have even implemented a custom post-deserialization mechanism that enables me to just-in-time initialize fields after ProtoBuf-Net has deserialized it into a valid instance. So suppose I have class AggregateA like so:
[Version(1)]
public sealed class AggregateA
{
private readonly int _x;
private readonly string _y;
...
}
Then in my serialization-library I have code something along the following lines:
var metaType = RuntimeTypeModel.Default.Add(typeof(AggregateA), false);
metaType.UseConstructor = false;
metaType.AddField(1, "_x");
metaType.AddField(2, "_y");
...
However, I realize that up to this point I have only implemented the basic scenario, and I am now starting to think about how to approach versioning of my model. I am particularly interested in larger refactoring-scenario's, where type A has been split into type A1 and A2, for example:
[Version(2)]
public sealed class AggregateA1
{
private readonly int _x;
...
}
[Version(2)]
public sealed class AggregateA2
{
private readonly string _y;
...
}
Suppose I have a serialized bunch of instances of AggregateA, but now my domain model knows only AggregateA1 and AggregateA2, how would you handle this scenario with ProtoBuf-Net?
A second question deals with point 3: is ProtoBuf-Net capable of handling arbitrary types if you're willing to put in some extra configuration-effort? I've read about exceptions raised when using the DateTimeOffset-type, which makes me think not all types can be serialized by the framework out-of-the-box, but can I serialize these types by registering them in the RuntimeTypeModel? Should I even want to go there? Or better to forget about serializing common .NET types other than the simple ones?
protobuf-net is intended to work with predictable known models. It is true that everything can be configured at runtime, but I have not put any thought as to how to handle your A1/A2 scenario, precisely because that is not a supported scenario (in my defense, I can't see that working nicely with most serializers). Thinking off the top of my head, if you have the configuration/mapping data somewhere, then you could simply deserialize twice; i.e. as long as we still tell it that AggregateA1._x maps to 1 and AggregateA2._y maps to 2, you can do:
object a1 = model.Deserialize(source, null, typeof(AggregateA1));
source.Position = 0; // rewind
object a2 = model.Deserialize(source, null, typeof(AggregateA2));
However, more complex tweaks would require additional thought.
Re "arbitrary types"... define "arbitrary" ;p In particular, there is support for "surrogate" types which can be useful for some transformations - but without a very specific "problem statement" it is hard to answer completely.
Summary:
protobuf-net has an intended usage, which includes both serialization-aware (attributed, etc) and non-aware scenarios (runtime configuration, etc) - but it also works for a range of more bespoke scenarios (letting you drop to the raw reader/writer API if you want to). It does not and cannot guarantee to be a direct fit for every serialization scenario imaginable, and how well it behaves will depend on how far from that scenario you are.