I've got a data class with a map in it. One of the values stored in the map is a Kotlin enum class.
public enum SecurityRole
{
User,
Admin
}
It serialized and deserialized correctly it seems, but when I try to pull it out of the map and cast it back to the proper type, it excepts with:
com.fff.security.SecurityRole cannot be cast to com.fff.security.SecurityRole
Looking at it in the debugger it looks totally fine, all the data is there, it just makes no sense! I've tried using Java serialization with it, FST's serialization, Klaxon JSON serialization, they all fail to deserialize this thing in a way that's castable afterward, what am i doing wrong!
This happens when SecurityRole is loaded with two different ClassLoaders. Even if they are the same class, the Class object which was loaded is not the same instance. Most of the cases the solution is to instantiate the ClassLoader itself with having the other ClassLoader as parent.
Related
I am writing a custom spring boot starter that provides a uniform error response class for all repositories that will add this starter. It also provides the corresponding exception handler.
The problem is, that this error response needs an error code which might differ between all those repositories, using my starter. So the solution would be to create an error response with an error code that is an interface. Other repositories can then create enums implementing this interface to achieve the desired behavior.
It is written in kotlin
interface BaseErrorCode {
val message: String
}
class ErrorResponse {
val customMessage: String,
val errorCode: BaseErrorCode,
val timestamp: OffsetDateTime
}
Now, in another repository, I use this starter, get access to the classes above and create my error code enum:
enum class MyCustomErrorCodes : BaseErrorCode {
FOO
}
Now, I can throw an exception and through the handler, this json will be produced:
{
"customMessage": "My message",
"errorCode": "FOO",
"timestamp": "2023-01-27T12:15:31.7730645+01:00"
}
So serializing works absolutely fine.
However, when deserializing the ErrorResponse in my integration-tests, I get the following exception:
com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of `my.package.BaseErrorCode` (no Creators, like default constructor, exist): abstract types either need to be mapped to concrete types, have custom deserializer, or contain additional type information
at [Source: (String)"{"customMessage":"My message","errorCode":"FOO","timestamp":"2023-01-27T12:12:06.4932227+01:00"}"; line: 1, column: 54] (through reference chain: my.package.ErrorResponse["errorCode"])
Approach 1
I know a solution where you write a custom deserializer. But this could not be placed in the spring boot starter because I need to know the implementing classes of that ´BaseErrorCode`-interface.
Approach 2
Using #JsonSubType, I can also tell Jackson how to handle this interface. But I do need some information about implementing enums as well, which I do not have in my starter
Is there any way to deserialize this error response
only by modifying the starter, not each repository
without knowing the classes that implement BaseErrorCode, BUT knowing that it will ALWAYS be an enum?
I think the answer lies in using intentionally dissimilar objects. I have not tested, but I think:
follow what you have with serialization, and note that you are not serializing the whole enum rather an instance of enum which conforms to BaseErrorCode.
on deserialization you make a concrete implementation of BaseErrorCode (this can be private to the Starter) that you deserialize the JSON in to. (Adding #JsonIgnoreProperties(ignoreUnknown = true) to guard against an unknown subclass sending more data than you can deal with).
Jackson's #JacksonInject annotation is useful for declaring properties of your deserialized object that are to be "injected" by the code calling for deserialization (as opposed to only being parsed from the JSON). To use this feature, it seems you have to either:
Set the InjectableValues into the ObjectMapper (which would tie them to that ObjectMapper instance and be used for all calls to it).
Get an ObjectReader [via ObjectMapper.reader(InjectableValues)] and use that ObjectReader directly to parse the JSON.
Unfortunately, neither of these is doable (from what I can see) when using Spring's RestTemplate without jumping through a lot of hoops. I don't want every object being deserialized from the RestTemplate to use injected values; nor do I see a way to customize how RestTemplate uses the underlying ObjectMapper1.
Is there a way to incorporate InjectableValues into RestTemplate's JSON deserialization?
1I suppose I could write my own custom HttpMessageConverter and figure out how to inject that into RestTemplate. But even then I don't see a way to pass the InjectableValues into ObjectMapper's read... methods. It's a lot of work even if I could.
I'm trying to solve the problem of serializing and deserializing Box<SomeTrait>. I know that in the case of a closed type hierarchy, the recommended way is to use an enum and there are no issues with their serialization, but in my case using enums is an inappropriate solution.
At first I tried to use Serde as it is the de-facto Rust serialization mechanism. Serde is capable of serializing Box<X> but not in the case when X is a trait. The Serialize trait can’t be implemented for trait objects because it has generic methods. This particular issue can be solved by using erased-serde so serialization of Box<SomeTrait> can work.
The main problem is deserialization. To deserialize polymorphic type you need to have some type marker in serialized data. This marker should be deserialized first and after that used to dynamically get the function that will return Box<SomeTrait>.
std::any::TypeId could be used as a marker type, but the main problem is how to dynamically get the deserialization function. I do not consider the option of registering a function for each polymorphic type that should be called manually during application initialization.
I know two possible ways to do it:
Languages that have runtime reflection like C# can use it to get
deserialization method.
In C++, the cereal library uses magic of static objects to register deserializer in a static map at the library initialization time.
But neither of these options is available in Rust. How can deserialization of polymorphic objects be added in Rust if at all?
This has been implemented by dtolnay.
The concept is quite clever ans is explained in the README:
How does it work?
We use the inventory crate to produce a registry of impls of your trait, which is built on the ctor crate to hook up initialization functions that insert into the registry. The first Box<dyn Trait> deserialization will perform the work of iterating the registry and building a map of tags to deserialization functions. Subsequent deserializations find the right deserialization function in that map. The erased-serde crate is also involved, to do this all in a way that does not break object safety.
To summarize, every implementation of the trait declared as [de]serializable is registered at compile-time, and this is resolved at runtime in case of [de]serialization of a trait object.
All your libraries could provide a registration routine, guarded by std::sync::Once, that register some identifier into a common static mut, but obviously your program must call them all.
I've no idea if TypeId yields consistent values across recompiles with different dependencies.
A library to do this should be possible. To create such a library, we would create a bidirectional mapping from TypeId to type name before using the library, and then use that for serialization/deserialization with a type marker. It would be possible to have a function for registering types that are not owned by your package, and to provide a macro annotation that automatically does this for types declared in your package.
If there's a way to access a type ID in a macro, that would be a good way to instrument the mapping between TypeId and type name at compile time rather than runtime.
I am currently using Guava's ForwardingMap as a base class and have numerous types that extend it. I need to maintain the Map type because instances need to be treated as such in consumers. So, even though internally the ForwardingMap using composition the external interface still has to be a map.
As a map, deserializing just key-value properties using #JsonAnyGetter and #JsonAnySetter work fine but, I also need to take into account custom properties, using #JsonProperty, which may also be a part of the instance as well.
So, when serializing or deserializing I want all of the entries and any custom properties which may be a part of the extended class.
I have looked at numerous types of solutions, such as using the Shape.OBJECT and apply interfaces, but none of them seem to work properly for me. I believe I need to create a custom deserializer/serializer to handle the bean + map processing in Jackson but cannot find any examples as to how to do this.
These links help to explain what I am trying to do with no luck:
http://www.cowtowncoder.com/blog/archives/2013/10/entry_482.html
How to serialize with Jackson a java.util.Map based class (cannot change base of ForwardingMap)
Jackson - ignore Map superclass when serializing (cannot change base because it needs to remain a Map)
Ideally, I would like an example or pointer of how to serialize and deserialize an instance that extends ForwardingMap using #JsonAnySetter and #JsonAnyGetter and has custom properties using #JsonProperty as well.
I would want my output to look like
"modules": {
"MyModel": { <-- extends ForwardingMap<>
"domain": "typeinfo",
"property":"hello", <-- comes from #JsonProperty
"another": "GoodBye", <-- comes from #JsonAnyGetter
"another2": 50 <-- comes from #JsonAnyGetter
}
}
So we have a java class with two ArrayLists of generics. It looks like
public class Blah
{
public ArrayList<ConcreteClass> a;
public ArrayList<BaseClass> b;
}
by using [ArrayElementType('ConcreteClass')] in the actionscript class, we are able to get all the "a"s converted fine. However with "b", since the actual class coming across the line is a heterogeneous mix of classes like BaseClassImplementation1, BaseClassImplementation2 etc, it gets typed as an object. Is there a way to convert it to the specific concrete class assuming that a strongly typed AS version of the java class exists on the client side
thanks for your help!
Regis
To ensure that all of your DTO classes are marshalled across AS and Java, you need to define each remote class as a "remote class" in AS by using the "RemoteClass" attribute pointing to the java class definition like this [RemoteClass(alias="com.myco.class")].
BlazeDS will perform introspection on the class as it is being serialized/de-serialized and convert it appropriately (see doc below). It doesn't matter how the classes are packed or nested in an array, as long as it can be introspected it should work.
If you need special serialization for a class you can create your own serialization proxys (called beanproxy) by extending "AbastractProxy" and loading them into blazeds using the PropertyProxyRegistry register method on startup.
You will find most of this in the Blaze developers guide http://livedocs.adobe.com/blazeds/1/blazeds_devguide/.
Creating your own beanproxy class look here: //livedocs.adobe.com/blazeds/1/javadoc/flex/messaging/io/BeanProxy.html