How to use serde to serialize/deserialize a HashMap with enum keys that contain data? [duplicate] - serialization

This question already has answers here:
Cant serialize when using enum as key in Hashmap
(2 answers)
Closed 9 months ago.
I need to serialize and deserialize a HashMap, h, with enum Foo as a key to and from JSON. Foo's variants contain data (here simplified to u32, but actually are enums themselves):
use serde::{Serialize, Deserialize};
use serde_json;
use std::collections::HashMap;
#[derive(Serialize, Deserialize)]
enum Foo {
A(u32),
B(u32),
}
// Tried several different things here! Just deriving the relevant traits doesn't work.
struct Bar {
h: HashMap<Foo, i32>, // The i32 value type is arbitrary
}
fn main() {
let mut bar = Bar { h: HashMap::new() };
bar.h.insert(Foo::A(0), 1);
// I want to be able to do this
let bar_string = serde_json::to_string(&bar).unwrap();
let bar_deser: Bar = serde_json::from_str(&bar_string).unwrap();
}
Since the JSON specification requires keys to be strings I know I need to customise the way serialization and deserialization are done for Foo when it is a key in h. I've tried the following:
Custom implementations of Serialize and Deserialize (e.g. in the accepted answer here)
Serde attributes e.g.: #[serde(into = "String", try_from = "String")] + implementing Into<String> for Foo and TryFrom<String> for Foo (described in the answers here)
Using the 'serde_with' crate (also described in the answers here).
Unfortunately none of these have worked - all eventually failed with different panics after compiling successfully.
What is a good way to achieve what I want in serde, if there is one? If not, I'd be very grateful for any workaround suggestions.
Bonus question: why doesn't serde/serde_json provide a default serialization to a String + deserialization of an enum like Foo when it is used as a key in a HashMap?

Here is a working solution with serde_as from the serde_with helper crate:
use serde::{Deserialize, Serialize};
use serde_json;
use std::collections::HashMap;
#[derive(Serialize, Deserialize, Hash, Eq, PartialEq, Debug)]
enum Foo {
A(u32),
B(u32),
}
#[serde_with::serde_as]
#[derive(Serialize, Deserialize, Debug)]
struct Bar {
#[serde_as(as = "HashMap<serde_with::json::JsonString, _>")]
h: HashMap<Foo, i32>,
}
fn main() {
let mut bar = Bar { h: HashMap::new() };
bar.h.insert(Foo::A(0), 1);
let bar_string = serde_json::to_string(&bar).unwrap();
let bar_deser: Bar = serde_json::from_str(&bar_string).unwrap();
println!("{:?}", bar);
println!("{}", bar_string);
println!("{:?}", bar_deser);
}
Bar { h: {A(0): 1} }
{"h":{"{\"A\":0}":1}}
Bar { h: {A(0): 1} }
Bonus question: why doesn't serde/serde_json provide a default serialization to a String + deserialization of an enum like Foo when it is used as a key in a HashMap?
If you noticed, most of those problems become runtime errors.
That is because serde actually does have a representation of a map that works with all Rust values. So serde itself has no power over this problem.
The problem comes when serde_json tries to feed the map into a json Serializer. And yes, this serializer could indeed convert any key into a JSON string and back if that was implemented.
I think it's more of a design decision why this isn't done. If JSON only supports string keys, so should the JSON serializer. If the user wants to use other types as keys, he should convert them to strings first as seen in the code example above.

Related

Rust define trait interface for any type?

I am building a rendering engine, one thing I want is to handle mangagement of arbitrary mesh data, regardless of representation.
My idea was, define a trait the enforces a function signature and then serialization can be handled by the user while I handle all the gpu stuff. This was the trait I made:
pub enum GpuAttributeData
{
OwnedData(Vec<Vec<i8>>, Vec<u32>),
}
pub trait GpuSerializable
{
fn serialize(&self) -> GpuAttributeData;
}
So very simple, give me a couple of arrays.
When i tested things inside the crate it worked, but I moved my example outside the crate, i.e. I have this snippet in an example:
impl <const N : usize> GpuSerializable for [Vertex; N]
{
fn serialize(&self) -> GpuAttributeData
{
let size = size_of::<Vertex>() * self.len();
let data = unsafe {
let mut data = Vec::<i8>::with_capacity(size);
copy_nonoverlapping(
self.as_ptr() as *const i8, data.as_ptr() as *mut i8, size);
data.set_len(size);
data
};
// let indices : Vec<usize> = (0..self.len()).into_iter().collect();
let indices = vec![0, 1, 2];
let mut buffers :Vec<Vec<i8>> = Vec::new();
buffers.push(data);
return GpuAttributeData::OwnedData(buffers, indices);
}
}
Which gives me this error:
error[E0117]: only traits defined in the current crate can be implemented for arbitrary types
--> examples/01_spinning_triangle/main.rs:41:1
|
41 | impl <const N : usize> GpuSerializable for [Vertex; N]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^-----------
| | |
| | this is not defined in the current crate because arrays are always foreign
| impl doesn't use only types from inside the current crate
|
= note: define and implement a trait or new type instead
This utterly breaks my design. The whole point of what I am trying to achieve is, anyone anywhere should be able to implement that trait, inside or outside any crate, to be able to serialize whatever data they have, in whatever esoteric format it is.
Can I somehow bypass this restriction? If not, is there another way I could enforce at runtime "give me an object that has this function signature among its methods"?
Rust has the orphan rule, which says that any implementation of a trait on a type must exist in the same crate as at least one of the trait or the type. In other words, both types can't be foreign. (It's a bit more complex than this -- for example, you can implement a foreign generic trait on a foreign type if a generic type argument to the generic trait is a type declared in the local crate.)
This is why you frequently see so-called "newtypes" in Rust, which are typically unit structs with a single member, whose sole purpose is to implement a foreign trait on a foreign type. The newtype lives in the same crate as the implementation, so the compiler accepts the implementation.
This could be realized in your example with a newtype around the array type:
#[repr(transparent)]
struct VertexArray<const N: usize>(pub [Vertex; N]);
impl<const N: usize> Deref for VertexArray<N> {
type Target = [Vertex; N];
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl<const N: usize> DerefMut for VertexArray<N> {
fn deref_mut(&mut self) -> &mut Self::Target {
&mut self.0
}
}
impl<const N: usize> GpuSerializable for VertexArray<N> {
// ...
}
Note that because of #[repr(transparent)], the layout of both VertexArray<N> and [Vertex; N] are guaranteed to be identical, meaning you can transmute between them, or references to them. This allows you to, for example, reborrow a &[Vertex; N] as a &VertexArray<N> safely, which means you can store all of your data as [Vertex; N] and borrow it as the newtype at zero cost whenever you need to interact with something expecting an implementation of GpuSerializable.
impl<const N: usize> AsRef<VertexArray<N>> for [Vertex; N] {
fn as_ref(&self) -> &VertexArray<N> {
unsafe { std::mem::transmute(self) }
}
}

Deserialize a Vec of values into a struct by implementing serde `Deserializer`

I have a data format using custom enum of values. From the database I receive a Vec<MyVal>. I want to convert this to a struct (and fail if it doesn't work). I want to use serde because after processing I want to return the API response as a json, and serde makes this super easy.
Playground link for the example
enum MyVal {
Bool(bool),
Text(String)
}
#[derive(Serialize, Deserialize)]
struct User {
name: String,
registered: bool
}
The challenge is with converting the data format into the serde data model. For this I can implement a Deserializer and implement the visit_seq method i.e. visit the Vec<MyVal> as if it were a sequence and return the values one by one. The visitor for User can consume the visited values to build the struct User.
However I'm not able to figure out how to convert the Vec into something visitor_seq can consume. Here's some sample code.
struct MyWrapper(Vec<MyVal>);
impl<'de> Deserializer<'de> for MyWrapper {
type Error = de::value::Error;
// skip unncessary
fn deserialize_seq<V>(self, visitor: V) -> Result<V::Value, Self::Error>
where
V: serde::de::Visitor<'de>,
{
let convert_myval = move |myval: &MyVal| match myval {
MyVal::Bool(b) => visitor.visit_bool(*b),
MyVal::Text(s) => visitor.visit_string(s.clone())
};
// now I have a vec of serde values
let rec_vec: Vec<Result<V::Value, Self::Error>> =
self.0.iter().map(|myval| convert_myval(myval)).collect();
// giving error!!
visitor.visit_seq(rec_vec.into_iter())
}
}
The error is
92 | visitor.visit_seq(rec_vec.into_iter())
| --------- ^^^^^^^^^^^^^^^^^^^ the trait `SeqAccess<'de>` is not implemented for `std::vec::IntoIter<Result<<V as Visitor<'de>>::Value, _::_serde::de::value::Error>>`
| |
| required by a bound introduced by this call
So I looked into SeqAccess and it has an implementor that requires that whatever is passed to it implement the Iterator trait. But I thought I had that covered, because vec.into_iter returns an IntoIter, a consuming iterator which does implement the Iterator trait.
So I'm completely lost as to what's going wrong here. Surely there must be a way to visit a Vec<Result<Value, Error>> as a seq?
Preface: The question wants to treat a Rust data structure Vec<MyData> like a serialized piece of data (e.g.: like a JSON string) and allow deserializing that into any other Rust data structure that implements Deserialize. This is a quite unusual, but not without precedent. And since the MyVals are actually pieces of data with various types which get returned from a database access crate, this approach does make sense.
The main problem with the code in the question is that it tries to deserialize two different data structures (MyWrapper<Vec<MyVal>> and MyVal) with a single Deserializer. The obvious way out is to define a second struct MyValWrapper(MyVal) and implement Deserializer for it:
struct MyValWrapper(MyVal);
impl<'de> Deserializer<'de> for MyValWrapper {
type Error = de::value::Error;
fn deserialize_any<V>(self, visitor: V) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
match self.0 {
MyVal::Bool(b) => visitor.visit_bool(b),
MyVal::Text(s) => visitor.visit_string(s.clone()),
}
}
// Maybe you should call deserialize_any from all other deserialize_* functions to get a passable error message, e.g.:
forward_to_deserialize_any! {
bool i8 i16 i32 i64 i128 u8 u16 u32 u64 u128 f32 f64 char str string
bytes byte_buf option unit unit_struct newtype_struct seq tuple
tuple_struct map struct enum identifier ignored_any
}
// Though maybe you want to handle deserialize_enum differently...
}
MyValWrapper now can deserialize a MyVal. To use MyValWrapper to deserialize a Vec of MyVals, the serde::value::SeqDeserializer adapter is convenient, it can turn an iterator of (Into)Deserializer into a sequence Deserializer:
let data: Vec<MyData> = …;
// Vec deserialization snippet
let mut sd = SeqDeserializer::new(data.into_iter().map(|myval| MyValWrapper(myval)));
let res = visitor.visit_seq(&mut sd)?;
sd.end()?;
Ok(res)
For some reason, SeqDeserializer requires the iterator items to be IntoDeserializer, but no impl<T: IntoDeserializer> Deserializer for T exists, so we need to make sure that MyValWrapper is not only a Deserializer but trivially also an IntoDeserializer:
impl<'de> IntoDeserializer<'de> for MyValWrapper {
type Deserializer = MyValWrapper;
fn into_deserializer(self) -> Self::Deserializer {
self
}
}
Finally, you need to impl Deserializer for MyWrapper (you can use the "Vec deserialization snippet" for that) — if you do actually need Deserializer to be implemented, which I suspect you don't: the SeqDeserializer already implements Deserializer, and it is a wrapper struct (just as MyWrapper is a wrapper struct). Especially, if your final goal is having a function like
fn turn_bad_myvals_into_good_T<T: DeserializeOwned>(v: Vec<MyVal>) -> T {
T::deserialize(SeqDeserializer::new(
v.into_iter().map(|myval| MyValWrapper(myval)),
))
}
then you entirely don't need MyWrapper.

Understanding polymorphism in Go

I guess I got stuck in thinking about a polymorphism solution to my following problem:
Let's say I have a BaseTX struct with fields for a transaction. Now I have two special types of transactions: RewardTX struct and AllowanceTX struct.
RewardTX struct has at this moment only the composition of BaseTX struct.
AllowanceTX struct has a composition of BaseTX struct and an AddField.
I have also a function logicAndSaveTX(), which has some logic on fields from BaseTX but at the end is serializing the whole object using json.Marshal() and saving the byte[] somewhere.
type TXapi interface {
logicAndSaveTX()
}
type BaseTX struct {
Field1 string
Field2 string
}
type RewardTX struct {
BaseTX
}
type AllowanceTX struct {
BaseTX
AddField string
}
func (tx BaseTX) logicAndSaveTX() {
// logic on BaseTX fields; simplified:
tx.Field1 = "overwritten"
tx.Field2 = "logic done"
// here would be marshal to json and save; simplified to print object:
fmt.Printf("saved this object: %+v \n", tx)
}
func SaveTX(tx TXapi) {
tx.logicAndSaveTX()
}
func main() {
rewardTX := RewardTX{BaseTX : BaseTX{Field1: "Base info1", Field2: "Base info2"}}
SaveTX(rewardTX) // should print rewardTX with fields from BaseTX
allowanceTX := AllowanceTX{BaseTX : BaseTX{Field1: "Base info1", Field2: "Base info2"}, AddField: "additional field"}
SaveTX(allowanceTX) // would like to print allowanceTX with fields from BaseTX + AdditionalField >>> instead only printing fields from BaseTX
}
https://play.golang.org/p/0Vu_YXktRIk
I try to figure out how to implement the structures and the function to operate on both kinds of transactions but at the end serializing both structures properly. My problem is, that the AddField is not being seen in my current implementation.
Maybe I have got some brain fail here--I would really like to implement this the "proper Go way". :)
Go is not object-oriented. The only form of polymorphism in Go is interfaces.
Coming from other, object-oriented languages can be difficult, because you have to get rid of a lot of ideas you might try to carry over - things like, for example, "base" classes/types. Just remove "base" from your design thinking; you're trying to turn composition into inheritance, and that's only going to get you into trouble.
In this case, maybe you have a legitimate case for composition here; you have some common shared fields used by multiple types, but it's not a "base" type. It's maybe "metadata" or something - I can't say what to call it given that your example is pretty abstract, but you get the idea.
So maybe you have:
type TXapi interface {
logicAndSaveTX()
}
type Metadata struct {
Field1 string
Field2 string
}
type RewardTX struct {
Metadata
}
func (tx RewardTX) logicAndSaveTX() {
// logic on BaseTX fields; simplified:
tx.Field1 = "overwritten"
tx.Field2 = "logic done"
// here would be marshal to json and save; simplified to print object:
fmt.Printf("saved this object: %+v \n", tx)
}
type AllowanceTX struct {
Metadata
AddField string
}
func (tx AllowanceTX) logicAndSaveTX() {
// logic on BaseTX fields; simplified:
tx.Field1 = "overwritten"
tx.Field2 = "logic done"
tx.AddField = "more stuff"
// here would be marshal to json and save; simplified to print object:
fmt.Printf("saved this object: %+v \n", tx)
}
If the handling of the metadata (or whatever) fields is identical in all uses, maybe you give that type its own logicTX method to fill those fields, which can be called by the logicAndSaveTX of the structs that embed it.
The key here is to think of the behavior (methods) on a type to be scoped to that type, instead of thinking of it as somehow being able to operate on "child types". Child types don't exist, and there is no way for a type that is embedded in another type to operate on its container.
Also point to be noted here aht Go only support run time polymorphism through interfaces. Compile time polymorphism is not possible in Golang.
Source: - https://golangbyexample.com/oop-polymorphism-in-go-complete-guide/

Is there a way to convert a trait reference to an object of another unconnected type? [duplicate]

This question already has answers here:
How to get a reference to a concrete type from a trait object?
(2 answers)
Closed 4 years ago.
I have a collection of interfaces that are loaded dynamically from shared libraries. I want to be able to convert those downcasted interfaces to their original type (trait).
struct A {}
fn abstract_a<'l>() -> &'l Any { return &A{} }
trait TargetTrait { fn some_method(); }
impl TargetTrait for A { fn some_method() { println!("HELLO"); } }
fn main() {
let x: &Any = abstract_a();
let y: &TargetTrait = magic_conversion<&TargetTrait> (x);
}
// question: does 'magic_conversion'(or 'dynamic_cast') exist? what is it?
While loading these is not a problem, I have no idea how to get target functionality with such interface. In other words:
/* simplified for readability */
// this part is known
let some_lib = loadlib("path/to/lib.so")
let some_interface: &Any = some_lib.loadfunc<&Any>("constructor_func")()
/* loader does not know what target type constructor has, so it presumes 'Any' */
// the problem:
let dependent_class = Some(class)
dependent_class.graphics = dynamic_cast<IGraphics>(some_interface)
In this example, dependent_class uses an extern interface and does not care about handling libloading and all of that complicated stuff.
If there is another way to achieve my goal, I would also be very happy to see it, but the only solution I came up with is 'dynamic_cast'
I think what you're looking for is downcast_ref::<A>:
let y: &TargetTrait = Any::downcast_ref::<A>(x).expect("Expected an A");
You have to specify the concrete type A. Any trait objects don't hold any information about what traits the underlying type implements, so you can't "cross-cast" from &Any to &TargetTrait directly; you have to know the underlying type.
The expect will panic if downcast_ref returns None; if that's not what you want, you have to decide what you want to happen when x is not an A and match against the result of downcast_ref instead.

JSON marshall nullable type value in database/sql Go

I am trying to figure out how one can properly marhsall nullable type (string, int, time) properly to JSON in Go. I know that database/sql provide sql.NullTime, sql.NullInt, etc but when you marshall these values, you get something like
{"first_name": {
"Value": "",
"Valid": false,
}}
What I really want is
{"first_name": null}
I understand that you can implement your own MarshalJSON to do this (I wrote about it here http://dennissuratna.com/marshalling-nullable-string-db-value-to-json-in-go/)
BUT I am wondering if anyone knows a better way to do this. I want to know other people know a less tedious way to do this.
May be late, but when searching for the same problem google report this page at a high rank, so : my solution is to define special type that surcharge NullInt64 for exemple and has and export JSON
Example for NullInt64 :
// NullInt64 is the same as sql.NullInt64
// But when exported to JSON it is null or the int64 val that is exported
// not a JSON containing Value and Valid properties
type NullInt64 struct {
sql.NullInt64
}
// NullInt64 MarshalJSON interface redefinition
func (r NullInt64) MarshalJSON() ([]byte, error) {
if r.Valid {
return json.Marshal(r.Int64)
} else {
return json.Marshal(nil)
}
}
And then replace all sql.NullInt64 by NullInt64.
You can easily do the same for sql.NullString.
Hope it helps
Create a type that embeds (e.g.) sql.NullInt and implements the json.Marshaler interface.
If you really want to have null field, you will have to resort to a custom marshaller (or maybe, just maybe, use *string fields in struct and assign nil instead of an empty string).
However, if you look at the original goal of JSON (JavaScript Object Notation), you will notice that in JavaScript there is hardly any difference between:
var obj = JSON.parse('{"first_name": null }');
alert(obj.first_name)
and:
var obj = JSON.parse('{}');
alert(obj.first_name)
In other words: assigning null to a field has the same effect as not specifying it all. And most JSON parsers work like this.
Not specifying empty fields is supported by the Go JSON marshaller:
type MyType struct {
firstname string `json:omitempty`
}
In the end, it depends on what you want to do with your JSON :)