An object which implements some custom serialization can be serialized and deserialized to different formats, for example to Xml or byte[].
I have run into a problem where when I put to cache, AppFabric runs the IXmlSerializable implementation on a class when I would rather force it to go with binary. AppFabric Caching - What are its serialization and deserialization requirements for an object?
Can I configure this?
(At the moment the workaround is to serialize the object programatically to a byte[] and then send that into the cache, reversing the process on the way out).
In the MSDN documentation it says we could implement IDataCacheObjectSerializer to achieve this goal. You can read about it here: http://msdn.microsoft.com/en-us/library/windowsazure/hh552969.aspx
class MySerializer : IDataCacheObjectSerializer
{
public object Deserialize(System.IO.Stream stream)
{
// Deserialize the System.IO.Stream 'stream' from
// the cache and return the object
}
public void Serialize(System.IO.Stream stream, object value)
{
// Serialize the object 'value' into a System.IO.Stream
// that can be stored in the cache
}
}
Afer that, you can set the custom serializer to the DataCacheFactory:
DataCacheFactoryConfiguration configuration = new DataCacheFactoryConfiguration();
configuration.SerializationProperties =
new DataCacheSerializationProperties(DataCacheObjectSerializerType.CustomSerializer,
new MyNamespace.MySerializer());
// Assign other DataCacheFactoryConfiguration properties...
// Then create a DataCacheFactory with this configuration
DataCacheFactory factory = new DataCacheFactory(configuration);
Hope this helps.
Related
I am trying to implement signal r client and server with json serialization.
Currently I am targeting .NET 5 and use Microsoft json serializer implementation.
My messages are represented by complex objects and there is an JsonConverter used for reading and writing.
What I see is that on the client the On event is never raised except if handler parameter specified as an object class.
connection.On("EntityEventAsync", (object obj) =>
{
//obj will be json object here
});
On client side I can clearly see that the messages are received as JsonConverter is called and reads the messages as it should BUT the On event is never raised.
Typed client hub code
public interface IEventsClient
{
Task EntityEventAsync(DetailedMessage message);
}
[Authorize(AuthenticationSchemes = "Basic,Bearer")]
public class EventHub : Hub<Clients.IEventsClient>
{
#region CONSTRUCTOR
public EventHub()
{
}
#endregion
}
What I can be missing here?
If some one else struggles with same problem then it be might the problem with the JsonConverter implementation as it in my case.
Its possible to enable Signal R client logging as mentioned here https://learn.microsoft.com/en-us/aspnet/core/signalr/diagnostics?view=aspnetcore-5.0 and that should make it easier to figure out the problem.
I have downloaded a sample code from github and run AtLeastOnceDelivery.sln
Every new run it is sending messages with it. And if I change the message namespace it shows an error started with
Error loading snapshot [SnapshotMetadata<pid: delivery, seqNr: 0, timestamp: 2018/09/24>], remaining attempts: [0]
If I could clear the persistence hopefully it will accept then changed namespace and restart messaging id.
By default, all snapshots are stored as files directly in ./snapshots directory of the application, while events are stored in the memory. Because of that you should consider using a one of the akka.persistence plugins for the production purposes.
Your problem happens because you're using akka.net default serializers (dedicated for networking) which are not very version tolerant - so changing any fields, their types, class names or namespaces makes previous version of the class non-deserializable - and in future will be subject to change. This is also why it's strongly discouraged to use default serializers for persistence.
How to make a custom Akka.NET Serializer
While there are plans to improve serializers API, at the current moment (Akka.NET v1.3.9), to make your own serializer you need to simply inherit from Akka.Serialization.Serializer class:
public sealed class MySerializer : Serializer
{
public MySerializer(ExtendedActorSystem system) : base(system) { }
public override int Identifier => /* globaly unique serializer id */;
public override bool IncludeManifest => true;
public override byte[] ToBinary(object obj)
{
// serialize object
}
public override object FromBinary(byte[] bytes, Type type)
{
// deserialize object
}
}
Keep in mind that Identifier property must be unique in cluster scope - usually values below 100 are used by akka.net internal serializers, therefore it's better to use higher values.
How to bind serializer to be used for a given type
By convention Akka.NET uses empty interfaces to mark message types that are supposed to be serialized. Then you can setup your HOCON configuration to use a specific serializer for a given interface:
akka.actor {
serializers {
my-serializer = ""MyNamespace.MySerializer, MyAssembly""
}
serialization-bindings {
""MyNamespace.MyInterface, MyAssembly"" = my-serializer
}
}
Where MyInterface is interface assigned to a message type you want to serialize/deserialize with MySerializer.
Assuming I have the following object
public class DataObjectA {
private Stream<DataObjectB> dataObjectBStream;
}
How can I serialize them using Jackson?
As others have pointed out, you can only iterate once over a stream. If that works for you, you can use this to serialize:
new ObjectMapper().writerFor(Iterator.class).writeValueAsString(dataObjectBStream.iterator())
If you're using a Jackson version prior to 2.5, use writerWithType() instead of writerFor().
See https://github.com/FasterXML/jackson-modules-java8/issues/3 for the open issue to add java.util.Stream support to Jackson. There's a preliminary version of the code included. (edit: this is now merged and supported in 2.9.0).
Streaming support feels like it would work naturally/safely if the stream is the top level object you were (de)serializing, eg returning a java.util.stream.Stream<T> from a JAX-RS resource, or reading a Stream from a JAX-RS client.
A Stream as a member variable of a (de)serialized object, as you have in your example, is trickier, because it's mutable and single use:
private Stream<DataObjectB> dataObjectBStream;
Assuming it was supported, all of the caveats around storing references to streams would apply. You wouldn't be able to serialize the object more than once, and once you deserialized the wrapping object presumably it's stream member would retain a live connection back through the JAX-RS client and HTTP connection, which could create surprises.
You don’t.
A Stream is a single-use chain of operations and never meant to be persistent. Even storing it into an instance field like in your question is an indicator for a misunderstanding of it’s purpose. Once a terminal operation has been applied on the stream, it is useless and streams can’t be cloned. This, there is no point in remembering the unusable stream in a field then.
Since the only operations offered by Stream are chaining more operations to the pipeline and finally evaluating it, there is no way of querying its state such that it would allow to create an equivalent stream regarding its behavior. Therefore, no persistence framework can store it. The only thing a framework could do, is traversing the resulting elements of the stream operation and store them but that means effectively storing a kind of collection of objects rather than the Stream. Besides that, the single-use nature of a Stream also implies that a storage framework traversing the stream in order to store the elements had the side-effect of making the stream unusable at the same time.
If you want to store elements, resort to an ordinary Collection.
On the other hand, if you really want to store behavior, you’ll end up storing an object instance whose actual class implements the behavior. This still works with Streams as you can store an instance of a class which has a factory method producing the desired stream. Of course, you are not really storing the behavior but a symbolic reference to it, but this is always the case when you use an OO storage framework to store behavior rather than data.
I had below class having 2 elements one of them was Stream, had to annotate the getterStream method with#JsonSerializer and then override Serialize method, produces stream of JSON in my Response API:
public class DataSetResultBean extends ResultBean
{
private static final long serialVersionUID = 1L;
private final List<ComponentBean> structure;
private final Stream<DataPoint> datapoints;
private static class DataPointSerializer extends JsonSerializer<Stream<DataPoint>>
{
#Override
public void serialize(Stream<DataPoint> stream, JsonGenerator gen, SerializerProvider serializers) throws IOException, JsonProcessingException
{
gen.writeStartArray();
try
{
stream.forEach(dp -> serializeSingle(gen, dp));
}
catch (UncheckedIOException e)
{
throw (IOException) e.getCause();
}
finally
{
stream.close();
}
gen.writeEndArray();
}
public synchronized void serializeSingle(JsonGenerator gen, DataPoint dp) throws UncheckedIOException
{
try
{
gen.writeStartObject();
for (Entry<DataStructureComponent<?, ?, ?>, ScalarValue<?, ?, ?>> entry: dp.entrySet())
{
gen.writeFieldName(entry.getKey().getName());
gen.writeRawValue(entry.getValue().toString());
}
gen.writeEndObject();
}
catch (IOException e)
{
throw new UncheckedIOException(e);
}
}
}
public DataSetResultBean(DataSet dataset)
{
super("DATASET");
structure = dataset.getMetadata().stream().map(ComponentBean::new).collect(toList());
datapoints = dataset.stream();
}
public List<ComponentBean> getStructure()
{
return structure;
}
#JsonSerialize(using = DataPointSerializer.class)
public Stream<DataPoint> getDatapoints()
{
return datapoints;
}
}
I have a net.tcp WCF service and its client, each in one assembly and sharing another assembly containing the service interface and DTOs.
The client is implemented as a proxy to the service using a Channel instantiated through ChannelFactory:
public ServiceClient : IService
{
IService _channel;
public ServiceClient()
{
_channel = new ChannelFactory<IService>("NetTcp_IService")
.CreateChannel();
}
public DTO ServiceMethod()
{
return _channel.ServiceMethod();
}
}
public class DTO
{
public IList<int> SomeList;
}
As expected, the SomeListfield of the DTO returned by the client is an array but I would like it to be converted by WCF to a List. As you may suspect from the described set-up, I don't use svcutil (or the Add Service Reference dialog for that matter), so I can't use configureType.
I don't want to modify the client proxy to instantiate the List and modify the received DTO in my client proxy because the actual implementation uses a command processor using interfaces resolved through dependency injection at run-time to avoid coupling - and this solution would do the opposite, by requiring the client to perform know service commands.
Therefore, I'm currently using the work-around which modifies the DTO to internally create the List instance:
public class DTO
{
private IList<int> _someList;
public IList<int> SomeList
{
get { return _someList; }
set {
if (value != null)
_someList = new List<int>(value);
else
_someList = new List<int>();
}
}
}
However, I'd rather avoid this. So the question is:
How can I configure the WCF deserialization so that the array is converted to the expected List?
Is there any way to configure the deserialization through the binding either in the App.config or from code upon Channel creation? Maybe through ImportOptions.ReferencedCollectionTypes or CollectionDataContract?
There are 4 ways:
Convert data to List in your save methods on Client side
Change property type:
public IList<int> SomeList;
to
public List<int> SomeList;
Approach you have shown above (changing type on assigment).
Implement IDataContractSurrogate. But you will have to apply a behaviour on client side.
Can anyone point me to a protobuf-net serializer for NEventStore 3.0?
I'm having trouble I think mainly due to the serialization in event store 3 wrapping the event body and headers in an EventMessage.
I'm not sure how to setup the custom serializer correctly.
This is entirely untested guesswork based on a very brief glance at github, but it looks like you want to use the wire-up API to specify a custom serializer, for example:
var store = Wireup.Init()
.UsingSqlPersistence("Name Of EventStore ConnectionString In Config File")
.InitializeStorageEngine()
.UsingCustomSerialization(mySerializer)
... etc
where mySerializer is an instance of a type that implements the ISerialize interface. It looks like this should work:
class ProtobufSerializer : EventStore.Serialization.ISerialize
{
public void Serialize<T>(Stream output, T graph)
{
ProtoBuf.Serializer.Serialize<T>(output, graph);
}
public T Deserialize<T>(Stream input)
{
return ProtoBuf.Serializer.Deserialize<T>(input);
}
}
(so obviously mySerializer here would be a new ProtobufSerializer())