I am trying to implement signal r client and server with json serialization.
Currently I am targeting .NET 5 and use Microsoft json serializer implementation.
My messages are represented by complex objects and there is an JsonConverter used for reading and writing.
What I see is that on the client the On event is never raised except if handler parameter specified as an object class.
connection.On("EntityEventAsync", (object obj) =>
{
//obj will be json object here
});
On client side I can clearly see that the messages are received as JsonConverter is called and reads the messages as it should BUT the On event is never raised.
Typed client hub code
public interface IEventsClient
{
Task EntityEventAsync(DetailedMessage message);
}
[Authorize(AuthenticationSchemes = "Basic,Bearer")]
public class EventHub : Hub<Clients.IEventsClient>
{
#region CONSTRUCTOR
public EventHub()
{
}
#endregion
}
What I can be missing here?
If some one else struggles with same problem then it be might the problem with the JsonConverter implementation as it in my case.
Its possible to enable Signal R client logging as mentioned here https://learn.microsoft.com/en-us/aspnet/core/signalr/diagnostics?view=aspnetcore-5.0 and that should make it easier to figure out the problem.
Related
An external company has given me a WSDL to consume which has a couple of odd characteristics which I don't want to impact my client code.
Firstly, each OperationContract requires the same username parameter sent over. Instead of setting this each time in my client code I'd like to do this globally.
I believe setting this in a IClientMessageInspector is my best bet, however, with this being a SOAP service I'm a little confused at how to add this into the body.
public class CustomInspector : IClientMessageInspector
{
public object BeforeSendRequest(ref Message request, IClientChannel channel)
{
// Add an additional parameter to the SOAP body
return null;
}
}
Secondly, whilst the service does return mapped objects, one of the objects contains an xml document shoved in a CDATA :(
<a:ResponseData>
<![CDATA[ INSERT XML DOCUMENT HERE]]>
</a:ResponseData>
I'm looking to extract the XML out and add it back in without the CDATA and XML declaration so I can add the appropriate properties on my response object. That way it should deserialize like normal(hope that makes sense)
public class CustomInspector : IClientMessageInspector
{
public void AfterReceiveReply(ref Message reply, object correlationState)
{
// Get the XML from the ResponseData element and remove the CDATA. Add the XML back in (Minus the <xml> declaration)
}
}
Firstly, each OperationContract requires the same username parameter
sent over. Instead of setting this each time in my client code I'd
like to do this globally. I believe setting this in a
IClientMessageInspector is my best bet, however, with this being a
SOAP service I'm a little confused at how to add this into the body.
If you want to add custom message header to the message, you could refer to the following code.
public object BeforeSendRequest(ref Message request, System.ServiceModel.IClientChannel channel)
{
request.Headers.Add(MessageHeader.CreateHeader("username", "", "user"));
request.Headers.Add(MessageHeader.CreateHeader("password", "", "pass"));
return null;
}
Take a look at IClientMessageInspector.
Here are some links may be useful to you.
Adding custom SOAP headers from Silverlight client
https://weblogs.asp.net/paolopia/handling-custom-soap-headers-via-wcf-behaviors
https://social.msdn.microsoft.com/Forums/vstudio/en-US/f1f29779-0121-4499-a2bc-63ffe8025b21/wcf-security-soap-header
I have downloaded a sample code from github and run AtLeastOnceDelivery.sln
Every new run it is sending messages with it. And if I change the message namespace it shows an error started with
Error loading snapshot [SnapshotMetadata<pid: delivery, seqNr: 0, timestamp: 2018/09/24>], remaining attempts: [0]
If I could clear the persistence hopefully it will accept then changed namespace and restart messaging id.
By default, all snapshots are stored as files directly in ./snapshots directory of the application, while events are stored in the memory. Because of that you should consider using a one of the akka.persistence plugins for the production purposes.
Your problem happens because you're using akka.net default serializers (dedicated for networking) which are not very version tolerant - so changing any fields, their types, class names or namespaces makes previous version of the class non-deserializable - and in future will be subject to change. This is also why it's strongly discouraged to use default serializers for persistence.
How to make a custom Akka.NET Serializer
While there are plans to improve serializers API, at the current moment (Akka.NET v1.3.9), to make your own serializer you need to simply inherit from Akka.Serialization.Serializer class:
public sealed class MySerializer : Serializer
{
public MySerializer(ExtendedActorSystem system) : base(system) { }
public override int Identifier => /* globaly unique serializer id */;
public override bool IncludeManifest => true;
public override byte[] ToBinary(object obj)
{
// serialize object
}
public override object FromBinary(byte[] bytes, Type type)
{
// deserialize object
}
}
Keep in mind that Identifier property must be unique in cluster scope - usually values below 100 are used by akka.net internal serializers, therefore it's better to use higher values.
How to bind serializer to be used for a given type
By convention Akka.NET uses empty interfaces to mark message types that are supposed to be serialized. Then you can setup your HOCON configuration to use a specific serializer for a given interface:
akka.actor {
serializers {
my-serializer = ""MyNamespace.MySerializer, MyAssembly""
}
serialization-bindings {
""MyNamespace.MyInterface, MyAssembly"" = my-serializer
}
}
Where MyInterface is interface assigned to a message type you want to serialize/deserialize with MySerializer.
I have a WebAPI service using SimpleInjector. I have this set up using AsyncScopedLifestyle for my scoped dependencies, and one of these dependencies is my Entity Framework DataContext. Many things in my service depend on the DataContext, and it is generally injected in to my MediatR handlers using constructor injection - this works well. Separately I have a few areas where I need to create an instance of an object given its type (as a string), so I have created a custom activator class (ResolvingActivator) that is configured with a reference to Container.GetInstance(Type):
In my container bootstrap code:
ResolvingActivator.Configure(container.GetInstance);
I can then create objects by using methods such as:
ResolvingActivator.CreateInstance<T>(typeName)
When I'm using WebAPI, the above is working perfectly.
A further part of the project is a legacy API that uses WCF. I have implemented this as a translation layer, where I translate old message formats to new message formats and then dispatch the messages to the Mediator; I then translate the responses (in new format) back to old format and return those to the caller. Because I need access to the Mediator in my WCF services, I'm injecting this in their constructors, and using the SimpleInjector.Integration.Wcf package to let SimpleInjector's supplied SimpleInjectorServiceHostFactory build instances of the services. I've also created a hybrid lifestyle, so I can use the same container for my both my WebAPI and WCF services:
container.Options.DefaultScopedLifestyle = Lifestyle.CreateHybrid(
new AsyncScopedLifestyle(),
new WcfOperationLifestyle());
This works well for some calls, but when a call ultimately calls my ResolvingActivator class, I get an ActivationException thrown, with the following message:
The DataContext is registered as 'Hybrid Async Scoped / WCF Operation' lifestyle, but the instance is requested outside the context of an active (Hybrid Async Scoped / WCF Operation) scope.
As I only receive this error when making WCF calls, I'm wondering if I have something wrong in my configuration. In a nutshell, this will work:
public class SomeClass
{
private readonly DataContext db;
public SomeClass(DataContext db)
{
this.db = db;
}
public bool SomeMethod() => this.db.Table.Any();
}
But this will not:
public class SomeClass
{
public bool SomeMethod()
{
// Code behind is calling container.GetInstance(typeof(DataContext))
var db = ResolvingActivator.CreateInstance<DataContext>();
return db.Table.Any();
}
}
Any ideas where I'm going wrong?
Edit: here is the stack trace from the ActivationException:
at SimpleInjector.Scope.GetScopelessInstance[TImplementation](ScopedRegistration`1 registration)
at SimpleInjector.Scope.GetInstance[TImplementation](ScopedRegistration`1 registration, Scope scope)
at SimpleInjector.Advanced.Internal.LazyScopedRegistration`1.GetInstance(Scope scope)
at lambda_method(Closure )
at SimpleInjector.InstanceProducer.GetInstance()
at SimpleInjector.Container.GetInstance(Type serviceType)
at Service.Core.ResolvingActivator.CreateInstance(Type type) in Service.Core\ResolvingActivator.cs:line 43
at Service.Core.ResolvingActivator.CreateInstance(String typeName) in Service.Core\ResolvingActivator.cs:line 35
at Service.Core.ResolvingActivator.CreateInstance[TService](String typeName) in Service.Core\ResolvingActivator.cs:line 69
With a full stack trace here: https://pastebin.com/0WkyHGKv
After close inspection of the stack trace, I can conclude what's going on: async.
The WcfOperationLifestyle under the covers depends on WCF's OperationContext.Current property, but this property has a thread-affinity and doesn't flow with async operations. This is something that has to be fixed in the integration library for Simple Injector; it simply doesn't support async at the moment.
Instead, wrap a decorator around your handlers that start and end a new async scope. This prevents you from having to use the WcfOperationLifestyle all together. Take a look at the ThreadScopedCommandHandlerProxy<T> implementation here to get an idea how to do this (but use AsyncScopedLifestyle instead).
Assuming I have the following object
public class DataObjectA {
private Stream<DataObjectB> dataObjectBStream;
}
How can I serialize them using Jackson?
As others have pointed out, you can only iterate once over a stream. If that works for you, you can use this to serialize:
new ObjectMapper().writerFor(Iterator.class).writeValueAsString(dataObjectBStream.iterator())
If you're using a Jackson version prior to 2.5, use writerWithType() instead of writerFor().
See https://github.com/FasterXML/jackson-modules-java8/issues/3 for the open issue to add java.util.Stream support to Jackson. There's a preliminary version of the code included. (edit: this is now merged and supported in 2.9.0).
Streaming support feels like it would work naturally/safely if the stream is the top level object you were (de)serializing, eg returning a java.util.stream.Stream<T> from a JAX-RS resource, or reading a Stream from a JAX-RS client.
A Stream as a member variable of a (de)serialized object, as you have in your example, is trickier, because it's mutable and single use:
private Stream<DataObjectB> dataObjectBStream;
Assuming it was supported, all of the caveats around storing references to streams would apply. You wouldn't be able to serialize the object more than once, and once you deserialized the wrapping object presumably it's stream member would retain a live connection back through the JAX-RS client and HTTP connection, which could create surprises.
You don’t.
A Stream is a single-use chain of operations and never meant to be persistent. Even storing it into an instance field like in your question is an indicator for a misunderstanding of it’s purpose. Once a terminal operation has been applied on the stream, it is useless and streams can’t be cloned. This, there is no point in remembering the unusable stream in a field then.
Since the only operations offered by Stream are chaining more operations to the pipeline and finally evaluating it, there is no way of querying its state such that it would allow to create an equivalent stream regarding its behavior. Therefore, no persistence framework can store it. The only thing a framework could do, is traversing the resulting elements of the stream operation and store them but that means effectively storing a kind of collection of objects rather than the Stream. Besides that, the single-use nature of a Stream also implies that a storage framework traversing the stream in order to store the elements had the side-effect of making the stream unusable at the same time.
If you want to store elements, resort to an ordinary Collection.
On the other hand, if you really want to store behavior, you’ll end up storing an object instance whose actual class implements the behavior. This still works with Streams as you can store an instance of a class which has a factory method producing the desired stream. Of course, you are not really storing the behavior but a symbolic reference to it, but this is always the case when you use an OO storage framework to store behavior rather than data.
I had below class having 2 elements one of them was Stream, had to annotate the getterStream method with#JsonSerializer and then override Serialize method, produces stream of JSON in my Response API:
public class DataSetResultBean extends ResultBean
{
private static final long serialVersionUID = 1L;
private final List<ComponentBean> structure;
private final Stream<DataPoint> datapoints;
private static class DataPointSerializer extends JsonSerializer<Stream<DataPoint>>
{
#Override
public void serialize(Stream<DataPoint> stream, JsonGenerator gen, SerializerProvider serializers) throws IOException, JsonProcessingException
{
gen.writeStartArray();
try
{
stream.forEach(dp -> serializeSingle(gen, dp));
}
catch (UncheckedIOException e)
{
throw (IOException) e.getCause();
}
finally
{
stream.close();
}
gen.writeEndArray();
}
public synchronized void serializeSingle(JsonGenerator gen, DataPoint dp) throws UncheckedIOException
{
try
{
gen.writeStartObject();
for (Entry<DataStructureComponent<?, ?, ?>, ScalarValue<?, ?, ?>> entry: dp.entrySet())
{
gen.writeFieldName(entry.getKey().getName());
gen.writeRawValue(entry.getValue().toString());
}
gen.writeEndObject();
}
catch (IOException e)
{
throw new UncheckedIOException(e);
}
}
}
public DataSetResultBean(DataSet dataset)
{
super("DATASET");
structure = dataset.getMetadata().stream().map(ComponentBean::new).collect(toList());
datapoints = dataset.stream();
}
public List<ComponentBean> getStructure()
{
return structure;
}
#JsonSerialize(using = DataPointSerializer.class)
public Stream<DataPoint> getDatapoints()
{
return datapoints;
}
}
I have a net.tcp WCF service and its client, each in one assembly and sharing another assembly containing the service interface and DTOs.
The client is implemented as a proxy to the service using a Channel instantiated through ChannelFactory:
public ServiceClient : IService
{
IService _channel;
public ServiceClient()
{
_channel = new ChannelFactory<IService>("NetTcp_IService")
.CreateChannel();
}
public DTO ServiceMethod()
{
return _channel.ServiceMethod();
}
}
public class DTO
{
public IList<int> SomeList;
}
As expected, the SomeListfield of the DTO returned by the client is an array but I would like it to be converted by WCF to a List. As you may suspect from the described set-up, I don't use svcutil (or the Add Service Reference dialog for that matter), so I can't use configureType.
I don't want to modify the client proxy to instantiate the List and modify the received DTO in my client proxy because the actual implementation uses a command processor using interfaces resolved through dependency injection at run-time to avoid coupling - and this solution would do the opposite, by requiring the client to perform know service commands.
Therefore, I'm currently using the work-around which modifies the DTO to internally create the List instance:
public class DTO
{
private IList<int> _someList;
public IList<int> SomeList
{
get { return _someList; }
set {
if (value != null)
_someList = new List<int>(value);
else
_someList = new List<int>();
}
}
}
However, I'd rather avoid this. So the question is:
How can I configure the WCF deserialization so that the array is converted to the expected List?
Is there any way to configure the deserialization through the binding either in the App.config or from code upon Channel creation? Maybe through ImportOptions.ReferencedCollectionTypes or CollectionDataContract?
There are 4 ways:
Convert data to List in your save methods on Client side
Change property type:
public IList<int> SomeList;
to
public List<int> SomeList;
Approach you have shown above (changing type on assigment).
Implement IDataContractSurrogate. But you will have to apply a behaviour on client side.