How complex an object can be passed to silverlight from server, using WCF? - wcf

Please note that my experience in Silverlight/.Net and WCF is about two weeks of googling and deciphering tutorials. I need to attempt and provide feedback to a client on if Silverlight will be a possible solution to their application needing a RIA front end.
The client has a rather large .Net based application with a UI layer built which greatly relies on the creation and manipulation of specific (personal) classes and objects from the backend (which would be the server side).
A summery of what I understand to be the general procedure: one can pass simple objects containing simple data types, or more complex .Net type objects. Basically anything which can be understood by both client and server side, after serializing.
But what is the limitation to the complexity of an object I can pass? Or phrased otherwise, would silverlight and WCF be able to support the passing of a personalized object which may contain references to other classes/objects and variables etc?
Additional Info (in case it can help):
I am not allowed direct access to their backend code but with the information I have been given I can safely say their classes heavily use inheritance and overloading of functions/methods in the classes.

As far as I know there is nothing specific to Silverlight. There are some things to keep in mind though.
WCF serialization doesn´t like circular references.
All types need to specified in the contract. So watch out with inheritance etc.
In general using DTO's (Data Transfer Objects) and not exposing your business objects is the way to go.

The metaphor is one of message passing as opposed to passing objects. DTO's as Maurice said.
You can get pretty complex, but each object needs to have its contract defined.

Related

What is the counterpart (pattern) to services on the client side?

Let's say i do have a service, which is just a REST-API. This rest api provides some data.
As far as i understand, which makes sense, I can encapsulate data, which is sent from and to this service into DTO's. This totaly makes sense, since you'll have some business objects but often you'll need to serialize them in a way. So as far as i understand this would be a generally accepted and know way to abstract it regarding this part.
Then this DTO's are sent trough the REST-API. Regarding the server side it seams pretty straight forward, having some controllers which provide the data or receive them, I'm not seeing any issues there (at least for now).
So regarding my question. On the client side there are objects, which will access this API, this object, in my implementation contains a http client (not sure maybe i decouple them from this objects) and also it contains methods to access the api. So in one way or another, abstracting the use of http client and accessing the API away.
HOW DO YOU NAME THIS OBJECTS ACCESSING THE API?
I'm now naming them XXXManager/XXXHandler/..., but this names feel far to generic and i feel like there has to be some convention or pattern for this? Naming them XXXService also does not feel not completely right, because service for me is like the server side part, this object are accessing the service.
So how would you name this kind of objects and are there some deeper patterns to handle this kind of service/api accessors?
The model/pattern that would work here, is a classical layered architecture, which works like that:
The HttpClient should be wrapped around a class (let's name it ApiClient) that exposes methods for accessing the REST API. In each of those methods, the httpClient is used to execute the HTTP call.
There is a layer of Service/Manager classes that use the ApiClient and also apply their own business logic.
There is a layer of UI components which also inject the Services/Managers to grab the data and render it on the UI.
In this way you decouple the layers, which improves both the scalability and the testability of your code.
The naming somehow depends on the type of the client-side implementation/framework that you have.
If you have a web-frontend client, then the name TransactionService would tell me that this class talks to some external transaction service (Service is not a naming tied to server-side components).
This naming model applies to Angular, for example.
Patterns of Enterprise Application Architecture suggests Gateway, but I'd just go with Client.

Converting a Library to WCF web service

As the subject line describes, I am in the process of exposing a C# library into a WCF Service. Eventually we want to expose all the functionality, but at present the scope is limited to a subset of the library API. One of the goals of this exercise is also to make sure that the WCF service uses a Request / Response message exchange pattern. So the interface /API will change as the existing library does not use this pattern
I have started off by implementing the Service Contracts and the Request/Response objects, but when it comes to designing the DataContracts, I am not sure which way to go.
I am split between going back and annotating the existing library classes with DataContract/DataMember attributes VS defining new classes which are like surrogate classes to the existing classes.
Does anyone have any experience with similar task or have any recommendations on which way works best ? I would like to point out that our team owns the existing library so do have the source code for it. Any pointers or best practices will be helpful
My recommendation is to use the Adapter pattern, which in this case basically means create brand new DataContracts and ServiceContracts. This will allow everything to vary independently, and will allow you to optimize the WCF stuff for WCF and the API stuff for the API (if that makes sense). The last thing you want is to go down the modification route and find that something just won't map right once you are almost done.
Starting from .NET 3.5 SP1 you no longer need to decorate objects that you want to expose with [DataContract]/[DataMember] attributes. All public properties will be automatically exposed. This being said personally I prefer to use special DTO objects that I expose and decorate with those attributes. I then use AutoMapper to map between the actual domain models and the objects I want to expose.
If you are going to continue to use the existing library but want to have control over what you expose as the web service API, I would recommend defining new classes as wrapper(s) around the library.
What I mean to say is don't "convert" the existing library even if you think you're not going to continue to use it in other contexts. If it has been tested and proven, then take advantage of that fact and wrap around it.

WCF Data Objects Best Practice

I am in the processing of migrating from web services to WCF, and rather than trying to make old code work in WCF, I am just going to rebuild the services. As a part of this process, I have not figured out the best design to provide easy to consume services and also support future changes.
My service follows the pattern below; I actually have many more methods than this so duplication of code is an issue.
<ServiceContract()>
Public Interface IPublicApis
<OperationContract(AsyncPattern:=False)>
Function RetrieveQueryDataA(ByVal req As RequestA) As ResponseA
<OperationContract(AsyncPattern:=False)>
Function RetrieveQueryDataB(ByVal req As RequestB) As ResponseB
<OperationContract(AsyncPattern:=False)>
Function RetrieveQueryDataC(ByVal req As RequestC) As ResponseC
End Interface
Following this advice, I first created the schemas for the Request and Response objects. I then used SvcUtil to create the resulting classes so that I am assured the objects are consumable by other languages, and the clients will find the schemas easy to work with (no references to other schemas). However, because the Requests and Responses have similar data, I would like to use interfaces and inheritance so that I am not implementing multiple versions of the same code.
I have thought about writting my own version of the classes using interfaces and inheritance in a seperate class library, and implementing all of the logging, security, data retrieval logic there. Inside each operation I will just convert the RequestA to my InternalRequestA and call InternalRequestA's process function which will return an InternalResponseA. I will then convert that back to a ResponseA and send to the client.
Is this idea crazy?!? I am having problems finding another solution that takes advantage of inheritance internally, but still gives clean schemas to the client that support future updates.
The contracts created by using WCF data contracts generally produce relatively straight-forward schemas that are highly interoperable. I believe this was one of the guiding principles for the design of WCF. However, this interoperability relates to the messages themselves and not the objects that some other system might produce from them. How the messages are converted to/from objects at the other end entirely depends on the other system.
We have had no real issues using inheritance with data contract objects.
So, given that you clearly have control over the schemas (i.e. they are not being specified externally) and can make good use of WCF's inbuilt data contract capabilities, I struggle to see the benefit you will get the additional complexity and effort implied in your proposed approach.
In my view the logic associated with processing the messages should be kept entirely separate from the messages themselves.

WCF Object Design - OOP vs SOA

What is the proper way to handle polymorphic business objects in a WCF/SOAP world?
It seems to me that SOA and OOP are at odds with each other - to expose a clean WSDL you need concrete objects, typically not even utilizing inheritance. On the other hand, presumably in the underlying system, you'll want to follow proper OO design.
What do people typically do here? Build a set of WCF contract objects, forgoing OOP principles, then convert to and from another set of objects in the actual logic layers?
What do people typically do here? Build a set of WCF contract objects, forgoing OOP principles, then convert to and from another set of objects in the actual logic layers?
Yes.
The way WCF serializes things ends up putting a lot of limitations on what you can and can't do with the contract objects. What you can't do ends up being "most anything useful".
I've found it makes things much clearer if you think of the WCF-contract objects as just a data transfer mechanism. Basically like strongly/statically typed XML.
Instead of converting your business object to an XML string (and back again), you convert your business object to a WCF-contract object (and back again), but it's otherwise similar
After reading the Thomas Erl library, I came to the following conclusion:
Think of the WCF Contracts/SOAP Message as simply a message that the Services use to communicate (don't tightly tie that to Objects in your code).
You can then use OOP to design a code-base that gracefully handles those messages using common OOP techniques.
You use an abstraction (interface type) annotated with WCF attributes in order to define your Service contract.
This is both depending on abstraction, which is according to OOP, as well as defining a service endpoint, which is SOA.
In general, if you find that you are getting business objects with dependencies, you should consider pulling such dependencies up to the service business layer as opposed to inject dependencies into the business objects.
The service business layer will then act as a mediator acting on both the WCF service proxy as well as the business objects. As opposed to having the business objects acting on the WCF service proxy.
All great comments on this topic! I'll add my vote to the notion of an adapter for mediation between your service orientation and object orientation. I also like Thomas Erl's approach where in his service model he introduces the notion of "application services" and "business services." These are the way to go for your integration points with your specific application/business environment (i.e. your object oriented and component oriented framework/API). This way should result in much better composability and thus capability, for you enterprise framework gurus out there.

Lazy Loading with a WCF Service Domain Model?

I'm looking to push my domain model into a WCF Service API and wanted to get some thoughts on lazy loading techniques with this type of setup.
Any suggestions when taking this approach?
when I implemented this technique and step into my app, just before the server returns my list it hits the get of each property that is supposed to be lazy loaded ... Thus eager loading. Could you explain this issue or suggest a resolution?
Edit: It appears you can use the XMLIgnore attribute so it doesn’t get looked at during serialization .. still reading up on this though
Don't do lazy loading over a service interface. Define explicit DTO's and consume those as your data contracts in WCF.
You can use NHibernate (or other ORMs) to properly fetch the objects you need to construct the DTOs.
As for any remoting architecture, you'll want to avoid loading a full object graph "down the wire" in an uncontrolled way (unless you have a trivially small number of objects).
The Wikipedia article has the standard techniques pretty much summarised (and in C#. too!). I've used both ghosts and value holders and they work pretty well.
To implement this kind of technique, make sure that you separate concerns strictly. On the server, your service contract implementation classes should be the only bits of the code that work with data contracts. On the client, the service access layer should be the only code that works with the proxies.
Layering like this lets you adjust the way that the service is implemented relatively independently of the UI layers calling the service and the business tier that's being called. It also gives you half a chance of unit testing!
You could try to use something REST based (e.g. ADO.NET Data Services) and wrap it transpariently into your client code.