wcf and ADO entity framework - wcf

We are using Linq to Entities in WCF service. We created a edmx file which contains auto generated entities. While creating proxy the entities are not appearing in the proxy class even the data contract and datamember attributes are there. We found that the problem is because of the auto generated entities are inheriting from something called System.Data.Objects.DataClasses.EntityObject But if we create a class without any inheritance that class is appearing in the proxy. Is there any way to resolve this?
Regards
Sekar

The way we do this is:
Auto generate entity framework entities
Create separate classes to be used in the data contracts
Write mapping code to convert from one contract classes to entity classes, and back
This may be a bit cumbersom but it works (it also isolates your services from changes in your database). This should become much easier in the next version of entity framework.

Related

Entity Framework through WCF Generates Backing Fields

I'm new to EF 5 (Switching from LINQ to SQL).
I did auto generate an .edmx based on my Database within my WCF project.
I did update my windows client which point to this same WCF service to auto generate all the entities on the client.
My issue is that the generation process appends __BackingField to all properties.
For example
User.Name within my WCF service becomes User.Name__BackingField on my client
My entities generated through LINQ to SQL did not had this problem.
Any help to remove that __BackingField is appriciated
Thanks,
Mathieux
I found out what was causing the issue.
I had some of my entities extended with partial classes which where inheriting from [serializable].
Removing the [serializable] from my own partial classes resolved the issue.

How can I send POCO Entities through WCF Service when I don't want to track the entity *later*?

I have an ASP.NET MVC 4 project, where Controller calls a WCF Service layer, that calls Business Layer, that use a Repository of EF 5.0 Entities. Then the results are returned as POCO entities to the Controller.
It works fine while the WCF Service is directly referenced as a Library, but I know it won't work referenced as a Service because they will need to be serialized, and with ProxyCreation enabled this is not possible.
I don't want to create DTOs because I use generated POCO entities, that's why they exist in my humble opinion.
I want to track changes only before the POCO entities reach Service layer.
A lot of people talk about using DTOs even when they are identical to POCOs, if I do that, I could create auto-generated copied classes just with different names to be a "Proxy disabled POCO as DTO", what would be a little strange.
Could I kill the proxy class of a POCO, in a way the object could be serialized when returned from the Service layer?
Also I don't know if this idea is a good practice. But would be great to send "clean" entities to my Controllers, ready to me mapped to ViewModels.
I'm looking for performance too.
The problem is solved using ProxyDataContractResolver. We must use [Serializable] and [DataContract(IsReference=true)] too. With this combination, ProxyCreation can be enabled.
The way we handled this was by doing the following:
Customize the T4 generating the POCO classes so that it generates classes decorated with
[Serializable()] and [DataContract(IsReference=true)] attribute.
Both frontend (views) and backend (wcf service / business layer) references the POCO generated classes, since you won't be using proxy due to IsReference=true.
and that's basically it.
With this, you don't have to create DTO and just use the POCO classes both in backend and frontend.
Keep in mind though, that WCF using IsReference=true handles does not like redundant objects (so this would be an issue on some POCO classes with navigation properties).

Share POCO types between WCF Data Service and Client Generated by Add Service Reference

I have a WCF Data Service layer that is exposing POCO entities generated by the POCO T4 template. These POCO entities are created in their own project (i.e. Company.ProjectName.Entities) because I'd like to share them wherever possible.
I have a set of interfaces in another project (Company.ProjectName.Clients) that reference these POCO types by adding an assembly reference to the Company.ProjectName.Entities.dll. One of the implementation of these interfaces is a .NET client that I want to consumes the service using the WCF Data Service Client Library.
I've used the Add Service Reference to add service reference. This generated the DataServiceContext client class and the POCO entities that are used by the service. However, these POCO types gemerated by the Add Service Reference utility now have a different namespace (i.e. Company.ProjectName.Clients.Implementation.WcfDsReference).
What that means is that the POCO types defined in the interfaces cannot be used by the types generated by the utility without have to cast or map.
i.e. Suppose I have:
1. POCO Entity: Company.ProjectName.Entities.Account
2. Interface: interface IRepository<Company.ProjectName.Entities.Account>{....}
3. Implementation: ServiceClientRepository : IRepository<Company.ProjectName.Entities.Account>
4. WcfDsReference: Company.ProjectName.Clients.Implementation.WcfDsReference
& Company.ProjectName.Clients.Implementation.WcfDsReference.Account
Let's say I want to create a DataServiceQuery query on the Account, I won't be able to do this:
var client = new WcfDsReference(baseUrl);
var accounts = client.CreateQuery<Company.ProjectName.Entities.Account>(...)
OR: client.AddToAccounts(Company.ProjectName.Entities.Account)
, because the CreateQuery<T>() expects T to be of type & Company.ProjectName.Clients.Implementation.WcfDsReference.Account
What I currently have to do is to pass the correct entity to the CreateQuery method and have to map the results back to the type the interface understands. (Possible with a mapper but doesn't seems like a good solution.)
So the question is, is there a way to get the Add Service Reference utility to generate methods that use the POCO types that are in the Company.ProjectName.Entities namespace?
One solution I am thinking of is to not use the utility to generate the DataServiceContext and other types, but to create my own.
The other solution is to update the IRepository<T> interface to use the POCO types generated by the utility. But this sounds a little bit hacky.
Is there any better solution that anyone has come up with or if there's any suggestion?
Ok, a few hours after starting the bounty I found out why it wasn't working as expected on my end.
It turns out that the sharing process is quite easy. All that needs to be done is mark the model classes with the [DataServiceKey] attribute. This article explains the process quite well, in the 'Exposing another Data Model' section
With that in mind, what I was trying to do is the following:
Placing the model on a separate class library project C, sharing it with both webapplication projects A and B
Create the data service on project A
Add the service reference on project B
Delete the generated model proxies out of the service reference, and update it to use my model classes in project C
Add the DataServiceKey attribute to the models, specifying the correct keys
When I tried this it did not work, giving me the following error:
There is a type mismatch between the client and the service. Type
{MyType} is not an entity type, but the type in the
response payload represents an entity type. Please ensure that types
defined on the client match the data model of the service, or update
the service reference on the client.
This problem was caused by a version mismatch between project C (which was using the stock implementations on the System.Data.OData assemblies) and the client project B that was calling the service (using the Microsoft.Data.OData assemblies in the packages). By matching the version on both ends, it worked the first time.
After all this, one problem remained though: The service reference procedure is still not detecting the models to be shared, meaning proxies are being created as usual. This led me to opt out of the automatic service integration mechanic, instead forcing me to go forward with a simple class of my own to serve as the client to the Wcf Data service. Basically, it's a heavily trimmed version of the normally autogenerated class:
using System;
using System.Data.Services.Client;
using System.Data.Services.Common;
using Model;
public class DataServiceClient : DataServiceContext
{
private readonly Lazy<DataServiceQuery<Unit>> m_units;
public DataServiceClient(Uri _uri)
: base(_uri, DataServiceProtocolVersion.V3)
{
m_units = new Lazy<DataServiceQuery<Unit>>(() => CreateQuery<Unit>("Units"));
}
public DataServiceQuery<Unit> Units
{
get { return m_units.Value; }
}
}
This is simple enough because I'm only using the service in readonly mode. I would still like to use the service reference feature though, potentially avoiding future maintenance problems, as evidenced by the hardcoded EntitySet name in this simple case. At the moment, I'm using this implementation and have deleted the service reference altogether.
I would really like to see this fully integrated with the service reference approach if anyone can share a workaround to it, but this custom method is acceptable for our current needs.

n-tiers, Linq and WCF

We have an n-tiers architecture :
-a WCF Service that communicates with the database and handles all the business logic.
-an ASP.NET MVC website that communicates with the WCF service.
Here is a scenario of data serialization-deserialization from the database to the html view of a 'guitar':
-Guitar_1 a class generated by linq,
-Guitar_2 the DataContract exposed by the WCF service, and consumed by the ASP.NET MVC website.
-Guitar_3 the model passed to the View
When an end user wants to retrieve a guitar, Guitar_1 is transformered into Guitar_2 and then into Guitar_3. That's really not a problem but if the end user requests a list of guitars then all this process is repeated for each guitar (a loop).
If i had to programmatically handle all the serialization-deserialization stuff, i'd had only one class per layer. It could still be done for example on the wcf project by annoting 'DataContract'/'DataMember' on the Linq class, but if I refresh my database model all my annotations disappear (Same case ont the ASP.NET MVC project, refreshing the service reference deletes all the added code).
Also, Is it really more productive to use these automatic serializers? the time taken to write a serializer-deserializer takes as much time as annoting classes (DataContract/DataMember) and handling the conversion of class Guitar_1 to Guitar_2... Add to that the loss of perofrmance (Loop and conversion)...
What do you guys think? Do some of you code as in the old days because of this?
UPDATE: As suggested by 'Abhijit Kadam', I used partial classes when consuming a webservice, however, I found a better solution when using Linq2SQL : POCO classes.
If the main concern is that the model classes created by framework are automatically regenerated and you changes like annotations on such classes are wiped out THEN in this case you can use partial classes, info here. If the auto generated class is Employee. Then in separate file create a partial class Employee and include the fields in this partial defination that you want to annotate. This class will not be wiped out and regenarated. However when you compile the code the resultant Employee class will be combination of the Original Employee class + the partially defined Employee class.
Also converting from class Guitar_1 to Guitar_2 is OK and at times we have to do such things to meet specific requirements. I prefer JSON data to be transferred across the network wire like from WCF to MVC Web and then browser will fetch the json data from the MVC APP. Then I use frameworks like jsrender or knockout to render the data as HTML on the client side(browser). JSON is readable, compact and javascript and javascript libraries love json.

WCF and Inheritance

I'm working on a project where I have an abstract class of Appointment. There are Workouts, Meals and Measurements that all derived from Appointment. My architecture looks like this so far:
Dao - with data access layer being entity framework 4 right now
POCO classes using the T4 templates
WCF
Silverlight Client, ASP.net MVP, mobile clients
Would I put business rules in the POCO class? or map my Entities to a business object with rules and then map those to DTOs and pass those through WCF?? and when I pass the DTOs do I pass over type Appointment? Or write a service method for each sub class like Workout or Meal?
I haven't found any good material using table per type inheritance and WCF.
thanks in advance!
-ajax
it mainly depends on complexity you require. You are using POCO classes it is good starting point. You now have to choose how complex application are you going to build, how much business logic do you want to add and what do you want to expose to your clients?
The POCO entity can be just DTO or you can turn POCO entity into business object by adding business methods and rules directly into that entity - you will transform the entity into Active record pattern or to Domain object. I don't see any reason to map your POCOs to another set of business objects.
Exposing POCO entity in WCF service is the simplest way. You can use operations which will works directly with Appointment class. Additionally you have to give your service information about all classes derived from Appointment - check KnownTypeAttribute and ServiceKnownTypeAttribute. Using entity often means that service calls transport more than is needed - this can be problem for mobile clients with slow internet connection. There is one special point you have to be aware of when exposing entity which is aggregation root (contains references to another entitities and collection of entities) - if you don't have full control over client applications and you allow clients sending full modified object graph you have to validate not only each entity but also that client changed only what he was allowed to. Example: Suppose that client want to modify Order entity. You send him Order with all OrderItem entities and each item will have reference to its Product entity = full object graph. What happens if instead of modifing Order and OrderItems client changes any of Products (for example price)? If you don't check this in your business logic exposed by WCF and pass the modified object graph into EF context, it will modify the price in your database.
If you decide to use your entities like business objects you usually don't expose those entities, instead you will create large set of DTOs. Each operation will work with precisely defined DTO for request and response. That DTO will carry only information which are really needed - this will reduce data payload for service calls and avoid passing modified prices of product, because you will simply define your DTO to not transfer price or even whole product from the client. This solution is much more time consuming to implement and it adds additional layer of complexity.
Because I have mentioned object graphs I must clarify that there is another hidden level of complexity when using them: change tracking. EF context needs to know what have changed in object graph (at least which OrderItem was modified, which was added or deleted, etc.) for correct persistence. Tracking and multi tier solution is a chalenge. The simplest solution does not track changes and instead uses additional query to EF. This query returns actual persisted state of object graph and modified object graph is merged with it (special care is needed for concurrency checks). Other solutions uses some tracking support in entity - check Tracking changes in POCO and Self-tracking entities. But this is only for entities. If you want to track changes in DTO you have to implement your own change tracking. You can also read articles from MSDN magazine about multi tier applications and EF:
Anti-Patterns To Avoid In N-Tier Applications;
Building N-Tier Apps with EF4