CRUD Operations on Underlying POCO data via OData/WCF Data Services - wcf

I’m trying to write an OData service in C#2010 that exposes some POCO to a jQuery web client via JSON, but also allows updates to the underlying data. I’ve found lots of examples of read-only POCO data via OData and lots of examples of updatable data via Entity Framework and OData.
My problem is that the data is in a proprietary database so there needs to be a business logic layer to handle the DB updates and I don’t see where this fits in the OData/WCF Data Services model. I’m populating the POCO entities using IQueryable lists and exposing them using SetEntitySetAccessRule, but how do I call a method in the business logic/data model layer to persists data to the DB?
Should I be using SetServiceOperationAccessRule to expose service methods? If so, could anyone point me in the direction of a simple example please?
Thanks

My suggestion would be a custom WCF Data services provider, so that you can have a custom implementation of IDataServiceUpdateProvider. There is a good blog series at http://blogs.msdn.com/b/alexj/archive/2010/01/07/data-service-providers-getting-started.aspx

Rich's suggestion to implement IUpdatable/IDataServiceUpdateProvider is correct. That's the way to support Update operations (the EF provider implements this in-box, the reflection provider doesn't so you have to do that yourself).
You can implement IUpdatable even when using reflection provider. Just make your context class (the one you pass in as T to DataService) implement the IUpdatable interface.

Related

OData with WCF Data Services / Entity Framework

Apologies in advance, this is a long question.
(TL;DR : Does anyone have any advice on using the EF with dynamic fields exposed using WCF Data Services/OData)
I am having some conceptual problems with WCF Data Services and EF, specifically pertaining to exposing some data as an OData service.
Basically my issue is this. The database I am exposing allows users to add fields dynamically (user-defined fields) and it uses a system whereby these fields are added directly to the underlying SQL tables. Furthermore, when you want to add data to the tables you cannot use direct SQL, you have to go via an API that they provide. (it's SAP Business One, fwiw).
I have already sucessfully built a system that exposes various objects via XML and allows a client to update or add new entities into SBO by sending in XML messages, and although it works well it's not really suited to mobile apps as it's very XML-heavy and the entry point is an old-skool asmx webservice. I want to try to jazz it up for mobile development and use Odata with WCF or Web API. (I know I could change up to a WCF service, allow handing of JSON-format requests, and start returning JSON data, but it just seems like there must be a more...native...way)
Initially I had discounted the possibility of using the EF for this because a)Dynamic fields and b)the EF could only be read-only; adding/updating entities would have to be intercepted and routed to the SBO DI Server. However, I am coming back to thinking about it and am looking for some advice (negative or otherwise!) on how to approach.
What I basically want to do is this
Expose the base tables from SBO (which don't change except when they themselves issue a patch) as EF Entities, with all the usual relationy goodness. In fact I actually will not be directly exposing the tables, I will use a set of filtered SQL Views as the data sources as this ties in with various other stuff we do to allow exposing only certain data to 3rd parties.
Expose any UDFs a particular user has added as some kind of EAV sub-collection per entity.
Intercept any requests to ADD or UPDATE an object, and route these through an existing engine I have for interfacing with the SAP Data import services.
I suppose my main question is this; suppose I implement an EF entity representing a Sales Order which comprises a Header and Details collection. To each of these classes I stick in an EAV type collection of user-defined fields and values. How much work is involved in allowing the OData filtering system to work directly on the EAV colleciton (e.g for a client to be able to ask for Service/Orders/$filter=SomeUdfField eq SomeValue where this request has to be passed down into the EAV collection of the Order header entity)
Or is it possible, for example, to generate an EF Model from some kind of metadata on the fly (I don't mind how - code generation or model building library) that would mean I could just expose each entity, dyanmic fields included, as a proper EF Model? Many thanks in advance if you read this far :)
For basic crud to an existing EF context, WCF Data Services works out great. As soon as you want to add some custom functionality, as you described above it takes a bit more work.
What you described is possible, but you would need to build out your own custom data provider to handle the dynamic generation of metadata as well as custom hooks into add/update/delete.
It may be worth looking into WCF Data Services Toolkit, it's a custom provider which slaps a repository pattern over WCF Data Services for ease of use, but it does not provide the custom metadata generation.

WCF Data Services with Custom Entities

I have a set of custom entities which reflect the business representation of data. Then I also have a set of entities that map 1-to-1 to the database that represent the storage of the data. My business layer converts between the 2 types and performs any other logic needed. I only expose the custom objects through my service interface.
From what I can tell I cannot use WCF Data Services. Data services
need to be bound directly to a db source (or some slight abstraction of the direct db connection) and,
that results in using the data entities.
Correct me if I'm wrong, but I can't see any way to use WCF Data Services and its built-in queryability with custom entities while using my business layer.
I do not necessarily agree with that. If you look at the Architecture Overview in http://msdn.microsoft.com/en-us/library/cc668794.aspx you see two other options next to the EF / DB connectivity. You can have Data Service Providers that just take an alternative (your custom) information model made up of queryable CLR classes and expose them using WCF data services.
So if you create your Business Layer using this approach, your custom entities can just as easy be exposed with WCF data services.

OData WCF Data Service with NHibernate and corporate business logic

Let me first apologise for the length of the entire topic. It will be fairly long, but I wish to be sure that the message comes over clearly without errors.
Here at the company, we have an existing ASP.NET WebApplication. Written in C# ASP.NET on the .NET Framework 3.5 SP1. Some time ago an initial API was developed for this web application using WCF and SOAP to allow external parties to communicate with the application without relying on the browsers.
This API survived for some time, but eventually the request came to create a new API that was RESTfull and relying on new technologies. I was given this assignment, and I created the initial API using the Microsoft MVC 2 Framework, running inside our ASP.NET WebApplication. This took initially quiet some time to get it properly running, but at the moment we're able to make REST calls on the application to receive XML detailing our resources.
I've attended a Microsoft WebCamp, and I was immediatly sold by the OData concept. It was very similar then what we are doing, but this was a protocol supported by more players instead of our own implementation. Currently I'm working on a PoC (Proof of Concept) to recreate the API that I developed using the OData protocol and the WCF DataService technology.
After searching the Internet for getting NHibernate 2 to work with the Data Services, I succeeded in creating a ReadOnly version of the API that allows us to read out the entities from the internal business layer by mapping the incoming query requests to our Business layer.
However, we wish to have a functional API that also allows the creation of entities using the OData protocol. So now i'm a bit stuck on how to proceed. I've been reading the following article : http://weblogs.asp.net/cibrax/default.aspx?PageIndex=3
The above articly nicely explains on how to map a custom DataService to the NHibernate layer. I've used this as a base to continue on, but I have the "problem" that I don't want to map my requests directly to the database using NHibernate, but I wish to map them to our Business layer (a seperate DLL) that performs a large batch of checks, constraints and updates based upon accessrights, privledges and triggers.
So what I want to ask, I for example create my own NhibernateContext class as in the above articly, but instead rely on our Business Layer instead of NHibernate sessions, could it work? I'd probably have to rely on reflection alot to figure out the type of object I'm working with at runtime and call the correct business classes to perform the updates and deletes.
To demonstrate with a smal ascii picture:
*-----------------*
* Database *
*-----------------*
*------------------------*
* DAL(Data Access Layer) *
*------------------------*
*------------------------*
* BUL (Bussiness Layer) *
*------------------------*
*---------------* *-------------------*
* My OData stuff* * Internal API *
*---------------* *-------------------*
*------------------*
* Web Application *
*------------------*
So, would this work, or would the performance make it useless?
Or am I just missing the ball here?
The idea is that I wish to reuse whatever logic is stored in the BUL & DAL layer from the OData WCF DataService.
I was thinking about creating new classes that inherit from the EntityModel classes in the Data.Services namespace and create a new DataService object that marks all calls to the BUL & DAL & API layers. I'm however not sure where/who to intercept the requests for creating and deleting resources.
I hope it's a bit clear what I'm trying to explain, and I hope someone can help me on this.
The devil is in the details, but it sounds like the design you're proposing should work.
The DataService class is where you get to define the access rights applicable to everyone, configuration settings, and custom operations. In this scenario, I think you will be focusing more on the data context instead (the 'T' in DataService).
For the context, there are really two interesing paths: reads and writes. Reads happen through the IQueryable entry points. Writing a LINQ provider is a good chunk of work, but NHibernate already supports this, although it would return what I imagine we're calling DAL entities. You can use query interceptors to do access checks here if you can express those in terms that the database would understand.
The update path is from what I understand where you are trying to run more business logic (you mentioned validation, extra updates, etc). To do this, you'll want to focus on the IUpdatable implementation (IDataServiceUpdateProvider if you're using the latest version). Here you can use whichever objects you want - they could be DAL objects or business objects. You can do everything in the DAL and then run validation on SaveChanges(), or do everything on business objects if they validate as they go.
There are two places where you might 'jump' from one kind of objects to another. One is in the GetResource() API, where you get an IQueryable, presumably in term of DAL entities. The other is in ResolveResource(), where the runtime is asking for an object to serialize, just like it would get from an IQueryable, so it's presumably also a DAL entity.
Hope this helps - doing uniform access over non-uniform APIs can be hard, but often well worth it!

WCF Service with DataContracts VS Default Entity Framework Entities Object

What are the pros and cons of using WCF Service with DataContracts VS Entity Framework Entities Object?
If i generate Data Contracts using ADO.net Self Tracking Entity Generator the classes in my data layer.
What will the best way of using it in my WCF service?
Will the datacontract genrated ADO.net Self Tracking Entity Generator will be exchnaged via the service or WCF service will still use the default Entity framework objects?
Main advantage of STEs (Self tracking entities) is implementation of change set. It means that you can return STE from web service's operation modify entity (or whole entity graph) and call another operation to post updated STE back to web service for processing. EF will automatically detect changes in STE and process them.
This is not possible with Entity Framework entities because it can track changes only if entity is attached to ObjectContext but the entity is detached when returned from web service operation.
Drawback of STEs is that you have to share assembly which defines them among service and all clients. STEs are not for interoperable solutions.
At the moment most projects are developed with third type of entities - POCOs. POCOs are also not able to track changes when detached from ObjectContext. It is the feature of STEs.
It depends on what type of work you are doing.
Using DTOs (Data Transfer Objects) which form your Data Contracts and are separate from the EF model will give you more control over what get serialized or not. This is important for compatibility and versioning with multiple clients.
http://martinfowler.com/eaaCatalog/dataTransferObject.html
Using EF with POCO is probably next in terms of control and separation with the default database generated form last. However these two are easier to use and more flexible when used with Silverlight clients.

WCF. to send data through service and client

I have a service. This service get data from SQL Server.
What the best way to send information to client?
Should I use ADO.NET or Entity Framework?
You can use whichever data technology you want. However, up to Entity Framework 4 in .NET 4 (currently in Release Candidate status), it is recommended not to return an Entity Framework entity or a LINQ to SQL class from a web service. Both technologies unfortunately leak their implementation over the wire - the client-side proxy classes would have client-side classes corresponding to the base classes used by the data framework.
Instead, use a Data Transfer Object, which is an object that has nothing but properties that correspond one-to-one with the properties of the data you want transferred.
From such a brief description it is impossible to say which one is preferred. My personal favourite for such a scenario is Linq to SQL.
If they are both .NET then i say WCF. If server does simple manipulations with data consider Linq to SQL. Or nHibernate.