I am looking at OData and it is very powerful, at the same time it's very disconcerting. It is the equivalent of exposing your datasource to a remote user. There is no service nothing nada and very little injection points, resulting in an almost comical 2-tier architecture.
My concerns are:
It's hard to enforce pattern such as DDD while using OData.
It is also hard to use OData against a set of soa Data Transfer Objects, because these are not usually queryable. ie.by the time you get the DTOs, the DB query is done, but OData is just starting up. There is not a lot of value in querying against it, because the DTOs are already in memory.
I can create a view on the DB itself that's a representation of the OData entity, but this is putting UI concern in persistence. Big no-no.
At best combining the result set from various DDD services now happens at the UI layer using kludgy javascipt - a maintainance nightmare with poor reuseability record.
Another possibility is to write a translator for the OData entity which is likely a ViewModel level class, that then translates to the DTOs, that then translates to Domains, that then translates to the ConceptualModels, the rest the ORM can handle. But this will require an inordinate amount of resources.
In short, how do you make OData play nice with SOA, OO encapsulation principles, DDD and just good old SOC?
There needs to be clear separation of the OData standard and the OData implementations.
As for the standard:
The standard itself to my view lets you to have an OOTB accessible data endpoint with full metadata on a portable manner. With support for relations and projection consumers can compose viewmodels on the server or later on the client. It is important to note, that OData supports operations (functions) to be exposed, so on the top of automatic crud you can have remote methods that seamlessly integrate into the relational pattern (functions can have results that you can further query, still on the server, and also functions can act as the "smart writer" code).
To wrap it up: OData gives a shape to what actually happens most of the time when someone needs massive REST compliant data access: formalizes common, always repeated scenarios like how it is to query for data or submit data back. This might be still affected by what you write at point 4, but to my opinion it's not an OData related issue. This is simply how AJAX and Mashups work: you'll have a client with lots of code dealing with combining data.
Other issues of yours can be answered with selecting the most appropriate server implementation. There are a couple of implementations already:
EF4/EF5 + WCF Data Services being the most "automatic". In this use case you might just be correct regarding some of your concerns but: with the fine extensibility model of EF you can interact with the automatic operations as you wish. Having an application that is driven by the actual EDMX model is a true DDD scenario.
ASP.NET Web API let's you have a totally code based back-end for what you perceive as a static, relational endpoint, so this is where you can think in 3 layers: DB, middle tier to bridge between DB data and to what is best for the clients, and client tier as a smart consumer to that model.
JayData provides these in Server Side JavaScript with the added dynamism of JavaScript.
SAP NetWeaver gateway exposes complex SAP data on a manner easy to consume for simple scenarios.
OO concerns:
With V3 of OData we have now "instance methods" (that are definitely server methods too) so what actually SOA took away with brutally separating things to data and functionality OData really gives back: defining functionality + data encapsulated but in mapped to the SOA concept of static methods with context data that act like the "this".
2Tier concerns:
IMHO 2Tier architectures became "ancient" not because the client has too much knowledge about the server side structure, but because they did not scale well and the new thin cliens were to dumb to act like a real DB client. In fact 2tier was always easier to work with (from a developer point of view), and now that actually OData server implementations are indeed middle tier with logic, you actually can get the best of both worlds:
- code agains a virtuall 2 tier architecture
- and scale as a normal n tier application can.
Related
I am working on a application design and investigating on passing data between WCF services and the ASP.NET web application. Options I am looking at are to either use "Datasets" or Entity Framework.
I have bunch of questions,
If I use Entity Framework to pass data, would it add more overhead to the WCF communication,
Is Datasets considered as "light weight" considering the communication overhead,
If i use Entity Framework, how can I maintain object models if I use stored procedures to return complex data?
Overall I need to figure out pros and cons of using these technologies.
Your question touches on the principles of service-oriented architecture (SOA). In SOA, a service should expose operations (methods) that apply to business objects through contracts. The business objects should be defined in a standard way which is why WCF uses WSDL and XML Schema to do this.
An SOA tenet is that business objects are shared through a schema and/or a contract but not as a specific implementation. By this priniciple, Dataset is a heavyweight, .NET specific, database-oriented object which should not be used to represent a business object. The Microsoft Patterns & Practices group apparently disregards SOA tenets since it shows how to use Dataset as Data Transfer Objects. Whether they're just promoting vendor lock-in or just trashing the whole concept of SOA is anybody's guess. If there is even the remotest chance your service will ever be consumed by a non-.NET client please do not use a Dataset.
If you decide on using the Entity Framework then I'd recommend using Code First to define the Entity Framework data model and just expose those "code first" business objects in your service. This SO question & answers provide a good discussion on using Entity Framework with WCF.
Are you really considering passing entire datasets up and down? This will destroy your object model and be more difficult to maintain, though I suppose you could implement the Repository pattern. Still, even just sending the changes you would need to do strange things like ensure you do not transfer the schema and perhaps using the compress option. But it is a very poor choice when compared to Entity Framework Code First with nice clean POCOs in a separate assembly and without any of the EF infrastructure polluting the DTOs.
EF should not add overhead as such but it depends how it is implemented and what data objects are being passed around. If you pass data into a context and use the correct option when SaveChanges is called such as SaveChangesOptions.ReplaceOnUpdate, only the changed entities will be updated. The queries executed are efficient so long as you are mindful not to lazy load stuff you don't need. You need to understand LINQ to entities well and batch your updates as you would any other potentially expensive method call. Run some tests with your database profiler running and try to improve the efficiency of your interactions with EF, monitor your IIS logs for data sizes and transmission times etc.
Datasets are not considered lightweight due to the need to encapsulate a schema and potentially somebody might make a mistake and send up a whole heap of data in multiple tables, including tables that are dependencies. These might need to be pulled in anyway either on the client or server - very messy! EF does support stored procedures in a sensible manner as they can be part of your model and get called when specific entities need to be saved. ORM will compliment your OO design and lead to cleaner code.
Also if you are doing something simple and only require CRUD without much in the way of business logic consider WCF Data Services.
I was looking at how you can create a WCF data service around an entity framework context and you can consume it as an EF context as well.
Creating an OData API for StackOverflow including XML and JSON in 30 minutes
I really just started looking at this, but I was wondering where would the business logic go? As a service I would expect that you couldn't just freely add/delete etc without it having some validation.
If I wrote an MVC app to consume this service, how would I best implement business logic. Not simple property level validation that you could do with attributes, but more complex stuff that needs to check the data store first etc.
It sounds like you need a custom data service provider (msdn link). They're quite powerful and give you full control over all of your reading/writing logic.
For example I wrote one that enforces our licensing logic in the update provider.
You can put some in the Data Service class, but you are limited as to what you can and can't do there. And then of course you can put some in the client above the service, but that's not ideal either.
I've only spent a few weeks with WCF Data Services but you highlight (one of) the big problems with it - lack of flexibility. It's fantastic for rapid development and banging out LOB applications, but anything with a deliberate design is very difficult to implement. I needed to include objects in my entity model simply to allow them to be exposed through the service, and I had huge headaches just trying to extend those classes with a simple property.
I'd only recommend using WCF Data Services for trivially simple applications that needed extremely fast development - a one or two day development cycle, for example. Anything else is worth doing thoroughly with regular WCF services, writing your own data layer and so on.
Depending on your specific needs, it sounds like Web API might be a good fit. Web API may never get the full range of OData support that WCF Data Services has, but it does make certain things easier (like adding business logic). I'm quite confident that Web API's initial support for OData will cover a significant number of use cases, and that support will grow over time.
While a custom data provider will most certainly do about anything you want and may well be a great solution for you if you have a complex architecture, I wasn't really thrilled when I attempted to save back through the client and found out I had to implement my the IUpdatable Interface as part of my Context.(I was attempting to build a repository pattern out of my context and DataService).
I'm sure it's very useful for many people, but I really only needed the functionality the EntityProvider already contained and didn't have the time in my project schedule to figure the Iupdatable piece of the custom provider out, so my team, specifically Geoff , stuck with the Entity Provider and used Change and Query Interceptors to route the DataService requests through our Business Logic classes on the server. It provides a central point of control. We used these to provide security checks, run calculations and other operations on Insert/Update, etc. Turned out great. You can also use service methods as another way to provide specific business logic functions to your clients.
I want to know if using Self Tacking Entities (in Entity Framework) is recommended with WCF services? If yes, then can you guide me to a tutorial which may guide how to do that?
Actually, I am going to develop a WPF application using Prism with MEF and MVVM. I have decided to use Entity Framework. I want suggestions and advices regarding this approach.
Any help will be appreciated.
I want to know if using Self Tacking Entities (in Entity Framework) is
recommended with WCF services?
It depends who you ask. If you ask MS they will tell you Yes because they simply don't have anything better to offer. STEs were response to this very old MS Connect suggestion. The problem is that EF itself has terrible bad support for merging changes between two entity graphs (you must do it completely yourselves) and developers working on MS platform (sometimes including me) share some common behaviors:
They are lazy to develop their own solution to problem and they expect some magic directly in APIs provided by MS.
Most of the time they are not trained / skilled / competent in the technology they have to use, because they have to move to a new one too often.
The only APIs they know are part of .NET Framework. They don't look for other options neither they compare features.
First two points are result of MS strategy where RAD become synonym for designer (or newly also T4 templates).
I share #Richard opinion about STEs. I would add one additional drawback of STEs - they move large datasets between participants. If you decide to get an entity graph from the server, change a single entity in the graph and push data back they will transfer again the whole graph. Transferring only changed entities results in fighting with STE's core logic. I'm also afraid that they track changes completely on per entity level instead of per property level. In case of modification to entities with large binary or string data it can result in transferring too much unneeded data between the service and the database and between the service and the client.
Anyway for a simple application with low data traffic and small entities they can do a good job and allow you building your application quickly but without strict separation of concerns. You will get entities from service and bind them directly to WPF UI and they will be able to track changes for you. Later you will push entities back to service and they will be able to persist changes. Your client and service will be tightly coupled but in some scenarios it can be good enough.
I would avoid self tracking entities in general - I blogged about it here.
Create your own DTOs and use them to manage the transfer of data - then biuold your POCO objects in the service and use them with entity framework for persistence
If you want self tracking then there is a slightly cleaner approach here
My Team are evaluation to a bigger Business portal. (Invoicing, Bookkeeping, Salaries.....)
We are all used to work with DDD, O/R mappers with NHibernate as our first choice.
We have chosen to work with CompositeWPF to keep modularity between all modules and part system in the business portal.
Now we have evaluated Ria Services and are kind of disappointed how it works in a Data Oriented way, Data Oriented can be good in a service oriented scenario, but we feel that we can with an Object Oriented approach to, and we feel that we can get an application with less complexity with the OO approach than the DO approach.
For example it doesn't allow Value Objects, Many-to-many relations, everything needs to have keys and so on.
We haven't looked at WCF Data Services yet so our question is WCF Data Services our answere? Does it integrate well with Silverlight 4? Can we work with it in a OO manner?
RIA / WCF is not about replacing O/R mappers etc. It is about exposing data in an open format to another application. Not high end, but basically for integration. It is IMHO pretty stupid to put that within an application, but it is a great external interface, especially as it gets tooling support.
Good examples:
Bank accounting access. If I only could do home banking using Odata ;) And get my account statements into excel.
Trading ;) Yeah, ok - I have a trading server (that then connects to various brokers). I have a web front end. I now will expose certain data through OData, too, so I can easily get things out in excel etc., or even use a silverlight application for some stuff... but i will NOT use OData within one application to replace my object infrastructure- way too muc hoverhead.
Ebay could provide an OData interface for larger customers. Nice to get an overview over your auctions AND do some basic maintenance on your account. Nothing high performance, but again, TOOLING support. Excel, Report services all soon support OData.
If you look at it from that integration point of view it makes a LOT mroe sense. It is not a full environment - that "never" works. It is a great standardization, though, to open up an application with semantics (better than web services - standardized query and filter logic) AND tooling support.
I somehow dont really run into many problems with a lot of items you mention, though:
Anything I work with has a key per definition
I neve rdo many:many relations. I always havean interim object WITH A KEY.... so that I can add properties to it (and if that is only a timestamp).
The servies ARE data oriented, and seriously - I love them. I am a big OO fan, but the tooling support makes that a PERFECT external interface for applications.
In my SOA architecture, I have several WCF services.
All of my services need to access the database.
Should I create a specialized WCF service in charge of all the database access ?
Or is it ok if each of my services have their own database access ?
In one version, I have just one Entity layer instanced in one service, and all the other services depend on this service.
In the other one the Entity layer is duplicated in each of my services.
The main drawback of the first version is the coupling induced.
The drawback of the other version is the layer duplication, and maybe SOA bad practice ?
So, what do so think good people of Stack Overflow ?
Just my personal opinion, if you create a service for all database access then multiple services depend on ONE service which sort of defeats the point of SOA (i.e. Services are autonomous), as you have articulated. When you talk of layer duplication, if each service has its own data to deal with, is it really duplication. I realize that you probably have the same means of interacting with your relational databases or back from the OOA days you had a common class library that encapsulated data access for you. This is one of those things I struggle with myself, but I see no problem in each service having its own data layer. In fact, in Michele Bustamante's book (Chapter 1 - Page 8) - she actually depicts this and adds "Services encapsulate business components and data access". If you notice each service has a separate DALC layer. This is a good question.
It sounds as if you have several services but a single database.
If this is correct you do not really have a pure SOA architecture since the services are not independant. (There is nothing wrong with not having a pure SOA architecture, it can often be the correct choice)
Adding an extra WCF layer would just complicate and slow down your solution.
I would recommend that you create a single data access dll which contains all data access and is referenced by each WCF service. That way you do not have any duplication of code. Since you have a single database, any change in the database/datalayer would require a redeployment of all services in any case.
Why not just use a dependency injection framework, and, if they are currently using the same database, then just allow them to share the same code, and if these were in the same project then they would all use the same dll.
That way, later, if you need to put in some code that you don't want the others to share, you can make changes and just create a new DAO layer.
If there is a certain singleton that all will use, then you can just inject that in when you inject in the dao layer.
But, this will require that they use the same DI framework controller.
The real win that SOA brings is that it reduces the number of linkages between applications.
In the past I've worked with organizations who have done it a many different ways. Some data layers are integrated, and some are abstracted.
The way I've seen it most successfully done is when you create generic data-layer services for each app/database and you create the higher level services based on your newly created data layer.