What's the difference between a "Data Service Layer" and a "Data Access Layer"? - data-access-layer

I remember reading that one abstracts the low level calls into a data agnostic framework (eg. ExecuteCommand methods etc), and the other usually contains business specific methods (eg. UpdateCustomer).
Is this correct? Which is which?

To me this is a personal design decision on how you want to handle your project design. At times data access and data service are one and the same. For .NET and LINQ that is the case.
To me the data service layer is what actually does the call to the database. The data access layer receives the objects and creates them or modify them for the data service layer to make the call to the database.
In my designs the Business Logic Layer manipulates the objects based on the business rules, then passes them to the data access layer which will format them to go into the database or the objects from the database, and the data service layer handles the actual database call.

I think in general the two terms are interchangeable, but could have more specific meanings depending on the context of your development environment.
A Data Access Layer sits on the border between data and the application. The "data" is simply the diverse set of data sources used by the application. This can mean that substantial coding must be done in each application to pull data together from multiple sources. The code which creates the data views required will be redundant across some applications.
As the number of data sources grows and becomes more complex, it becomes necessary to isolate various tasks of data access to address details of data access, transformation, and integration. With well-designed data services, Business Services will be able to interact with data at a higher level of abstraction. The data logic that handles data access, integration, semantic resolution, transformation, and restructuring to address the data views and structures needed by applications is best encapsulated in the Data Services Layer.
It is possible to break the Data Services Layer down even further into its constituent parts (i.e. data access, transformation, and integration). In such a case you might have a "Data Access Layer" that concerns itself with only retrieving data, and a "Data Service Layer" that retrieves its data through the Data Access Layer and combines and transforms the retrieved data into the various objects required by the Business Service Layer.

Here's another perspective deep from the trenches! A Data Access Layer is a software abstraction layer which hides the complexity / implementation of actually getting the data. The applications asks the Data Access Layer (See DAO design pattern) to "get me this" or "update that" etc (indirection). The Data Access Layer is responsible for performing implementation-specific operations, such as reading/updating various data sources, such as Oracle, MySQL, Cassandra, RabbitMQ, Redis, a simple file system, a cache, or even delegate to another Data Service Layer.
If all this work happens inside a single machine and in the same application, the term Data Service Layer is equivalent to a Service Facade (indirection). It is responsible for servicing and delegating application calls to the correct Data Access Layer.
Somewhat confusingly, in a distributed computing world, or Service Oriented Architecture, a Data Service Layer can actually be a web service that acts as a standalone application. In this context, the Data Service Layer delegates received upstream application data requests to the correct Data Access Layer. In this instance, web services are indirecting data access from applications - the application only needs to know what service to call to get the data, so as a rule-of-thumb, in distributed computing environments, this approach will reduces application complexity (and there are always be exceptional cases)
So just to be clear, the application uses a DSL and a DAL. The DSL in the app should talk to a DAL in the same application. DAL's have the choice of using a single datasource, or delegate to another web service. Web Service DSL can delegate the work to the DAL for that request. Indeed, it's possible for a single web service request to use a number of data sources in order to respond to the data.
With all that said, from a pragmatic perspecive, it's only when systems become increasingly complex, should more attention be paid to architectural patterns. It's good practice to do things right, but there's no point in unnecessarily gold-plating your work. Remember YAGNI? Well that fails to resonate come the time it's needed!
To conclude: A famous aphorism of David Wheeler goes: "All problems in computer science can be solved by another level of indirection";[1] this is often deliberately mis-quoted with "abstraction layer" substituted for "level of indirection". Kevlin Henney's corollary to this is, "...except for the problem of too many layers of indirection."

The Data Service Layer concept done in the WebSphere Commerce documentation is straightforward:
The data service layer (DSL) provides an abstraction layer for data access that is independent of the physical schema.
The purpose of the data service layer is to provide a consistent interface (called the data service facade) for accessing data, independent of the object-relational mapping framework
Currently in internet the DSL concept is mainly associated with the SOAs (Service Oriented Architectures) but not only. Here is mentioned in an example of N-tier applications.

Related

Should I use Datasets or Entity Framework to transfer data via WCF services?

I am working on a application design and investigating on passing data between WCF services and the ASP.NET web application. Options I am looking at are to either use "Datasets" or Entity Framework.
I have bunch of questions,
If I use Entity Framework to pass data, would it add more overhead to the WCF communication,
Is Datasets considered as "light weight" considering the communication overhead,
If i use Entity Framework, how can I maintain object models if I use stored procedures to return complex data?
Overall I need to figure out pros and cons of using these technologies.
Your question touches on the principles of service-oriented architecture (SOA). In SOA, a service should expose operations (methods) that apply to business objects through contracts. The business objects should be defined in a standard way which is why WCF uses WSDL and XML Schema to do this.
An SOA tenet is that business objects are shared through a schema and/or a contract but not as a specific implementation. By this priniciple, Dataset is a heavyweight, .NET specific, database-oriented object which should not be used to represent a business object. The Microsoft Patterns & Practices group apparently disregards SOA tenets since it shows how to use Dataset as Data Transfer Objects. Whether they're just promoting vendor lock-in or just trashing the whole concept of SOA is anybody's guess. If there is even the remotest chance your service will ever be consumed by a non-.NET client please do not use a Dataset.
If you decide on using the Entity Framework then I'd recommend using Code First to define the Entity Framework data model and just expose those "code first" business objects in your service. This SO question & answers provide a good discussion on using Entity Framework with WCF.
Are you really considering passing entire datasets up and down? This will destroy your object model and be more difficult to maintain, though I suppose you could implement the Repository pattern. Still, even just sending the changes you would need to do strange things like ensure you do not transfer the schema and perhaps using the compress option. But it is a very poor choice when compared to Entity Framework Code First with nice clean POCOs in a separate assembly and without any of the EF infrastructure polluting the DTOs.
EF should not add overhead as such but it depends how it is implemented and what data objects are being passed around. If you pass data into a context and use the correct option when SaveChanges is called such as SaveChangesOptions.ReplaceOnUpdate, only the changed entities will be updated. The queries executed are efficient so long as you are mindful not to lazy load stuff you don't need. You need to understand LINQ to entities well and batch your updates as you would any other potentially expensive method call. Run some tests with your database profiler running and try to improve the efficiency of your interactions with EF, monitor your IIS logs for data sizes and transmission times etc.
Datasets are not considered lightweight due to the need to encapsulate a schema and potentially somebody might make a mistake and send up a whole heap of data in multiple tables, including tables that are dependencies. These might need to be pulled in anyway either on the client or server - very messy! EF does support stored procedures in a sensible manner as they can be part of your model and get called when specific entities need to be saved. ORM will compliment your OO design and lead to cleaner code.
Also if you are doing something simple and only require CRUD without much in the way of business logic consider WCF Data Services.

Does OData violate separation of concern?

I am looking at OData and it is very powerful, at the same time it's very disconcerting. It is the equivalent of exposing your datasource to a remote user. There is no service nothing nada and very little injection points, resulting in an almost comical 2-tier architecture.
My concerns are:
It's hard to enforce pattern such as DDD while using OData.
It is also hard to use OData against a set of soa Data Transfer Objects, because these are not usually queryable. ie.by the time you get the DTOs, the DB query is done, but OData is just starting up. There is not a lot of value in querying against it, because the DTOs are already in memory.
I can create a view on the DB itself that's a representation of the OData entity, but this is putting UI concern in persistence. Big no-no.
At best combining the result set from various DDD services now happens at the UI layer using kludgy javascipt - a maintainance nightmare with poor reuseability record.
Another possibility is to write a translator for the OData entity which is likely a ViewModel level class, that then translates to the DTOs, that then translates to Domains, that then translates to the ConceptualModels, the rest the ORM can handle. But this will require an inordinate amount of resources.
In short, how do you make OData play nice with SOA, OO encapsulation principles, DDD and just good old SOC?
There needs to be clear separation of the OData standard and the OData implementations.
As for the standard:
The standard itself to my view lets you to have an OOTB accessible data endpoint with full metadata on a portable manner. With support for relations and projection consumers can compose viewmodels on the server or later on the client. It is important to note, that OData supports operations (functions) to be exposed, so on the top of automatic crud you can have remote methods that seamlessly integrate into the relational pattern (functions can have results that you can further query, still on the server, and also functions can act as the "smart writer" code).
To wrap it up: OData gives a shape to what actually happens most of the time when someone needs massive REST compliant data access: formalizes common, always repeated scenarios like how it is to query for data or submit data back. This might be still affected by what you write at point 4, but to my opinion it's not an OData related issue. This is simply how AJAX and Mashups work: you'll have a client with lots of code dealing with combining data.
Other issues of yours can be answered with selecting the most appropriate server implementation. There are a couple of implementations already:
EF4/EF5 + WCF Data Services being the most "automatic". In this use case you might just be correct regarding some of your concerns but: with the fine extensibility model of EF you can interact with the automatic operations as you wish. Having an application that is driven by the actual EDMX model is a true DDD scenario.
ASP.NET Web API let's you have a totally code based back-end for what you perceive as a static, relational endpoint, so this is where you can think in 3 layers: DB, middle tier to bridge between DB data and to what is best for the clients, and client tier as a smart consumer to that model.
JayData provides these in Server Side JavaScript with the added dynamism of JavaScript.
SAP NetWeaver gateway exposes complex SAP data on a manner easy to consume for simple scenarios.
OO concerns:
With V3 of OData we have now "instance methods" (that are definitely server methods too) so what actually SOA took away with brutally separating things to data and functionality OData really gives back: defining functionality + data encapsulated but in mapped to the SOA concept of static methods with context data that act like the "this".
2Tier concerns:
IMHO 2Tier architectures became "ancient" not because the client has too much knowledge about the server side structure, but because they did not scale well and the new thin cliens were to dumb to act like a real DB client. In fact 2tier was always easier to work with (from a developer point of view), and now that actually OData server implementations are indeed middle tier with logic, you actually can get the best of both worlds:
- code agains a virtuall 2 tier architecture
- and scale as a normal n tier application can.

approaches to WCF service version control

We are implementing numerous services in our company and running into versioning issues with data contracts. One of the problems we have is that our data contract are also used as the model of the actual application behind the service. I was wondering what approach others have taken in this kind of situation or just service versioning in general. I am aware of the microsoft best practices guide but wanted to see if anybody has any other ideas on how to version.
The first rule of Services, Business Object != Message Object. Basicly, never expose your business objects as data contracts. Or as I like to say, you can't fax a cat. You can send a facsimile of a cat, but you can't send a cat over the wire. Here's a great picture to remind you: http://www.humorhound.com/2009/04/demotivational-poster-youre-doing-it-wrong/
In more modern terms, it is really the MVVM pattern. The view of the model that the domain layer uses is not built for a client, so you have to create a separate model and view for the other layers. Yes it seems like a lot more work, but in the end it is a much easier and better way to build service oriented applications. Versioning is just one of the ways that it makes life easier. The other important thing is that you tend to build models that are geared around how it is going to be used, and you wind up with more explict code (less crazy branching).
The way that we have implemented this is to build a facade layer on top of the business layer.
The facade layer talks to the rest of the world using the objects defined in the data contracts.
The facade layer maps the objects to internal objects before sending the data into the business layer.
This isolates the internal functionality of your system from the objects used in the data contracts.

3 Tier Architecture?

Using VB.NET 2008
I want to know what is 3 Tier Architecture for windows application?
Can any one give a example of How to make a code for Inserting, Deleting, Updating in a database using 3 tier architecture.
Note am not asking a real code. Just give me a example.
From Multitier architecture
Three-tier'[2] is a client-server
architecture in which the user
interface, functional process logic
("business rules"), computer data
storage and data access are developed
and maintained as independent modules,
most often on separate platforms.
These days, a normal 3 tier application consists of a user interface written in Javascript, CSS and HTML which runs in the browser, a business rules layer which runs in a web server, and could indeed be built in VB.NET, and a storage layer which runs on a database server written in SQL and stored procedures.
Now it would be possible to do a user interface layer in VB.NET as a Windows application which then calls the business rules layer on the web server using a web services interface. This would give you more flexibility than the browser, and would not require learning as many APIs, however it is not common. It can really only be done in an enterprise situation.
This article has a simple VB.NET application that is a Windows GUI app, which call Google's web services API to do searches and to check spelling. That is a good example of a user interface layer. Then check this article for and exmple of a web service developed in VB.NET. This corresponds to the business rules layer, and in a real 3-tier application, it would be based around a database such as SQL server. If you were to use Access then it would not be a real 3-tier application. The database needs to be run on its own server and accessed across the network in order to be considered a tier.
The advantage of a 3-tier application is that you can scale each layer separately, and because each layer is simpler, scaling is also simpler. The DBAs can scale up to a database cluster, the business rules layer can scale up with a load-balancer and multiple servers, and the user-interface just gets replicated across as many clients as you need.
I don't know if is it the right way to use it, but I often use 3-tier int the following way:
One big Solution, with the name of the project
One dll project wich has the conection with the DB, using LINQ or whathever. Validating only the required fields of the DB
Another DLL project, wich has a reference to the project that conects to DB, and validate all data using the bussiness rules. Sometimes you may want a repositorium class wich has methods that can be used from the GUI layer
Finally, the GUI layer that can be HTML or WINForms, wich references to the bussines layer and calls all the appropiates methods, passing the data transparently and waiting for validation on the bussines rules.
You can comunicate with each layer using bool methods that returns true if everything is good, and personalized exceptions for each of the possible errors, and catch them on the upper layer.
I'll give you the gist of it. Real crash course.
You have three tiers:
DAL - Data Access Layer
BRL - Business Rule Layer
Presentation - The Forms, and such.
In the DAL, you configure how your application connects to databases, how it recieves datasets, etc. etc. Everything that has to do with data access.
In the BRL, you lay down how your program is going to handle the data it recieves from the DAL. Methods, and other things go in here.
And in the Presentation area, you simply make things purty and instantiate things from the BRL. The presentation area never has to touch the DAL, and that's the beauty of the 3tier layout. You can work on different areas and not step on other peoples toes.
I find the best way to understand it is to look at an example. If you go here:
http://www.codeproject.com/KB/vb/N-Tier_Application_VB.aspx
You can download an example and read the walk through for creating a very basic 3 Tier App in VB.Net. It's a little old in that it's a Visual Studio 2003 project but it should be easy enough to follow the upgrade wizard and get it up and running to check it out.
I'd like to give a brief overview about this style of programming and maybe I'll explain it in more details next time.
First of all, the 3-Tire concept engages dividing your program or application you are designing into 3 layers, the first layer is for manipulating the database in the operation called CRUD which stands for {Create,Read,Update,Delete} data from your database, using any kind of Databases : for example Oracle,SQLserver,MySql etc. That means you can connect your application with any type of databases without specifying a connection String to Only one database and we'll get more details about this next time.
The Second layer is Business layer which includes user data verification and other similar operations in which you process your business rules and the core of program,
The Third and Last layer is the Presentation layer which relates to User input and UI User Interfaces {The different forms for Input}
Frankly you can divide your Solution {Program,Application,Website} to sub-Programs to avoid data loss,organize your work, and to divide the development of your application among a team members.
in my point of view this is a great thing should be learnt in development, and as I was told by the grapeVine, if you want to enrich your knowledge and experience then you should be acknowledged about this important subject.

SOA architecture data access

In my SOA architecture, I have several WCF services.
All of my services need to access the database.
Should I create a specialized WCF service in charge of all the database access ?
Or is it ok if each of my services have their own database access ?
In one version, I have just one Entity layer instanced in one service, and all the other services depend on this service.
In the other one the Entity layer is duplicated in each of my services.
The main drawback of the first version is the coupling induced.
The drawback of the other version is the layer duplication, and maybe SOA bad practice ?
So, what do so think good people of Stack Overflow ?
Just my personal opinion, if you create a service for all database access then multiple services depend on ONE service which sort of defeats the point of SOA (i.e. Services are autonomous), as you have articulated. When you talk of layer duplication, if each service has its own data to deal with, is it really duplication. I realize that you probably have the same means of interacting with your relational databases or back from the OOA days you had a common class library that encapsulated data access for you. This is one of those things I struggle with myself, but I see no problem in each service having its own data layer. In fact, in Michele Bustamante's book (Chapter 1 - Page 8) - she actually depicts this and adds "Services encapsulate business components and data access". If you notice each service has a separate DALC layer. This is a good question.
It sounds as if you have several services but a single database.
If this is correct you do not really have a pure SOA architecture since the services are not independant. (There is nothing wrong with not having a pure SOA architecture, it can often be the correct choice)
Adding an extra WCF layer would just complicate and slow down your solution.
I would recommend that you create a single data access dll which contains all data access and is referenced by each WCF service. That way you do not have any duplication of code. Since you have a single database, any change in the database/datalayer would require a redeployment of all services in any case.
Why not just use a dependency injection framework, and, if they are currently using the same database, then just allow them to share the same code, and if these were in the same project then they would all use the same dll.
That way, later, if you need to put in some code that you don't want the others to share, you can make changes and just create a new DAO layer.
If there is a certain singleton that all will use, then you can just inject that in when you inject in the dao layer.
But, this will require that they use the same DI framework controller.
The real win that SOA brings is that it reduces the number of linkages between applications.
In the past I've worked with organizations who have done it a many different ways. Some data layers are integrated, and some are abstracted.
The way I've seen it most successfully done is when you create generic data-layer services for each app/database and you create the higher level services based on your newly created data layer.