Implement Repository Pattern in Asp.Net MVC with SOA architecture - wcf

We have starting new project in our company. We finalize the architecture as follows
There are 5 different project as follows
1) BusinessEntities(Class Library) which contains DataContract as follows
[DataContract]
public class Cities
{
/// <summary>
/// Gets or sets City Id
/// </summary>
[DataMember]
public int Id { get; set; }
/// <summary>
/// Gets or sets City name
/// </summary>
[DataMember]
[Display(Name = "CityName", ResourceType = typeof(DisplayMessage))]
[Required(ErrorMessageResourceName = "CityName", ErrorMessageResourceType = typeof(ErrorMessage))]
[RegularExpression(#"[a-zA-Z ]*", ErrorMessageResourceName = "CityNameAlphabates", ErrorMessageResourceType = typeof(ErrorMessage))]
[StringLength(50, ErrorMessageResourceName = "CityNameLength", ErrorMessageResourceType = typeof(ErrorMessage))]
public string Name { get; set; }
}
2) Interface which contains
[ServiceContract]
public interface ICity : IService<CityViewModel>
{
[OperationContract]
Status Add(Cities entity);
}
3) DAL which contains implementation of WCF service
[ServiceBehavior(InstanceContextMode = InstanceContextMode.PerCall)]
public class City : ICity
{
public Status Add(Cities entity)
{
//Insert Logic goes here
}
}
4) Webcomponent which call the WCF service
public class City
{
public static Status Add(Cities entity)
{
using (var service = new WcfServiceProvider<ICity>())
{
return service.GetProxy().Add(entity);
}
}
}
5) UI (Asp.Net MVC Project) which call webcomponent to access service
City.Add(entity);
Now we finalize this structure. But the problem is how to use Repository Pattern for Unit Testing? Is it possible to use repository pattern on this structure if yes how? Or is there any other pattern we have to use?

I recommend that you read about seperation of concerns. Right now you are using your business object as DTO and BO. That effectivly couples your WCF service with your domain layer AND with the UI layer.
That means that version control will be impossible. If you want to do any change in the UI or in the DL you have to make sure that all changes are made in both layers as the UI won't be able to talk with the BL otherwise.
It's much better to have dedicated DTOs since you then can handle versioning issues a lot easier (like default values or a newly introduced property etc).
Your naming does not make sense. Your Cities class contains ONE city, right? Why don't you name it City.
[ServiceContract]
public interface ICity : IService<CityViewModel>
{
[OperationContract]
Status Add(Cities entity);
}
Can you explain what the service definition is? I don't see the relation between the view model and the DTO. Same thing goes here. The name ICity is misleading. If it's a repository name it as such. Most of us use the City name to point out the object that we work with and use other names like ICityService or ICityRepository to point out the access technologies.
Now to the real question:
But the problem is how to use Repository Pattern for Unit Testing?
You don't.
The only responsibilty of repositories is to load and store data in the data source. You can of course mock the DbConnection etc. But that doesn't guarantee anything at all since the repositories is effectivly coupled to the data source. If you use mocks you'll still get failures from incorrect SQL queries, invalid column types, incorrect table relations etc.
Hence if you truly want to make sure that the repositories work you have to query a database.

Related

silverlight domain service don't allow return a generic object

I have a domain service running smooth, some expose functions that return generic lists of defined entity, but for some reason, I had add some common information so I created a generic object to wrap the collection with the extra information that I need return.
but when after made the change and try use the service in the client, the function don't show up in the context, I already search about it and what I found was attributes for generic IQueryable
my wrap class
public class Wrap<T>
{
public String commonProperty { get; set; }
public String anotherCommonProperty { get; set; }
public List<T> items { get; set; }
}
in my service domain
public Wrap<SomeClass> GetAll()
{
Wrap<SomeClass> myObject = new Wrap<SomeClass>();
myObject.items = new List<SomeClass>();
myObject.commonProperty = "some info";
myObject.anotherCommonProperty = "some info";
return myObject;
}
Maybe adding the [KnownType(typeof(SomeClass))] attribute in the Wrap<T> class, the problem is that you need to include one KnowType attribute for every class in your domain (this is because you are making a polymorphic service).
And adding the [ServiceKnownType(typeof(SomeClass))] in the GetAll method in the service (this is for wcf services I don't know if is valid for domain services).
WCF RIA domain services does not support generic entity types. IEnumerable<T> and IQueryable<T> are special cases.
Your method was ignored because it did not match supported method type.
Before changes GetAll was recognized as Query method. You can force that by adding attribute.
[Query]
public Wrap<SomeClass> GetAll()
Now it does not dissapear silently. But generates compile time error instead:
Type 'Wrap`1' is not a valid entity type. Entity types cannot be
generic.

WCF DataContract Versioning

Alright here goes nothing. After reading Best Practices on Service Versioning and Data Contract Versioning (http://msdn.microsoft.com/en-us/library/ms733832.aspx) I mostly understand how its all done. I am planning to use Agile Versioning for Data Contracts but cant figure out what the difference or better practice is between Creating a WorkRequestV2 to add new properties or just adding the new properties to WorkRequestV1. Now I tried doing both ways and it worked but when I do create WorkRequestV2 I have to modify ServiceContractor to use WorkRequestV2 why do this rather than just adding properties to WorkRequestV1? What is the difference?
The Example I looked at was here (http://msdn.microsoft.com/en-us/library/ms731138.aspx)
CarV1 and CarV2 why not add HorsePower to CarV1 and not have to create a whole new Contract.
[DataContract(Name = "WorkRequest")]
public class WorkRequestV1 : IExtensibleDataObject {
[DataMember(Name = "workrequest",Order=1,IsRequired=true)]
public int workrequest { get; set; }
[DataMember(Name = "CQ")]
public string CrewHeadquarter { get; set; }
[DataMember(Name = "JobCode")]
public string JobCode { get; set; }
[DataMember(Name = "JobType")]
public string JobType { get; set; }
[DataMember(Name = "Latitude")]
public string Latitude { get; set; }
[DataMember(Name = "Longitute")]
public string Longitute { get; set; }
private ExtensionDataObject theData;
public ExtensionDataObject ExtensionData {
get {
return theData;
}
set {
theData = value;
}
}
}
Have another read of the Data Contract versioning (your second link)
Here is a quote from that page:
Breaking vs. Nonbreaking Changes
Changes to a data contract can be
breaking or nonbreaking. When a data contract is changed in a
nonbreaking way, an application using the older version of the
contract can communicate with an application using the newer version,
and an application using the newer version of the contract can
communicate with an application using the older version. On the other
hand, a breaking change prevents communication in one or both
directions.
For your case, adding some additional properties is a non-breaking change. You can quite safely add the properties to the existing data contract rather than create a new one, as long as you don't have strict schema validation (such as the new properties don't have 'required' marked on them)
Old clients communicating with new services still continue to work, values of the new properties will remain the default value. New clients communicating with old services will also work, as the new properties will be ignored.
But as you can see, you will run into the problem of how can you ensure new clients communicate with new services, and old clients with old services? If this isn't an issue, then you don't have a problem. Otherwise you may need to introduce a new data contract.
Further reading:
MSDN Service Versioning
IBM Best practice for Web service versioning
Oracle Web services versioning
What are your WebService Versioning best practices?

Problems with EF-Agnostic design consumed by WCF service.

I am trying to set up EF to work on WCF and keeping the domain class models EF Agnostic.
The code is organized into 3 projects. (I am taking a stab a DDD - I am very new to it but am looking forward t learning more)
Project: QA - Domain Layer. Contains the DataContract models/entities.
References
QA.Data
Project: QA.Data - Data Layer. Contains the context and EDMX (code generation stragtegy = "none")
References
Entity Framework/System.Data.Entity
Project: QA.Repository - Data Access/Repository. Contains the repository classes
References
QA [Domain Layer]
QA.Data [Data Layer]
Entity Frame/System.DataEntity
My understanding is that the domain layer can reference the data layer but the data layer should never reference the domain. The problem that this presents is that my Domain Models/Classes are defined in the Domain layer but the Context which creates and returns them is in the Data layer. In order for my context to know to return a "Widget" object it would need a reference to the Domain layer which defined the "Widget"
My (failed) solution : My solution was to create interfaces for each Domain Model and place them in the data layer. The context would return ... IdbSet ... These interfaces would, in turn, be implemented by the Domain Models, therefore keeping my data layer from directly needing to reference my domain (which causes illegal circular references anyway). The domain models were originally contructed using "ADO.NET DbContext Generator w/WCF Support" T4 templates. This process resulted in the inclusion of the [KnownType(typeof(IWidgetPiece))] at the beginning of of the widget class defin ition. (A Widget has a navigation property ... ICollection ...)
The problem appears when I attempt to access the service, I get the following error
'QA.Data.IWidgetPiece' cannot be added to list of known types since
another type 'System.Object' with the same data contract name
'http://www.w3.org/2001/XMLSchema:anyType' is already present. If
there are different collections of a particular type - for example,
List and Test[], they cannot both be added as known types.
Consider specifying only one of these types for addition to the known
types list.
I can change these to the concrete implementations ... [KnownType(typeof(WidgetPiece))] ... but I continue to get this error because the navigation property they are referring to is still returning an IWidgetPiece interface type which it MUST do in order to satify the interface implementation.
I am trying to figure out how to keep things appropriately divided and still have the context returning what it should. the context returning Interfaces still doesn't "sit" right with me for this and other reasons but I cannot think of another way to do this, and even this is presenting the aforementioned issue. HELP!
Some code to hopefully clarify my previous ramblings ...
namespace QA.Data
{
public interface IWidgetPiece
{
String ID { get; set; }
}
public interface IWidget
{
String ID { get; set; }
ICollection<IWidgetPiece> Pieces;
}
public partial class WidgetEntities : DbContext
{
IDbSet<IWidget> Widgets { get; set; }
IDbSet<IWidgetPiece> WidgetPieces { get; set; }
}
}
namespace QA
{
[KnownType(typeof(IWidgetPiece))]
// [KnownType(typeof(WidgetPiece))]
[DataContract(IsReference = true)]
public partial class Widget : QA.Data.IWidget
{
[DataMember]
public String ID { get; set; }
[DataMember]
public virtual ICollection<IWidgetPiece> Pieces { get; set; }
}
[DataContract(IsReference = true)]
public partial class WidgetPiece : QA.Data.IWidgetPiece
{
[DataMember]
public string ID { get; set; }
}
}
namespace QA.Repository
{
public class WidgetRepository
{
public List<Widget> GetWidgetbyID(String sId)
{
WidgetEntities context = new WidgetEntities();
List<IWidget> objs = context.Widgets.Where(b => b.ID == "78").ToList();
List<Widget> widgetList = new List<Widget>();
foreach (var iwidget in widgetList)
widgetList((Widget)iwidget);
return widgetList;
}
}
}
Do you really want / need two separate models i.e. your data access layer model (edmx) and your "real" domain model? The whole point of an ORM framework like EF is so you can map your domain model to your database tables, using mappings between the physical (database) conceptual model.
Since EF4.1, you can construct your domain model and then in your data access layer map that to your database directly using a fluent API. You can also elect to reverse-engineer your POCO domain model from a database if you want to quickly get up an running.
It just seems a bit of unnecessary complexity to create an entire EF class model, only to then have to map it again into another class model (which will most likely be fairly close to the EF-generated one).

When upgrading from asmx to wcf, does every member need to be decorated with the DataMember attribute?

I have a web service that is currently using asmx. The operations are decorated with WebMethod and each takes in a request and returns a response. I started creating a WCF app and I am referencing the business layer so I can reuse the Web methods. My question is, do I have to decorate each class with DataContract and each property of the request with DataMember?
Currently, one of the classes is decorated with SerializableAttribute, XmlTypeAttribute, and XmlRootAttribute. Do I need to remove these and add DataContract or do I can I add DataContract to it? It is a .NET 2 app by the way. The class also contains a bunch of private fields and public properties, do I need to decorate these with a DataMember attribute. Is this even possible if it is using the .NET 2 framework?
The WCF Service is currently targeting .NET Framework 4.0. A few of the methods need to still use the XmlSerializer, so does this mean I can just decorate the operation with [XmlSerializerFormat]?
Can you elaborate on not using any business objects on the service boundary? and what is DTO?
If possible, can you give an example?
Since .NET 3.5 SP1 the DataContractSerializer does not require the use of attributes (called POCO support). Although this gives you little control over the XML that is produced
However, if you already have an ASMX service you want to port then to maintain the same serialization you really want to use the XmlSerializer. You can wire this in in WCF using the [XmlSerializerFormat] attribute which can be applied at the service contract or individual operation level
Edit: adding section on DTOs
However, putting business objects on service boundaries causes potential issues:
You may be exposing unnecessary data that is purely part of your business rules
You tightly couple your service consumers to your business layers introducing fragility in their code and preventing you from refactoring freely
The idea of Data Transfer Objects (DTOs) is to create classes whose sole role in life is to manage the transition between the XML and object worlds. This also conforms to the Single Responsibility Principle. The DTOs oinly expose the necessary data and act as a buffer between business changes and the wire format. Here is an example
[ServiceContract]
interface ICustomer
{
[OperationContract]
CustomerDTO GetCustomer(int id);
}
class CustomerService : ICustomer
{
ICustomerRepository repo;
public CustomerService (ICustomerRepository repo)
{
this.repo = repo;
}
public CustomerService()
:this(new DBCustomerRepository())
{
}
public CustomerDTO GetCustomer(int id)
{
Customer c = repo.GetCustomer(id);
return new CustomerDTO
{
Id = c.Id,
Name = c.Name,
AvailableBalance = c.Balance + c.CreditLimit,
};
}
}
class Customer
{
public int Id { get; private set; }
public string Name { get; set; }
public int Age { get; set; }
public decimal Balance { get; set; }
public decimal CreditLimit { get; set; }
}
[DataContract(Name="Customer")]
class CustomerDTO
{
[DataMember]
public int Id { get; private set; }
[DataMember]
public string Name { get; set; }
[DataMember]
public decimal AvailableBalance { get; set; }
}
Using DTOs allows you to expose existing business functionality via services without having to make changes to that business functionality for purely technical reasons
The one issue people tend to baulk at with DTOs is the necessity of mapping between them and business objects. However, when you consider the advantages they bring I think it is a small price to pay and it is a price that can be heavily reduced by tools such as AutoMapper
WCF uses the DataContractSerializer which is primarily based upon attributes like: DataContract, DataMember, ServiceContract and so forth. But it also supports SerializableAttribute amongst others. This http://msdn.microsoft.com/en-us/library/ms731923.aspx document gives you all the insight you need.
So it might be that you don't need to refactor all your existing code but it aks some further investigation and testing ;)

How to simply map an NHibernate ISet to IList using AutoMapper

I'm trying to use AutoMapper to map from DTO's to my Domain.
My DTO's might look like this:
public class MyDTO
{
public string Name { get; set; }
public bool OtherProperty { get; set; }
public ChildDTO[] Children { get; set;}
}
public class ChildDTO
{
public string OtherName { get; set; }
}
My Domain objects like this:
public class MyDomain
{
public string Name { get; set; }
public bool OtherProperty { get; set; }
public ISet<ChildDomain> Children { get; set; }
}
public class ChildDomain
{
public string OtherName { get; set; }
}
How would I setup AutoMapper to be able to map from these Array's to Set's. It seems like AutoMapper is taking the Array's and converting them into IList's then failing on conversion to ISet.
Here's the exception
Unable to cast object of type 'System.Collections.Generic.List`1[DataTranser.ChildDTO]' to type 'Iesi.Collections.Generic.ISet`1[Domain.ChildDomain]'.
I'm hoping to find a simple generic way to do this so that I can minimize the infrastructure needed to map from DTO's to Domain. Any help is greatly appreciated.
UPDATE:
So then how would I model MyDomain -> ChildDomain without ending up with an anemic domain model? I understand that without business logic in MyDomain or ChildDomain the domain model is currently anemic, but the goal was to add business logic in as we move forward. I just want to ensure that my View Model can be translated into the domain model and persisted.
What would you suggest for this scenario, moving from a simple mapping between view and domain and later adding in business rules?
Thanks again for your help.
If your persistence layer is simple, using UseDestinationValue() will tell AutoMapper to not replace the underlying collection:
ForMember(dest => dest.Children, opt => opt.UseDestinationValue())
However, if it's not simple, we just do the updating manually back into the domain. The logic generally gets more complex to update the domain model. Doing reverse mapping puts constraints on the shape of your domain model, which you might not want.
The answer:
You have to create your own IObjectMapper to map a custom collection like ISet
Create your own configuration instance with all the standard
objectmappers and your new
setobjectmapper.
Use an IMappingEngine instance created with the configuration with
your own objectmapper instead of the
static AutoMapper.Mapper class.
Some remarks:
It's easy to configure the IMappingEngine construction in a inversion of control container.
The source of automapper itself might help you with creating the IObjectMapper implementation.
You are using automapper on the opposite way for what it is designed for: It's designed to map complex objects to simple objects. You try to map a simple DTO to a complex entity. (This does not mean that what you want is hard to do with automapper, but you might get different problems in the future)
You are using the anemic domain model anti pattern. Domain should hold all the business logic, so it should not expose a complex collection like ISet (and no public setters for collections at all)