Asp.Net Core - Middleware VS. DispatchProxy - asp.net-core

I just found the DispatchProxy class and wondering if I should wrap my Repository and Service objects with them (the wrapping is done by factory).
Currently I use Middleware's and ActionFilter's for logging and authenticating but the proxy seems to be better because with them I should be able to create my code more plattform Independent (they are supported in dot.Net Standard).
Is there a big drawback between Middleware/ActionFilter vs. DispatchProxy?
Has anyone experience with the DispatchProxy class in professional environments?
Are they much slower because they use delegates?
Thank you for your efforts!
(here an article about the DispatcherProxy: https://dzone.com/articles/aspect-oriented-programming-in-c-using-dispatchpro)

Related

ASP.NET Core AddContext

I'm architecting a new App and I really unconfortable with this approach in ASP.NET Core that made "normal" adding a DbContext by using AddDbContext, in services.
I'd like to know if you guys think that using AddDbContext in ASP.NET Core isn't a bad practice, since it forces my Web App to have a dependecy on my database access layer.
I've researched a lot and it was incredible that there isn't almost anything that cover this subject.
How should I proceed to overcome this concern?
Thanks!
it forces my Web App to have a dependency on my database access layer
That's exactly the place where it should be: the composition root. Your application startup code is the place where you glue your components together.
What else would you want, create a separate library, containing interfaces for all classes in your entire DAL, and wire that up using magic during startup?

WCF OData for multiplatform development?

The OP in this question asks about using an WCF/OData as an internal data access layer.
Arguments of using WCF/OData as access layer instead of EF/L2S/nHibernate directly
The resounding reply seems to be don't do it. I'm in similar position to the OP, but have a concern not raised in the original question. I'm trying to develop (natively) for a lot of different platforms but want to keep as much of the data and business logic server side as possible. So I'll have iOS/Android/Web (MVC)/Desktop applications. Currently, I have a single WinForms applications with an ORM data access layer (LLBLGen Pro).
I'm envisioning moving most of my business / data access logic (possibly still with LLBLGen or other ORM) behind a WCF / OData interface. Then making all my different clients on the different platforms very thin (basically UI and WCF calls).
Is this also overengineered? Am I missing a simpler solution?
I cannot see any problem in your architecture or consider it overengeenered as a OData is a standard protocol and your concept conforms the DRY principle as well.
I change the question: Why would you implement the same business logic in each client to introduce more possible bugs and loose the possibility to fix the errors at one single and centralized place. Your idea makes you able to implement the security layer only once.
OData is a cross-platform standard and you can find a OData libraries for each development platform (MSDN, OData.org, JayData for JavaScript). Furthermore, you can use OData FunctionImports/Service methods and entity-level methods, which will simplify your queries.
If you are running multiplatform development, then you may find more practical to choose platform-agnostic communication protocol, such as HTTP, rather than bringing multiple drivers and ORMs to access your data Sources directly. In addition since OData is a REST protocol, you don't need much on the Client side: anything that can format OData HTTP requests and parse HTTP responses. There are however a few aspects to be aware of:
OData server is not a replacement for an SQL database. It supports batches but they are not the same as DB transactions (although in many cases can be used to model transactional operations). It supports parent-child relations but it does not support JOINs in classic SQL sense. So you have to plan what you expose as OData entity. It's too easy to build an OData server using WCF Data Services wrapping EF model. Too easy because People often expose low Level database content instead of building high level domain types.
As for today an OData multiplatorm clients are still under development, but they are coming. If I may suggest something I am personally working on, have a look at Simple.Data OData adapter (https://github.com/simplefx/Simple.OData, look at its Wiki pages for examples) - it has a NuGet package. While this a Client Library that only supports .NET 4.0, part of it is being extracted to be published as a portable class Library Simple.OData.Client to support .NET 4.x, Windows Store, Silverlight 5, Windows Phone 8, Android and iOS. In fact, if you check winrt branch of the Git repository, you will find a multiplatform PCL already, it's just not published on NuGet yet.

What is the best practice using NHibernate 3.0 with WCF Web Services?

There seems to be quite a bit of information regarding using NHibernate and WCF Web Services but I'm struggling to find a definitive guide on how to implement the two technologies together in a efficient, thread safe way.
Specifically I want to grab the ISession object and uses that to get and save data through my existing repositories. My Business Objects, Unit Tests and ASP.NET Web Application all use the NHibernate framework and it works just great (it's my first hobby project using an ORM). My big question is how to combine this great framework with WCF Web Services.
I've read that version 3.0 NHibernate has NHibernate.Context.WcfOperationSessionContext but I'm unsure of it's implementation (see this question). From what I understand, one option is to store the ISession object in the OperationContext?
Can anyone point me in the direction of a implementation example?
Many thanks.
Here is a post describing, in detail, all the steps for registering and using the WcfOperationSessionContext. It also includes instructions for using it with the agatha-rrsl project.
WCF and Nhibernate work together in Sharp Architecture project. You can have a look at their implementation

Entitity Framework: Change tracking in SOA with POCO approach

In our layered application, we are accessing database via WCF calls. We are creating and disposing contexts per request. Also we are using POCO approach.
My question is, in pure POCO model (completely persistent ignorant POCOs) is it possible to track the changes, while we are creating and disposing context per request (as previous context is disposed in that service call)? If yes how EF handles this situation? As far as I can see 2 mechanisms (snapshot based change tracking and notification based change tracking with proxies) will not be able to handle this? If not, how should we handle context so that we are able to track the changes?
I'd say:
Do not use self-tracking entities in a pure SOA environment:
The self-tracking entities only work when your clients use
the generated proxy classes.
When you're doing SOA by the book, you cannot expect your clients
to be .Net, or even more, .Net 4.0; Which is the only scenario
in which self-tracking entities will work. Your services will be
useless to any other clients.
Just My 2 cents,
Regards,
Koen
Self-Tracking Entities does indeed solve this problem if you are capable of sharing the Model with the client as opposed to using metadata generated by the reference.
Abstract the STEs and reference them from the client, you will have access to tracking disconnected from the context.
Unfortunately you won't find a simple solution to this in Entity Framework v1.0.
There has been much discussion and little resolution. It is one of the many known problems with EF v1.0 and one way or another you will end up having to write lot's of code to handle this.
In .NET 4.0 the ADO.NET team have introduced Self-Tracking Entities to tackle this very problem.

Are NHibernate and XML Webservices (.asmx) a good match?

I'm looking at new architecture for my site and was wondering if pairing NHibernate with a web service core is a good idea. What I want to do is make my webservice the core of my business, from the site front ends to the utilties I write. I'm trying to make all of my UIs completely ignorant of anything but my service API's.
In a simple strawman experiement, I'm running into issues with Serialzing my Iesi ISets....this is causing me to rethink the strategy altogether.
I know I could just develop a core Library (dll) and reference that in each of my applications, but maintaining that dll's version over a minimum of 6 applications seems like it's going to cause me much pain.
With NHibernate, what are the pro's and con's of those two approaches?
I see no problem in using NHibernate and webservices together - I just don't think it's a good idea to send the entities themselves over "to the other side".
A better approach is to use a set of DTOs that are made for the service - then you won't be running into issues like that of serializing unknown types and such.
You can use a library like AutoMapper to do the mapping from the entities to the DTOs.
There's a lot of stuff written about this, some of it:
http://martinfowler.com/bliki/FirstLaw.html
http://ayende.com/Blog/archive/2009/05/14/the-stripper-pattern.aspx
http://elegantcode.com/2008/04/27/dtos-or-serialized-domain-entities/
DTOs vs Serializing Persisted Entities
As a side note for the service it self, you could design wise use an approach like Davy Brion describes here: http://davybrion.com/blog/2009/11/requestresponse-service-layer-series/
I don't know NHibernate, but want to remind you that you should be using WCF for new web service development, unless you are stuck in the past (.NET 2.0). Microsoft now considers ASMX web services to be "legacy technology", and you can imagine what that means.