How to implement Transactions with Generic Repository Pattern? - asp.net-core

I am developing a .NET Core application where I leverage the Generic Repository pattern and I would like to know how can I implement a transaction:
IGenericRepository
public interface IGenericRepository<T>
{
Task InsertAsync(T insert);
Task<bool> RemoveAsync(object id);
Task UpdateAsync(T entity);
Task<T> GetByIdAsync(object id,string includeProperties="");
Task<IQueryable<T>> GetAsync(Expression<Func<T, bool>> filter=null,
int? skip=null,
int? take=null,
Func<IQueryable<T>,IOrderedQueryable<T>> orderBy = null,
string includeProperties = "");
Task SaveAsync();
}
I was looking at this implementation which uses UnitOfWork as well, but in .NET Core, I do not have a DbContextTransaction.
I am not using UnitOfWork yet. Currently my service looks like this:
public class SomeService
{
IGenericRepository<A> arepo;
IGenericRepository<B> brepo;
public SomeService(IGenericRepository<A> arepo,IGenericRepository<B> brepo)
{
this.arepo=arepo;
this.brepo=brepo;
}
public async Task DoTransaction(id)
{
var a=await arepo.GeyById(id)
await brepo.RemoveAsync(a.Id);
await brepo.SaveChangesAsync();
await arepo.InsertAsync([something]);
await arepo.SaveChanges();
}
}
I would want to make this transactional and also, avoid using SaveChangesAsync for all repositories that get involved.
What would be a solution?

Well I am not expert in entity framework, but I am answering in terms of repository and unit of work.
To begin with, avoid unnecessary wrapper of additional generic repository as you are already using full-ORM. Please refer to this answer.
but in .NET Core i do not have a DbContextTransaction.
The DbContextTransaction is important but not a key for implementing unit of work in this case. What is important is DBContext. It is DBContext that tracks and flushes the changes. You call SaveChanges on DBContext to notify that you are done.
I would want to make this transactional
I am sure there must be something available to replace DbContextTransaction or to represent transaction.
One way suggested by Microsoft is to use it as below:
context.Database.BeginTransaction()
where context is DbContext.
Other way is explained here.
also ,avoid using SaveChangesAsync for all repos that get involved
That is possible. Do not put SaveChanges in repositories. Put it in separate class. Inject that class in each concrete/generic repository. Finally, simply call SaveChanges once when you are done. For sample code, you can have a look at this question. But, code in that question have a bug which is fixed in the answer I provided to it.

Related

Inject httpcontext into custom attribute outside controller in .net core

I need to inject httpcontext into custom attribute that is used outside the controller. I found several solutions how to do it in controller, but my case is little tricky. Now I have following code in my PermissionController
[PermissionFilter(PermissionEnum.Permission, AccessLevelEnum.Create)] <-- it works perfectly
[HttpPost("users/{userId}")]
public async Task<IActionResult>
AssignPermissionToUser([FromBody] List<PermissionToVM> permissions, int userId)
{
await _permissionService.Assign(permissions); <-- .Assign() extension
//code goes here
}
In the method above there is a call of extension method .Assign. This method code is available below.
//[SecondPermissionFilter(PermissionEnum.Permission,
AccessLevelEnum.Create)] <-- here I check permissions but don't
know how to inject the httpcontext
public async Task Assign(List<PermissionToVM> permissions)
{
//code goes here
}
As mentioned in many websites I visited f.e. here https://dotnetcoretutorials.com/2017/01/05/accessing-httpcontext-asp-net-core/ injecting of httpcontext outside the controller can be done using IHttpContextAccessor. The problem is that I don't know how to use it without passing it into constructor. My custom attribute should be called as decorator [SecondPermissionFilter(PermissionEnum.Permission, AccessLevelEnum.Create)] when only permission settings should be passed, so there is no any reference to httpcontextaccessor.
Is this even possible? If not, there is maybe another way to do this?
EDIT: Here is the code of SecondPermissionFilter class:
public sealed class SecondPermissionFilterAttribute : Attribute
{
private readonly PermissionEnum _requestedPermission;
private readonly IEnumerable<AccessLevelEnum> _accessLevelCollection;
private readonly IHttpContextAccessor _contextAccessor; //<-- how to inject?
public PermissionFilterAttribute(PermissionEnum requestedPermission, params AccessLevelEnum[] accessLevelCollection)
{
_requestedPermission = requestedPermission;
_accessLevelCollection = accessLevelCollection;
}
}
What you are after is something called Property Injection. As per the official docs this is not something that is supported out of the box by the .NET Core DI Container.
You can however use a third party library such as Ninject or Autofac - both of which are available via NuGet.
In my opinion the Ninject syntax is nicer, however as noted in this answer, and this answer property injection itself is considered bad practice. So if possible I would try to avoid it.
So you should instead use one of the three methods specified by the filter documentation, this answer breaks things down a bit more.
Edit
This answer deals specificically with Attribute injection, the second answer looks to achieve this without external dependencies.

What is the benefit of using a Ninject.Factory over just injecting the IKernel?

According to this article (first paragraph), it is bad practice to inject your IKernel into wherever you need it.
Instead it is proposed to introduce a factory interface that is automatically implementend by Ninject (doing internally the same resolution).
This is an actual code snipped I am working on:
Former implementation:
public class CommandServer
{
[Inject]
public IKernel Kernel { get; set; }
....
public TResponse ExecuteCommand<TRequest, TResponse>(TRequest request)
where TResponse : ResponseBase, new()
{
...
var command = Kernel.Get<ICommand<TRequest, TResponse>>();
...
}
}
Using a factory:
public class CommandServer
{
[Inject]
public ICommandFactory CommandFactory { get; set; }
....
public TResponse ExecuteCommand<TRequest, TResponse>(TRequest request)
where TResponse : ResponseBase, new()
{
...
var command = CommandFactory.CreateCommand<TRequest, TResponse>();
...
}
}
// at binding time:
public interface ICommandFactory
{
ICommand<TRequest, TResponse> CreateCommand<TRequest, TResponse>();
}
Bind<ICommandFactory>().ToFactory();
I am not saying I don't like it (it looks nice and clean) - just not exactly sure why the former is particularly bad and the latter is so much better?
Generally you should not be using the Service Locator pattern. Why you ask? Please see Mark Seeman(comments, too!) and this SO question. Using the IKernel (or somewhat better: only the IResolutionRoot part of it) smells like Service Locator.
Now Mark would suggest that you should apply the Abstract Factory Pattern instead - and he also mentions the Dynamic proxy approach.
I personally think that using ninject auto-generated factories (= dynamic proxy approach) instead is worth the trade off.
You should not use a factory like:
public interface IServiceLocator
{
T Create<T>();
}
because well.. it's service locator ;-)
However, using something like
public interface IResponseHandleFactory
{
IResponseHandle Create(int responseId);
}
is perfectly fine.
Of course you can also do this by using the IResolutionRoot directly - instead of the factory. The code would look like:
IResolutionRoot.Get<IResponseHandle>(
new ConstructorArgument("responseId", theResponseIdValue);
Reasons not to use IResolutionRoot directly
A lot of the IResolutionRoot "methods" are in fact extension methods. That complicates unit-testing a lot (it's basically not a sensible choice if you want to unit test it, at all).
slight worse decoupling from container (=> ease of changing DI containers) than when using a factory interface. The auto-generated factory feature you can also implement as an add on to other containers - if they don't have it already (i've done so personally for Unity and AutoFac). However it requires some know-how about dynamic proxies.
Alternative to factory interfaces: Using Func<> factories. The above example could also be replaced by Func<int, IResponseHandle>(). Quite a lot DI containers support this out of the box / with standard plugins (ninject needs the Factory extension). So you'd be decoupled from the container even more. Disadvantage: harder to unit test and not clearly named parameters.

Using Test Doubles with DbEntityEntry and DbPropertyEntry

I am using the new Test Doubles in EF6 as outlined here from MSDN . VS2013 with Moq & nUnit.
All was good until I had to do something like this:
var myFoo = context.Foos.Find(id);
and then:
myFoo.Name = "Bar";
and then :
context.Entry(myFoo).Property("Name").IsModified = true;
At this point is where I get an error:
Additional information: Member 'IsModified' cannot be called for
property 'Name' because the entity of type
'Foo' does not exist in the context. To add an
entity to the context call the Add or Attach method of
DbSet.
Although, When I examine the 'Foos' in the context with an AddWatch I can see all items I Add'ed before running the test. So they are there.
I have created the FakeDbSet (or TestDbSet) from the article. I am putting each FakeDbSet in the FakeContext at the constructor where each one gets initialized. Like this:
Foos = new FakeDbSet<Foo>();
My question is, is it possible to work with the FakeDbSet and the FakeContext with the test doubles scenario in such a way to have access to DbEntityEntry and DBPropertyEntry from the test double? Thanks!
I can see all items I Add'ed before running the test. So they are there.
Effectively, you've only added items to an ObservableCollection. The context.Entry method reaches much deeper than that. It requires a change tracker to be actively involved in adding, modifying and removing entities. If you want to mock this change tracker, the ObjectStateManager (ignoring the fact that it's not designed to be mocked at all), good luck! It's got over 4000 lines of code.
Frankly, I don't understand all these blogs and articles about mocking EF. Only the numerous differences between LINQ to objects and LINQ to entites should be enough to discourage it. These mock contexts and DbSets build an entirely new universe that's a source of bugs in itself. I've decided to do integrations test only when and wherever EF is involved in my code. A working end-to-end test gives me a solid feeling that things are OK. A unit test (faking EF) doesn't. (Others do, don't get me wrong).
But let's assume you'd still like to venture into mocking DbContext.Entry<T>. Too bad, impossible.
The method is not virtual
It returns a DbEntityEntry<T>, a class with an internal constructor, that is a wrapper around an InternalEntityEntry, which is an internal class. And, by the way, DbEntityEntry doesn't implement an interface.
So, to answer your question
is it possible to (...) have access to DbEntityEntry and DBPropertyEntry from the test double?
No, EF's mocking hooks are only very superficial, you'll never even come close to how EF really works.
Just abstract it. If you are working against an interface, when creating your own doubles, put the modified stuff in a seperate method. My interface and implementation (generated by EF, but I altered the template) look like this:
//------------------------------------------------------------------------------
// <auto-generated>
// This code was generated from a template.
//
// Manual changes to this file may cause unexpected behavior in your application.
// Manual changes to this file will be overwritten if the code is regenerated.
// </auto-generated>
//------------------------------------------------------------------------------
namespace Model
{
using System;
using System.Data.Entity;
using System.Data.Entity.Infrastructure;
public interface IOmt
{
DbSet<DatabaseOmtObjectWhatever> DatabaseOmtObjectWhatever { get; set; }
int SaveChanges();
void SetModified(object entity);
void SetAdded(object entity);
}
public partial class Omt : DbContext, IOmt
{
public Omt()
: base("name=Omt")
{
}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
throw new UnintentionalCodeFirstException();
}
public virtual DbSet<DatabaseOmtObjectWhatever> DatabaseOmtObjectWhatever { get; set; }
public void SetModified(object entity)
{
Entry(entity).State = EntityState.Modified;
}
public void SetAdded(object entity)
{
Entry(entity).State = EntityState.Added;
}
}
}

sharp architecture contrib transaction attribute in windows service

For some reason this:
[Transaction]
public void DoSomething()
{
...
}
does not work I still have to explicitly use the transaction like this:
public void DoSomething()
{
using (var tx = NHibernateSession.Current.BeginTransaction())
{
....
tx.Commit();
}
}
Any ideas why?
I am using something like this to bootstrap stuff:
_container = new WindsorContainer();
ComponentRegistrar.AddComponentsTo(_container);
...
ServiceLocator.SetLocatorProvider(() => new WindsorServiceLocator(_container));
ComponentRegistrar.AddComponentsTo(_container, typeof(NHibernateTransactionManager));
NHibernateSession.Init(new ThreadSessionStorage(),
new[] { "Bla.Domain.dll" },
new AutoPersistenceModelGenerator().Generate(),
"NHibernate.config");
As Doan said the component that had the method is not proxied.
Since the method is not virtual, I am assuming that your class is implementing an interface. make sure that you have the dependency in the class calling DoSomething defined as the interface and not the implementing class.
if you debug the code, and check the run time type of the object, it should be a castle proxy
for more details check the trouble shooting section on Sharp Architecture contrib wiki
https://github.com/sharparchitecture/Sharp-Architecture-Contrib/wiki/Troubleshooting
Normally, this kind of problem is caused by the failure of invoking the dynamic proxy that provides the transaction management service. Two of the most common errors are:
The method cannot be proxied: most likely not implement any interface method, or the object was not proxied.
The method was called from the same class, which bypassed all proxies.
Edit:
I guess you use Castle Windsor as IoC container. The [Transaction] decoration requires the Automatic Transaction Management Facility in order to work. If you successfully configured the facility, i.e. you made [Transaction] work in one method, but not other, then the answer above applies. If all Transaction decoration failed to work, then you have to review the configuration of the facility first.

How to correctly dispose objects registered with Autofac

I've implemented Unit of Work/Repository pattern, as described here, but I'm also using autofac and constructor injection, so I registered UnitOfWork and DbContext (PsyProfContext) class like this:
builder.Register(context => new PsyProfContext()).InstancePerHttpRequest();
builder.RegisterType<UnitOfWork>().As<IUnitOfWork>().InstancePerHttpRequest();
And everything works great!
Except for one thing: I'm also using enterprise library logging block, and I have implemented CustomTraceListener which is using Entity Framework to write log entry into the database.
My controller looks like this (it is empty because at the moment I just tried to verify if all the things (IoC, logging, entity framework) are working):
public class HomeController : Controller
{
private readonly UnitOfWork unitOfWork;
public HomeController(IUnitOfWork unitOfWork)
{
this.unitOfWork = (UnitOfWork) unitOfWork;
}
//
// GET: /Home/
public ActionResult Index()
{
throw new HttpException();
return View();
}
protected override void Dispose(bool disposing)
{
unitOfWork.Dispose();
base.Dispose(disposing);
}
}
And in the Write method of the CustomTraceListener class, I've tried to Resolve UnitOfWork:
DependencyResolver.Current.GetService<IUnitOfWork>() as UnitOfWork;
But I get an instance which is already disposed! so I've put some breakpoints and found out that Dispose method of the controller is called before the Write method of the CustomTraceListener class, so in the end I didn't found other solution than using DbContext (PsyProfContext) directly:
public override void Write(object o)
{
using (var conext = new PsyProfContext())
{
var customLogEntry = o as CustomLogEntry;
if (customLogEntry != null)
{
var logEntry = new LogEntry
{
//a bunch of properties
};
conext.Exceptions.Add(logEntry);
conext.SaveChanges();
}
}
}
But I don't like this solution! What's the point to use UnitOfWork and Repository pattern if you access DbContext object directly. Or what's the point in using DI in project if you create a registered object manually in some cases.
So I wanted to hear your opinion, about how to deal with this kind of situations? Is my current implementation fine, or it is definitely wrong and I should think about another one.
Any help will be greatly appreciated and any ideas are welcome!
It looks like you may have a couple of problems.
First, if you're manually disposing the unit of work object in your controller, your controller should take an Owned<IUnitOfWork> in the constructor. When the request lifetime is disposed it will automatically dispose of any IDisposable components - including the controller and any resolved dependencies - unless you specify somehow that you're going to take over ownership of the lifetime. You can do that by using Owned<T>.
public class HomeController : Controller
{
Owned<IUnitOfWork> _uow;
public HomeController(Owned<IUnitOfWork> uow)
{
this._uow = uow;
}
protected override void Dispose(bool disposing)
{
if(disposing)
{
this._uow.Dispose();
}
base.Dispose(disposing);
}
}
(Note a minor logic fix in the Dispose override there - you need to check the value of disposing so you don't double-dispose your unit of work.)
Alternatively, you could register your units of work as ExternallyOwned, like
builder
.RegisterType<UnitOfWork>()
.As<IUnitOfWork>()
.ExternallyOwned()
.InstancePerHttpRequest();
ExternallyOwned also tells Autofac that you'll take control of disposal. In that case, your controller will look like it does already. (Generally I like to just let Autofac do the work, though, and not take ownership if I can avoid it.)
In fact, looking at the way things are set up, you might be able to avoid the disposal problem altogether if you let Autofac do the disposal for you - the call to DependencyResolver would return the unit of work that isn't disposed yet and it'd be OK.
If that doesn't fix it... you may want to add some detail to your question. I see where your controller is using the unit of work class, but I don't see where it logs anything, nor do I see anything in the listener implementation that's using the unit of work.
(Also, as noted in the first comment on your question, in the constructor of your controller you shouldn't be casting your service from IUnitOfWork to UnitOfWork - that's breaking the abstraction that the interface was offering in the first place.)