Risks of holding an Entity Framework dynamic proxy object in session? - asp.net-mvc-4

So I have a fairly comprehensive activity-based access control system I built for a web app under MVC 4 using Entity Framework. Well, to be precise the access control doesn't care if it's using EF or not, but the app is.
Anyway, I'm loading the user's permissions on each request right now. I get a reference to my DbContext injected from the IoC container into my ApplicationController, and it overrides OnAuthorization to stuff the user's profile into the HttpContext.Current.Items. Seems to work fairly well, but I can't help but wonder if it's the best way.
My thought was that since the users' permissions don't change often, if ever, the better way to do it would be to load the profile of permissions into the Session instead, and then not have to change them at all until the user logs out and logs back in (pretty common in desktop OS's anyway). But I'm concerned that if I fetch using the DbContext, then the object I get back is a dynamic proxy which holds a reference to the DbContext and I certainly don't want to do that for the whole session.
Thoughts? Is this a good approach, and if so how do I ensure that my DbContext doesn't linger beyond when I really need it?

Invoke .AsNoTracking() on the Set<UserPermission> before you query out. Entities will still be proxied, but will be detached from the DbContext.
var userPermission = dbContext.Set<UserPermission>().AsNoTracking()
.SingleOrDefault(x => x.UserName == User.Identity.Name);
Thoughts? Is this a good approach?
Putting a dynamically proxied entity in session will break as soon as you load balance your code across more than 1 web server. Why? Because of the dynamic proxy class. Server A understands the type DynamicProxies.UserPermission_Guid, because it queried out the entity. However Server B through N do not, and therefore cannot deserialize it from the Session. The other servers will dynamically proxy the entity with a different GUID.
That said, you could DTO your data into a POCO object and put it in session instead. However then you do not need to worry about your entity being attached to the context when you first query it out. AsNoTracking will only make the query perform a bit faster.
// you can still call .AsNoTracking for performance reasons
var userPermissionEntity = dbContext.Set<UserPermission>().AsNoTracking()
.SingleOrDefault(x => x.UserName == User.Identity.Name);
// this can safely be put into session and restored by any server with a
// reference to the DLL where the DTO class is defined.
var userPermissionSession = new UserPermissionInSession
{
UserName = userPermissionEntity.UserName,
// etc.
};

Thoughts? Is this a good approach?
Another problem attached to this approach is when you use the common pattern that create one dbContext per http request. This pattern typically dispose dbContext when the request ends.
protected virtual void Application_EndRequest(object sender, EventArgs e)
But what happen when we try to get navigation property of a proxy entity which reference to a disposed DbContext?
We will get a ObjectDisposedException

Related

What is the best practice when adding data in one-to-many relationship?

I am developing a website for a beauty salon. There is an admin part of the website, where the esthetician can add a new care. A care is linked to a care category (all cares related to hands, feets, massages, ...). To solve this I wrote this code into the CareRepository in the .NET API :
public async Task<Care?> AddAsync(Care care)
{
// var dbCareCategory = await this._careCategoryRepository.GetByNameAsync(care.CareCategoryName);
if (string.IsNullOrEmpty(care.CareCategoryName) || string.IsNullOrWhiteSpace(care.CareCategoryName))
return null;
var dbCareCategory = await this._instituteDbContext.CareCategories
.Where(careCategory => Equals(careCategory.Name, care.CareCategoryName))
.Include(careCategory => careCategory.Cares)
.FirstOrDefaultAsync();
if (dbCareCategory == null || dbCareCategory.Cares.Contains(care))
return null; // TODO : improve handling
dbCareCategory.Cares.Add(care);
await this._instituteDbContext.SaveChangesAsync();
return care;
}
My problem here is that I am a bit struggling with the best practice to have, because in order to add a care to a category, I have to get the category first. At the first place, I called the CareCategoryRepository to get the care (commented line). But then, EF was not tracking the change, so when I tried to add the care, it was not registered in the database. But once I call the category from the CareRepository, EF tracks the change and saves the Care in the database.
I assume this is because when calling from another repository, it is a different db context that tracks the changes of my entity. Please correct me if my assumption is wrong.
So, I am wondering, what is the best practice in this case ?
Keep doing what I am doing here, there is no issue to be calling the category entities from the care repository.
Change my solution and put the AddCare method into the CareCategoryRepository, because it makes more sense to call the categories entities from the CareCategoryRepository.
Something else ?
This solution works fine, however I feel like it may not be the best way to solve this.
The issue with passing entities around in web applications is that when your controller passes an entity to serve as a model for the view, this goes to the view engine on the server to consume and build the HTML for the view, but what comes back to the controller when a form is posted or an AJAX call is made is not an entity. It is a block of data cast to look like an entity.
A better way to think of it is that your Add method accepts a view model called AddCareViewModel which contains all of the fields and FKs needed to create a new Care entity. Think about the process you would use in that case. You would want to validate that the required fields are present, and that the FKs (CareCategory etc.) are valid, then construct a Care entity with that data. Accepting an "entity" from the client side of the request is trusting the client browser to construct a valid entity without any way to validate it. Never trust anything from the client.
Personally I use the repository pattern to serve as a factory for entities, though this could also be a separate class. I use the Repository since it already has access to everything needed. Factory methods offer a standard way of composing a new entity and ensuring that all required data is provided:
var care = _careRepository.Create(addCareViewModel.CareCategoryId,
/* all other required fields */);
care.OptionalField = addCareViewModel.OptionalField; // copy across optional data.
_context.SaveChanges(); // or ideally committed via a Unit of Work wrapper.
So for instance if a new Care requires a name, a category Id, and several other required fields, the Create method accepts those required fields and validates that they are provided. When it comes to FKs, the repository can load a reference to set the navigation property. (Also ensuring that a valid ID was given at the same time)
I don't recommend using a Generic Repository pattern with EF. (I.e. Repository()) Arguably the only reason to use a Repository pattern at all with EF would be to facilitate unit testing. Your code will be a lot simpler to understand/maintain, and almost certainly perform faster without a Repository. The DbContext already serves all of those needs and as a Unit of Work. When using a Repository to serve as a point of abstraction for enabling unit testing, instead of Generic, or per-entity repositories, I would suggest defining repositories like you would a Controller, with effectively a one-to-one responsibility.
If I have a CareController then I'd have a CareRepository that served all needs of the CareController. (Not just Care entities) The alternative is that the CareController would need several repository references, and each Repository would now potentially serve several controllers meaning it would have several reasons to change. Scoping a repository to serve the needs of a controller gives it only one reason to change. Sure, several repositories would potentially have methods to retrieve the same entity, but only one repository/controller should be typically responsible for creating/updating entities. (Plus repositories can always reference one another if you really want to see something as simple as Read methods implemented only once)
If using multiple repositories, the next thing to check is to ensure that the DbContext instance used is always scoped to the Web Request, and not something like Transient.
For instance if I have a CareRepository with a Create method that calls a CareCategoryRepository to get a CareCategory reference:
public Care Create(string name, int careCategoryId)
{
if (string.IsNullOrEmpty(name)) throw new ArgumentNullException(nameOf(name));
var care = new Care { Name = name };
var careCategory = _careCategoryRepository.GetById(careCategoryId);
care.CareCategory = careCategory;
_context.Cares.Add(care);
}
We would want to ensure that the DbContext reference (_context) in all of our repositories, and our controller/UnitOfWork if the controller is going to signal the commit with SaveChanges, are pointing at the exact same single reference. This applies whether repositories call each other or a controller fetches data from multiple repositories.

Autofac Multitenant Database Configuration

I have a base abstract context which has a couple hundred shared objects, and then 2 "implementation" contexts which both inherit from the base and are designed to be used by different tenants in a .net core application. A tenant object is injected into the constructor for OnConfiguring to pick up which connection string to use.
public abstract class BaseContext : DbContext
{
protected readonly AppTenant Tenant;
protected BaseContext (AppTenant tenant)
{
Tenant = tenant;
}
}
public TenantOneContext : BaseContext
{
public TenantOneContext(AppTenant tenant)
: base(tenant)
{
}
}
In startup.cs, I register the DbContexts like this:
services.AddDbContext<TenantOneContext>();
services.AddDbContext<TenantTwoContext>();
Then using the autofac container and th Multitenant package, I register tenant specific contexts like this:
IContainer container = builder.Build();
MultitenantContainer mtc = new MultitenantContainer(container.Resolve<ITenantIdentificationStrategy>(), container);
mtc.ConfigureTenant("1", config =>
{
config.RegisterType<TenantOneContext>().AsSelf().As<BaseContext>();
});
mtc.ConfigureTenant("2", config =>
{
config.RegisterType<TenantTwoContext>().AsSelf().As<BaseContext>();
});
Startup.ApplicationContainer = mtc;
return new AutofacServiceProvider(mtc);
My service layers are designed around the BaseContext being injected for reuse where possible, and then services which require specific functionality use the TenantContexts.
public BusinessService
{
private readonly BaseContext _baseContext;
public BusinessService(BaseContext context)
{
_baseContext = context;
}
}
In the above service at runtime, I get an exception "No constructors on type 'BaseContext' can be found with the constructor finder 'Autofac.Core.Activators.Reflection.DefaultConstructorFinder'". I'm not sure why this is broken....the AppTenant is definitely created as I can inject it other places successfully. I can make it work if I add an extra registration:
builder.RegisterType<TenantOneContext>().AsSelf().As<BaseContext>();
I don't understand why the above registration is required for the tenant container registrations to work. This seems broken to me; in structuremap (Saaskit) I was able to do this without adding an extra registration, and I assumed using the built in AddDbContext registrations would take care of creating a default registration for the containers to overwrite. Am I missing something here or is this possibly a bug in the multitenat functionality of autofac?
UPDATE:
Here is fully runable repo of the question: https://github.com/danjohnso/testapp
Why is line 66 of Startup.cs needed if I have lines 53/54 and lines 82-90?
As I expected your problem has nothing to do with multitenancy as such. You've implemented it almost entirely correctly, and you're right, you do not need that additional registration, and, btw, these two (below) too because you register them in tenant's scopes a bit later:
services.AddDbContext<TenantOneContext>();
services.AddDbContext<TenantTwoContext>();
So, you've made only one very small but very important mistake in TenantIdentitifcationStrategy implementation. Let's walk through how you create container - this is mainly for other people who may run into this problem as well. I'll mention only relevant parts.
First, TenantIdentitifcationStrategy gets registered in a container along with other stuff. Since there's no explicit specification of lifetime scope it is registered as InstancePerDependency() by default - but that does not really matter as you'll see. Next, "standard" IContainer gets created by autofac's buider.Build(). Next step in this process is to create MultitenantContainer, which takes an instance of ITenantIdentitifcationStrategy. This means that MultitenantContainer and its captive dependency - ITenantIdentitifcationStrategy - will be singletons regardless of how ITenantIdentitifcationStrategy is registered in container. In your case it gets resolved from that standard "root" container in order to manage its dependencies - well, this is what autofac is for anyways. Everything is fine with this approach in general, but this is where your problem actually begins. When autofac resolves this instance it does exactly what it is expected to do - injects all the dependencies into TenantIdentitifcationStrategy's constructor including IHttpContextAccessor. So, right there in the constructor you grab an instance of IHttpContext from that context accessor and store it for using in tenant resolution process - and this is a fatal mistake: there's no http request at this time, and since TenantIdentitifcationStrategy is a singleton it means that there will not ever be one for it! So, it gets null request context for the whole application lifespan. This effectively means that TenantIdentitifcationStrategy will not be able to resolve tenant identifier based on http requests - because it does not actually analyze them. Consequently, MultitenantContainer will not be able to resolve any tenant-specific services.
Now when the problem is clear, its solution is obvious and trivial - just move fetching of request context context = _httpContextAccessor.HttpContext to TryIdentifyTenant() method. It gets called in the proper context and will be able to access request context and analyze it.
PS. This digging has been highly educational for me since I had absolutely no idea about autofac's multi-tenant concept, so thank you very much for such an interesting question! :)
PPS. And one more thing: this question is just a perfect example of how important well prepared example is. You provided very good example. Without it no one would be able to figure out what the problem is since the most important part of it was not presented in the question - and sometimes you just don't know where this part actually is...

Entity Framework and WCF (Returning Entities Attached To Context)

I have a WCF service that calls the following method in one of my Repository objects to create a new sale object in the database
public static Sale New(Sale sale)
{
using (var ctx = new DBcontext())
{
ctx.Sales.AddObject(sale);
ctx.SaveChanges();
return sale;
}
}
The WCF method calling this looks like this
public Sale SaleNew(Sale sale)
{
return SaleRepository.New(sale);
}
When I call this from a client application I get the following error
"The underlying connection was closed: The connection was closed unexpectedly."
If I step through all the code seems to run fine and the record gets insterted into the database. If I add the following line to my repository method after the SaveChanges it works fine
ctx.Detach(sale);
Is the exception happening because I'm disposing the context as soon as the method returns? Is using the entity context in this way bad practise ie disposing of it straight away? I'm only doing this because its SOA and pretty much stateless so all my repository methods create the context return the value and dispose the context. Anything that is passed in will either get added to the context or re-attached.
As advised I turned on tracing in WCF and watched what was happening. There was a proxy exception occurring. In this case as I'm using my own POCO objects I don't really want proxy objects so I set the ContextOptions.ProxyCreationEnabled property in my DatabaseContext to false and it now works fine.
1) Is using the entity context in this way bad practise ie disposing of it straight away?
No, that's how I do it - and I believe it is the proper way of doing it. But creating context can be expensive and with EF we are stuck with no ideal way of reusing the context.
2) ctx.Detach(sale);
This as far as I know should not be required for what you are doing although I have had loads of issues with attaching and detaching when I reuse the same entities. This should be only needed if you need to re-attach to context. Are you using Lazy-Loading?

Managing NHibernate ISession with Autofac

Does anyone have any tips or best practices regarding how Autofac can help manage the NHibernate ISession Instance (in the case of an ASP.NET MVC application)?
I'm not overly familiar with how NHibernate sessions should be handled. That said, Autofac have excellent instance lifetime handling (scoping and deterministic disposal). Some related resources are this article and this question. Since you're in ASP.Net MVC land make sure you also look into the MVC integration stuff.
To illustrate the point, here's a quick sample on how you can use Autofac factory delegates and the Owned generic to get full control over instance lifetime:
public class SomeController
{
private readonly Func<Owned<ISession>> _sessionFactory;
public SomeController(Func<Owned<ISession>> sessionFactory)
{
_sessionFactory = sessionFactory;
}
public void DoSomeWork()
{
using (var session = _sessionFactory())
{
var transaction = session.Value.BeginTransaction();
....
}
}
}
The container setup to get this to work is quite simple. Notice that we don't have to do anything to get the Func<> and Owned<> types, these are made available automatically by Autofac:
builder.Register(c => cfg.BuildSessionFactory())
.As<ISessionFactory>()
.SingleInstance();
builder.Register(c => c.Resolve<ISessionFactory>().OpenSession());
Update: my reasoning here is that, according to this NHibernate tutorial, the lifetime of the session instance should be that of the "unit of work". Thus we need some way of controlling both when the session instance is created and when the session is disposed.
With Autofac we get this control by requesting a Func<> instead of the type directly. Not using Func<> would require that the session instance be created upfront before the controller instance is created.
Next, the default in Autofac is that instances have the lifetime of their container. Since we know that we need the power to dispose this instance as soon as the unit of work is done, we request an Owned instance. Disposing the owned instance will in this case immediately dispose the underlying session.
Edit: Sounds like Autofac and probably other containers can scope the lifetime correctly. If that's the case, go for it.
It isn't a good idea to use your IoC container to manage sessions directly. The lifetime of your session should correspond to your unit of work (transaction boundary). In the case of a web application, that should almost certainly be the lifetime of a web request.
The most common way to achieve this is with an HttpModule that both creates your session and starts your transaction when a request begins, then commits when the request has finished. I would have the HttpModule register the session in the HttpContext.Items collection.
In your IoC container, you could register something like HttpContextSessionLocator against ISessionLocator.
I should mention that your generic error handling should locate the current session and roll back the transaction automatically, or you could end up committing half a unit of work.

NHibernate sessions - what's the common way to handle sessions in windows applications?

I've just started using NHibernate, and I have some issues that I'm unsure how to solve correctly.
I started out creating a generic repository containing CUD and a couple of search methods. Each of these methods opens a separate session (and transaction if necessary) during the DB operation(s). The problem when doing this (as far as I can tell) is that I can't take advantage of lazy loading of related collections/objects.
As almost every entity relation has .Not.LazyLoad() in the fluent mapping, it results in the entire database being loaded when I request a list of all entities of a given type.
Correct me if I'm wrong, 'cause I'm still a complete newbie when it comes to NHibernate :)
What is most common to do to avoid this? Have one global static session that remains alive as long as the program runs, or what should I do?
Some of the repository code:
public T GetById(int id)
{
using (var session = NHibernateHelper.OpenSession())
{
return session.Get<T>(id);
}
}
Using the repository to get a Person
var person = m_PersonRepository.GetById(1); // works fine
var contactInfo = person.ContactInfo; // Throws exception with message:
// failed to lazily initialize a collection, no session or session was closed
Your question actually boils down to object caching and reuse. If you load a Foo object from one session, then can you keep hold of it and then at a later point in time lazy load its Bar property?
Each ISession instance is designed to represent a unit of work, and comes with a first level cache that will allow you to retrieve an object multiple times within that unit of work yet only have a single database hit. It is not thread-safe, and should definitely not be used as a static object in a WinForms application.
If you want to use an object when the session under which it was loaded has been disposed, then you need to associate it with a new session using Session.SaveOrUpdate(object) or Session.Update(object).
You can find all of this explained in chapter 10 of the Hibernate documentation.
If this seems inefficient, then look into second-level caching. This is provided at ISessionFactory level - your session factory can be static, and if you enable second-level caching this will effectively build an in-memory cache of much of your data. Second-level caching is only appropriate if there is no underlying service updating your data - if all database updates go via NHibernate, then it is safe.
Edit in light of code posted
Your session usage is at the wrong level - you are using it for a single database get, rather than a unit of work. In this case, your GetById method should take in a session which it uses, and the session instance should be managed at a higher level. Alternatively, your PersonRepository class should manage the session if you prefer, and you should instantiate and dispose an object of this type for each unit of work.
public T GetById(int id)
{
return m_session.Get<T>(id);
}
using (var repository = new PersonRepository())
{
var person = repository.GetById(1);
var contactInfo = person.ContactInfo;
} // make sure the repository Dispose method disposes the session.
The error message you are getting is because there is no longer a session to use to lazy load the collection - you've already disposed it.