NHibernate not persisting changes to my object - nhibernate

My ASP.NET MVC 4 project is using NHibernate (behind repositories) and Castle Windsor, using the AutoTx and NHibernate Facilities. I've followed the guide written by haf and my I can create and read objects.
My PersistenceInstaller looks like this
public class PersistenceInstaller : IWindsorInstaller
{
public void Install(Castle.Windsor.IWindsorContainer container, Castle.MicroKernel.SubSystems.Configuration.IConfigurationStore store)
{
container.AddFacility<AutoTxFacility>();
container.Register(Component.For<INHibernateInstaller>().ImplementedBy<NHibernateInstaller>().LifeStyle.Singleton);
container.AddFacility<NHibernateFacility>(
f => f.DefaultLifeStyle = DefaultSessionLifeStyleOption.SessionPerWebRequest);
}
}
The NHibernateInstaller is straight from the NHib Facility Quickstart.
I am using ISessionManager in my base repository...
protected ISession Session
{
get
{
return _sessionManager.OpenSession();
}
}
public virtual T Commit(T entity)
{
Session.SaveOrUpdate(entity);
return entity;
}
Finally, my application code which is causing the problem:
[HttpPost]
[ValidateAntiForgeryToken]
[Transaction]
public ActionResult Maintain(PrescriberMaintainViewModel viewModel)
{
if (ModelState.IsValid)
{
var prescriber = UserRepository.GetPrescriber(User.Identity.Name);
//var prescriber = new Prescriber { DateJoined = DateTime.Today, Username = "Test" };
prescriber.SecurityQuestion = viewModel.SecurityQuestion;
prescriber.SecurityAnswer = viewModel.SecurityAnswer;
prescriber.EmailAddress = viewModel.Email;
prescriber.FirstName = viewModel.FirstName;
prescriber.LastName = viewModel.LastName;
prescriber.Address = new Address
{
Address1 = viewModel.AddressLine1,
Address2 = viewModel.AddressLine2,
Address3 = viewModel.AddressLine3,
Suburb = viewModel.Suburb,
State = viewModel.State,
Postcode = viewModel.Postcode,
Country = string.Empty
};
prescriber.MobileNumber = viewModel.MobileNumber;
prescriber.PhoneNumber = viewModel.PhoneNumber;
prescriber.DateOfBirth = viewModel.DateOfBirth;
prescriber.AHPRANumber = viewModel.AhpraNumber;
prescriber.ClinicName = viewModel.ClinicName;
prescriber.ClinicWebUrl = viewModel.ClinicWebUrl;
prescriber.Qualifications = viewModel.Qualifications;
prescriber.JobTitle = viewModel.JobTitle;
UserRepository.Commit(prescriber);
}
return View(viewModel);
}
The above code will save a new prescriber (tested by uncommenting out the commented out line etc).
I am using NHProf and have confirmed that no sql is sent to the database for the Update. I can see the read being performed but that's it.
It seems to me that NHibernate doesn't recognise the entity as being changed and therefore does not generate the sql. Or possibly the transaction isn't being committed?
I've been scouring the webs for a few hours now trying to work this one out and as a last act of desperation have posted on SO. Any ideas? :)
Oh and in NHProf I see three Sessions (1 for the GetPrescriber call from the repo, one I assume for the update (with no sql) - and one for some action in my actionfilter on the base class). I also get an alert about the use of implicit transactions. This confuses me because I thought I was doing everything I needed to get an transaction - using AutoTx and the Transaction attribute. I also expected there to be only 1 session per webrequest, as per my Windsor config.
UPDATE: It seems, after spending the day reading through the source for NHibernateFacility and AutoTx Facility for automatic transactions, that AutoTx is not setting the Interceptors on my implementation of INHibernateInstaller. It seems this means whenever SessionManager calls OpenSession it is calling the default version with no parameter, rather than the one that accepts an Interceptor. Internally AutoTxFacility registers TransactionInterceptor with windsor, so that it can be added the Interceptor on my INHibernateInstaller concrete, by windsor making use of the AutoTx's TransactionalComponentInspector
AutoTxFacility source on github

To me it looks like creating sessions for every call to the repository. A session should span the whole business operation. It should be opened at the beginning and committed and disposed at the end.
There are other strange things in this code.
Commit is a completely different concept than SaveOrUpdate.
And you don't need to tell NH to store changes anyway. You don't need to call session.Save for objects that are already in the session. They are stored anyway. You only need to call session.Save when you add new objects.
Make sure that you use a transaction for the whole business operation.

There is one most likely "unintended" part in the code snippet above. And proven by observation made by NHProf
Oh and in NHProf I see three Sessions (1 for the GetPrescriber call
from the repo, one I assume for the update (with no sql) - and one for
some action in my actionfilter on the base class).
Calling the OpenSession() is triggering creation of a new session instances.
protected ISession Session
{
get { return _sessionManager.OpenSession(); }
}
So, whenever the code is accessing the Session property, behind is new session instance created (again and again). One session for get, one for udpate, one for filter...
As we can see here, the session returned by SessionManager.OpenSession() must be used for the whole scope (Unit of work, web request...)
http://docs.castleproject.org/Windsor.NHibernate-Facility.ashx
The syntaxh which we need, si to create one session (when firstly accessed) and reuse it until enf of scope (then later correctly close it, commit or rollback transaction...). Anyhow, first thing right now is to change the Session property this way:
ISession _session;
protected ISession Session
{
get
{
if (_session == null)
{
_session = sessionFactory.OpenSession();
}
return _session;
}
}

After spending a full day yesterday searching through the AutoTx and NHibernate facilities on github and getting nowhere, I started a clean project in an attempt to replicate the problem. Unfortunately for the replication, everything worked! I ran Update-Package on my source and brought down new version of Castle.Transactions and I was running correctly. I did make a small adjustment to my own code. That was to remove the UserRepository.Commit line.
I did not need to modify how I opened sessions. That was taken care of by the SessionManager instance. With the update to Castle.Transactions, the Transaction attribute is being recognised and a transaction is being created (as evidenced by no more alert in NHProf).

Related

Blazor concurrency problem using Entity Framework Core

My goal
I want to create a new IdentityUser and show all the users already created through the same Blazor page. This page has:
a form through you will create an IdentityUser
a third-party's grid component (DevExpress Blazor DxDataGrid) that shows all users using UserManager.Users property. This component accepts an IQueryable as a data source.
Problem
When I create a new user through the form (1) I will get the following concurrency error:
InvalidOperationException: A second operation started on this context before a previous operation completed. Any instance members are not guaranteed to be thread-safe.
I think the problem is related to the fact that CreateAsync(IdentityUser user) and UserManager.Users are referring the same DbContext
The problem isn't related to the third-party's component because I reproduce the same problem replacing it with a simple list.
Step to reproduce the problem
create a new Blazor server-side project with authentication
change Index.razor with the following code:
#page "/"
<h1>Hello, world!</h1>
number of users: #Users.Count()
<button #onclick="#(async () => await Add())">click me</button>
<ul>
#foreach(var user in Users)
{
<li>#user.UserName</li>
}
</ul>
#code {
[Inject] UserManager<IdentityUser> UserManager { get; set; }
IQueryable<IdentityUser> Users;
protected override void OnInitialized()
{
Users = UserManager.Users;
}
public async Task Add()
{
await UserManager.CreateAsync(new IdentityUser { UserName = $"test_{Guid.NewGuid().ToString()}" });
}
}
What I noticed
If I change Entity Framework provider from SqlServer to Sqlite then the error will never show.
System info
ASP.NET Core 3.1.0 Blazor Server-side
Entity Framework Core 3.1.0 based on SqlServer provider
What I have already seen
Blazor A second operation started on this context before a previous operation completed: the solution proposed doesn't work for me because even if I change my DbContext scope from Scoped to Transient I still using the same instance of UserManager and its contains the same instance of DbContext
other guys on StackOverflow suggests creating a new instance of DbContext per request. I don't like this solution because it is against Dependency Injection principles. Anyway, I can't apply this solution because DbContext is wrapped inside UserManager
Create a generator of DbContext: this solution is pretty like the previous one.
Using Entity Framework Core with Blazor
Why I want to use IQueryable
I want to pass an IQueryable as a data source for my third-party's component because its can apply pagination and filtering directly to the Query. Furthermore IQueryable is sensitive to CUD
operations.
UPDATE (08/19/2020)
Here you can find the documentation about how to use Blazor and EFCore together
UPDATE (07/22/2020)
EFCore team introduces DbContextFactory inside Entity Framework Core .NET 5 Preview 7
[...] This decoupling is very useful for Blazor applications, where using IDbContextFactory is recommended, but may also be useful in other scenarios.
If you are interested you can read more at Announcing Entity Framework Core EF Core 5.0 Preview 7
UPDATE (07/06/2020)
Microsoft released a new interesting video about Blazor (both models) and Entity Framework Core. Please take a look at 19:20, they are talking about how to manage concurrency problem with EFCore
General solution
I asked Daniel Roth BlazorDeskShow - 2:24:20 about this problem and it seems to be a Blazor Server-Side problem by design.
DbContext default lifetime is set to Scoped. So if you have at least two components in the same page which are trying to execute an async query then we will encounter the exception:
InvalidOperationException: A second operation started on this context before a previous operation completed. Any instance members are not guaranteed to be thread-safe.
There are two workaround about this problem:
(A) set DbContext's lifetime to Transient
services.AddDbContext<ApplicationDbContext>(opt =>
opt.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")), ServiceLifetime.Transient);
(B) as Carl Franklin suggested (after my question): create a singleton service with a static method which returns a new instance of DbContext.
anyway, each solution works because they create a new instance of DbContext.
About my problem
My problem wasn't strictly related to DbContext but with UserManager<TUser> which has a Scoped lifetime. Set DbContext's lifetime to Transient didn't solve my problem because ASP.NET Core creates a new instance of UserManager<TUser> when I open the session for the first time and it lives until I don't close it. This UserManager<TUser> is inside two components on the same page. Then we have the same problem described before:
two components that own the same UserManager<TUser> instance which contains a transient DbContext.
Currently, I solved this problem with another workaround:
I don't use UserManager<TUser> directly instead, I create a new instance of it through IServiceProvider and then it works. I am still looking for a method to change the UserManager's lifetime instead of using IServiceProvider.
tips: pay attention to services' lifetime
This is what I learned. I don't know if it is all correct or not.
I downloaded your sample and was able to reproduce your problem. The problem is caused because Blazor will re-render the component as soon as you await in code called from EventCallback (i.e. your Add method).
public async Task Add()
{
await UserManager.CreateAsync(new IdentityUser { UserName = $"test_{Guid.NewGuid().ToString()}" });
}
If you add a System.Diagnostics.WriteLine to the start of Add and to the end of Add, and then also add one at the top of your Razor page and one at the bottom, you will see the following output when you click your button.
//First render
Start: BuildRenderTree
End: BuildRenderTree
//Button clicked
Start: Add
(This is where the `await` occurs`)
Start: BuildRenderTree
Exception thrown
You can prevent this mid-method rerender like so....
protected override bool ShouldRender() => MayRender;
public async Task Add()
{
MayRender = false;
try
{
await UserManager.CreateAsync(new IdentityUser { UserName = $"test_{Guid.NewGuid().ToString()}" });
}
finally
{
MayRender = true;
}
}
This will prevent re-rendering whilst your method is running. Note that if you define Users as IdentityUser[] Users you will not see this problem because the array is not set until after the await has completed and is not lazy evaluated, so you don't get this reentrancy problem.
I believe you want to use IQueryable<T> because you need to pass it to 3rd party components. The problem is, different components can be rendered on different threads, so if you pass IQueryable<T> to other components then
They might render on different threads and cause the same problem.
They most likely will have an await in the code that consumes the IQueryable<T> and you'll have the same problem again.
Ideally, what you need is for the 3rd party component to have an event that asks you for data, giving you some kind of query definition (page number etc). I know Telerik Grid does this, as do others.
That way you can do the following
Acquire a lock
Run the query with the filter applied
Release the lock
Pass the results to the component
You cannot use lock() in async code, so you'd need to use something like SpinLock to lock a resource.
private SpinLock Lock = new SpinLock();
private async Task<WhatTelerikNeeds> ReadData(SomeFilterFromTelerik filter)
{
bool gotLock = false;
while (!gotLock) Lock.Enter(ref gotLock);
try
{
IUserIdentity result = await ApplyFilter(MyDbContext.Users, filter).ToArrayAsync().ConfigureAwait(false);
return new WhatTelerikNeeds(result);
}
finally
{
Lock.Exit();
}
}
Perhaps not the best approach but rewriting async method as non-async fixes the problem:
public void Add()
{
Task.Run(async () =>
await UserManager.CreateAsync(new IdentityUser { UserName = $"test_{Guid.NewGuid().ToString()}" }))
.Wait();
}
It ensures that UI is updated only after the new user is created.
The whole code for Index.razor
#page "/"
#inherits OwningComponentBase<UserManager<IdentityUser>>
<h1>Hello, world!</h1>
number of users: #Users.Count()
<button #onclick="#Add">click me. I work if you use Sqlite</button>
<ul>
#foreach(var user in Users.ToList())
{
<li>#user.UserName</li>
}
</ul>
#code {
IQueryable<IdentityUser> Users;
protected override void OnInitialized()
{
Users = Service.Users;
}
public void Add()
{
Task.Run(async () => await Service.CreateAsync(new IdentityUser { UserName = $"test_{Guid.NewGuid().ToString()}" })).Wait();
}
}
I found your question looking for answers about the same error message you had.
My concurrency issue appears to have been due to a change that triggered a re-rendering of the visual tree to occur at the same time as (or due to the fact that) I was trying to call DbContext.SaveChangesAsync().
I solved this by overriding my component's ShouldRender() method like this:
protected override bool ShouldRender()
{
if (_updatingDb)
{
return false;
}
else
{
return base.ShouldRender();
}
}
I then wrapped my SaveChangesAsync() call in code that set a private bool field _updatingDb appropriately:
try
{
_updatingDb = true;
await DbContext.SaveChangesAsync();
}
finally
{
_updatingDb = false;
StateHasChanged();
}
The call to StateHasChanged() may or may not be necessary, but I've included it just in case.
This fixed my issue, which was related to selectively rendering a bound input tag or just text depending on if the data field was being edited. Other readers may find that their concurrency issue is also related to something triggering a re-render. If so, this technique may be helpful.
Well, I have a quite similar scenario with this, and I 'solve' mine is to move everything from OnInitializedAsync() to
protected override async Task OnAfterRenderAsync(bool firstRender)
{
if(firstRender)
{
//Your code in OnInitializedAsync()
StateHasChanged();
}
{
It seems solved, but I had no idea to find out the proves. I guess just skip from the initialization to let the component success build, then we can go further.
/******************************Update********************************/
I'm still facing the problem, seems I'm giving a wrong solution to go. When I checked with this Blazor A second operation started on this context before a previous operation completed I got my problem clear. Cause I'm actually dealing with a lot of components initialization with dbContext operations. According to #dani_herrera mention that if you have more than 1 component execute Init at a time, probably the problem appears.
As I took his advise to change my dbContext Service to Transient, and I get away from the problem.
#Leonardo Lurci Had covered conceptually. If you guys are not yet wanting to move to .NET 5.0 preview, i would recommend looking at Nuget package 'EFCore.DbContextFactory', documentation is pretty neat. Essential it emulates AddDbContextFactory. Ofcourse, it creates a context per component.
So far, this is working fine for me so far without any problems...
I ensure single-threaded access by only interacting with my DbContext via a new DbContext.InvokeAsync method, which uses a SemaphoreSlim to ensure only a single operation is performed at a time.
I chose SemaphoreSlim because you can await it.
Instead of this
return Db.Users.FirstOrDefaultAsync(x => x.EmailAddress == emailAddress);
do this
return Db.InvokeAsync(() => ...the query above...);
// Add the following methods to your DbContext
private SemaphoreSlim Semaphore { get; } = new SemaphoreSlim(1);
public TResult Invoke<TResult>(Func<TResult> action)
{
Semaphore.Wait();
try
{
return action();
}
finally
{
Semaphore.Release();
}
}
public async Task<TResult> InvokeAsync<TResult>(Func<Task<TResult>> action)
{
await Semaphore.WaitAsync();
try
{
return await action();
}
finally
{
Semaphore.Release();
}
}
public Task InvokeAsync(Func<Task> action) =>
InvokeAsync<object>(async () =>
{
await action();
return null;
});
public void InvokeAsync(Action action) =>
InvokeAsync(() =>
{
action();
return Task.CompletedTask;
});
#Leonardo Lurci has a great answer with multiple solutions to the problem. I will give my opinion about every solution and which I think it is the best one.
Making DBContext transient - it is a solution but it is not optimized for this cases..
Carl Franklin suggestion - the singleton service will not be able to control the lifetime of the context and will depend on the service requester to dispose the context after use.
Microsoft documentation they talk about injecting DBContext Factory into a component with the IDisposable interface to Dispose the context when the component is destroied. This is not a very good solution, because a lot of problems happen with it, like: performing a context operation and leaving the component before it finishes that operation, will dispose the context and throw exception..
Finally. The best solution so far is to inject the DBContext Factory in the component yes, but whenever you need it, you create a new instance with using statement like bellow:
public async Task GetSomething()
{
using var context = DBFactory.CreateDBContext();
return await context.Something.ToListAsync();
}
Since DbFactory is optimazed when creating new context instances, there is no significante overhead, making it a better choice and better performing than Transient context, it also disposes the context at the end of the method because of "using" statement.
Hope it was useful.

RavenDB fails with ConcurrencyException when using new transaction

This code always fails with a ConcurrencyException:
[Test]
public void EventOrderingCode_Fails_WithConcurrencyException()
{
Guid id = Guid.NewGuid();
using (var scope1 = new TransactionScope())
using (var session = DataAccess.NewOpenSession)
{
session.Advanced.UseOptimisticConcurrency = true;
session.Advanced.AllowNonAuthoritativeInformation = false;
var ent1 = new CTEntity
{
Id = id,
Name = "George"
};
using (var scope2 = new TransactionScope(TransactionScopeOption.RequiresNew))
{
session.Store(ent1);
session.SaveChanges();
scope2.Complete();
}
var ent2 = session.Load<CTEntity>(id);
ent2.Name = "Gina";
session.SaveChanges();
scope1.Complete();
}
}
It fails at the last session.SaveChanges. Stating that it is using a NonCurrent etag. If I use Required instead of RequiresNew for scope2 - i.e. using the same Transaction. It works.
Now, since I load the entity (ent2) it should be using the newest Etag unless this is some cached value attached to scope1 that I am using (but I have disabled Caching). So I do not understand why this fails.
I really need this setup. In the production code the outer TransactionScope is created by NServiceBus, and the inner is for controlling an aspect of event ordering. It cannot be the same Transaction.
And I need the optimistic concurrency too - if other threads uses the entity at the same time.
BTW: This is using Raven 2.0.3.0
Since no one else have answered, I had better give it a go myself.
It turns out this was a human error. Due to a bad configuration of our IOC container the DataAccess.NewOpenSession gave me the same Session all the time (across other tests). In other words Raven works as expected :)
Before I found out about this I also experimented with using TransactionScopeOption.Suppress instead of RequiresNew. That also worked. Then I just had to make sure that whatever I did in the suppressed scope could not fail. Which was a valid option in my case.

MVC 3/EF repository pattern and proper data access

Being rather new to MVC 3 and EF, I'm trying to understand the best architectural approach to developing an application for my company. The application will be a large-scale application that potentially handles hundreds of users at the same time, so I want to make sure I understand and am following proper procedures. So far, I've determined that a simple repository pattern (such as Controller -> Repository -> EF) approach is the best and easiest to implement, but I'm not sure if that is definitely the best way to do things. The application will basically return data that is shown to a user in a devexpress grid and they can modify this data/add to it etc.
I found this article and it is rather confusing for me at this time, so I'm wondering if there is any reason to attempt to work with a disconnected EF and why you would even want to do so: http://www.codeproject.com/Articles/81543/Finally-Entity-Framework-working-in-fully-disconne?msg=3717432#xx3717432xx
So to summarize my question(s):
Is the code below acceptable?
Should it work fine for a large-scale MVC application?
Is there a better way?
Will unnecessary connections to SQL remain open from EF? (SQL Profiler makes it look like it stays open a while even after the using statement has exited)
Is the disconnected framework idea a better one and why would you even want to do that? I don't believe we'll need to track data across tiers ...
Note: The repository implements IDisposable and has the dispose method listed below. It creates a new instance of the entity context in the repository constructor.
Example Usage:
Controller (LogOn using Custom Membership Provider):
if (MembershipService.ValidateUser(model.UserName, model.Password))
{
User newUser = new User();
using (AccountRepository repo = new AccountRepository())
{
newUser = repo.GetUser(model.UserName);
...
}
}
Membership Provider ValidateUser:
public override bool ValidateUser(string username, string password)
{
using (AccountRepository repo = new AccountRepository())
{
try
{
if (string.IsNullOrEmpty(password.Trim()) || string.IsNullOrEmpty(username.Trim()))
return false;
string hash = FormsAuthentication.HashPasswordForStoringInConfigFile(password.Trim(), "md5");
bool exists = false;
exists = repo.UserExists(username, hash);
return exists;
}catch{
return false;
}
}
}
Account Repository Methods for GetUser & UserExists:
Get User:
public User GetUser(string userName)
{
try
{
return entities.Users.SingleOrDefault(user => user.UserName == userName);
}
catch (Exception Ex)
{
throw new Exception("An error occurred: " + Ex.Message);
}
}
User Exists:
public bool UserExists(string userName, string userPassword)
{
if (userName == "" || userPassword == "")
throw new ArgumentException(InvalidUsernamePassword);
try
{
bool exists = (entities.Users.SingleOrDefault(u => u.UserName == userName && u.Password == userPassword) != null);
return exists;
}
catch (Exception Ex)
{
throw new Exception("An error occurred: " + Ex.Message);
}
}
Repository Snippets (Constructor, Dispose etc):
public class AccountRepository : IDisposable
{
private DbContext entities;
public AccountRepository()
{
entities = new DbContext();
}
...
public void Dispose()
{
entities.Dispose();
}
}
What's acceptable is pretty subjective, but if you want to do proper data access I suggest you do NOT use the repository pattern, as it breaks down as your application gets more complex.
The biggest reason is minimizing database access. So for example look at your repository and notice the GetUser() method. Now take a step back from the code and think about how your application is going to be used. Now think about how often you are going to request data from the user table without any additional data. The answer is almost always going to be "rarely" unless you are creating a basic data entry application.
You say it your application will show a lot of grids. What data is in that Grid? I'm assuming (without knowing your application domain) that the grids will combine user data with other information that's relevant for that user. If that's the case, how do you do it with your repositories?
One way is to call on each repository's method individually, like so:
var user = userRepository.GetUser("KallDrexx");
var companies = companyRepository.GetCompaniesForUser(user.Id);
This now means you have 2 database calls for what really should be just one. As your screens get more and more complex, this will cause the number of database hits to increase and increase, and if your application gets significant traffic this will cause performance issues. The only real way to do this in the repository pattern is to add special methods to your repositories to do that specific query, like:
public class UserRepository
{
public User GetUser(string userName)
{
// GetUser code
}
public User GetUserWithCompanies(string userName)
{
// query code here
}
}
So now what happens if you need users and say their contact data in one query. Now you have to add another method to your user repository. Now say you need to do another query that also returns the number of clients each company has, so you need to add yet another method (or add an optional parameter). Now say you want to add a query that returns all companies and what users they contain. Now you need a new query method but then comes the question of do you put that in the User repository or the Company repository? How do you keep track of which one it's in and make it simple to choose between GetUserWithCompany and GetCompanyWithUsers when you need it later?
Everything gets very complex from that point on, and it's those situations that have made me drop the repository pattern. What I do now for data access is I create individual query and command classes, each class represents 1 (and only 1) query or data update command to the database. Each query class returns a view model that only contains the data I need for one specific user usage scenario. There are other data access patterns that will work too (specification pattern, some good devs even say you should just do your data access in your controllers since EF is your data access layer).
The key to doing data access successfully is good planning. Do you know what your screens are going to look like? Do you know how users are going to use your system? Do you know all the data that is actually going to be on each screen? If the answer to any of these is no, then you need to take a step back and forget about the data layer, because the data layer is (or should be for a good application) determined based on how the application is actually going to be used, the UI and the screens should not be dependent on how the data layer was designed. If you don't take your UI needs and user usage scenarios into account when developing the data access, your application will not scale well and will not be performant. Sometimes that's not an issue if you don't plan on your site being big, but it never hurts to keep those things in mind.
No matter what you do, you may consider moving instantiation and disposing of your context to your controller like this:
public class MyController : Controller
{
private Entities context = new Entities();
...
public override void Dispose()
{
context.Dispose();
}
}
You can then pass that context into any method that needs it without duplicating the overhead of creating it.
I disagree that the repository pattern is necessarily bad for the same reason. You create multiple classes to break up your code to make it manageable and still reuse the same context. That could look something like this:
repository.Users.GetUser(userName);
In this case "Users" is a lazy loaded instance of your user repository class which reuses the context from your repository. So the code for that Users property in your repository would look something like this:
private UserRepository users;
public UserRepository Users
{
get
{
If (users == null)
{
users = new UserRepository(this);
}
return users;
}
}
You can then expose your context to these other lazy loaded classes via a property.
I don't think this necessarily conflicts with KallDrexx's pattern. His method simply flips this so instead of
repository.Users.GetUser(userName);
You would have something like
UserQuery query = new UserQuery(repository.Users);
This then becomes an issue of syntax. Do you want this:
repository.Area.Query(value1, value2, ...);
Or this:
AreaQuery query = new AreaQuery { Property1 = value1, ... };
The latter actually works nicer with model binding but obviously is more verbose when you actually have to code it.
Best advice KallDrexx gave is to just put your code I your actions and then figure it out. If you are doing simple CRUD, then let MVC instantiate and populate your model, then all you have to do is attach and save. If you find you can reuse code, move it to where it can be reused. If your application starts getting too complicated, try some of these recommendations until you find what works for you.

RavenDB Catch 22 - Optimistic Concurrency AND Seeing Changes from Other Clients

With RavenDB, creating an IDocumentSession upon app start-up (and never closing it until the app is closed), allows me to use optimistic concurrency by doing this:
public class GenericData : DataAccessLayerBase, IGenericData
{
public void Save<T>(T objectToSave)
{
Guid eTag = (Guid)Session.Advanced.GetEtagFor(objectToSave);
Session.Store(objectToSave, eTag);
Session.SaveChanges();
}
}
If another user has changed that object, then the save will correctly fail.
But what I can't do, when using one session for the lifetime of an app, is seeing changes, made by other instances of the app (say, Joe, five cubicles away), to documents. When I do this, I don't see Joe's changes:
public class CustomVariableGroupData : DataAccessLayerBase, ICustomVariableGroupData
{
public IEnumerable<CustomVariableGroup> GetAll()
{
return Session.Query<CustomVariableGroup>();
}
}
Note: I've also tried this, but it didn't display Joe's changes either:
return Session.Query<CustomVariableGroup>().Customize(x => x.WaitForNonStaleResults());
Now, if I go the other way, and create an IDocumentSession within every method that accesses the database, then I have the opposite problem. Because I have a new session, I can see Joe's changes. Buuuuuuut... then I lose optimistic concurrency. When I create a new session before saving, this line produces an empty GUID, and therefore fails:
Guid eTag = (Guid)Session.Advanced.GetEtagFor(objectToSave);
What am I missing? If a Session shouldn't be created within each method, nor at the app level, then what is the correct scope? How can I get the benefits of optimistic concurrency and the ability to see others' changes when doing a Session.Query()?
You won't see the changes, because you use the same session. See my others replies for more details
Disclaimer: I know this can't be the long-term approach, and therefore won't be an accepted answer here. However, I simply need to get something working now, and I can refactor later. I also know some folks will be disgusted with this approach, lol, but so be it. It seems to be working. I get new data with every query (new session), and I get optimistic concurrency working as well.
The bottom line is that I went back to one session per data access method. And whenever a data access method does some type of get/load/query, I store the eTags in a static dictionary:
public IEnumerable<CustomVariableGroup> GetAll()
{
using (IDocumentSession session = Database.OpenSession())
{
IEnumerable<CustomVariableGroup> groups = session.Query<CustomVariableGroup>();
CacheEtags(groups, session);
return groups;
}
}
Then, when I'm saving data, I grab the eTag from the cache. This causes a concurrency exception if another instance has modified the data, which is what I want.
public void Save(EntityBase objectToSave)
{
if (objectToSave == null) { throw new ArgumentNullException("objectToSave"); }
Guid eTag = Guid.Empty;
if (objectToSave.Id != null)
{
eTag = RetrieveEtagFromCache(objectToSave);
}
using (IDocumentSession session = Database.OpenSession())
{
session.Advanced.UseOptimisticConcurrency = true;
session.Store(objectToSave, eTag);
session.SaveChanges();
CacheEtag(objectToSave, session); // We have a new eTag after saving.
}
}
I absolutely want to do this the right way in the long run, but I don't know what that way is yet.
Edit: I'm going to make this the accepted answer until I find a better way.
Bob, why don't you just open up a new Session every time you want to refresh your data?
It has many trade-offs to open new sessions for every request, and your solution to optimistic concurrency (managing tags within your own singleton dictionary) shows that it was never intended to be used that way.
You said you have a WPF application. Alright, open a new Session on startup. Load and query whatever you want but don't close the Session until you want to refresh your data (e.g. a list of order, customers, i don't know...). Then, when you want to refresh it (after a user clicks on a button, a timer event is fired or whatever) dispose the session and open a new one. Does that work for you?

NHibernate - ISession

About the declaration of ISession.
Should we close the Session everytime we use it, or should we keep it open?
I'm asking this because in manual of NHibernate (nhforge.org) they recommend us to declare it once in Application_Start for example, but i don't know if we should close it everytime we use.
Thanks
You can keep one single static reference to a ISessionFactory, which can be indeed instantiated in Application_Start for web applications.
However, ISession must not be kept open and cannot be shared between two or more requests. You should adopt the "one session per request" pattern which allows you to build a single ISession for each HTTP request and dispose it safely once the request has been handled (this is assuming you are writing a web application).
For instance, the code handling NHibernate sessions in you project might look like this:
public static class NHibernateHelper {
static ISessionFactory _factory;
public static NHibernateHelper(){
//This code runs once when the application starts
//Use whatever is needed to build your ISessionFactory (read configuration, etc.)
_factory = CreateYourSessionFactory();
}
const string SessionKey = "NhibernateSessionPerRequest";
public static ISession OpenSession(){
var context = HttpContext.Current;
//Check whether there is an already open ISession for this request
if(context != null && context.Items.ContainsKey(SessionKey)){
//Return the open ISession
return (ISession)context.Items[SessionKey];
}
else{
//Create a new ISession and store it in HttpContext
var newSession = _factory.OpenSession();
if(context != null)
context.Items[SessionKey] = newSession;
return newSession;
}
}
}
This code is probably far to simple and has not been tested (nor compiled in fact), but it should work. For a more safe handling of your sessions you could also use an IoC container (Autofac for instance) and register your ISessions with a lifetime that depends on HTTP requests (Autofac will handle everything for you in that case).
Sessions should be closed when you are done with them. There are multiple possible ways to manage the lifetime of a session and choosing the right one is specific to each scenario. "Unit of Work" and "Session per Request" are the two most often used session lifetime management patterns.
In Application_Start, you should create the SessionFactory, not the Session. The SessionFactory does not need to be closed.