NHibernate second-level caching of collections - nhibernate

I am having trouble working out how to correctly cache one-to-many or many-to-many relationships in NHibernate.
For example, an office class may have the following mapping:
public OfficeDbMap()
{
...
HasMany(x => x.Employees)
.Cache.NonStrictReadWrite();
}
However I find that when I delete an employee (without specifically removing its relationship to the office), that the cache of office->employees does not get invalidated and the employee continues to appear in the office's list of employees.
I suspect it may have something to do with cache regions, but I don't know whether the region should be the office's region or the employee's region (actually I have tried specifying both and neither works).

The problem may be the NonStrictReadWrite configuration. You have to use the Read-Write strategy.

Related

NHibernate - using tenant specific table prefixes

We are developing a multi-tenant application using NHibernate where all tenants share the same database.
One option we considered was to use a tenant specific prefix for our database objects (I believe this the same approach taken by Orchard).
This would at least give us some kind of recovery model for tenant data and would mean we wouldn't have to include a tenantid filter on most of our queries.
So my question - has anyone employed this strategy? If so, how did you go about it.
Specifically, can we use the same SessionFactory for all tenants and can we use NHibernate to generate a new set of tables for a "new" tenant at runtime (is it safe to do so).
Thanks
Ben
[Update]
This was a worthwhile investigation but ultimately we decided that a shared schema was more suitable for our needs. Schema per tenant clearly offers better separation of tenant data but makes maintenance more difficult. Since our tenants are only storing small amounts of data, the thought of having 10 tables * 1K tenants is a little off-putting.
There are a couple of points of customization / extension that you may want to consider.
I don't think that you will be able to share the same session factory across tenants. I think that the simplest thing to do may be to update the mappings based on the tenant associated with the session factory.
public class EntitytMap:ClassMap<Entity>
{
public EntitytMap()
{
Table("TableName");
Schema(Session.TenantName);
Id(p => p.Id, "Id").GeneratedBy.Identity();
If you want to have each tenant in their own schema, this should work. If you want to keep the schema the same but have a prefix on the table, you could change to:
public class EntityMap:ClassMap<Entity>
{
public EntityMap()
{
Table(Session.TenantPrefix + "TableName");
Schema("SCHEMA");
Id(p => p.Id, "Id").GeneratedBy.Identity();
You can also try providing your own ConnectionProvider. Derive a class from NHibernate.Connection.DriverConnectionProvider and reference your own copy in the nhibernate configuration file instead of:
<property name="connection.provider">NHibernate.Connection.DriverConnectionProvider</property>
use
<property name="connection.provider">My.Project.DAL.NHibernate.CustomConnectionProvider, My.Project.DAL</property>
When the GetConnection provider is called, you can specify your own connection string based on the tenent.
can we use NHibernate to generate a new set of tables for a "new" tenant
at runtime (is it safe to do so)
I'd suggest that you wouldn't want to grant your web application the level of permissions required to perform these DDL tasks. I'd leave the web app with the minimum level of permissions for normal DML operations and have a background service operating as a 'Provisioning Service'. Its role would be schema modifications for the new tenant, and it is also a good place to put any other tenant provisioning tasks such as creating new folders, configuring IIS etc. All these tasks take time too and are best not done in a single web request. The background database can update a provisioning table, updating it with information about its progress until its complete and the web ui moves to the next step.

should my POCO classes communicate with repositories, and if so- how to do it?

I'm writing an ASP.Net webforms app using nHibernate. Currently I'm not using any IoC framework.
Firstly- my scenarios:
I don't want to allow two Employees with the same name. I would like to enable the Employee class to search for other Employee with the same name and issue an error.
A Department may have 1000s of Employees. I need to know exactly how many Employees each Department has. I don't want to load them all into memory, so I use a calculated property- EmployeesCount. However, this property is initialized the first time the Department object is loaded. If I add / remove employees the changes are not reflected in that property. (especially since I'm using 2nd level cache, so my object persists over multiple Session scopes)
My idea of solving these problems is to have my domain objects hold reference to the appropriate Repository objects.
I think the best solution is to have my repositories implement interfaces and use some sort of IoC container / DI mechanism.
So, in Employee I'll have:
if (_empRepository.GetEmployeeByName(newEmployeeName) != null) //...
and in Department:
public int EmployeesCount
{
get
{
return _departmentRepository.GetCurrentEmployeesCount(this);
}
}
my questions:
Which is prefferable- c'tor DI
public Employee(IEmployeeRepository repository)
or using an IoC container?
public Employee()
{
_empRepository = container.Resolve<IEmployeeRepository>();
}
How to implement the desired solution? I understand that Interceptors / tuplizers is the way to go, but I couldn't really get my head around the whole thing. Is this a good example? Also- in which namespace should I define the Interfaces, the IoC container etc.?
The POC0-to-persistence relationship is usually uni-directional: The persistence tier knows about POCOs, but not the other way around.
The moment you make the POCO responsible for things like searching for other Employees with the same name, you make it bi-directional. I'd be concerned about growing cycles between modules in your design.
Putting that activity in the POCO might sound like a good idea, because it helps to avoid the "anemic domain model" label, but I wouldn't do it. I'd put those methods on persistence interfaces and let them handle it. It's also more likely that the service in charge of fulfilling the use case would want to know about duplicate Employee names; let it do the asking and leave POCO out of the conversation.
This approach will make your testing life easier. You'll know how painful module cycles can be when you go to test. You'll have to drag too much machinery into the test to get it to work.
i also think Pocos shouldn't know about persistance. Have you considered alternatives like
class Department
{
public int Id { get; set; }
public IList<Employee> ActiveEmployees { get; set; }
}
public DepartmentMap()
{
...
HasMany(d => d.ActiveEmployees)
.KeyColumn("department_id")
.Where("stillactiveindepartment = true") // for example
.ExtraLazyLoad(); // will issue an Count(*) instead of initializing collection, also other methods like Contains() issue an sql
}
For Employee with unique names:
1) in the service/controler: before saving a new employee issue a query which checks if an employee with same name already exists.
2) Unique Constraint in database will give an exception on save.
3) Custom interceptor in the session which hooks the save/flush event. it can throw if there is already a customer
The other answer mostly addresses the questions you asked, and I agree with the advice there.
For your question about accessing employee counts, look into using projections with NHibernate. I'd put a query in your service layer (for whatever web service operation needs to get the employee count) that uses a count projection.
EDIT: Forgot you're using Services/Repositories/POCOs. In that case, create an employee count query in the repository and invoke it from the service.

NHibernate caching entities across sessions using SysCache

I'm developing a web application and I would like caching to persist across web requests.
I am aware that the first level cache is per-session only. I have second-level caching enabled and this is working for queries.
However, the second-level cache does not seem to work for "getting" entities... therefore most of the DB work the application does is not being cached across web requests.
Is this normal / desirable behaviour? I'm reviewing one particular page that makes lots of round trips to the database, although each query is quick, these seem unnecessary if the entities could be cached.
Edit
Okay so I have second level cache enabled, and working for queries. I just can't seem to get it working for entities. I have Cache.Is(c => c.ReadWrite()) (fluent nhibernate) on my main entity that I'm testing. But nope, it still hits the DB each time. Any ideas?
Edit
I've tried using transactions like so:
public override Accommodation Get(int id)
{
using (var tx = Session.BeginTransaction())
{
var accomm = Session.Get<Accommodation>(id);
tx.Commit();
return accomm;
}
}
My mapping is such (and you can see we have a nasty schema):
public void Override(AutoMapping<Core.Entities.Itinerary.Accommodation.Accommodation> mapping)
{
mapping.HasManyToMany(x => x.Features).Table("AccommodationLinkFeatureType").ChildKeyColumn("FeatureTypeId").NotFound.Ignore();
mapping.HasManyToMany(x => x.SimilarAccommodation).Table("AccommodationLinkSimilarAccommodation").ChildKeyColumn("SimilarAccommodationId").NotFound.Ignore();
mapping.HasMany(x => x.TourItinerary).Table("AccommodationTourItinerary");
mapping.HasOne(x => x.Images).ForeignKey("AccommodationId").Cascade.All().Not.LazyLoad();
mapping.References(x => x.CollectionType).NotFound.Ignore().Not.LazyLoad();
mapping.References(x => x.AccommodationUnitType).NotFound.Ignore().Not.LazyLoad();
Cache.Is(c => c.ReadWrite());
}
However, this still doesn't seem to fetch from the 2nd level cache.
Incidentally, I see a lot of examples online using Cache.ReadWrite() but I can only see an Is method on the Cache helper, so I'm trying Cache.Is(c => c.ReadWrite()) -- has the fluent interface changed?
I have not tested this, but my understanding is that committing transactions is the magic that places objects into second level cache. If you are doing read operations outside of a transaction then the objects will not be placed in the second level cache.
I had the same problem.
In my case the cause was the References is mapped with NotFound().Ignore() (i.e if no entity is found with this foreign key just ignore it, which is actualy a data consistency error anyway). Remove NotFound.Ignore and fix your db.
Maybe you´re having some problem with the configuration of your cache provider. I´ve been able to do want you want using Couchbase as 2nd level cache provider, as described here:
http://blog.couchbase.com/introducing-nhibernate-couchbase-2nd-level-cache-provider
If your deployment enviroment is on Azure, i guess this might be useful. Note that The SysCache module can’t co-exist with the AzureMemcached module.
http://www.webmoco.com/webmoco-development-blog/orchard-cms-second-level-caching

Where to put NHibernate query logic?

I am trying to set up proper domain architecture using Fluent NHibernate and Linq to NHibernate. I have my controllers calling my Repository classes, which do the NHibernate thang under the hood and pass back ICollections of data. This seems to work well because it abstracts the data access and keeps the NHibernate functionality in the "fine print".
However, now I'm finding situations where my controllers need to use the same data calls in a different context. For example, my repo returns a list of Users. That's great when I want to display a list of users, but when I want to start utilizing the child classes to show roles, etc., I run into SELECT N+1 issues. I know how to change that in NHibernate so it uses joins instead, but my specific question is WHERE do I put this logic? I don't want every GetAllUsers() call to return the roles also, but I do want some of them to.
So here are my three options that I see:
Change the setting in my mapping so the roles are joined to my query.
Create two Repository calls - GetAllUsers() and GetUsersAndRoles().
Return my IQueryable object from the Repository to the Controller and use the NHibernate Expand method.
Sorry if I didn't explain this very well. I'm just jumping into DDD and a lot of this terminology is still new to me. Thanks!
As lomaxx points out, you need query.Expand.
To prevent your repository from becoming obscured with all kinds of methods for every possible situation, you could create Query Objects which make configurable queries.
I posted some examples using the ICriteria API on my blog. The ICriteria API has FetchMode instead of Expand, but the idea is the same.
I try and keep all the query logic in my repositories and try to only pass back the ICollection from them.
In your situation, I'd pass in some parameters to determine if you want to eager load roles or not and construct the IQueryable that way. For example:
GetAllUsers(bool loadRoles)
{
var query = session.Linq<Users>();
if(loadRoles)
query.Expand("Roles");
return query.ToList();
}
I would choose 2, creating two repositories. And perhaps would I consider creating another repository call to GetRoleByUser(User user). So, you could access a user's role upon user selection change on a seperate thread, if required, so it would increment your performance and won't load every user's roles for each of your users, which would require most resources.
It sounds like you are asking if it is possible to make GetAllUsers() sometimes return just the Users entities and sometimes return the Users and the roles.
I would either make a separate repository method called GetRolesForUser(User user), use lazy loading for Roles, or use the GetAllUsers(bool loadRoles) mentioned by lomaxx's answer.
I would lean toward lazy loading roles or a separate method in your repository.

Fluent NHibernate hierarchical data

Hey all. Quick question on Fluent syntax. I had thought I had this down, but I'm getting a weird failure. Basically, I have a hierarchical kind of structure that I'm trying to persist, and it all seems to work, except when I do an actual integration test w/ the db.
I have a Node object which has a Parent property, which is another Node, and a _children field backing a readonly Children property, which is a collection of Nodes as well.
The properties handle correlating the relationships, and the in-memory objects test out just fine. When I retrieve them from the repository (an in-memory SQLite db in my tests), though, any Node's Children include itself for some reason. Any ideas?
My mappings are mostly done w/ AutoMap, but I've overridden the following:
mapping.References(x => x.Parent);
mapping.HasMany(x => x.Children).Inverse().Access.LowerCaseField(Prefix.Underscore);
I've also tried it w/o the Inverse() call.
Got it. The problem was that I needed to tell the children collection what Id field to hook into for the foreign key.
I changed that mapping to look like so:
mapping.HasMany(m => m.Children)
.Inverse()
.KeyColumn("ParentId")
.Access.CamelCaseField(Prefix.Underscore)
.Cascade.All()