I'm developing a web application and I would like caching to persist across web requests.
I am aware that the first level cache is per-session only. I have second-level caching enabled and this is working for queries.
However, the second-level cache does not seem to work for "getting" entities... therefore most of the DB work the application does is not being cached across web requests.
Is this normal / desirable behaviour? I'm reviewing one particular page that makes lots of round trips to the database, although each query is quick, these seem unnecessary if the entities could be cached.
Edit
Okay so I have second level cache enabled, and working for queries. I just can't seem to get it working for entities. I have Cache.Is(c => c.ReadWrite()) (fluent nhibernate) on my main entity that I'm testing. But nope, it still hits the DB each time. Any ideas?
Edit
I've tried using transactions like so:
public override Accommodation Get(int id)
{
using (var tx = Session.BeginTransaction())
{
var accomm = Session.Get<Accommodation>(id);
tx.Commit();
return accomm;
}
}
My mapping is such (and you can see we have a nasty schema):
public void Override(AutoMapping<Core.Entities.Itinerary.Accommodation.Accommodation> mapping)
{
mapping.HasManyToMany(x => x.Features).Table("AccommodationLinkFeatureType").ChildKeyColumn("FeatureTypeId").NotFound.Ignore();
mapping.HasManyToMany(x => x.SimilarAccommodation).Table("AccommodationLinkSimilarAccommodation").ChildKeyColumn("SimilarAccommodationId").NotFound.Ignore();
mapping.HasMany(x => x.TourItinerary).Table("AccommodationTourItinerary");
mapping.HasOne(x => x.Images).ForeignKey("AccommodationId").Cascade.All().Not.LazyLoad();
mapping.References(x => x.CollectionType).NotFound.Ignore().Not.LazyLoad();
mapping.References(x => x.AccommodationUnitType).NotFound.Ignore().Not.LazyLoad();
Cache.Is(c => c.ReadWrite());
}
However, this still doesn't seem to fetch from the 2nd level cache.
Incidentally, I see a lot of examples online using Cache.ReadWrite() but I can only see an Is method on the Cache helper, so I'm trying Cache.Is(c => c.ReadWrite()) -- has the fluent interface changed?
I have not tested this, but my understanding is that committing transactions is the magic that places objects into second level cache. If you are doing read operations outside of a transaction then the objects will not be placed in the second level cache.
I had the same problem.
In my case the cause was the References is mapped with NotFound().Ignore() (i.e if no entity is found with this foreign key just ignore it, which is actualy a data consistency error anyway). Remove NotFound.Ignore and fix your db.
Maybe you´re having some problem with the configuration of your cache provider. I´ve been able to do want you want using Couchbase as 2nd level cache provider, as described here:
http://blog.couchbase.com/introducing-nhibernate-couchbase-2nd-level-cache-provider
If your deployment enviroment is on Azure, i guess this might be useful. Note that The SysCache module can’t co-exist with the AzureMemcached module.
http://www.webmoco.com/webmoco-development-blog/orchard-cms-second-level-caching
Related
So I have a fairly comprehensive activity-based access control system I built for a web app under MVC 4 using Entity Framework. Well, to be precise the access control doesn't care if it's using EF or not, but the app is.
Anyway, I'm loading the user's permissions on each request right now. I get a reference to my DbContext injected from the IoC container into my ApplicationController, and it overrides OnAuthorization to stuff the user's profile into the HttpContext.Current.Items. Seems to work fairly well, but I can't help but wonder if it's the best way.
My thought was that since the users' permissions don't change often, if ever, the better way to do it would be to load the profile of permissions into the Session instead, and then not have to change them at all until the user logs out and logs back in (pretty common in desktop OS's anyway). But I'm concerned that if I fetch using the DbContext, then the object I get back is a dynamic proxy which holds a reference to the DbContext and I certainly don't want to do that for the whole session.
Thoughts? Is this a good approach, and if so how do I ensure that my DbContext doesn't linger beyond when I really need it?
Invoke .AsNoTracking() on the Set<UserPermission> before you query out. Entities will still be proxied, but will be detached from the DbContext.
var userPermission = dbContext.Set<UserPermission>().AsNoTracking()
.SingleOrDefault(x => x.UserName == User.Identity.Name);
Thoughts? Is this a good approach?
Putting a dynamically proxied entity in session will break as soon as you load balance your code across more than 1 web server. Why? Because of the dynamic proxy class. Server A understands the type DynamicProxies.UserPermission_Guid, because it queried out the entity. However Server B through N do not, and therefore cannot deserialize it from the Session. The other servers will dynamically proxy the entity with a different GUID.
That said, you could DTO your data into a POCO object and put it in session instead. However then you do not need to worry about your entity being attached to the context when you first query it out. AsNoTracking will only make the query perform a bit faster.
// you can still call .AsNoTracking for performance reasons
var userPermissionEntity = dbContext.Set<UserPermission>().AsNoTracking()
.SingleOrDefault(x => x.UserName == User.Identity.Name);
// this can safely be put into session and restored by any server with a
// reference to the DLL where the DTO class is defined.
var userPermissionSession = new UserPermissionInSession
{
UserName = userPermissionEntity.UserName,
// etc.
};
Thoughts? Is this a good approach?
Another problem attached to this approach is when you use the common pattern that create one dbContext per http request. This pattern typically dispose dbContext when the request ends.
protected virtual void Application_EndRequest(object sender, EventArgs e)
But what happen when we try to get navigation property of a proxy entity which reference to a disposed DbContext?
We will get a ObjectDisposedException
Ok so I have an n tiered model. (WPF, Asp.Net, etc) to backend services via WCF. These services use NHibernate to communicate with the database. Some of these services will be run in InstanceContextMode.Single mode.
Questions:
In singleton service instances should I try to utilize 1 session object for the entire time the wcf service is alive to get the most out of my cache?
If I use 1 session instance in this singleton instance and never create new ones I assume I have to worry about eventually removing cached entities from the session or dumping it all together to avoid performance issues with the session?
Is it a good idea at all to use the session in this way for a singleton wcf service? It seems like it would be if I want to utilize caching.
Should I utilize 2nd level cache in a scenario like this?
Outside of this scenario when should I avoid caching? I would assume that I would want to avoid it in any sort of batching scenario where a large number of objects are created/updated and never really used again outside of the creation or updates.
Are items automatically cached in session when I create/read/update/delete or do I need to specify something in the mapping files or configuration?
1-3: As far as I know, ISession objects are supposed to be light-weight, short-lived objects, which live only for the duration for which they're needed. I would advise AGAINST using the same ISession object for the whole lifetime of your service.
What I would suggest instead is using the same ISeessionFactory instance, and creating new ISessions from it as necessary (you can try something similar to Session-Per-Request pattern).
If you enable 2nd level cache, you can have all the benefits of caching in this scenario.
5 Yep, pretty much. Also remember that 2nd level cache instance is per ISessionFactory instance. that means that if you're using more than 1 ISessionFactory instance you'll have a lot of problems with your cache.
6 for 1st level cache you don't need to define anything.
for 2nd level cache you need to enable the cache when you configure nHibernate (fluently, in my case):
.Cache(c => c.UseQueryCache()
.ProviderClass(
isWeb ? typeof(NHibernate.Caches.SysCache2.SysCacheProvider).AssemblyQualifiedName //in web environment- use sysCache2
: typeof(NHibernate.Cache.HashtableCacheProvider).AssemblyQualifiedName //in dev environmet- use stupid cache
))
)
and specify for each entity and each collection that you want to enable cache for them:
mapping.Cache.ReadWrite().Region("myRegion");
and for a collection:
mapping.HasMany(x => x.Something)
.Cache.ReadWrite().Region("myRegion");
We are developing a multi-tenant application using NHibernate where all tenants share the same database.
One option we considered was to use a tenant specific prefix for our database objects (I believe this the same approach taken by Orchard).
This would at least give us some kind of recovery model for tenant data and would mean we wouldn't have to include a tenantid filter on most of our queries.
So my question - has anyone employed this strategy? If so, how did you go about it.
Specifically, can we use the same SessionFactory for all tenants and can we use NHibernate to generate a new set of tables for a "new" tenant at runtime (is it safe to do so).
Thanks
Ben
[Update]
This was a worthwhile investigation but ultimately we decided that a shared schema was more suitable for our needs. Schema per tenant clearly offers better separation of tenant data but makes maintenance more difficult. Since our tenants are only storing small amounts of data, the thought of having 10 tables * 1K tenants is a little off-putting.
There are a couple of points of customization / extension that you may want to consider.
I don't think that you will be able to share the same session factory across tenants. I think that the simplest thing to do may be to update the mappings based on the tenant associated with the session factory.
public class EntitytMap:ClassMap<Entity>
{
public EntitytMap()
{
Table("TableName");
Schema(Session.TenantName);
Id(p => p.Id, "Id").GeneratedBy.Identity();
If you want to have each tenant in their own schema, this should work. If you want to keep the schema the same but have a prefix on the table, you could change to:
public class EntityMap:ClassMap<Entity>
{
public EntityMap()
{
Table(Session.TenantPrefix + "TableName");
Schema("SCHEMA");
Id(p => p.Id, "Id").GeneratedBy.Identity();
You can also try providing your own ConnectionProvider. Derive a class from NHibernate.Connection.DriverConnectionProvider and reference your own copy in the nhibernate configuration file instead of:
<property name="connection.provider">NHibernate.Connection.DriverConnectionProvider</property>
use
<property name="connection.provider">My.Project.DAL.NHibernate.CustomConnectionProvider, My.Project.DAL</property>
When the GetConnection provider is called, you can specify your own connection string based on the tenent.
can we use NHibernate to generate a new set of tables for a "new" tenant
at runtime (is it safe to do so)
I'd suggest that you wouldn't want to grant your web application the level of permissions required to perform these DDL tasks. I'd leave the web app with the minimum level of permissions for normal DML operations and have a background service operating as a 'Provisioning Service'. Its role would be schema modifications for the new tenant, and it is also a good place to put any other tenant provisioning tasks such as creating new folders, configuring IIS etc. All these tasks take time too and are best not done in a single web request. The background database can update a provisioning table, updating it with information about its progress until its complete and the web ui moves to the next step.
Hey all. Quick question on Fluent syntax. I had thought I had this down, but I'm getting a weird failure. Basically, I have a hierarchical kind of structure that I'm trying to persist, and it all seems to work, except when I do an actual integration test w/ the db.
I have a Node object which has a Parent property, which is another Node, and a _children field backing a readonly Children property, which is a collection of Nodes as well.
The properties handle correlating the relationships, and the in-memory objects test out just fine. When I retrieve them from the repository (an in-memory SQLite db in my tests), though, any Node's Children include itself for some reason. Any ideas?
My mappings are mostly done w/ AutoMap, but I've overridden the following:
mapping.References(x => x.Parent);
mapping.HasMany(x => x.Children).Inverse().Access.LowerCaseField(Prefix.Underscore);
I've also tried it w/o the Inverse() call.
Got it. The problem was that I needed to tell the children collection what Id field to hook into for the foreign key.
I changed that mapping to look like so:
mapping.HasMany(m => m.Children)
.Inverse()
.KeyColumn("ParentId")
.Access.CamelCaseField(Prefix.Underscore)
.Cascade.All()
I've just started using NHibernate, and I have some issues that I'm unsure how to solve correctly.
I started out creating a generic repository containing CUD and a couple of search methods. Each of these methods opens a separate session (and transaction if necessary) during the DB operation(s). The problem when doing this (as far as I can tell) is that I can't take advantage of lazy loading of related collections/objects.
As almost every entity relation has .Not.LazyLoad() in the fluent mapping, it results in the entire database being loaded when I request a list of all entities of a given type.
Correct me if I'm wrong, 'cause I'm still a complete newbie when it comes to NHibernate :)
What is most common to do to avoid this? Have one global static session that remains alive as long as the program runs, or what should I do?
Some of the repository code:
public T GetById(int id)
{
using (var session = NHibernateHelper.OpenSession())
{
return session.Get<T>(id);
}
}
Using the repository to get a Person
var person = m_PersonRepository.GetById(1); // works fine
var contactInfo = person.ContactInfo; // Throws exception with message:
// failed to lazily initialize a collection, no session or session was closed
Your question actually boils down to object caching and reuse. If you load a Foo object from one session, then can you keep hold of it and then at a later point in time lazy load its Bar property?
Each ISession instance is designed to represent a unit of work, and comes with a first level cache that will allow you to retrieve an object multiple times within that unit of work yet only have a single database hit. It is not thread-safe, and should definitely not be used as a static object in a WinForms application.
If you want to use an object when the session under which it was loaded has been disposed, then you need to associate it with a new session using Session.SaveOrUpdate(object) or Session.Update(object).
You can find all of this explained in chapter 10 of the Hibernate documentation.
If this seems inefficient, then look into second-level caching. This is provided at ISessionFactory level - your session factory can be static, and if you enable second-level caching this will effectively build an in-memory cache of much of your data. Second-level caching is only appropriate if there is no underlying service updating your data - if all database updates go via NHibernate, then it is safe.
Edit in light of code posted
Your session usage is at the wrong level - you are using it for a single database get, rather than a unit of work. In this case, your GetById method should take in a session which it uses, and the session instance should be managed at a higher level. Alternatively, your PersonRepository class should manage the session if you prefer, and you should instantiate and dispose an object of this type for each unit of work.
public T GetById(int id)
{
return m_session.Get<T>(id);
}
using (var repository = new PersonRepository())
{
var person = repository.GetById(1);
var contactInfo = person.ContactInfo;
} // make sure the repository Dispose method disposes the session.
The error message you are getting is because there is no longer a session to use to lazy load the collection - you've already disposed it.