Nhibernate Clear Cache for a Specific Region - nhibernate

I am trying to manually clear the level 2 cache for a specific region. I found the method posted in answer to this question. While this is working to clear my entities, for some reason the querycache is not getting cleared. This results in a separate query for each entity the next time the entities are retrieved from the database. If does work when I call sessionFactory.EvictQueries() without any parameters. It is only not working when I am passing in a specific region name. Any ideas as to what is going wrong?
Code is from the above link:
private void ClearRegion(string regionName)
{
_sessionFactory.EvictQueries(regionName);
foreach (var collectionMetaData in _sessionFactory.GetAllCollectionMetadata().Values)
{
var collectionPersister = collectionMetaData as NHibernate.Persister.Collection.ICollectionPersister;
if (collectionPersister != null)
{
if ((collectionPersister.Cache != null) && (collectionPersister.Cache.RegionName == regionName))
{
_sessionFactory.EvictCollection(collectionPersister.Role);
}
}
}
foreach (var classMetaData in _sessionFactory.GetAllClassMetadata().Values)
{
var entityPersister = classMetaData as NHibernate.Persister.Entity.IEntityPersister;
if (entityPersister != null)
{
if ((entityPersister.Cache != null) && (entityPersister.Cache.RegionName == regionName))
{
_sessionFactory.EvictEntity(entityPersister.EntityName);
}
}
}
}
Caching is working and verified using NHProfiler.

Ok, so I figured out my issue. I did not realize that it is necessary to specify a cache region when querying the data, aside from specifying it in the entity mapping. After adding .CacheRegion("regionName") to my queries everything works. By not adding the region when querying, it was going into the query cache without a region name. That is why it worked when I called.EvictQueries() without a region name parameter.
To sum it up, it is necessary to add the region name when mapping the entities (.Region("regionName") when using Fluent) and when querying with isession using .CacheRegion("regionName").
Thank you for you responses.

Related

Updating complex type with ef code first

I have a complex type called account, which contains a list of licenses.
Licenses in turn contains a list of domains (a domain is a simple id + url string).
In my repository I have this code
public void SaveLicense(int accountId, License item)
{
Account account = GetById(accountId);
if (account == null)
{
return;
}
if (item.Id == 0)
{
account.Licenses.Add(item);
}
else
{
ActiveContext.Entry(item).State = EntityState.Modified;
}
ActiveContext.SaveChanges();
}
When I try to save an updated License (with modified domains) what happens is that strings belonging straight to the license get updated just fine.
However no domains get updated.
I should mention that what I have done is allow the user to add and remove domains in the user interface. Any new domains get id=0 and any deleted domains are simply not in the list.
so what I want is
Any domains that are in the list and database and NOT changed - nothing happens
Any domains that are in the list and database, but changed in the list - database gets updated
Any domains with id=0 should be inserted (added) into database
Any domains NOT in the list but that are in the database should be removed
I have played a bit with it with no success but I have a sneaky suspicion that I am doing something wrong in the bigger picture so I would love tips on if I am misunderstanding something design-wise or simply just missed something.
Unfortunately updating object graphs - entities with other related entities - is a rather difficult task and there is no very sophisticated support from Entity Framework to make it easy.
The problem is that setting the state of an entity to Modified (or generally to any other state) only influences the entity that you pass into DbContext.Entry and only its scalar properties. It has no effect on its navigation properties and related entities.
You must handle this object graph update manually by loading the entity that is currently stored in the database including the related entities and by merging all changes you have done in the UI into that original graph. Your else case could then look like this:
//...
else
{
var licenseInDb = ActiveContext.Licenses.Include(l => l.Domains)
.SingleOrDefault(l => l.Id == item.Id)
if (licenseInDb != null)
{
// Update the license (only its scalar properties)
ActiveContext.Entry(licenseInDb).CurrentValus.SetValues(item);
// Delete domains from DB that have been deleted in UI
foreach (var domainInDb in licenseInDb.Domains.ToList())
if (!item.Domains.Any(d => d.Id == domainInDb.Id))
ActiveContext.Domains.Remove(domainInDb);
foreach (var domain in item.Domains)
{
var domainInDb = licenseInDb.Domains
.SingleOrDefault(d => d.Id == domain.Id);
if (domainInDb != null)
// Update existing domains
ActiveContext.Entry(domainInDb).CurrentValus.SetValues(domain);
else
// Insert new domains
licenseInDb.Domains.Add(domain);
}
}
}
ActiveContext.SaveChanges();
//...
You can also try out this project called "GraphDiff" which intends to do this work in a generic way for arbitrary detached object graphs.
The alternative is to track all changes in some custom fields in the UI layer and then evaluate the tracked state changes when the data get posted back to set the appropriate entity states. Because you are in a web application it basically means that you have to track changes in the browser (most likely requiring some Javascript) while the user changes values, adds new items or deletes items. In my opinion this solution is even more difficult to implement.
This should be enough to do what you are looking to do. Let me know if you have more questions about the code.
public void SaveLicense(License item)
{
if (account == null)
{
context.Licenses.Add(item);
}
else if (item.Id > 0)
{
var currentItem = context.Licenses
.Single(t => t.Id == item.Id);
context.Entry(currentItem ).CurrentValues.SetValues(item);
}
ActiveContext.SaveChanges();
}

Time out expired while querying through criteria using nhibernate

I am using criteria to query the database based on the unique key. But I am coming through a weird scenario. After two or three queries, it starts giving me timeout expired error.
using (NHibernate.ISession session = m_SessionFactory.OpenSession())
{
using (ITransaction transacion = session.BeginTransaction())
{
if (cashActivity.ActivityState == ApplicationConstants.TaxLotState.Deleted || cashActivity.ActivityState == ApplicationConstants.TaxLotState.Updated)
{
IList<CashActivity> lsCActivity = RetrieveEquals<CashActivity>("UniqueKey",cashActivity.UniqueKey);
if (lsCActivity != null && lsCActivity.Count > 0)
cashActivity.CashActivityID = lsCActivity[0].CashActivityID;
}
if (cashActivity.ActivityState == ApplicationConstants.TaxLotState.Deleted)
{
session.Delete(cashActivity);
}
else
session.SaveOrUpdate(cashActivity);
}
}
}
public IList<T> RetrieveEquals<T>(string propertyName, object propertyValue)
{
using (Isession session = m_SessionFactory.OpenSession())
{
Icriteria criteria = session.CreateCriteria(typeof(T));
criteria.Add(Restrictions.Eq(propertyName, PropertyValue));
IList<T> matchingObjects = criteria.List<T>();
return matchingObjects;
}
}
I made changes in the code and start using StateLess Session but that change only reduces the frequency of timeout error.
After decugging , I found IList matchingObjects = criteria.List(); is cause of the exception. But this is only returning only one value, so it should not result timeout error since table also doesnt contain more than 100 rows as of now. Any Suggestions??
Unless you have wrapped NHibernate's ISessionFactory in something else, each call to OpenSession() will yield a new session. So the above code involves multiple sessions and it isn't clear if this is required.
Theoretically, a query on the session in RetrieveEquals() could block because of locks taken on the connection used in the calling method. But given the code as shown I can't see anything to prove this.
The calling method first updates a property of cashActivity, then in some cases goes on to delete the object. And there is no Commit(). This seems strange - is this really the used code or might there be a copy/paste error?
You also say "after two or three queries"... do you imply that there is a loop somewhere which isn't shown?

NHibernate second-level caching - evicting regions

We have a number of cache regions set up in our nHibernate implementation. In order to avoid trouble with load balanced web servers, I want to effectively disable the caching on the pages that edit the cached data. I can write a method that clears out all my query caches, my class caches and my entity caches easily enough.
But what I really want is to clear the cache by region. sessionFactory.EvictQueries() will take a region parameter, but Evict() and EvictCollection() does not. I don't really want to throw away the whole cache here, nor do I want to maintain some sort of clumsy dictionary associating types with their cache regions. Does nHibernate have a way to ask an entity or collection what its caching settings are?
thanks
I've just done the same thing. For everyone's benefit, here is the method I constructed:
public void ClearCache(string regionName)
{
// Use your favourite IOC to get to the session factory
var sessionFactory = ObjectFactory.GetInstance<ISessionFactory>();
sessionFactory.EvictQueries(regionName);
foreach (var collectionMetaData in sessionFactory.GetAllCollectionMetadata().Values)
{
var collectionPersister = collectionMetaData as NHibernate.Persister.Collection.ICollectionPersister;
if (collectionPersister != null)
{
if ((collectionPersister.Cache != null) && (collectionPersister.Cache.RegionName == regionName))
{
sessionFactory.EvictCollection(collectionPersister.Role);
}
}
}
foreach (var classMetaData in sessionFactory.GetAllClassMetadata().Values)
{
var entityPersister = classMetaData as NHibernate.Persister.Entity.IEntityPersister;
if (entityPersister != null)
{
if ((entityPersister.Cache != null) && (entityPersister.Cache.RegionName == regionName))
{
sessionFactory.EvictEntity(entityPersister.EntityName);
}
}
}
}
OK, looks like i've answered my own question. The default interface that's returned when you pull out the nHibernate metadata doesn't provide information on caching, however if you dig around in the implementations of it, it does. A bit clumsy, but it does the job.

given a list of objects using C# push them to ravendb without knowing which ones already exist

Given 1000 documents with a complex data structure. for e.g. a Car class that has three properties, Make and Model and one Id property.
What is the most efficient way in C# to push these documents to raven db (preferably in a batch) without having to query the raven collection individually to find which to update and which to insert. At the moment I have to going like so. Which is totally inefficient.
note : _session is a wrapper on the IDocumentSession where Commit calls SaveChanges and Add calls Store.
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var page = 0;
const int total = 30;
do
{
var paged = sales.Skip(page*total).Take(total);
if (!paged.Any()) return;
foreach (var sale in paged)
{
var current = sale;
var existing = _session.Query<Sale>().FirstOrDefault(s => s.Id == current.Id);
if (existing != null)
existing = current;
else
_session.Add(current);
}
_session.Commit();
page++;
} while (true);
}
Your session code doesn't seem to track with the RavenDB api (we don't have Add or Commit).
Here is how you do this in RavenDB
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
sales.ForEach(session.Store);
session.SaveChanges();
}
Your code sample doesn't work at all. The main problem is that you cannot just switch out the references and expect RavenDB to recognize that:
if (existing != null)
existing = current;
Instead you have to update each property one-by-one:
existing.Model = current.Model;
existing.Make = current.Model;
This is the way you can facilitate change-tracking in RavenDB and many other frameworks (e.g. NHibernate). If you want to avoid writing this uinteresting piece of code I recommend to use AutoMapper:
existing = Mapper.Map<Sale>(current, existing);
Another problem with your code is that you use Session.Query where you should use Session.Load. Remember: If you query for a document by its id, you will always want to use Load!
The main difference is that one uses the local cache and the other not (the same applies to the equivalent NHibernate methods).
Ok, so now I can answer your question:
If I understand you correctly you want to save a bunch of Sale-instances to your database while they should either be added if they didn't exist or updated if they existed. Right?
One way is to correct your sample code with the hints above and let it work. However that will issue one unnecessary request (Session.Load(existingId)) for each iteration. You can easily avoid that if you setup an index that selects all the Ids of all documents inside your Sales-collection. Before you then loop through your items you can load all the existing Ids.
However, I would like to know what you actually want to do. What is your domain/use-case?
This is what works for me right now. Note: The InjectFrom method comes from Omu.ValueInjecter (nuget package)
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var ids = sales.Select(i => i.Id);
var existingSales = _ravenSession.Load<Sale>(ids);
existingSales.ForEach(s => s.InjectFrom(sales.Single(i => i.Id == s.Id)));
var existingIds = existingSales.Select(i => i.Id);
var nonExistingSales = sales.Where(i => !existingIds.Any(x => x == i.Id));
nonExistingSales.ForEach(i => _ravenSession.Store(i));
_ravenSession.SaveChanges();
}

An NHibernate audit trail that doesn't cause "collection was not processed by flush" errors

Ayende has an article about how to implement a simple audit trail for NHibernate (here) using event handlers.
Unfortunately, as can be seen in the comments, his implementation causes the following exception to be thrown: collection xxx was not processed by flush()
The problem appears to be the implicit call to ToString on the dirty properties, which can cause trouble if the dirty property is also a mapped entity.
I have tried my hardest to build a working implementation but with no luck.
Does anyone know of a working solution?
I was able to solve the same problem using following workaround: set the processed flag to true on all collections in the current persistence context within the listener
public void OnPostUpdate(PostUpdateEvent postEvent)
{
if (IsAuditable(postEvent.Entity))
{
//skip application specific code
foreach (var collection in postEvent.Session.PersistenceContext.CollectionEntries.Values)
{
var collectionEntry = collection as CollectionEntry;
collectionEntry.IsProcessed = true;
}
//var session = postEvent.Session.GetSession(EntityMode.Poco);
//session.Save(auditTrailEntry);
//session.Flush();
}
}
Hope this helps.
The fix should be the following. Create a new event listener class and derive it from NHibernate.Event.Default.DefaultFlushEventListener:
[Serializable]
public class FixedDefaultFlushEventListener: DefaultFlushEventListener
{
private static readonly log4net.ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
protected override void PerformExecutions(IEventSource session)
{
if (log.IsDebugEnabled)
{
log.Debug("executing flush");
}
try
{
session.ConnectionManager.FlushBeginning();
session.PersistenceContext.Flushing = true;
session.ActionQueue.PrepareActions();
session.ActionQueue.ExecuteActions();
}
catch (HibernateException exception)
{
if (log.IsErrorEnabled)
{
log.Error("Could not synchronize database state with session", exception);
}
throw;
}
finally
{
session.PersistenceContext.Flushing = false;
session.ConnectionManager.FlushEnding();
}
}
}
Register it during NHibernate configuraiton:
cfg.EventListeners.FlushEventListeners = new IFlushEventListener[] { new FixedDefaultFlushEventListener() };
You can read more about this bug in Hibernate JIRA:
https://hibernate.onjira.com/browse/HHH-2763
The next release of NHibernate should include that fix either.
This is not easy at all. I wrote something like this, but it is very specific to our needs and not trivial.
Some additional hints:
You can test if references are loaded using
NHibernateUtil.IsInitialized(entity)
or
NHibernateUtil.IsPropertyInitialized(entity, propertyName)
You can cast collections to the IPersistentCollection. I implemented an IInterceptor where I get the NHibernate Type of each property, I don't know where you can get this when using events:
if (nhtype.IsCollectionType)
{
var collection = previousValue as NHibernate.Collection.IPersistentCollection;
if (collection != null)
{
// just skip uninitialized collections
if (!collection.WasInitialized)
{
// skip
}
else
{
// read collections previous values
previousValue = collection.StoredSnapshot;
}
}
}
When you get the update event from NHibernate, the instance is initialized. You can safely access properties of primitive types. When you want to use ToString, make sure that your ToString implementation doesn't access any referenced entities nor any collections.
You may use NHibernate meta-data to find out if a type is mapped as an entity or not. This could be useful to navigate in your object model. When you reference another entity, you will get additional update events on this when it changed.
I was able to determine that this error is thrown when application code loads a Lazy Propery where the Entity has a collection.
My first attempt involed watching for new CollectionEntries (which I've never want to process as there shouldn't actually be any changes). Then mark them as IsProcessed = true so they wouldn't cause problems.
var collections = args.Session.PersistenceContext.CollectionEntries;
var collectionKeys = args.Session.PersistenceContext.CollectionEntries.Keys;
var roundCollectionKeys = collectionKeys.Cast<object>().ToList();
var collectionValuesClount = collectionKeys.Count;
// Application code that that loads a Lazy propery where the Entity has a collection
var postCollectionKeys = collectionKeys.Cast<object>().ToList();
var newLength = postCollectionKeys.Count;
if (newLength != collectionValuesClount) {
foreach (var newKey in postCollectionKeys.Except(roundCollectionKeys)) {
var collectionEntry = (CollectionEntry)collections[newKey];
collectionEntry.IsProcessed = true;
}
}
However this didn't entirly solve the issue. In some cases I'd still get the exception.
When OnPostUpdate is called the values in the CollectionEntries dictionary should all already be set to IsProcessed = true. So I decided to do an extra check to see if the collections not processed matched what I expected.
var valuesNotProcessed = collections.Values.Cast<CollectionEntry>().Where(x => !x.IsProcessed).ToList();
if (valuesNotProcessed.Any()) {
// Assert: valuesNotProcessed.Count() == (newLength - collectionValuesClount)
}
In the cases that my first attempt fixed these numbers would match exactly. However in the cases where it didn't work there were extra items alreay in the dictionary. In my I could be sure these extra items also wouldn't result in updates so I could just set IsProcessed = true for all the valuesNotProcessed.