LINQ SQL Attach, Update Check set to Never, but still Concurrency conflicts - sql

In the dbml designer I've set Update Check to Never on all properties. But i still get an exception when doing Attach: "An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported." This approach seems to have worked for others on here, but there must be something I've missed.
using(TheDataContext dc = new TheDataContext())
{
test = dc.Members.FirstOrDefault(m => m.fltId == 1);
}
test.Name = "test2";
using(TheDataContext dc = new TheDataContext())
{
dc.Members.Attach(test, true);
dc.SubmitChanges();
}

The error message says exactly what is going wrong: You are trying to attach an object that has been loaded from another DataContext, in your case from another instance of the DataContext. Dont dispose your DataContext (at the end of the using statement it gets disposed) before you change values and submit the changes. This should work (all in one using statement). I just saw you want to attach the object again to the members collection, but it is already in there. No need to do that, this should work just as well:
using(TheDataContext dc = new TheDataContext())
{
var test = dc.Members.FirstOrDefault(m => m.fltId == 1);
test.Name = "test2";
dc.SubmitChanges();
}
Just change the value and submit the changes.
Latest Update:
(Removed all previous 3 updates)
My previous solution (removed it again from this post), found here is dangerous. I just read this on a MSDN article:
"Only call the Attach methods on new
or deserialized entities. The only way
for an entity to be detached from its
original data context is for it to be
serialized. If you try to attach an
undetached entity to a new data
context, and that entity still has
deferred loaders from its previous
data context, LINQ to SQL will thrown
an exception. An entity with deferred
loaders from two different data
contexts could cause unwanted results
when you perform insert, update, and
delete operations on that entity. For
more information about deferred
loaders, see Deferred versus Immediate
Loading (LINQ to SQL)."
Use this instead:
// Get the object the first time by some id
using(TheDataContext dc = new TheDataContext())
{
test = dc.Members.FirstOrDefault(m => m.fltId == 1);
}
// Somewhere else in the program
test.Name = "test2";
// Again somewhere else
using(TheDataContext dc = new TheDataContext())
{
// Get the db row with the id of the 'test' object
Member modifiedMember = new Member()
{
Id = test.Id,
Name = test.Name,
Field2 = test.Field2,
Field3 = test.Field3,
Field4 = test.Field4
};
dc.Members.Attach(modifiedMember, true);
dc.SubmitChanges();
}
After having copied the object, all references are detached, and all event handlers (deferred loading from db) are not connected to the new object. Just the value fields are copied to the new object, that can now be savely attached to the members table. Additionally you do not have to query the db for a second time with this solution.

It is possible to attach entities from another datacontext.
The only thing that needs to be added to code in the first post is this:
dc.DeferredLoadingEnabled = false
But this is a drawback since deferred loading is very useful. I read somewhere on this page that another solution would be to set the Update Check on all properties to Never. This text says the same: http://complexitykills.blogspot.com/2008/03/disconnected-linq-to-sql-tips-part-1.html
But I can't get it to work even after setting the Update Check to Never.

This is a function in my Repository class which I use to update entities
protected void Attach(TEntity entity)
{
try
{
_dataContext.GetTable<TEntity>().Attach(entity);
_dataContext.Refresh(RefreshMode.KeepCurrentValues, entity);
}
catch (DuplicateKeyException ex) //Data context knows about this entity so just update values
{
_dataContext.Refresh(RefreshMode.KeepCurrentValues, entity);
}
}
Where TEntity is your DB Class and depending on you setup you might just want to do
_dataContext.Attach(entity);

Related

Update Document with external object

i have a database containing Song objects. The song class has > 30 properties.
My Music Tagging application is doing changes on a song on the file system.
It then does a lookup in the database using the filename.
Now i have a Song object, which i created in my Tagging application by reading the physical file and i have a Song object, which i have just retrieved from the database and which i want to update.
I thought i just could grab the ID from the database object, replace the database object with my local song object, set the saved id and store it.
But Raven claims that i am replacing the object with a different object.
Do i really need to copy every single property over, like this?
dbSong.Artist = songfromFilesystem.Artist;
dbSong.Album = songfromFileSystem.Album;
Or are there other possibilities.
thanks,
Helmut
Edit:
I was a bit too positive. The suggestion below works only in a test program.
When doing it in my original code i get following exception:
Attempted to associate a different object with id 'TrackDatas/3452'
This is produced by following code:
try
{
originalFileName = Util.EscapeDatabaseQuery(originalFileName);
// Lookup the track in the database
var dbTracks = _session.Advanced.DocumentQuery<TrackData, DefaultSearchIndex>().WhereEquals("Query", originalFileName).ToList();
if (dbTracks.Count > 0)
{
track.Id = dbTracks[0].Id;
_session.Store(track);
_session.SaveChanges();
}
}
catch (Exception ex)
{
log.Error("UpdateTrack: Error updating track in database {0}: {1}", ex.Message, ex.InnerException);
}
I am first looking up a song in the database and get a TrackData object in dbTracks.
The track object is also of type TrackData and i just put the ID from the object just retrieved and try to store it, which gives the above error.
I would think that the above message tells me that the objects are of different types, which they aren't.
The same error happens, if i use AutoMapper.
any idea?
You can do what you're trying: replace an existing object using just the ID. If it's not working, you might be doing something else wrong. (In which case, please show us your code.)
When it comes to updating existing objects in Raven, there are a few options:
Option 1: Just save the object using the same ID as an existing object:
var song = ... // load it from the file system or whatever
song.Id = "Songs/5"; // Set it to an existing song ID
DbSession.Store(song); // Overwrites the existing song
Option 2: Manually update the properties of the existing object.
var song = ...;
var existingSong = DbSession.Load<Song>("Songs/5");
existingSong.Artist = song.Artist;
existingSong.Album = song.Album;
Option 3: Dynamically update the existing object:
var song = ...;
var existingSong = DbSession.Load<Song>("Songs/5");
existingSong.CopyFrom(song);
Where you've got some code like this:
// Inside Song.cs
public virtual void CopyFrom(Song other)
{
var props = typeof(Song)
.GetProperties(System.Reflection.BindingFlags.Public | System.Reflection.BindingFlags.Instance)
.Where(p => p.CanWrite);
foreach (var prop in props)
{
var source = prop.GetValue(other);
prop.SetValue(this, source);
}
}
If you find yourself having to do this often, use a library like AutoMapper.
Automapper can automatically copy one object to another with a single line of code.
Now that you've posted some code, I see 2 things:
First, is there a reason you're using the Advanced.DocumentQuery syntax?
// This is advanced query syntax. Is there a reason you're using it?
var dbTracks = _session.Advanced.DocumentQuery<TrackData, DefaultSearchIndex>().WhereEquals("Query", originalFileName).ToList();
Here's how I'd write your code using standard LINQ syntax:
var escapedFileName = Util.EscapeDatabaseQuery(originalFileName);
// Find the ID of the existing track in the database.
var existingTrackId = _session.Query<TrackData, DefaultSearchIndex>()
.Where(t => t.Query == escapedFileName)
.Select(t => t.Id);
if (existingTrackId != null)
{
track.Id = existingTrackId;
_session.Store(track);
_session.SaveChanges();
}
Finally, #2: what is track? Was it loaded via session.Load or session.Query? If so, that's not going to work, and it's causing your problem. If track is loaded from the database, you'll need to create a new object and save that:
var escapedFileName = Util.EscapeDatabaseQuery(originalFileName);
// Find the ID of the existing track in the database.
var existingTrackId = _session.Query<TrackData, DefaultSearchIndex>()
.Where(t => t.Query == escapedFileName)
.Select(t => t.Id);
if (existingTrackId != null)
{
var newTrack = new Track(...);
newTrack.Id = existingTrackId;
_session.Store(newTrack);
_session.SaveChanges();
}
This means you already have a different object in the session with the same id. The fix for me was to use a new session.

NHibernate - Handling StaleObjectStateException to always commit client changes - Need advice/recommendation

I am trying to find the perfect way to handle this exception and force client changes to overwrite any other changes that caused the conflict. The approach that I came up with is to wrap the call to Session.Transaction.Commit() in a loop, inside the loop I would do a try-catch block and handle each stale object individually by copying its properties, except row-version property then refreshing the object to get latest DB data then recopying original values to the refreshed object and then doing a merge. Once I loop I will commit and if any other StaleObjectStateException take place then the same applies. The loop keeps looping until all conflicts are resolved.
This method is part of a UnitOfWork class. To make it clearer I'll post my code:
// 'Client-wins' rules, any conflicts found will always cause client changes to
// overwrite anything else.
public void CommitAndRefresh() {
bool saveFailed;
do {
try {
_session.Transaction.Commit();
_session.BeginTransaction();
saveFailed = false;
} catch (StaleObjectStateException ex) {
saveFailed = true;
// Get the staled object with client changes
var staleObject = _session.Get(ex.EntityName, ex.Identifier);
// Extract the row-version property name
IClassMetadata meta = _sessionFactory.GetClassMetadata(ex.EntityName);
string rowVersionPropertyName = meta.PropertyNames[meta.VersionProperty] as string;
// Store all property values from client changes
var propertyValues = new Dictionary<string, object>();
var publicProperties = staleObject.GetType().GetProperties();
foreach (var p in publicProperties) {
if (p.Name != rowVersionPropertyName) {
propertyValues.Add(p.Name, p.GetValue(staleObject, null));
}
}
// Get latest data for staled object from the database
_session.Refresh(staleObject);
// Update the data with the original client changes except for row-version
foreach (var p in publicProperties) {
if (p.Name != rowVersionPropertyName) {
p.SetValue(staleObject, propertyValues[p.Name], null);
}
}
// Merge
_session.Merge(staleObject);
}
} while (saveFailed);
}
The above code works fine and handle concurrency with the client-wins rule. However, I was wondering if there is any built-in capabilities in NHibernate to do this for me or if there is a better way to handle this.
Thanks in advance,
What you're describing is a lack of concurrency checking. If you don't use a concurrency strategy (optimistic-lock, version or pessimistic), StaleStateObjectException will not be thrown and the update will be issued.
Okay, now I understand your use case. One important point is that the ISession should be discarded after an exception is thrown. You can use ISession.Merge to merge changes between a detached a persistent object rather than doing it yourself. Unfortunately, Merge does not cascade to child objects so you still need to walk the object graph yourself. So the implementation would look something like:
catch (StaleObjectStateException ex)
{
if (isPowerUser)
{
var newSession = GetSession();
// Merge will automatically get first
newSession.Merge(staleObject);
newSession.Flush();
}
}

Persistance of 2 one-to-many with NHibernate

I have the following code which uses the repository pattern, backed by a fluent nhibernate automapping to a MySQL DB.
The following snippet adds a user, and a person to the model, and then persists to the database.
It works, however I have to remember to put user.Person.User = user; otherwise when the object is persisted, Person has the FK userId set to null.
Is there a way to automatically cause the Person.User to be populated, or do I need to create a method to do this?
Is there a way to cause all objects associated with that user to be persisted (e.g. only call repository.Save(obj) on one object and cause the others to also be persisted.)
using (MySQLRepositoryBase repository = new MySQLRepositoryBase())
{
try
{
repository.BeginTransaction();
User user = new User() { Password = "12345", Username = "jbloggs", Person = new Person() };
user.Person.Firstname = "Joe";
user.Person.Lastname = "Bloggs";
user.Person.User = user;
repository.Save(user);
repository.Save(user.Person);
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
repository.RollbackTransaction();
}
}
NHibernate can't set this automatically, you have to do it yourself.
Most common is a AddSomething() method in the parent entity which
adds the child to the collection and sets the parent reference in
the child. But if you don't need the parent reference in your code
you could just remove it (and map only the collection).
Use the Cascade options that are available on collections and references.

How to work around NHibernate caching?

I'm new to NHibernate and was assigned to a task where I have to change a value of an entity property and then compare if this new value (cached) is different from the actual value stored on the DB. However, every attempt to retrieve this value from the DB resulted in the cached value. As I said, I'm new to NHibernate, maybe this is something easy to do and obviously could be done with plain ADO.NET, but the client demands that we use NHibernate for every access to the DB. In order to make things clearer, those were my "successful" attempts (ie, no errors):
1
DetachedCriteria criteria = DetachedCriteria.For<User>()
.SetProjection(Projections.Distinct(Projections.Property(UserField.JobLoad)))
.Add(Expression.Eq(UserField.Id, userid));
return GetByDetachedCriteria(criteria)[0].Id; //this is the value I want
2
var JobLoadId = DetachedCriteria.For<User>()
.SetProjection(Projections.Distinct(Projections.Property(UserField.JobLoad)))
.Add(Expression.Eq(UserField.Id, userid));
ICriteria criteria = JobLoadId.GetExecutableCriteria(NHibernateSession);
var ids = criteria.List();
return ((JobLoad)ids[0]).Id;
Hope I made myself clear, sometimes is hard to explain a problem when even you don't quite understand the underlying framework.
Edit: Of course, this is a method body.
Edit 2: I found out that it doesn't work properly for the method call is inside a transaction context. If I remove the transaction, it works fine, but I need it to be in this context.
I do that opening a new stateless session for geting the actual object in the database:
User databaseuser;
using (IStatelessSession session = SessionFactory.OpenStatelessSession())
{
databaseuser = db.get<User>("id");
}
//do your checks
Within a session, NHibernate will return the same object from its Level-1 Cache (aka Identity Map). If you need to see the current value in the database, you can open a new session and load the object in that session.
I would do it like this:
public class MyObject : Entity
{
private readonly string myField;
public string MyProperty
{
get { return myField; }
set
{
if (value != myField)
{
myField = value;
DoWhateverYouNeedToDoWhenItIsChanged();
}
}
}
}
googles nhforge
http://nhibernate.info/doc/howto/various/finding-dirty-properties-in-nhibernate.html
This may be able to help you.

An NHibernate audit trail that doesn't cause "collection was not processed by flush" errors

Ayende has an article about how to implement a simple audit trail for NHibernate (here) using event handlers.
Unfortunately, as can be seen in the comments, his implementation causes the following exception to be thrown: collection xxx was not processed by flush()
The problem appears to be the implicit call to ToString on the dirty properties, which can cause trouble if the dirty property is also a mapped entity.
I have tried my hardest to build a working implementation but with no luck.
Does anyone know of a working solution?
I was able to solve the same problem using following workaround: set the processed flag to true on all collections in the current persistence context within the listener
public void OnPostUpdate(PostUpdateEvent postEvent)
{
if (IsAuditable(postEvent.Entity))
{
//skip application specific code
foreach (var collection in postEvent.Session.PersistenceContext.CollectionEntries.Values)
{
var collectionEntry = collection as CollectionEntry;
collectionEntry.IsProcessed = true;
}
//var session = postEvent.Session.GetSession(EntityMode.Poco);
//session.Save(auditTrailEntry);
//session.Flush();
}
}
Hope this helps.
The fix should be the following. Create a new event listener class and derive it from NHibernate.Event.Default.DefaultFlushEventListener:
[Serializable]
public class FixedDefaultFlushEventListener: DefaultFlushEventListener
{
private static readonly log4net.ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
protected override void PerformExecutions(IEventSource session)
{
if (log.IsDebugEnabled)
{
log.Debug("executing flush");
}
try
{
session.ConnectionManager.FlushBeginning();
session.PersistenceContext.Flushing = true;
session.ActionQueue.PrepareActions();
session.ActionQueue.ExecuteActions();
}
catch (HibernateException exception)
{
if (log.IsErrorEnabled)
{
log.Error("Could not synchronize database state with session", exception);
}
throw;
}
finally
{
session.PersistenceContext.Flushing = false;
session.ConnectionManager.FlushEnding();
}
}
}
Register it during NHibernate configuraiton:
cfg.EventListeners.FlushEventListeners = new IFlushEventListener[] { new FixedDefaultFlushEventListener() };
You can read more about this bug in Hibernate JIRA:
https://hibernate.onjira.com/browse/HHH-2763
The next release of NHibernate should include that fix either.
This is not easy at all. I wrote something like this, but it is very specific to our needs and not trivial.
Some additional hints:
You can test if references are loaded using
NHibernateUtil.IsInitialized(entity)
or
NHibernateUtil.IsPropertyInitialized(entity, propertyName)
You can cast collections to the IPersistentCollection. I implemented an IInterceptor where I get the NHibernate Type of each property, I don't know where you can get this when using events:
if (nhtype.IsCollectionType)
{
var collection = previousValue as NHibernate.Collection.IPersistentCollection;
if (collection != null)
{
// just skip uninitialized collections
if (!collection.WasInitialized)
{
// skip
}
else
{
// read collections previous values
previousValue = collection.StoredSnapshot;
}
}
}
When you get the update event from NHibernate, the instance is initialized. You can safely access properties of primitive types. When you want to use ToString, make sure that your ToString implementation doesn't access any referenced entities nor any collections.
You may use NHibernate meta-data to find out if a type is mapped as an entity or not. This could be useful to navigate in your object model. When you reference another entity, you will get additional update events on this when it changed.
I was able to determine that this error is thrown when application code loads a Lazy Propery where the Entity has a collection.
My first attempt involed watching for new CollectionEntries (which I've never want to process as there shouldn't actually be any changes). Then mark them as IsProcessed = true so they wouldn't cause problems.
var collections = args.Session.PersistenceContext.CollectionEntries;
var collectionKeys = args.Session.PersistenceContext.CollectionEntries.Keys;
var roundCollectionKeys = collectionKeys.Cast<object>().ToList();
var collectionValuesClount = collectionKeys.Count;
// Application code that that loads a Lazy propery where the Entity has a collection
var postCollectionKeys = collectionKeys.Cast<object>().ToList();
var newLength = postCollectionKeys.Count;
if (newLength != collectionValuesClount) {
foreach (var newKey in postCollectionKeys.Except(roundCollectionKeys)) {
var collectionEntry = (CollectionEntry)collections[newKey];
collectionEntry.IsProcessed = true;
}
}
However this didn't entirly solve the issue. In some cases I'd still get the exception.
When OnPostUpdate is called the values in the CollectionEntries dictionary should all already be set to IsProcessed = true. So I decided to do an extra check to see if the collections not processed matched what I expected.
var valuesNotProcessed = collections.Values.Cast<CollectionEntry>().Where(x => !x.IsProcessed).ToList();
if (valuesNotProcessed.Any()) {
// Assert: valuesNotProcessed.Count() == (newLength - collectionValuesClount)
}
In the cases that my first attempt fixed these numbers would match exactly. However in the cases where it didn't work there were extra items alreay in the dictionary. In my I could be sure these extra items also wouldn't result in updates so I could just set IsProcessed = true for all the valuesNotProcessed.