Starting to use Nhibernate for persistency being seduced by the promise that it respects your domain model, I tried to implement a relation manager for my domain objects. Basically, to DRY my code with respect to managing bidirectional one to many and many to many relations, I decided to have those relations managed by a separate class. When a one to many or many to one property is set an entry for the two objects is made in an dictionary, the key is either a one side with a collection value to hold the many sides, or a many side with a value of the one side.
A one to many relation for a specific combination of types looks as follows:
public class OneToManyRelation<TOnePart, TManyPart> : IRelation<IRelationPart, IRelationPart>
where TOnePart : class, IRelationPart
where TManyPart : class, IRelationPart
{
private readonly IDictionary<TOnePart, Iesi.Collections.Generic.ISet<TManyPart>> _oneToMany;
private readonly IDictionary<TManyPart, TOnePart> _manyToOne;
public OneToManyRelation()
{
_manyToOne = new ConcurrentDictionary<TManyPart, TOnePart>();
_oneToMany = new ConcurrentDictionary<TOnePart, Iesi.Collections.Generic.ISet<TManyPart>>();
}
public void Set(TOnePart onePart, TManyPart manyPart)
{
if (onePart == null || manyPart == null) return;
if (!_manyToOne.ContainsKey(manyPart)) _manyToOne.Add(manyPart, onePart);
else _manyToOne[manyPart] = onePart;
}
public void Add(TOnePart onePart, TManyPart manyPart)
{
if (onePart == null || manyPart == null) return;
if (!_manyToOne.ContainsKey(manyPart)) _manyToOne.Add(manyPart, onePart);
else _manyToOne[manyPart] = onePart;
if (!_oneToMany.ContainsKey(onePart)) _oneToMany.Add(onePart, new HashedSet<TManyPart>());
_oneToMany[onePart].Add(manyPart);
}
public Iesi.Collections.Generic.ISet<TManyPart> GetManyPart(TOnePart onePart)
{
if (!_oneToMany.ContainsKey(onePart)) _oneToMany[onePart] = new HashedSet<TManyPart>();
return _oneToMany[onePart];
}
public TOnePart GetOnePart(TManyPart manyPart)
{
if(!_manyToOne.ContainsKey(manyPart)) _manyToOne[manyPart] = default(TOnePart);
return _manyToOne[manyPart];
}
public void Remove(TOnePart onePart, TManyPart manyPart)
{
_manyToOne.Remove(manyPart);
_oneToMany[onePart].Remove(manyPart);
}
public void Set(TOnePart onePart, Iesi.Collections.Generic.ISet<TManyPart> manyPart)
{
if (onePart == null) return;
if (!_oneToMany.ContainsKey(onePart)) _oneToMany.Add(onePart, manyPart);
else _oneToMany[onePart] = manyPart;
}
public void Clear(TOnePart onePart)
{
var list = new HashedSet<TManyPart>(_oneToMany[onePart]);
foreach (var manyPart in list)
{
_manyToOne.Remove(manyPart);
}
_oneToMany.Remove(onePart);
}
public void Clear(TManyPart manyPart)
{
if (!_manyToOne.ContainsKey(manyPart)) return;
if (_manyToOne[manyPart] == null) return;
_oneToMany[_manyToOne[manyPart]].Remove(manyPart);
_manyToOne.Remove(manyPart);
}
}
On the many side a code snippet looks like:
public virtual SubstanceGroup SubstanceGroup
{
get { return RelationProvider.SubstanceGroupSubstance.GetOnePart(this); }
protected set { RelationProvider.SubstanceGroupSubstance.Set(value, this); }
}
On the one side, so, in this case the SubstanceGroup, the snippet looks like:
public virtual ISet<Substance> Substances
{
get { return RelationProvider.SubstanceGroupSubstance.GetManyPart(this); }
protected set { RelationProvider.SubstanceGroupSubstance.Set(this, value); }
}
Just using my domain objects, this works excellent. In the domain object I just have to reference an abstract factory that retrieves the appropriate relation and I can set the relation from one side, wich automatically becomes thus bidirectional.
However, when NH kicks in the problem is that I get duplicate keys in my dictionaries. Somehow NH sets a relation property with a null value(!) with a new copy(?) of a domain object. So when the domain object gets saved, I have two entries of that domain object in, for example the many side of the relation, i.e. _manyToOne dictionary.
This problem makes me lose my hair, I do not get it what is happening??
To answer your first, very general question: "Does NHibernate really deliver transparent persistency", I just can say: nothing is perfect. NH tries its best to be as transparent as possible, by also trying to keep its complexity as low as possible.
There are some assumptions, particularly regarding collections: Collections and their implementations are not considered to be part of your domain model. NH provides its own collection implementations. You are not only expected to use the interfaces like ISet and IList. You should also take the instances given by NH when the object is read from the database and never replace it with your own. (I don't know what your relation class is actually used for, so I don't know if this is the problem here.)
Domain objects are unique within the same instance of the session. If you get new instances of domain objects each time, you probably implemented the "session-per-call" anti-pattern, which creates a new session for each database interaction.
I don't have a clue what you actually are doing. How is this OneToManyRelation actually used for? What are you doing when NH doesn't behave as expected? This is a very specific problem to your specific implementation.
Besides the comments on 'convoluted code' and 'what the heck are you doing'. The problem was that I was replacing the persistence collections of NH like in the below code snippet:
public void Add(TOnePart onePart, TManyPart manyPart)
{
if (onePart == null || manyPart == null) return;
if (!_manyToOne.ContainsKey(manyPart)) _manyToOne.Add(manyPart, onePart);
else _manyToOne[manyPart] = onePart;
if (!_oneToMany.ContainsKey(onePart)) _oneToMany.Add(onePart, new HashedSet<TManyPart>());
_oneToMany[onePart].Add(manyPart);
}
I create a new Hashed set for the many part. And that was the problem. If just has set the many part with the collection coming in (in case of the persistence collection implementation of NH) than it would have worked.
As a NH newbie, this replacing of collections with a special implementation from NH has been an important source of errors. Just as a warning to other NH newbies.
Related
I want to ask this question and I tried to search for a while without concrete answers.
I have made a database and used LINQ2SQL to auto-generate the classes needed.
I have set the serialization mode to unidirectional to make sure the classes are being serialized and making the datamembers.
Now, what I want to know is, how I can send the references to the other classes (which has been made through LINQ2SQL).
F.x. I have a Class called Scheduler which is referencing Reservation, and Seat, because Reservation and Seat have foreign keys.
You can see the dbml here:
http://imgur.com/rR6OxDi
The dbml file. This is the model of our database
Also you can see that when I run the WCF test client it does not return the objects of Seats and Reservation.
http://imgur.com/brxNBz7
Hopefully you can all help.
UPDATE
Here is the snippet of the code provided by LINQ2SQL.
This is the fields for the scheduler
[global::System.Data.Linq.Mapping.TableAttribute(Name="dbo.Scheduler")]
[global::System.Runtime.Serialization.DataContractAttribute()]
public partial class Scheduler : INotifyPropertyChanging, INotifyPropertyChanged
{
private static PropertyChangingEventArgs emptyChangingEventArgs = new PropertyChangingEventArgs(String.Empty);
private int _SchID;
private System.Nullable<System.DateTime> _Date;
private System.Nullable<System.TimeSpan> _Starttime;
private System.Nullable<int> _MovieID;
private System.Nullable<int> _HallID;
private EntitySet<Seat> _Seats;
private EntitySet<Reservation> _Reservations;
private EntityRef<Hall> _Hall;
private EntityRef<Movie> _Movie;
private bool serializing;
And here is the snippet part of the code where it references to Reservation and Seat:
[global::System.Data.Linq.Mapping.AssociationAttribute(Name="Scheduler_Seat", Storage="_Seats", ThisKey="SchID", OtherKey="SchedulerID")]
[global::System.Runtime.Serialization.DataMemberAttribute(Order=6, EmitDefaultValue=false)]
public EntitySet<Seat> Seats
{
get
{
if ((this.serializing
&& (this._Seats.HasLoadedOrAssignedValues == false)))
{
return null;
}
return this._Seats;
}
set
{
this._Seats.Assign(value);
}
}
[global::System.Data.Linq.Mapping.AssociationAttribute(Name="Scheduler_Reservation", Storage="_Reservations", ThisKey="SchID", OtherKey="SchedulerID")]
[global::System.Runtime.Serialization.DataMemberAttribute(Order=7, EmitDefaultValue=false)]
public EntitySet<Reservation> Reservations
{
get
{
if ((this.serializing
&& (this._Reservations.HasLoadedOrAssignedValues == false)))
{
return null;
}
return this._Reservations;
}
set
{
this._Reservations.Assign(value);
}
}
Update 2
Here is the Reservation class which LINQ2SQL made:
Here is the fields:
[global::System.Data.Linq.Mapping.TableAttribute(Name="dbo.Reservation")]
[global::System.Runtime.Serialization.DataContractAttribute()]
public partial class Reservation : INotifyPropertyChanging, INotifyPropertyChanged
{
private static PropertyChangingEventArgs emptyChangingEventArgs = new PropertyChangingEventArgs(String.Empty);
private int _ResID;
private System.Nullable<int> _CustomerID;
private System.Nullable<int> _SchedulerID;
private string _Row;
private string _Seat;
private EntityRef<Customer> _Customer;
private EntityRef<Scheduler> _Scheduler;
And here is the Scheduler reference part of the class
[global::System.Data.Linq.Mapping.AssociationAttribute(Name="Scheduler_Reservation", Storage="_Scheduler", ThisKey="SchedulerID", OtherKey="SchID", IsForeignKey=true, DeleteRule="SET DEFAULT")]
public Scheduler Scheduler
{
get
{
return this._Scheduler.Entity;
}
set
{
Scheduler previousValue = this._Scheduler.Entity;
if (((previousValue != value)
|| (this._Scheduler.HasLoadedOrAssignedValue == false)))
{
this.SendPropertyChanging();
if ((previousValue != null))
{
this._Scheduler.Entity = null;
previousValue.Reservations.Remove(this);
}
this._Scheduler.Entity = value;
if ((value != null))
{
value.Reservations.Add(this);
this._SchedulerID = value.SchID;
}
else
{
this._SchedulerID = default(Nullable<int>);
}
this.SendPropertyChanged("Scheduler");
}
}
}
All of these things should lead to where I could get the object like this:
Scheduler[] schedulers = client.GetAllSchedulers();
Reservation reservation = schedulers[0].Reservations.First();
But get this error due to WCF not sending the object, (which you could see in picture one).
Which is this error:
Description: An unhandled exception occurred during the execution of
the current web request. Please review the stack trace for more
information about the error and where it originated in the code.
Exception Details: System.InvalidOperationException: Sequence contains
no elements
UPDATE 3:
Ok so it appears that it works somehow.
I just had to make a join between the Scheduler and Reservation.
Also whenever I debug the code I can see the variables are there. (Due to my reputation I can not post links).
But some of you might recognize the following whenever you try to view a result in debug mode:
"expanding the results view will enumerate the ienumerable c#"
Whenever I do this, it works, but not if I run it in release mode.
Looks like only object types (Reservation,Seat) have null values.
I'm guessing either you are missing DataContract/DataMember attributes in your complex types or you might need to include KnownTypeAttribute
It'd be easier to tell if you could provide some code.
EDIT
What your are talking about later is deferred loading. See this blog for more information on deferred vs immediate loading.
When you expand the IEnumerable in debug mode, that makes the request to retrieve/load the objects.
What your probably want is to load your Reservation,Seat objects along with your Scheduler object. Something like the following:
YourDatabaseContext database = new YourDatabaseContext ())
{
DataLoadOptions options = new DataLoadOptions();
options.LoadWith<Scheduler>(sch=> sch.Reservation);
options.LoadWith<Scheduler>(sch=> sch.Seat);
database.LoadOptions = options;
}
See DataLoadOptions for more details.
If you want to understand deferred execution. See this article for more details.
Quote from the article:
By default LINQ uses deferred query execution. This means when you write a LINQ query it doesn't execute. LINQ queries execute when you 'touch' the query results. This means you can change the underlying collection and run the same query subsequent times in the same scope. Touching the data means accessing the results, for instance in a for loop or by using an aggregate operator like Average or AsParallel on the results.
I'm writing a simple movie database which has a three tier stack of service layer, data access layer and SQL DB.
I'm using LINQ to SQL to access the DB and return a Film from the Film table in the DB. This will then be returned as a Film object from the DataContracts in the service layer.
I thought this would work ok, but its resulted in some awkward code which doesn't look right. Can someone sanity check this please?
Is it best practice to map every LINQ result to it's DataContract?
public static class DBConnection
{
private static RMDB_LINQDataContext _db;
static DBConnection()
{
_db = new RMDB_LINQDataContext();
}
public static RMDB.DTO.Film GetFilm(string name)
{
var LINQ_film = from film in _db.GetTable<Film>()
where film.name == name
select film;
if (LINQ_film.ToList().Count != 1)
{
// TODO - faultException
}
else
{
foreach (Film f in LINQ_film.ToList())
{
// Yuck
return new RMDB.DTO.Film(f.name,
f.releaseDate.GetValueOrDefault(), "foo", f.rating.GetValueOrDefault());
}
}
return null;
}
this is a bit neater and more efficient as your formulation performs the query twice
public static List<Film> GetFilm(string name)
{
var LINQ_film = from film in _db.GetTable<Film>()
where film.name == name
select new Film(film.name,
film.releaseDate.GetValueOrDefault(),
"foo",
film.rating.GetValueOrDefault());
var list = LINQ_film.ToList();
if (list.Count != 1)
{
// TODO - faultException
}
return list;
}
Copying one instance of a type to another is standard practice when mapping from a domain object to a DTO. It may seem like a lot more work but it is a purely memory based mapping and provides a layer of isolation between your business layer and your clients. Without it clients will be impacted by refactoring of the business layer
I've been using NH Validator for some time, mostly through ValidationDefs, but I'm still not sure about two things:
Is there any special benefit of using ValidationDef for simple/standard validations (like NotNull, MaxLength etc)?
I'm worried about the fact that those two methods throw different kinds of exceptions on validation, for example:
ValidationDef's Define.NotNullable() throws PropertyValueException
When using [NotNull] attribute, an InvalidStateException is thrown.
This makes me think mixing these two approaches isn't a good idea - it will be very difficult to handle validation exceptions consistently. Any suggestions/recommendations?
ValidationDef is probably more suitable for business-rules validation even if, having said that, I used it even for simple validation. There's more here.
What I like about ValidationDef is the fact that it has got a fluent interface.
I've been playing around with this engine for quite a while and I've put together something that works quite well for me.
I've defined an interface:
public interface IValidationEngine
{
bool IsValid(Entity entity);
IList<Validation.IBrokenRule> Validate(Entity entity);
}
Which is implemented in my validation engine:
public class ValidationEngine : Validation.IValidationEngine
{
private NHibernate.Validator.Engine.ValidatorEngine _Validator;
public ValidationEngine()
{
var vtor = new NHibernate.Validator.Engine.ValidatorEngine();
var configuration = new FluentConfiguration();
configuration
.SetDefaultValidatorMode(ValidatorMode.UseExternal)
.Register<Data.NH.Validation.User, Domain.User>()
.Register<Data.NH.Validation.Company, Domain.Company>()
.Register<Data.NH.Validation.PlanType, Domain.PlanType>();
vtor.Configure(configuration);
this._Validator = vtor;
}
public bool IsValid(DomainModel.Entity entity)
{
return (this._Validator.IsValid(entity));
}
public IList<Validation.IBrokenRule> Validate(DomainModel.Entity entity)
{
var Values = new List<Validation.IBrokenRule>();
NHibernate.Validator.Engine.InvalidValue[] values = this._Validator.Validate(entity);
if (values.Length > 0)
{
foreach (var value in values)
{
Values.Add(
new Validation.BrokenRule()
{
// Entity = value.Entity as BpReminders.Data.DomainModel.Entity,
// EntityType = value.EntityType,
EntityTypeName = value.EntityType.Name,
Message = value.Message,
PropertyName = value.PropertyName,
PropertyPath = value.PropertyPath,
// RootEntity = value.RootEntity as DomainModel.Entity,
Value = value.Value
});
}
}
return (Values);
}
}
I plug all my domain rules in there.
I bootstrap the engine at the app startup:
For<Validation.IValidationEngine>()
.Singleton()
.Use<Validation.ValidationEngine>();
Now, when I need to validate my entities before save, I just use the engine:
if (!this._ValidationEngine.IsValid(User))
{
BrokenRules = this._ValidationEngine.Validate(User);
}
and return, eventually, the collection of broken rules.
I've got a Silverlight 4 RIA Services (SP1) app using Entity Frameworks 4 CTP5. I can databind a grid or listbox to the IEnumerable loaded by the domain context and it shows data from the server. Great.
Now I want to create a new instance of MyEntity and add it to the client-side data so that the user can see the newly added entity. MyEntity is a true entity descendant, not a POCO.
The only Add method I can find is domainContext.EntityContainer.GetEntitySet<MyEntity>().Add(newobj)
This does add the new entity to the domain context, and the domainContext.HasChanges does become true, but the new entity doesn't show up in the databound controls.
How do I get the new entity to show up in the databound controls prior to SubmitChanges?
(Probably related to this SO question from years ago that never got an answer)
Here's the server side declarations of the domain service, per requests:
[EnableClientAccess()]
public class MyDomainService : LinqToEntitiesDomainService<MyObjectContext>
{
protected override MyObjectContext CreateObjectContext()
{
return new MyObjectContext();
}
public IQueryable<MyEntity> GetMyEntities()
{
return this.ObjectContext.MyEntities;
}
public void InsertMyEntity(MyEntity MyEntity)
{
// ...
}
public void UpdateMyEntity(MyEntity currentMyEntity)
{
// ...
}
public void DeleteMyEntity(MyEntity MyEntity)
{
// ...
}
}
I've figured this out with a combination of my own trial and error and hints provided by some of the other responses to this question.
The key point I was missing was that it's not enough for the ViewModel to keep track of the DomainContext and hand out query results to the View for databinding. The ViewModel also has to capture and retain the query results if you want entity adds and deletes performed by the ViewModel to appear in the UI before DomainContext.SubmitChanges(). The ViewModel has to apply those adds to the collection view of the query results.
The ViewModel collection property for View databinding. In this case I'm using the Telerik QueryableDomainServiceCollectionView, but other collection views can be used:
public IEnumerable<MyEntity> MyEntities
{
get
{
if (this.view == null)
{
DomainContextNeeded();
}
return this.view;
}
}
private void DomainContextNeeded()
{
this.context = new MyDomainContext();
var q = context.GetMyEntitiesQuery();
this.view = new Telerik.Windows.Data.QueryableDomainServiceCollectionView<MyEntity>(context, q);
this.view.Load();
}
The ViewModel function that adds a new entity for the UI to display:
public void AddNewMyEntity(object selectedNode)
{
var ent = new MyEntity() { DisplayName = "New Entity" };
if (selectedNode == null)
{
this.view.AddNew(ent);
}
else if (selectedNode is MyEntity)
{
((MyEntity)selectedNode).Children.Add(ent);
}
}
Other responses mentioned ObservableCollection. The query results and the collection view may not return instances of ObservableCollection. They could be just IEnumerables. What is critical is that they implement INotifyCollectionChanged and IEditableCollectionView.
Thanks to those who contributed responses. I've +1'd each response that was helpful, but since none directly solved my problem I couldn't justify marking any as the definitive answer.
Your domainContext will have a property domainContext.MyEntities. Does it not show up in there when you add it?
Bind to that collection or watch that collection for changes.
domainContext.MyEntities.PropertyChanged += MyEventHandler;
I assume you bind your control to the IEnumerable which is provided by LoadOperation<TEntity>.Entities. In that case your binding source is not the DomainContext.GetEntitySet<MyEntity>().
DomainContext.GetEntitySet<MyEntity>() holds all your currently tracked instances of MyEntity, including the one you add with .Add().
LoadOperation<TEntity>.Entities only contains the instances of MyEntity that were actually loaded by your last LoadOperation/Query.
You have two options: Either add the new entity to the ItemsSource-collection for your control (I recommend that) or rebuild the collection with the contents of DomainContext.GetEntitySet<MyEntity>(). That may contain other elements that you have not cleared out before, though.
Ayende has an article about how to implement a simple audit trail for NHibernate (here) using event handlers.
Unfortunately, as can be seen in the comments, his implementation causes the following exception to be thrown: collection xxx was not processed by flush()
The problem appears to be the implicit call to ToString on the dirty properties, which can cause trouble if the dirty property is also a mapped entity.
I have tried my hardest to build a working implementation but with no luck.
Does anyone know of a working solution?
I was able to solve the same problem using following workaround: set the processed flag to true on all collections in the current persistence context within the listener
public void OnPostUpdate(PostUpdateEvent postEvent)
{
if (IsAuditable(postEvent.Entity))
{
//skip application specific code
foreach (var collection in postEvent.Session.PersistenceContext.CollectionEntries.Values)
{
var collectionEntry = collection as CollectionEntry;
collectionEntry.IsProcessed = true;
}
//var session = postEvent.Session.GetSession(EntityMode.Poco);
//session.Save(auditTrailEntry);
//session.Flush();
}
}
Hope this helps.
The fix should be the following. Create a new event listener class and derive it from NHibernate.Event.Default.DefaultFlushEventListener:
[Serializable]
public class FixedDefaultFlushEventListener: DefaultFlushEventListener
{
private static readonly log4net.ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
protected override void PerformExecutions(IEventSource session)
{
if (log.IsDebugEnabled)
{
log.Debug("executing flush");
}
try
{
session.ConnectionManager.FlushBeginning();
session.PersistenceContext.Flushing = true;
session.ActionQueue.PrepareActions();
session.ActionQueue.ExecuteActions();
}
catch (HibernateException exception)
{
if (log.IsErrorEnabled)
{
log.Error("Could not synchronize database state with session", exception);
}
throw;
}
finally
{
session.PersistenceContext.Flushing = false;
session.ConnectionManager.FlushEnding();
}
}
}
Register it during NHibernate configuraiton:
cfg.EventListeners.FlushEventListeners = new IFlushEventListener[] { new FixedDefaultFlushEventListener() };
You can read more about this bug in Hibernate JIRA:
https://hibernate.onjira.com/browse/HHH-2763
The next release of NHibernate should include that fix either.
This is not easy at all. I wrote something like this, but it is very specific to our needs and not trivial.
Some additional hints:
You can test if references are loaded using
NHibernateUtil.IsInitialized(entity)
or
NHibernateUtil.IsPropertyInitialized(entity, propertyName)
You can cast collections to the IPersistentCollection. I implemented an IInterceptor where I get the NHibernate Type of each property, I don't know where you can get this when using events:
if (nhtype.IsCollectionType)
{
var collection = previousValue as NHibernate.Collection.IPersistentCollection;
if (collection != null)
{
// just skip uninitialized collections
if (!collection.WasInitialized)
{
// skip
}
else
{
// read collections previous values
previousValue = collection.StoredSnapshot;
}
}
}
When you get the update event from NHibernate, the instance is initialized. You can safely access properties of primitive types. When you want to use ToString, make sure that your ToString implementation doesn't access any referenced entities nor any collections.
You may use NHibernate meta-data to find out if a type is mapped as an entity or not. This could be useful to navigate in your object model. When you reference another entity, you will get additional update events on this when it changed.
I was able to determine that this error is thrown when application code loads a Lazy Propery where the Entity has a collection.
My first attempt involed watching for new CollectionEntries (which I've never want to process as there shouldn't actually be any changes). Then mark them as IsProcessed = true so they wouldn't cause problems.
var collections = args.Session.PersistenceContext.CollectionEntries;
var collectionKeys = args.Session.PersistenceContext.CollectionEntries.Keys;
var roundCollectionKeys = collectionKeys.Cast<object>().ToList();
var collectionValuesClount = collectionKeys.Count;
// Application code that that loads a Lazy propery where the Entity has a collection
var postCollectionKeys = collectionKeys.Cast<object>().ToList();
var newLength = postCollectionKeys.Count;
if (newLength != collectionValuesClount) {
foreach (var newKey in postCollectionKeys.Except(roundCollectionKeys)) {
var collectionEntry = (CollectionEntry)collections[newKey];
collectionEntry.IsProcessed = true;
}
}
However this didn't entirly solve the issue. In some cases I'd still get the exception.
When OnPostUpdate is called the values in the CollectionEntries dictionary should all already be set to IsProcessed = true. So I decided to do an extra check to see if the collections not processed matched what I expected.
var valuesNotProcessed = collections.Values.Cast<CollectionEntry>().Where(x => !x.IsProcessed).ToList();
if (valuesNotProcessed.Any()) {
// Assert: valuesNotProcessed.Count() == (newLength - collectionValuesClount)
}
In the cases that my first attempt fixed these numbers would match exactly. However in the cases where it didn't work there were extra items alreay in the dictionary. In my I could be sure these extra items also wouldn't result in updates so I could just set IsProcessed = true for all the valuesNotProcessed.