Updating complex type with ef code first - asp.net-mvc-4

I have a complex type called account, which contains a list of licenses.
Licenses in turn contains a list of domains (a domain is a simple id + url string).
In my repository I have this code
public void SaveLicense(int accountId, License item)
{
Account account = GetById(accountId);
if (account == null)
{
return;
}
if (item.Id == 0)
{
account.Licenses.Add(item);
}
else
{
ActiveContext.Entry(item).State = EntityState.Modified;
}
ActiveContext.SaveChanges();
}
When I try to save an updated License (with modified domains) what happens is that strings belonging straight to the license get updated just fine.
However no domains get updated.
I should mention that what I have done is allow the user to add and remove domains in the user interface. Any new domains get id=0 and any deleted domains are simply not in the list.
so what I want is
Any domains that are in the list and database and NOT changed - nothing happens
Any domains that are in the list and database, but changed in the list - database gets updated
Any domains with id=0 should be inserted (added) into database
Any domains NOT in the list but that are in the database should be removed
I have played a bit with it with no success but I have a sneaky suspicion that I am doing something wrong in the bigger picture so I would love tips on if I am misunderstanding something design-wise or simply just missed something.

Unfortunately updating object graphs - entities with other related entities - is a rather difficult task and there is no very sophisticated support from Entity Framework to make it easy.
The problem is that setting the state of an entity to Modified (or generally to any other state) only influences the entity that you pass into DbContext.Entry and only its scalar properties. It has no effect on its navigation properties and related entities.
You must handle this object graph update manually by loading the entity that is currently stored in the database including the related entities and by merging all changes you have done in the UI into that original graph. Your else case could then look like this:
//...
else
{
var licenseInDb = ActiveContext.Licenses.Include(l => l.Domains)
.SingleOrDefault(l => l.Id == item.Id)
if (licenseInDb != null)
{
// Update the license (only its scalar properties)
ActiveContext.Entry(licenseInDb).CurrentValus.SetValues(item);
// Delete domains from DB that have been deleted in UI
foreach (var domainInDb in licenseInDb.Domains.ToList())
if (!item.Domains.Any(d => d.Id == domainInDb.Id))
ActiveContext.Domains.Remove(domainInDb);
foreach (var domain in item.Domains)
{
var domainInDb = licenseInDb.Domains
.SingleOrDefault(d => d.Id == domain.Id);
if (domainInDb != null)
// Update existing domains
ActiveContext.Entry(domainInDb).CurrentValus.SetValues(domain);
else
// Insert new domains
licenseInDb.Domains.Add(domain);
}
}
}
ActiveContext.SaveChanges();
//...
You can also try out this project called "GraphDiff" which intends to do this work in a generic way for arbitrary detached object graphs.
The alternative is to track all changes in some custom fields in the UI layer and then evaluate the tracked state changes when the data get posted back to set the appropriate entity states. Because you are in a web application it basically means that you have to track changes in the browser (most likely requiring some Javascript) while the user changes values, adds new items or deletes items. In my opinion this solution is even more difficult to implement.

This should be enough to do what you are looking to do. Let me know if you have more questions about the code.
public void SaveLicense(License item)
{
if (account == null)
{
context.Licenses.Add(item);
}
else if (item.Id > 0)
{
var currentItem = context.Licenses
.Single(t => t.Id == item.Id);
context.Entry(currentItem ).CurrentValues.SetValues(item);
}
ActiveContext.SaveChanges();
}

Related

EF Core include related ids but not related entities

Before I go creating my own SQL scripts by hand for this, I have a scenario where I want to get the ids of a foreign key, but not the entirety of the foreign entities, using EF Core.
Right now, I'm getting the ids manually by looping through the related entities and extracting the ids one at a time, like so:
List<int> ClientIds = new List<int>();
for (var i = 0; i < Clients.length; i++){
ClientIds.add(Clients.ElementAt(i).Id);
}
To my understanding, this will either cause data returns much larger than needed (my entity + every related entity) or a completely separate query to be run for each related entity I access, which obviously I don't want to do if I can avoid it.
Is there a straightforward way to accomplish this in EF Core, or do I need to head over the SQL side and handle it myself?
Model:
public class UserViewModel {
public UserViewModel(UserModel userModel){
ClientIds = new List<int>();
for (var i = 0; i < UserModel.Clients.length; i++){
ClientIds.add(Clients.ElementAt(i).Id);
}
//...all the other class asignments
}
public IEnumerable<int> ClientIds {get;set;}
//...all the other irrelevant properties
}
Basically, I need my front-end to know which Client to ask for later.
It looks like you are trying to query this from within the parent entity. I.e.
public class Parent
{
public virtual ICollection<Client> Clients { get; set; }
public void SomeMethod()
{
// ...
List<int> ClientIds = new List<int>();
for (var i = 0; i < Clients.length; i++)
{
ClientIds.add(Clients.ElementAt(i).Id);
}
// ...
}
}
This is not ideal because unless your Clients were eager loaded when the Parent was loaded, this would trigger a lazy load to load all of the Clients data when all you want is the IDs. Still, it's not terrible as it would only result in one DB call to load the clients.
If they are already loaded, there is a more succinct way to get the IDs:
List<int> ClientIds = Clients.Select(x => x.Id).ToList();
Otherwise, if you have business logic involving the Parent and Clients where-by you want to be more selective about when and how the data is loaded, it is better to leave the entity definition to just represent the data state and basic rules/logic about the data, and move selective business logic outside of the entity into a business logic container that scopes the DbContext and queries against the entities to fetch what it needs.
For instance, if the calling code went and did this:
var parent = _context.Parents.Single(x => x.ParentId == parentId);
parent.SomeMethod(); // which resulted in checking the Client IDs...
The simplest way to avoid the extra DB call is to ensure the related entities are eager loaded.
var parent = _context.Parents
.Include(x => x.Clients)
.Single(x => x.ParentId == parentId);
parent.SomeMethod(); // which resulted in checking the Client IDs...
The problem with this approach is that it will still load all details about all of the Clients, and you end up in a situation where you end up defaulting to eager loading everything all of the time because the code might call something like that SomeMethod() which expects to find related entity details. This is the use-case for leveraging lazy loading, but that does have the performance overheads of the ad-hoc DB hits and ensuring that the entity's DbContext is always available to perform the read if necessary.
Instead, if you move the logic out of the entity and into the caller or another container that can take the relevant details, so that this caller projects down the data it will need from the entities in an efficient query:
var parentDetails = _context.Parents
.Where(x => x.ParentId == parentId)
.Select(x => new
{
x.ParentId,
// other details from parent or related entities...
ClientIds = x.Clients.Select(c => c.Id).ToList()
}).Single();
// Do logic that SomeMethod() would have done here, or pass these
// loaded details to a method / service to do the work rather than
// embedding it in the Entity.
This doesn't load a Parent entity, but rather executes a query to load just the details about the parent and related entities that we need. In this example it is projected into an anonymous type to hold the information we can later consume, but if you are querying the data to send to a view then you can project it directly into a view model or DTO class to serialize and send.

Nhibernate Clear Cache for a Specific Region

I am trying to manually clear the level 2 cache for a specific region. I found the method posted in answer to this question. While this is working to clear my entities, for some reason the querycache is not getting cleared. This results in a separate query for each entity the next time the entities are retrieved from the database. If does work when I call sessionFactory.EvictQueries() without any parameters. It is only not working when I am passing in a specific region name. Any ideas as to what is going wrong?
Code is from the above link:
private void ClearRegion(string regionName)
{
_sessionFactory.EvictQueries(regionName);
foreach (var collectionMetaData in _sessionFactory.GetAllCollectionMetadata().Values)
{
var collectionPersister = collectionMetaData as NHibernate.Persister.Collection.ICollectionPersister;
if (collectionPersister != null)
{
if ((collectionPersister.Cache != null) && (collectionPersister.Cache.RegionName == regionName))
{
_sessionFactory.EvictCollection(collectionPersister.Role);
}
}
}
foreach (var classMetaData in _sessionFactory.GetAllClassMetadata().Values)
{
var entityPersister = classMetaData as NHibernate.Persister.Entity.IEntityPersister;
if (entityPersister != null)
{
if ((entityPersister.Cache != null) && (entityPersister.Cache.RegionName == regionName))
{
_sessionFactory.EvictEntity(entityPersister.EntityName);
}
}
}
}
Caching is working and verified using NHProfiler.
Ok, so I figured out my issue. I did not realize that it is necessary to specify a cache region when querying the data, aside from specifying it in the entity mapping. After adding .CacheRegion("regionName") to my queries everything works. By not adding the region when querying, it was going into the query cache without a region name. That is why it worked when I called.EvictQueries() without a region name parameter.
To sum it up, it is necessary to add the region name when mapping the entities (.Region("regionName") when using Fluent) and when querying with isession using .CacheRegion("regionName").
Thank you for you responses.

Why is my record being deleted from the db when I attempt to update the record from entity framework MVC?

When I attempt to update a record from entity framework the record is being deleted from the table. There are no errors thrown so it really has me baffled what is happening.
I am fairly new to entity framework and asp.net. I've been learning it for about a month now.
I can update the record without any issues from SQL Server but not from vs. Here is the code to update the db:
// GET: /Scorecard/Edit/5
public ActionResult Edit(int id, string EmployeeName)
{
if (id == null)
{
return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
}
CRS_Monthly crs_monthly = GetAgentById(id);
crs_monthly.EmployeeName = EmployeeName;
if (crs_monthly == null)
{
return HttpNotFound();
}
return View(crs_monthly);
}
// POST: /Scorecard/Edit/5
// To protect from overposting attacks, please enable the specific properties you want to bind to, for
// more details see http://go.microsoft.com/fwlink/?LinkId=317598.
[HttpPost]
[ValidateAntiForgeryToken]
public ActionResult Edit([Bind(Include="REC_ID,Cur_Plan,Plan_Update,Comments,Areas_Improve,Strengths,UPDATED_BY,UPDATED_TIME,Agent_Recognition")] CRS_Monthly crs_monthly)
{
if (ModelState.IsValid)
{
crs_monthly.UPDATED_TIME = DateTime.Now;
crs_monthly.UPDATED_BY = Request.LogonUserIdentity.Name.Split('\\')[1];
db.Entry(crs_monthly).State = EntityState.Modified;
db.SaveChanges();
return RedirectToAction("Index");
}
return View(crs_monthly);
}
When I run the debugger crs_monthly is valid and looks fine until db.SaveChanges(). Any help is greatly appreciated!
You should never save an instance of your entity created from a post, especially when you're utilizing Bind to restrict which properties are bound from the post data. Instead, always pull the entity fresh from the database and map the posted values on to it. This ensures that no data is lost.
Using Bind is a horrible practice, anyways. The chief problem with it is that all your properties are listed as string values, and you're introducing maintenance concerns. If remove one of these properties or change the name, the Bind list is not automatically updated. You must remember to change every single instance. Worse, if you add properties, you have to remember to go back and include them in this list or else your data just gets silently dropped with no notice.
If you need to only work with a subset of properties on your entity, create a view model containing just those properties. Then, again, map the posted values from your view model onto an instance of your entity pulled fresh from the database.

How do I add a lazy loaded column in EntitySpaces?

If you do not have experience with or aren't currently using EntitySpaces ("ES") ORM this question is not meant for you.
I have a 10 year old application that after 4 years now needs my attention. My application uses a now defunct ORM called EntitySpaces and I'm hoping if you're reading this you have experience or maybe still use it too! Switching to another ORM is not an option at this time so I need to find a way to make this work.
Between the time I last actively worked on my application and now (ES Version 2012-09-30), EntitySpaces ("ES") has gone through a significant change in the underlying ADO.net back-end. The scenario that I'm seeking help on is when an entity collection is loaded with only a subset of the columns:
_products = new ProductCollection();
_products.Query.SelectAllExcept(_products.Query.ImageData);
_products.LoadAll();
I then override the properties that weren't loaded in the initial select so that I may lazyload them in the accessor. Here is an example of one such lazy-loaded property that used to work perfectly.
public override byte[] ImageData
{
get
{
bool rowIsDirty = base.es.RowState != DataRowState.Unchanged;
// Check if we have loaded the blob data
if(base.Row.Table != null && base.Row.Table.Columns.Contains(ProductMetadata.ColumnNames.ImageData) == false)
{
// add the column before we can save data to the entity
this.Row.Table.Columns.Add(ProductMetadata.ColumnNames.ImageData, typeof(byte[]));
}
if(base.Row[ProductMetadata.ColumnNames.ImageData] is System.DBNull)
{
// Need to load the data
Product product = new Product();
product.Query.Select(product.Query.ImageData).Where(product.Query.ProductID == base.ProductID);
if(product.Query.Load())
{
if (product.Row[ProductMetadata.ColumnNames.ImageData] is System.DBNull == false)
{
base.ImageData = product.ImageData;
if (rowIsDirty == false)
{
base.AcceptChanges();
}
}
}
}
return base.ImageData;
}
set
{
base.ImageData = value;
}
}
The interesting part is where I add the column to the underlying DataTable DataColumn collection:
this.Row.Table.Columns.Add(ProductMetadata.ColumnNames.ImageData, typeof(byte[]));
I had to comment out all the ADO.net related stuff from that accessor when I updated to the current (and open source) edition of ES (version 2012-09-30). That means that the "ImageData" column isn't properly configured and when I change it's data and attempt to save the entity I receive the following error:
Column 'ImageData' does not belong to table .
I've spent a few days looking through the ES source and experimenting and it appears that they no longer use a DataTable to back the entities, but instead are using a 'esSmartDictionary'.
My question is: Is there a known, supported way to accomplish the same lazy loaded behavior that used to work in the new version of ES? Where I can update a property (i.e. column) that wasn't included in the initial select by telling the ORM to add it to the entity backing store?
After analyzing how ES constructs the DataTable that is uses for updates it became clear that columns not included in the initial select (i.e. load) operation needed to be added to the esEntityCollectionBase.SelectedColumns dictionary. I added the following method to handle this.
/// <summary>
/// Appends the specified column to the SelectedColumns dictionary. The selected columns collection is
/// important as it serves as the basis for DataTable creation when updating an entity collection. If you've
/// lazy loaded a column (i.e. it wasn't included in the initial select) it will not be automatically
/// included in the selected columns collection. If you want to update the collection including the lazy
/// loaded column you need to use this method to add the column to the Select Columns list.
/// </summary>
/// <param name="columnName">The lazy loaded column name. Note: Use the {yourentityname}Metadata.ColumnNames
/// class to access the column names.</param>
public void AddLazyLoadedColumn(string columnName)
{
if(this.selectedColumns == null)
{
throw new Exception(
"You can only append a lazy-loaded Column to a partially selected entity collection");
}
if (this.selectedColumns.ContainsKey(columnName))
{
return;
}
else
{
// Using the count because I can't determine what the value is supposed to be or how it's used. From
// I can tell it's just the number of the column as it was selected: if 8 colums were selected the
// value would be 1 through 8 - ??
int columnValue = selectedColumns.Count;
this.selectedColumns.Add(columnName, columnValue);
}
}
You would use this method like this:
public override System.Byte[] ImageData
{
get
{
var collection = this.GetCollection();
if(collection != null)
{
collection.AddLazyLoadedColumn(ProductMetadata.ColumnNames.ImageData);
}
...
It's a shame that nobody is interested in the open source EntitySpaces. I'd be happy to work on it if I thought it had a future, but it doesn't appear so. :(
I'm still interested in any other approaches or insight from other users.

given a list of objects using C# push them to ravendb without knowing which ones already exist

Given 1000 documents with a complex data structure. for e.g. a Car class that has three properties, Make and Model and one Id property.
What is the most efficient way in C# to push these documents to raven db (preferably in a batch) without having to query the raven collection individually to find which to update and which to insert. At the moment I have to going like so. Which is totally inefficient.
note : _session is a wrapper on the IDocumentSession where Commit calls SaveChanges and Add calls Store.
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var page = 0;
const int total = 30;
do
{
var paged = sales.Skip(page*total).Take(total);
if (!paged.Any()) return;
foreach (var sale in paged)
{
var current = sale;
var existing = _session.Query<Sale>().FirstOrDefault(s => s.Id == current.Id);
if (existing != null)
existing = current;
else
_session.Add(current);
}
_session.Commit();
page++;
} while (true);
}
Your session code doesn't seem to track with the RavenDB api (we don't have Add or Commit).
Here is how you do this in RavenDB
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
sales.ForEach(session.Store);
session.SaveChanges();
}
Your code sample doesn't work at all. The main problem is that you cannot just switch out the references and expect RavenDB to recognize that:
if (existing != null)
existing = current;
Instead you have to update each property one-by-one:
existing.Model = current.Model;
existing.Make = current.Model;
This is the way you can facilitate change-tracking in RavenDB and many other frameworks (e.g. NHibernate). If you want to avoid writing this uinteresting piece of code I recommend to use AutoMapper:
existing = Mapper.Map<Sale>(current, existing);
Another problem with your code is that you use Session.Query where you should use Session.Load. Remember: If you query for a document by its id, you will always want to use Load!
The main difference is that one uses the local cache and the other not (the same applies to the equivalent NHibernate methods).
Ok, so now I can answer your question:
If I understand you correctly you want to save a bunch of Sale-instances to your database while they should either be added if they didn't exist or updated if they existed. Right?
One way is to correct your sample code with the hints above and let it work. However that will issue one unnecessary request (Session.Load(existingId)) for each iteration. You can easily avoid that if you setup an index that selects all the Ids of all documents inside your Sales-collection. Before you then loop through your items you can load all the existing Ids.
However, I would like to know what you actually want to do. What is your domain/use-case?
This is what works for me right now. Note: The InjectFrom method comes from Omu.ValueInjecter (nuget package)
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var ids = sales.Select(i => i.Id);
var existingSales = _ravenSession.Load<Sale>(ids);
existingSales.ForEach(s => s.InjectFrom(sales.Single(i => i.Id == s.Id)));
var existingIds = existingSales.Select(i => i.Id);
var nonExistingSales = sales.Where(i => !existingIds.Any(x => x == i.Id));
nonExistingSales.ForEach(i => _ravenSession.Store(i));
_ravenSession.SaveChanges();
}