edit only changed or mentionned values with entity framework core - asp.net-core

I need to update only mentioned fields in the put request body , the current issue is that all the values that are not mentioned in the entity to update are set to null
below is my currrent update implementation in the generic repository.
public virtual void Update(T entity)
{
Context.Attach(entity);
Context.Entry(entity).State = EntityState.Modified;
}

You need two different steps. First you have to perform a patch operation. Description here
public IActionResult PatchEntity(int id, [FromBody] JsonPatchDocument<Entity> patchdoc)
{
var entity = dbContext.Entities.Find(e=>e.Id == id);
patchdoc.ApplyTo(entity);
dbContext.Update(entity);
return Ok(entity);
}
Here is a method to perform partial update on DB (take a look at this question too):
public virtual void Update(params object[] keys, T entity)
{
var current = Context.Entities.Find(keys);
Context.Entry(entity).CurrentValues.SetValues(entity);
Context.SaveChanges();
}
If you donĀ“t need to partially update the database record you are fine with:
public virtual void Update(T entity)
{
Context.Update(entity); // entity is attached by default after select of entity
Context.SaveChanges();
}

What you could do is to get the entity before updating it :
Get your entity from your Context
Update the fields of your entity with the data from your model. You can use tools like Automapper to achieve this goal in a clean way.
Then call your Update method on the entity
Another way would be to check the state of each field such as in this answer.
EDIT Update point 2.
Hope it helps.

finally figured it out without even changing the repository
i just added a config within the automapper config file to ignore any null value
CreateMap<TeamDto, Team>().ForAllMembers(opts => opts.Condition((src, dest, srcMember) => srcMember != null));

Related

Exclude columns from INSERT [duplicate]

We have a field in our SQL Server database table which is autogenerated by SQL Server, the field is called CreatedTime.
We have mapped the whole database table to our datamodel in Entity Framework, thus also the field CreatedTime.
When we insert a new row in the database, via Entity Framework, we thus do not provide any value for CreatedTime.
This causes the insert to fail with the error:
SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM
So the question is: Is there is a way to to exclude a particular field in the Entity datamodel in the Entity Framework insert statement? So that we will not get the above error?
We would like to keep the field CreatedTime in the Entity model, because we might want to access it later.
If using Fluent API:
using System.ComponentModel.DataAnnotations.Schema;
this.Property(t => t.CreatedTime)
.HasDatabaseGeneratedOption(DatabaseGeneratedOption.Computed);
If using Annotations
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
public System.DateTime CreatedTime { get; set; }
I found a simple solution to the problem on this thread:
http://social.msdn.microsoft.com/Forums/en-US/adodotnetentityframework/thread/7db14342-b259-4973-ac09-93e183ae48bb
There Fernando Soto writes:
"If you go to the EDM designer click on the field in the table that is auto-generated by the database, right click on it and select Properties and
look at the properties windows click on StoreGeneratedPattern and set its value to Computed, I believe it will give you what you are looking for."
The above solution was super quick and easy and it seems to work.
Also thank you for your contributions guys, but the above solution seems to do the job.
Try to use NotMapped attribute on this property
http://msdn.microsoft.com/en-us/library/system.componentmodel.dataannotations.schema.notmappedattribute.aspx
there are two things you can do:
If you have access to the database, check if the field has a default value. If it doesn't you can set it to GETDATE(), and the field should be set correctly, and you don't have to add/update it through Entity Framework.
If you don't have access to the database, or don't want to make any changes there, you can alter the behavior of the Entity Data Model to automatically set the date. Simply extend your ObjectContext model.
public partial class MyEntities
{
public override int SaveChanges()
{
var entityChangeSet = ChangeTracker.Entries<SomeEntity>();
if (entityChangeSet != null)
{
foreach (DbEntityEntry<SomeEntity> entry in entityChangeSet )
{
switch (entry.State)
{
case EntityState.Modified:
entry.Entity.LastModifiedDate = DateTime.UtcNow;
break;
case EntityState.Added:
entry.Entity.CreatedDate = DateTime.UtcNow;
break;
}
}
}
return base.SaveChanges();
}
}
This way you don't have to add any information for those fields when you add or update an item, the model will do it for you. If you have multiple entities which need this behavior, you can create an interface and make the Entity classes inherit that:
public interface IHaveCreatedDate {
DateTime CreatedDate { get; set; }
}
public partial class MyEntity : IHaveCreatedDate {
//MyEntity already implements this!
}
Then all you need to do is change the call to the ChangeTracker:
var entityChangeSet = ChangeTracker.Entries<IHaveCreatedDate>();
Is CreatedTime nullable?
One possible workaround - if CreatedTime is NOT nullable:
DateTime sqlServerMinDateTime = new DateTime(1753, 1, 1, 12, 0, 1, 0);
if(myEntity.CreatedTime < sqlServerMinDateTime)
{
myEntity.CreatedTime = sqlServerMinDateTime;
}
// do insert here
// ....
One possible workaround - if CreatedTime is nullable:
DateTime sqlServerMinDateTime = new DateTime(1753, 1, 1, 12, 0, 1, 0);
if(myEntity.CreatedTime < sqlServerMinDateTime)
{
myEntity.CreatedTime = null;
}
// do insert here
// ....

Save complex object to session ASP .NET CORE 2.0

I am quite new to ASP .NET core, so please help. I would like to avoid database round trip for ASP .NET core application. I have functionality to dynamically add columns in datagrid. Columns settings (visibility, enable, width, caption) are stored in DB.
So I would like to store List<,PersonColumns> on server only for actual session. But I am not able to do this. I already use JsonConvert methods to serialize and deserialize objects to/from session. This works for List<,Int32> or objects with simple properties, but not for complex object with nested properties.
My object I want to store to session looks like this:
[Serializable]
public class PersonColumns
{
public Int64 PersonId { get; set; }
List<ViewPersonColumns> PersonCols { get; set; }
public PersonColumns(Int64 personId)
{
this.PersonId = personId;
}
public void LoadPersonColumns(dbContext dbContext)
{
LoadPersonColumns(dbContext, null);
}
public void LoadPersonColumns(dbContext dbContext, string code)
{
PersonCols = ViewPersonColumns.GetPersonColumns(dbContext, code, PersonId);
}
public static List<ViewPersonColumns> GetFormViewColumns(SatisDbContext dbContext, string code, Int64 formId, string viewName, Int64 personId)
{
var columns = ViewPersonColumns.GetPersonColumns(dbContext, code, personId);
return columns.Where(p => p.FormId == formId && p.ObjectName == viewName).ToList();
}
}
I would like to ask also if my approach is not bad to save the list of 600 records to session? Is it better to access DB and load columns each time user wants to display the grid?
Any advice appreciated
Thanks
EDIT: I have tested to store in session List<,ViewPersonColumns> and it is correctly saved. When I save object where the List<,ViewPersonColumns> is property, then only built-in types are saved, List property is null.
The object I want to save in session
[Serializable]
public class UserManagement
{
public String PersonUserName { get; set; }
public Int64 PersonId { get; set; }
public List<ViewPersonColumns> PersonColumns { get; set; } //not saved to session??
public UserManagement() { }
public UserManagement(DbContext dbContext, string userName)
{
var person = dbContext.Person.Single(p => p.UserName == userName);
PersonUserName = person.UserName;
PersonId = person.Id;
}
/*public void PrepareUserData(DbContext dbContext)
{
LoadPersonColumns(dbContext);
}*/
public void LoadPersonColumns(DbContext dbContext)
{
LoadPersonColumns(dbContext, null);
}
public void LoadPersonColumns(DbContext dbContext, string code)
{
PersonColumns = ViewPersonColumns.GetPersonColumns(dbContext, code, PersonId);
}
public List<ViewPersonColumns> GetFormViewColumns(Int64 formId, string viewName)
{
if (PersonColumns == null)
return null;
return PersonColumns.Where(p => p.FormId == formId && p.ObjectName == viewName).ToList();
}
}
Save columns to the session
UserManagement userManagement = new UserManagement(_context, user.UserName);
userManagement.LoadPersonColumns(_context);
HttpContext.Session.SetObject("ActualPersonContext", userManagement);
HttpContext.Session.SetObject("ActualPersonColumns", userManagement.PersonColumns);
Load columns from the session
//userManagement build-in types are set. The PersonColumns is null - not correct
UserManagement userManagement = session.GetObject<UserManagement>("ActualPersonContext");
//The cols is filled from session with 600 records - correct
List<ViewPersonColumns> cols = session.GetObject<List<ViewPersonColumns>>("ActualPersonColumns");
Use list for each column is better than use database.
you can't create and store sessions in .net core like .net framework 4.0
Try Like this
Startup.cs
public void ConfigureServices(IServiceCollection services)
{
//services.AddDbContext<GeneralDBContext>(options => options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));
services.AddMvc().AddSessionStateTempDataProvider();
services.AddSession();
}
Common/SessionExtensions.cs
sing Microsoft.AspNetCore.Http;
using Newtonsoft.Json;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
namespace IMAPApplication.Common
{
public static class SessionExtensions
{
public static T GetComplexData<T>(this ISession session, string key)
{
var data = session.GetString(key);
if (data == null)
{
return default(T);
}
return JsonConvert.DeserializeObject<T>(data);
}
public static void SetComplexData(this ISession session, string key, object value)
{
session.SetString(key, JsonConvert.SerializeObject(value));
}
}
}
Usage
==> Create Session*
public IActionResult Login([FromBody]LoginViewModel model)
{
LoggedUserVM user = GetUserDataById(model.userId);
//Create Session with complex object
HttpContext.Session.SetComplexData("loggerUser", user);
return Json(new { status = result.Status, message = result.Message });
}
==> Get Session data*
public IActionResult Index()
{
//Get Session data
LoggedUserVM loggedUser = HttpContext.Session.GetComplexData<LoggedUserVM>("loggerUser");
}
Hope this is helpful. Good luck.
This is an evergreen post, and even though Microsoft has recommended serialisation to store the object in session - it is not a correct solution unless your object is readonly, I have a blog explaining all scenario here and i have even pointed out the issues in GitHub of Asp.Net Core in issue id 18159
Synopsis of the problems are here:
A. Serialisation isn't same as object, true it will help in distributed server scenario but it comes with a caveat that Microsoft have failed to highlight - that it will work without any unpredictable failures only when the object is meant to be read and not to be written back.
B. If you were looking for a read-write object in the session, everytime you change the object that is read from the session after deserialisation - it needs to be written back to the session again by calling serialisation - and this alone can lead to multiple complexities as you will need to either keep track of the changes - or keep writing back to session after each change in any property. In one request to the server, you will have scenarios where the object is written back multiple times till the response is sent back.
C. For a read-write object in the session, even on a single server it will fail, as the actions of the user can trigger multiple rapid requests to the server and not more than often system will find itself in a situation where the object is being serialised or deserialised by one thread and being edited and then written back by another, the result is you will end up with overwriting the object state by threads - and even locks won't help you much since the object is not a real object but a temporary object created by deserialisation.
D. There are issues with serialising complex objects - it is not just a performance hit, it may even fail in certain scenario - especially if you have deeply nested objects that sometimes refer back to itself.
The synopsis of the solution is here, full implementation along with code is in the blog link:
First implement this as a Cache object, create one item in IMemoryCache for each unique session.
Keep the cache in sliding expiration mode, so that each time it is read it revives the expiry time - thereby keeping the objects in cache as long as the session is active.
Second point alone is not enough, you will need to implement heartbeat technique - triggering the call to session every T minus 1 min or so from the javascript. (This we anyways used to do even to keep the session alive till the user is working on the browser, so it won't be any different
Additional Recommendations
A. Make an object called SessionManager - so that all your code related to session read / write sits in one place.
B. Do not keep very high value for session time out - If you are implementing heartbeat technique, even 3 mins of session time out will be enough.

Why is my record being deleted from the db when I attempt to update the record from entity framework MVC?

When I attempt to update a record from entity framework the record is being deleted from the table. There are no errors thrown so it really has me baffled what is happening.
I am fairly new to entity framework and asp.net. I've been learning it for about a month now.
I can update the record without any issues from SQL Server but not from vs. Here is the code to update the db:
// GET: /Scorecard/Edit/5
public ActionResult Edit(int id, string EmployeeName)
{
if (id == null)
{
return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
}
CRS_Monthly crs_monthly = GetAgentById(id);
crs_monthly.EmployeeName = EmployeeName;
if (crs_monthly == null)
{
return HttpNotFound();
}
return View(crs_monthly);
}
// POST: /Scorecard/Edit/5
// To protect from overposting attacks, please enable the specific properties you want to bind to, for
// more details see http://go.microsoft.com/fwlink/?LinkId=317598.
[HttpPost]
[ValidateAntiForgeryToken]
public ActionResult Edit([Bind(Include="REC_ID,Cur_Plan,Plan_Update,Comments,Areas_Improve,Strengths,UPDATED_BY,UPDATED_TIME,Agent_Recognition")] CRS_Monthly crs_monthly)
{
if (ModelState.IsValid)
{
crs_monthly.UPDATED_TIME = DateTime.Now;
crs_monthly.UPDATED_BY = Request.LogonUserIdentity.Name.Split('\\')[1];
db.Entry(crs_monthly).State = EntityState.Modified;
db.SaveChanges();
return RedirectToAction("Index");
}
return View(crs_monthly);
}
When I run the debugger crs_monthly is valid and looks fine until db.SaveChanges(). Any help is greatly appreciated!
You should never save an instance of your entity created from a post, especially when you're utilizing Bind to restrict which properties are bound from the post data. Instead, always pull the entity fresh from the database and map the posted values on to it. This ensures that no data is lost.
Using Bind is a horrible practice, anyways. The chief problem with it is that all your properties are listed as string values, and you're introducing maintenance concerns. If remove one of these properties or change the name, the Bind list is not automatically updated. You must remember to change every single instance. Worse, if you add properties, you have to remember to go back and include them in this list or else your data just gets silently dropped with no notice.
If you need to only work with a subset of properties on your entity, create a view model containing just those properties. Then, again, map the posted values from your view model onto an instance of your entity pulled fresh from the database.

Atomic Read and Write with Entity Framework

I have two different processes (on different machines) that are reading and updating a database record.
The rule I need to ensure is that the record must only be updated if the value of it, lets say is "Initial". Also, after the commit I would want to know if it actually got updated from the current process or not (in case if value was other than initial)
Now, the below code performs something like:
var record = context.Records
.Where(r => (r.id == id && r.State == "Initial"))
.FirstOrDefault();
if(record != null) {
record.State = "Second";
context.SaveChanges();
}
Now couple of questions
1) From looking at the code it appears that after the record is fetched with state "Initial", some other process could have updated it to state "Second" before this process performs SaveChanges.
In this case we are unnecessarily overwriting the state to the same value. Is this the case happening here ?
2) If case 1 is not what happens then EntityFramework may be translating the above to something like
update Record set State = "Second" where Id = someid and State = "Initial"
and performing this as a transaction. This way only one process writes the value. Is this the case with EF default TransactionScope ?
In both cases again how do I know for sure that the update was made from my process as opposed to some other process ?
If this were in-memory objects then in code it would translate to something like assuming multiple threads accessing same data structure
Record rec = FindRecordById(id);
lock (someobject)
{
if(rec.State == "Initial")
{
rec.State = "Second";
//Now, that I know I updated it I can do some processing
}
}
Thanks
In general there are 2 main concurrency patterns that can be used:
Pessimistic concurrency: You lock a row to prevent others from unexpectedly changing the data you are currently attempting to update. EF does not provide any native support for this type of concurrency pattern.
Optimistic concurrency: Citing from EF's documentation: "Optimistic concurrency involves optimistically attempting to save your entity to the database in the hope that the data there has not changed since the entity was loaded. If it turns out that the data has changed then an exception is thrown and you must resolve the conflict before attempting to save again." This pattern is supported by EF, and can be used rather simply.
Focusing on the optimistic concurrency option, which EF does support, let's compare how your example behaves with and without EF's optimistic concurrency control handling. I'll assume you are using SQL Server.
No concurrency control
Let's start with the following script in the database:
create table Record (
Id int identity not null primary key,
State varchar(50) not null
)
insert into Record (State) values ('Initial')
And here is the code with the DbContext and Record entity:
public class MyDbContext : DbContext
{
static MyDbContext()
{
Database.SetInitializer<MyDbContext>(null);
}
public MyDbContext() : base(#"Server=localhost;Database=eftest;Trusted_Connection=True;") { }
public DbSet<Record> Records { get; set; }
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
modelBuilder.Conventions.Remove<PluralizingTableNameConvention>();
modelBuilder.Configurations.Add(new Record.Configuration());
}
}
public class Record
{
public int Id { get; set; }
public string State { get; set; }
public class Configuration : EntityTypeConfiguration<Record>
{
public Configuration()
{
this.HasKey(t => t.Id);
this.Property(t => t.State)
.HasMaxLength(50)
.IsRequired();
}
}
}
Now, let's test your concurrent update scenario with the following code:
static void Main(string[] args)
{
using (var context = new MyDbContext())
{
var record = context.Records
.Where(r => r.Id == 1 && r.State == "Initial")
.Single();
// Insert sneaky update from a different context.
using (var sneakyContext = new MyDbContext())
{
var sneakyRecord = sneakyContext.Records
.Where(r => r.Id == 1 && r.State == "Initial")
.Single();
sneakyRecord.State = "Sneaky Update";
sneakyContext.SaveChanges();
}
// attempt to update row that has just been updated and committed by the sneaky context.
record.State = "Second";
context.SaveChanges();
}
}
If you trace the SQL, you will see that the update statement looks like this:
UPDATE [dbo].[Record]
SET [State] = 'Second'
WHERE ([Id] = 1)
So, in effect, it doesn't care that another transaction sneaked in an update. It just blindly writes over whatever the other update did. And so, the final value of State in the database for that row is 'Second'.
Optimistic concurrency control
Let's adjust our initial SQL script to include a concurrency control column to our table:
create table Record (
Id int identity not null primary key,
State varchar(50) not null,
Concurrency timestamp not null -- add this row versioning column
)
insert into Record (State) values ('Initial')
Let's also adjust our Record entity class (the DbContext class stays the same):
public class Record
{
public int Id { get; set; }
public string State { get; set; }
// Add this property.
public byte[] Concurrency { get; set; }
public class Configuration : EntityTypeConfiguration<Record>
{
public Configuration()
{
this.HasKey(t => t.Id);
this.Property(t => t.State)
.HasMaxLength(50)
.IsRequired();
// Add this config to tell EF that this
// property/column should be used for
// concurrency checking.
this.Property(t => t.Concurrency)
.IsRowVersion();
}
}
}
Now, if we try to re-run the same Main() method we used for the previous scenario, you will notice a change in how the update statement is generated and executed:
UPDATE [dbo].[Record]
SET [State] = 'Second'
WHERE (([Id] = 1) AND ([Concurrency] = <byte[]>))
SELECT [Concurrency]
FROM [dbo].[Record]
WHERE ##ROWCOUNT > 0 AND [Id] = 1
In particular, notice how EF automatically includes the column defined for concurrency control in the where clause of the update statement.
In this case, because there was in fact a concurrent update, EF detects it, and throws a DbUpdateConcurrencyException exception on this line:
context.SaveChanges();
And so, in this case, if you check the database, you'll see that the State value for the row in question will be 'Sneaky Update', because our 2nd update failed to pass the concurrency check.
Final thoughts
As you can see, there isn't much that needs to be done to activate automatic optimistic concurrency control in EF.
Where it gets tricky though is, how do you handle the DbUpdateConcurrencyException exception when it gets thrown? It will largely be up to you to decide what you want to do in this case. But for further guidance on the topic, you'll find more information here: EF - Optimistic Concurrency Patterns.

Get existing entity if it exists or create a new one

I'm importing data that may or may not exist already in my database. I'd like NHibernate to associate any entities with the existing db one if it exists (probably just setting the primary key/id), or create a new one if it doesn't. I'm using S#arp architecture for my framework (MVC 2, NHibernate, Fluent).
I've added the [HasUniqueDomainSignature] attribute to the class, and a [DomainSignature] attribute to the properties I want to use for comparison. The only way I can think to do it (which is not an acceptable solution and may not even work) is the following (psuedo C#):
foreach (Book importedBook in importedBooks){
foreach (Author author in importedBook.Authors){
if (!author.IsValid()){ // NHibernate Validator will check DomainSignatures
author = _authorRepository.GetByExample(author); // This would be to get the db object with the same signature,
//but I don't think I could even update this as I iterate through it.
}
}
}
As you can see, this is both messy, and non-sensical. Add to that the fact that I've got a half dozen associations on the Book (subject, format, etc), and it doesn't make any sense. There's got to be an easy way to do this that I'm missing. I'm not a novice with NHibernate, but I'm definitely not an expert.
I might not be understanding the problem, but how can the data "may or may not exist in the database"? For example, if a Book has 2 Authors, how is the relationship stored at the database level if the Author doesn't exist?
It seems as if you're trying to use NHibernate to import your data (or create an entity if it doesn't exist) which doesn't seem correct.
Most database implementations support a conditional UPDATE-or-INSERT syntax. Oracle, for example, has a MERGE command. In combination with a Hibernate <sql-insert> block in your mapping you should be able to work something out. I don't know Fluent but I assume it supports this too.
Just realize I never gave an answer or approved another's answer. I ended up just writing a new SaveOrUpdate which takes a parameter to check for existing before persisting. I also added an attribute to my domain models to overwrite when saving/updating (although in retrospect it's only on updating that it'd be overwriting).
Here's the code if it can help anyone else in this dilemma:
public TEntity SaveOrUpdate<TEntity>(TEntity entity, bool checkForExistingEntity)
{
IRepository<TEntity> repository = new Repository<TEntity>();
if (checkForExistingEntity) {
if (entity is Entity) {
IEnumerable<PropertyInfo> props = (entity as Entity).GetSignatureProperties();
Dictionary<string, object> parameters =
props.ToDictionary(propertyInfo => propertyInfo.Name, propertyInfo => propertyInfo.GetValue(entity, null));
TEntity duplicateEntity = repository.FindOne(parameters);
if (duplicateEntity != null) {
// Update any properties with the OverwriteOnSaveUpdate attribute
foreach (var property in RepositoryHelper.GetUpdatableProperties(typeof(TEntity)))
{
object initialValue = property.GetValue(entity, null);
property.SetValue(duplicateEntity, initialValue, null);
}
// Fill in any blank properties on db version
foreach (var property in typeof(TEntity).GetProperties())
{
if (property.GetValue(duplicateEntity, null) == null) {
object initialValue = property.GetValue(entity, null);
property.SetValue(duplicateEntity, initialValue, null);
}
}
return duplicateEntity;
}
}
}
return SaveOrUpdate(entity);
}