For various reasons I am having to send a typed dataset to a WCF service endpoint. This works fine except that upon Deserializing, the RowState of each row in each DataTable is set to 'Added', regardless of what they were on the client. If I write the serialized stream out to a file, I see that the RowState is not part of the Serialized data. How can I add this so that I can preserve the RowState across service boundaries? Not that I think it matters, but the client process is running .net 3.5 while the service process is running .net 4.0
I had this problem also, and found a very simple solution for it:
Instead of using the dataset object's "WriteXml" method, serialize the object 'manually' using a BinaryFormatter:
BinaryFormatter bf = new BinaryFormatter();
using(FileStream fs = File.Open("datastore.dat", FileMode.Create, FileAccess.Write))
{
bf.Serialize(fs, ds);
}
When you deserialize, the object is in the exact same state as before, including the rowstate data.
Here is the code for the RowState property:
public DataRowState RowState
{
get
{
if (this.oldRecord == this.newRecord)
{
if (this.oldRecord == -1)
{
return DataRowState.Detached;
}
if (0 < this._columns.ColumnsImplementingIChangeTrackingCount)
{
foreach (DataColumn column in this._columns.ColumnsImplementingIChangeTracking)
{
object obj2 = this[column];
if ((DBNull.Value != obj2) && ((IChangeTracking)obj2).IsChanged)
{
return DataRowState.Modified;
}
}
}
return DataRowState.Unchanged;
}
if (this.oldRecord == -1)
{
return DataRowState.Added;
}
if (this.newRecord == -1)
{
return DataRowState.Deleted;
}
return DataRowState.Modified;
}
}
as you can see, there may be nothing you can do about its value, because it is calculated rather than just stored. The easiest solution may be to just add another column to the DataSet which contains the state of the row.
(Why does it always calculate out to the value Added? Most likely because when your serialized dataset is rehydrated back on the server, new rows are created and added to the dataset - so the value is quite literally true. If you follow the above suggestion to add another column to the dataset, that will require a change to the server code to be able to examine and process it - if you are going to make that sort of change then maybe it is worth doing the whole thing and recode that service to use proper serializable DTOs instead?).
Related
If you do not have experience with or aren't currently using EntitySpaces ("ES") ORM this question is not meant for you.
I have a 10 year old application that after 4 years now needs my attention. My application uses a now defunct ORM called EntitySpaces and I'm hoping if you're reading this you have experience or maybe still use it too! Switching to another ORM is not an option at this time so I need to find a way to make this work.
Between the time I last actively worked on my application and now (ES Version 2012-09-30), EntitySpaces ("ES") has gone through a significant change in the underlying ADO.net back-end. The scenario that I'm seeking help on is when an entity collection is loaded with only a subset of the columns:
_products = new ProductCollection();
_products.Query.SelectAllExcept(_products.Query.ImageData);
_products.LoadAll();
I then override the properties that weren't loaded in the initial select so that I may lazyload them in the accessor. Here is an example of one such lazy-loaded property that used to work perfectly.
public override byte[] ImageData
{
get
{
bool rowIsDirty = base.es.RowState != DataRowState.Unchanged;
// Check if we have loaded the blob data
if(base.Row.Table != null && base.Row.Table.Columns.Contains(ProductMetadata.ColumnNames.ImageData) == false)
{
// add the column before we can save data to the entity
this.Row.Table.Columns.Add(ProductMetadata.ColumnNames.ImageData, typeof(byte[]));
}
if(base.Row[ProductMetadata.ColumnNames.ImageData] is System.DBNull)
{
// Need to load the data
Product product = new Product();
product.Query.Select(product.Query.ImageData).Where(product.Query.ProductID == base.ProductID);
if(product.Query.Load())
{
if (product.Row[ProductMetadata.ColumnNames.ImageData] is System.DBNull == false)
{
base.ImageData = product.ImageData;
if (rowIsDirty == false)
{
base.AcceptChanges();
}
}
}
}
return base.ImageData;
}
set
{
base.ImageData = value;
}
}
The interesting part is where I add the column to the underlying DataTable DataColumn collection:
this.Row.Table.Columns.Add(ProductMetadata.ColumnNames.ImageData, typeof(byte[]));
I had to comment out all the ADO.net related stuff from that accessor when I updated to the current (and open source) edition of ES (version 2012-09-30). That means that the "ImageData" column isn't properly configured and when I change it's data and attempt to save the entity I receive the following error:
Column 'ImageData' does not belong to table .
I've spent a few days looking through the ES source and experimenting and it appears that they no longer use a DataTable to back the entities, but instead are using a 'esSmartDictionary'.
My question is: Is there a known, supported way to accomplish the same lazy loaded behavior that used to work in the new version of ES? Where I can update a property (i.e. column) that wasn't included in the initial select by telling the ORM to add it to the entity backing store?
After analyzing how ES constructs the DataTable that is uses for updates it became clear that columns not included in the initial select (i.e. load) operation needed to be added to the esEntityCollectionBase.SelectedColumns dictionary. I added the following method to handle this.
/// <summary>
/// Appends the specified column to the SelectedColumns dictionary. The selected columns collection is
/// important as it serves as the basis for DataTable creation when updating an entity collection. If you've
/// lazy loaded a column (i.e. it wasn't included in the initial select) it will not be automatically
/// included in the selected columns collection. If you want to update the collection including the lazy
/// loaded column you need to use this method to add the column to the Select Columns list.
/// </summary>
/// <param name="columnName">The lazy loaded column name. Note: Use the {yourentityname}Metadata.ColumnNames
/// class to access the column names.</param>
public void AddLazyLoadedColumn(string columnName)
{
if(this.selectedColumns == null)
{
throw new Exception(
"You can only append a lazy-loaded Column to a partially selected entity collection");
}
if (this.selectedColumns.ContainsKey(columnName))
{
return;
}
else
{
// Using the count because I can't determine what the value is supposed to be or how it's used. From
// I can tell it's just the number of the column as it was selected: if 8 colums were selected the
// value would be 1 through 8 - ??
int columnValue = selectedColumns.Count;
this.selectedColumns.Add(columnName, columnValue);
}
}
You would use this method like this:
public override System.Byte[] ImageData
{
get
{
var collection = this.GetCollection();
if(collection != null)
{
collection.AddLazyLoadedColumn(ProductMetadata.ColumnNames.ImageData);
}
...
It's a shame that nobody is interested in the open source EntitySpaces. I'd be happy to work on it if I thought it had a future, but it doesn't appear so. :(
I'm still interested in any other approaches or insight from other users.
I am trying to find the perfect way to handle this exception and force client changes to overwrite any other changes that caused the conflict. The approach that I came up with is to wrap the call to Session.Transaction.Commit() in a loop, inside the loop I would do a try-catch block and handle each stale object individually by copying its properties, except row-version property then refreshing the object to get latest DB data then recopying original values to the refreshed object and then doing a merge. Once I loop I will commit and if any other StaleObjectStateException take place then the same applies. The loop keeps looping until all conflicts are resolved.
This method is part of a UnitOfWork class. To make it clearer I'll post my code:
// 'Client-wins' rules, any conflicts found will always cause client changes to
// overwrite anything else.
public void CommitAndRefresh() {
bool saveFailed;
do {
try {
_session.Transaction.Commit();
_session.BeginTransaction();
saveFailed = false;
} catch (StaleObjectStateException ex) {
saveFailed = true;
// Get the staled object with client changes
var staleObject = _session.Get(ex.EntityName, ex.Identifier);
// Extract the row-version property name
IClassMetadata meta = _sessionFactory.GetClassMetadata(ex.EntityName);
string rowVersionPropertyName = meta.PropertyNames[meta.VersionProperty] as string;
// Store all property values from client changes
var propertyValues = new Dictionary<string, object>();
var publicProperties = staleObject.GetType().GetProperties();
foreach (var p in publicProperties) {
if (p.Name != rowVersionPropertyName) {
propertyValues.Add(p.Name, p.GetValue(staleObject, null));
}
}
// Get latest data for staled object from the database
_session.Refresh(staleObject);
// Update the data with the original client changes except for row-version
foreach (var p in publicProperties) {
if (p.Name != rowVersionPropertyName) {
p.SetValue(staleObject, propertyValues[p.Name], null);
}
}
// Merge
_session.Merge(staleObject);
}
} while (saveFailed);
}
The above code works fine and handle concurrency with the client-wins rule. However, I was wondering if there is any built-in capabilities in NHibernate to do this for me or if there is a better way to handle this.
Thanks in advance,
What you're describing is a lack of concurrency checking. If you don't use a concurrency strategy (optimistic-lock, version or pessimistic), StaleStateObjectException will not be thrown and the update will be issued.
Okay, now I understand your use case. One important point is that the ISession should be discarded after an exception is thrown. You can use ISession.Merge to merge changes between a detached a persistent object rather than doing it yourself. Unfortunately, Merge does not cascade to child objects so you still need to walk the object graph yourself. So the implementation would look something like:
catch (StaleObjectStateException ex)
{
if (isPowerUser)
{
var newSession = GetSession();
// Merge will automatically get first
newSession.Merge(staleObject);
newSession.Flush();
}
}
I'm new to NHibernate and was assigned to a task where I have to change a value of an entity property and then compare if this new value (cached) is different from the actual value stored on the DB. However, every attempt to retrieve this value from the DB resulted in the cached value. As I said, I'm new to NHibernate, maybe this is something easy to do and obviously could be done with plain ADO.NET, but the client demands that we use NHibernate for every access to the DB. In order to make things clearer, those were my "successful" attempts (ie, no errors):
1
DetachedCriteria criteria = DetachedCriteria.For<User>()
.SetProjection(Projections.Distinct(Projections.Property(UserField.JobLoad)))
.Add(Expression.Eq(UserField.Id, userid));
return GetByDetachedCriteria(criteria)[0].Id; //this is the value I want
2
var JobLoadId = DetachedCriteria.For<User>()
.SetProjection(Projections.Distinct(Projections.Property(UserField.JobLoad)))
.Add(Expression.Eq(UserField.Id, userid));
ICriteria criteria = JobLoadId.GetExecutableCriteria(NHibernateSession);
var ids = criteria.List();
return ((JobLoad)ids[0]).Id;
Hope I made myself clear, sometimes is hard to explain a problem when even you don't quite understand the underlying framework.
Edit: Of course, this is a method body.
Edit 2: I found out that it doesn't work properly for the method call is inside a transaction context. If I remove the transaction, it works fine, but I need it to be in this context.
I do that opening a new stateless session for geting the actual object in the database:
User databaseuser;
using (IStatelessSession session = SessionFactory.OpenStatelessSession())
{
databaseuser = db.get<User>("id");
}
//do your checks
Within a session, NHibernate will return the same object from its Level-1 Cache (aka Identity Map). If you need to see the current value in the database, you can open a new session and load the object in that session.
I would do it like this:
public class MyObject : Entity
{
private readonly string myField;
public string MyProperty
{
get { return myField; }
set
{
if (value != myField)
{
myField = value;
DoWhateverYouNeedToDoWhenItIsChanged();
}
}
}
}
googles nhforge
http://nhibernate.info/doc/howto/various/finding-dirty-properties-in-nhibernate.html
This may be able to help you.
In the dbml designer I've set Update Check to Never on all properties. But i still get an exception when doing Attach: "An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported." This approach seems to have worked for others on here, but there must be something I've missed.
using(TheDataContext dc = new TheDataContext())
{
test = dc.Members.FirstOrDefault(m => m.fltId == 1);
}
test.Name = "test2";
using(TheDataContext dc = new TheDataContext())
{
dc.Members.Attach(test, true);
dc.SubmitChanges();
}
The error message says exactly what is going wrong: You are trying to attach an object that has been loaded from another DataContext, in your case from another instance of the DataContext. Dont dispose your DataContext (at the end of the using statement it gets disposed) before you change values and submit the changes. This should work (all in one using statement). I just saw you want to attach the object again to the members collection, but it is already in there. No need to do that, this should work just as well:
using(TheDataContext dc = new TheDataContext())
{
var test = dc.Members.FirstOrDefault(m => m.fltId == 1);
test.Name = "test2";
dc.SubmitChanges();
}
Just change the value and submit the changes.
Latest Update:
(Removed all previous 3 updates)
My previous solution (removed it again from this post), found here is dangerous. I just read this on a MSDN article:
"Only call the Attach methods on new
or deserialized entities. The only way
for an entity to be detached from its
original data context is for it to be
serialized. If you try to attach an
undetached entity to a new data
context, and that entity still has
deferred loaders from its previous
data context, LINQ to SQL will thrown
an exception. An entity with deferred
loaders from two different data
contexts could cause unwanted results
when you perform insert, update, and
delete operations on that entity. For
more information about deferred
loaders, see Deferred versus Immediate
Loading (LINQ to SQL)."
Use this instead:
// Get the object the first time by some id
using(TheDataContext dc = new TheDataContext())
{
test = dc.Members.FirstOrDefault(m => m.fltId == 1);
}
// Somewhere else in the program
test.Name = "test2";
// Again somewhere else
using(TheDataContext dc = new TheDataContext())
{
// Get the db row with the id of the 'test' object
Member modifiedMember = new Member()
{
Id = test.Id,
Name = test.Name,
Field2 = test.Field2,
Field3 = test.Field3,
Field4 = test.Field4
};
dc.Members.Attach(modifiedMember, true);
dc.SubmitChanges();
}
After having copied the object, all references are detached, and all event handlers (deferred loading from db) are not connected to the new object. Just the value fields are copied to the new object, that can now be savely attached to the members table. Additionally you do not have to query the db for a second time with this solution.
It is possible to attach entities from another datacontext.
The only thing that needs to be added to code in the first post is this:
dc.DeferredLoadingEnabled = false
But this is a drawback since deferred loading is very useful. I read somewhere on this page that another solution would be to set the Update Check on all properties to Never. This text says the same: http://complexitykills.blogspot.com/2008/03/disconnected-linq-to-sql-tips-part-1.html
But I can't get it to work even after setting the Update Check to Never.
This is a function in my Repository class which I use to update entities
protected void Attach(TEntity entity)
{
try
{
_dataContext.GetTable<TEntity>().Attach(entity);
_dataContext.Refresh(RefreshMode.KeepCurrentValues, entity);
}
catch (DuplicateKeyException ex) //Data context knows about this entity so just update values
{
_dataContext.Refresh(RefreshMode.KeepCurrentValues, entity);
}
}
Where TEntity is your DB Class and depending on you setup you might just want to do
_dataContext.Attach(entity);
Here is the situation:
Silverlight 3 Application hits an asp.net hosted WCF service to get a list of items to display in a grid. Once the list is brought down to the client it is cached in IsolatedStorage. This is done by using the DataContractSerializer to serialize all of these objects to a stream which is then zipped and then encrypted. When the application is relaunched, it first loads from the cache (reversing the process above) and the deserializes the objects using the DataContractSerializer.ReadObject() method. All of this was working wonderfully under all scenarios until recently with the entire "load from cache" path (decrypt/unzip/deserialize) taking hundreds of milliseconds at most.
On some development machines but not all (all machines Windows 7) the deserialize process - that is the call to ReadObject(stream) takes several minutes an seems to lock up the entire machine BUT ONLY WHEN RUNNING IN THE DEBUGGER in VS2008. Running the Debug configuration code outside the debugger has no problem.
One thing that seems to look suspicious is that when you turn on stop on Exceptions, you can see that the ReadObject() throws many, many System.FormatException's indicating that a number was not in the correct format. When I turn off "Just My Code" thousands of these get dumped to the screen. None go unhandled. These occur both on the read back from the cache AND on a deserialization at the conclusion of a web service call to get the data from the WCF Service. HOWEVER, these same exceptions occur on my laptop development machine that does not experience the slowness at all. And FWIW, my laptop is really old and my desktop is a 4 core, 6GB RAM beast.
Again, no problems unless running under the debugger in VS2008. Anyone else seem this? Any thoughts?
Here is the bug report link: https://connect.microsoft.com/VisualStudio/feedback/details/539609/very-slow-performance-deserializing-using-datacontractserializer-in-a-silverlight-application-only-in-debugger
EDIT: I now know where the FormatExceptions are coming from. It seems that they are "by design" - they occur when when I have doubles being serialized that are double.NaN so that that xml looks like NaN...It seems that the DCS tries to parse the value as a number, that fails with an exception and then it looks for "NaN" et. al. and handles them. My problem is not that this does not work...it does...it is just that it completely cripples the debugger. Does anyone know how to configure the debugger/vs2008sp1 to handle this more efficiently.
cartden,
You may want to consider switching over to XMLSerializer instead. Here is what I have determined over time:
The XMLSerializer and DataContractSerializer classes provides a simple means of serializing and deserializing object graphs to and from XML.
The key differences are:
1.
XMLSerializer has much smaller payload than DCS if you use [XmlAttribute] instead of [XmlElement]
DCS always store values as elements
2.
DCS is "opt-in" rather than "opt-out"
With DCS you explicitly mark what you want to serialize with [DataMember]
With DCS you can serialize any field or property, even if they are marked protected or private
With DCS you can use [IgnoreDataMember] to have the serializer ignore certain properties
With XMLSerializer public properties are serialized, and need setters to be deserialized
With XmlSerializer you can use [XmlIgnore] to have the serializer ignore public properties
3.
BE AWARE! DCS.ReadObject DOES NOT call constructors during deserialization
If you need to perform initialization, DCS supports the following callback hooks:
[OnDeserializing], [OnDeserialized], [OnSerializing], [OnSerialized]
(also useful for handling versioning issues)
If you want the ability to switch between the two serializers, you can use both sets of attributes simultaneously, as in:
[DataContract]
[XmlRoot]
public class ProfilePerson : NotifyPropertyChanges
{
[XmlAttribute]
[DataMember]
public string FirstName { get { return m_FirstName; } set { SetProperty(ref m_FirstName, value); } }
private string m_FirstName;
[XmlElement]
[DataMember]
public PersonLocation Location { get { return m_Location; } set { SetProperty(ref m_Location, value); } }
private PersonLocation m_Location = new PersonLocation(); // Should change over time
[XmlIgnore]
[IgnoreDataMember]
public Profile ParentProfile { get { return m_ParentProfile; } set { SetProperty(ref m_ParentProfile, value); } }
private Profile m_ParentProfile = null;
public ProfilePerson()
{
}
}
Also, check out my Serializer class that can switch between the two:
using System;
using System.IO;
using System.Runtime.Serialization;
using System.Text;
using System.Xml;
using System.Xml.Serialization;
namespace ClassLibrary
{
// Instantiate this class to serialize objects using either XmlSerializer or DataContractSerializer
internal class Serializer
{
private readonly bool m_bDCS;
internal Serializer(bool bDCS)
{
m_bDCS = bDCS;
}
internal TT Deserialize<TT>(string input)
{
MemoryStream stream = new MemoryStream(input.ToByteArray());
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
return (TT)dc.ReadObject(stream);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
return (TT)xs.Deserialize(stream);
}
}
internal string Serialize<TT>(object obj)
{
MemoryStream stream = new MemoryStream();
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
dc.WriteObject(stream, obj);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
xs.Serialize(stream, obj);
}
// be aware that the Unicode Byte-Order Mark will be at the front of the string
return stream.ToArray().ToUtfString();
}
internal string SerializeToString<TT>(object obj)
{
StringBuilder builder = new StringBuilder();
XmlWriter xmlWriter = XmlWriter.Create(builder);
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
dc.WriteObject(xmlWriter, obj);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
xs.Serialize(xmlWriter, obj);
}
string xml = builder.ToString();
xml = RegexHelper.ReplacePattern(xml, RegexHelper.WildcardToPattern("<?xml*>", WildcardSearch.Anywhere), string.Empty);
xml = RegexHelper.ReplacePattern(xml, RegexHelper.WildcardToPattern(" xmlns:*\"*\"", WildcardSearch.Anywhere), string.Empty);
xml = xml.Replace(Environment.NewLine + " ", string.Empty);
xml = xml.Replace(Environment.NewLine, string.Empty);
return xml;
}
}
}
This is a guess, but I think it is running slow in debug mode because for every exception, it is performing some actions to show the exception in the debug window, etc. If you are running in release mode, these extra steps are not taken.
I've never done this, so I really don't know id it would work, but have you tried just setting that one assembly to run in release mode while all others are set to debug? If I'm right, it may solve your problem. If I'm wrong, then you only waste 1 or 2 minutes.
About your debugging problem, have you tried to disable the exception assistant ? (Tools > Options > Debugging > Enable the exception assistant).
Another point should be the exception handling in Debug > Exceptions : you can disable the user-unhandled stuff for the CLR or only uncheck the System.FormatException exception.
Ok - I figured out the root issue. It was what I alluded to in the EDIT to the main question. The problem was that in the xml, it was correctly serializing doubles that had a value of double.NaN. I was using these values to indicate "na" for when the denominator was 0D. Example: ROE (Return on Equity = Net Income / Average Equity) when Average Equity is 0D would be serialized as:
<ROE>NaN</ROE>
When the DCS tried to de-serialize it, evidently it first tries to read the number and then catches the exception when that fails and then handles the NaN. The problem is that this seems to generate a lot of overhead when in DEBUG mode.
Solution: I changed the property to double? and set it to null instead of NaN. Everything now happens instantly in DEBUG mode now. Thanks to all for your help.
Try disabling some IE addons. In my case, the LastPass toolbar killed my Silverlight debugging. My computer would freeze for minutes each time I interacted with Visual Studio after a breakpoint.