Upgraded NHibernate and FNH DLLs - now getting "No persister" exceptions - nhibernate

I'm trying to upgrade my FNH Automapping project to the latest versions of NHibernate and Fluent NHibernate (NH 3.2 and FNH 1.3), but now I get a "no persister" exception on a derived class (though the base class still seems to be persisted properly).
This derived class Automapped fine with the old dlls (FNH 1.0, NH 2.1.2) - I have not changed my mapping logic or these classes in any way.
I upgraded my project by just copying the new dlls over the old ones, and deleting references to dlls that are no longer needed (e.g. Antlr 3, Castle) by the new dlls.
Exact versions I'm using:
NHibernate 3.2.0.4000
FluentNHibernate 1.3.0.0
System.Data.SQLite 1.0.76.0
VS 2008 9.0.30729.1 SP
Windows XP SP3 (32 bit)
The mapping code that works with the old dlls, but not with the new ones:
return AutoMap.Assemblies(_assemblies)
// Don't map the abstract base class
.IgnoreBase<OfeEntity>()
// Only map subclasses of OfeEntity
.Where(t => t.IsSubclassOf(typeof(OfeEntity)))
.Conventions.Add(
// Do cascading saves on all entities so lists will be
// automatically saved
DefaultCascade.All(),
// Turn on lazy loading, so will only read data that is actually
// displayed
DefaultLazy.Always()
);
Edit:
After turning FNH Diagnostics on, I can see that FNH is not creating a table for my derived class with the new dlls.
Also, one thing I noticed - the class that is not being persisted is subclassed by 2 levels. That is, I have a the following classes:
public abstract class OfeEntity
public class OfeMeasurementBase : OfeEntity
public class OfeDlsMeasurement : OfeMeasurementBase
OfeDlsMeasurement is the class that is not being persisted. OfeMeasurementBase, as well as several other classes that inherit from OfeEntity, are being persisted properly.
Old versions had no problem with this - maybe new versions have a bug when there's more than one level of inheritance?

I migrated our project from some older (F)NH to the latest NH 3.2. I suspect it now uses different key field names in collections and such, because I need to specify exact column names when using the existing database.
Also, I suggest rebuilding Fluent NHibernate from sources with NHibernate 3.2 just to be sure everything gets in the place.

The article How to upgrade your apps to NHibernate 3.2 with Fluent NHibernate 1.2 may be helpful. I used it myself to upgrade a project and it worked.

Related

Using Virtual on navigation properties

I want to know when we use the key word virtual with navigation properties (I learnt it's for lazy loading) but I'm reading a tutorial in https://docs.asp.net/en/latest/data/ef-mvc/intro.html that creates an asp.net web application core and they are not using that virtual anymore.
I checked on old versions (MVC4, MVC5) it is always there but not in the core.
Can anyone explain to me why?
You use virtual properties on entities, so Entity Framework can create a proxy class at runtime that inherits from your entity and injects a stub into overridden properties. This stub does a database call when you access the property's getter from code.
Entity Framework Core does not support lazy loading (yet, and probably never will), so there's no reason for it to mark properties as virtual.
See also: Loading Related Data - Entity Framework Core 1.0.0 Documentation in the officical documentation, Lazy Loading · Issue #3797 · aspnet/EntityFramework · GitHub on GitHub and Why use 'virtual' for class properties in Entity Framework model definitions? here on Stack Overflow.

NopCommerece => Moving Mapping folder of Data Access Layer in Seperate Project

In NopCommerece MVC version, I am trying to move the mapping folder out of the DAL project to a seperate class library project, I am trying it to make the DAL more generic, so that it can be used in other projects as well.
But when I run the application, for every entity it says that "The entity type [EntityName] is not part of the model for the current context."
I think its happening because autofac is not finding IRepository for injection, any tips or ideas that where and what I am doing wrong?
Thanks in advance
OK! I found the solution to this issue, in the ObjectContext file, there is an overridden method named OnModelCreating, which was basically creating instances of mapping type objects in the assembly through reflection.
I have asked this method to look into a specific dll for those mapping entries and it started working.

Class versioning

I'm looking for a clean way to make incremental updates to my code library, without breaking backwards compatibility. This could mean adding new members to classes, or changing existing members to provide additional functionality. Sometimes I am required to change a member in such a way that it would break existing code (e.g. renaming a method or changing its return type), so I'd rather not touch any of my existing types once they are shipped.
The way I currently set this up is through inheritance and polymorphism by creating a new class that extends the previous "version" of that class.
The way this works is by creating the appropriate version of StatusResult (e.g. StatusResultVersion3), based on the actual value of the ProtocolVersion property, and returning it as an instance of CommandResult.
Because .NET does not seem to have a concept of class versioning, I had to come up with my own: appending the version number to the end of the class name. This will no doubt make you cringe. I could easily imagine yourself scratching your eyes out after zooming in on the diagram. But it works. I can add new members and override existing members, without introducing any code breaking changes.
Is there a better way to version my classes?
There are typically two approaches when considering existing code and assembly updates:
Regression Testing
This is a great approach for non-breaking changes, where you can simply overload functions to provide new parameters, etc. Visual Studio has some very advanced unit testing capabilities to make your regression testing relatively easy and automated.
Assembly Versions
If your changes are going to start breaking things, like rewriting the way some utility works, then it's time for a new assembly version. .NET is very good about working with assembly versions. You can deploy the versioned assemblies to different folders so that existing code can continue to reference the old version while new code can take advantage of the features in the new version.
The problem with interfaces is that once published they're largely set in stone. To quote Anders Hejlsburg:
... It's like adding a method to an interface. After you publish an interface, it is for all practical purposes immutable, because any implementation of it might have the methods that you want to add in the next version. So you've got to create a new interface instead.
So you can never just update an interface, you need to create a completely new one. Of course, you can have a single class implement both interfaces so your maintainability effort is fairly low compared with (say) polymorphic classes where your code will become spread out between multiple classes over time.
Multiple Interfaces also allows you to remove methods in a way that classes do not (Sure, you can Deprecate them but that can result in very noisy intellisense after a few iterations)
I personally lean towards having entirely stand-alone versions of the interface in each assembly version.
That is to say...
v 0.1.0.0
interface IExample
{
String DoSomething();
}
v 0.2.0.0
interface IExample
{
void DoSomethingElse();
}
How you implement them behind the scenes is up to you, but most likely it'll be the same classes with slightly different methods doing similar jobs (otherwise, why use the same interface?)
All the old code should be referencing 0.1.x.x and new code will reference 0.2.x.x. About the only issue is when you find (say) a security flaw and the fix needs to be back-ported to an earlier version. This is where a decent VCS comes in (Personal preference is TFS but SVN or anything else which supports branching/merging will do).
Merge the fixes from the 0.2 branch back into the 0.1 branch and then do a recompile to result in (say) 0.1.1.0.
As long as you stick to a process like this:
Major or Minor build will increment if there are any breaking changes (aka signatures will not change on Build/Revision increments)
Use publisher policies if the new Major/Minor version should be used by older programs (equivalent to guaranteeing nothing broke so use the new version anyway)
References in client apps should point at a Major/Minor version but not specify revision/build
This gives you:
A clean codebase without legacy clutter
Allows clients to use the latest version with no code changes if nothing has broken
Prevents clients using newer versions of an assembly which do have breaking changes until they recompile (and, one hopes, update their code as appropriate to take advantage of the new features.)
Allows you to release security patches for previous versions
The OP solved his problem as indicated by this comment:
In the end, I went with the interfaces idea because it allows me to keep multiple versions of a class member in a single class file. When I need to update the class, I'll just add the new interface, shadowing the member that has been changed, and change the return type on some of my methods. This works without breaking backwards compatibility because of polymorphism.
If this is mainly for serialization, This can be achieved in .Net using DataContractSerializers and DataAnnotations. They can deserialize different versions an object into the same object to allow for different versions of the same class to be deserialized, leaving any properties it can't map blank.

System.ComponentModel.DataAnnotations.compare vs System.Web.Mvc.Compare

MVC 4 Beta project fails to compile after upgrading to .Net 4.5.
This happens due to conflict between
System.ComponentModel.DataAnnotations.CompareAttribute and System.Web.Mvc.CompareAttribute
System.ComponentModel.DataAnnotations.CompareAttribute MSDN documentation says:
Provides an attribute that compares two properties.
While System.Web.Mvc.CompareAttribute MSDN documentation says:
Provides an attribute that compares two properties of a model.
What is the difference between the two and when it would be "smarter" to use each of them?
10x.
So, looking at the MSDN documentation and doing a literal comparison of the two classes, I noticed both classes are derived from System.ComponentModel.DataAnnotations.ValidationAttribute. In fact, the classes are almost exactly the same. The only notable difference is that the MVC version also implements IClientValidatable which adds the following properties:
FormatPropertyForClientValidation - (static member) Formats the property for client validation by prepending an asterisk and a dot.
GetClientValidationRules - Gets a list of compare-value client validation rules for the property using the specified model metadata and controller context.
As for which class you should use, if the model will be directly bound to a view, use the MVC version so that you can take advantage of the client-side validation. However, if you're using ViewModels, you can stick with the ComponentModel class and avoid the unnecessary overhead of the additional properties. Your call!
System.Web.Mvc.CompareAttribute
System.ComponentModel.DataAnnotations.CompareAttribute
Microsoft Connect work-around is:
Posted by GavK on 6/17/2012 at 5:13 AM
I added a full reference to [System.Web.Mvc.Compare(...)] rather than
just using [Compare(...)]
Works for me in VS 2012...
Vinney nailed most of it with the exception of which one you should use...
The reason you have a conflict after changing your target framework to 4.5 is because prior to .NET 4.5 there was no CompareAttribute class in the System.ComponentModel.DataAnnotations namespace and the class defined under System.Web.Mvc filled the gap. Therefore, as an example if you were using [Compare] and [Required] attributes in your model class prior to updating your target framework you ended up with a conflict when you upgraded.
Assuming you are not using anything else in the System.Web.Mvc namespace in your model class, you should remove that using statement and let it rely on the System.ComponentModel.DataAnnotations namespace. Unobtrusive client-side validation will continue to work exactly as it did before, just as it does for other attributes that you decorate your model's properties with from the same namespace (eg Required).
If you wish to be explicit about the reference, you can simply add this line:
using CompareAttribute = System.Web.Mvc.CompareAttribute;
Using Visual Studio 2013 (MVC 5 project, .NET 4.5) the IntelliSense suggests that System.Web.Mvc.CompareAttribute is deprecated.
I used System.ComponentModel.DataAnnotations.CompareAttribute and it works fine.
It also does the client-side validation!
On this post, they also suggest another solution, which is to move the reference of the preferred namespace for Compare() inside the model's namespace. Eg. if you prefer to use Compare from System.Web.Mvc, use:
using System.ComponentModel.DataAnnotations;
namespace MyProject.MyViewModel
{
using System.Web.Mvc;
The compiler will search inside the model's namespace first.

NH 3.2 Fluent Mapping Lazy Loading

I used NH 3.2 mapping by code and I tryied Nhibernate Mapping Generator http://nmg.codeplex.com/ which looked a great tool.
I found a big difference between my code and theirs. On each class they have a call to the function LazyLoad(). (Although I thinked that it was the default behaviour)
Now I fear that my application doesn't use lazy loading, does someone know the default behaviour of nh 3.2 with mapping by code ? (when we don't call the LazyLoad method)
Regards
Depends on the default-lazy attribute of the hibernate-mapping tag which can be changed in Fluent NHibernate by adding the DefaultLazy.Always() or DefaultLazy.Never() convention.
If no default-lazy attribute is defined (no convention added in Fluent NHibernate), lazy loading is enabled.