Prior to using ORMs we always performed object caching in our service layer. This gave us the ability to switch between different data layers without having to change our caching implementation.
Nowadays we use both Entity Framework (mainly code first) and NHibernate. NHibernate seems to have much better caching features with several 2nd level cache providers available.
Another problem I am faced with is that for both of the above ORMs we make use of lazy loaded properties. So if we are retrieving an object from the cache we normally have to reattach it to the current ObjectContext/ISession, something we can't really do in our service layer.
So should I really be looking at implementing caching at the repository/data layer; and is it likely that I will find a common solution that will work for EF and NH?
Thanks
Ben
Have you looked at the CQRS pattern? Because caching is policy, I'm suspicious of trying to automate these decisions with a "one size fits all" cache like a second layer cache. E.g., if what you really want to cache is the HTML for your site home page, then a second layer cache is just a waste of memory and an invitation to bugs.
CQRS turns these mechanical considerations into policy decisions. And it's ORM-agnostic.
My experience is that 2nd level cache provided by the ORM will cache the data based on the actual resultset retrieved from the database. This can give you a lot of overhead because the object instantiation and data population is pretty costly.
The advantage is that lazy loading etc will work fine, but big results will still use up a lot of resources when repopulating the objects.
We use a combination of 2nd level cache and the ASP.NET cache in our webapplication because of the costly overhead, which in rare cases saves us as much as several seconds (!), but with the drawback of not being able to lazy load collections or update the entities.
This is based on NHibernate only, I have never worked with entity framework but I'm guessing all ORM frameworks suffer from this.
Related
For my application, I am looking at using an ORM and currently trying to decide if the domain layer should interface with it through a Data Access Object, Repositories, or something else? I am hesitant to pair an ORM with repositories because they can become redundant if the ORM entities are identical with the domain objects, but having one big DAO seems cludgy. I want to keep my SQL centralized, but I can't figure which of these options, if any, would make the most sense. Any suggestions on an appropriate design pattern?
This is very opinion-based, but I tend toward creating separate entities from my domain models. The domain model needs to closely match your domain, whereas your entities need to closely model your storage. They may initially match very closely, and seem really redundant, but they very often drift dramatically from each other very quickly.
That being said, wrappers that do nothing but map domain entities to persistence entities often feels horrible, and a giant waste of time. Additionally, it doesn't pay off until much later in the game, when you are doing refactoring, and you realize that your domain isn't quite right, but you don't want to modify your persistence layer.
The good news is, most languages/frameworks have some form of a mapping library that will let you automatically map from one object to another that is similarly structured. This is a great way to speed this up initially, while still giving yourself flexibility to create a manual mapping when the requirements change out from under you.
What are the advantages/disadvantages of using NHibernate ?
What kind of applications should be (& should not be) built using NHibernate ?
Since other ppl have listed advantages I will just list the disadvantages
Disadvantages
Increased startup time due to metadata preparation ( not good for desktop like apps)
Huge learning curve without orm background.
Comparatively Hard to fine tune generated sql.
Hard to get session management right if used in non-typical environments ( read non webapps )
Not suited for apps without a clean domain object model ( no all apps in world dont need clean domain object models) .
Have to jump through hoops if you have badly designed ( legacy ) db schema.
Advantages:
Flexible and very powerful mapping capabilities.
Caching.
Very polished UnitOfWork implementation.
Future query (article).
Model classes are POCO - which effectively means you can easily implement anemic domain antipatter.
Interceptors - you can do a kind of aspect oriented programming... Like very easily implementing audition, logging, authorization, validation, ect for your domain.
Lucene.NET and NHibernate are well integrated with each other - gives you a very fast and effective implementation of full-text indexing.
It's very mature and popular in enterprise environment.
Big community.
Disadvantages:
Already mentioned learning curve. You can start using NHibernate very fast but it will take you months to master it. I'd highly recomend to read Manning NHibernate book.
Writing XML mapping can be very tedious especially for big databases with hundreds and thousands of tables and views and stored procedures. Yes, there is tools that will help you by generating those mappings but you still will have to do quite a lot of manual work there. Fluent NHibernate seem to simplify this process by getting rid of XML mappings, so is Castle ActiveRecord (AR though is impossible to use for anemic domain as you define mappings in attributes on your model classes).
Performance may be low for certain scenarious. For instance large bulk operations. For those you might have to use IStatelessSession but its awkward experience, least to say...
Advantages:
Open source
Based on widely approved patterns
NH is not code-generator :)
Disadvantages:
Half-done LINQ support
Low performance
(see for example performance and LINQ tests on ormbattle.net)
Advantages:
Caching
Simplicity in your code
Power
Flexibility
Multi-database support
Disadvantages:
Stops you having to write your own persistence code
May reduce your knowledge of SQL
Applications you should use it for:
Any that use a database
A few more specific reasons to like NHibernate
Disadvantages: NHibernate is not a Microsoft product and therefore will face some resistance from coworkers who haven't heard of it. Especially FOSS bigots. Configuring the mapping files and lazy/eager loading behavior can be time-consuming. If your database has a bizarre naming convention, atypical design or very strict performance requirements, more work may be required than expected.
I say this a lot but ActiveRecord is a great layer over NHibernate. It uses attributes to map the data points to class members right in the classes themselves. People are not using this thing enough.
The high level answer is that NHibernate is in a class by itself and there is no near competition.
If you need CRUD against a database from a .NET application, you should be using NHibernate, for at least two reasons:
1) You get Linq support (which requires something like an ORM)
2) NHibernate is very mature
There are no significant disadvantages. There are other options, but those other options have significant disadvantages.
I wrote some more on this a while ago:
.NET and ORM - Decisions, decisions
For those who know the inner workings of nhibernate, do you think a large scale web application like say facebook/myspace would use nhibernate?
Or is nhibernate well suited for more low traffic sites like company sites etc? i.e. not enterprise ready because of its chatty nature?
NHibernate is not chatty at all. About scalability, there was already a question on NH's groups, which was more about the complexity of the database then traffic, but might still be interesting for you.
Even if there are always complaint's about unnecessary queries on every ORM, because of the generic nature of an ORM, it doesn't mean that it is chatty. On the other hand it optimizes situations where it would be too complex to optimize in hand-written DAL's. Eg. query batches or lazy loading.
NHibernate is quite light-weight compared to other ORM's and compared to it's powerful features.
NHibernate (as any other ORM) could be considered to be overkill if there is no object oriented business model but you need to optimize for highest performance. I don't think that Google could make use of NHibernate for its search engine, for instance.
Edit:
The performance and power of NHibernate is not fully for free. It requires that the developers understand at least the basics about relational databases. Other ORM's try to hide the whole relational problematics, which leads to much more unoptimized behaviour.
nHibernate is a professional joke.
In my company, its use has been prohibited by several reasons.
As tool is quite unproductive; you'll spend countless hours trying to figure out, or finding alternate strategies in a scarce documentation.
Much better, use your own generated DAL and SP's to achieve high performance. You'll have a cached execution plan, and in the end that's what really matters.
nHibernate has no advanced support for memcached, which is specially what you are going to use if you want to build a scalable web solution, like Facebook.
I work for a social gaming company, and we have specially forbidden to use nhibernate in particular.
NHibernate supports query caching, 2nd level caching based on primary keys, and also session cache for repeated hits on the same entity within the same session.
That's all a great help, but as long as you are hitting a database with a large load, you are going to have scaling problems. The best way to scale a database is to minimise the amount of time you actually have to use it. Distributed cache such as memcache, and caching your output (either post-datacrunched views or html) are the best ways to scale an application. If clients are regularly hitting the database, you are doing it wrong, ORM or not. In a .NET application, like a typical MVC app, has the advantages of being able to use varyby output caching, donut and donut-hole caching, as well as clients for memcache to be used with NHibernate and for your ViewModels.
I've been having a brief look at NHibernate and Linq2Sql. I'm also intending to have a peek at Entity Framework.
The question that gets raised, when I talk of these ORM's is "They can't scale", so can they? From Google I get the impression they're able to scale well, but ultimately I suppose there must be a price to pay, Is it worth paying for a richer simpler business layer.
This is a good question, and IMHO they can scale just as well as any custom DAL. I've only used nHibernate so I will focus only on it and the features it has which can help scale a system.
Lazy Loading - Since it supports lazy loading you can avoid loading any unnessecary items. Of course you need to watch out for the Select n+1 problem however there are things in the system to prevent this.
Eager Fetching - There are various ways to eagerly fetch objects which you might need allowing you to avoid extra trips to SQL.
Second Level Cache - nHibernate has support for a second level cache which can be used to increase the scalability by reducing trips to the DB. There are various backing providers available which give you some flexibility.
Write your own SQL - In nHibernate you can call stored procedures, or provide the SQL query inline that will return your entities. This will let you use your own SQL when the generated sql doesn't cut it. For example eager loading a self joining tree using a recursive query.
Now with that said, I think it is easier to initially tweak a custom DAL layer because your are intimate with its construction and can fine tune it; however, a good ORM will provide plenty of hooks that allow you to optimize quite a bit. You just need to spend some time learning it.
I also feel that if you have a performance critical area of code and you can't get your ORM to work within your requirements then for that tiny area of your application you can custom build your own DAL layer. If you're using a decent design pattern such as a Repository created by a factory, then all you need to do is swap out the implementation of your repository
Hibernate Shards is being ported to NHibernate, which will allow for horizontal scaling.
There are also some very cool hacks like this one to implement sharding.
So the answer is yes, NHibernate can scale, in a persistance-ignorant and fully-transparent way.
It's simply incorrect to say that apps built in an ORM do not scale well. Certainly it has happened before that careless or lazy devs abuse an ORM by writing code that generates horribly inefficient SQL. Building performant apps means understanding something about what all the lovely abstractions actually do under the hood. It does not take much to stay out of this trap however. Using an ORM doesn't mean never opening SQL profiler or NHibernate Profiler.
And regarding the claim that SPs are just a whole lot faster, read this and this. And besides, ORMs (NHibernate, at least) give you pretty easy ways to use SPs if you ever need to.
I'm a student currently dabbling in a .Net n-tier app that uses Nhibernate+WCF+WPF.
One of the things that is done quite terribly is object graph serialisation, In fact it isn't done at all, currently associations are ignored and we are using DTOs everywhere.
As far as I can tell one method to proceed is to predefine which objects and collections should be loaded and serialised to go across the wire, thus being able to present some associations to the client, however this seems limited, inflexible and inconsistent (can you tell that I don't like this idea).
One option that occurred to me was to simply replace the NHProxies that lazy load collection on the client tier with a "disconnectedProxy" that would retrieve the associated stuff over the wire. This would mean that we'd have to expand our web service signature a little and do some hackery on our generated proxies but this seemed like a good T4/other code gen experiment.
As far as I can tell this seems to be a common stumbling block but after doing a lot of reading I haven't been able to figure out any good/generally accepted solutions. I'm looking for a bit of direction as much as any particular solution, but if there is an easy way to make the client "feel" connected please let me know.
You ask a very good question that unfortunately does not have a very clean answer. Even if you were able to get lazy loading to work over WCF (which we were able to do) you still would have issues using the proxy interceptor. Trust me on this one, you want POCO objects on the client tier!
What you really need to consider...what has been conceived as the industry standard approach to this problem from the research I have seen, is called persistence vs. usage or persistence ignorance. In other words, your object model and mappings represent your persistence domain but it does not match your ideal usage scenarios. You don't want to bring the whole database down to the client just to display a couple properties right??
It seems like such a simple problem but the solution is either very simple, or very complex. On one hand you can design your entities around your usage scenarios but then you end up with proliferation of your object domain making it difficult to maintain. On the other, you still want the rich object model relationships in order to write granular business logic.
To simplify this problem let’s examine the two main gaps we need to fill…between the database and the database/service layer and the service to client gap. NHibernate fills the first one just fine by providing an ORM to load data into your objects. It does a decent job, but in order to achieve great performance it needs to be tweaked using custom loading strategies. I digress…
The second gap, between the server and client, is where things get dicey. To simplify, imagine if you did not send any mapped entities over the wire to the client? Try creating a mechanism that exchanges business entities into DTO objects and likewise DTO objects into business entities. That way your client deals with only DTOs (POCO of course), and your business logic can maintain its rich structure. This allows you to leverage not only NHibernate’s lazy loading mechanism, but other benefits from the session such as L1 cache.
For brevity and intellectual property reasons I will not go into the design of said mechanism, but hopefully this is enough information to point you in the right direction. If you don’t care about performance or latency at all…just turn lazy loading off all together and work through the serialization issues.
It has been a while for me but the injection/disconnected proxies may not be as bad as it sounds. Since you are a student I am going to assume you have some time and want to muck around a bit.
If you want to inject your own custom serialization/deserialization logic you can use IDataContractSurrogate which can be applied using DataContractSerializerOperationBehavior. I have only done a few basic things with this but it may be worth looking into. By adding some fun logic (read: potentially hackish) at this layer you might be able to make it more connected.
Here is an MSDN post about someone who came to the same realization, DynamicProxy used by NHibernate makes it not possible to directly serialize NHibernate objects doing lazy loading.
If you are really determined to transport the object graph across the network and preserve lazy loading functionality. Take a look at some code I produced over here http://slagd.com/?page_id=6 . Basically it creates a fake session on the other side of the wire and allows the nhibernate proxies to retain their functionality. Not saying it's the right way to do things, but it might give you some ideas.