NHibernate Override ISession for Faking the Database during testing - nhibernate

I am working on a project that has over 2000 integration tests that full circles the database. I want to speed up the process so I thought why not fake the database.
We are using Fluent NHibernate as our ORM which is probably why I am having such trouble. We have already implemented this concept in a program that does not use NHibernate but rather basic CRUD operations.
Basically on any CRUD operation I would like to save the object in memory to say a dictionary list. This will speed up the process of our tests which will hopefully lessen our build times. Not to mention the cool factor.
I have looked into having two separate sessions and having the session factory use one or the other but I would have to implement many methods/properties that I really don't care about.
I also considering using a go between in class that does the translating myself but it would probably include me changing A LOT of existing code. I am trying to limit the impact on the rest of the project as much as possible so regression testing will not be a HUGE factor.
Please let me know of any other suggestions anyone has!
Thanks!

Use a SQLite in-memory database. Here's the blog post that originally showed me how...
http://jasondentler.com/blog/2009/08/nhibernate-unit-testing-with-sqlite-in-memory-db/

Related

Starting on ORMS - Nhibernate

I am starting to delve into the realm of ORMs, particularly NHibernate in developing .NET data-aware applications. I must say that the learning curve is pretty steep and that a lot of things should be noted. Apparently, it actually changes the way you do data-aware applications, manner of coding, development and just about everything.
Anyway, I want to ask if you do set some parameters when deciding to USE or NOT TO USE ORMs in your applications? How do you decide then the approach that one needs to make it valuable to your organization?
The organization which I work for now apparently has made a lot of SQL and Data Access thing running through back end and I must say that these class/methods/procedures have successfully performed their tasks of providing the data which is needed and when it is needed. I think it would be a tremendous effort just to map some of this into ORM and derive the same business value that the company has for the last few years.
Nevertheless, I know that ORM paves the way for applications to talk with database servers, if properly implemented. I must admit that I am at a learning stage and that I would possibly need all the help, resources and the guidance to make this transition. I was also thinking of buying the book from Manning but I feel that with so much changes to NHibernate, the book may be a bit outdated. Perhaps waiting for the Packt book on NHibernate (release on May 2010??) would help me better get up and running.
Kindly share your thoughts. By the way, if you could also point me in a small sample web app which uses NHibernate + Visual Web Developer 2008 Express and SQL Server, that would be highly appreciated.
Thanks.
For me, the short of it is the following:
If you don't use an established ORM, and you develop correctly (meaning you refactor out duplication and look to simplify where you can), you'll wind up building your own ORM through the evolution of your data access layer.
The question then becomes:
"Do I want my developers spending time learning the idiosyncrasies of my home-grown ORM or learning those of a well-documented and well-tested ORM?"
Furthermore:
"If I'm hiring a new developer, wouldn't it be nicer to bring in a developer that knows the established ORM tool we're using rather than having to train someone up on this thing I built?"
I use NHibernate, particularly Fluent - and it's great; if given the choice, I wouldn't develop on an RDBMS any other way.
To be successful with an ORM you must make sure to normalize correctly, and use the database for it's designed purpose, storing data.
I don't use an orm when:
I don't use a relation database (Relational databases are not the best choice of database for every application)
The database is has a very small amount of tables. (I might need less code without an orm)
I use a very simple database that can map to code with simple naming
conventions. (Mapping to dumb DTO classes and all queries like select * from tablename where id=#id)
Learning a good orm is worth the time and effort, it will save you writing a lot of code when you use relational databases a lot.
You can find example apps/tutorials/video's about NHibernate on with stackoverflow search. There is another book in progress by manning, maybe it's possible to read it with the early access program.

NHibernate Interceptor - What is it

What is NHibernate Interceptor, and what purpose does it serve in an application?
Also, in this article, I learned that using NHibernate makes a desktop application slower at startup, so to avoid this, I need to save the configuration in a file, and later load it from the saved file. How can I do that? I didn't find any examples in that tutorial.
An interceptor allows you to execute additional functionality when an entity is retrieved / deleted / updated / inserted in the DB ...
Interceptors article
Hibernate doc
other useful info
About making your app slower:
I'd suggest that you only have a look at optimizing start-up time, when it really becomes a problem.
When you build a session-factory, NHibernate will parse all the mappings, and that is an operation that is a bit expensive. But, as long as you have a limited number of entities, the performance hit isn't that big.
I have never ever had to optimize the initialization of NHibernate, because of slow startup times.
I'd suggest that you first concentrate on the core of your application -the problem you're trying to solve- and afterwards have a look on how you could improve startup performance.
(If you'll ever have to do it).
Interceptors, like the name itself says, allows you to intercept NHibernate operations (save/update/delete/load/flush/etc).
A newer, more flexible API to achieve this is the event system.
About serializing the configuration, the code is there, it's the class Effectus.Infrastructure.BootStrapper which is called at application startup.
An interceptor's dissection series written by me can be found in here
http://blog.scooletz.com/2011/02/03/nhibernate-interceptor-magic-tricks-pt-1/
hope it helps

O/R Mappers - Good or bad

I am really torn right now between using O/R mappers or just sticking to traditional data access. For some reason, every time I bring up O/R mappers, fellow developers cringe and speak about performance issues or how they're just bad in general. What am I missing here? I'm looking at LINQ to SQL and Microsoft Entity Framework. Is there any basis to any of these claims? What kind of things do I have to compromise if I want to use an O/R mapper. Thanks.
This will seem like an unrelated answer at first, but: one of my side interests is WWII-era fighter planes. All of the combatant nations (US, Great Britain, Germany, USSR, Japan etc.) built a bunch of different fighters during the war. Some of them used radial engines (P47, Corsair, FW-190, Zero); some used inline liquid-cooled engines (Bf-109, Mustang, Yak-7, Spitfire); and some used two engines instead of one (P38, Do-335). Some used machine guns, some used cannons, and some used both. Some were even made out of plywood, if you can imagine.
In the end, they all went really really fast, and in the hands of a competent, experienced pilot, they would shoot your rookie ass down in a heartbeat. I don't imagine many pilots flew around thinking "oh, that idiot is flying something with a radial engine - I don't have to worry about him at all". Everyone understood that there were many different ways of achieving the ultimate goal, and each approach had its particular advantages and disadvantages, depending on the circumstances.
The debate between ORM and traditional data access is just like this, and it behooves any programmer to become competent with both approaches, and choose the option that is right for the job at hand.
I struggled with this decision for a long time. I think I was hesitant for two primary reasons. First, O/R mappers represented a lack of control over what was happening in a critical part of the app and, second, because so many times I've been disappointed by solutions that are awesome for the 90% case but miserable for the last 10%. Everything works for select * from authors, of course, but when you crank up the complexity and have a high-volume, critical system and your career is on the line, you feel you need to have complete control to tune every query pattern and byte over the wire. Most developers, including me, get frustrated the first time the tool fails us, and we cannot do what we need to do, or our need deviates from the established pattern supported by the tool. I'll probably get flamed for mentioning specific flaws in tools, so I'll leave it at that.
Fortunately, Anderson Imes finally convinced me to try CodeSmith with the netTiers template. (No, I don't work for them.) After more than a year using this, I can't believe we didn't do it sooner. My team uses Visual Studio DB Pro, and on every check-in our continuous integration build drops out a new set of data access layer assemblies. This handles all the common, low risk stuff automatically, yet we can still write custom sprocs for the tricky bits and have them included as methods on the generated classes, and we can customize the templates for the generated code as well. I highly recommend this approach. There may be other tools that allow this level of control as well, and there is a newer CodeSmith template called PLINQO that uses LINQ to SQL under the hood. We haven't that yet examined (haven't needed to), but this overall approach has a lot of merit.
Jerry
O/RM tools designed to perform very well in most situations. It will cache entities for you, it will execute queries in bulks, it has a very low level optimised access to objects which is way faster than manually assigning values to properties, they give you a very easy way to incorporate variations of aspect oriented programming using modern technics like interceptors, it will manage entity state for you and help resolve conflicts and many more.
Now cons of this approach usually lies in lack of understanding of how things work on a very low level. Most classic problem is "SELECT N+1" (link).
I've been working with NHibernate for 2.5 years now, and I'm still discovering something new about it almost on a daily basis...
Good. In most cases.
The productivity benefit of using an ORM, will in most case outweigh the loss of control over how the data is accessed.
There are not that many who would avoid C#, in order to program is MSIL or Assembly, although that would give them more control.
The problem that i see with a lot of OR mappers is that you get bloated domain objects, which are usually highly coupled with the rest of your data access framework. Our developers cringe at that as well :) It's just harder to port these object to another data access technology. If you use L2S, you can take a look at the generated code. It looks like a complete mess. NHibernate is probably one of the best at this. Your entities are completely unaware of your data access layer, if you design them right.
It really depends on the situation.
I went from a company that used a tweaked out ORM to a company that did not use a ORM and wrote SQL queries all the time. When I asked about using an ORM to simplify the code, I got that blank look in the face followed by all the negatives of it:
Its High Bloat
you don't have fine control over your queries and execute unnecessary ones
there is a heavy object to table mapping
its not dry code because you have to repeat your self
on an on
Well, after working there for a few weeks, I had noticed that:
we had several queries that were almost identical, and alot of times if there was a bug, only a handful would get fixed
instead of caching common tables queries, we would end up reading a table multiple times.
We were repeating our selves all over the place
We had several levels of skill level, so some queries were not written the most efficiently.
After I pointed most of this out, they wrote a "DBO" because the didn't want to call it an ORM. They decided to write one from scratch instead of tweaking out one.
Also, alot of the arguments come from ignorance against ORM's I feel. Every ORM that I have seen allows for custom queries, and even following the ORM's conventions, you can write very complex and detailed queries and normally are more human readable. Also, they tend to be very DRY, You give them your schema, and they figure the rest out, down to relationship mapping.
Modern ORM's have a lot of tools to help you out, like migration scripts, multiple DB types accessed to the same objects so you can leverage advantages of both NOSQL and SQL DB's. But you have to pick the right ORM for your project if your going to use one.
I first got into ORM mapping and Data Access Layers from reading Rockford Lhotka's book, C# business objects. He's spent years working on a framework for DAL's. While his framework out of the box is quite bloated and in some cases, overkill, he has some excellent ideas. I highly recommend the book for anyone looking at ORM mappers. I was influenced by his book enough to take away a lot of his ideas and build them into my own framework and code generation.
There is no simple answer to this since each ORM provider will have it's own particular pluses and minuses. Some ORM solutions are more flexible than others. The onus is on the developer to understand these before using one.
However, take LinqToSql - if you are sure you are not going to need to switch away from SQL Server then this solves a lot of the common problems seen in ORM mappers. It allows you to easily add stored procedures (as static methods), so you aren't just limited to generated SQL. It uses deferred execution, so that you can chain queries together efficiently. It uses partial classes to allow you to easily add custom logic to generated classes without needing to worry about what happens when you re-generate them. There is also nothing stopping you using LINQ to create your own, abstracted DAL - it just speeds up the process. The main, thing, though is that it alleviates the tedium and time required to create basic CRUD layer.
But there are downsides, too. There will be a tight coupling between your tables and classes, there will be a slight performance drop, you may occasionally generate queries that are not as efficient as you expected. And you are tied in to SQL Server (though some other ORM technlogies are database agnostic).
As I said, the main thing is to be aware of the pros and cons before pinning your colours to a particular methodology.

Can Persistence Ignorance Scale?

I've been having a brief look at NHibernate and Linq2Sql. I'm also intending to have a peek at Entity Framework.
The question that gets raised, when I talk of these ORM's is "They can't scale", so can they? From Google I get the impression they're able to scale well, but ultimately I suppose there must be a price to pay, Is it worth paying for a richer simpler business layer.
This is a good question, and IMHO they can scale just as well as any custom DAL. I've only used nHibernate so I will focus only on it and the features it has which can help scale a system.
Lazy Loading - Since it supports lazy loading you can avoid loading any unnessecary items. Of course you need to watch out for the Select n+1 problem however there are things in the system to prevent this.
Eager Fetching - There are various ways to eagerly fetch objects which you might need allowing you to avoid extra trips to SQL.
Second Level Cache - nHibernate has support for a second level cache which can be used to increase the scalability by reducing trips to the DB. There are various backing providers available which give you some flexibility.
Write your own SQL - In nHibernate you can call stored procedures, or provide the SQL query inline that will return your entities. This will let you use your own SQL when the generated sql doesn't cut it. For example eager loading a self joining tree using a recursive query.
Now with that said, I think it is easier to initially tweak a custom DAL layer because your are intimate with its construction and can fine tune it; however, a good ORM will provide plenty of hooks that allow you to optimize quite a bit. You just need to spend some time learning it.
I also feel that if you have a performance critical area of code and you can't get your ORM to work within your requirements then for that tiny area of your application you can custom build your own DAL layer. If you're using a decent design pattern such as a Repository created by a factory, then all you need to do is swap out the implementation of your repository
Hibernate Shards is being ported to NHibernate, which will allow for horizontal scaling.
There are also some very cool hacks like this one to implement sharding.
So the answer is yes, NHibernate can scale, in a persistance-ignorant and fully-transparent way.
It's simply incorrect to say that apps built in an ORM do not scale well. Certainly it has happened before that careless or lazy devs abuse an ORM by writing code that generates horribly inefficient SQL. Building performant apps means understanding something about what all the lovely abstractions actually do under the hood. It does not take much to stay out of this trap however. Using an ORM doesn't mean never opening SQL profiler or NHibernate Profiler.
And regarding the claim that SPs are just a whole lot faster, read this and this. And besides, ORMs (NHibernate, at least) give you pretty easy ways to use SPs if you ever need to.

Code generators or ORMs?

What do you suggest for Data Access layer? Using ORMs like Entity Framework and Hibernate OR Code Generators like Subsonic, .netTiers, T4, etc.?
For me, this is a no-brainer, you generate the code.
I'm going to go slightly off topic here because there's a bigger underlying fallacy at play. The fallacy is that these ORM frameworks solve the object/relational impedence mismatch. This claim is a barefaced lie.
I find the best way to resolve the object/relational impedance mismatch is to either use OOP exclusively and use an object database or use the idioms of the relational database exclusively and ignore OOP.
The abstraction "everything is a table" is to me, much more powerful than the abstraction "everything is a class." It takes less code, less intellectual effort and leads to faster code when you code to the database rather than to an object model.
To me this seems obvious. If your application is data driven then surely your code should be data driven too? Yet to say this is hugely controversial.
The central problem here is that OOP becomes a really leaky abstraction when used in conjunction with a database. Code that look perfectly sensible when written to the idioms of OOP looks completely insane when you see the traffic that code generates at the database. When that messiness becomes a performance problem, OOP is the first casualty.
There is really no way to resolve this. Databases work with sets of data. OOP focus on instances of classes. Trying to marry the two is always going to end in divorce.
So to answer your question, I believe you should generate your classes and try and make them map the underlying database structure as closely as possible.
Perhaps controversially, I've always felt that using code generators for the ADO.NET plumbing is fundamentally solving the wrong problem.
At some point, hopefully not too long after learning about Connection Strings, SqlCommands, DataAdapters, and all that, we notice that:
Such code is ugly
It is very boring to write
It's very easy to miss something if you're doing it by hand
It has to be repeated every time you want to access the database
So, the problem to solve is "how to do the same thing lots of times fast"?
I say no.
Using code generators to make this process quick still means that you have a ton of code, all the same, all over your business classes (or your data access layer, if you separate that from your business logic).
And then, if you want to do something generically (like track stored procedure usage, for instance), you end up having to customise your code generator if it doesn't already have the feature you want. And even if it does, you still have to regenerate everything all the time.
I like to do things once, not many times, no matter how fast I can do them.
So I rolled my own Data Access class that knows how to add parameters, set up and close connections, manage transactions, and other cool stuff. It only had to be written once, and calling its methods from a Business object that needs some database stuff done consists of one line of code.
When I needed to make the application support multithreaded database accesses, it required a change to the Data Access class only, and all the business classes just do what they already did.
There is no right answer it all depends on your project. As Simon points out if your application is all data driven, then it might make sense depending on the size and complexity of the domain to use non oop paradigm. I had a lot of success building a system using a Transaction Script pattern, which passed XML Messages around the system.
However this system started to break down after five or six years as the application grew in size and complexity (5 or 6 webs, several web services, tons of COM+ components, legacy and .net code, 8+ databases with 800+ tables 4,000+ procedures). No one knew what anything was, and duplication was running rampant.
There are other ways to alleviate the maintance then OOP; however, if you have a very complex domain then hainvg a rich domain model is ideal IMHO, as it allows for the business rules to be expressed in nice encapsulated components.
To answer your question, avoid code generators if you can. Code generators are a recipe for disaster, but if you do go with code generation do not modify the generated code. Also be sure to have a good process in place that is easy for developers to get the new generated code.
I recommend using either the following: ORM or hand roll a lightweight DAL. I am currently transitioning a project over to nHibernate off my hand rolled DAL and am having a lot of success; however, I like having the option of using either option. Further if you properly seperate your concerns (Data Access from Business Layer from Presentation) you can have a single service layer that might talk to a Dao (Data Access Object) that for one object is an ORM but for another is hand rolled). I like this flexibility as it allows to apply the best tool to the job.
I like nHibernate over a hand rolled DAL because while my DAL does abstract away most of the ADO.Net code you still have to write the code that takes a data reader to an object or an object and creates the parameters.
I've always preferred to go the code generator route, especially in C# where you can make use of extended classes to add functionality to the basic data objects.
Hate to say this, but it depends. If you find an ORM tool that fits your needs go for it. We wrote our own system in small steps while developing the application. We are using C++ and there are not that many tools out there anyway. Ours ended up being a XML description of the database, from that the SQL generation script and the basic object layer and metadata were generated.
Do your homework and evaluate theses tools and you will find a good fit for your needs.