I have NHibernate configured with Fluent NNibernate connecting to a PostgreSQL database.
I have a worker class that takes an ISessionFactory as a constructor parameter and consumes messages from a queue. For each message the worker process calls ISessionFactory.OpenSession() and it does some database processing.
When I add more worker processes the performance of the system remains the same which is odd.
After some more investigation I realized that all worker processes are using a single database connection. For example I would add 8 worker processes but on the database I can see only one database connection.
My understanding is that ISessionFactory.OpenSession() will open a new database connection unless the Connection Pool is full.
So is my understanding wrong or is this and issue with the Postgres NHibernate driver.
OpenSession does not open a database connection until needed, and it closes it (i.e. releases it back into the pool) as soon as possible.
By default the session will keep the connection open for the life time of a transaction and as Diego said, it only opens it when needed.
If you want to manage your own connections you can call
ISessionFactory.OpenSession(myConnection);
Related
I use Webflux R2DBC.
I know, the default value creates 1 thread per CPU core.
However, I know that the default value of R2DBC's connection pool is 10.
I can't understood that....
DB connections operate asynchronously. Then, DB connection don't need many thread pool.
Because the DB connection only sends and receives, connection can do other work(other sending or receiving) while waiting for it.
Rather, since the main thread(1 per CPU core) performs various calculations, the number of main threads should be large.
What am I missing?
I think you kind of have it backwards.
since the main thread(1 per CPU core) performs various calculations, the number of main threads should be large.
A (logical) CPU can perform only one task at a time and since threads in a reactive application should never just wait, since all calls are non-blocking, it doesn't make sense to have more threads than CPU's.
DB connections operate asynchronously. Then, DB connection don't need many thread pool.
Correct, you don't need a thread pool for your database connectivity. But a connection pool isn't a thread pool. I holds connections. Database connections are still expensive to create, so you want to reuse them and database transactions are bound to the database connection, so you need multiple connections in order to process different requests in separate transactions.
Of course how big a connection pool should be is a completely different and rather complex question.
I am a DBA, not a developer, and could use some insight. The development staff is using VB.NET to create web based applications with connections to a DB2 database. Please assume that the connection string in the web.config file is coded correctly.
I can see the number of orphaned connections from the web servers grow over time. By orphaned I mean that there is no activity associated with the connection for hours, yet I can see other connections being created and destroyed every couple of seconds.
I suspect that the connections are not being properly closed, but there are two different groups looking at the problem and so far haven't turned up anything. By the end of the day I can have hundreds of these connections - all of which are cleared when the application pool is reset every night. (This means it is NOT a database problem)
Is there a coding technique to ensure a connection is closed using vb.net on IIS v7+?
Am I seeing the symptom of another problem on IIS?
You need to have the developers implement the Dispose pattern, which is facilitated by the Using statement in VB.NET, like this (this is pertinent to SQL Server, but whatever connection object you are using for DB2 should work as well):
Using theConnection As New SqlConnection()
'' Command, parameter logic goes here
End Using
By wrapping the connection object inside of a Using block, it will guarantee that the connection is closed and properly disposed of memory-wise, even if there is an exception in the code within the Using block.
Sounds like a code-review is in order for whoever is in charge of the developers.
A long-running application uses the MS C OBDC API to create and use SQL connections to an Oracle DB. The application was originally designed to establish an ODBC connection at startup and keep that connection indefinitely as the application runs, potentially for weeks or months.
We're seeing very infrequent cases where a connection suddenly dies and I wondered if that's because we're using them wrong, or if it's considered OK to hold a connection like this. Can anyone point me to some definitive information on the subject?
I'm not sure there's definitive information regarding this, but with long-running programs you always have to be prepared for this kind of incidents, they just happen (and not only with db connections but also with sockets that remain open for extended periods). I have no experience with Oracle, but I have a very similar setup with Informix, and this is (in pseudocode) what we do
while (programissupposedtorun) {
opendb();
do {
youractivities();
} while(dbisok);
closedbandcleanup();
}
As long as you are able to correctly detect that the connection died and are able to resume processing without losing data you should be OK.
I'm not familiar with ODBC but such use cases are best handled with a connection pool. Your application can simply request a connection from the pool when it has some work and release it as soon as its done -- the pool will take care of actually (re)connecting to the database.
A quick search for ODBC connection pooling brought this up: Driver Manager Connection Pooling
I am considering using Fluent NHibernate for a new application with SQL Server 2008 and I am having trouble understanding the connection handling behavior I am seeing.
I am monitoring connections using sp_who2 and here is what I see:
When the SessionFactory is created, a single connection is opened. This connection
seems to stay open until the app is killed.
No connection is opened when a new session is opened. (That's ok, I understand NHibernate waits until the last moment to create database connections).
No new connection is opened even when I run a query through NHibernate. I must assume it is using the connection created when the SessionFactory was created, which is still open. I set a breakpoint after the query (before the session was closed), and no new sessions had appeared in sp_who.
Running an entire app through a single connection is not acceptable (obviously). How can I ensure each ISession gets its own connection? I'm sure there's something obvious I'm missing here...
Thanks in advance.
The behaviour that you see is nothing NHibernate specific - Connection pooling is default behaviour in SQL Server.
Even if it may sound awkward at first glance, it actually is a good thing because creating new connections and maintaining them is expensive.
(for more information, see the Wikipedia article about connection pooling)
So there is no need to try to get NH to open a new connection for each session, because reusing existing connections actually improves SQL Server performance.
On different sites it is written that, datasource maintain the connection pool. Yes i agree it will maintain the pool
if we dont close the connection once we are done with transaction. But generally we close the connection once we are done
with transaction. So practically datasource will never have any connection to maintain the pool. Is this correct?
It will be the scenario only when programmer forget to close the connection.Right?
The connection pool maintains open connections for a certain period of time and usually closes them after some time of inactivity. Forgetting to close a connection prevents it from being released back to the pool and later reused, that's why you should always close them. What is the question anyways?
Well, if this is MS .NET, as long as the connection string is identical, the web server will pool the connections automatically. This will occur even if you 'close' them, so it is very efficient.