When working with NHibernate and Sql Compact in a Windows Form application I am wondering what is the best practice for managing connections. With SQL CE I have read that you should keep your connection open vs closing it as one would typically do with standard SQL. If that is the case and your using a IoC, would you make your repositories lifetime be singletons so they exist forever or dispose of them after you perform a "Unit of Work".
Also is there a way to determine the number of connections open to Sql CE?
In my DAL or DataService, which will have a lifetime of the entire app, I'd create and hold open a connection to the database and then let the ORM do whatever it wants for its own connection management. I would only do this in a Compact Framework app, though, where the speed of building up and tearing down the connection for each query might make a difference.
Related
I have been under the understanding that database connections are best used and closed. However with SQLite Im not sure that this applies. I do all the queries with a Using Connection statment. So it is my understanding that I open a connection and then close it doing this. When it comes to SQLite and optimal usage, is it better to open one permament connection for the duration of the program being in use or do I continue to use the method that I currently use.
I am using the database for a VB.net windows program with a fairly large DB of about 2gig.
My current method of connection example
Using oMainQueryR As New SQLite.SQLiteCommand
oMainQueryR.CommandText = ("SELECT * FROM CRD")
Using connection As New SQLite.SQLiteConnection(conectionString)
Using oDataSQL As New SQLite.SQLiteDataAdapter
oMainQueryR.Connection = connection
oDataSQL.SelectCommand = oMainQueryR
connection.Open()
oDataSQL.FillSchema(crd, SchemaType.Source)
oDataSQL.Fill(crd)
connection.Close()
End Using
End Using
End Using
As with all things database, it depends. In this specific case of sqlite, there are two "depends" you need to look at:
Are you the only user of the database?
When are implicit transactions committed?
For the first item, you probably want to open/close different connections frequently if there are other users of the database or if it's all possible that more than process will be hitting your sqlite database file at the same time.
For the second item, I'm not sure how sqlite specifically behaves. Some database engines don't commit implicit transactions until the connection is closed. If this is the case for sqlite, you probably want to be closing your connection a little more often.
The idea that connections should be short-lived in .Net applies mainly to Microsoft Sql Server, because the .Net provider for Sql Server is also able to take advantage of a feature known as connection pooling. Outside of Sql Server this advice is not entirely without merit, but it's not as much of a given.
If it is a local application being used by only one user I think it is fine to keep one connection opened for the life of the application.
I think with most databases the "Best used and closed" idea comes from the perspective of saving memory by ensuring you only have the minimum number of connections need open.
In reality opening the connection can be a large amount of of overhead and should be done when needed. This is why managed server infrastructure (weblogic etc.) promotes the use of connection pooling. In this way you have N connections that are utilizable at any given time. You never "waste" resources but you also aren't left with the responsibility of managing them at a global level.
related:Multiple Queries on single connection
In a Enterprise application, Is having a single connection execute for different inputs is better than disposing connections after their every execution? btw this is .net 2.0 and Sql 2005 i am talking about but the question applies to all database systems.
It is generally better to just open and dispose connections when you need them. Let connection pooling do its job.
Im interested to find out what techniques developers are using to connect to a Windows Azure instance running in the cloud?
From what i understand it is very similar to SQL Server with two of the key differences being Multiple Active Recordsets are not supported and idle/long running connections are automatically terminated by azure. For this microsoft suggest incorporating retry logic in your application to detect a closed connection and then attempt to complete the interrupted action. Does any one have example code that they are currently using on this?
To build out the data layer i was looking at various ORMs. Since im going to be accessing azure from windows azure (ie seperate boxes) to me it would seem key that any ORM mapper would need to support asynchronous methods so as not to block any windows azure instances.
Any suggestions as to which ORM mapper to use, or comments on what you are currently using
I have successfully used NHibernate with Azure and we are in the process of building a commercial app on top of NHibernate. The only problem that I had was with the connection pools when running locally and connecting to SQL Azure in the cloud - which was fixed when turning connection pooling off.
You may find similar problems with other ORM's... SQL Azure is less patient (for obvious reasons) than most people are used to. Connections timeout quicker, recycle sooner and so on.
Test first!
Here's one specifically designed for Azure:
"Telerik recently announced the
availability of Open Access, the first
ORM that works seamlessly with SQL
Azure relational databases in the
Windows Azure cloud."
And a few commenters at the Azure User Group recommend LLBLGen and Entity Framework.
I've been using Entity Framework - runs without any problems, just a different connection string.
What you do have to think about is your connection strategy, and how efficient your queries are. I've got method that's easy to write in EF - I've got a new record that could be duplicated, so I check if it's there, and if not, add it.
EF makes it really easy to do this, as if you're just accessing a local collection. BUT ... if you're paying for your dB access because it's in Azure and not on your local network, hmm, maybe there's a better (aka cheaper) way of doing that
According to Ayende, NHibernate "just works" with SQL Azure.
We have been using NHibernate without any customization on Azure (indeed, it just works), you can check Lokad.Translate as an open source example of such use.
My database must be updated at arbitrary intervals, manually, with new info to put on standart tables that don't require structural modifications. one app will update the database.
Another app, the one I will distribute (to friends and family only, but doesn't require any security feature to it, like encrypting the database) will read the database and use its data on labels, listviews, etc.
The problem is, I'm the perfect definition of full-fledged n00b at programming of any sort, and still don't know what database to use.
I thought that I should use a SQL CE (*.sdf) file, and store that database thing on an FTP. then, I could download it and get data from it everytime the "client" app runs, and certain button ("connect") is clicked.
After some hard-core googling, I found out how to connect to the sdf thing, using this connection string:
Provider=Microsoft.SQLSERVER.CE.OLEDB.3.5;Data Source=D:\Documents and Settings\Camilo\JCTM.sdf
So it connects, or at least didn't show any error.
I don't know if it's a good idea to use sdf SQL CE files as databases, if it's too hard maybe I should go for XML? what do you guys suggest, what is the easiest way to implement very simple databases in VB.NET?
By simple databases I mean:
- no search needed
- no advanced features except storing strings on tables with columns and rows
- easy to access, read, edit, etc. by different VB.NET apps
Is sdf a good idea?
I would recommend Sql Server Express Its free and can be redistributed with .net applications as part of the install process.
The challenge will be syncing the changes between the different clients. If you have access to a FTP server, you may have the ability to host a website in IIS. If you can do that you can just use webservices and read against one database instead of copying one local.
Luckily for you, you can abstract away the need to be concerned with which back-end database you use to store your data.
Technologies such as ODBC (Open Database Connectivity) and OLEDB (Object Linking and Embedding, Database) allow you to limit your concern for the backend datastore to the task of crafting your connection string. In your example, the connection string says, "I'm going to connect to a SQL Server CE database via its OLEDB provider, and it's physically located over on D:/...
Within the code, you use standard OLEDB mechanisms to access and manage the database. Because of this abstraction, you can use OLEDB providers for SQL Server, Oracle, XML, Access or comma delimited text files as your backing store if you wish, and the only change you need to make to your code is the connection string. Your choice then should be to pick the database that you have the tools and know-how to set up and manage initially.
I'd start with Microsoft Access because it has its own UI, and can play well with .NET.
You can also try the ADO.Net implementation for SQLite, which I've also found very useful.
I've written a small (8-10 laptops) point-of-sale system running over a wireless network, as an HTA that reads from/writes to an Access MDB located on a network share.
I need to use ADO - GetString and the user roster are not available with DAO.
I also need to use DAO - the MDB cannot be compacted with ADO.
I know that:
1) If the database backend is not an Access MDB, I should use ADO.
2) If the backend is an MDB, but I want to upgrade to SQL Server at some point, I should use ADO.
3) Within an Access application, or any other VBA/VB application, I should use DAO, as ADO must go through a translation layer of the Jet OLE DB Provider, while DAO is more direct.
4) VBScript/JScript allows me to use either DAO or ADO.
The two-part question is as follows:
1) In this software environment (HTA/scripting), is it better to use ADO rather than DAO?
2) Does ADO offer any benefits because the HTA is reading/writing over a wireless network?
If the only reason you need DAO is to compact the database, you can use DAO for that, and use ADO for everything else. You are not limited to using only ADO or DAO.
The biggest benefit of using ADO is that it will be easier to move to SQL Server Express when the time comes. You should do that sooner rather than later, as SQL Server Express offers all of the benefits of MSAccess databases without the drawbacks. SQL Server Express is free, and it will easily handle the system size you are proposing.
Access databases corrupt easily in a multi-user environment, especially when a wireless network is involved. If you are worried about losing the benefits of working in MSAccess, you can still attach to SQL Server using linked tables, and work with your SQL Server Express database that way.
You can also use JRO to compact your MDB file. This will be included with any recent version of MDAC, installed by default on XP and later systems. No installation of Access is necessary.
To answer your specific questions:
1) I would opt for ADO just because it is more current, and the same API can be used in other kinds of scripting, like LDAP/ActiveDirectory access, reading file system folders, reading MAPI mail, and working with other types of semi-structured text files like fixed width text and CSV. It's not specifically better for the HTA programming environment, but its perhaps better for you to learn a more widely applicable API. I also thinks it's an easier API to work with, but I started with it and only later worked on some older DAO projects.
2) One possible benefit that ADO provides is that of disconnected recordsets, which may have an advantage or at least suggest some architectural alternatives in your wireless network setup. You open a recordset, then disconnect it, so you can still work with the data in memory, but not have to leave a database connection open. Then at a later time you can reconnect and update the database. Also, you can work in a fully disconnected style by managing tables as local XML or ADTG files.
You might shoehorn DAO into working from VBScript but that's an odd pairing. ADO makes the most sense in general.
Another advantage of ADO would be that it supports RDS over DCOM or HTTP. This can be used to overcome many of the limitations of Jet MDBs used via file sharing, such as the corruption unreliable networks and clients can lead to. It also cuts the amount of traffic over the network, improving performance. In addition it offers a middle tier in which business objects can "live" and all of this can be mediated using COM+ where applicable.
Of course you no longer have the option of using a simple P2P network and a file share to host the database. RDS needs a server to host the process and run the Jet engine, which no longer needs to run on each client system. This means you can use Jet stored procedures that run on the server, offloading more client processing and network traffic. While not as sophisticated as T-SQL or other alternatives, this ADO/Jet 4.0 OLE DB Provider exclusive technology offers tangible benefits that can't be had using DAO.
RDS can mask much of the process of using disconnected Recordsets, simplifying client code. It uses ADTG under the hood, which was developed and optimized for this very purpose.
However using RDS requires more infrastructure and expertise than a simple file share. You might as well look into a low-end version of SQL Server.
In general I'd recommend using the jetcomp.exe utility to compact and repair, over either DAO or JRO. It offers a number of advantages.
If you can afford to use a HTA application with Access instead of a more performant platform, I would say you should go with the easiest api for you. The bottleneck will never be the abstraction of db access in your case. Still, DAO is really old.