How to close connection created with WebMatrix.Data.Database.Open? - vb.net

I'm using the Database class from webmatrix in a windows service. (And debugging in a winform causes same error)
The error I receive (after running for a while) is:
Timeout expired. The timeout period elapsed prior to obtaining a
connection from the pool. This may have occured because all pooled
connections were in use and max pool size was reached.
This should be solved by using using statements to make sure all connections are closed. ( as described in this good answer) But I've recently started using also the WebMatrix database class, and it does not seem to close the connections I open with it.
Question is: How to close connections created with the Database class from WebMatrix?
Using this code:
Public Shared Function GetProperty(pid As Integer, propertyName As String) As String
Dim db = Database.Open("SSEConnectionString")
Dim item = db.QuerySingle("select PropertyValue from eConfiguration where PID=#0 and PropertyName=#1", pid, propertyName)
Dim retVal = item.PropertyValue
db.Connection.Close()
db.Close()
Return retVal
End Function
Each time I run this code I get a new entry in my sys.sysprocesses table (and from what I've figured out, this is indicating a new connection is created and maintained)
My connectionstring looks like this:
<add name="SSEConnectionString" connectionString="Data
Source=123.45.67.890;Initial Catalog=SSE;User
ID=+++;Password=+++;Connect Timeout=5;Application Name=SSE_Service"
providerName="System.Data.SqlClient" />
Any idea what I'm doing wrong?
Edit: Asked a second question regarding the sysprocesses table - unsure if it helps in debugging
Thanks for any help
Edit2: Arghhhhh.... I just found what was causing the error - and it's only related to myself. I forgot to close a connection. :-(
Larsi

Use:
db.Connection.Close()
db.Close()

Related

EFUtilities with EFProfiler running

I was wondering if there was a way to get EFUtilities running at the same time EFProfiler is running.
I appreciate the profiler would not show the bulk insert due to it being done outside the confines of DBContext. At the moment, I cannot run batch jobs as the profiler has the connection wrapped. It Runs fine when not enabled
The exception I am getting is thus:
A first chance exception of type 'System.InvalidOperationException'
occurred in EntityFramework.Utilities.dll
Additional information: No provider supporting the InsertAll operation
for this datasource was found
The inner exception is null.
This is because EFUtilities automatically finds the correct provider. But when the connection is wrapped this is no longer possible.
InsertAll looks like this.
public void InsertAll<TEntity>(IEnumerable<TEntity> items, DbConnection connection = null, int? batchSize = null)
To use the SqlProvider (which is actually the only provider out of the box) you can create a new SqlConnection() and pass that to insert all.
So basically you would need to do this:
using (var db = new YourContext())
using (var con = new SqlConnection(YourConnectionString))
{
EFBatchOperation.For(db, db.PartialTestClass1).InsertAll(partials, con);
}
Now, maybe you are doing more and want both parts to run under the same transaction. In that case you can wrap that code block in a TransactionScope.

Control the timeout for locking Exclusive SQLite3 database

I have a SQLite database that I want to lock for synchronization purposes. I don't want a process that runs async on a different box processing data that has been added from a different box until it has finished with updates. DataAccess is a class that connects to sPackageFileName and reuses the same connection as long as sPackageFileName is the same or unless .Close method is called. So basically DataAccess.ExecCommand executes a command.
In Google I found this ....
DataAccess.ExecCommand("PRAGMA locking_mode = EXCLUSIVE", sPackageFileName)
DataAccess.ExecCommand("BEGIN EXCLUSIVE", sPackageFileName)
DataAccess.ExecCommand("COMMIT", sPackageFileName)
This works as advertise. If I run this on box A and then on box B I get a "database locked" exception. The problem is how long it takes. I found a PRAGMA busy_timeout. This PRAGMA is timeout controls access locks, not database locks. I am stratring to think there is not PRAGMA for database lock timeout. Right now it seems about 3-4 minutes. One other note, the sPackageFileName is not on either box, they (box A and B) connect to it over a share drive.
Also I am using the VB.NET wrapper for the SQLite dll.
CL got me on the right trail. It was the timeout of the .NET command. Here the code setting it up from my class.
Dim con As DbConnection = OpenDb(DatabaseName, StoreNumber, ShareExclusive, ExtType)
Dim cmd As DbCommand = con.CreateCommand()
If _QueryTimeOut > -1 Then cmd.CommandTimeout = _QueryTimeOut
Don't get hang up on the variables, the purpose of posting the code is show I could show the property I was talking about. The default _QueryTimeOut was set the 300 (seconds). I set cmd.ComandTimeout to 1 (second) and it returned as expected.
As CL finally got through to me, the timeout was happening someplace else. Sometimes it takes a kick to get you out of the box. :-)

Hangs with LINQ-SQL Server and TransactionScope

I'm encountering a hang when the program tries to access the fruit database. I've already enabled network access MSDTC on both my development computer and the SQL Server server.
Code:
(pardon the code coloring...SO's misinterpreting my VB .NET)
Using ts As New TransactionScope
Dim fruit As New FruitDataContext
Dim thingies As New ThingiesDataContext
If (From f In fruit.tblApples Where f.Rotten = "Yes" AndAlso f.batch = 1).Count >= 1 Then
'Record today's date as the day that the rotten apples were dumped.
End If
'Other complicated code that uses ThingiesDataContext and FruitDataContext
du.SubmitChanges()
ts.Complete()
End Using
Edit:
I've dug around a bit more and it turns out that the problem lies in the line of LINQ. When I tried to view it with the LINQ to SQL Visualizer, I get the following error:
System.InvalidCastException: Specified cast is not valid.
at LinqToSqlQueryVisualizer.SqlQueryInfo.deserialize(Stream stream)
at LinqToSqlQueryVisualizer.Visualizer.Display(IDialogVisualizerService windowService, Stream rawStream)
at LinqToSqlQueryVisualizer.DialogChooser.Show(IDialogVisualizerService windowService, IVisualizerObjectProvider objectProvider)
at Microsoft.VisualStudio.DebuggerVisualizers.DebugViewerShim.ManagedShim.DelegatedHost.CreateViewer(IntPtr hwnd, HostServicesHelper hsh, SafeProxyWrapper proxy)
I've also edited the LINQ statement to be closer to my real code.
Final edit:
I tried using a normal SqlConnection instead of a "thingies as New ThingiesDataContext" and the problem still occurs.
It appears that TransactionScope cannot handle multiple SQL connections inside the same transaction.
Official Microsoft Note
parallel transactions are not supported by SQL Server.
From MSDN: http://msdn.microsoft.com/en-us/library/bb896149.aspx
This is not an MSDTC issue. If it were, you would get an error saying DTC is not enabled and needs to be. It's also not a deadlock issue, because you would get a specific error about that as well.
If I had to guess, I would say that the 'Other complicated code...' is attempting to perform a database operation and is being blocked by one or the other database context objects.
One way you can determine this is to run SQL Profiler to see what SQL statements are actually being executed on the server, and check for blocks.

SELECT through oledbcommand in vb.net not picking up recent changes

I'm using the following code to work out the next unique Order Number in an access database. ServerDB is a "System.Data.OleDb.OleDbConnection"
Dim command As New OleDb.OleDbCommand("", serverDB)
command.CommandText = "SELECT max (ORDERNO) FROM WORKORDR"
iOrder = command.ExecuteScalar()
NewOrderNo = (iOrder + 1)
If I subsequently create a WORKORDR (using a different DB connection), the code will not pick up the new "next order number."
e.g.
iFoo = NewOrderNo
CreateNewWorkOrderWithNumber(iFoo)
iFoo2 = NewOrderNo
will return the same value to both iFoo and iFoo2.
If I Close and then reopen serverDB, as part of the "NewOrderNo" function, then it works. iFoo and iFoo2 will be correct.
Is there any way to force a "System.Data.OleDb.OleDbConnection" to refresh the database in this situation without closing and reopening the connection.
e.g. Is there anything equivalent to serverdb.refresh or serverdb.FlushCache
How I create the order.
I wondered if this could be caused by not updating my transactions after creating the order. I'm using an XSD for the order creation, and the code I use to create the record is ...
Sub CreateNewWorkOrderWithNumber(ByVal iNewOrder As Integer)
Dim OrderDS As New CNC
Dim OrderAdapter As New CNCTableAdapters.WORKORDRTableAdapter
Dim NewWorkOrder As CNC.WORKORDRRow = OrderDS.WORKORDR.NewWORKORDRRow
NewWorkOrder.ORDERNO = iNewOrder
NewWorkOrder.name = "lots of fields filled in here."
OrderDS.WORKORDR.AddWORKORDRRow(NewWorkOrder)
OrderAdapter.Update(NewWorkOrder)
OrderDS.AcceptChanges()
End Sub
From MSDN
Microsoft Jet has a read-cache that is
updated every PageTimeout milliseconds
(default is 5000ms = 5 seconds). It
also has a lazy-write mechanism that
operates on a separate thread to main
processing and thus writes changes to
disk asynchronously. These two
mechanisms help boost performance, but
in certain situations that require
high concurrency, they may create
problems.
If you possibly can, just use one connection.
Back in VB6 you could force the connection to refresh itself using ADO. I don't know whether it's possible with VB.NET. My Google-fu seems to be weak today.
You can change the PageTimeout value in the registry but that will affect all programs on the computer that use the Jet engine (i.e. programmatic use of Access databases)
I always throw away a Connection Object after I used it. Due to Connection Pooling getting a new Connection is cheap.

Transaction timeout expired while using Linq2Sql DataContext.SubmitChanges()

please help me resolve this problem:
There is an ambient MSMQ transaction. I'm trying to use new transaction for logging, but get next error while attempt to submit changes - "Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding." Here is code:
public static void SaveTransaction(InfoToLog info)
{
using (TransactionScope scope =
new TransactionScope(TransactionScopeOption.RequiresNew))
{
using (TransactionLogDataContext transactionDC =
new TransactionLogDataContext())
{
transactionDC.MyInfo.InsertOnSubmit(info);
transactionDC.SubmitChanges();
}
scope.Complete();
}
}
Please help me.
Thx.
You could consider increasing the timeout or eliminating it all together.
Something like:
using(TransactionLogDataContext transactionDC = new TransactionLogDataContext())
{
transactionDC.CommandTimeout = 0; // No timeout.
}
Be careful
You said:
thank you. but this solution makes new question - if transaction scope was changed why submit operation becomes so time consuming? Database and application are on the same machine
That is because you are creating new DataContext right there:
TransactionLogDataContext transactionDC = new TransactionLogDataContext())
With new data context ADO.NET opens up new connection (even if connection strings are the same, unless you do some clever connection pooling).
Within transaction context when you try to work with more than 1 connection instances (which you just did)
ADO.NET automatically promotes transaction to a distributed transaction and will try to enlist it into MSDTC. Enlisting very first transaction per connection into MSDTC will take time (for me it takes 30+ seconds), consecutive transactions will be fast, however (in my case 60ms). Take a look at this http://support.microsoft.com/Default.aspx?id=922430
What you can do is reuse transaction and connection string (if possible) when you create new DataContext.
TransactionLogDataContext tempDataContext =
new TransactionLogDataContext(ExistingDataContext.Transaction.Connection);
tempDataContext.Transaction = ExistingDataContext.Transaction;
Where ExistingDataContext is the one which started ambient transaction.
Or attemp to speed up your MS DTC.
Also do use SQL Profiler suggested by billb and look for SessionId between different commands (save and savelog in your case). If SessionId changes, you are in fact using 2 different connections and in that case will have to reuse transaction (if you don't want it to be promoted to MS DTC).