I don't know if this is possible, but I figured I'd ask.
Is there an equivalent to On Error GoTo Next for ADODB connections in Excel VBA?
I have a stored procedure I'm calling using an ADODB.Command object. The problem is, if any one statement in that stored procedure throws an error, the entire process gets shut down. Yes, that's appropriate in some cases, but in my case, it's not a big deal, I'd just like it to continue executing the rest of the stored procedure anyway.
On Error GoTo 0 shows the SQL error message and gives options to "End" or "Debug".
On Error Resume Next skips the SQL error message, but silently cancels the SQL command and moves the next VBA statement.
Any ideas?
EDIT: I've had a request for my code. I'm not sure it will help you much, but here:
On Error GoTo 0
ServerConnection.Open "Provider=MSDASQL.1;Persist Security Info=True;Extended Properties="DRIVER=SQL Native Client;Trusted_Connection=Yes;SERVER=DBServer;DATABASE=Database";"
Dim SqlCommand As ADODB.Command
Set SqlCommand = New ADODB.Command
With SqlCommand
.ActiveConnection = ServerConnection
.CommandText = "EXEC CacheTables #CacheTableType1=True"
.CommandType = adCmdText
.Execute
End With
ServerConnection.Close
Set SqlCommand = Nothing
Set ServerConnection = Nothing
I don't really control the stored procedure, unfortunately, but the reason it's throwing errors is that the person who wrote it used RAISERROR statements to tell the user where the execution is up to. This is fine in SQL Server, which understands the "Priority" flag and hence continues executing, but VBA appears to consider all errors to be the same importance.
Because the processing of your stored procedure is all taking place on your DB server, you would have to have some kind of error handling inside the stored procedure. For example a Try/Catch block in your TSQL if you are using Sql Server:
BEGIN TRY
{ sql_statement | statement_block }
END TRY
BEGIN CATCH
[ { sql_statement | statement_block } ]
END CATCH
[ ; ]
http://msdn.microsoft.com/en-us/library/ms175976.aspx
Alternatively you could break up your sprocs into logical divisions and surround each call with On Error GoTo Next blocks in VBA. This latter option is a pretty dirty hack though, I believe.
Related
I've tried
CurrentProject.Connection.BeginTrans
' This acts as if you didn't begin a transaction.
' Changes are posted real time then gets an error on Commit that you are not in a transaction
Private conn As ADODB.Connection
Set conn = CurrentProject.Connection
conn.BeginTrans
' This fails silently in the sense that nothing posts to the database when it gets committed
Set conn = New ADODB.connection
conn.Open GetConnectionString
conn.BeginTrans
' This fails on the first SQL statement with
' "Object variable or with block variable not set"
' But it clearly is set because it works if I don't start a transaction
' You would think that starting a transaction explicitly would work.
Private conn As ADODB.Connection
Set conn = CurrentProject.Connection
conn.Execute "Begin Transaction", , adCmdText Or adExecuteNoRecords
' This also fails silently in the sense that nothing posts to the database when it gets committed.
Commit looks just as you would expect. Whichever of these is appropriate:
CurrentProject.Connection.CommitTrans
conn.CommitTrans
conn.Execute "Commit Transaction", , adCmdText Or adExecuteNoRecords
Nothing works.
I can easily get it to work on DAO.
DAO.Workspace.BeginTrans
works just as you would expect, except that only DAO statements are committed. Anything posted through ADO is lost, disappearing as if they didn't happen.
But I don't want to switch hundreds of SQL routines to use DAO if I can avoid it.
Note that I'm not trying to create a new connection to the back end. These are all linked tables, with Access compatible SQL statements.
The solution I posted here, doesn't appear to be working after all.
What is the recommended approach to transactions in ADO on local tables?
I am executing a stored procedure in SQL Server from Excel VBA. The stored procedure takes several minutes to run, but the VBA code originally continued to run and displays a "done" message, even when the stored procedure was still executing.
To remedy this problem, I added a loop to check if the connection to SQL Server is executing something, but for some reason, I am getting stuck in the loop and never reaching the end message.
Here is what I have written:
Dim cn As Object
Dim startTime As Variant, endTime As Date
Set cn = CreateObject("ADODB.Connection")
cn.Open = ...connection string stuff to open...
cn.Execute "stored_procedure", , adAsyncExecute
Do While (cn.State And adStateExecuting) = adStateExecuting
DoEvents
Loop
cn.Close
Set cn = Nothing
MsgBox "Execution completed"
For some reason, when I print the connection state inside of the "Do while" loop, it only says 1 (1 = connection is open, instead of 4 = executing). Yet the loop is never exited!! Am I missing something here?
In a Classic ASP script, I have a sql statement that executes a stored procedure:
sql="exec my_stored_proc"
I then attempt to fill an ADO Recordset object with the results and loop through the recordset.
The stored procedure itself is populating a virtual table and the last line of the procedure returns the results:
select tableName, subTable, FieldLabel, total, FieldOrder from #AdmissionsTable
When executed directly from SQL management studio, I get results, but when executed from the ASP page, this line:
do while not rs.eof
results in the error "Operation is not allowed when the object is closed."
I know it's not because I'm trying to execute a stored proc, because I tested it against another stored proc, and was able to get results into the rs object.
Is it because I'm using a virtual table? Or something else?
Extra code:
After the sql string is assigned I open the connection, set the rs object, and execute the query:
conn.Open strConnection
Set rs = server.createobject("ADODB.Recordset")
rs.open sql, conn
Then comes the error line:
do while not rs.eof
Further update:
Right after the rs.open line, I added some debug code to response.write "RS Populated," and this line executes. But then the error still appears directly after that, so I'm guessing this means that somehow the RS is getting closed right away. Not sure why.
Still further update:
I've at least isolated the problem. The err object returns "Query timeout" as the error, and rs.ActiveConnection.CommandTimeout returns "30" as my timeout value.
The query is large and takes at least 60 seconds to run, so I tried setting rs.ActiveConnection.CommandTimeout = 120, but it still fails and - most maddeningly - it still returns "30" as the timeout value.
If rs.ActiveConnection.CommandTimeout = 120 didn't work to increase the timeout, how can I get that value up?
Problem solved. I had to set the CommandTimeout property at the connection level:
conn.CommandTimeout=120
Now it works.
As title says trying to run pass through query asynchrnously.
I have tried
db.Execute "QrySSRSOneParameter", dbRunAsync
but this doesnt work.
So I found this code that passes the SQL statement through.
I run the following code but a get a
Could not find stored procedure 'sptest'.
It does exist.
Set ws = DBEngine.CreateWorkspace("ODBCWorkspace", "LESTERASSOCIATE\Malcolm", "access", dbUseODBC)
Set myconn = ws.OpenConnection("TestConnection", dbRunAsync, False, connstring)
Set myqry = myconn.CreateQueryDef("", "EXECUTE sptest")
myconn.Execute "EXECUTE sptest", dbRunAsync
Set myconn = Nothing
Set ws = Nothing
just looking at this code briefly and one thing struck me.
you're setting your connection then creating a query def... then not using the query def
shouldn't the execute line read
myqry.execute(dbRunAsync)
My application requires a user to log in and allows them to edit a list of things. However, it seems that if the same user always logs in and out and edits the list, this user will run into a "System.Data.SqlClient.SqlException: Timeout expired." error. I've read a comment about it possibly caused by uncommitted transactions. And I do have one going in the application.
I'll provide the code I'm working with and there is an IF statement in there that I was a little iffy about but it seemed like a reasonable thing to do.
I'll just go over what's going on here, there is a list of objects to update or add into the database. New objects created in the application are given an ID of 0 while existing objects have their own ID's generated from the DB. If the user chooses to delete some objects, their IDs are stored in a separate list of Integers. Once the user is ready to save their changes, the two lists are passed into this method. By use of the IF statement, objects with ID of 0 are added (using the Add stored procedure) and those objects with non-zero IDs are updated (using the Update stored procedure). After all this, a FOR loop goes through all the integers in the "removal" list and uses the Delete stored procedure to remove them. A transaction is used for all this.
Public Shared Sub UpdateSomethings(ByVal SomethingList As List(Of Something), ByVal RemovalList As List(Of Integer))
Using DBConnection As New SqlConnection(conn)
DBConnection.Open()
Dim MyTransaction As SqlTransaction
MyTransaction = DBConnection.BeginTransaction()
Try
Using MyCommand As New SqlCommand()
MyCommand.Transaction = MyTransaction
MyCommand.CommandType = CommandType.StoredProcedure
For Each SomethingItem As Something In SomethingList
MyCommand.Connection = DBConnection
If SomethingItem.ID > 0 Then
MyCommand.CommandText = "UpdateSomething"
Else
MyCommand.CommandText = "AddSomething"
End If
MyCommand.Parameters.Clear()
With MyCommand.Parameters
If MyCommand.CommandText = "UpdateSomething" Then
.Add("#id", SqlDbType.Int).Value = SomethingItem.ID
End If
.Add("#stuff", SqlDbType.Varchar).Value = SomethingItem.Stuff
End With
MyCommand.ExecuteNonQuery()
Next
MyCommand.CommandText = "DeleteSomething"
For Each ID As Integer In RemovalList
MyCommand.Parameters.Clear()
With MyCommand.Parameters
.Add("#id", SqlDbType.Int).Value = ID
End With
MyCommand.ExecuteNonQuery()
Next
End Using
MyTransaction.Commit()
Catch ex As Exception
MyTransaction.Rollback()
'Exception handling goes here '
End Try
End Using
End Sub
There are three stored procedures used here as well as some looping so I can see how something can be holding everything up if the list is large enough.
I'm using Visual Studio 2008 to debug and am using SQL Server 2000 for the DB.
Edit: I still seem to be getting this error. I've even removed the whole transaction thing and I still encounter it. At this point, I'm assuming there is some kind of leak happening here. I've tried not using the USING statements and explicitly tell the command and connection to dispose itself but no dice. Memory usage by SQL Server also increases quite a bit if this method is called a lot in a short period of time.
I've read that increasing the CommandTimeout property of the SQLCommand would help. I'm wondering if there are any big disadvantages or consequences from doing so.
I would suggest using the following, that way Dispose will always be called and be Rolledback in every non-committed case.
using (SqlConnection sqlCn = new SqlConnection())
{
using (SqlTransaction myTrans = sqlCn.BeginTransaction())
{
...
myTrans.Commit();
}
}
Also, I don't believe you need to make a new SqlCommand for every execution. Just maintain the same one and update the CommandText and Parameters.
If you have a large number of commands, you may want to build them all before opening the connection. After you start the transaction and open the connection, spin through and execute them.
You probably want to use TransactionScope
Using _tx as New System.Transactions.TransactionScope(<add your own timeout here>)
'Do all your sql work'
If _noErrors Then
_tx.Complete()
End If
End Using
With the transaction scope, you can set a timeout of up to 20 minutes without modifying server settings.
I believe I have managed to solve the problem. I have modified the application so that unnecessary calls to the database are not made (i.e. unchanged objects do not need to be updated again) and increased the CommandTimeout property for the SQLCommand object. So far, no problems.
Big thanks for suggestions too.