I am at the moment trying to improve the performance in my ASP Classic application, and I am at the point of improving SQL transactions.
I have been reading a little from http://www.somacon.com/p236.php
So, my question is:
How does the line set rs = conn.execute(sql) actually work?
I believe that the line defines a variable named rs and binding ALL the data collected from the database through the SQL sentence (fx select * from users).
Then afterwards I can actually throw the database-connection to hell and redefine my sql variable if I please, is this true?
If that is true, will I then get the best performance by executing my code like this:
set conn = server.createobject("adodb.connection")
dsn = "Provider = sqloledb; Data Source = XXX; Initial Catalog = XXX; User Id = XXX; Password = XXX"
conn.open dsn
sql = "select id, name from users"
set rs = conn.execute(sql)
conn.close
-- Do whatever i want with the variable rs
conn.open dsn
sql = "select id from logins"
set rs = conn.execute(sql)
conn.close
-- Do whatever i want with the variable rs
conn.open dsn
sql = "select id, headline from articles"
set rs = conn.execute(sql)
conn.close
-- Do whatever i want with the variable rs
set conn = nothing
In this example i open and close the connection each time i do a sql transaction.
Is this a good idea?
Is this a good idea?
No but not for the reasons indicated by Luke. The reality is that ADODB will cache connections anyway so opening and closing connections isn't all that expensive after all. However the question proceeds from the mis-information you appear to have over the behaviour of a recordset...
From you comment on Lukes answer:-
But it is correct, that it stores all the data in the variable when executed?
Not unless you have carefully configured the recordset return to be a static client-side cursor and even then you would have to ensure that the cursor is completely filled. Only then could you disconnect the recordset from the connection and yet continue to use the data in the recordset.
By default a SQL server connection will deliver a simple "fire-hose" rowset (this isn't even really a cursor) the data is delivered raw from the query, only a small amount of buffering occurs of incoming records and you can't navigate backwards.
The most efficient way to minimise the amount of time you need the connection is to use the ADODB Recordset's GetRows method. This will suck all the rows into a 2-dimensional array of variants. Having got this array you can dispense with the recordset and connection.
Much is still made of minimising the number of connections maintained on a server but in reality on modern hardware that is not a real issue of the majority of apps. The real problem is the amount of time an on going query is maintaining locks in the DB. By consuming and closing a recordset quickly you minimise the time locks are held.
A word of caution though. The tradeoff is an increased demand for memory on the web server. You need to be careful you aren't just shifting one bottleneck to another. That said there are plenty of things you can do about that. Use a 64Bit O/S and stuff plenty of memory in it or scale out the web servers into a farm.
Nope, opening and closing connections is costly. Open it, reuse the recordset like you are, then close it.
Related
I have noticed that a lot of code I have found on the internet will have variables defined and set that I would simply avoid...such as:
Dim db as Database
Dim rs as Recordset
Set db = CurrentDB
Set rs = db.OpenRecordset
I would write this as:
Dim rs as Recordset
Set rs = CurrentDB.OpenRecordset
I save 2 lines of code at the top plus one to Set db = Nothing at the end. Which, over hundreds of subs and functions can really add up...
But is one more preferred over another by most coders? Is there an actual reason to use one or the other? Is there an advantage to actually defining and spelling out the whole thing?? Is there any real savings in doing it my way, other than staving off carpel tunnel for a few more minutes?
In terms of execution, there is no real difference between the two VBA methods. However, the first is a more generic version as the db object can be set to either the local DAO database or an external DAO database using the Workspace.OpenDatabase method. The second is a more shortcut version as usually needed recordsets, querydefs, etc. derive from the working, local database. Below are examples of referencing external databases.
Short cut version:
Set db = OpenDatabase ("C:\Path\ToExternalDB\someotherdb.accdb")
Full generic version:
Set db = DBEngine.Workspaces(0).OpenDatabase("C:\Path\ToExternalDB\someotherdb.accdb")
Therefore, should a developer change database environments (i.e., transition from local to external database source) and even workspaces, they can easily change the set db = ... line instead of adding this line to the code if developer had originally used only the CurrentDb namespace.
This is a good example of the trade-off decision between efficiency (less scripted lines) and scalability (translatable lines).
"The CurrentDb method creates another instance of the current database... The CurrentDb method enables you to create more than one variable of type Database that refers to the current database" from https://msdn.microsoft.com/en-us/library/office/aa221178%28v=office.11%29.aspx?f=255&MSPPError=-2147217396.
So if you use CurrentDB just once in your code than its fine not to declare it, however using it multiple times it won't be the same instance but always created a new one which may create strange bugs for you.
I know VB6 is a bit out of date but that's the application at the moment that I have inherited.
I have to do an update based on the results of an array to an access table.
The array contains a double and the id of the record to update.
The issue is that there are 120,000 records to update but on a test it took 60 seconds just to run it on 374 records.
This is my update code
Dim oCon As New ADODB.Connection
Dim oRs As New ADODB.Recordset
Dim string3 As String
oCon.Open "Driver={Microsoft Access Driver (*.mdb)};Dbq=" & App.Path &"\hhis.mdb;Pwd=/1245;"
oCon.BeginTrans
For i = 0 To maxnumberofrecords - 1
string3 = "UPDATE YourRatings SET yourRatings=" & YourTotalRatingsAll(i) & " Where thisID = " & thisID(i) & ";"
oRs.Open string3, oCon, adOpenStatic, adLockOptimistic
Next i
oCon.CommitTrans
oCon.Close
I added the "CommitTrans" based on some other articles that I had read but that doesn't seem to increase the speed.
The other problem is that I am going to have to run another query to rank the highest(1) to lowest (374) and update the db again...although I can probably do something with the array to add that column at the same time.
This seems quite slow to me especially when other post mention 200000 records in 14 seconds.
Have I just missed something?
Would it be quicker to load from a text file?
Thank you in advance for any help.
Mal
With Open you always constructing a new ResultSet object. Try oCon.execute string3 which only sends the SQL to your database without ResultSet overhead.
Ensure that you have an index on thisID.
Maybe your Access DB sits on a network drive. That could have a large performance impact. Try it local.
Why are you using the creaky old Access Desktop ODBC Driver instead of the Jet 4.0 OLEDB Provider?
Why are you opening a Recordset to do an UPDATE instead of calling Execute on the Connection?
Is there any reason you can't open the database for exclusive access to perform this operation? Locking overhead is all but eliminated and "cargo cult" techniques like thrashing with BeginTrans/CommitTrans lose any significance.
Do you have an index on your thisID field in the table?
Please do move to the magic bullet .Net as soon as possible. Your programs will be even slower but we won't have to read all the whinging blaming VB6.
Just to add to Wumpz answer, you might want to try just adding the query directly into the Access database and calling this using parameters. This SO thread says far more about it.
Parameters are far more stable and less hackable than injection.
Has anyone ever encountered a problem where a SQL query from a classic ASP page returns partial results, but no errors?
I have a situation where a particular query on a page (one of many throughout the system) is returning a different number of rows each time it is run, and always less than the "correct" number, as determined by running the SQL against the server directly.
I think this may be related to the connection timeout (since this occurs with a long-running query and my timing shows that it is returning very close to the timeout), but I am receiving no timeout errors. Instead, as far as I can tell, it causes no errors, and returns a valid DataSet which the code then loops over to build the results table.
Since there is no indication that an error has occurred, there is no suggestion that the data is not complete which means that users can no longer trust this report. Generally with this system, where SQL timeouts do occur frequently for "large" queries, we get error messages displayed on the page.
Investigations
I've check the HTML source to make sure there are no injected error's that I'm missing and that all tags are well-formed and that expected page elements are present. This indicates it's not an error writing a particular row from the results.
** Furthermore, the number of rows returned is different each time.
I've verified that the exact same query is being run each time.
I've verified that the data in the DB is not changin underneath the report (it's historic and I've cross-checked by running the report and the query against the DB at the same time.)
I've tried to manually print any errors from the query, but get nothing.
I've tried changing the timeout (though this hasn't helped as I can only do
this on the Dev environment and there there is not sufficient data in
the DB to reach the timeout, due to this issue.).
There are only around 20 rows in total expected, so not an issue with a very large dataset.
Has anyone run into a situation where a SQL query from a classic asp page only returns partial results, and is there any way to check for or prevent this condition?
Setup:
Classic asp web application
Heavy use throughout of ADODB.Connection objects for connecting to the DB2 server backend databases.
The database is static as far as the queried data is concerned.
Shared connection initilization as follows:
connString = "Provider=MSDASQL.1;User ID=xxx;Data Source=XXX;Extended Properties=""DSN=XXX;UID=XXX;PWD=XXX;"""
Set conn = Server.CreateObject( "ADODB.Connection" )
Set cmd = Server.CreateObject("ADODB.Command")
conn.Mode = adOpenReadOnly
conn.Open connString
cmd.ActiveConnection = conn
cmd.CommandTimeout = 600
Usage as follows:
query = " SELECT blah FROM Foo WHERE ... " ' big long list of clauses defined from user selections.
cmd.CommandText = sql
Set oRs = cmd.Execute
Resposne.Write "<table>..." ' Write table headers here'
Do while (Not oRs.eof)
Response.Write "<tr>...</tr>" ' WRite details from oRs here
oRs.MoveNext
Loop
Response.Write "</table>"
Try adding an order by to the query, that way it should send all rows at once and you can rule in or out timeout issues
I have some task regarding do operations on database using ADODB in Vb.NET. Can any help me regarding using the same record set multiple times?
I have tried on this topic just by closing the recordset after performing First operation and doing second operation e.t.c.
Example:
Dim rs As New ADODB.Recordset()
OpenConnection()
rs.Open("table1")
//Some Operations
//Now using same recordSet for Second Operation
rs.Close()
rs.Open("table2")
In ADO.NET we have "MultipleActiveResultSets=True" in ConnectionString for SqlDataReader.Do we have any property like this in ADODB?
The more similar thing that exists in ADODB is sending multiple queries in your Sql String then proccessing them one by one using rs.NextRecordset (rs is the Recordset), here is an example: ADO NextRecordset Method Demonstration
But, personally, i prefer doing it as you're doing it now, using just one recordset for each query. Take into account that using multiple recordsets inside one, like in the previous example may require some additional commands in some dbs to ensure no additional unwanted recordsets are created from messages coming from the backend, for example for Sql Server it's a good idea using:
Set NoCount On and Set ANSI_Warnings OFF
I have some ASP (Classic) code that queries a SQL 2005 Express database. I'm currently handling programmatically if this DB goes down by handling the error when someone tries to connect and can't. I capture the error and bypass subsequent db queries to this database using a session variable.
My problem is that this first query takes about 20 seconds before it timeouts.
I'd like to reduce this timeout length but can't find which property either in the code or database is the right one to reduce.
I've tried following in the code;
con.CommandTimeout = 5
con.CONNECTIONTIMEOUT = 5
Any suggestions please?
Thanks,
Andy
First off you should investigate why the DB is going down at all. We manage servers for hundreds of clients and have never run into a problem with the DB going down unless it was scheduled maintenance.
Besides that, you're already onto the right properties.
"Connect Timeout" is set in the connection string and controls how long the client waits to establish a connection to the database. It should be safe to lower this value in most cases--connections should never take long to establish.
"CommandTimeout" is a property on the IDbCommand implementation and controls how long the client waits for a particular query to return. You can lower this value if you know your queries will not take longer than the value you're setting.
Ended up using the "Connect Timeout" option within ADODB.Connection string.
e.g.
Set con = Server.CreateObject( "ADODB.Connection" )
con.Open "Provider=SQLOLEDB;Server=databaseserver;User ID=databaseuser;Password=databasepassword;Initial Catalog=databasename;Connect Timeout=5;"
If Err.Number = 0 And con.Errors.Count = 0 Then
'connected to database successfully
Else
'did not connect to database successfully within timeout period specified (5 seconds in this example)
End If