VB6 Access speed when UPDATE to table - sql

I know VB6 is a bit out of date but that's the application at the moment that I have inherited.
I have to do an update based on the results of an array to an access table.
The array contains a double and the id of the record to update.
The issue is that there are 120,000 records to update but on a test it took 60 seconds just to run it on 374 records.
This is my update code
Dim oCon As New ADODB.Connection
Dim oRs As New ADODB.Recordset
Dim string3 As String
oCon.Open "Driver={Microsoft Access Driver (*.mdb)};Dbq=" & App.Path &"\hhis.mdb;Pwd=/1245;"
oCon.BeginTrans
For i = 0 To maxnumberofrecords - 1
string3 = "UPDATE YourRatings SET yourRatings=" & YourTotalRatingsAll(i) & " Where thisID = " & thisID(i) & ";"
oRs.Open string3, oCon, adOpenStatic, adLockOptimistic
Next i
oCon.CommitTrans
oCon.Close
I added the "CommitTrans" based on some other articles that I had read but that doesn't seem to increase the speed.
The other problem is that I am going to have to run another query to rank the highest(1) to lowest (374) and update the db again...although I can probably do something with the array to add that column at the same time.
This seems quite slow to me especially when other post mention 200000 records in 14 seconds.
Have I just missed something?
Would it be quicker to load from a text file?
Thank you in advance for any help.
Mal

With Open you always constructing a new ResultSet object. Try oCon.execute string3 which only sends the SQL to your database without ResultSet overhead.
Ensure that you have an index on thisID.
Maybe your Access DB sits on a network drive. That could have a large performance impact. Try it local.

Why are you using the creaky old Access Desktop ODBC Driver instead of the Jet 4.0 OLEDB Provider?
Why are you opening a Recordset to do an UPDATE instead of calling Execute on the Connection?
Is there any reason you can't open the database for exclusive access to perform this operation? Locking overhead is all but eliminated and "cargo cult" techniques like thrashing with BeginTrans/CommitTrans lose any significance.
Do you have an index on your thisID field in the table?
Please do move to the magic bullet .Net as soon as possible. Your programs will be even slower but we won't have to read all the whinging blaming VB6.

Just to add to Wumpz answer, you might want to try just adding the query directly into the Access database and calling this using parameters. This SO thread says far more about it.
Parameters are far more stable and less hackable than injection.

Related

Copying Tables contents of one Database to another from a vb.NET app using OracleDataAdapter.InsertCommand

So the rundown of what I'm trying to achieve is essentially an update app that will pull data from our most recent production databases and copy it's contents to the Devl or QA databases. I plan to limit what is chosen by a number of rows so to increase the consistency that this update can happen by allowing us to only get what we need, as for right now these databases rarely get updated due to the vast size of the copy job. The actual pl/sql commands will be stored in a table that I plan to reference for each table, but I'm currently stuck on the best and easiest way to transfer these between these two databases while still getting my commandText to be used. I figured the best way would be to use the OracleDataAdapter.InsertCommand command, but very few examples can be found as to what I'm doing, any suggestions aside from the .InsertCommand are welcome as I'm still getting my footing with Oracle all together.
Dim da As OracleDataAdapter = New OracleDataAdapter
Dim cmd As New OracleCommand()
GenericOraLoginProvider.Connect()
' Create the SelectCommand.
cmd = New OracleCommand("SELECT * FROM TAT_TESTTABLE ", GenericOraLoginProvider.Connection())
da.SelectCommand = cmd
' Create the InsertCommand.
cmd = New OracleCommand("INSERT INTO TAT_TEMP_TESTTABLE", GenericOraLoginProvider.Connection())
da.InsertCommand = cmd
Question: This is an example of what I've been trying as a first step with the Insert command, TAT_TESTTABLE and TAT_TEMP_TESTTABLE are just junk tables that I loaded with data to see if I could move things the way I wanted this way.
As why I'm asking this question the data isn't transferring over, while these tables are on the same database in the future they will not be along with the change to the previously mentioned pl/sql commands. Thankyou for any help, or words of wisdom you can provide, and sorry for the wall of text I tried to keep it specific.
lookup sqlbulkcopy. I use this to transfer data between all kinds of vendor databases.. https://msdn.microsoft.com/en-us/library/ex21zs8x(v=vs.110).aspx

CurrentProject vs CurrentDb

I'm new to VBA and very new to SQL so I don't quite understand why I am having this problem. I'm trying to import excel files into an access database. For a few reasons I have to store my excel data into an array and then upload that array into a temporary table and then append the pertinent parts of the table. To my current understanding, the best way to get my data from my array into a new temporary table would be to create that table and then populate it row by row using SQL. My problem is that when I try to create the table, I get an error. I've simplified my code, I believe this contains what you need to know.
Dim strSQL As String
strSQL = "CREATE TABLE TestTable (`Value` VARCHAR(20))"
CurrentDb.Execute strSQL
This works fine
Dim strSQL As String
strSQL = "CREATE TABLE TestTable (`Value` DECIMAL(4,2))"
CurrentDb.Execute strSQL
This results in run-time error 3292 "Syntax error in field definition"
Dim strSQL As String
strSQL = "CREATE TABLE TestTable (`Value` DECIMAL(4,2))"
CurrentProject.Connection.Execute strSQL
This works fine
Is there a reason why I should ever use one over the other? Simply changing to CurrentProject seems to fix any problems I have but I just want to understand what is happening here. I have done my own research but any answer I find goes over my head (again, I am relatively new to VBA). I apologize in advance, there may be no answer that I am currently able to comprehend
IIRC, CurrentDb.Execute supports a different ANSI SQL standard, I think ANSI 89, although I could be wrong. CurrentProject.Connection.Execute supports ANSI 92, which allows that statement to execute correctly. It's a different spec of the SQL language. If you've ever used or will use SQL server this becomes obvious pretty quickly.
You should notice that your SQL statement won't work either trying to run it as a plain SQL query either. To run this successfully as a query, one option is to enable ANSI 92 as it isn't enabled by default in Access. See below for the relevant option to toggle.
That being said, changing the SQL standard may be less than ideal if you are familiar with Access' SQL syntax, however it might be very helpful if you already have experience with SQL Server. Also worth noting, changing the ANSI option in Access won't change how CurrentDB.Execute runs executes SQL statements in VBA, you'll still need the current approach, or running a stored query.

How does "set rs = conn.execute(sql)" actually work?

I am at the moment trying to improve the performance in my ASP Classic application, and I am at the point of improving SQL transactions.
I have been reading a little from http://www.somacon.com/p236.php
So, my question is:
How does the line set rs = conn.execute(sql) actually work?
I believe that the line defines a variable named rs and binding ALL the data collected from the database through the SQL sentence (fx select * from users).
Then afterwards I can actually throw the database-connection to hell and redefine my sql variable if I please, is this true?
If that is true, will I then get the best performance by executing my code like this:
set conn = server.createobject("adodb.connection")
dsn = "Provider = sqloledb; Data Source = XXX; Initial Catalog = XXX; User Id = XXX; Password = XXX"
conn.open dsn
sql = "select id, name from users"
set rs = conn.execute(sql)
conn.close
-- Do whatever i want with the variable rs
conn.open dsn
sql = "select id from logins"
set rs = conn.execute(sql)
conn.close
-- Do whatever i want with the variable rs
conn.open dsn
sql = "select id, headline from articles"
set rs = conn.execute(sql)
conn.close
-- Do whatever i want with the variable rs
set conn = nothing
In this example i open and close the connection each time i do a sql transaction.
Is this a good idea?
Is this a good idea?
No but not for the reasons indicated by Luke. The reality is that ADODB will cache connections anyway so opening and closing connections isn't all that expensive after all. However the question proceeds from the mis-information you appear to have over the behaviour of a recordset...
From you comment on Lukes answer:-
But it is correct, that it stores all the data in the variable when executed?
Not unless you have carefully configured the recordset return to be a static client-side cursor and even then you would have to ensure that the cursor is completely filled. Only then could you disconnect the recordset from the connection and yet continue to use the data in the recordset.
By default a SQL server connection will deliver a simple "fire-hose" rowset (this isn't even really a cursor) the data is delivered raw from the query, only a small amount of buffering occurs of incoming records and you can't navigate backwards.
The most efficient way to minimise the amount of time you need the connection is to use the ADODB Recordset's GetRows method. This will suck all the rows into a 2-dimensional array of variants. Having got this array you can dispense with the recordset and connection.
Much is still made of minimising the number of connections maintained on a server but in reality on modern hardware that is not a real issue of the majority of apps. The real problem is the amount of time an on going query is maintaining locks in the DB. By consuming and closing a recordset quickly you minimise the time locks are held.
A word of caution though. The tradeoff is an increased demand for memory on the web server. You need to be careful you aren't just shifting one bottleneck to another. That said there are plenty of things you can do about that. Use a 64Bit O/S and stuff plenty of memory in it or scale out the web servers into a farm.
Nope, opening and closing connections is costly. Open it, reuse the recordset like you are, then close it.

Insert Records repeatedly faster

I'm monitoring a folder for Jpg files and need to process the incoming files.
I decode the filename to get all the information I want and insert into a table and then move the file to another folder.
The file name is already contains all the information I want. Eg.
2011--8-27_13:20:45_MyLocation_User1.jpg.
Now I'm using an Insert statement
Private Function InsertToDB(ByVal SourceFile As String, ByVal Date_Time As DateTime, ByVal Loc As String, ByVal User As String) As Boolean
Dim conn As SqlConnection = New SqlConnection(My.Settings.ConString)
Dim sSQL As String = "INSERT INTO StageTbl ...."
Dim cmd As SqlComman
cmd = New SqlCommand(sSQL, conn)
....Parameters Set ...
conn.Open()
cmd.ExecuteNonQuery()
conn.Close()
conn = Nothing
cmd = Nothing
End Function
The function will be called for every single file found.
Is this most efficient way? Looks like its is very slow. I need to process about 20~50 files/sec. Probably a stored procedure?
I need to do this as fast as possible. I guess bulk insert not applicable here.
Please help.
Bulk insert could be applicable here - do you need them to be in the DB instantly, or could you just build up the records in memory then push them into the database in batches?
Are you multi-threading as well - otherwise your end to end process could get behind.
Another solution would be to use message queues - pop a message into the queue for every file, then have a process (on a different thread) that is continually reading the queue and adding to the database.
There are several things you can do to optimize the speed of this process:
Don't open and close the connection for every insert. That alone will yield a (very) significant performance improvement (unless you were using connection pooling already).
You may gain performance if you disable autocommit and perform inserts in blocks, commiting the transaction after every N rows (100-1000 rows is a good number to try for a start).
Some DB systems provide a syntax to allow insertion of multiple rows in a single query. SQL Server doesn't but you may be interested on this: http://blog.sqlauthority.com/2007/06/08/sql-server-insert-multiple-records-using-one-insert-statement-use-of-union-all/
If there are many users/processes accessing this table, access can be slow depending on your transaction isolation level. In your case (20-50 inserts/sec) this shouldn't make a big difference. I don't recommend modifying this unless you understand well what you are doing: http://en.wikipedia.org/wiki/Isolation_%28database_systems%29 and http://technet.microsoft.com/es-es/library/ms173763.aspx .
I don't think a stored procedure will necessarily provide a big performance gain. You are only parsing/planning the insert 20-50 times per second. Use a stored procedure only if it fits well your development model. If all your other queries are in code, you can avoid it.
Ensure your bottleneck is the database (i.e. moving files is not taking a lot of time), but since the OS should be good at this, check the points above. If you find that moving files is your bottleneck, delaying or moving files in the background (another thread) can help to a certain extent.

SQL bottleneck, how to fix

This is related to my previous thread: SQL Query takes about 10 - 20 minutes
However, I kinda figured out the problem. The problem (as described in the previous thread) is not the insert (while its still slow), the problem is looping through the data itself
Consider the following code:
Dim rs As DAO.Recordset
Dim sngStart As Single, sngEnd As Single
Dim sngElapsed As Single
Set rs = CurrentDb().QueryDefs("select-all").OpenRecordset
MsgBox "All records retreived"
sngStart = Timer
Do While Not rs.EOF
rs.MoveNext
Loop
sngEnd = Timer
sngElapsed = Format(sngEnd - sngStart, "Fixed") ' Elapsed time.
MsgBox ("The query took " & sngElapsed _
& " seconds to run.")
As you can see, this loop does NOTHING. You'd expect it to finish in seconds, however it takes about 857 seconds to run (or 15 minutes). I dont know why it is so slow. Maybe the lotusnotes sql driver?
any other ideas? (java based solution, any other solution)
What my goal is: To get all the data from remote server and insert into local access table
This document has some information about performance tuning in NotesSQL. If you aren't already, select your data from Notes Views instead of Notes Forms. NotesSQL will then leverage the indexes within the views for faster queries. You may need to create the view in the Notes database, but the performance benefit will make it worthwhile.
My recommendation is that you create a Pass-Through query that will get the data from the remote server. Then create a Make Table query that uses the aforementioned query as its source. Your function then would be simplified to a call to this second query.
The loop isn't doing "nothing" it's calling MoveNext, which is potentially doing A LOT.