Access increase speed of DLookup by "caching" tables or other strategy? - sql

Background
I have an Access splitform with multiple DLookups. There are about 10 total DLookups on the form and there are approximately 25-50 records displayed at any one time in the Splitform view.
The Access frontend is linked to SQL tables.
When the DLookup values are displayed in the Datasheet view, it becomes quite slow to view the information, because there are frequent recalculations (each time anything in the dataset changes Access appears to recalculate all DLookups for the entire Splitform datasheet). This was very noticeably and unacceptably slow when connecting through VPN.
Research
I decided to investigate and wrote the following to determine why things were so slow. I also wanted to check if DLookup was slower than a SQL query for some reason.
sub testLotsofDlookups()
Dim count As Integer
Dim startTime As Date
Dim endTime As Date
Dim numbTries As Integer
Dim t As String
numbTries = 100
startTime = Now
count = 0
Dim dbs As DAO.database
Dim rsSQL As DAO.Recordset
Dim strSQL As String
Set dbs = CurrentDb
'Open a snapshot-type Recordset based on an SQL statement
strSQL = "Select FullName from ToolDesigners Where ToolDesignersID=4;"
startTime = Now
For count = 1 To numbTries
Set rsSQL = dbs.OpenRecordset(strSQL, dbOpenSnapshot)
t = rsSQL.Fields(0)
Next count
Dim mDiff As Double
mDiff = DateDiff("s", startTime, Now)
Debug.Print "SQL Total time:" & vbTab & DateDiff("s", startTime, Now)
Debug.Print "SQL Average time:" & vbTab & mDiff / numbTries
'
'
'
'
'
startTime = Now
For count = 1 To numbTries
t = DLookup("FullName", "ToolDesigners", "ToolDesignersID=4")
Next count
mDiff = DateDiff("s", startTime, Now)
Debug.Print "DLookupUp Total time:" & vbTab & DateDiff("s", startTime, Now)
Debug.Print "DLookupUp Average time:" & vbTab & mDiff / numbTries
end sub
(I understand this is only precise to single seconds)
Interestingly, I found that on average each DLookup and SQL query was taking nearly 0.5 seconds. While working on company intranet, I still have times of over 0.10 seconds on average. Both are very comparable in speed.
This causes very slow form refresh as well as VERY slow datasheet refresh.
I then tested against a SQLExpress database hosted on my machine - times dropped to 0.0005 seconds on average.
Question
It seems DLookups are slow in this application. I am hoping to find an alternative and faster approach.
What I would like to be able to do is to somehow cause the DLookup to run against local tables Access presumably keeps rather than the SQL tables on the server. It seems I could either create temp tables every time I open a form or the database (not a fan) - is there a better way?
It seems if I was referring to another Access database I could just use "opendatabase" which then keeps it in memory. This then increases the speed of queries against that database. 100% of the examples I find are referring to Access databases though, not SQL.
Alternatively I could use something other than DLookup, which is what I thought when testing the SQL commands but I'm not really sure what to do because SQL was comparable speed.

If it's just single values then I'd be inclined to use a simple in-memory cache -
Private mToolDesignerFullNameCache As New Scripting.Dictionary
Function GetToolDesignerFullName(Criteria As String)
If mToolDesignerFullNameCache.Exists(Criteria) Then
GetToolDesignerFullName = mToolDesignerFullNameCache(Criteria)
Else
Dim Name
Name = DLookup("FullName", "ToolDesigners", Criteria)
mToolDesignerFullNameCache.Add(Criteria, Name)
GetToolDesignerFullName = Name
End If
End Function
Sub ResetToolDesignerFullNameCache()
mToolDesignerFullNameCache.RemoveAll
End Sub
Requires adding 'Microsoft Scripting Runtime' as a VBA reference to compile. In the past I have found this sort of thing useful even when using an Access backend given how often the Access UI will poll for data.

Related

Can I open a recordset using application-level features (user-defined functions, form-based parameters) in Access?

I want users to be able to provide a query they made in the GUI, using a combo box, and then load that query into a recordset to do further processing on it. This fails if the query contains a user-defined function or form-based parameter.
My code looks like this:
Private Sub cmbSelectionColumn_AfterUpdate()
Dim r As DAO.Recordset
Set r = CurrentDb.OpenRecordset("SELECT DISTINCT " & EscapeSQLIdentifier(Me.cmbSelectionColumn.Value) & " FROM " & EscapeSQLIdentifier(Me.cmbSelectionTable.Value))
Do While Not r.EOF
'Do stuff
r.MoveNext
Loop
End Sub
Where cmbSelectionColumn is a user-selected column, and cmbSelectionTable is a user-selected table or query, and EscapeSQLIdentifier is a function that escapes and adds brackets to ensure the field and tablename are safe. This mostly works fine, but it fails in multiple cases, such as involving pass-through queries, user-defined functions, and form-based parameters.
Is there a way I can create a recordset from any query that works in Access, without having to worry about this?
Yes, there is, but you will have to do some trickery.
Forms support these queries just fine. And forms have a .RecordsetClone property that allows us to retrieve the recordset.
To allow us to retrieve the recordset from code, we're going to create a new blank form, and add a module to it (in fact, any form with a module will do). We'll name it frmBlank.
Then, we can adjust the code to use this form to retrieve the recordset.
Private Sub cmbSelectionColumn_AfterUpdate()
Dim r As DAO.Recordset
Dim frm As New Form_frmBlank
frm.RecordSource = "SELECT DISTINCT " & EscapeSQLIdentifier(Me.cmbSelectionColumn.Value) & " FROM " & EscapeSQLIdentifier(Me.cmbSelectionTable.Value)
Set r = frm.RecordsetClone
Do While Not r.EOF
'Do stuff
r.MoveNext
Loop
End Sub
This allows us to retrieve the recordset. The form will not pop up (since we haven't set .Visible to True), and will close once the code is done running since there is no active reference to it. I haven't yet seen any tables or queries that do work in Access, but do not work with this approach, and the performance penalty is minor. It does make for odd code and an odd blank form with blank module that will cause your database to malfunction when deleted.
The following may present an alternative approach to opening DAO recordsets which reference form-based parameters:
Dim db As DAO.Database
Dim pr As DAO.Parameter
Set db = CurrentDb
With db.CreateQueryDef("", "SELECT DISTINCT " & EscapeSQLIdentifier(Me.cmbSelectionColumn.Value) & " FROM " & EscapeSQLIdentifier(Me.cmbSelectionTable.Value))
For Each pr In .Parameters
pr.Value = Eval(pr.Name)
Next pr
With .OpenRecordset
If Not .EOF Then
.MoveFirst
Do Until .EOF
' Do stuff
.MoveNext
Loop
End If
.Close
End With
End With
Here, since references to objects outside of the scope of the query (such as references to form controls) become query parameters whose parameter name matches the original reference, the parameter name is evaluated to yield the value held by the form control, and the parameter value is then updated to the result of this evaluation.

Access 2010 + VBA: Parameterised append queries – one works, two don’t, but no error messages

This one has had me tearing my hair out for ages but the solution still eludes me. And my last 15 years before retiring were working with Access & VBA, so my pride is hurting even more than my hair!
The project:
A database to catalogue portraits recording for each artist name, sitter name, date painted (when known), brief details on content (may be none), and the portrait’s location (if known). In addition to the table for portrait details are a table for artist information and one for sitter information (plus various lookup & other tables that aren’t relevant to this issue).
My client was super-keen to get data entered so she uses the memo field on the artist form (the first one designed) to enter details of portraits painted by that artist – following a strict pattern set by me (and checked before attempting to process). That’s not the issue – my parsing routine correctly identifies the data to be processed for the creation of the new sitter records and portrait records: I know this to be true because my code is liberally scattered with message boxes showing the VBA’s interpretation of the data (vital for debugging!)
The process of parsing + posting is briefly as follows (the memo field is assigned to a variable which is then parsed one character at a time):
The sitter’s name is identified (always the first entry on a new line) and assigned to a variable, then (if relevant/if known) brief details, date painted & location;
once it’s reached the end of the details for the portrait it shows a message box giving that information as interpreted by the routine for me to confirm;
it then appends the sitter’s name to the sitters’ table and returns the ID of the new record. This works. It then appends (or, rather, it should append but doesn’t!) the relevant data, including the sitter ID (and the artist ID from the form) to the portraits table.
It then continues (on that line if there are more portraits for that sitter, otherwise on the next line, until it reaches the end of the memo).
Originally the two append processes were by dynamically-created SQL – building the SQL strings using the appropriate variables – using database.Execute. But when I found that the second append was failing – with no error messages – I spent several hours looking at various tech sites, and one message was coming through strongly – use parameterised queries! OK, hadn’t used these before (not in VBA – of course I’d used them on forms for select & other queries) so I set up a simple test database to mimic the process but without putting my client’s data at risk!
It took a while – and a bit more time online – before I got it right, but yay, it worked with all combinations of missing data. (Incidentally, as you’ll note from the table defs below, the “year painted” is an integer field, which of course doesn’t accept “Nulls”, and I don’t want a zero where there’s no date, so there are two append portrait queries, one omitting posting to the ‘year’ field.)
And now I’ve gone back to the live database – and the query that appends the sitter is working, but the other two aren’t, again with no error message. So it’s over to you, please!
TABLE DEFS (I’m only listing the relevant fields; no fields have Required set to Yes; zero-length strings allowed):
taArtists: arID – autonumber; arNotes – memo; (plus other fields)
taSitters_Sub: ssID – autonumber; ssFullname – text 70; (plus other fields)
xtaPortraits: xrID – autonumber; xrArtistRef – long integer (link to taArtists = arID);
xrSitterRef – long integer (link to taSitters_sub = ssID); xrPortraitName – text 25; xrLocationCode – text 20; (plus other fields)
PARAMETERISED QUERIES
qu_app_sitter: INSERT INTO taSitters_Sub ( ssFullname ) SELECT [par1] AS Expr1;
qu_app_portrait: INSERT INTO xtaPortraits ( xrArtistRef, xrSitterRef, xrYearPainted, xrPortraitName, xrLocationCode ) SELECT [par2] AS Expr1, [par3] AS Expr2, [par4] AS Expr3, [par5] AS Expr4, [par6] AS Expr5;
qu_app_portrait_NoYear: INSERT INTO xtaPortraits ( xrArtistRef, xrSitterRef, xrPortraitName, xrLocationCode ) SELECT [par2] AS Expr1, [par3] AS Expr2, [par5] AS Expr4, [par6] AS Expr5;
EXTRACTS OF VBA
(Were I to start again I’d probably feed the values for par2 to par6 to a separate subroutine but while that would reduce the amount of code I’m not sure it’d actually be more efficient! I am, of course, open to expert advice on that! Were my client to start again I’d get her to wait for the system to be complete before entering data – making all this redundant – or get her to enter the data in an Excel spreadsheet & I’d process it from there. One lives, one learns [hopefully!])
1 – declarations, setting database & querydefs:
[only showing relevant Dims:
Private Sub Command77_Click()
On Error GoTo myError
Dim myID As Long 'the ID for the painter
Dim myName As String 'the name of the sitter
Dim myDesc As String 'any text description **could include part-dates or other digits
Dim myLoc As String 'the location code
Dim myDate As Integer 'myNum converted from string if it's 4 digits
Dim Errline As Integer ‘used so error messages get me close to the problem
'**** Now the database stuff
Dim db As DAO.Database
Dim rs As DAO.Recordset
Dim mySitterID As Long, myNewRows As Integer
Dim qd1 As DAO.QueryDef, qd2 As DAO.QueryDef, qd3 As DAO.QueryDef
Dim par1 As DAO.Parameter, par2 As DAO.Parameter, par3 As DAO.Parameter, par4 As DAO.Parameter, par5 As DAO.Parameter, par6 As DAO.Parameter
'**** Set these once, and turn off in 'Leave'
Set db = CurrentDb()
Set qd1 = db.QueryDefs("qu_app_sitter")
Set qd2 = db.QueryDefs("qu_app_portrait")
Set qd3 = db.QueryDefs("qu_app_portrait_noyear")
2 showing routine calling the insert queries plus preceding message box
[there are 9 of these, similar but not identical, depending on missing information]
MsgBox "Got name & location only" & vbCrLf & vbCrLf & "Sitter: " & myName & vbCrLf & "Location: " & myLoc, vbInformation, "LINE " & Errline
qd1.Parameters("par1").Value = myName
qd1.Execute
myNewRows = qd1.RecordsAffected
MsgBox myNewRows & " record added", vbInformation, "NEW SITTER"
myNewRows = 0
qd1.Close
Set rs = db.OpenRecordset("SELECT ##IDENTITY AS LastID;")
mySitterID = rs!lastid
qd3.Parameters("par2").Value = myID
qd3.Parameters("par3").Value = mySitterID
qd3.Parameters("par5").Value = myDesc
qd3.Parameters("par6").Value = myLoc
qd3.Execute
myNewRows = qd3.RecordsAffected
MsgBox myNewRows & " record added", vbInformation, "NEW PORTRAIT (year not known)"
myNewRows = 0
qd3.close
Both queries should insert 1 row - the first always does, the second always reports 0 rows inserted (and indeed none is inserted)
I'm painfully aware that this post is much longer than many here but hopefully I've given all the info required.
I did a quick lookover of your code.. I suggest
are you getting the correct value of mySitterID ?
im not 100% sure but isnt the ##Identity actually Artistid??
Meir
OK, thanks partly to a prompt from Meir Rotfleisch, I dug deeper - running the query manually (omitting each field in turn) revealed that one was causing a key violation error. Then typing manually into that field pointed me to a need for a related entry in another table (which table wasn't visible on the relationship diagram!) Removing that link (which will be reinstated once the full system has been built & tested) resolved the problem.
One vital lesson I'd like to pass on - make sure your relationship diagram shows ALL tables!

Setting two rowsources in same procedure is terrible slow - Access 2010

I have two Listboxes which are filled via vba on a click event. The table 'Project' is a odbc datasource with 250 records.
List1.RowSource = "SELECT Name FROM Project WHERE ProjectID = " & ProjectID.Caption & " AND Year = " & ActualYear.Caption & " ORDER BY Name"
List2.RowSource = "SELECT ProjectShare FROM Project WHERE ProjectID = " & ProjectID.Caption & " AND Year = " & ActualYear.Caption & " ORDER BY Name"
So far so good. But when I run this code, it takes everytime up to 30sec to complete. I thought, okay it's because of odbc and so on. But when I run only one line of this code (no matter which), it is fast as lightning (0,1sec).
How can it be, that one query takes 0,1sec and two querys 30sec? May I could make a break between these two lines? Btw. without odbc everything works like a charm, no matter how many lines
You can bind both listboxes to the same recordset by manually creating the recordset. This allows Access to only query the table once instead of twice at the same time, avoiding any locking conflicts, and tends to avoid other problems as well.
This also allows you to use parameters, fixing any errors introduced by string concatenation.
Dim rs As DAO.Recordset
With CurrentDb.CreateQueryDef("", "SELECT Name, ProjectShare FROM Project WHERE ProjectID = p1 AND Year = p2 ORDER BY Name")
.Parameters(0).Value= ProjectID.Caption
.Parameters(1).Value = ActualYear.Caption
Set rs = .OpenRecordset(dbOpenSnapshot) 'Snapshot because it won't be updated
End With
Set list1.Recordset = rs
Set list2.Recordset = rs
Note that I have had errors occur when an object bound to a recordset with parameters was requeried, so you might want to use string concatenation if that's happening.

Update query in MS Access 2010 exhausts system resources

For reasons beyond my control we are using Access 2010 to update linked SharePoint lists to keep them synchronized to our CMDB. We obtain reports from the CMDB in CSV format, and link them to Access as well. We then use a combination of Access VBA and predefined queryies to add new data, or update or soft delete existing data. One list in particular is causing problems. Specifically, inserts/soft deletes seem to work, but Access exhausts resources and crashes when running the update query. Pulling up the resource monitor shows that memory usage constantly increases as the application runs, and Access finally fails when ~ 1.6 GB or RAM has been allocated to it (on a 4 GB machine with a 6 GB swap file, Windows 7 64 bit, but 32 bit Access).
I use two queries in addition to the VBA code. One query retrieves a result set that allows me to determine which row in the SharePoint list is to be updated (if any), while the other identifies which columns from the report update corresponding columns in the SharePoint list, the join condition between the linked report and the corresponding list, and the row in SP to be updated, identified by by its composite key. Fairly standard stuff, I think.
We have to use this approach (or one substantially similar) due to the fact that the SharePoint list has associated workflows. We found that if we wrote our SQL to perform standard set-type updates, the updates occurred too quickly, overloading Sharepoint's workflow engine and causing the workflows to fail.
I've tried a number of alternate techniques:
Using a recordset edit/update sequence rather than the query/exec
shown below. That consumes memory even more quickly, and spikes the
CPU to 26% vs. 12%.
As shown in the VBA code below, I've tried closing and reopening the queries every
100 rows, as well as using transactions. Neither technique results in
an improvement.
I've tried disabling then re-enabling and extending Access'
SharePoint caching mechanism, with no success.
I've tried using parameterized queries. This technique does not work
as we must update a number of memo fields, and query parameters max
out at 255 characters.
Running a database compact/repair does not release allocated memory.
This is the VBA code to execute the queries:
Private Sub runUpdt()
Dim oQdfUpdt As DAO.QueryDef
Dim oRs As DAO.Recordset
Dim oWrkSpc As DAO.Workspace
Dim strmsg As String
On Error GoTo Handler
logMsg "Entering method runUpdt in class clsAppFsFin"
Debug.Print "Entering method runUpdt in class clsAppFsFin", Now()
Set oRs = CurrentDb.QueryDefs("slctAppFsFinRowsForUpdt").OpenRecordset(dbOpenDynaset, dbReadOnly)
Set oQdfUpdt = CurrentDb.QueryDefs("updtAppFsFin")
Set oWrkSpc = DBEngine.Workspaces(0)
Do While (Not oRs.EOF)
oWrkSpc.BeginTrans
If (isUpdated(oRs)) Then
oQdfUpdt.Parameters("CHGTXT") = "System Change"
oQdfUpdt.Parameters("CID") = oRs.Fields("RYCID")
oQdfUpdt.Execute
' inserts a row into the flg_is_updt table
oFlgUpdt.insFlgIsUpdt oRs.Fields("RYAID")
ElseIf (oRs.Fields("SPCTX") <> "System NoChange") Then
oQdfUpdt.Parameters("CHGTXT") = "System NoChange"
oQdfUpdt.Parameters("CID") = oRs.Fields("RYCID")
oQdfUpdt.Execute
' inserts a row into the flg_is_updt table
oFlgUpdt.insFlgIsUpdt oRs.Fields("RYAID")
End If
oWrkSpc.CommitTrans
If ((oRs.AbsolutePosition Mod 100 = 0) And (oRs.AbsolutePosition > 0)) Then
strmsg = "Updated " & oRs.AbsolutePosition & " rows. Class: clsAppFsFin, Method: runUpdt."
Debug.Print strmsg, Now()
logMsg strmsg
Dim curFSCID As String
curFSCID = oRs.Fields("RYCID")
oRs.Close
Set oRs = Nothing
oQdfUpdt.Close
Set oQdfUpdt = Nothing
Set oRs = CurrentDb.QueryDefs("slctAppFsFinRowsForUpdt").OpenRecordset
Set oQdfUpdt = CurrentDb.QueryDefs("updtAppFsFin")
oRs.FindFirst "RYCID = '" & curFSCID & "'"
End If
' sleep .1 seconds to avoid overloading the upstream workflow
Sleep SLEEPTIMEINMILLIS
oRs.MoveNext
Loop
strmsg = "Final update count: " & oRs.RecordCount & " rows. Class: clsAppFsFin, Method: runUpdt."
logMsg strmsg
Debug.Print strmsg, Now()
oRs.Close
oQdfUpdt.Close
Set oRs = Nothing
Set oQdfUpdt = Nothing
Debug.Print "Exiting method runUpdt in class clsAppFsFin", Now()
logMsg "Exiting method runUpdt in class clsAppFsFin"
Exit Sub
Handler:
oWrkSpc.Rollback
Debug.Print Err.Number, Err.Description
logError Err.Number, Err.Description
End Sub
Here are the select and update queries executed by the VBA code
Select query:
SELECT APFF.[App ID] AS SPAID,
APFF.Server AS SPHST,
APFF.Directory AS SPDIR,
RAppAH.AppID AS RYAID,
RAppAH.Host AS RYHST,
RAppAH.FSCID AS RYCID
<
snip
>
FROM (AppCert
INNER JOIN AppFileSystemFin AS APFF
ON AppCert.[App ID] = APFF.[App ID])
LEFT JOIN RAppAH
ON APFF.FSCID = RAppAH.FSCID
WHERE APFF.FSCID = [RAppAH].[FSCID]
AND AppCert.State = "8 - Complete"
AND RAppAH.FSCID IS NOT NULL
AND APFF.[Change In SoR - Text] <> "System Remove"
ORDER BY APFF.ID;
Update query:
UPDATE AppFileSystemFin
INNER JOIN RAppAH
ON AppFileSystemFin.FSCID = RAppAH.FSCID
SET AppFileSystemFin.Server = [RAppAH].[Host],
AppFileSystemFin.Directory = [RAppAH].[Directory],
<
snip
>
WHERE AppFileSystemFin.ID = [ID];
The issue is now resolved. In the update query shown above, the line:
WHERE AppFileSystemFin.ID = [ID];
does not refer to Sharepoint's system-generated ID column. Instead, it refers to an internally generated key field that we had to use in order to be able to perform SQL join operations between lists.
The query has been updated to use SharePoint's generated ID column instead. This minor update resolves the memory allocation issue and in turn, allows updates to proceed more quickly - now requiring only about a third of the previous runtime to complete execution.

What's the best way to read a very large database table in VB.NET?

I'm developing a VB.NET application to obtain data from any data source (using an odbc connection string), and because this I can't use specific .net connectors like MySql.Net connector and I can't previously know if the file/DMBS supports LIMIT parameter. This app will read data from a table in the data source and use the information to make some files. At this point there was no problem with some tables because they are less than 3,000,000 records length, but there is a table that has 5,000,000+ rows length, and when I send the query the connection is lost. I'm working with OdbcDataReader, because I read on MSDN site that this is the best way to read a table one row at time and I just use each record once.
This is an example of my code:
Private Sub ReadData()
dim cnn as odbc.odbcConnection
dim coma as odbc.odbcCommand
dim reg as odbc.odbcDataReader
try
cnn=new odbc.odbcConnection("Driver={MySQL ODBC 3.51 Driver}; server=localhost; Database=datos; User=usuario; Password=contrasenia; option=3;")
cnn.open()
coma=new odbc.odbcCommand("select * from tabla")
reg=coma.ExecuteReader()'<- when this line is executed fails with the 5,000,000+ length table.
catch ex as Exception
MessageBox("Error: "+ex.Message,MsgBoxStyle.Critical,"Error")
end try
... 'Do anything with the data
end sub
In VBA or VB6 I do something like this:
Private Sub ReadData()
dim cnn as object
dim tab as object
set cnn = CreateObject("ADODB.Connection")
set tab = CreateObject("ADODB.Recordset")
cnn.cursorlocation=3
cnn.open "Driver={MySQL ODBC 3.51 Driver}; server=localhost; Database=datos; User=usuario; Password=contrasenia; option=3;"
tab.open "tabla", cnn,,2
...'Do anything with the data
end sub
And this code is executed without problem.
Any idea how to retreive data in a more efficent way in VB.NET? Or there is any way to do like ADODB (just indicating the table name not the SQL sentence).
Sorry if anything is incomprehensible.
Perhaps try setting the ConnectionTimeout property? Try adding this line before your cnn.open() call:
cnn.ConnectionTimeout = 50000 ' Number of seconds before timeout
When I had similar problem, my solution was to add LimitQuery function that added keywords to query to limit the number of results depending on the provider.
Something like this:
Public Function LimitQuery(ByVal query As String, ByVal RowLimit As Integer) As String
If RowLimit > 0 Then
Select Case m_DbType
Case DbType.Oracle
return "SELECT * FROM(" & query & ") WHERE ROWNUM<" & cstr(RowLimit + 1)
Case DbType.SQLServer
return Replace(query, "SELECT", "SELECT TOP " & cstr(RowLimit), 1, 1)
Case DbType.MySQL
return query & " LIMIT " & cstr(RowLimit)
End Select
Else
return query
End If
End Function
This is jast a quick hack, if you want to use any data source, sooner or later, you'll need some database abstraction layer.