I am working on migrating a MS Access Database over to a newer SQL platform.
But, with all of the users who are currently using it, we're migrating slowly/carefully.
The first step is that we are re-writing the VBA code into C#, which is then deployed in a .dll along with the database.
Now, the VBA code calls into the C# to do the business logic, then the VBA continues to do the displays/UI, while Access still hosts the database.
The problem comes in that I have a report that is being run after the business logic from the C# in one place, and apparently MS Access has a cache, which clears every 5 seconds. So, the transaction that occurs in the C# code writes to the database, but the VBA code is still using the cache. This is causing errors, as the records added to the database (which the VBA report is trying to report on) don't exist in the cache yet...
I'm guessing that the C# .dll must be getting treated as a "second connection" to the MS Access database, which is what seems to typically cause this error in my searches (thinks that one process is writing, and the other is reading).
Since the cache is cleared out every 5 seconds, we can just put the process to sleep, and wake it up after 5 seconds, and then run the report, but that's pretty terrible for an end user.
And, making things difficult, the cache seems like it only gets used in the deployed version (so, when running from source / in debug mode, the error never happens).
Doing some searches, there seems to be plenty of people who have said "just refresh the cache." But, the question is: within VBA, how do you refresh the cache?
Any advice would be welcome.
Thanks
I've been fighting the same issue for years as I write a lot of tools around an old Powerbuilder application that has an Access MDB back end.
The cache does exist and it is VERY real. When data is inserted on a different connection than it is queried on, the cache can be directly observed and measured. It was also documented by Microsoft before they blackholed a bunch of their old articles...
Microsoft Jet has a read-cache that is updated every PageTimeout milliseconds (default is 5000ms = 5 seconds). It also has a lazy-write mechanism that operates on a separate thread to main processing and thus writes changes to disk asynchronously. These two mechanisms help boost performance, but in certain situations that require high concurrency, they may create problems.
I've found a couple workarounds that are not the best, but somewhat make due until I find something better or can re-write the app with a better back end database.
The seemingly best answer I've found (that may actually work for you since you say you need VBA) is to use JRO.RefreshCache. I've been trying to figure out how to implement this using C# or VB.net without any luck. Below is a link to a code example where you execute the RefreshCache method on your 2nd connection that needs to pull the data. I have not tested this myself.
https://documentation.help/MSJRO/jrmthrefreshcachex.htm
A workaround I've found that will deliver the query results within 500ms to 1000ms of insert time (instead of anywhere between 500 and 5000 ms - or more):
Use System.Data.ODBC instead of OleDB, with connection string: Driver={Microsoft Access Driver (*.mdb, *.accdb)};Dbq=;
If someone knows how to use the JRO.RefreshCache method with OLEDB and C# or VB.net, I'd be forever grateful. I believe the issue is it's looking for an ADO connection to be passed in, not an OLEDB connection.
I not aware of ANY suggesting that some 5 second cache exits? Where did this idea come from????
Furthermore, if you have 5 users, then you not going to be able to update their cache, are you?
In other words, the issue of some cache for one user still not going to solve or work with mutli-users anyway, is it?
The simple matter is if you load up a form with 100 reocrds, and then other users are ALSO working on that 100 rows, then all users will not see other changes until such time you tell access to re-load the form.
You can do this with a me.Refresh in the form, and then it will show changes made by other users (or even your c# code!!!).
However, that not really the soluion here.
How does near EVERY system deal with this issue?
Answer:
You don't, you "design" the software to take the user work flow into account.
So, in place of loading up a form with 100 rows of data? (which you should not, unless SUPER DUPER reason exists for doing that).
The you provide a UI in which the user FIRST searches for whatever it is they want to work on.
In other words, say you just booked a user on a tour. Now, they call the office back, and want to change some details of that tour. But, a different tour staff might pick up the phone. So, now a 2nd user opens the tour?
So, you solve that issue by NOT loading all the tours into that form in the first place.
you provide a search screen, so they can search for the user, find the user, maybe type in a invoice number or whatever.
You display the results in a pick list, and then launch the form to the ONE record (and perhaps detail records from child tables).
So there no concpet of a cache in Access anymore then there is in c#.
However, if you load up a datatable in c#, and then display that data?
Well, what about the other users on that system. They will not see changes to that data ANY MORE then the current access form.
So, if you want to update some data in c#? Then fine, but you need/want to do two things:
First, before you call any c# code that may update the current form reocrd? You need to FORCE a data save of that current record BEFORE you call any code, be it VBA code, or c# code that going to update that current reocrd the user is working on.
You can in Access save the current reocrd in MANY different ways, but the typical approach is:
' single record save - current record
if me.dirty then me.dirty = false
' VBA or c# code goes here.
' optional refresh the current form to reflect changes
me.Refresh
So, in most cases, it is the "design" of your software that will solve this issue.
For example, in the tour example, or in fact ANY system, the user can't work, can't update, and can't do their job UNLESS they first find/search and have a means to bring up that form + record data in the first place.
So, ANY typical good design will:
Ask the user for that name, invoce number or whatever.
Display the results of the search, and THEN allow the user to pick the record/data to work on. When they are done, they close that form and are RIGHT BACK to the search form to do battle with the next customer or task or phone call or whatever.
So, a search form might look like this:
In above, I typed in smi, and then displayed a pick list.
The user can further type in say part of the first name, and thus now get this:
So, maybe they type in a invoice number, customer number, booking number or whatever.
So, you display the results, and then they can select the row or "thing" to work on.
thus, we click on the row (or above glasses button), and then jump to the ONE record.
so, the user does whatever they have to do with the customer. Now, when done, they close the ONE thing, the ONE main reocrd.
This not only saves the data (so others in the office can now use that booking data), but it also means the data is saved. and they are NOW right back at the search screen, ready to do battle with the next customer.
So, not only does this mean we have a VERY bandwith friednly design (we only pull the one main reocrd into that form), but it also is better for work flow.
The Access form's cache thus becomes a non issue, since we only dealing with the one record.
And as I pointed out, if the system is multi-user, then you NOT going to be able to udpate and deal with multiple users cached data anyway, are you?
Think of ANY system you EVER used from a software point of view.
When you use google, does it download the WHOLE internet, and then you use ctrl-f to search megs and megs of data in the browser?
Nope!
you search first, get a list of that search, and THEN pick one!!
And when that list is display, maybe others on the internet are udpateing, and add new data - but if that was cached in your browser, then it would not work!!!
And same goes for a desktop accounting system. You don't load up all accounts, and THEN have the user go ctrl-f to search all the data. You search for the customer, invoice number and PICK ONE to work on.
And it does not make sense to load up a form with 1000 customers, and then go ctrl-f to find that customer. Same goes for a instant banking machine. It does not download ALL customers and THEN let you search. It asks you FIRST to get what you need. So, be it browser based, desktop based, or JUST ABOUT ANY software you use?
You quite much elminate the cache issue, since not pre-loading boatloads of data, but asking and letting the user search for the data they need.
So, in regards to the Access form data and cache?
If you are on a form, and call VBA code, or c# code or whatever?
If that code update the current form, you have NO MORE OR LESS of a issue when calling VBA code, or c# code!!!! If that code updates the current form, and the reocrd is dirty (has pending edits), then you get that message about the current form's reocrd having been udpated by another user!!!
So, your cache issue does NOT IN ANY WAY exist MORE or LESS as a issue in typical Access software.
As a genreal rule, if you are on a form with pending edits, and say want to pop up some form to edit releated data?
You have to ensure that pending edits are SAVED before you launch an form that can edit the same data, or run code that can/may edit that data.
As a result, ZERO cache issues should exist, and they no more or no less exist when calling sql or VBA update code in a form then calling some c# code from that form.
So, write the pending update for that form.
Then run your VBA, SQL, or c# code.
And then do a me.Refresh to display any changes made by those external routines.
there is no documetjion, or ANY article I can find that suggests some kind of 5 seocnd cache or update - it is a urban myth, and your software challenge here in regards to use c# or VBA, or even SQL server stored procedures?
They are all the same issue, and I dare say that often access is used as a front end to SQL server, and ALL OF the SAME issues exist when using SQL server with ms-access.
What I'm aiming at is getting a list of all currently running scripts in order to check if other users aren't running the same VBA macro at the same time (and if yes, then stop the code etc., similar to what the OP of the below question wanted). This would be for a shared workbook (I learned it's not designed for this type of work, but I need to try it).
https://stackoverflow.com/a/36116091/5947935
I've been trying to make the code in the above answer work in VBA, but it seems it's a vbs thing and I would like to avoid that.
I'm not an expert to say the least, so I'm having trouble understanding how to get this to work in Excel VBA. I don't even know if it's possible at all.
I've found this as well: VBA Getting program names and task ID of running processes and it works fine but it only lists the running processes.
I've no idea however how to merge the two... or even if the WMI is the correct way to go.
I'd appreciate any sort of help.
I used to create a "locking file" which was just an empty text file with the name of the workbook followed by the username and an extension of .LCK
First thing my code did on auto open was look for a locking file then report back to the user which user had it open then cancel the open.
If it didn't find a locking file then it created one and proceeded as normal. If it found one but it was the same username (ie that user had it crash on them) it proceeded with the open.
The last thing the code did was delete the file.
No codes here and theory not tested yet, but the idea of preventing a different user executing a Macro on a shared workbook requires some thinking.
I would create a hidden worksheet, and use one of the cells to store the Environ("USERNAME") when the macro is first started - to indicate who has it running, then clear it when complete, first-in-first-out.
Lets say named range MUser (macro user) is range A1 in that hidden worksheet
When the macro runs, it will first check if MUser is empty, if so then change it's value to Environ("USERNAME") and Save the file before next step (here I am not certain the value is updated on others session).
If MUser is not empty, either abort or retry in a few second.
When macro completes, MUser will be ClearContents, and save the File to free up the workbook for macro.
Idea is here but please test. Post your own code for us to troubleshoot. You may also use Workbook events to "lock" the macro execution this way. Or even use this hidden sheet to make a log record for debug. Also some fail-safe needs to be implemented (such as a time stamp at macro start and override the lock after some minutes).
I've been trying to create a macro that is run via a batch file to generate a report and have been having some massively annoying issues with connecting to a SQL Server, probably due to some stupid mistake on my part.
My code to create a connection is as follows:
Dim dbs As Database
Dim qdf As QueryDef
Dim rst As Recordset
Set dbs = OpenDatabase("", False, False, connect)
Set qdf = dbs.CreateQueryDef("")
where 'connect' is a string containing connection details.
My problem is that Excel (2007) seems to have a problem with the very last line of that block, for whatever reason. I've added a watcher to the dbs variable, and it seems perfectly fine, so then why does it not like qdf?
The even stranger thing is that, if I put a 'Stop' command at the very beginning of the macro, start execution from the batch file, and then continue manually upon reaching the Stop, it works perfectly and does exactly what I'd like it to.
I've tried looking at DBEngine.Errors, and that tells me that I have the following problems:
General error: Invalid window handle
Connection not open
Again, this makes no sense to me, especially given that it works under manual execution, and Google hasn't yielded any answers.
P.S. If this is unclear, please also take a look at this question, which seems to be asking the same thing, but has no answer.
So, using the scientific method, a co-worker and I were able to figure out what the issue was, though I have no idea how it makes sense or why it would even be an issue in the first place. The layout of the macro was, in a rough sense, as follows:
Test database connection and close it
Create new workbook
Connect to database again and populate workbook from database calls
For some reason, however, the solution ended up being that VBA didn't like the fact that we were doing connection stuff both before and after creating the workbook. Changing the program flow to:
Create new workbook
Test database connection and close it
Connect to database again and populate workbook from database calls
did the trick. Again, I have no clue why this solved our problem, or if changing the order of this sequence changed something more significant within the code, but it works now! If anyone has any ideas as to why this might be, feel free to comment.
I'm trying to load data from my database into an excel file of a standard template. The package is ready and it's running, throwing a couple of validation warnings stating that truncation may occur because my template has fields of a slightly smaller size than the DB columns i've matched them to.
However, no data is getting populated to my excel sheet.
No errors are reported, and when I click preview for my OLE DB source, it's showing me rows of results. None of these are getting populated into my excel sheet though.
You should first make sure that you have data coming through the pipeline. In the arrow connecting your Source task to Destination task (I'm assuming you don't have any steps between), double click and you'll open the Data Flow Path Editor. Click on Data Viewer, then Add and click OK. That will allow you to see what is moving through the pipeline.
Something to consider with Excel is that is prefers Unicode data types to Non-Unicode. Chances are you have a database collation that is Non-Unicode, so you might have to convert the values in a Data Conversion task.
ALSO, you may need to force the package to execute in 32bit runtime. The VS application develops in a 32bit environment, so the drivers you have visibility to are 32bit. If there is no 64bit equivalent, it will break when you try and run the package. Right click on your project and click Properties and under the Debug menu you'll need to change the setting Run64BitRuntime to FALSE.
you dont provide much informatiom. Add a Data View between your source and your excel destination to see if data is passing through. Do do it, just double click the data flow path, select data view and then add a grid.
Run your app. If you see data, provide more details so we can help you
Couple of questions that may lead to an answer:
Have you checked that data is actually passed through the SSIS package at run time?
Have you double checked your mapping?
Try converting within the package so you don't have the truncation issue
If you add some more details about what you're running, I may be able do give a better answer.
EDIT: Considering what you wrote in your comment, I'd defiantly try the third option. Let us know if this doesn't solve the problem.
Just as an assist for anyone else running into this - I had a similar issue and beat my head against the wall for a long time before I found out what was going on. My export WAS writing data to the file, but because I was using a template file as the destination, and that template file had previous data that had been deleted, the process was appending the data BELOW the previously used rows. So, I was writing out three lines of data, for example, but the data did not start until row 344!!!
The solution was to select the entire spreadsheet in my template file, and delete every bit of it so that I had a completely clean sheet to begin with. I then added my header lines to the clean sheet and saved it. Then I ran the data flow task and...ta-daa!!! Perfect export!
Hopefully this will help some poor soul who runs into this same issue in the future!
I have created a database/app where a report is created when a particular button is clicked. just now, two people managed to hit the button at exactly the same time, which caused all sorts of not-good.
Is there a way to make a button invisible across instances once it's clicked by one person? Or some way to lock the database so nothing can be done until the person who clicked first is done?
I have a solution (basically, a global check variable that stops the report creation) but now I want to know if either of the other two options can be done.
It would really help to know more about your architecture here. What database? What language have you written your application in? Concurrent reading is usually an important and basic feature of most multi-user databases.
Seconding Daniel Cook's general notion, maybe explicating a bit: don't have the button run the report directly. Have it run a little subroutine that first checks a special purpose table where you represent report "runs" with a new record that has a start date-time and an end date-time. If there is a record sitting in the table with no (null) end-date, then the report must still be running, therefore, do NOT begin report, turn off button instead. Else, insert into that same table and then start running the report. Add to this a periodic, not-too-frequent callback on that button to perform the same check, and you've got something that comes close, but isn't "realtime", but should work in most architectures (not knowing anything about session management capabilities).
Here's what I did:
If DLookup("PayLock", "table", "pkID=1") Then 'it's locked - exit
MsgBox "Someone else has already started the pay process.", vbOKOnly
Exit Sub
Else
blah blah blah......
The "PayLock" field in the table holds the check variable. After "Else" comes the actual code to run when the button is clicked.
Just FYI, since they were asked:
it is split database
there are multiple users
yes, the report just reads data and exports it into an excel spreadsheet.
It looks like this is the only solution, which works, but seems inelegant. I keep discovering that the way I get around my lack of knowledge is the actual way to do it...