Related
I am working on migrating a MS Access Database over to a newer SQL platform.
But, with all of the users who are currently using it, we're migrating slowly/carefully.
The first step is that we are re-writing the VBA code into C#, which is then deployed in a .dll along with the database.
Now, the VBA code calls into the C# to do the business logic, then the VBA continues to do the displays/UI, while Access still hosts the database.
The problem comes in that I have a report that is being run after the business logic from the C# in one place, and apparently MS Access has a cache, which clears every 5 seconds. So, the transaction that occurs in the C# code writes to the database, but the VBA code is still using the cache. This is causing errors, as the records added to the database (which the VBA report is trying to report on) don't exist in the cache yet...
I'm guessing that the C# .dll must be getting treated as a "second connection" to the MS Access database, which is what seems to typically cause this error in my searches (thinks that one process is writing, and the other is reading).
Since the cache is cleared out every 5 seconds, we can just put the process to sleep, and wake it up after 5 seconds, and then run the report, but that's pretty terrible for an end user.
And, making things difficult, the cache seems like it only gets used in the deployed version (so, when running from source / in debug mode, the error never happens).
Doing some searches, there seems to be plenty of people who have said "just refresh the cache." But, the question is: within VBA, how do you refresh the cache?
Any advice would be welcome.
Thanks
I've been fighting the same issue for years as I write a lot of tools around an old Powerbuilder application that has an Access MDB back end.
The cache does exist and it is VERY real. When data is inserted on a different connection than it is queried on, the cache can be directly observed and measured. It was also documented by Microsoft before they blackholed a bunch of their old articles...
Microsoft Jet has a read-cache that is updated every PageTimeout milliseconds (default is 5000ms = 5 seconds). It also has a lazy-write mechanism that operates on a separate thread to main processing and thus writes changes to disk asynchronously. These two mechanisms help boost performance, but in certain situations that require high concurrency, they may create problems.
I've found a couple workarounds that are not the best, but somewhat make due until I find something better or can re-write the app with a better back end database.
The seemingly best answer I've found (that may actually work for you since you say you need VBA) is to use JRO.RefreshCache. I've been trying to figure out how to implement this using C# or VB.net without any luck. Below is a link to a code example where you execute the RefreshCache method on your 2nd connection that needs to pull the data. I have not tested this myself.
https://documentation.help/MSJRO/jrmthrefreshcachex.htm
A workaround I've found that will deliver the query results within 500ms to 1000ms of insert time (instead of anywhere between 500 and 5000 ms - or more):
Use System.Data.ODBC instead of OleDB, with connection string: Driver={Microsoft Access Driver (*.mdb, *.accdb)};Dbq=;
If someone knows how to use the JRO.RefreshCache method with OLEDB and C# or VB.net, I'd be forever grateful. I believe the issue is it's looking for an ADO connection to be passed in, not an OLEDB connection.
I not aware of ANY suggesting that some 5 second cache exits? Where did this idea come from????
Furthermore, if you have 5 users, then you not going to be able to update their cache, are you?
In other words, the issue of some cache for one user still not going to solve or work with mutli-users anyway, is it?
The simple matter is if you load up a form with 100 reocrds, and then other users are ALSO working on that 100 rows, then all users will not see other changes until such time you tell access to re-load the form.
You can do this with a me.Refresh in the form, and then it will show changes made by other users (or even your c# code!!!).
However, that not really the soluion here.
How does near EVERY system deal with this issue?
Answer:
You don't, you "design" the software to take the user work flow into account.
So, in place of loading up a form with 100 rows of data? (which you should not, unless SUPER DUPER reason exists for doing that).
The you provide a UI in which the user FIRST searches for whatever it is they want to work on.
In other words, say you just booked a user on a tour. Now, they call the office back, and want to change some details of that tour. But, a different tour staff might pick up the phone. So, now a 2nd user opens the tour?
So, you solve that issue by NOT loading all the tours into that form in the first place.
you provide a search screen, so they can search for the user, find the user, maybe type in a invoice number or whatever.
You display the results in a pick list, and then launch the form to the ONE record (and perhaps detail records from child tables).
So there no concpet of a cache in Access anymore then there is in c#.
However, if you load up a datatable in c#, and then display that data?
Well, what about the other users on that system. They will not see changes to that data ANY MORE then the current access form.
So, if you want to update some data in c#? Then fine, but you need/want to do two things:
First, before you call any c# code that may update the current form reocrd? You need to FORCE a data save of that current record BEFORE you call any code, be it VBA code, or c# code that going to update that current reocrd the user is working on.
You can in Access save the current reocrd in MANY different ways, but the typical approach is:
' single record save - current record
if me.dirty then me.dirty = false
' VBA or c# code goes here.
' optional refresh the current form to reflect changes
me.Refresh
So, in most cases, it is the "design" of your software that will solve this issue.
For example, in the tour example, or in fact ANY system, the user can't work, can't update, and can't do their job UNLESS they first find/search and have a means to bring up that form + record data in the first place.
So, ANY typical good design will:
Ask the user for that name, invoce number or whatever.
Display the results of the search, and THEN allow the user to pick the record/data to work on. When they are done, they close that form and are RIGHT BACK to the search form to do battle with the next customer or task or phone call or whatever.
So, a search form might look like this:
In above, I typed in smi, and then displayed a pick list.
The user can further type in say part of the first name, and thus now get this:
So, maybe they type in a invoice number, customer number, booking number or whatever.
So, you display the results, and then they can select the row or "thing" to work on.
thus, we click on the row (or above glasses button), and then jump to the ONE record.
so, the user does whatever they have to do with the customer. Now, when done, they close the ONE thing, the ONE main reocrd.
This not only saves the data (so others in the office can now use that booking data), but it also means the data is saved. and they are NOW right back at the search screen, ready to do battle with the next customer.
So, not only does this mean we have a VERY bandwith friednly design (we only pull the one main reocrd into that form), but it also is better for work flow.
The Access form's cache thus becomes a non issue, since we only dealing with the one record.
And as I pointed out, if the system is multi-user, then you NOT going to be able to udpate and deal with multiple users cached data anyway, are you?
Think of ANY system you EVER used from a software point of view.
When you use google, does it download the WHOLE internet, and then you use ctrl-f to search megs and megs of data in the browser?
Nope!
you search first, get a list of that search, and THEN pick one!!
And when that list is display, maybe others on the internet are udpateing, and add new data - but if that was cached in your browser, then it would not work!!!
And same goes for a desktop accounting system. You don't load up all accounts, and THEN have the user go ctrl-f to search all the data. You search for the customer, invoice number and PICK ONE to work on.
And it does not make sense to load up a form with 1000 customers, and then go ctrl-f to find that customer. Same goes for a instant banking machine. It does not download ALL customers and THEN let you search. It asks you FIRST to get what you need. So, be it browser based, desktop based, or JUST ABOUT ANY software you use?
You quite much elminate the cache issue, since not pre-loading boatloads of data, but asking and letting the user search for the data they need.
So, in regards to the Access form data and cache?
If you are on a form, and call VBA code, or c# code or whatever?
If that code update the current form, you have NO MORE OR LESS of a issue when calling VBA code, or c# code!!!! If that code updates the current form, and the reocrd is dirty (has pending edits), then you get that message about the current form's reocrd having been udpated by another user!!!
So, your cache issue does NOT IN ANY WAY exist MORE or LESS as a issue in typical Access software.
As a genreal rule, if you are on a form with pending edits, and say want to pop up some form to edit releated data?
You have to ensure that pending edits are SAVED before you launch an form that can edit the same data, or run code that can/may edit that data.
As a result, ZERO cache issues should exist, and they no more or no less exist when calling sql or VBA update code in a form then calling some c# code from that form.
So, write the pending update for that form.
Then run your VBA, SQL, or c# code.
And then do a me.Refresh to display any changes made by those external routines.
there is no documetjion, or ANY article I can find that suggests some kind of 5 seocnd cache or update - it is a urban myth, and your software challenge here in regards to use c# or VBA, or even SQL server stored procedures?
They are all the same issue, and I dare say that often access is used as a front end to SQL server, and ALL OF the SAME issues exist when using SQL server with ms-access.
So, record-locking in Access is pretty awful. I can't use the built-in record locking because it locks a "page" of records instead of just the individual records (I've tried changing the settings for using record-level locking, but it's still locking a page instead of just one record), but even if I could get that working, it wouldn't solve my issue because the record doesn't lock until the user starts to make changes in the form.
The issue is, when two people open the same record, they can start making changes and both save (thus overwriting the earlier change). To make matters worse, there are listboxes on the form that link to other tables (keyed on an ID) and the changes they make to those tables are then overwritten by any change that comes after if they both opened the same record.
Long story short, I need to make sure it's impossible for two people to even open the same record at the same time (regardless of whether or not they've made any edits to it yet).
To do this, I added a field to the table which indicates if a record has been locked by a user. When they open a form, it sets their name in the field and other users who try to open that record get a notification that it's already locked. The problem is, this "lock" isn't instantaneous. It takes a few seconds for other users to "detect" that the record is locked, so if two people try to open the same record at roughly the same time, it will allow them both to open it. I've applied a transaction to the UPDATE statement that sets the lock, but it still leaves a short window wherein the lock doesn't "take" and two people can open the same record.
So, is there a way to make an UPDATE instantaneous (so all other users immediately see its results), or better yet, a robust and comprehensive way to lock records in an Access multi-user environment?
It not clear why you only receiving “page” locking.
If you turn on row locking in file->options, then you ALSO need to set the particular form to lock the current record. So just turning on record locking will not help you. That setting ONLY sets the default for new forms - it is not a system wide setting.
If you correctly turn on locking for a form, then if two users are viewing the same record and one user starts to edit the record, then all others CANNOT edit the record. Any other user attempting to edit a record will see a “lock” icon in the record selector bar (assuming record selector is turned on for the given form). They also will receive a "beep" if they try to type into any editable control on the given form.
And when they try to edit, they will see a visible "lock" icon on the selector bar like this:
A few things:
If two users are able to edit a record, then you not have turned on locking for that given form. This feature MUST be set on a form-by-form basis. Changing the setting in file->options->client setting ONLY SETS THE DEFAULT for NEW forms you create! So the setting ONLY applies to the default for new forms – it does NOT change existing forms.
So setting record locking is ONLY a form-by-form setting.
So you ALWAYS MUST set each form you want locking to the current edited record. You set this in form design, in the data tab of the properties sheet like this:
And also keep in mind that the setting of record level locking (a different setting and feature) is an Access client setting and does NOT travel with the given application.
So since you state that two users can edit the same record, then CLEARLY you NEVER turned on record locking for that given form. The systemwide “default” record locking ONLY sets the above form default (so existing forms you have are NOT changed).
Next up:
The setting of [x] Open database by using record-level locking is an Access client setting and NOT saved with the application. So this is an Access-wide setting, not an application setting, nor one that travels with the application.
So you have to set this on each client workstation, or you have to set this in your start-up code.
If you can’t go around and change each workstation to change this setting (or you are using the Access runtime), then you can use this VBA in your start-up code to set this feature:
Application.SetOption "Use Row Level Locking", True
Note that the setting does NOT take effect until exit the application, but that’s really a “non” issue since this means the first time you run this code, some users might well be in page locking mode, and others in row locking mode. Most of the time this causes little issue.
However the next time any user launches the application then they will be in row locking mode.
I have in the past also written custom locking code. And can outline how to make this work well, but from what you posted so far, you never turned on or set locking nor had locking working correctly for any of the forms you have now anyway.
OK, I finally figured out all of the issues contributing to this and worked out a solution.
The problem is multi-faceted so I'll cover the issues separately:
First issue: My custom locks weren't instantaneous. Even though I was using a transaction, there were several seconds after a lock was placed where users could still access the same record at the same time. I was using CurrentDb.Execute to UPDATE the record and Workspaces(0).BeginTrans for the transaction. For some reason (despite Microsoft's assurances to the contrary from here: https://msdn.microsoft.com/en-us/library/office/ff197654.aspx) the issue was that the transaction wasn't working when using the Workspaces object. When I switched to DBEngine.BeginTrans, the lock was instantaneous and solved my immediate problem.
The irony is that I almost always use DBEngine for my transactions but went with Workspaces this time for no reason, so that was a bad move obviously.
Second issue: The reason I had to use custom locking in the first place was because record-level locking wasn't working as expected (despite being properly configured). It was still using page-level locking. This was due to a performance trick I was using from here: https://msdn.microsoft.com/en-us/library/dd942824%28v=office.12%29.aspx?f=255&MSPPError=-2147217396
The trick involves opening a connection to the database where your linked tables are contained, which speeds up linked table operations. The problem is that the OpenDatabase method is NOT compatible with record-level locking so it opens the db using page-level locking, and since the first user to open a database determines its lock level (as explained here: https://msdn.microsoft.com/en-us/library/aa189633(v=office.10).aspx), all subsequent connections were forced to page-level.
Third issue: My problem is that my forms are not just simple bound forms to a single table. They open a single record (not allowing the user to navigate) and provide several functions which allow the user make modifications which affect other records in other tables that are related to the record they're editing (through comboboxes and pop-up forms and what not). As a result, I can't allow two people to open the same record at the same time as it leaves way too many opportunities for users to walk over each others' changes. So even if I remove the OpenDatabase performance trick, I'd still have to force the Form to be Dirty as soon as they open it so the record locks immediately and no one else can open it. I don't know if this would be as instantaneous as my custom locking and haven't yet tested that aspect.
In any event, I need a record to be locked the instant a user opens it and for now I've decided to keep using my custom locking (with the fix for the transaction). If something else comes to light that makes that less than ideal, I can try removing the OpenDatabase trick and switching to Access's built-in locking and force an immediate lock on every record when it is opened.
You could use the method described here:
Handle concurrent update conflicts in Access silently
to handle your lock field.
Since Access doesn't make locking records easy, I'm wondering if you were to add a table with locked record entries whether that would solve the problem even though it would be the "duct tape, soup can and coat hanger" solution: You create a "Locked_Record" table with 2 fields a) record ID being updated and b) the user name of the person updating that record. That table would control exactly who owns and therefore can edit what record. Your form would have a search field and when the search term is entered and "Enter" pressed the form would search for the record by looking for it in the data and looking for it in the Locked_Record table. If found in the Locked_Record table, then you user gets an error saying "Record in use already" and display who owns the record. If not found in the data then the appropriate message is displayed. If found in the data and not found in the Locked_Record table, then a Locked_Record entry would be created and the user would then get the data displayed in the form. At this point nobody else can edit that record. When the user is done updating, either the user would need to press a button saying "Done updating" or the form would have to be closed. Either way the Locked_Record entry would be deleted so others could use that record. If the record owner doesn't close out the form or doesn't press the button then that is a training issue. This method could be user for multiple entities such as Customers, Employees, Departments, etc. You would just have to assure your application and DB is set up so any sub-forms used which might lock other tables would ONLY affect that record's entries in the other tables.
I know this is a bit old but the information here inspired me to to use the following. Basically, the me.txtApplication is a text box on the bound form. The form is bound to the table and is set to lock the edited record in the property section. This code won't do anything other than trigger that editing lock and promptly undo the change. If another user tries to load the same record it will attempt to do the same edit, trigger the error, and move to the next record or start a new record without the user being the wiser.
'Lock current record with edit-level lock by editing and removing the edit from a
field.
'If record is already locked, move to next record.
On Error Resume Next
Me.txtApplication = Me.txtApplication & "-%$^$^$$##$"
Me.txtApplication = Replace(Me.txtApplication, "-%$^$^$$##$", "")
If Err.Number = -2147352567 Then
If Me.CurrentRecord < Me.Recordset.RecordCount Then
DoCmd.GoToRecord , , acNext
Else
MsgBox "No available records.", vbOKOnly, "No Records"
DoCmd.GoToRecord , , acNewRec
'[If the condition is not true, then we are on the last record, so don't go
to the next one]
End If
End If
End Sub
I have a table which has three fields with a few records. If an user is going to edit a record from the table, other users won't be allowed to edit that record the same time. What kind of steps i can take to make this happen?
Alot of people from a desktop application background will wonder how this is done in a web application.
Locked record flag
One approach in the desktop world is to have a Boolean column on the row that indicates that it is being edited, and by whom. You could certainly do this with a web app, but it is a very bad approach because if a user visits the Edit page, placing the record into a locked state, then leaves the page, it will forever be in a locked state. You have no definitive way to tell that the user doesn't still have the edit page open.
Time sensitive lock
The airline reservation approach is a variation on the above, but you would also have a LockedUntilUtc which is a datetime indicating how long the record is locked for. Let's say Bob visits a page for a record, when serving the apge from the GET action you also set the locked flag, and set LockedUntilUtc to 10 minutes in the future. 5 minutes later Sarah visits the page but gets a "currently locked" error because you checked the LockedUntilUtc and it is currently in the future. Another 6 minutes elapses(total of 11 minutes since locked) and someone visits the page and the LockedUntil is now in the past, so you give the lock to the new user.
This seems like a reasonable compromise, but it is rife with problems sure to frustrate users. First, there is no easy way to queue up users who need access to edit the record. Sarah could try 10 times, and then just as it passes 10 minutes, Jimmy visits the page and because he was the first person after the lock expired, he grabbed the next lock without Sarah getting a chance. Sarah calls your help desk and says she waited 10 minutes for the lock to expire, and it's now been 15 minutes and she still can't get to the page. Your helpdesk probably doubts she really waited a full 10 minutes, back and forth ensues.
You also must implement a client side timer/display for whoever currently has the lock so they know how much time they have left before it can expire.
Optimistic concurrency
This is the right approach in most cases. You don't actually lock the record in any way at all. Instead, many users can visit the edit page. When they save an edit, the form includes both the original values and the new edited values. The server will compare the original values from the form, with the current values in the database, to see if there was an interim edit.
The original values are from some point in the past(when Bob initially visited the edit page). The current values are from right now. Between the past and now, if Sarah also visited the edit page, and successfully saved changes to the database values, then Bob's original values will be different from the current values in the database. Thus when Bob attempts to save his changes, the server will see that his original values are different than current values in the DB, and throw an error. You will need to decide how you handle this situation. Usually you let the user know that someone else has edited the page since they did, and refresh the page, and they lose their edits. Entity Framework supports optimistic concurrency.
Ajax'ified Optimistic Concurrency
You can also have the client occasionally ping the server with original values so the server can check to see if your page is stale(i.e. other user changed something) and popup a message. This improves the user's experience by giving the user earlier notice that another user has edited the page. Thus they don't get to far along in making edits which they are going to lose anyways. They can also take note/copy/paste their edits out of the browser so that can refresh the page and have a reference of what they changed.
There is a Timestamp column in SQL Server which can work in tandem with Entity Framework to lower the overhead involved in checking for changes. Such that you don't need to keep the entire record of original values in each client and pass them back in forth: http://www.remondo.net/entity-framework-concurrency-checking-with-timestamp/
Granular edits
One approach we use alot is to ajax'ify every field and edits to a single field are committed immediately. This is accomplished using a jquery library called x-editable. The user edits a single field, confirms the edit, and that value is sent to server. You could combine this with optimistic concurrency if you wanted to check the entire record for changes, or just the single field. If changes are detected, then you reject the edit and refresh the page. This can be a much friendlier experience for the user, primarily because the user gets the "Another user edited page" error instantly when editing a single field. This prevents them from wasting alot of time editing a large number of fields, only to find their edit was rejected and they have to redo all of their edits again. Instead, they edit a single field, get the error, page refreshes, they only have to repeat that one field edit and continue from there.
http://vitalets.github.io/x-editable/demo-bs3.html
I have an excel file on shared location where multiple users (4 in this case) are accessing the file at the same time.
This file has a “Master data” tab where all the base data is there and then there are 4 identical tabs (one for each user).
Each user tab has a set of filters using which the user will be able to extract relevant data based on the filters selected and can add or edit the rows. Once the user is done editing/adding rows, user will submit the data which will get updated/appended in the master data tab.
Users can select same or different options in the filters. I am facing errors when multiple users click on the submit button (macro) at the same time.
How can I resolve this?
Like some comments say, Excel is not designed for this...
But if you want to use Excel, i would recommend something like this:
Every time someone writes in the master data, you have to "lock" the master data tab. Just put a boolean in a cell, set it to true while you are writing and back to false as soon as you finished altering the master data tab.
Now, if someone wants to change values in the master data tab at the same time, check if the boolean is set to true. If yes, then you have to wait, if not, you can write the data.
Adding to Manuel Allenspach's response, my suggestion is to create a queue.
Queue should have a spot for user processing and other spots for users waiting.
Than, before running code, you should include a check to make sure no two users have their macros updating database at same time.
What I'm trying to do is seperate my existing MS Access application into a front-end (which will run locally on a user's machine) and backend (which will be hosted on a networked file server) and allow users to choose between "read-only" and "write" modes. The idea is that only one user can use the "write" mode at a time, thus preventing the same piece of inventory being allocated to mutliple customers. My problem is that the application currently handles concurrency by requiring users to open a .bat file which only allows them to enter application if a .ldb file does not already exist (there is no read-only mode currently), so I need to prevent users accessing the production data in "read-only" mode from creating a .ldb file and unessarily blocking out other users.
The biggest challenge to implemnting this is that users must have write access to the temporary tables in the MS Access (.mdb) file installed locally. I have tried to implement this using a linked table, but I'm not sure how I can control when records become locked using linked tables (which creates a .ldb file).
You could change the sharing setting back to Exclusive Mode. Then only one person can access the file at a time. Check out this link and the other sharing options you have.
http://office.microsoft.com/en-us/access-help/set-options-for-a-shared-access-database-mdb-HP005188297.aspx
Side note: Yikes. Using Access in a shared network environment is not fun. I hope nothing important/time sensitive/secure is in this file. The .ldb file not being deleted and blocking other users is something that I use to see happen regularly in this situation. I believe splitting the Access file into a front-end and back-end like you've done is the first step. Then using linked tables to a SQL Server database can help resolve these issues. But if you're going to this level of effort you may want to consider dumping Access and get a COTS product or create a new application.
Depending on which version of Access you are using, theres alot of flexibility in the UI developement. In other words, this sounds more like an "interface" issue as apposed to a "database" issue. Given everybody is able write to a table, you should be able to check in somewhat real time (performance can be an issue with larger datasets), whether a particular has been added to inventory or not.
They I handled this problem is have two tables, an incomming and outgoing log, and set up a query that did the math against the inventory list on the amount of products. And like a general ledger, select set amount of time to "close the log" (monthly, quarterly) so that the query is not taking into account stuff that happened two years ago.
If you need more help with Access related stuff, Access Monster is a good forum site that deal with nothing but access.
My problem is that the application currently handles concurrency by requiring users to open a .bat file which only allows them to enter application if a .ldb file does not already exist (there is no read-only mode currently), so I need to prevent users accessing the production data in "read-only" mode from creating a .ldb file and unessarily blocking out other users.
--> If every user has his own copy of the front-end on his own machine, you'd have to check the .ldb file of the back-end.
I guess it would be easier to give everyone write access to the backend and manage the actual writing programmatically with a "locked by User X" field in the backend:
You said:
preventing the same piece of inventory being allocated to mutliple customers
If this is the only reason for putting all users but one in read-only mode, you could put a "locked by User X" field on the inventory table. If someone starts to modify (or even opens) a piece of inventory, update the record with his user name, and delete the user name again when he's done.
If another user tries to open the same piece of inventory as well, the name of the first user will already be in the "locked by User X" field, so you can put the second user in read-only mode.
If the inventory pieces are not the only problem and all the other users really are not allowed to change anything as soon as someone else already is editing, you can create a new table with only one column and one row and use this as the "locked by User X" field. As soon as there is a user name inside, you can put everyone else in readonly mode.
No matter how you do it, you will have to provide some kind of admin menu, so if someone's front-end crashes while editing, someone else needs to be able to unlock this user's locked data (=delete his username from the "locked by User X" field).