Update access backend schema with vba code - vba

I have created an application in ms access with vba code. The app is split in front end and back end. Back end holds the user data. Now i want to add some new features but this needs some excessive back end changes in the schema structure.
My question is, what is the best way or practice to deliver to the end user the changes i want every time I upgrade my application? Of course it has to be something that keeps the current data intact.
I have seen some suggestions about a free program called CompareEm that compares the old and new database, and produces the appropriate vba code to do the job.
I was also considering if it would be more convenient to copy an empty database that has the wanted schema alongside the uograded frontend and having a module that compares the old database of the user with the empty one and try to change the old schema according the new one. (First my removing all relations, converting the tables, and then reapplying the new relations).
In any case i would like something that could be done automatically or by some customer code, in order to avoid messing up with users data.
Any help is appreciated.

This is one weak spot of Access. We don't have a really great scripting language and system built that allows scripting out of database "schema".
So while the split design is a must have, and makes update of the code and application part very easy, a update of the back end (BE) is a challenge.
I have done this in the past. And the way you go about this?
Well, you have to make up a rule, and the rule is this:
Anytime you add a new field/column, a new table?
You MUST write the change(s) into a code module. In other words, you NEVER use the GUI anymore to add a new column (or say change the length). You code to that sub that will do this for you. In fact, in one application I had (many customers, all 100% Access runtime).
So, to add new features, changes, fixes? I often had to add a few new columns or tables. So what I would do is WRITE the code into that "change/update" routine. (and I also passed it a version number). On startup, that code would be called.
It started out with say about 5 lines of code. A few years later? I think it had well over 100 lines of code. And worked very well. If they opend a older data file (the BE), then if it really was a older verison - or say they had not paid for upgrades, then their current verison of software could and would be several upgrades behind. But, as noted, since on startup all of the "updates" to the database were ALWAYS written as code, then even if they were say 5 versions back, then all of the version numbered code would run, make the changes to the BE, and they would be up to date.
So, this means you have to code this out. It not all that hard, but you DO HAVE do adopt this way of working.
So, the code in that module looked like this:
There were two common cases. The first case was a WHOLE new table. I did not want to write out the code to create that table, so what I would do is INCLUDE the whole new table in the FE for this purpose (and I would append the letter "C" to the table).
So, the code on startup would check if the table exists, and if it does not, then I would execute a transfer command to copy the table. The code stub looked like this:
' check for new table tblRemindDefaults
On Error GoTo reminddefaultsadd
Set rst = CurrentDb.OpenRecordset("tblRemindDefaults")
So, in above, I set the error handler and try to open the table. If the table does not exist, then above calls the routine remindDefaultsAdd. Its job of course is to add this new table to the BE.
The above code stub looked like this:
remindadd:
Dim strFromDB As String
Dim strToDB As String
Dim strSql As String
Dim cp As Object
Dim shellpath As String
strFromDB = CurrentProject.FullName
strToDB = strBackEnd
DoCmd.TransferDatabase acExport, "Microsoft Access", strToDB, acTable, "tblGroupRemindC", "tblGroupRemind", True
Note how then I simply copy a table from the FE to the BE.
The next type of upgrade was adding a new column to a existing table.
And again, similar to above, the code would check for the column, and if not present, I would add the column. In this case we not creating a new table, so no copy of the table.
Typical code looked like this:
' add our new default user field
Dim nF As dao.field
Dim nT As dao.TableDef
Dim nR As dao.Relation
strFromDB = CurrentProject.FullName
strToDB = strBackEnd
Set db = OpenDatabase(strToDB)
Set nT = db.TableDefs("tblEmployee")
nT.Fields.Append nT.CreateField("DefaultUser", dbText, 25)
nT.Fields.Refresh
Set nT = Nothing
So in above we add a new field called DefaultUser to the table.
If you OPEN DIRECT the BE, and do NOT use linked tables, then you are free and able to modify the table in question with code. You can ALSO use SQL ddl's statements. So while I noted that scripting support in Access is not all that great, you can do this:
Set db = OpenDatabase(strToDB)
strSql = "ALTER TABLE tblEmployee ADD COLUMN DefaultUser varCHAR(25);"
CurrentDB.Execute strSQL, dbFailOnError.
So, you can write out code using table defs, create the field and add it. Or you can use SQL DDL's statements and execute those. Of course each upgrade part I wrote had a version number test.
So, the simple concept is that every time you need a new column, new table etc.?
You write out this code to make the change. So, you might say work for a month or 2 add a whole bunch of new features - and some of these features will require new columns. So you NEVER use the GUI to do this. You write little code routines in your upgrade module, and the run them. Then keep on developing. So, when you done, all your updates to the table(s) are now done, and you can deploy the new FE to those users. On startup, the update code will run based on version number. (I have a small one row table in the FE with version number, and also a small one row table in the BE that has version number). After a upgrade, the new columns, new tables are now in the BE, and then of course I update that version number.
So, you can use a combination of SQL ddl commands, or use code to create the field def as I did above.
The only really important issue is that table changes can NOT be made against linked tables. So you have to create + open a SEPERATE instance of the database object as I did per above. And on startup, you have to ensure that no main form that is bound to a linked table has or is running yet. (since that will open the BE, and you can't make table changes against a open table).
So, it not too hard to make the updates. But the HARD part is your self discipline . You have to ALWAYS go to your upgrade routines and add the new column in code. So that way after working for a few weeks, you have coded out the table changes, and thus are able to re-run that code against older existing BE's out in the field.

Welcome to Stack Overflow! When you send out a new version of your front-end db, you would need either to (1) include a script to update the backend db, or (2) a fresh copy of the back-end db and a script to transfer the data from the previous version. In either case, you need to consider customers who may have skipped an update, so you would need some kind of version stamp on the back-end db.
You can accomplish either strategy with Access DDL (see here: https://learn.microsoft.com/en-us/office/client-developer/access/desktop-database-reference/data-definition-language) and/or Access DAO (see here: https://learn.microsoft.com/en-us/office/client-developer/access/desktop-database-reference/microsoft-data-access-objects-reference).
Please let us know if you have any specific questions but keep in mind SO is not intended for offering general advice or product reviews.

Related

MS Access 2016 Woes - Lockups in MultiUser (Inserting Dummy Entries to Add New Item)

I found an interesting conundrum with a database I administer. To create a new Stock Item, it adds a dummy entry and then opens this new entry in the usual Form for editing. See the code below.
The code works perfectly fine until you have the database open on more than one PC.
If the user on the 1st PC adds a new item, the 2nd PC freaks out over the dummy entry. This causes 10-20 second delay on everything they do on the 2nd PC.
I'm trying to think of a simple / elegant way to achieve this without using a dummy entry (because it doesn't actually have a StockCode until the user enters one, I think on the 2nd PC the program chokes on the dummy entry with no StockCode)
I really have no idea at this point.
Case vbKeyF1 ' F1 Key to Add New Record
stokmastSQL = "INSERT INTO tblSTOKMAST (STOCKCODE, PER, SELL1, SELL2, SELL3, GST) VALUES ('', '', 0.00, 0.00, 0.00, 'N');"
DoCmd.SetWarnings False ' Turn off SQL warnings for Action Queries
DoCmd.RunSQL stokmastSQL
DoCmd.SetWarnings True ' Re-enable SQL warnings for Action Queries
[Forms]![frmSTOKMASTLIST]![lst_STOKMASTLIST].Requery ' Requery ListBox after change
SQL = "SELECT STOCKCODE, DESCR, PER, SELL1 FROM tblSTOKMAST ORDER BY STOCKCODE" ' Re-initialize Record Source of ListBox
[Forms]![frmSTOKMASTLIST]![lst_STOKMASTLIST].RowSource = SQL
[Forms]![frmSTOKMASTLIST]!lst_STOKMASTLIST.SetFocus ' Set Focus to ListBox and select first record
[Forms]![frmSTOKMASTLIST]!lst_STOKMASTLIST.Selected(0) = True
DoCmd.OpenForm "frmSTOKMASTEDIT", , , "[STOCKCODE] = '" & [Forms]![frmSTOKMASTLIST]![lst_STOKMASTLIST].Column(0) & "'" ' Open new record in frmSTOKMASTEDIT
KeyCode = 0
End Select
Well, there is a MASSIVE HUGE LARGE difference here between a long delay, and that of something not working.
What you described so far is something that is SLOW, not that it does not work.
About once a week on SO, this issue comes up, and has come up for about the last 20 years in a row!!!
You don't want to try and write a boatload of code to fix this issue, since virtually EVERYTHING you do will have to be fixed, and in 99% of cases, writing code will NOT fix this issue!!!
In other words, this is NOT due to the dummy entry, but something wrong.
You don't mention/note if the database is split - but lets leave that issue for a bit.
The first test/thing to try?
Do this:
launch the applcation. Now, open ANY table (and you not noted if you using linked tables here - a MASSIVE issue we need to know here).
Now, from the nav pane, open ANY table. Now minimize the table (assuming the application is in windowed mode. But, regardless, JUST OPEN any table.
At this point, now launch the form that creates the new record. is it slow?
The above is what we call a persistent connection. By opening a table (any table), then this forces access to keep the database open. And in multi-user mode, this "opining" process can be VERY slow (like 20 seconds).
So, first and foremost, try the above. So, you can open ANY table (and if using linked tables, then ANY linked table). Now, just leave that table open, and now again try your form. The delay (even with 2 users) should not be gone.
if you don't address this delay, then that dealy will start to appear everywhere, and will appear even without writing code.
And conversly, then writing code will not fix this issue.
If you determine the above fixes this issue? Then on application startup, you can in VBA code create what we call a persistent connection. Another way is to open a form (any form) bound to a existing table.
As noted, if the above does not fix this issue, then we have to look at how that new record is added, and perhaps it uses something like dmax() or some such to get the "max ID value". In that case? Then adding a index to that column will/can fix this issue.
So, try the above first (since you can do this without any code). Just open any table, and THEN launch/use the form(s) in question, and see if the long delay goes away.

'Update requires a valid UpdateCommand when passed DataRow collection with modified rows.'

In my application I have three tables all from the same Access database. I have used the Wizard in Visual Studio Express 2019. This is an extension of my unresolved question here.
Database.TwixBindingSource.EndEdit()
Database.NPCsBindingSource.EndEdit()
Database.EffectsBindingSource.EndEdit()
Database.TableAdapterManager.UpdateAll(Database.DatabaseDataSet)
I have these three tables ("Twix","NPCs",and "Effects"). The above code automatically runs every 15 seconds, like an autosave feature. My issue is when I try to edit "NPCs" or "Effects" I recieve the following error:
'Update requires a valid UpdateCommand when passed DataRow collection
with modified rows.'
Most of the online answers in regard to this error are solved by ensuring each table has a Primary Key column that is recognised by Visual Studio in order to automatically generate the necessary commands to edit, updte, delete etc.
However, I am stumped because I have PKs on each table.
The only thing I can think is that it has something to do with the 'TableAdapterManager,'
ALthough I have no clue at this point.
EDIT: I have replaced the UPDATE ALL with individual UPDATES, but the same problem persists...
Database.TwixBindingSource.EndEdit()
Database.TwixTableAdapter.Update(Database.DatabaseDataSet)
Database.NPCsBindingSource.EndEdit()
Database.NPCsTableAdapter.Update(Database.DatabaseDataSet)
Database.EffectsBindingSource.EndEdit()
Database.EffectsTableAdapter.Update(Database.DatabaseDataSet)
Right click on your tableadapter (on the header, not the Fill command), and choose Configure
Click Advanced Options and verify that "Generate I/U/D" is ticked
When this isn't ticked, the resulting TA doesn't have any DML statements built:
If your SELECT statement doesn't select the column that is set as the primary key in the database, then this Generate IUD tickbox may be greyed out, or it will be ticked but the DML statements won't generate. Pay attention to the final page of the wizard. Here is what happened when I made a table "Other" that had no primary key:
INSERT generates, because it's easy to generate an insert on a keyless table, but update and delete cannot be generated
If you don't select the PK column you get a warning:
It's important that your DB tables have a PK; it's not the same thing to declare some datacolumns of a datatable in a dataset to be a primary key. A Dataset is not a database; it may have more or fewer tables/columns and the presentation and datatypes of row data do not have to match the DB. I can see that your dataset screenshot shows some tables have PKs declared in the DataSet side, but this is not a statement that they are definitely PKs on the DB side
Feel free to delete the DataTable; it will delete the TableAdapter too. You can then recreate that one TA by right click, new, tableadapter.. SELECT * FROM table
If you hadn't already realized, remember that you can (and should) declare more queries per tableadapter than just keeping with the basic Fill, which appears to be a SELECT * FROM without a WHERE clause in your case. Personally I always make my first query SELECT * FROM table WHERE id = #id because it's really rare that you want to download a whole table.. You can leave the default as a WHEREless query, but consider adding others, such as SELECT * FROM Twix WHERE Location = #location and naming the query FillByLocation. In code you can then fill just the locations you want, rather than downloading 10000 Twixes into the app just to show some of them (with a rowfilter, i guess)
This question and answers helped me figure out my problem.
NOTE that if your problem is your table (for example you forgot to set a primary key and you go back and fix it in SSMS while debugging) you will need to find the menus in Caius answer, ensure boxes are ticked, and hit Finish if nothing else.
This will refresh something about the statements(unclear what), and allow this to work. Otherwise you will continue to receive these errors as if you'd fixed nothing, and probably continue to search to no avail.

Error adding vfp table to existing

I copied a new version of a table into an existing vfp installation and when it then tries to access that table it comes up with a variable not found error. The old version and new version of the table appear to have the same structure. Why could this happen? Does the dbc need to be updated in some way if I copy a new version of the table in. The structure is the same, but the data in it is different.
I copied the table in in Windows Explorer.
If the DBC in the same folder as the table? If not, are they in the same relative position on the two different drives? If not, you'll get errors, though I wouldn't expect "Variable not found."
Did you bring along the FPT and CDX for the new file? Again, that's not the error I would expect, but failure to do so would cause problems.
Assuming all that is right, what's the actual line of code that's failing?
Was the table that you've copied in "freed" from it's previous DBC before copying? If not, as soon as you attempt to USE it in the new location then I believe VFP will try to locate the DBC that it belongs to.
If you believe the table structure to be identical then you might be better off leaving the existing one in place and just ZAPping it to clean it out then appending the records from the other copy... of course you might need to temporarily switch off any INSERT triggers or row-level validation if you've got anything clever happening therein such as updating a "last modified" field. AutoInc fields will also need to be handled with care too, but it doesn't sound like this is something you're expecting to do on a regular basis so shouldn't be too onerous as a one-off exercise.

Access 2010 vba Array vs. Query

I maintain an Access DB at work that we use to send out hourly updates on employee productivity. Currently I have the form arranged into columns: The first column contains a ComboBox that we can select an employee's name from. Once a name is selected, the next column fills in automatically with the agent's employee ID (TID), through this code:
AgentName = rs.Fields("AgentName")
sqlString2 = "SELECT * FROM " & "AllAgents WHERE FullName ='" & AgentName & "'"
rs2.Open sqlString2, CurrentProject.Connection, adOpenKeyset, adLockOptimistic
AgentTID = rs2.Fields("TID").Value
rs2.Close
Everything works fine when working in the office on the corporate network, but I've just discovered that working over the VPN causes horrendous slowdown when using this form. Slowness is an issue I've fought with over the VPN forever, but I had a thought in this case that may potentially aleve the slowness, I judt want to know if I'm correct before I go to the trouble of re-coding everything.
My idea is that I could create an array in VBA that would be populated with the agents' name & ID's when the form first loads, or even when the DB is opened on each inidividual laptop. Then the DB would only need to read from the 'AllAgents' table once, and could simply use the array as a source instead. What I doin't know is if this would have an effect or not. Basically, if an Access DB is saved onto a network drive and accessed over a VPN, would the array be stored in the RAM of the laptop? If it is, I would assume this would alleviate the speed issues and would be worthwhile taking the time to re-code.
Thanks,
The thing about form-level or global variables in Access is that you better have good error handling in the application. If an uncaptured error occurs it can result in those variables being for lack of better word discarded. The result would be the next time you try to access the data in the array you get another exception.
Here are a few things you could try before going the array route:
Your combo box probably doesn't need to be bound to your recordset rs directly. Set the source of the combo box at design time to the underlying query or table.
This makes it possible to simply refer to the combo box's bound field using something like this: AgentName = cboAgentName.Value
(If you can eliminate an unnecessary recordset object from the form the better off you will be in the long run)
Your lookup code shouldn't need to use SELECT *. This just isn't a good practice to use in production code. Instead, use SELECT TID. Basically, only return in your query the fields you actually need.
You don't need to use the adOpenKeySet option, which is unnecessary overhead. You should be able to use adOpenForwardOnly.
I would also suggest looking at the AllAgents table to make sure that there is an index on the field you are using for the lookup. If there isn't, think about adding one.
You still might need to go the array route, but these are relatively simple things that you can use to try to troubleshoot performance without introducing massive code changes to the application.

How update a SQL table from a modified datatable?

I used the DataSet Designer to create FTWDataSet which holds the AlarmText table from a SQLExpress database. This far my form contains ONLY Datagridview1. The code below successfully shows the contents of the AlarmText table plus the one added checkbox column (which I will populate with display-only data, and is not an issue here).
Dim ta As New FTWDataSetTableAdapters.AlarmTextTableAdapter
Dim dt As New FTWDataSet.AlarmTextDataTable
ta.Fill(dt)
DataGridView1.DataSource = dt
'create a new Bool column in the datatable
dt.Columns.Add("NewCol", (New Boolean).GetType)
What else do I need to do to use the DataGridView to edit and save values in the AlarmText table?
Here's a brief MSDN walkthrough on this topic.
Some notes:
You shouldn't need a binding source to persist changes back to the database.
To make the table adapter accessible to other procedures in your form, make it a form-scoped (a.k.a. member) variable instead of method-scoped, as in your example.
If you created your Dataset using the Dataset Designer, and you're fetching your original data from anything more than a simple table or view, then your adapter won't know how to update anything in the original database. You have to manually configure the UPDATE command in this situation. For reference, see the TableAdapter Update Commands section in the above link.
I should also mention that I avoid TableAdapters in ADO.Net like the plague. In theory they're very convenient and powerful. In practice, many of them (especially the one for the Oracle provider) are buggy, and if they don't work exactly right you're totally screwed.
If you're pulling from just one underlying table, the adapter should work fine (so ignore my nihilistic advice for the time being). It may be that adding an additional column in code will breaks the adapter (since there's no corresponding column in the database table).