a problem has come up after a SQL DB I used was migrated to a new server. Now when trying to edit a record in Access (form or table), it says: WRITE CONFLICT: This record has been changed by another user since you started editing it...
Are there any non obvious reasons for this. There is noone else using the server, I've disabled any triggers on the Table. I've just found that it is something to do with NULLs as records that have none are ok, but some rows which have NULLs are not. Could it be to do with indexes? If it is relevant, I have recently started BULK uploading daily, rather than doing it one at a time using INSERT INTO from Access.
Possible problems:
1 Concurrent edits
A reason might be that the record in question has been opened in a form that you are editing. If you change the record programmatically during your editing session and then try to close the form (and thus try to save the record), access says that the record has been changed by someone else (of course it's you, but Access doesn't know).
Save the form before changing the record programmatically.
In the form:
'This saves the form's current record
Me.Dirty = False
'Now, make changes to the record programmatically
2 Missing primary key or timestamp
Make sure the SQL-Server table has a primary key as well as a timestamp (= rowversion) column.
The timestamp column helps Access to determine if the record has been edited since it was last selected. Access does this by inspecting all fields, if no timestamp is available. Maybe this does not work well with null entries if there is no timestamp column (see 3 Null bits issue).
The timestamp actually stores a row version number and not a time.
Don't forget to refresh the table link in access after adding a timestamp column, otherwise Access won't see it. (Note: Microsoft's Upsizing Wizard creates timestamp columns when converting Access tables to SQL-Server tables.)
3 Null bits issue
According to #AlbertD.Kallal this could be a null bits issue described here: KB280730 (last snapshot on WayBackMachine, the original article was deleted). If you are using bit fields, set their default value to 0 and replace any NULLs entered before by 0. I usually use a BIT DEFAULT 0 NOT NULL for Boolean fields as it most closely matches the idea of a Boolean.
The KB article says to use an *.adp instead of a *.mdb; however, Microsoft discontinued the support for Access Data Projects (ADP) in Access 2013.
Had this problem, same as the original poster. Even on edit directly using no form. The problem is on bit fields, If your field is Null, it converts Null to 0 when you access the record, then you make changes which this time is the 2nd change. So the 2 changes conflicts. I followed Olivier's suggestion:
"Make sure the table has a primary key as well as a timestamp column."
And it solved the problem.
I have seen a similar situation with MS Access 2003 (and prior) when linked to MS SQL Sever 2000 (and prior). In my case I found that the issue to be the bit fields in MS SQL Server database tables - bit fields do not allow null values. When I would add a record to a table linked via the MS Access 2003 the database window an error would be returned unless I specifically set the bit field to True or False. To remedy, I changed any MS SQL Server datatables so that any bit field defaulted to either 0 value or 1. Once I did that I was able to add/edit data to the linked table via MS Access.
I found the problem due to the conflict between Jet/Access boolean and SQL Server bit fields.
Described here under pitfall #4
https://blogs.office.com/2012/02/17/five-common-pitfalls-when-upgrading-access-to-sql-server/
I wrote an SQL script to alter all bit fields to NOT NULL and provide a default - zero in my case.
Just execute this in SQL Server Management Studio and paste the results into a fresh query window and run them - its hardly worth putting this in a cursor and executing it.
SELECT
'UPDATE [' + o.name + '] SET [' + c.name + '] = ISNULL([' + c.name + '], 0);' +
'ALTER TABLE [' + o.name + '] ALTER COLUMN [' + c.name + '] BIT NOT NULL;' +
'ALTER TABLE [' + o.name + '] ADD CONSTRAINT [DF_' + o.name + '_' + c.name + '] DEFAULT ((0)) FOR [' + c.name + ']'
FROM
sys.columns c
INNER JOIN sys.objects o
ON o.object_id = c.object_id
WHERE
c.system_type_id = 104
AND o.is_ms_shipped = 0;
This is a bug with Microsoft
To work around this problem, use one of the following methods:
Update the form that is based on the multi-table view
On the first occurrence of the error message that is mentioned in the "Symptoms" section, you must click either Copy to Clipboard or
Drop Changes in the Write Conflict dialog box. To avoid the repeated
occurrence of the error message that is mentioned in the "Symptoms"
section, you must update the recordset in the form before you edit
the same record again.
Notes
To update the form in Access 2003 or in Access 2002, click Refresh on the Records menu.
To update the form in Access 2007, click Refresh All in the Records group on the Home tab.
Use a main form with a linked subform
To avoid the repeated occurrence of the error message that is mentioned in the "Symptoms" section, you can use a main form with a
linked subform to enter data in the related tables. You can enter
records in both tables from one location without using a form that is
based on the multi-table view.
To create a main form with a linked subform, follow these steps:
Create a new form that is based on the related (child) table that is used in the multi-table view. Include the required fields
on the form.
Save the form, and then close the form.
Create a new form that is based on the primary table that is used in the multi-table view. Include the required fields on the
form.
In the Database window, add the form that you saved in step 2 to the main form.
This creates a subform.
Set the Link Child Fields property and the Link Master Fields property of the subform to the name of the field or fields that are
used to link the tables.
Methods from work around taken from microsoft support
I have experienced both of the causes detailed above: Directly changing data in a table that is currently bound to a form AND having a 'bit' type field in SQL Server that does not have the Default Value set to '0' (zero).
The only way I have been able to get around the latter issue is to add the default value of zero to the bit field AND run an update query to set all current values to zero.
In order to get around the former error, I have had to be inventive. Sometimes I can change the order of the VBA statements and move Refresh or Requery to a different location, thus preventing the error message. In most cases, however, what I do is DIM a String variable in the Subroutine where I call the direct table update. BEFORE I call the update, I set this String variable to the value of the Recordsource behind the bound form, thus capturing the exact SQL statement being used at the time. Then, I set the form's Recordsource to an empty string ("") in order to disconnect it from the data. Then, I perform the data update. Then, I set the form's Recordsource back to the value saved in the String variable, reestablishing the binding and allowing it to pick up the new value(s) in the table. If there is one or more subforms contained within this form, then the "Link" fields need to handled in a similar manner as the Recordsource. When the Recordsource is set to an empty string, you may see #Name in the now-unbound fields. What I do is simply set the Visible property to False at the highest possible level (Detail section, Subform, etc.) during the time when the Recordsource is empty, hiding the #Name values from the user. Setting the Recordsource to an empty string is my go-to solution when a coding change can't be found. I am wondering, though, if my design skills are lacking and there is a way to completely avoid the issue altogether?
One final thought on addressing the error message: Instead of calling a routine to directly update the data in the table table, I find a way to update the data via the form instead, by adding a bound control to the form and updating the data in that so that the form data and the table data do not become out of sync.
In order to get over this problem. I created VBA to change another field in the same row. So I created a separate field which adds 1 to the contents when I try to close the form. This solved the issue.
I've dealt with this issue with MS Access tables linked to MS SQL tables multiple times. The original poster's response was extremly helpful and was indeed the source of much of my issues.
I also ran into this issue when i accidently added a bit field with a space in the fieldname... yeah....
I had run alter table tablename add [fieldname ] bit default 0. i solution i found was to drop that field and not have a space in the name.
I had this issue and realized it was caused by adding a new bit field to an existing table. I deleted the new field and everything went back to working fine.
If you are using linked tables, ensure you have updated these and retry before doing anything else.
I thought I had updated them but hadn't, turns out someone had updated the form validation and SQL tables to allow 150 chars, but hadn't refreshed the linked table hence access only saw 50 char allowed - Boom Write conflict
Not sure this is the most appropriate error for the scenario, but hey, most of the interesting issues are never flagged appropriately in any microsoft software!
I´m using this workaround and it has worked for me:
Front end: Ms Access
Backend: Mysql
On the Before update event of a given field:
Private Sub tbl_comuna_id_comuna_BeforeUpdate(Cancel As Integer)
If Me.tbl_comuna_id_comuna.OldValue = Me.tbl_comuna_id_comuna.Value Then
Cancel = True
Undo
End If
End Sub
I just had very havy write-conflict problems (Acc2013 32bit, SQL Srv2017 expr) with a rather "heavy loaded" Split-Form.
For me - at last - was the solution to get rid of the write-conflict problems to simply
SET THE AcSplitFormDatasheet to READ-ONLY !!! (I haven't a clue why it was read-write anyway i must have set it by fault...)
It did nearly cost me a whole week to find that out.
I was having this problem and saving the record, marking Dirty to false, etc. did not work. It ended up that adding a timestamp column to the SQL table is what avoided/fixed the issue.
When last time I got this error, it was bit field having NULL value issue.
But this time, it was different text size of source table field vs linked table field.
I checked all my bit fields in various tables but didn't find any issue. All of them had default value, so there were no NULL values for bit fields. I observed that a text field with nvarchar(500) was giving this error. The linked table was using old field size 50 instead of recently changed 500. Relink of tables solved the problem.
So another finding is if the data type is changed for a linked table, you need to relink the table.
Just had this issue on MS Access 365 connected to PostgreSQL server. The error only occurred when trying to edit the first row.
I manually deleted the first row in pgAdmin 4, and then manually added it again. This solved the issue.
I was receiving the same error message.
Id Column in database table was set to BigInt, changing it to Int resolved the issue.
Related
I created a table in MS-Access 2010 by running the following script on SQL server 2008
SELECT * into qryInstrumentInterfacelog FROM tblInstrumentInterfaceLog
qryInstrumentInterface is used to populate a subform on the main form. After a "Process" button is pressed, files are read in and stored in the database. tblInstrumentInterface will be inserted with a new record everytime a new file is read in. My problem is qryInstrumentInterfacelog will not update with tblInstrumentInterfaceLog, it will just keep the same data it had when the script was first ran on the server. I have tried different methods to requery the subform but I realized the subform had no issues it was the actual table that wasn't changing. How can I get qryInstrumentInterfacelog to be dynamic and update as tblInstrumentInterfaceLog updates? Is my SQL code wrong?
Well, one important concern is that, indeed, you cannot repeat the query as written.
"Select... into" creates a new table only. It does not insert/append to such a table.
So if you are really calling that a second time, it is probably erroring out.
If you really want to drop and replace the table, make sure to call an explicit "Drop Table" in advance of your "Select...Into".
--
A typical pattern, in SQL Server t-sql, is
if object_id('*your_table_name*') is not null
drop table *your_table_name*
;
*...your select...into*
sometimes i face the following case in my database design,, i wanna to know what is the best practice to handle this case:::
for example i have a specific table and after a while ,, when the database in operation and some real data are already entered.. i need to add some required fields (that supposed not to accept null)..
what is the best practice in this situation..
make the field accept null as (some data already entered in the table ,, and scarify the important constraint )and try to force the user to enter this field through some validation in the code..
truncate all the entered data and reentered them again (tedious work)..
any other suggestions about this issue...
It depends on requirements. If the data to populate existing rows for the new column isn't available immediately then I would generally prefer to create a new table and just populate new rows when the data exists. If and when you have all the data for every row then put the new column into the original table.
If possible i would set a default value for the new column.
e.g. For Varchar
alter table table_name
add column_name varchar(10) not null
constraint column_name_default default ('Test')
After you have updated you could then drop the default
alter table table_name
drop constraint column_name_default
A lot will come down to your requirements.
It depends on your application, your database scheme, your entities.
The best way to go about it is to truncate the data and re - enter it again, but it need not be too tedious an item. Temporary tables and table variables could assist a great deal with this issue. A simple procedure comes to mind to go about it:
In SQL Server Management Studio, Right - click on the table you wish to modify and select Script Table As > CREATE To > New Query Editor Window.
Add a # in front of the table name in the CREATE statement.
Move all records into the temporary table, using something to the effect of:
INSERT INTO #temp SELECT * FROM original
Then run the script to keep all your records into the temporary table.
Truncate your original table, and make any changes necessary.
Right - click on the table and select Script Table As > INSERT To > Clipboard, paste it into your query editor window and modify it to read records from the temporary table, using INSERT .. SELECT.
That's it. Admittedly not quite straightforward, but a well - kept database is almost always worth a slight hassle.
So I'm trying to do something I thought would've been straightforward. I have a table in the DB named "Images." It's 'Description' property is of type nvarchar(50). I simply want to make it nvarchar(250). Every time I try, it says it can't save because some tables would have to be redropped. I can't just delete it (i think) because, there's already data being maintained by it, and I can't lose it.
EDIT::
Exact error message
"Saving changes is not permitted. The
changes you have made require the
following tables to be dropped and
re-created. You have either made
changes to a table that can't be
re-created or enabled the option
Prevent saving changes that require
the table to be re-created."
Should I just disable the 'Prevent saving changes that require table re-creation' and save it from there.
This KB article explain it
Do you have any tables referencing the "Description" column? That would prevent you from changing the data type/length.
Were you doing this from the SSMS GUI or were you running a script using alter table to make the change?
IF you did it through the designer, I believe it creates another table, drops the orginal and renames the new table. If that table is in a PK/FK relationship. it can't drop the table. Never make table changes except by using a script. YOu also need these to properly put them in source control as well.
I am really posting this out of desperation after searching around a lot for an answer and trying a few different things with no success.
I have an Access database where I have recently migrated the tables to SQL 2005, Access continues to function to the users as a front-end providing forms, reports, and queries.
However, since moving to the Access FE/SQL BE setup, the users have been reporting that sometimes, when they are entering a new record, they click into a subform (saving the record) or click save on the menu itself, it jumps to an existing record. The new record has been saved, but for some reason access switches to a different record as it refreshes. The user then has to close out, find the saved record, and continue editing it.
Scenario: A user is entering a quote and fills out all the quote details, customer,
date, etc, then clicks in the line-items subform to add a product (or clicks save in the menu), and suddenly
the quote form (and line-item subform) is showing the details of some random quote. The random quote could be recent, or from years ago, and has nothing in common with the quote they were entering.
This weird behavior only happens on inserting a new record, never on editing an existing record. Users tell me that it happens 'more often' when they go to add a new (quote, customer, whatever) after opening the database.
I have noticed it is only happening on forms that have subforms, so my first thought is that it had to do with Access sending through the subform data before the form data is saved, causing a PK violation. But this doesn't appear to be happening: there are no errors on the SQL server, and the record is successfully saved. Forcing the users to save the main form record before adding subform records (i.e. on a quote, forcing them to save the quote before they can add line items) didn't work, it just causes the jump (sometimes) on the save.
It isn't vba running on the save or on current, I have set breakpoints on all the event handlers as it jumps and no vba is being executed. Some of the 'jumping' forms have no vba on the form. But all have subforms. I suspect it has to do with record locking.
The server running the tables is SQL Server 2005, the users are using a mix of Access 2000 and 2003, mostly XP SP3 with a couple of old Win2k boxes. They are using Merge replication and a couple of users are running replicated SSEE2005 editions and subscribing to the main server. Most users are not replicated, just connecting directly to the server via ODBC or SQL native client connections. But I have verified that this is happening to all users, usually once or twice a day, and it has happened to me before. So it isn't a user issue.
The worst part about this behavior is that it only happens some of the time and I haven't managed to find a scenario that will always cause it to happen.
If anyone has experienced anything like this before, please let me know how you sorted it out, or even suggestions would be welcome.
Update:
(1/10/09) Problem solved, thanks to David Fenton. Setting the form to Data Entry mode (Form.DataEntry = true) before opening it to add records does indeed prevent the jumping. Client reports no issues at all since I changed this a week ago.
A client is reporting occasional similar problems. It started immediately after they started using merge replication.
I've informed several contacts within the Microsoft Access product group as well as my fellow Access and SQL Server MVPs.
Please email me your email address so I can forward that to my contacts at Microsoft as I would assume they would want to contact you directly. tony at granite.ab.ca
BTW excellent trouble shooting and detailed problem description.
It definitely sounds like a record locking issue. Are you using autonumbers as PK? Have you tried 2 computers adding a record on the same form at the same time (meaning one of them will fire the insert event while the other has added a new record on the form one but is still editing it)?
Could you check in a way or another if the PK of the inserted record after insertion in the table stays similar to the PK given before the insertion (by adding for example a few 'debug.print's to your code)?
A scenario could be 2 pending inserts been given by the machine the same PK, the second one being then automatically changed at insert time, resulting in your form loosing the 'active' record.
I'm wondering about the scenario where you are using a form to add records that has any other records for the user to jump to.
That is, I don't believe in using the same form to edit records as is used to create them.
Instead, I use an unbound dialog to collect all the required fields, insert the record in SQL, then open the main editing form to that single record (not a form with the whole table navigated to the record that was just added).
Keep in mind that in a main form/subform scenario, creating a record in the subform when the parent form is unsaved causes the parent record to be saved. You might want to check if there is any code in the Insert and Update events of the main form that would cause a requery of the main form on the insert of a new record (triggered by editing the subform).
But I would still suggest that the best architecture is to avoid this kind of possible scenario by loading only single records, so there is no other record to jump to. That would certainly limit the possibilities of where the user could end up when the problem occurs.
I have seen behavior 'like' this when there are multiple ways of doing the same thing. (i.e. tabbing out of the textbox triggering the lostfocus vs clicking a button) So make sure that this isn't the case, if you haven't already.
This problem is coused by merge replication trigger. In this trigger (this problem strart from sql 2005 server , in SQL 2000 server this not nake problems) replication insert some data in replication tables with identities and access get this number of identity instead real form indentity insert. I read that access use ##IDENTITY insetad of SCOPE_IDENTITY and this is problem . To avoid this you should change merge trigger in way that in insert trigger on begining you save current value from ##identity in variable and on the end of trigger insert value in temp table as identity with start value of what is written in variable. this will correct ##iddentity and acces will get right value.
at begin of trigger
DECLARE #identity int
DECLARE #strsql varchar(128)
set #identity=##IDENTITY
ar end something like
set #strsql='select identity(int,'+CAST(#identity as varchar(15)) +',1) as id into #temp'
exec(#strsql)
et the and it should be placed between
if ##error <> 0
goto FAILURE
and
return
Problem in acces will erace not only on form but directly in ODBC link table too.
I'm looking for way how to add this automaticly to merge replication trigger (mainly insert).
This is bug in Access and SQL comunication. Access take identity of new record from ##IDENTITY and when you finish entering record it reload data based on value from ##IDENTITY value from SQL. In SQL 200 inserted merge trigger and Acces usualy work ok. From SQL 2005 merge trigger have some part in which data are entered in some merge replication table which have identity to and change value of #IDDENTITY form that of newly entered rcord from Access.
One solution is to chanege all merege insert trigger to save #IDDENTITY on begining of it in variable and at the end of trigger insert dumy record in #temp table as identity column with starting value of variable previosly saved.
This solution I found somewhere the net when before week I was affected with this problem too. I was moving database from SQL 200 to SQL 2008 and then I found this problem with identity in Access. I suspect replication because when I was removing one of subscription all start to work well but after recreating it erased again.
I use this for solving problem (takem from somewhere on net).
at the begining of merge insert trigger
DECLARE #identity int
DECLARE #strsql varchar(128)
set #identity=##IDENTITY
and at the end of merge insert trigger
set #strsql='select identity(int,'+CAST(#identity as varchar(15)) +',1) as id into #temp'
exec(#strsql)
last code should be placed on the place of /*insert end on this place */ in merge replication code
if ##error <> 0
goto FAILURE
/*insert end on this place */
return
But I'm searching for a way to do that automaticly for all existing merge trigger on publication and on all existing merge trigger on existing and future subscriptions.
I'm slowly learning SQL and how to use form builder 6. The situation is I have a simple table named 'players' within the table I have three columns:
player_no (primary key)
position
goals
Within form builder 6 I have created a very simple form using these three fields. The form is named 'TEAM'. At at the foot of the form I have a button labelled 'Add'. The goal is for the user to enter a player_no, position and goals and then to click 'Add'. This information is then to go into my table.
All attempts so far have failed miserably. I have set up a trigger on the button (WHEN_MOUSE_CLICK). I have then entered the following code:
BEGIN
INSERT INTO players ( player_no )
VALUES ( :TEAM.player_no )
END
For the purpose of testing it out I have only been using the one (player_no) field. This then compiles with no errors yet when I run the form and enter a player_no and hit the button I get the following error in the status bar:
frm-40735: WHEN-MOUSE-CLICK trigger raised unhandled exception ORA-01400
Am I doing something horribly wrong? I am very much new to SQL and Form Builder so any help would be greatly appreciated.
ORA-01400: cannot insert Null seems like one of your fields are not null and you omited them on insert. or value :TEAM.player_no is null during insert.
Also, somewhere from web:
FRM-40735: ON-INSERT trigger raised
unhandled We have had similar problem
since 11.5.9. We clear Jinitiator
cache, and temporary internet files
(tools>internet options then under
temporary internet files the clear
files button). Seems to work.
One of the benefits of using Form Builder is that you almost always don't need to write the DML statements yourself.
Just make the block based on the table - then the user can add and modify as many records as they like, then when they save (i.e. COMMIT), the Forms runtime automatically works out what INSERTs and UPDATEs are required to save the changes.