I was wondering how I would be able to update a table of mine without modifying the data. I have an application which allows users to enter data, but if they make a mistake they can modify the data entered. But when the user modifies the data it overwrites the data previously entered.
Is there a way of keeping the old data but if the user does modify the data it will show on another added column which may say, "data modified".
This is what I have taken from my application in the regional source
select "PROBLEM_ID",
"PROBLEM_TYPE_ID",
"DATE_REPORTED",
"DESCRIPTION",
"POSTCODE"
from "#OWNER#"."CS_PROBLEMS"
You can write a "post update for each row" trigger to save the old date in a history table. Or a "pre update for each row" trigger to set extra columns in the existing table. Then you can create a screen or procedure to retrieve the historical data. Or restore with a manual process.
Related
I have a need to audit changes where triggers are not performing well enough to use. In the audit I need to know exactly who made the change based on a column named LastModifiedBy (gathered at login and used in inserts and updates). We use a single SQL account to access the database so I cant use that to tie it to a user.
Scenario: Now we are researching the SQL transaction Log to determine what has changed. Table has a LastUpdatedBy column we used with trigger solution. With previous solution I had before and after transaction data so I could tell if the user making the change was the same user or a new user.
Problem: While looking at tools like DBForge Transaction Log and ApexSQL Audit I cant seem to find a solution that works. I can see the Update command but I can't tell if all the fields actually changed (Just because SQL says to update a field does not mean it actually changed value). ApexSQL Audit does have a before and after capability but if the LastUpdatedBy field does not change then I don't know what the original value is.
Trigger problem: Large data updates and inserts are crushing performance because of the triggers. I am gathering before and after data in the triggers so I can tell exactly what changed. But this volume of data is taking a 2 second update of 1000 rows and making it last longer than 3 minutes.
I have created a table in SQL by copying data from another table in a different database. How can I make the changes made in the old table get reflected in the new table automatically?
You can use a Trigger function to reflect the changes.
Refer this.
Given that you have access over the First table, Trigger would work fine.
I have a SQL view created from normalized tables linked into Access. I created a form off of it to help control user access. I can make all the updates I want in the linked view, but - in the form - if I try to change a record I already updated I get the following error; "The data has been changed. Another user edited this record and saved the changes before you attempted to save your changes."
Dirty is set to False and all tables that will update have a timestamp.
Sounds like the auto form save event fires more than once.
You might want to have more control over the update transaction, with the following:
Do not link the form record source property to the table or query
Use queries to load data:
select based recordset as in Rst1 = dbCurr.OpenRecordset();
and update data using Rst1.update or action query DoCmd.RunSQL "UPDATE Query;"
Test the timestamp field before you save the changes.
The cost is you will need a bit more code to transfer data from the recordset/query to the FormFields and viceversa; plus you need to build a save or update button to initiate the data save transaction.
Is it possible to capture an event AFTER the paste confirmation message has been displayed when a user pastes records directly into a datasheet subreport? I need this to be able to log when new records are created in an audit table.
By capturing the Before/After Update and Insert events, I can easily create a collection of records that have been added, ready to insert details into the audit log, however after all these events have fired the user is then prompted to confirm with a "You are about to paste x record(s)" message.
So the problem is the user may click "No" here, and I can't find any way of capturing this, meaning the insertions could all be captured in the audit log, but as the user cancelled the request the records wouldn't actually exist.
The only way I can think of handling this is to create a temp table to display the existing records, and adding a "Save" button to write the temp table back, but running a comparison beforehand to update the audit log. However, this isn't ideal, especially as there is more than one of these tables.
If you use a Data Macros you can achieve this. I set up a table, TestDataTable, that looks like this which will be audited
and an audit table like this
I added 3 data macros to my TestDataTable
The After Insert looks like this
After Update looks like this
and After Delete looks like this
Which generates records which looks like this
And if you paste data in but click NO on the paste confirmation Access takes care of everything for you. Those records are not added to your main table and no audit records are inserted.
I'm trying to figure out if there's a method for copying the contents of a main schema into a table of another schema, and then, somehow updating that copy or "refreshing" the copy as the main schema gets updated.
For example:
schema "BBLEARN", has table users
SELECT * INTO SIS_temp_data.dbo.bb_users FROM BBLEARN.dbo.users
This selects and inserts 23k rows into the table bb_course_users in my placeholder schema SIS_temp_data.
Thing is, the users table in the BBLEARN schema gets updated on a constant basis, whether or not new users get added, or there are updates to accounts or disables or enables, etc. The main reason for copying the table into a temp table is for data integration purposes and is unrelated to the question at hand.
So, is there a method in SQL Server that will allow me to "update" my new table in the spare schema based on when the data in the main schema gets updated? Or do I just need to run a scheduled task that does a SELECT * INTO every few hours?
Thank you.
You could create a trigger which updates the spare table whenever an updated or insert is performed on the main schema
see http://msdn.microsoft.com/en-us/library/ms190227.aspx