I'm looking for an auditing solution that does exactly what Change Data Capture (CDC) does, except I need it to also track the application user that made the change. I'm currently using SQL Server 2012 Enterprise and may be upgrading to 2014 later this year.
We already have an auditing solution in place that leverages Delete, Insert, and Update triggers, but some new requirements might force us to update every audit trigger and corresponding audit table. Given various problems we've run in to with that solution over the years, this seems like as good a time as any to reevaluate and potentially replace the solution.
To give you an idea of what I'm currently working with (and may be able to leverage), we use a stored procedure (ConnectionInitialize) to store a user id with a SPID in a table (ApplicationUser) and then we delete the row using another stored procedure (ConnectionReset) once we're done making our deletes, inserts, and updates.
Were we to use CDC, I looked into adding a trigger to something like the cdc.lsn_time_mapping table, but I couldn't find a way to map the LSN back to the SPID (and therefore the user id) that was being used. This also presented some other issues in that CDC is always a little bit behind.
I looked into SQL Server Audit a little bit, but that presented some challenges of its own. We're using Transparent Data Encryption (TDE) to appease some of our security requirements, but SQL Server Audit looks like it'd need a separate encryption strategy; that and I'm more interested in the columns than in the actual SQL statements. Even so, these aren't deal-breakers for me, so I'm still looking into it.
Given what I'm trying to accomplish, does anyone have any feedback or recommendations?
By itself, CDC doesn't meet the requirements. The reason being is that CDC only grabs changes to your data, not any underlying context under which those changes were made. You can, however, get what you're looking for if you're willing to tag your data with some audit columns. The basic idea is that you append a column to your table (or to a different table if you aren't able to modify the actual table for whatever reason) and populate it with the user who last modified the record (pretty simple to do via an insert/update trigger). Once that is actual data, you can consume it however you need to (CDC being one possible mechanism).
Late answer but hopefully useful.
There is a third party tool, ApexSQL Audit, capable of meeting your requirements. My previous company is using it for years and they have been satisfied with it.
There is a helpful comparison article you can read to find more details about audited data, auditing mechanisms, integrity protection etc, for both CDC & Audit tool at one place.
Related
I need to audit DDL changes made to a database. Those changes need to be replicated in many other databases at a later time. I found here that one can enable DDL triggers to keep track of DDL activities, and that works great for create table and drop table operations, because the trigger gets the T-SQL that was executed, and I can happily store it somewhere and simply execute it on the other servers later.
The problem I'm having is with alter operations: when a column name is changed from Management Studio, the event that is produced doesn't contain any information about columns! It just says the table was locked... What's more, if many columns are changed at once (say, column foo => oof, and also, column bar => rab) the event is fired only once!
My poor man's solution would be to have a table to store the structure of the table that's going to be altered, before and after the alter operation. That way, I could compare both structures and figure out what happened to which column(s).
But before I do that, I was wondering if it is possible to do it using some other feature from SQL Server that I have overlooked, or maybe there's a better way. How would you go about this?
There is a product meant for doing just that (I wrote it).
It monitors scripts that contained ddl changes, who wrote them and when together with their effect on performance, and it gives you the ability to easily copy them as one deployment script. For what you asked, the free version is sufficient.
http://www.seracode.com/
There is no special feature in SQL Server regarding your need. You can use triggers, but they require a lot of T-SQL coding for proper function. Fast solution would be some third party tools, but they're not free. Please take a look at this answer regarding the third party tools https://stackoverflow.com/a/18850705/2808398
I'm working on a design of a relational database. It has several tables and there are multiple users on application level. I need to know that changes to a certain record of a certain table are made, by which user, which time, and what has actually changed. There is a table for saving the user's information and this table is also included in this behavior.
How should I do this in the SQL database design so I can let users see which one of them made these changes?
What you want is a Wiki-like versioning. Basically, for every table you want to keep versions, you'll want to create at least a copy of that table with the fields you mentioned added (userid, when it was added). That's probably all there is to it, as long as you only need to track changes. Then, upon an edit, you just create a backup of the current row in that copied table and put the new one in the actual table. This way you can (hopefully) add the versioning without having to touch existing presentational code.
It gets a little more tricky, if you need to record additional actions like creation of new rows and deletion.
If you need a code example, just have look under the hood of some Wiki like https://mediawiki.org/
For starters you can look at sql server version tracking mechanisms (row versioning or row changes). After that you can look at sql server audit features. I think sql server audit would be the best for your needs.
On the other hand, if you want to make ad-hok versioning then YOU MUST NOT go to triggers. Imagine, you must create triggers for all tables for inserts, updates and deletes. This IS bad practice.
I think ad-hoc versioning should be avoided (degradation in performance and difficult to support) but in case it cannot be avoided, I would surely use CONTEXT_INFO in order to track current user and then I would try to create something that would read the schema of the table, I would get changes by using sql server change tracking mechanisms and store that in a tablename, changeduser, changedtime, column, prevValue, newValue style. I would not replicate each and every table for the changes.
I have a database with 50 tables and I want to log users requests, such as inserts, updates or deletes on all the tables in the database. I can also create a trigger for this for each request type.
What is the best way to do this from a performance perspective or is there a better way to track this?
You can also create audit tables which are populated by triggers (and which allow much more flexibility than change data capture). The critical component is to capture sets of data not try to work row-by-row. It does add some overhead yes, but if you write the triggers correctly, it isn't that much. Be sure to capture who (including which application if you have multiple applications hitting the database) and when as well as the old and new values. Set up one audit table per table you want audited (too much locking if you use only one audit table). And at the time you set up your system, write the code to get data back from a bad transaction or set of transactions. That makes it easier to recover when you do have something go wrong and you need to revert. We use two tables per table audited, one contains the info about the process that did the changes (name of the application, date, user, etc. and an auditid), the other contains the details about what was changed (old and new values, ID of the record being affected and column affected). Our structure enables us to use the same structure for each table being audited, and allows the tables to change without having to change the audit table and allows us to easily script the audit tables for a new tables. It is also easy for us to see what records were changed at the same time or in the same process or to find out which of the many applications which touch our database was responsible for the bad data as well as telling us who in particular was responsible for the bad data. This helps us track down application bugs and find out why the data was changed the way it was in some cases. It also makes it easier for us to track down all the data that was affected by a broken process rather than just the one we knew about.
If you have Enterprise Edition, look into Change Data Capture. If you don't have Enterprise and aren't interested in capturing the historical values of the columns that change, look into Change Tracking.
See Comparing Change Data Capture and Change Tracking to understand the differences between the two.
Assuming all requests to insert, update and/or delete data goes through some middle-tier data access layer, I would suggest you do your logging there. This is where we do all of ours. It is much simpler than trying to extract the actual insert / delete / update statements out of SQL Server.
If you want to do auditing of data, you can look into Change Data Capture (CDC). But this requires the Enterprise Edition.
I wanted to see if others are using SQL Server 2008 Change Data Capture and if so how do you like it? We currently use APEXSQL Audit Triggers for our auditing purposes which seems to work pretty well, but means we have to add triggers to all of our "audited" tables.
Some of the articles I have read have pointed out things like having to create a new capture table when you change a schema then drop the old one, but as far as the general maintenance is concerned it seems to be fairly straight forward.
Any comments /input is greatly appreciated.
--S
How busy is the system and what is the end goal for the Auditing; tracking changes in a short period of time, or auditing changes for a long time? One of the biggest problems I have with CDC is that it utilizes the log reader and SQL Agent jobs to capture changes, so a busy system can get behind to the point that it will never catch up unless you turn off CDC, leading to at worst a full transaction log, or at best delayed truncation causing the log to grow in size. If your intent is to do real auditing CDC is not built for that, its more for synchronizing changes than it is for auditing for a long term, unless you setup jobs to pull the data over into audit tables like you would with a triggered solution.
You don't mention the new Server Audit Specifications here, which would be another option to look at, but keep in mind that Server Audit Specifications are used for auditing by inclusion. This is one of the reasons that I still use the old tried and true triggers and audit tables method in my SQL Server 2008 Ent databases, its still the easiest solution until the newer features get past being v1.0 features in the product.
If you have a working auditing solution, I wouldn't even try it.
Another problem I noted when looking into this was that you can't add things like the user who made the change to the tables (or at least I couldn't figure out how), so your audit tables may be more flexible than CDC allows.
Finally CDC tables expire in 3 days (although I think you can change the expiration but still have to set a specific time frame.) We keep our audit records for longer than that so you still need to copy them out of the CDc tables to an audit table.
I would like to log changes made to all fields in a table to another table. This will be used to keep a history of all the changes made to that table (Your basic change log table).
What is the best way to do it in SQL Server 2005?
I am going to assume the logic will be placed in some Triggers.
What is a good way to loop through all the fields checking for a change without hard coding all the fields?
As you can see from my questions, example code would be veeery much appreciated.
I noticed SQL Server 2008 has a new feature called Change Data Capture (CDC). (Here is a nice Channel9 video on CDC). This is similar to what we are looking for except we are using SQL Server 2005, already have a Log Table layout in-place and are also logging the user that made the changes. I also find it hard to justify writing out the before and after image of the whole record when one field might change.
Our current log file structure in place has a column for the Field Name, Old Data, New Data.
Thanks in advance and have a nice day.
Updated 12/22/08: I did some more research and found these two answers on Live Search QnA
You can create a trigger to do this. See
How do I audit changes to sql server data.
You can use triggers to log the data changes into the log tables. You can also purchase Log Explorer from www.lumigent.com and use that to read the transaction log to see what user made the change. The database needs to be in full recovery for this option however.
Updated 12/23/08: I also wanted a clean way to compare what changed and this looked like the reverse of a PIVOT, which I found out in SQL is called UNPIVOT. I am now leaning towards a Trigger using UNPIVOT on the INSERTED and DELETED tables. I was curious if this was already done so I am going through a search on "unpivot deleted inserted".
Posting Using update function from an after trigger had some different ideas but I still believe UNPIVOT is going to be the route to go.
Quite late but hopefully it will be useful for other readers…
Below is a modification of my answer I posted last week on a similar topic.
Short answer is that there is no “right” solution that would fit all. It depends on the requirements and the system being audited.
Triggers
Advantages: relatively easy to implement, a lot of flexibility on what is audited and how is audit data stored because you have full control
Disadvantages: It gets messy when you have a lot of tables and even more triggers. Maintenance can get heavy unless there is some third party tool to help. Also, depending on the database it can cause a performance impact.
Creating audit triggers in SQL Server
Log changes to database table with trigger
CDC
Advantages: Very easy to implement, natively supported
Disadvantages: Only available in enterprise edition, not very robust – if you change the schema your data will be lost. I wouldn’t recommend this for keeping a long term audit trail
Reading transaction log
Advantages: all you need to do is to put the database in full recovery mode and all info will be stored in transaction log
Disadvantages: You need a third party log reader in order to read this effectively
Read the log file (*.LDF) in sql server 2008
SQL Server Transaction Log Explorer/Analyzer
Third party tools
I’ve worked with several auditing tools from ApexSQL but there are also good tools from Idera (compliance manager) and Krell software (omni audit)
ApexSQL Audit – Trigger based auditing tool. Generated and manages auditing triggers
ApexSQL Log – Allows auditing by reading transaction log
Under SQL '05 you actually don't need to use triggers. Just take a look at the OUTPUT clause. OUTPUT works with inserts, updates, and deletes.
For example:
INSERT INTO mytable(description, phone)
OUTPUT INSERTED.description, INSERTED.phone INTO #TempTable
VALUES('blah', '1231231234')
Then you can do whatever you want with the #TempTable, such as inserting those records into a logging table.
As a side note, this is an extremely easy way of capturing the value of an identity field.
You can use Log Rescue. It quite the same as Log Explorer, but it is free.
It can view history of each row in any tables with logging info of user, action and time.
And you can undo to any versions of row without set database to recovery mode.