How to set Database Audit Specification for all the tables in db - sql

I need to create an audit to track all CRUD events for all the tables in a database ,
now i have more than 100 tables in the DB , is there a way to create the specification which will include all the tables in the DB ?
P.S : I am using SQL Server 2008

I had the same question. The answer is actually simpler than expected and doesn't need a custom C# app to generate lots of SQL to cover all the tables. Example SQL below. The important point was to specify database and public for INSERT/UPDATE/DELETE.
USE [master]
GO
CREATE SERVER AUDIT [CancerStatsAudit]
TO FILE
( FILEPATH = N'I:\CancerStats\Audit\'
,MAXSIZE = 128 MB
,MAX_ROLLOVER_FILES = 64
,RESERVE_DISK_SPACE = OFF
)
WITH
( QUEUE_DELAY = 1000
,ON_FAILURE = CONTINUE
,AUDIT_GUID = '5a0a18cf-fe42-4171-ad01-5e19af9e27d1'
)
ALTER SERVER AUDIT [CancerStatsAudit] WITH (STATE = ON)
GO
USE [CancerStats]
GO
CREATE DATABASE AUDIT SPECIFICATION [CancerStatsDBAudit]
FOR SERVER AUDIT [CancerStatsAudit]
ADD (INSERT ON DATABASE::[CancerStats] BY [public]),
ADD (UPDATE ON DATABASE::[CancerStats] BY [public]),
ADD (DELETE ON DATABASE::[CancerStats] BY [public]),
ADD (EXECUTE ON DATABASE::[CancerStats] BY [public]),
ADD (DATABASE_OBJECT_CHANGE_GROUP),
ADD (SCHEMA_OBJECT_CHANGE_GROUP)
WITH (STATE = ON)
GO
NB: DATABASE_OBJECT_CHANGE_GROUP and SCHEMA_OBJECT_CHANGE_GROUP are not needed for auditing INSERT, UPDATE and DELETE - see additional notes below.
Additional notes:
The example above also includes the DATABASE_OBJECT_CHANGE_GROUP and the SCHEMA_OBJECT_CHANGE_GROUP. These were included since my requirement was to also track CREATE/ALTER/DROP actions on database objects. It is worth noting that the documentation is wrong for these.
https://learn.microsoft.com/en-us/sql/relational-databases/security/auditing/sql-server-audit-action-groups-and-actions
The above page states DATABASE_OBJECT_CHANGE_GROUP tracks CREATE, UPDATE and DELETE. This is not true (I've tested in SQL Server 2016), only CREATE is tracked, see:
https://connect.microsoft.com/SQLServer/feedback/details/370103/database-object-change-group-audit-group-does-not-audit-drop-proc
In fact, to track CREATE, UPDATE, DELETE, use SCHEMA_OBJECT_CHANGE_GROUP. Despite the above learn.microsoft.com documentation page suggesting this only works for schemas, it actually works for objects within the schema as well.

Change Data Capture
You can use the Change Data Capture functionality mechanism provided by SQL Server 2008.
http://msdn.microsoft.com/en-us/library/bb522489.aspx
Note that this will only do Create, Update and Delete.
Triggers and Audit tables
Even for 100 tables, you can use a single script that will generate the audit tables and the necessary triggers. Note, that this is not a very good mechanism - it will slow down the control will not be returned unless the trigger execution is complete.

Found a way to create Database Audit specification ,
wrote a c# code that dynamically generated sql statement for all the tables and all the actions I needed and executed the resultant string.
Frankly the wizard provided is no help at all if you are creating a Database Audit Specification for more than a couple of tables.

Related

How can I know what changed in a SQL database for a certain time?

I have a system to manage printers of a company and I need to understand how the workflow between the Website and database works by knowing what is added/changed in the database with each interaction of the user. Is there a way to find or create some kind of log for a database or even the entire SQL Server that can show me what I need?
you can use extended events feature in some case :
https://learn.microsoft.com/en-us/sql/relational-databases/extended-events/quick-start-extended-events-in-sql-server?view=sql-server-2017
I think what you're looking for are triggers.
You can make tables to log the currently updated or changed data and use triggers to automatically feed data in the log table on any change
CREATE OR REPLACE TRIGGER [trigger_name]
BEFORE DELETE OR INSERT OR UPDATE ON [table_name]
FOR EACH ROW
WHEN [some condition]
DECLARE
[variable declaration]
BEGIN
[create an entry in the log table here]
END;
You can use NEW and OLD keywords to refer to the data (new referring to the most recent update of data)
Just for the record, a tool for that exactly purpose exists and is installed together with SQL Server, it is called SQL Server Profiler.

SQL 2008 audit - show data deleted, etc

I'm using SQL 2008 and have DELETE, UPDATE & INSERT auditing enabled on table XYZ. It works great other than when I query the data:
SELECT * FROM fn_get_audit_file('H:\SQLAudits\*', default, default)
It doesn't actually show me what was deleted or inserted or updated, only that a deletion, etc ... occurred. The statement column of the above query shows this snippet:
delete [dbo].[XYZ] where ([Name] = #0)
I want it to show me what the value of #0 is. Is there a way of doing this?
From what I've found about it, SQL Server 2008's "auditing" feature is very lacking. It does not act as a traditional data audit trail, where you store a new row every time something changes (via Triggers), with complete information such as the user who made the change. It more or less just tells you something has changed without much detail. I really wish SQL Server would include full data audit trail features.
Reference
While Creating Database Audit Specification, you select operation for the Audit Action Type INSERT, UPDATE, DELETE
This result in showing us logs , saying Select or Insert or Update or Delete...But the individual value can never be seen
Example - Click here to view the Logs for Insert/Update/Delete
The SQL Server Audit tool is very powerful, however, it was never designed to record data changes (eg. col1 was changed from 'fred' to 'santa' in table 'dummy' in db 'test' by 'sa').
For this you will need Change Data Capture (http://msdn.microsoft.com/en-us/library/bb522489.aspx).
Cheers,
Mark
You can monitor the delete sentences using SQL Server Profiler. You will be able to see the changes.
Another way to monitor is using the CDC (Change Data Capture) feature in SQL Server. This feature will let you monitor changes in the tables.
Finally, there are other tools related like ApexSQL Trigger.

Auditing data changes in SQL Server 2008

I am trying to find a highly efficient method of auditing changes to data in a table. Currently I am using a trigger that looks at the INSERTED and DELETED tables to see what rows have changed and inserts these changes into an Audit table.
The problem is this is proving to be very inefficient (obviously!). It's possible that with 3 thousand rows inserted into the database at one time (which wouldn't be unusual) that 215000 rows would have to be inserted in total to audit these rows.
What is a reasonable way to audit all this data without it taking a long time to insert in to the database? It needs to be fast!
Thanks.
A correctly written trigger should be fast enough.
You could also look at Change Data Capture
Auditing in SQL Server 2008
I quite often use AutoAudit:
AutoAudit is a SQL Server (2005, 2008, 2012) Code-Gen utility that creates
Audit Trail Triggers with:
Created, CreatedBy, Modified, ModifiedBy, and RowVersion (incrementing
INT) columns to table
Insert event logged to Audit table
Updates old and new values logged to Audit table Delete logs all
final values to the Audit table
view to reconstruct deleted rows
UDF to reconstruct Row History
Schema Audit Trigger to track schema changes
Re-code-gens triggers when Alter Table changes the table
Update: (Original edit was rejected, but I'm re-adding it):
A major upgrade to version 3.20 was released in November 2013 with these added features:
Handles tables with up to 5 PK columns
Performance improvements up to 90% faster than version 2.00
Improved historical data retrieval UDF
Handles column/table names that need quotename [ ]
Archival process to keep the live Audit tables smaller/faster but retain the older data in archive AutoAudit tables
As others already mentioned - you can use Change Data Capture, Change Tracking, and Audit features in SQL Server, but to keep it simple and use one solution to track all SQL Server activities including these DML operations I suggest trying ApexSQL Comply. You can disable all other, and leave DML auditing option only
It uses a centralized repository for captured information on multiple SQL Server instances and their databases.
It would be best to read this article first, and then decide on using this tool:
http://solutioncenter.apexsql.com/methods-for-auditing-sql-server-data-changes-part-9-the-apexsql-solution/
SQL Server Notifications on insert update delete table change
SqlTableDependency C# componenet provides the low-level implementation to receive database notification creating SQL Server Queue and Service Broker.
Have a look at http://www.sqltabledependency.it/
For any record change, SqlTableDependency's event handler will get a notification containing modified table record values as well as DML - insert, update, delete - change executed on your database table.
You could allow the table to be self auditing by adding additional columns, for example:
For an INSERT - this is a new record and it's existence in the table is the audit itself.
With a DELETE - you can add columns like IsDeleted BIT \ DeletingUserID INT \ DeletingTimestamp DATETIME to your table.
With an UPDATE you add columns like IsLatestVersion BIT \ ParentRecordID INT to track version changes.

How to figure out how many tables are affected in database after inserting a record?

One third party app is storing data in a huge database (SQL Server 2000/2005). This database has more than 80 tables. How would I come to know that how many tables are affected when application stores a new record in database? Is there something available I can retrieve the list of tables affected?
You might be able to tell by running a trace in SQL Profiler on the database - the SQL:StmtCompleted event is probably the one to monitor - i.e. if the application does a series of inserts into multiple tables, you should see them go through in Profiler.
You can use SQL Profiler to trace SQL queries. So you will see sequence of calls caused by one button click in your application.
Also use can use metadata or SQL tools to get list of triggers which could make a lot of actions on simple insert.
If you have the SQL script that used to store the new record(Usually, it should be insert statement, or other DML statement such as update, merge and so on). Then you may know how many tables were affected by parsing those SQL script.
Take this SQL for example:
Insert into emp(fname, lname)
Values('john', 'reyes')
You can get result like this:
sstinsert
emp(tetInsert)
Tables:
emp
Fields:
emp.fname
emp.lname
you can add triggers on tables that get fired on update - you could use this to update a log table that would report what was being updated.
see more here: http://www.devarticles.com/c/a/SQL-Server/Using-Triggers-In-MS-SQL-Server/
Profiler is the way to go, as others have said especially with an unfamilar third party database.
I would also spend some time creating diagrams so you can see the foreign key relationships and understand how the database is put together. I usaully know my database structure so well, I can tell from the fields being inserted what tables they affect and I know what triggers are on my tables and what they affect. There is no substitute for taking the time to understand the database you support.

Suggestions for implementing audit tables in SQL Server?

One simple method I've used in the past is basically just creating a second table whose structure mirrors the one I want to audit, and then create an update/delete trigger on the main table. Before a record is updated/deleted, the current state is saved to the audit table via the trigger.
While effective, the data in the audit table is not the most useful or simple to report off of. I'm wondering if anyone has a better method for auditing data changes?
There shouldn't be too many updates of these records, but it is highly sensitive information, so it is important to the customer that all changes are audited and easily reported on.
How much writing vs. reading of this table(s) do you expect?
I've used a single audit table, with columns for Table, Column, OldValue, NewValue, User, and ChangeDateTime - generic enough to work with any other changes in the DB, and while a LOT of data got written to that table, reports on that data were sparse enough that they could be run at low-use periods of the day.
Added:
If the amount of data vs. reporting is a concern, the audit table could be replicated to a read-only database server, allowing you to run reports whenever necessary without bogging down the master server from doing their work.
We are using two table design for this.
One table is holding data about transaction (database, table name, schema, column, application that triggered transaction, host name for login that started transaction, date, number of affected rows and couple more).
Second table is only used to store data changes so that we can undo changes if needed and report on old/new values.
Another option is to use a third party tool for this such as ApexSQL Audit or Change Data Capture feature in SQL Server.
I have found these two links useful:
Using CLR and single audit table.
Creating a generic audit trigger with SQL 2005 CLR
Using triggers and separate audit table for each table being audited.
How do I audit changes to SQL Server data?
Are there any built-in audit packages? Oracle has a nice package, which will even send audit changes off to a separate server outside the access of any bad guy who is modifying the SQL.
Their example is awesome... it shows how to alert on anybody modifying the audit tables.
OmniAudit might be a good solution for you need. I've never used it before because I'm quite happy writing my own audit routines, but it sounds good.
I use the approach described by Greg in his answer and populate the audit table with a stored procedure called from the table triggers.