According to http://blogs.msdn.com/b/sqlcat/archive/2011/10/17/updating-a-database-snapshot.aspx I should be able to successfully execute an INSERT, UPDATE and DELETE against a Database Snapshot.
The idea is to create a view of a table before you create the snapshot, and then create the snapshot, and update the View in the snapshot.
I have tried this on my SQL Server 2014 (v12.0.2269) and I still get the error
Failed to update database "Snapshot2015_07" because the database is read-only.
The reason I am keen for this to work is that financials need to be frozen at a particular date, but need to be updated if errors are found in the snapshot.
Has anyone had success recently doing this?
I know there are alternatives like AutoAudit, but it is a lot of work to implement for 1-2 updates/deletes on a database with multiple tables with 5 million + rows
The view has to specify the database name (which is the original database name, not the snapshot database name), along with the schema and table name. Ensure the view you created specifies those three parts of the fully qualified object name.
Related
How do I create a trigger in Microsoft SQL server to keep a track of all deleted data of any table existing in the database into a single audit table? I do not want to write trigger for each and every table in the database. There will only be once single audit table which keeps a track of all the deleted data of any table.
For example:
If a data is deleted from a person table, get all the data of that person table and store it in an XML format in an audit table
Please check my solution that I tried to describe at SQL Server Log Tool for Capturing Data Changes
The solution is build on creating trigger on selected tables dynamically to capture data changes (after insert, update, delete) and store those in a general table.
Then a job executes periodically and parses data captured and stored in this general table. After data is parsed it will be easier for humans to understand and see easily which table field is changed and its old and new values
I hope this proposed solution helps you for your own solution,
Our test database is linked to a database owned by another department within our company. Whenever they bring their database down (like when refreshing with production data) our application goes down as well. The only thing we are doing with their database is we have a view that selects from one of their tables and we join to this view in a number of queries.
Ideally, whenever their system goes down, I'd like our view to pull from a backup of their table that exists in our database. It has slightly stale data, but at least we would be able to continue working. I thought of using a TRY...CATCH in the view or in a sql function, but they are not supported in those. A stored procedure might work, except that you can't join to the results of a stored procedure in queries, can you?
How can I make my SELECT statements fall back to a backup table when the linked server's table is unavailable?
So what I ended up doing was to create a SQL Server Agent job that calls sp_testlinkedserver in a TRY...CATCH every few minutes and if it's down we alter the view to point to our backup table and if it's up, we alter it to point to the "live" data again. We also track the previous state so we only alter the view if the state has changed. It works pretty slick.
I'm trying to figure out if there's a method for copying the contents of a main schema into a table of another schema, and then, somehow updating that copy or "refreshing" the copy as the main schema gets updated.
For example:
schema "BBLEARN", has table users
SELECT * INTO SIS_temp_data.dbo.bb_users FROM BBLEARN.dbo.users
This selects and inserts 23k rows into the table bb_course_users in my placeholder schema SIS_temp_data.
Thing is, the users table in the BBLEARN schema gets updated on a constant basis, whether or not new users get added, or there are updates to accounts or disables or enables, etc. The main reason for copying the table into a temp table is for data integration purposes and is unrelated to the question at hand.
So, is there a method in SQL Server that will allow me to "update" my new table in the spare schema based on when the data in the main schema gets updated? Or do I just need to run a scheduled task that does a SELECT * INTO every few hours?
Thank you.
You could create a trigger which updates the spare table whenever an updated or insert is performed on the main schema
see http://msdn.microsoft.com/en-us/library/ms190227.aspx
I am very new in Microsoft SQL Server and I am not so into databasess.
Yesterday I made an error and I deletd all the rows inside the wrong table (I should delete the records in another table)
So now it is very important to me restore in some way all the deleted records in this table (only these records and not all the DB, if it is possibile in someway).
for completeness the table is named dbo.VulnerabilityWorkaround and have the following fields:
Id: int not null (is the PK)
Description: varchar(max), not null
I think that the SQL Server
retains the information related to the deleted records in a log file (or in something like it, maybe a DB table...I don't know)
Can in some way restore my original dbo.VulnerabilityWorkaround by a query or something like it?
There is the transaction log, but as far as I know that can be used depending on the backup strategy the database instance has, meaning you would have to fire up a restore backup operation.
Other than restoring a previous backup, I don't think you have much options.
Since you just need one table it could be easier to restore a backup to a different server and then copy/move only the data you need using SSIS or Bulk Import/Export.
I'm using SQL 2008 and have DELETE, UPDATE & INSERT auditing enabled on table XYZ. It works great other than when I query the data:
SELECT * FROM fn_get_audit_file('H:\SQLAudits\*', default, default)
It doesn't actually show me what was deleted or inserted or updated, only that a deletion, etc ... occurred. The statement column of the above query shows this snippet:
delete [dbo].[XYZ] where ([Name] = #0)
I want it to show me what the value of #0 is. Is there a way of doing this?
From what I've found about it, SQL Server 2008's "auditing" feature is very lacking. It does not act as a traditional data audit trail, where you store a new row every time something changes (via Triggers), with complete information such as the user who made the change. It more or less just tells you something has changed without much detail. I really wish SQL Server would include full data audit trail features.
Reference
While Creating Database Audit Specification, you select operation for the Audit Action Type INSERT, UPDATE, DELETE
This result in showing us logs , saying Select or Insert or Update or Delete...But the individual value can never be seen
Example - Click here to view the Logs for Insert/Update/Delete
The SQL Server Audit tool is very powerful, however, it was never designed to record data changes (eg. col1 was changed from 'fred' to 'santa' in table 'dummy' in db 'test' by 'sa').
For this you will need Change Data Capture (http://msdn.microsoft.com/en-us/library/bb522489.aspx).
Cheers,
Mark
You can monitor the delete sentences using SQL Server Profiler. You will be able to see the changes.
Another way to monitor is using the CDC (Change Data Capture) feature in SQL Server. This feature will let you monitor changes in the tables.
Finally, there are other tools related like ApexSQL Trigger.