Scheduled job to copy data - sql

I need help with setting up a scheduled job.
I have two SQL Server databases on two different servers. The job would do SELECT on database A and INSERT on database B. When something changes in database A, the job would compare what had changed and did an update on database B.
Is this possible if I have SQL Server 2008 R2 Management Studio?
Thank you very much in advance.

I would suggest to make a Replication if possible. Read more about it here.
Otherwise if you really need a own job you have two ways.
Execute the Job with your SQL Agent every X minutes/hours. Check your new data and execute the INSERT-statement.
You can create a trigger on the source table which sets a flag in a table or on the sourcetable itself after an insert is executed. Your job on the target server executes every x minutes or even seconds and check the source table. After that he can evaluate if a changed happen and just copy the flagged rows to your target.

You could set up one-way replication between the two servers and let it take care of everything for you.
Or, you could add server B as a linked server then take responsibility for the record inspection and crafting the insert/update/delete statements yourself.
Have you tried either?

Related

MSSQL Automatic Merge Database

I have a PC that has a MSSQL database with 800 variables being populated every second. I need that database to merge/backup to a second database on another server PC at least every 10 minutes. Additionally, the first database needs to be wiped clean once per week, in order to save local drive space, so that only 1 week's worth of data is stored on that first database at any given time; meanwhile, the second database keeps everything intact and never gets cleared, only being added upon by the merges that occur every 10 minutes.
To my knowledge, this means I cannot rely on database mirroring, since the first one will be wiped every week. So from what I have gathered, this means I have to have scheduled merges going on every 10 minutes.
I will readily admit I know next to nothing about SQL. So my two questions are:
How do I set up scheduled merges to occur from one database to another in 10 minute frequencies?
How do I set a database to be scheduled/scripted so that it gets cleared every week?
(Note: both databases are running on MS SQL Server 2012 Standard.)
Assuming you can create a linked server on server A that connects to server B (Here's a guide)
Then create a trigger on your table, for example table1:
CREATE TRIGGER trigger1
ON table1
AFTER INSERT
AS
INSERT INTO ServerB.databaseB.dbo.table1
select *
from inserted
More on triggers here.
For part 2, you can schedule a job to truncate the table on whatever schedule you would like. How to create a scheduled job.
The trigger only fires on Inserts so deleting rows does nothing to the table on server B.
How is the purging/deleting of the data happening, via a stored proc? If so, you could also try transactional replication, and replicate the execution of that particular stored proc, but dummy the proc on the subscriber, so when the proc gets replicated and executed on the subscriber, nothing will get deleted/purged.

SQL Server - mirror some columns from tables to another database on the same server without replication

I have a SQL Server 2012 Web Edition (11.0.5058.0) instance on a VPS which hosts two databases. I would like to mirror a couple of columns from 3 tables to the second database, but I don't have transactional replication installed.
So I have a Staff table on the source database - I just want the staff_code and unique_id - I have an Activity table - I just need the activity_code, description and unique_id.. etc.
What is the best way to go about this - would that be triggers? The data is not regularly updated, possibly once a week - but I would still like the synchronisation to be fast if possible?
The data in the source database may be deleted, updated or inserted, by another application, so I want to ensure the data in my database reflects that information correctly.
Thanks for any suggestions!
UPDATED: Table comparison example:
SELECT CASE WHEN NOT EXISTS
( SELECT [COLUMN1],[COLUMN2],[UNIQUE_ID] FROM [SOURCE-DATABASE].[dbo].[SOURCE-TABLE]
EXCEPT
SELECT [COLUMN1],[COLUMN2],[UNIQUE_ID] FROM [DESTINATION-DATABASE].[dbo].[DESTINATION-TABLE]
)
AND NOT EXISTS
( SELECT [COLUMN1],[COLUMN2],[UNIQUE_ID] FROM [DESTINATION-DATABASE].[dbo].[DESTINATION-TABLE]
EXCEPT
SELECT [COLUMN1],[COLUMN2],[UNIQUE_ID] FROM [SOURCE-DATABASE].[dbo].[SOURCE-TABLE]
)
THEN 'True'
ELSE 'False' //GRAB NEW OR UPDATED DATA
END AS result ;
As long as the two databases can be connected (e.g. can you do a SELECT * FROM SecondDB.dbo.Activity?), then I would just
set up a query (stand-alone, or in a stored procedure) that just checks whether or not the data on the source has changed
updates the second database using normal SELECT, INSERT, UPDATE and possibly DELETE statements
set up that query/stored procedure with a SQL Server Agent Job to run at regular intervals, e.g. once every night, once every week - whatever works for you
I don't think triggers would be a good choice here - triggers should be kept very small, lean, fast - and "replicating" to another database sounds like too much processing work for a nimble trigger.... (also if you triggers take a long time to complete, the calling application will have to wait for that whole time..... not good for your application performance!)

How to sense inserted or updated record in SQL tables via log table

I have to disjoint Database with some common tables, I can not do any modification on the tables of one database ( it is under use always, and I can not for example add a col to one table), but I need to sync these two databases every night. What is the best solution to do this job, for example is there any systemic stored procedure to sense any updated or inserted record in one table?
I should mention that, only one of these databases write in these data bases?
you can write some batch script to sync the tables every night, use some scheduler to start the script (or) i guess you can also use triggers

SQL 2008 audit - show data deleted, etc

I'm using SQL 2008 and have DELETE, UPDATE & INSERT auditing enabled on table XYZ. It works great other than when I query the data:
SELECT * FROM fn_get_audit_file('H:\SQLAudits\*', default, default)
It doesn't actually show me what was deleted or inserted or updated, only that a deletion, etc ... occurred. The statement column of the above query shows this snippet:
delete [dbo].[XYZ] where ([Name] = #0)
I want it to show me what the value of #0 is. Is there a way of doing this?
From what I've found about it, SQL Server 2008's "auditing" feature is very lacking. It does not act as a traditional data audit trail, where you store a new row every time something changes (via Triggers), with complete information such as the user who made the change. It more or less just tells you something has changed without much detail. I really wish SQL Server would include full data audit trail features.
Reference
While Creating Database Audit Specification, you select operation for the Audit Action Type INSERT, UPDATE, DELETE
This result in showing us logs , saying Select or Insert or Update or Delete...But the individual value can never be seen
Example - Click here to view the Logs for Insert/Update/Delete
The SQL Server Audit tool is very powerful, however, it was never designed to record data changes (eg. col1 was changed from 'fred' to 'santa' in table 'dummy' in db 'test' by 'sa').
For this you will need Change Data Capture (http://msdn.microsoft.com/en-us/library/bb522489.aspx).
Cheers,
Mark
You can monitor the delete sentences using SQL Server Profiler. You will be able to see the changes.
Another way to monitor is using the CDC (Change Data Capture) feature in SQL Server. This feature will let you monitor changes in the tables.
Finally, there are other tools related like ApexSQL Trigger.

How to figure out how many tables are affected in database after inserting a record?

One third party app is storing data in a huge database (SQL Server 2000/2005). This database has more than 80 tables. How would I come to know that how many tables are affected when application stores a new record in database? Is there something available I can retrieve the list of tables affected?
You might be able to tell by running a trace in SQL Profiler on the database - the SQL:StmtCompleted event is probably the one to monitor - i.e. if the application does a series of inserts into multiple tables, you should see them go through in Profiler.
You can use SQL Profiler to trace SQL queries. So you will see sequence of calls caused by one button click in your application.
Also use can use metadata or SQL tools to get list of triggers which could make a lot of actions on simple insert.
If you have the SQL script that used to store the new record(Usually, it should be insert statement, or other DML statement such as update, merge and so on). Then you may know how many tables were affected by parsing those SQL script.
Take this SQL for example:
Insert into emp(fname, lname)
Values('john', 'reyes')
You can get result like this:
sstinsert
emp(tetInsert)
Tables:
emp
Fields:
emp.fname
emp.lname
you can add triggers on tables that get fired on update - you could use this to update a log table that would report what was being updated.
see more here: http://www.devarticles.com/c/a/SQL-Server/Using-Triggers-In-MS-SQL-Server/
Profiler is the way to go, as others have said especially with an unfamilar third party database.
I would also spend some time creating diagrams so you can see the foreign key relationships and understand how the database is put together. I usaully know my database structure so well, I can tell from the fields being inserted what tables they affect and I know what triggers are on my tables and what they affect. There is no substitute for taking the time to understand the database you support.