Does PostgreSQL provide Change Tracking feature similar to SQL Server change tracking? - sql

Does PostgreSQL provide change tracking feature like that on SQL Server.
this is what I basically want. I want to move my data after few minutes intervals to other database. for this I just want to fetch changed data only in PGSQL through change tracking like that of SQL Server change tracking. What is the best way to achieve this?

It's not so easy with PostgreSQL. You can use WAL’s aka Write Ahead Logs or triggers. May be the best approach will be using a external library like https://debezium.io

I'm trying to achieve the same goal. This is what I found surfing the Internet. There are few possible approaches:
Streaming replication (ships a binary change log to a standby server)
Slony (uses triggers to accumulate DML changes into tables that are periodically shipped to standby servers)
Logical changeset logging
Audit trigger 91plus

Related

How to transfer Data from One SQL server to another with out transactional replication

I have a database connected with website, data from website is inserting in that Database, i need to transfer data from that database to another Primary Database (SQL) on another server in real time (minimum latency).
I can not use transactional replication in this case. What are the other alternates to achieve this? Can i integrate DataStreams like Apache kafka etc with SQL server?
Without more detail it's hard to give a full answer. There's what's technically possible, and there's architecturally what actually makes sense :)
Yes you can stream from RDBMS to Kafka, and from Kafka to RDBMS. You can use the Kafka Connect JDBC source and sink. There are also CDC tools (e.g. Attunity, GoldenGate, etc) that support integration with MS SQL and other RDBMS)
BUT…it depends why you want the data in the second database. Do you need an exact replica of the first? If so DB-DB replication may be a better option. Kafka's a great option if you want to process the data elsewhere and/or persist it in another store. But if you just want MS SQL-MS SQL…Kafka itself may be overkill.

Is there a way to "bind" to an Oracle SQL database and get noticed of every update operation in it?

I was wondering... Is there a way to "bind" to an Oracle SQL database and get noticed of every create / update / delete operations in it, by any user?
A bit far-reaching demand, I know... My goal is to investigate how a specific application uses the DB. A good tool for comparing the data (not the schema) between two states of the database would also be a fair solution. A solution without having to dump the DB into a file every time is preferred.
Thanks in advance!
I would go with
Flashback Data Archive (Oracle Total Recall) available in Enterprise edition, and
Auditing available in any Oracle edition.
The two can be combined to suit your needs.
#a_horse_with_no_name suggested you using Log Miner, and it is a nice solution. But if you are a novice DBA, you can check Oracle Flashback Transaction Query which has a friendlier interface (though it still uses Log Miner underneath to analyze archived redo log files retrieving transaction details).
Some useful info WRT on using built in Oracle Auditing follows.
How to get index last modified time in Oracle?
Enabling and using Oracle Standard Auditing
Find who and when changed a specific value in the database – using Oracle Fine-Grained Auditing, plus some info regarding Log Miner.

How to automatically push data from SQL Server to Oracle?

I have users entering data in SharePoint (Running on SQL Server), but my application to view that data will be an Oracle Apex app running on Oracle, obviously. How do I have the data be pushed into the Oracle db automatically?
First off, are you sure that you need to replicate the data to Oracle? Oracle Heterogeneous Services allows you to create a database link in Oracle that connects to a non-Oracle database using ODBC (assuming you use the Transparent Gateway for ODBC which is free). Your APEX application could then query and report on data that is in SQL Server by issuing queries that run over the database link. Tim Hall has a good article (though it's a bit dated and some of the components have been renamed, the general approach is still the same) on configuring Heterogeneous Services.
If you do need to replicate the data, you can create materialized views in Oracle that query the objects in SQL Server using the database link you created with Heterogeneous Services and schedule those materialized views to refresh on a regular basis. The materialized views will need to do a complete refresh, though, which means that every row will need to be copied from SQL Server to Oracle every time there is a refresh. That generally limits the frequency with which you can realistically have refreshes happen. If you need the data to be replicated to the Oracle database and you need to send incremental changes so that the Oracle side doesn't lag too far behind, you can use Streams from a non-Oracle database to an Oracle database but that involves a lot more work.
In SQL Server you can setup linked servers that allow you to view data from other db's. You might see if Oracle has something similar, if not the same. Alternatively, you could use the sql's integration services to push the data over to an oracle table. Unfortunately I only know how to setup linked servers in SQL Server and I don't have a lot of experience with ssis to tell you how to do that, but those are the first two options I can think of that you might explore further.
Here's a link I found that might be helpful as well: http://www.dba-oracle.com/t_connecting_sql_server_oracle.htm
There's no way to do it "automatically" that I know of that will work across DBMS. ETL tools like Sql Server Integration Services might help but there's going to be a loading delay (as it will have to poll for changes). You could build some update triggers on the SharePoint database tables but that's going to turn into a support nightmare.

How do I keep a table synchronized with a query in SQL Server - ETL?

I wan't sure how to word this question so I'll try and explain. I have a third-party database on SQL Server 2005. I have another SQL Server 2008, which I want to "publish" some of the data in the third-party database too. This database I shall then use as the back-end for a portal and reporting services - it shall be the data warehouse.
On the destination server I want store the data in different table structures to that in the third-party db. Some tables I want to denormalize and there are lots of columns that aren't necessary. I'll also need to add additional fields to some of the tables which I'll need to update based on data stored in the same rows. For example, there are varchar fields that contain info I'll want to populate other columns with. All of this should cleanse the data and make it easier to report on.
I can write the query(s) to get all the info I want in a particular destination table. However, I want to be able to keep it up-to-date with the source on the other server. It doesn't have to be updated immediately (although that would be good) but I'd like for it be updated perhaps every 10 minutes. There are 100's of thousands of rows of data but the changes to the data and addition of new rows etc. isn't huge.
I've had a look around but I'm still not sure the best way to achieve this. As far as I can tell replication won't do what I need. I could manually write the t-sql to do the updates perhaps using the Merge statement and then schedule it as a job with sql server agent. I've also been having a look at SSIS and that looks to be geared at the ETL kind of thing.
I'm just not sure what to use to achieve this and I was hoping to get some advice on how one should go about doing this kind-of thing? Any suggestions would be greatly appreciated.
For that tables whose schemas/realtions are not changing, I would still strongly recommend Replication.
For the tables whose data and/or relations are changing significantly, then I would recommend that you develop a Service Broker implementation to handle that. The hi-level approach with service broker (SB) is:
Table-->Trigger-->SB.Service >====> SB.Queue-->StoredProc(activated)-->Table(s)
I would not recommend SSIS for this, unless you wanted to go to something like dialy exports/imports. It's fine for that kind of thing, but IMHO far too kludgey and cumbersome for either continuous or short-period incremental data distribution.
Nick, I have gone the SSIS route myself. I have jobs that run every 15 minutes that are based in SSIS and do the exact thing you are trying to do. We have a huge relational database and then we wanted to do complicated reporting on top of it using a product called Tableau. We quickly discovered that our relational model wasn't really so hot for that so I built a cube over it with SSAS and that cube is updated and processed every 15 minutes.
Yes SSIS does give the aura of being mainly for straight ETL jobs but I have found that it can be used for simple quick jobs like this as well.
I think, staging and partitioning will be too much for your case. I am implementing the same thing in SSIS now but with a frequency of 1 hour as I need to give some time for support activities. I am sure that using SSIS is a good way of doing it.
During the design, I had thought of another way to achieve custom replication, by customizing the Change Data Capture (CDC) process. This way you can get near real time replication, but is a tricky thing.

Log changes made to all fields in a table to another table (SQL Server 2005)

I would like to log changes made to all fields in a table to another table. This will be used to keep a history of all the changes made to that table (Your basic change log table).
What is the best way to do it in SQL Server 2005?
I am going to assume the logic will be placed in some Triggers.
What is a good way to loop through all the fields checking for a change without hard coding all the fields?
As you can see from my questions, example code would be veeery much appreciated.
I noticed SQL Server 2008 has a new feature called Change Data Capture (CDC). (Here is a nice Channel9 video on CDC). This is similar to what we are looking for except we are using SQL Server 2005, already have a Log Table layout in-place and are also logging the user that made the changes. I also find it hard to justify writing out the before and after image of the whole record when one field might change.
Our current log file structure in place has a column for the Field Name, Old Data, New Data.
Thanks in advance and have a nice day.
Updated 12/22/08: I did some more research and found these two answers on Live Search QnA
You can create a trigger to do this. See
How do I audit changes to sq​l server data.
You can use triggers to log the data changes into the log tables. You can also purchase Log Explorer from www.lumigent.com and use that to read the transaction log to see what user made the change. The database needs to be in full recovery for this option however.
Updated 12/23/08: I also wanted a clean way to compare what changed and this looked like the reverse of a PIVOT, which I found out in SQL is called UNPIVOT. I am now leaning towards a Trigger using UNPIVOT on the INSERTED and DELETED tables. I was curious if this was already done so I am going through a search on "unpivot deleted inserted".
Posting Using update function from an after trigger had some different ideas but I still believe UNPIVOT is going to be the route to go.
Quite late but hopefully it will be useful for other readers…
Below is a modification of my answer I posted last week on a similar topic.
Short answer is that there is no “right” solution that would fit all. It depends on the requirements and the system being audited.
Triggers
Advantages: relatively easy to implement, a lot of flexibility on what is audited and how is audit data stored because you have full control
Disadvantages: It gets messy when you have a lot of tables and even more triggers. Maintenance can get heavy unless there is some third party tool to help. Also, depending on the database it can cause a performance impact.
Creating audit triggers in SQL Server
Log changes to database table with trigger
CDC
Advantages: Very easy to implement, natively supported
Disadvantages: Only available in enterprise edition, not very robust – if you change the schema your data will be lost. I wouldn’t recommend this for keeping a long term audit trail
Reading transaction log
Advantages: all you need to do is to put the database in full recovery mode and all info will be stored in transaction log
Disadvantages: You need a third party log reader in order to read this effectively
Read the log file (*.LDF) in sql server 2008
SQL Server Transaction Log Explorer/Analyzer
Third party tools
I’ve worked with several auditing tools from ApexSQL but there are also good tools from Idera (compliance manager) and Krell software (omni audit)
ApexSQL Audit – Trigger based auditing tool. Generated and manages auditing triggers
ApexSQL Log – Allows auditing by reading transaction log
Under SQL '05 you actually don't need to use triggers. Just take a look at the OUTPUT clause. OUTPUT works with inserts, updates, and deletes.
For example:
INSERT INTO mytable(description, phone)
OUTPUT INSERTED.description, INSERTED.phone INTO #TempTable
VALUES('blah', '1231231234')
Then you can do whatever you want with the #TempTable, such as inserting those records into a logging table.
As a side note, this is an extremely easy way of capturing the value of an identity field.
You can use Log Rescue. It quite the same as Log Explorer, but it is free.
It can view history of each row in any tables with logging info of user, action and time.
And you can undo to any versions of row without set database to recovery mode.