kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Joe Ammann <...@pyx.ch>
Subject Re: Live data streaming from Oracle to oracle using Kafka
Date Tue, 11 Jun 2019 07:40:47 GMT
Hi Kailash

On 6/11/19 9:24 AM, Kailash Kota wrote:
> I understand Oracle Golden Gate is a data replication tool which uses log-based technology
to stream all changes to a database from source, to target. Can you please help me in understand
what is the role of Kafka after the data is provided to it by OGG ?

The tool in question here is not "standard Oracle GoldenGate", but rather "Oracle GoldenGate
for BigData" https://docs.oracle.com/goldengate/bd123010/gg-bd/. OGG-BD captures change records
from the Oracle log (or other database) and publishes those changes to Kafka (or other typical
"BigData technologies" like HDFS, HBase, Flume).

The alternatives that I'm aware of that do similar things are Attunity https://www.attunity.com/content/change-data-capture-cdc-oracle/
(commercial), StreamSets https://streamsets.com/blog/change-data-capture-from-oracle-with-streamsets-data-collector/
(partially OSS), Debezium https://debezium.io/ and Kafka Connect Oracle https://github.com/erdemcer/kafka-connect-oracle

There may be more.

> Also if we just need to data replicate our oracle DB, is there any other ways we can
do it without Oracle Golden Gate ? Is Apache Storm/Flink a prospect for us to look into (though
they are streaming tools).

As Robin Moffatt's article notes, besides log-based CDC there is also query based CDC. The
tool choce there is much wider, and you could certainly use Storm of Flink to implement a

Since most people I have seen doing this want a persistent store/buffer for the change records,
instead of processing them directly without any buffering, most solutions I have seen push
the records into Kafka, by using some Kafka Connect setup.

If your requirements are for direct processing, I would look into Flink for this.

CU, Joe

View raw message